使用 torchvision 下载的数据集在哪里?
Where are datasets downloaded using torchvision?
我今天是 Colab 的新手。
当我使用 torchvision
喜欢
下载数据集时
trainset = torchvision.datasets.CIFAR10(root='./data', train=True, download=True)
我的理解是我正在将数据下载到文件夹 ./data
。
但是我的电脑上 ./data
在哪里?
我在 Windows 10.
谢谢。
正如来自 Towards Data Science 的 article 中所述:
There is one big issue with Google Colab, often discussed before, which is the storage of your data. Notebooks, for example, Jupyter notebooks, often use data files stored locally, on your computer. This is often done using a simple read_csv statement or comparable. But Google Colaboratory is running in the Cloud. The Cloud’s local is not your local. Therefore a read_csv statement will search for the file on Google’s side rather than on your side. And then it will not find it.
历史较短: 下载的数据暂时存储在云端本地。如果您的 colab notebook 断开连接,请确保此类数据永远丢失。
如何解决该问题(下载数据丢失):使用 Google 驱动器和 Google Colaboratory 笔记本。有关它的更多信息,请阅读提到的文章。
我今天是 Colab 的新手。
当我使用 torchvision
喜欢
trainset = torchvision.datasets.CIFAR10(root='./data', train=True, download=True)
我的理解是我正在将数据下载到文件夹 ./data
。
但是我的电脑上 ./data
在哪里?
我在 Windows 10.
谢谢。
正如来自 Towards Data Science 的 article 中所述:
There is one big issue with Google Colab, often discussed before, which is the storage of your data. Notebooks, for example, Jupyter notebooks, often use data files stored locally, on your computer. This is often done using a simple read_csv statement or comparable. But Google Colaboratory is running in the Cloud. The Cloud’s local is not your local. Therefore a read_csv statement will search for the file on Google’s side rather than on your side. And then it will not find it.
历史较短: 下载的数据暂时存储在云端本地。如果您的 colab notebook 断开连接,请确保此类数据永远丢失。
如何解决该问题(下载数据丢失):使用 Google 驱动器和 Google Colaboratory 笔记本。有关它的更多信息,请阅读提到的文章。