如何在R中从URL批量下载数据

How to batched download data from URL in R

我有很多 url。每个url指向一个csv文件,每个csv文件都有自己的名字。

我想从 url 下载数据并将其保存在我的计算机上。

我试过 batch download zipped files in R 中的代码,但失败了。

所以我想知道有没有简单的方法从URL批量下载数据并保存在电脑上。

urls = c(
'http://minio.tapdata.org.cn:9000/tap-bj-1km/input_v3/Tile_162_lonlat.csv.zip?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=24BJXNVDJVVCUTC9CQZ1%2F20220418%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20220418T011215Z&X-Amz-Expires=604800&X-Amz-SignedHeaders=host&X-Amz-Signature=3d203f787748209654fc863992c6b51f206df3146dd8054cf8b4aea1ffc9150f',
'http://minio.tapdata.org.cn:9000/tap-bj-1km/input_v3/2010/1/China_PM25_1km_2010_001_162.csv.zip?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=24BJXNVDJVVCUTC9CQZ1%2F20220418%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20220418T043413Z&X-Amz-Expires=604800&X-Amz-SignedHeaders=host&X-Amz-Signature=f3f8e4bbac9227e30e314dc8dd4dc0802a3e54719a07a7754ccae4609f0df330',
'http://minio.tapdata.org.cn:9000/tap-bj-1km/input_v3/2010/1/China_PM25_1km_2010_002_162.csv.zip?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=24BJXNVDJVVCUTC9CQZ1%2F20220418%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20220418T043413Z&X-Amz-Expires=604800&X-Amz-SignedHeaders=host&X-Amz-Signature=c27655cfbb61a7a6f9c9cb2c2e01624037e84c5bc0c4aecb59bb2975e2c21466',
'http://minio.tapdata.org.cn:9000/tap-bj-1km/input_v3/2010/1/China_PM25_1km_2010_003_162.csv.zip?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=24BJXNVDJVVCUTC9CQZ1%2F20220418%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20220418T043413Z&X-Amz-Expires=604800&X-Amz-SignedHeaders=host&X-Amz-Signature=c7baba0660d28263033dc0df5db4cdb851d1a7b6a36e5b369e3dfe658b8f5305'
)
df_urls = data.frame(url = urls) # all the url save in r data frame

要在您的工作目录中下载文件,我们可以使用 downloader 包。使用 gsuburls.

中提取 zip 文件名
library(downloader)
lapply(urls, function(x){
#create zip file name
  nam = gsub(".*[/]([^.]+)[.].*", "\1", x)
  nam = paste0(nam, '.zip')
#download zip files to your working directory. 
  download.file(x, nam, mode = 'wb')
})