从 ftp 服务器下载时从 .gz 扩展名中提取压缩文件
extracting compressed files from .gz extension while downloading them from ftp server
我创建了一个从给定 ftp 服务器下载 .gz 文件的功能,我想在下载和删除压缩文件的同时即时提取它们。我该怎么做?
sinex_domain = "ftp://cddis.gsfc.nasa.gov/gnss/products/bias/2013"
def download(sinex_domain):
user = getpass.getuser()
sinex_parse = urlparse(sinex_domain)
sinex_connetion = FTP(sinex_parse.netloc)
sinex_connetion.login()
sinex_connetion.cwd(sinex_parse.path)
sinex_files = sinex_connetion.nlst()
sinex_userpath = "C:\Users\" + user + "\DCBviz\sinex"
pathlib.Path(sinex_userpath).mkdir(parents=True, exist_ok=True)
for fileName in sinex_files:
local_filename = os.path.join(sinex_userpath, fileName)
file = open(local_filename, 'wb')
sinex_connetion.retrbinary('RETR '+ fileName, file.write, 1024)
#want to extract files in this loop
file.close()
sinex_connetion.quit()
download(sinex_domain)
虽然可能有一种更聪明的方法可以避免将每个文件的全部数据都存储在内存中,但这些文件看起来很小(未压缩的只有几十千字节),因此读取压缩数据就足够了放入 BytesIO
缓冲区,然后在将其写入输出文件之前在内存中解压缩。 (压缩数据永远不会保存到磁盘。)
您将添加这些导入:
import gzip
from io import BytesIO
然后你的主循环变成:
for fileName in sinex_files:
local_filename = os.path.join(sinex_userpath, fileName)
if local_filename.endswith('.gz'):
local_filename = local_filename[:-3]
data = BytesIO()
sinex_connetion.retrbinary('RETR '+ fileName, data.write, 1024)
data.seek(0)
uncompressed = gzip.decompress(data.read())
with open(local_filename, 'wb') as file:
file.write(uncompressed)
(注意不需要 file.close()
。)
我创建了一个从给定 ftp 服务器下载 .gz 文件的功能,我想在下载和删除压缩文件的同时即时提取它们。我该怎么做?
sinex_domain = "ftp://cddis.gsfc.nasa.gov/gnss/products/bias/2013"
def download(sinex_domain):
user = getpass.getuser()
sinex_parse = urlparse(sinex_domain)
sinex_connetion = FTP(sinex_parse.netloc)
sinex_connetion.login()
sinex_connetion.cwd(sinex_parse.path)
sinex_files = sinex_connetion.nlst()
sinex_userpath = "C:\Users\" + user + "\DCBviz\sinex"
pathlib.Path(sinex_userpath).mkdir(parents=True, exist_ok=True)
for fileName in sinex_files:
local_filename = os.path.join(sinex_userpath, fileName)
file = open(local_filename, 'wb')
sinex_connetion.retrbinary('RETR '+ fileName, file.write, 1024)
#want to extract files in this loop
file.close()
sinex_connetion.quit()
download(sinex_domain)
虽然可能有一种更聪明的方法可以避免将每个文件的全部数据都存储在内存中,但这些文件看起来很小(未压缩的只有几十千字节),因此读取压缩数据就足够了放入 BytesIO
缓冲区,然后在将其写入输出文件之前在内存中解压缩。 (压缩数据永远不会保存到磁盘。)
您将添加这些导入:
import gzip
from io import BytesIO
然后你的主循环变成:
for fileName in sinex_files:
local_filename = os.path.join(sinex_userpath, fileName)
if local_filename.endswith('.gz'):
local_filename = local_filename[:-3]
data = BytesIO()
sinex_connetion.retrbinary('RETR '+ fileName, data.write, 1024)
data.seek(0)
uncompressed = gzip.decompress(data.read())
with open(local_filename, 'wb') as file:
file.write(uncompressed)
(注意不需要 file.close()
。)