Google 数据实验室:如何导入 pickle

Google datalab : how to import pickle

在 Google Datalab 中是否可以使用 %%storage 子句从 Google 存储中读取 pickle/joblib 模型?

此问题涉及

运行 在空单元格中添加以下代码:

%%storage read --object <path-to-gcs-bucket>/my_pickle_file.pkl --variable test_pickle_var

然后 运行 以下代码:

from io import BytesIO    
pickle.load(BytesIO(test_pickle_var))

我使用下面的代码将 pandas DataFrame 作为腌制文件上传到 Google Cloud Storage 并读回:

from datalab.context import Context
import datalab.storage as storage
import pandas as pd
from io import BytesIO
import pickle

df = pd.DataFrame(data=[{1,2,3},{4,5,6}],columns=['a','b','c'])

# Create a local pickle file
df.to_pickle('my_pickle_file.pkl')

# Create a bucket in GCS
sample_bucket_name = Context.default().project_id + '-datalab-example'
sample_bucket_path = 'gs://' + sample_bucket_name
sample_bucket = storage.Bucket(sample_bucket_name)
if not sample_bucket.exists():
    sample_bucket.create()

# Write pickle to GCS
sample_item = sample_bucket.item('my_pickle_file.pkl')
with open('my_pickle_file.pkl', 'rb') as f:
    sample_item.write_to(bytearray(f.read()), 'application/octet-stream')

# Read Method 1 - Read pickle from GCS using %storage read (note single % for line magic)
path_to_pickle_in_gcs = sample_bucket_path + '/my_pickle_file.pkl'
%storage read --object $path_to_pickle_in_gcs --variable remote_pickle_1
df_method1 = pickle.load(BytesIO(remote_pickle_1))
print(df_method1)

# Read Alternate Method 2 - Read pickle from GCS using storage.Bucket.item().read_from()
remote_pickle_2 = sample_bucket.item('my_pickle_file.pkl').read_from()
df_method2 = pickle.load(BytesIO(remote_pickle_2))
print(df_method2)

注意:有一个 known issue 如果 %storage 命令在单元格中的第一行不起作用。在第一行添加注释或 python 代码。