我正在尝试使用Python API将文件从Datalab实例上传到我的Google Storage Bucket,但无法确定。 Google在其文档中提供的code example在Datalab中似乎不起作用。我目前正在使用gsutil命令,但想了解如何使用Python API来执行此操作。
文件目录(我要上传位于checkpoints文件夹中的python文件):
!ls -R
.:
checkpoints README.md tpot_model.ipynb
./checkpoints:
pipeline_2020.02.29_00-22-17.py pipeline_2020.02.29_06-33-25.py
pipeline_2020.02.29_00-58-04.py pipeline_2020.02.29_07-13-35.py
pipeline_2020.02.29_02-00-52.py pipeline_2020.02.29_08-45-23.py
pipeline_2020.02.29_02-31-57.py pipeline_2020.02.29_09-16-41.py
pipeline_2020.02.29_03-02-51.py pipeline_2020.02.29_11-13-00.py
pipeline_2020.02.29_05-01-17.py
当前代码:
import google.datalab.storage as storage
from pathlib import Path
bucket = storage.Bucket('machine_learning_data_bucket')
for file in Path('').rglob('*.py'):
# API CODE GOES HERE
当前工作解决方案:
!gsutil cp checkpoints/*.py gs://machine_learning_data_bucket
这是对我有用的代码:
from google.cloud import storage
from pathlib import Path
storage_client = storage.Client()
bucket = storage_client.bucket('bucket')
for file in Path('/home/jupyter/folder').rglob('*.py'):
blob = bucket.blob(file.name)
blob.upload_from_filename(str(file))
print("File {} uploaded to {}.".format(file.name,bucket.name))
输出:
File file1.py uploaded to bucket.
File file2.py uploaded to bucket.
File file3.py uploaded to bucket.