Python 在 Azure 中写入(+锁定)/读取文件的脚本

Python script to write(+lock) / read a file in Azure

我是 python 编程和 Azure 的新手。

我需要编写一个将由 2 个进程执行的脚本。

这 2 个进程将 运行 相同的 python 脚本。 我知道 Azure 有 storageAccounts 可以在里面放一些文件,我发现了这个: https://docs.microsoft.com/en-us/python/api/azure-storage-file/azure.storage.file.fileservice.fileservice?view=azure-python

和: https://github.com/Azure/azure-storage-python

下面是一些伪代码来说明我需要实现的目标:

function useStorageFile
   if(fileFromStorage == null)
      createFileInStorage lockFileInStorage;
      executeDockerCommand;
      writeResultOFCommandInStorageFile;
   else
      if(fileFromStorage != null)
        X:if(fileFromStorage.status !== 'locked')
           readResultFromFile
        else
           wait 1s;
           continue X;

是否可以 lock/unlock Azure 中的文件?例如,我怎样才能在 python 中实现这一点?谢谢。

编辑 我设法使用 python 脚本在 Blob 存储中写入了一个文件。现在的问题是:如何在第一个进程写入命令结果时锁定文件,并在 Blob 存储锁(如果存在该选项......)被释放后立即让第二个进程读取它第一个过程?这是我使用的 python 脚本:

import os, uuid, sys
from azure.storage.blob import BlockBlobService, PublicAccess

def run_sample():
    try:
        # Create the BlockBlockService that is used to call the Blob service for the storage account
        block_blob_service = BlockBlobService(account_name='xxxxxx', account_key='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx')

        # Create a container called 'quickstartblobs'.
        container_name ='quickstartblobs'
        block_blob_service.create_container(container_name)

        # Set the permission so the blobs are public.
        block_blob_service.set_container_acl(container_name, public_access=PublicAccess.Container)

        # Create a file in Documents to test the upload and download.
        local_path=os.path.abspath(os.path.curdir)
        local_file_name ='youss.txt'
        full_path_to_file =os.path.join(local_path, local_file_name)

        # Write text to the file.
        file = open(full_path_to_file,  'w')
        file.write("Hello, World!")
        file.close()

        print("Temp file = " + full_path_to_file)
        print("\nUploading to Blob storage as blob" + local_file_name)

        # Upload the created file, use local_file_name for the blob name
        block_blob_service.create_blob_from_path(container_name, local_file_name, full_path_to_file)

        # List the blobs in the container
        print("\nList blobs in the container")
        generator = block_blob_service.list_blobs(container_name)
        for blob in generator:
            print("\t Blob name: " + blob.name)

        # Download the blob(s).
        # Add '_DOWNLOADED' as prefix to '.txt' so you can see both files in Documents.
        full_path_to_file2 = os.path.join(local_path, str.replace(local_file_name ,'.txt', '_DOWNLOADED.txt'))
        print("\nDownloading blob to " + full_path_to_file2)
        block_blob_service.get_blob_to_path(container_name, local_file_name, full_path_to_file2)

        sys.stdout.write("Sample finished running. When you hit <any key>, the sample will be deleted and the sample "
                         "application will exit.")
        sys.stdout.flush()
        input()

        # Clean up resources. This includes the container and the temp files
        block_blob_service.delete_container(container_name)
        os.remove(full_path_to_file)
        os.remove(full_path_to_file2)
    except Exception as e:
        print(e)


# Main method.
if __name__ == '__main__':
    run_sample()

How can i lock the file while writing the command result in it by the first process, and make it read by the second process as soon as the Blob Storage lock (if the option exists...) is released by the first process ?

A​​zure Blob 存储有一个名为 Lease 的功能,您可以使用它。本质上,Leasing 进程获取资源(在您的情况下为 blob)的独占锁,并且只有一个进程可以获得 blob 的租约。一旦在 blob 上获得租约,任何其他进程都无法修改或删除该 blob。

所以您需要做的是在写入之前尝试获取 blob 的租约。如果 blob 已租用,您将返回错误(HTTP 状态代码 412,PreConditionFailed 错误)。假设您没有收到错误,您可以继续更新文件。文件更新后,您可以手动释放锁(中断租约或释放租约)或让租约自动到期。假设您遇到错误,您应该等待并定期获取 blob 的租用状态(比如每 5 秒一次)。一旦您发现该 blob 不再被租用,您可以读取该 blob 的内容。