如何启动具有包依赖性的 AWS 云形成堆栈?

How to launch an AWS cloud formation stack with package dependencies?

我正在尝试让这个 repo 继续:https://github.com/mydatastack/google-analytics-to-s3

一个 link is provided to launch the AWS CloudFormation stack 但它不再工作,因为包含该模板的 S3 存储桶不再处于活动状态。

因此,我尝试通过 sam deploy --guided 自己启动堆栈。这开始构建堆栈,但中途失败并出现以下错误:

C:\Users\Me\GAS3\cloudformation>sam deploy --guided

Configuring SAM deploy
======================

        Looking for config file [samconfig.toml] :  Found
        Reading default arguments  :  Success

        Setting default arguments for 'sam deploy'
        =========================================
        Stack Name [GA_2_S3]:
        AWS Region [eu-central-1]:
        Parameter Name [pipes]:
        Parameter Stage [local]:
        Parameter AdminEmail [info@project.com]:
        Parameter FallbackEmail [info@project.com]:
        Parameter S3AlarmPeriod [60]:
        #Shows you resources changes to be deployed and require a 'Y' to initiate deploy
        Confirm changes before deploy [Y/n]: y
        #SAM needs permission to be able to create roles to connect to the resources in your template
        Allow SAM CLI IAM role creation [Y/n]: y
        Save arguments to configuration file [Y/n]: y
        SAM configuration file [samconfig.toml]:
        SAM configuration environment [default]:

        Looking for resources needed for deployment: Found!

                Managed S3 bucket: aws-sam-cli-managed-default-samclisourcebucket-1vcjy21utm1w6
                A different default S3 bucket can be set in samconfig.toml

        Saved arguments to config file
        Running 'sam deploy' for future deployments will use the parameters saved above.
        The above parameters can be changed by modifying samconfig.toml
        Learn more about samconfig.toml syntax at
        https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-config.html


File with same data already exists at GA_2_S3/d5396e95465bde0f60dbd769db9fe763, skipping upload
File with same data already exists at GA_2_S3/df3bbd85d54385405a650fc656f1ac19, skipping upload
File with same data already exists at GA_2_S3/2c01865beec56ebee30ae5b24e6f50e3, skipping upload
File with same data already exists at GA_2_S3/4adb166d233b6e3a1badf491522b0bcc, skipping upload
Error: Unable to upload artifact ./collector-ga.yaml referenced by Location parameter of GoogleAnalyticsCollectorStack resource.
Unable to upload artifact ../functions/lambda-layers/paramiko/ referenced by ContentUri parameter of SFTPLayer resource.
Parameter ContentUri of resource SFTPLayer refers to a file or folder that does not exist C:\Users\Me\GAS3\functions\lambda-layers\paramiko

检查文件夹,GitHub 上没有 ./lambda-layers/ 文件夹或 paramiko 包。我试过从 GitHub 下载 paramiko 包,然后创建引用的 /functions/lambda-layers/paramiko/ 但这没有用。

查看 ./collector-ga.yaml,这是失败的部分:

  SFTPLayer:
    Condition: SFTPUploadActivate
    Type: AWS::Serverless::LayerVersion
    Properties:
      LayerName: paramiko
      Description: paramkio lib for sftp connect
      ContentUri: ../functions/lambda-layers/paramiko/
      CompatibleRuntimes:
        - python3.7
      LicenseInfo: MIT should be added here 
      RetentionPolicy: Retain

为 paramiko 指定的 ContentUri 位置在 GitHub 上不存在,因此必须以其他方式构建它,因为原始 repo 旨在通过单击启动工作堆栈按钮。

我的问题是:如何使用所需的 paramiko 包启动此堆栈?

My question is: how do I launch this stack with the required paramiko package?

您需要适当的部署管道。如果您在 AWS 上,则可以使用 AWS CodePipeline 部署 lambda 函数。由于您具有构建依赖项,因此您需要一个 构建阶段 来实际获取和构建您的 lambda 部署包或层所需的所有内容。

或者您可以使用一些预构建或 public 中的 paraminco,例如 this one。这样您就可以将代码与依赖项分离。