如何将参数传递给步骤定义中包含的 Bitbucket 管道?

How to pass parameter to Bitbucket pipeline pipe included in a step definition?

我有以下 Bitbucket 管道

# This image is used in both "Zip and Upload to S3 steps"
image: atlassian/default-image:3

pipelines:
  default:
    - step:
        script:
          # Include this step so that we can get a successful build when any
          # commit is made to a branch. This will allow us to merge to the
          # staging branch which requires a successful commit previously
          - echo "The current branch is $BITBUCKET_BRANCH"

  branches:
    staging:
      - step:
          # We have decided to include the Zip and upload to S3 sections of the
          # pipeline in the same step because of a bug in the pipeline within
          # artifacts. See https://jira.atlassian.com/browse/BCLOUD-21666 for
          # details.
          name: Zip & Upload to S3
          services:
            - docker
          oidc: true
          script:
            # Build the zip file containing the contents of the applications
            # folder
            - zip -r $BITBUCKET_REPO_SLUG.zip "applications"
            # Upload the zip file to S3
            - pipe: atlassian/aws-code-deploy:1.1.1
              variables:
                AWS_DEFAULT_REGION: $AWS_REGION
                AWS_OIDC_ROLE_ARN: $AWS_OIDC_ROLE_ARN
                COMMAND: "upload"
                APPLICATION_NAME: $APPLICATION_NAME
                ZIP_FILE: $BITBUCKET_REPO_SLUG.zip
                S3_BUCKET: $S3_BUCKET_STAGING
                VERSION_LABEL: $BITBUCKET_REPO_SLUG

      - step:
          name: Deploy code
          deployment: staging
          oidc: true
          script:
            - pipe: atlassian/aws-code-deploy:1.1.1
              variables:
                AWS_DEFAULT_REGION: $AWS_REGION
                AWS_OIDC_ROLE_ARN: $AWS_OIDC_ROLE_ARN
                APPLICATION_NAME: $APPLICATION_NAME
                COMMAND: "deploy"
                DEPLOYMENT_GROUP: $DEPLOYMENT_GROUP_STAGING
                S3_BUCKET: $S3_BUCKET_STAGING
                WAIT: "true"
                FILE_EXISTS_BEHAVIOR: OVERWRITE
                VERSION_LABEL: $BITBUCKET_REPO_SLUG

    production:
      - step:
          # We have decided to include the Zip and upload to S3 sections of the
          # pipeline in the same step because of a bug in the pipeline within
          # artifacts. See https://jira.atlassian.com/browse/BCLOUD-21666 for
          # details.
          name: Zip & Upload to S3
          services:
            - docker
          oidc: true
          script:
            # Build the zip file
            - zip -r $BITBUCKET_REPO_SLUG.zip "applications"
            # Upload the zip file to S3
            - pipe: atlassian/aws-code-deploy:1.1.1
              variables:
                AWS_DEFAULT_REGION: $AWS_REGION
                AWS_OIDC_ROLE_ARN: $AWS_OIDC_ROLE_ARN
                COMMAND: "upload"
                APPLICATION_NAME: $APPLICATION_NAME
                ZIP_FILE: $BITBUCKET_REPO_SLUG.zip
                S3_BUCKET: $S3_BUCKET_PROD
                VERSION_LABEL: $BITBUCKET_REPO_SLUG

      - step:
          name: Deploy code
          deployment: production
          oidc: true
          script:
            - pipe: atlassian/aws-code-deploy:1.1.1
              variables:
                AWS_DEFAULT_REGION: $AWS_REGION
                AWS_OIDC_ROLE_ARN: $AWS_OIDC_ROLE_ARN
                APPLICATION_NAME: $APPLICATION_NAME
                COMMAND: "deploy"
                DEPLOYMENT_GROUP: $DEPLOYMENT_GROUP_PROD
                S3_BUCKET: $S3_BUCKET_PROD
                WAIT: "true"
                FILE_EXISTS_BEHAVIOR: OVERWRITE
                VERSION_LABEL: $BITBUCKET_REPO_SLUG

如您所见,登台和生产分支几乎相同。唯一的区别是:

  1. 在“部署”变量中使用“暂存”或“生产”
  2. S3_BUCKET 变量根据我们是部署到暂存还是生产而不同。

为了避免重复代码,我想创建一个我可以重复使用的步骤定义。因此,例如,“压缩并上传到 S3”步骤的定义如下所示

definitions:
  steps:
    - step: &EchoCurrentBranch
        name: Echo current branch
        script:
          - echo "The current branch is $BITBUCKET_BRANCH"

    - step: &ZipAndUploadToS3
        name: ⬆️ Zip & Upload to S3
        services:
          - docker
        oidc: true
        script:
          # Build the zip file containing the contents of the applications
          # folder
          - zip -r $BITBUCKET_REPO_SLUG.zip "applications"
          # Upload the zip file to S3
          - pipe: atlassian/aws-code-deploy:1.1.1
            variables:
              AWS_DEFAULT_REGION: $AWS_REGION
              AWS_OIDC_ROLE_ARN: $AWS_OIDC_ROLE_ARN
              COMMAND: "upload"
              APPLICATION_NAME: $APPLICATION_NAME
              ZIP_FILE: $BITBUCKET_REPO_SLUG.zip
              S3_BUCKET: <I want this variable to be changeable>
              VERSION_LABEL: $BITBUCKET_REPO_SLUG

但问题是我无法找到一种方法来将 S3 存储桶的不同值传递到管道中,以便对定义的步骤进行不同的调用。如何根据要部署到的分支将 S3 存储桶的不同值传递到管道中?

我知道可以将不同的值传递到步骤定义中(例如 Reusable Bitbucket pipelines configuration with YAML anchors 中所示),但我找不到显示如何将管道变量传递到定义的步骤中的任何地方。

我看到你的定义提到了 Deployments,但你尝试过实际使用它吗?

此功能允许您根据环境为同一变量定义不同的值。

使用变量填充您的部署并调整您的管道定义。结果应该类似于

pipelines:
  branches:
    staging:
      - step:
          <<: *&ZipAndUploadToS3
          deployment: upload-staging
      - step:
          <<: *&Deploy
          deployment: deploy-staging

    production:
      - step: 
          <<: *&ZipAndUploadToS3
          deployment: upload-production
      - step:
          <<: *&Deploy
          deployment: deploy-production

有关部署的更多信息:https://support.atlassian.com/bitbucket-cloud/docs/variables-and-secrets/#Deployment-variables