google cloud build和Source Repository如何部署新推送的多个云函数?

How to deploy multiple cloud functions that are newly pushed using google cloud build and Source Repository?

我有一个包含不同云功能文件夹的项目文件夹,例如

Project_Folder
    -Cloud-Function-Folder1
         -main.py
         -requirements.txt
         -cloudbuild.yaml
    -Cloud-Function-Folder2
         -main.py
         -requirements.txt
         -cloudbuild.yaml
    -Cloud-Function-Folder3
         -main.py
         -requirements.txt
         -cloudbuild.yaml
            --------- and so on!

现在我拥有的是。我将代码从Cloud Fucntions文件夹到Source Repository(每个函数文件夹的单独Repos)一个一个地推送到Source Repository。然后它启用了触发云构建然后部署功能的触发器。 我的 cloudbuild.yaml 文件如下所示..

 steps:

 - name: 'python:3.7'
 entrypoint: 'bash'
 args: 
   - '-c'
   - |
       pip3 install -r requirements.txt
       pytest

 - name: 'gcr.io/cloud-builders/gcloud'
  args:
  - functions 
  - deploy
  - Function
  - --runtime=python37
  - --source=.
  - --entry-point=function_main
  - --trigger-topic=Function
  - --region=europe-west3  

现在,我想做的是制作一个单一的源代码库,每当我更改一个云函数中的代码并推送它时,只有它得到部署,其余部分仍然像以前一样。


更新

现在我也尝试了下面类似的方法,但它也会同时部署所有功能,即使我正在处理一个功能。

Project_Folder
    -Cloud-Function-Folder1
         -main.py
         -requirements.txt
    -Cloud-Function-Folder2
         -main.py
         -requirements.txt
    -Cloud-Function-Folder3
         -main.py
         -requirements.txt
    -cloudbuild.yaml
    -requirements.txt

cloudbuild.yaml 文件如下所示

 steps:

 - name: 'python:3.7'
 entrypoint: 'bash'
 args: 
   - '-c'
   - |
       pip3 install -r requirements.txt
       pytest

 - name: 'gcr.io/cloud-builders/gcloud'
  args:
  - functions 
  - deploy
  - Function1
  - --runtime=python37
  - --source=./Cloud-Function-Folder1
  - --entry-point=function1_main
  - --trigger-topic=Function1
  - --region=europe-west3  

 - name: 'gcr.io/cloud-builders/gcloud'
  args:
  - functions 
  - deploy
  - Function2
  - --runtime=python37
  - --source=./Cloud-Function-Folder2
  - --entry-point=function2_main
  - --trigger-topic=Function2
  - --region=europe-west3 

如果您创建单个源代码库并将您的代码更改为一个云函数,则您必须创建一个 'cloudbuild.yaml' configuration file. You need to connect this single repo to Cloud Build. Then create a build trigger select this repo as a Source. Also you need to configure deployment 并且只要您将新代码推送到您的存储库,您将自动触发构建和部署云函数。

它更复杂,您必须使用 Cloud Build 的限制和约束。

我这样做:

  • 获取自上次提交后更新的目录
  • 在这个目录上循环并做我想做的事

假设 1:所有子文件夹都使用相同的命令部署

因此,为此,我将 cloudbuild.yaml 放在目录的根目录中,而不是子文件夹中

steps:
- name: 'gcr.io/cloud-builders/git'
  entrypoint: /bin/bash
  args:
    - -c
    - |
        # Cloud Build doesn't recover the .git file. Thus checkout the repo for this
        git clone --branch $BRANCH_NAME https://github.com/guillaumeblaquiere/cloudbuildtest.git /tmp/repo ;
        # Copy only the .git file
        mv /tmp/repo/.git .
        # Make a diff between this version and the previous one and store the result into a file
        git diff --name-only --diff-filter=AMDR @~..@ | grep "/" | cut -d"/" -f1 | uniq > /workspace/diff

# Do what you want, by performing a loop in to the directory
- name: 'python:3.7'
  entrypoint: /bin/bash
  args:
    - -c
    - |
       for i in $$(cat /workspace/diff); do
       cd $$i
           # No strong isolation between each function, take care of conflicts!!
           pip3 install -r requirements.txt
           pytest
       cd ..
       done

- name: 'gcr.io/cloud-builders/gcloud'
  entrypoint: /bin/bash
  args:
    - -c
    - |
       for i in $$(cat /workspace/diff); do
       cd $$i
           gcloud functions deploy .........           
       cd ..
       done

假设 2:部署特定于子文件夹

因此,为此我在目录的根目录下放置了一个 cloudbuild.yaml,在子文件夹中放置了另一个

steps:
- name: 'gcr.io/cloud-builders/git'
  entrypoint: /bin/bash
  args:
    - -c
    - |
        # Cloud Build doesn't recover the .git file. Thus checkout the repo for this
        git clone --branch $BRANCH_NAME https://github.com/guillaumeblaquiere/cloudbuildtest.git /tmp/repo ;
        # Copy only the .git file
        mv /tmp/repo/.git .
        # Make a diff between this version and the previous one and store the result into a file
        git diff --name-only --diff-filter=AMDR @~..@ | grep "/" | cut -d"/" -f1 | uniq > /workspace/diff

# Do what you want, by performing a loop in to the directory. Here launch a cloud build
- name: 'gcr.io/cloud-builders/gcloud'
  entrypoint: /bin/bash
  args:
    - -c
    - |
       for i in $$(cat /workspace/diff); do
       cd $$i
           gcloud builds submit
       cd ..
       done

注意这里的timeout,因为你可以触发很多Cloud Build并且需要时间。


想要 运行 手动构建,不要忘记添加 $BRANCH_NAME 作为替换变量

gcloud builds submit --substitutions=BRANCH_NAME=master

这很简单,但是您需要控制事物的构建触发器端的行为,而不是 cloudbuild.yaml。从概念上讲,您希望限制云构建触发行为并将其限制为存储库中的某些更改。

因此,在构建触发器页面中使用 regEx glob include 过滤器:

您将为每个云函数(或云运行)构建一个触发器,并按如下方式设置“包含的文件过滤器(glob)”:

  • Cloud-Function1-Trigger

    Project_Folder/Cloud-Function-Folder1/**

  • Cloud-Function2-Trigger

    Project_Folder/Cloud-Function-Folder2/**

...

假设:

  1. 对于每个触发器,设置回购和分支,使得回购的根具有 Project_Folder/
  2. 回购和分支被适当地设置为以便触发器可以定位和访问路径Project_Folder/Cloud-Function-Folder1/*
  3. 中的文件

当我有超过 2-3 个云函数时,我倾向于使用 Terraform 以自动方式创建所有必需的触发器。

您可以像这样为每个函数创建一个文件夹来做到这一点

Project_Folder
    -Cloud-Function-Folder1
         -main.py
         -requirements.txt
         -cloudbuild.yaml
    -Cloud-Function-Folder2
         -main.py
         -requirements.txt
         -cloudbuild.yaml
    -Cloud-Function-Folder3
         -main.py
         -requirements.txt
         -cloudbuild.yaml
            --------- and so on!

并在每个目录中创建一个 cloudbuild.yaml,看起来像这样

steps:
  - name: 'gcr.io/cloud-builders/gcloud'
    args:
      - functions
      - deploy
      - Cloud_Function_1
      - --source=.
      - --trigger-http
      - --runtime=python37
      - --allow-unauthenticated
    dir: "Cloud-Function-Folder1"

在云构建中创建一个包含文件过滤器的触发器,以仅包含来自 functions-folder-name 的文件,并为每个触发器手动指定 functions-folder-name/cloudbuild.yaml

来自 Torbjorn Zetterlund 的 this 博客 post,您可以阅读从单个 github 存储库部署多个云功能的整个过程,其中包含过滤器以仅部署更改的功能。