Gitlab CI CD:保留 docker env 文件
Gitlab CI CD: keeping docker env files
所以我需要为 docker-compose 存储 .env 文件。一种方法是将它们的内容存储在 gitlab CI/CD 中的屏蔽变量中,但这对我来说似乎并不安全,因为破解相当多的应用程序只需要有人破解一个 gitlab 帐户。
我想将 .env 文件存储在服务器上的一个目录中,并在 CI/CD 的第一个作业中将它们复制到新拉取的存储库路径。我为此尝试了工件,但它们已上传到 gitlab 并可以在那里查看,但我没能在后来的工作中找到它们(after_script 中的 ls 没有显示它们)。
如何将 .env 文件复制到所有作业中而不将它们上传到 gitlab?
.gitlab-ci.yml
before_script:
- docker info
- docker compose --version
copy_env_files:
script:
- cp /home/myuser/myapp/env.* .
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
artifacts:
paths:
- env.*
build_image:
script:
- docker-compose -f docker-compose.yml up -d --build
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
artifacts:
paths:
- env.*
collect_static_files:
script:
- docker-compose exec web python manage.py collectstatic --no-input
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
artifacts:
paths:
- env.*
migrate_database:
script:
- docker-compose exec web python manage.py migrate --no-input
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
artifacts:
paths:
- env.*
after_script:
- docker container ls
- pwd
- ls
How could I copy .env files into all jobs and not upload them on gitlab?
通过将您的 gitlab-ci 作业集成到外部保管库,敏感数据将安全地驻留在其中。
例如:“Authenticating and reading secrets with HashiCorp Vault”,但它仅适用于 GitLab premium。
你还可以use external secrets in CI
- Configure your vault and secrets.
- Generate your JWT and provide it to your CI job.
- Runner contacts HashiCorp Vault and authenticates using the JWT.
- HashiCorp Vault verifies the JWT.
- HashiCorp Vault checks the bounded claims and attaches policies.
- HashiCorp Vault returns the token.
- Runner reads secrets from the HashiCorp Vault.
我应该添加“cp /home/myuser/myapp/env.* 。”进入 before_script,而不是将其作为一项工作分开。
我还用 django 修复了我的错误 --no-input(通过将 -T 添加到 docker exec)发生在 docker 成功构建之后。
before_script:
- docker info
- docker compose --version
- cp /home/myuser/myproject/env.* .
build_image:
script:
- docker-compose -f docker-compose.yml up -d --build
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
collect_static_files:
script:
- docker-compose exec -T web python manage.py collectstatic --no-input
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
migrate_database:
script:
- docker-compose exec -T web python manage.py migrate --no-input
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
after_script:
- docker container ls
所以我需要为 docker-compose 存储 .env 文件。一种方法是将它们的内容存储在 gitlab CI/CD 中的屏蔽变量中,但这对我来说似乎并不安全,因为破解相当多的应用程序只需要有人破解一个 gitlab 帐户。
我想将 .env 文件存储在服务器上的一个目录中,并在 CI/CD 的第一个作业中将它们复制到新拉取的存储库路径。我为此尝试了工件,但它们已上传到 gitlab 并可以在那里查看,但我没能在后来的工作中找到它们(after_script 中的 ls 没有显示它们)。
如何将 .env 文件复制到所有作业中而不将它们上传到 gitlab?
.gitlab-ci.yml
before_script:
- docker info
- docker compose --version
copy_env_files:
script:
- cp /home/myuser/myapp/env.* .
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
artifacts:
paths:
- env.*
build_image:
script:
- docker-compose -f docker-compose.yml up -d --build
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
artifacts:
paths:
- env.*
collect_static_files:
script:
- docker-compose exec web python manage.py collectstatic --no-input
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
artifacts:
paths:
- env.*
migrate_database:
script:
- docker-compose exec web python manage.py migrate --no-input
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
artifacts:
paths:
- env.*
after_script:
- docker container ls
- pwd
- ls
How could I copy .env files into all jobs and not upload them on gitlab?
通过将您的 gitlab-ci 作业集成到外部保管库,敏感数据将安全地驻留在其中。
例如:“Authenticating and reading secrets with HashiCorp Vault”,但它仅适用于 GitLab premium。
你还可以use external secrets in CI
- Configure your vault and secrets.
- Generate your JWT and provide it to your CI job.
- Runner contacts HashiCorp Vault and authenticates using the JWT.
- HashiCorp Vault verifies the JWT.
- HashiCorp Vault checks the bounded claims and attaches policies.
- HashiCorp Vault returns the token.
- Runner reads secrets from the HashiCorp Vault.
我应该添加“cp /home/myuser/myapp/env.* 。”进入 before_script,而不是将其作为一项工作分开。
我还用 django 修复了我的错误 --no-input(通过将 -T 添加到 docker exec)发生在 docker 成功构建之后。
before_script:
- docker info
- docker compose --version
- cp /home/myuser/myproject/env.* .
build_image:
script:
- docker-compose -f docker-compose.yml up -d --build
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
collect_static_files:
script:
- docker-compose exec -T web python manage.py collectstatic --no-input
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
migrate_database:
script:
- docker-compose exec -T web python manage.py migrate --no-input
rules:
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
after_script:
- docker container ls