有没有办法在 bitbucket 管道中缓存 DockerHub 图像?
Is there a way to cache DockerHub image in bitbucket pipeline?
我正在使用来自 dockerhub 的外部 docker 图片。
在每个步骤中,docker图像一次又一次地从 dockerhub 中拉出。是的,这是所需的工作流程。
我的问题是我们能否缓存此图像,以便它不会在每个步骤中从 dockerhub 拉取?这个 DockerImage 不会经常更改,因为它只预装了 node 和 meteor。
那么是否可以缓存 docker 图片?
原版bitbucket-pipeline.yml
image: tasktrain/node-meteor-mup
pipelines:
branches:
'{develop}':
- step:
name: "Client: Install Dependencies"
caches:
- node
script:
- npm install
- npm run setup-meteor-client-bundle
artifacts:
- node_modules/**
- step:
name: "Client: Build for Staging"
script:
- npm run build-browser:stag
artifacts:
- dist/**
- step:
name: "Client: Deploy to Staging"
deployment: staging
script:
- pipe: atlassian/aws-s3-deploy:0.2.2
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
S3_BUCKET: $S3_STAGING_BUCKET_NAME
LOCAL_PATH: 'dist'
ACL: "public-read"
DELETE_FLAG: "true"
EXTRA_ARGS: "--follow-symlinks --quiet"
- step:
name: "Server: Build and Deploy to Staging"
script:
- cd server
- mup setup --config=.deploy/mup-settings.stag.js
- mup deploy --config=.deploy/mup-settings.stag.js --settings=meteor-settings.stag.json
确实可以缓存依赖项,docker 是 Bitbucket Pipelinespre-defined caches 之一
pipelines:
default:
- step:
services:
- docker
caches:
- docker
script:
- docker pull my-own-repository:5000/my-image
正如 OP 在对其他答案的评论中所说,定义 Docker 缓存不适用于构建映像本身
image: tasktrain/node-meteor-mup
始终为每个步骤下载,然后在该映像中执行步骤脚本。 Afaik,Docker 缓存
services:
- docker
caches:
- docker
仅适用于在步骤中拉取或构建的图像。
然而,Bitbucket Pipelines 最近开始在内部缓存 public 构建图像,根据这个 blog post:
Public image caching – Behind the scenes, Pipelines has recently started caching public Docker images, resulting in a noticeable boost to startup time to all builds running on our infrastructure.
还有一个 open feature request 也可以缓存私有构建图像。
我正在使用来自 dockerhub 的外部 docker 图片。
在每个步骤中,docker图像一次又一次地从 dockerhub 中拉出。是的,这是所需的工作流程。
我的问题是我们能否缓存此图像,以便它不会在每个步骤中从 dockerhub 拉取?这个 DockerImage 不会经常更改,因为它只预装了 node 和 meteor。
那么是否可以缓存 docker 图片?
原版bitbucket-pipeline.yml
image: tasktrain/node-meteor-mup
pipelines:
branches:
'{develop}':
- step:
name: "Client: Install Dependencies"
caches:
- node
script:
- npm install
- npm run setup-meteor-client-bundle
artifacts:
- node_modules/**
- step:
name: "Client: Build for Staging"
script:
- npm run build-browser:stag
artifacts:
- dist/**
- step:
name: "Client: Deploy to Staging"
deployment: staging
script:
- pipe: atlassian/aws-s3-deploy:0.2.2
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: $AWS_DEFAULT_REGION
S3_BUCKET: $S3_STAGING_BUCKET_NAME
LOCAL_PATH: 'dist'
ACL: "public-read"
DELETE_FLAG: "true"
EXTRA_ARGS: "--follow-symlinks --quiet"
- step:
name: "Server: Build and Deploy to Staging"
script:
- cd server
- mup setup --config=.deploy/mup-settings.stag.js
- mup deploy --config=.deploy/mup-settings.stag.js --settings=meteor-settings.stag.json
确实可以缓存依赖项,docker 是 Bitbucket Pipelinespre-defined caches 之一
pipelines:
default:
- step:
services:
- docker
caches:
- docker
script:
- docker pull my-own-repository:5000/my-image
正如 OP 在对其他答案的评论中所说,定义 Docker 缓存不适用于构建映像本身
image: tasktrain/node-meteor-mup
始终为每个步骤下载,然后在该映像中执行步骤脚本。 Afaik,Docker 缓存
services:
- docker
caches:
- docker
仅适用于在步骤中拉取或构建的图像。
然而,Bitbucket Pipelines 最近开始在内部缓存 public 构建图像,根据这个 blog post:
Public image caching – Behind the scenes, Pipelines has recently started caching public Docker images, resulting in a noticeable boost to startup time to all builds running on our infrastructure.
还有一个 open feature request 也可以缓存私有构建图像。