0

I am trying to build a multi-GB docker image for machine learning using Python in Github Actions running with buildx.

To test a speedup of the builds I wanted to split the application Dockerfile into "application" image that's coming from (FROM) a "packages" image so that if Python requirements haven't changed I wouldn't have to build it every time (i.e. a sort of a cache layer).

My Github Actions workflow goes with something like this:

  1. Determine if there exists a "packages" image with the same requirements.txt file hash tag.
  2. If not, build this "packages" image, tag it with a hash of the requirements.txt file and push the image to AWS ECR as well as to a local registry, skip to 4.
  3. If yes, download the "packages" image and push it to a local registry.
  4. Use local registry "packages" image as the FROM in a multi-stage build of the "application" image.

Step 3 is supposed to speed up the builds but since I have to go through a local registry in order for step 4 to work, I lose all leverage I gained.

Can I docker pull an image directly into a local registry? Right now, going through docker tag && docker push into the local registry takes extra minutes.

Any help is appreciated.

4

0 に答える 0