GitLab job Artifacts are not passed to Buildkit

Summary

Trying to migrate from Kaniko to BuildKit. Building AWX Execution Environment with GitLab pipelines in two jobs. Works great on Kaniko, but not with BuildKit. A Job Artifact is passed to the next Job, Buildkit can copy 2 files, and then it says ls: cannot access '_build/': No such file or directory.

The Job Artifact is passed, and used as build context. But then inside the BuildKit build container it isn't available. Instead of having to mount files as secrets, the build context directory should be mounted.

Steps to reproduce

Hopefully all the info below is sufficient.

SSH_HOSTS,SSH_CONFIG = CI/CD variable type File. ${CI_PROJECT_DIR}/${IMAGE_NAME_EE}/context = path where create_awx_docker_file outputs the created Dockerfile & _build context. .gitlab-ci.yml

stages:
  - create_docker_file
  - build_ansible_ee

variables:
  IMAGE_NAME_EE: "ansible-ee-img"

create_awx_docker_file:
  image:
    name: python:3.13-slim
    entrypoint: [""]
  tags:
    - kubernetes
  stage: create_docker_file
  before_script:
    - mkdir -p ~/.ssh && cat "$SSH_HOSTS" | base64 -d > ~/.ssh/known_hosts && chmod 0600 ~/.ssh/known_hosts
    - cat "$SSH_CONFIG" | base64 -d > ~/.ssh/config && chmod 0600 ~/.ssh/config
  script:
    - pip install ansible-builder
    - cd $CI_PROJECT_DIR/$IMAGE_NAME_EE
    - ansible-builder create
  artifacts:
    when: on_success
    paths:
      - $CI_PROJECT_DIR/$IMAGE_NAME_EE/context
    expire_in: 10 min

build_ansible_ee:
  stage: build_ansible_ee
  variables:
    BUILDKITD_FLAGS: --oci-worker-no-process-sandbox
  image:
    name: moby/buildkit:rootless
    entrypoint: [""]
  tags:
    - kubernetes
  before_script:
    - mkdir -p ~/.docker
    - echo "{\"auths\":{\"$CI_REGISTRY\":{\"username\":\"$CI_REGISTRY_USER\",\"password\":\"$CI_JOB_TOKEN\"}}}" > ~/.docker/config.json
  script:
    - ls -al ${CI_PROJECT_DIR}/${IMAGE_NAME_EE}/context/_build
    - |
      buildctl-daemonless.sh build --progress=plain \
        --frontend dockerfile.v0 \
        --local context=${CI_PROJECT_DIR}/${IMAGE_NAME_EE}/context \
        --local dockerfile=${CI_PROJECT_DIR}/${IMAGE_NAME_EE}/context \
        --output type=image,name=${CI_REGISTRY_IMAGE}/${IMAGE_NAME_EE}:${CI_PIPELINE_IID},push=true \
        --no-cache
  needs:
    - job: create_awx_docker_file
      artifacts: true

Actual behavior

Sharing my testing/results so far:

  • Jobs: 1. create_awx_docker_file / 2. build_ansible_ee. Artifact from create_awx_docker_file is passed to build_ansible_ee.

  • Job build_ansible_ee before_script/script shows the Artifact content is present.

  • Kaniko shows /builds/ folder in the build container at /. BuildKit doesn't.

  • BuildKit logs show COPY _build/scripts/ /output/scripts/ & COPY _build/scripts/entrypoint /opt/builder/bin/entrypoint DONE. I validated that /output/scripts/ contains the files from _build/scripts.

  • Dockerfile is found, copied from the Artifact. Otherwise the WARNings are not producible.

#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 3.88kB done
#1 WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 10)
  • _build is not in the Git Repo. Solely created by Job create_awx_docker_file.

Expected behavior

ansible-builder create creates a Dockerfile and _builds folder, and the Artifact contains specified files. For this example it contains /builds/path/to/repo/to/context/_build/root/.ssh/config & /builds/path/to/repo/to/context/_build/root/.ssh/known_hosts.

I expect Buildkit isolated environment to have the job Artifact available. Or like Kaniko outputs:

Using files from context: [/builds/path/to/repo/to/context/_build/root/.ssh/config] 
INFO[0015] ADD _build/root/.ssh/config /root/.ssh/config`
INFO[0015] Taking snapshot of files...

Relevant logs and/or screenshots

AWX Dockerfile from Job Artifact & job log
ARG EE_BASE_IMAGE="quay.io/rockylinux/rockylinux:10-ubi"
ARG PYCMD="/usr/bin/python3"
ARG PKGMGR_PRESERVE_CACHE=""
ARG ANSIBLE_GALAXY_CLI_COLLECTION_OPTS=""
ARG ANSIBLE_GALAXY_CLI_ROLE_OPTS=""
ARG ANSIBLE_INSTALL_REFS="ansible-core ansible-runner"
ARG PKGMGR="/usr/bin/dnf"

# Base build stage
FROM $EE_BASE_IMAGE as base
USER root
ENV PIP_BREAK_SYSTEM_PACKAGES=1
ARG EE_BASE_IMAGE
ARG PYCMD
ARG PKGMGR_PRESERVE_CACHE
ARG ANSIBLE_GALAXY_CLI_COLLECTION_OPTS
ARG ANSIBLE_GALAXY_CLI_ROLE_OPTS
ARG ANSIBLE_INSTALL_REFS
ARG PKGMGR

COPY _build/scripts/ /output/scripts/
COPY _build/scripts/entrypoint /opt/builder/bin/entrypoint
RUN ls -al
RUN ls -al _build/
COPY _build /tmp/
RUN ls -al /tmp/
...

job log

Running with gitlab-runner 18.4.0
  on gitlab-runner-kubernetes xxxxxxxxxx, system ID: xxxxxxxxxxxxxx
Preparing the "kubernetes" executor
00:00
Using Kubernetes namespace: namespace
Using Kubernetes executor with image moby/buildkit:rootless ...
Using attach strategy to execute scripts...
Using effective pull policy of [] for container build
Using effective pull policy of [] for container helper
Using effective pull policy of [] for container init-permissions
Preparing environment
00:04
Using FF_USE_POD_ACTIVE_DEADLINE_SECONDS, the Pod activeDeadlineSeconds will be set to the job timeout: 1h0m0s...
Waiting for pod namespace/runner-xxxxxxxxxxxxxx to be running, status is Pending
Running on runner-xxxxxxx-concurrent-0-csttkgdy via gitlab-runner-xxxxxxxxxxxxxx...
Getting source from Git repository
00:01
Gitaly correlation ID: xxxxxxxxxxxxxxxxxxxxxxxxxxxx
Fetching changes with git depth set to 20...
Initialized empty Git repository in /builds/path/to/repo/.git/
Created fresh repository.
Checking out xxxxxx as detached HEAD (ref is main)...
Skipping Git submodules setup
Downloading artifacts
00:01
Downloading artifacts for create_awx_docker_file (4049)...
Downloading artifacts from coordinator... ok        correlation_id=xxxxxxxxxxxxxxxxxxxxxxx host=sub.domain.com id=4049 responseStatus=200 OK token=xxxxx
Executing "step_script" stage of the job script
00:09
$ mkdir -p ~/.docker
$ echo "{\"auths\":{\"$CI_REGISTRY\":{\"username\":\"$CI_REGISTRY_USER\",\"password\":\"$CI_JOB_TOKEN\"}}}" > ~/.docker/config.json
$ ls -al ${CI_PROJECT_DIR}/${IMAGE_NAME_EE}/context/_build
total 32
drwxr-xr-x    5 root     root          4096 Oct  3 14:56 .
drwxr-xr-x    3 root     root          4096 Oct  3 14:56 ..
-rw-rw-rw-    1 root     root           172 Oct  3 14:56 bindep.txt
drwxr-xr-x    2 root     root          4096 Oct  3 14:56 certs
drwxr-xr-x    3 root     root          4096 Oct  3 14:56 files
-rw-rw-rw-    1 root     root           176 Oct  3 14:56 requirements.txt
-rw-rw-rw-    1 root     root          1005 Oct  3 14:56 requirements.yml
drwxr-xr-x    2 root     root          4096 Oct  3 14:56 scripts
$ buildctl-daemonless.sh build --progress=plain \ # collapsed multi-line command
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 3.88kB done
#1 WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 10)
#1 WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 47)
#1 WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 65)
#1 WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 86)
#1 DONE 0.0s
#2 [internal] load metadata for quay.io/rockylinux/rockylinux:10-ubi
#2 DONE 1.1s
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s
#4 [internal] load build context
#4 transferring context: 44.96kB done
#4 DONE 0.0s
#5 [base  1/24] FROM quay.io/rockylinux/rockylinux:10-ubi@sha256:eca03145dd5e0b2a281eef164d391e4758b4a5962d29b688d15a72cef712fbb4
#5 resolve quay.io/rockylinux/rockylinux:10-ubi@sha256:eca03145dd5e0b2a281eef164d391e4758b4a5962d29b688d15a72cef712fbb4 done
#5 sha256:7761adaa10fcf2967630524f4a844022cd1252823a6f679ca08ab21711b3702f 0B / 97.47MB 0.2s
#5 sha256:7761adaa10fcf2967630524f4a844022cd1252823a6f679ca08ab21711b3702f 5.24MB / 97.47MB 0.3s
#5 sha256:7761adaa10fcf2967630524f4a844022cd1252823a6f679ca08ab21711b3702f 19.92MB / 97.47MB 0.5s
#5 sha256:7761adaa10fcf2967630524f4a844022cd1252823a6f679ca08ab21711b3702f 36.70MB / 97.47MB 0.6s
#5 sha256:7761adaa10fcf2967630524f4a844022cd1252823a6f679ca08ab21711b3702f 54.53MB / 97.47MB 0.8s
#5 sha256:7761adaa10fcf2967630524f4a844022cd1252823a6f679ca08ab21711b3702f 70.25MB / 97.47MB 0.9s
#5 sha256:7761adaa10fcf2967630524f4a844022cd1252823a6f679ca08ab21711b3702f 88.08MB / 97.47MB 1.1s
#5 sha256:7761adaa10fcf2967630524f4a844022cd1252823a6f679ca08ab21711b3702f 97.47MB / 97.47MB 1.1s done
#5 extracting sha256:7761adaa10fcf2967630524f4a844022cd1252823a6f679ca08ab21711b3702f
#5 extracting sha256:7761adaa10fcf2967630524f4a844022cd1252823a6f679ca08ab21711b3702f 2.5s done
#5 DONE 3.6s
#6 [base  2/24] COPY _build/scripts/ /output/scripts/
#6 DONE 0.9s
#7 [base  3/24] COPY _build/scripts/entrypoint /opt/builder/bin/entrypoint
#7 DONE 1.1s
#8 [base  4/24] RUN ls -al
#8 1.144 total 72
#8 1.144 drwxr-xr-x  19 root   root   4096 Oct  3 14:56 .
#8 1.144 drwxr-xr-x  19 root   root   4096 Oct  3 14:56 ..
#8 1.144 -rw-r--r--   1 root   root   1014 Jun  6 16:43 .profile
#8 1.144 dr-xr-xr-x   2 root   root   4096 Oct 29  2024 afs
#8 1.144 lrwxrwxrwx   1 root   root      7 Oct 29  2024 bin -> usr/bin
#8 1.144 dr-xr-xr-x   2 root   root   4096 Oct 29  2024 boot
#8 1.144 drwxr-xr-x   5 root   root    360 Oct  3 14:56 dev
#8 1.144 drwxr-xr-x  48 root   root   4096 Oct  3 14:56 etc
#8 1.144 drwxr-xr-x   2 root   root   4096 Oct 29  2024 home
#8 1.144 lrwxrwxrwx   1 root   root      7 Oct 29  2024 lib -> usr/lib
#8 1.144 lrwxrwxrwx   1 root   root      9 Oct 29  2024 lib64 -> usr/lib64
#8 1.144 drwxr-xr-x   2 root   root   4096 Oct 29  2024 media
#8 1.144 drwxr-xr-x   2 root   root   4096 Oct 29  2024 mnt
#8 1.144 drwxr-xr-x   3 root   root   4096 Oct  3 14:56 opt
#8 1.144 drwxr-xr-x   3 root   root   4096 Oct  3 14:56 output
#8 1.144 dr-xr-xr-x 557 nobody nobody    0 Oct  3 14:56 proc
#8 1.144 dr-xr-x---   3 root   root   4096 Oct  3 14:56 root
#8 1.144 drwxr-xr-x   2 root   root   4096 Jun  6 16:43 run
#8 1.144 lrwxrwxrwx   1 root   root      8 Oct 29  2024 sbin -> usr/sbin
#8 1.144 drwxr-xr-x   2 root   root   4096 Oct 29  2024 srv
#8 1.144 drwxr-xr-x   2 root   root   4096 Jun  6 16:43 sys
#8 1.144 drwxrwxrwt   2 root   root   4096 Oct 29  2024 tmp
#8 1.144 drwxr-xr-x  12 root   root   4096 Oct  3 14:56 usr
#8 1.144 drwxr-xr-x  18 root   root   4096 Oct  3 14:56 var
#8 DONE 1.2s
#9 [base  5/24] RUN ls -al _build/
#9 1.144 ls: cannot access '_build/': No such file or directory
#9 ERROR: process "/bin/sh -c ls -al _build/" did not complete successfully: exit code: 2
------
 > [base  5/24] RUN ls -al _build/:
1.144 ls: cannot access '_build/': No such file or directory
------
Dockerfile:24
--------------------
  22 |     COPY _build/scripts/entrypoint /opt/builder/bin/entrypoint
  23 |     RUN ls -al
  24 | >>> RUN ls -al _build/
  25 |     COPY _build /tmp/
  26 |     RUN ls -al /tmp/
--------------------
error: failed to solve: process "/bin/sh -c ls -al _build/" did not complete successfully: exit code: 2
Cleaning up project directory and file based variables
00:01
ERROR: Job failed: command terminated with exit code 1

Environment description

GitLab v18.4.1 deployed in K8s. GitLab runner - K8s executor

Used GitLab Runner version

GitLab runner 18.4.0

Edited by D1StrX