Skip to content

Manually refresh JobVariables prior to ConfigExec

What does this MR do?

This modification aims to ensure that the ConfigExec stage is always provided with the most up to date JobVariables when attempting to construct the environment for the subsequent command.

Why was this MR needed?

The issues #29660 (closed) seems to be caused by cases where the GetAllVariables is called while preparing a command in the dynamicConfig; however, since b.allVariables is not nil changes to default variables are not captured in the variables returned. This is most apparent with the CI_CONCURRENT_ID and CI_CONCURRENT_PROJECT_ID variables since they can be relied upon by custom executor drivers to construct the builds_dir.

I want to note that I'm not confident enough in my understanding of all the executors so I've tried to keep the change focused only on the custom executor. If I'm wrong and this behavior is negatively affecting other types I could rethink the modification to occur before the func (e *executor) Prepare... is invoked for any of the executors.

What's the best way to test this MR?

At the moment the easiest path to test this is by manually creating a simple custom executor that can concurrently run at least two jobs. Then you can include a bash script like the one below for the config_exec:

#!/bin/bash
mkdir -p "/tmp/${CUSTOM_ENV_CI_JOB_ID}"
env >> "/tmp/${CUSTOM_ENV_CI_JOB_ID}/env"

Running a pipeline with multiple concurrent jobs without the changes you'll be able to notes that the CI_CONCURRENT_PROJECT_ID will be 0 for all job, even though they where started by the runner at the same time so they should have distinct ProjectRunnerID variables.

What are the relevant issue numbers?

Closes #29660 (closed)

Merge request reports