Evaluate child pipeline deployments in parent pipeline context / order
Problem to solve
If one makes deployments in dynamically generated child pipelines the "Skip outdated deployment jobs" setting does not always work as expected and may result in out of order deployments.
The current implementation seems to be based on the creation time of the pipeline containing the deployment job (or perhaps deployment job created time). If several commits are made in rapid succession, as can occur with a merge train, then the variance in the generation job may cause the child pipelines to be created out of order.
Example from our project:
- Merge order: A is first, then B, then C.
- Deployments: C, then A, with B skipped.
- Child pipeline creation times:
- A: 2022-03-17T17:13:05.760Z
- B: 2022-03-17T17:12:53.088Z
- C: 2022-03-17T17:13:02.717Z
B was skipped and is the oldest (specifically older than C which went out first), then A went out last and has the last creation date. Presumably, B being skipped was the result of scheduling delay before C was created.
Given the pipeline generation job can have variance in terms of: scheduling, run time, artifact upload, and Gitlab creating pipeline it is essentially impossible to control the order in which the child pipelines are created. Assuming the check for "outdated" deployments has to do with the job trigger time which is roughly the same as pipeline creation this makes sense. It also follows that a manually triggered (or re-triggered) deployment would be allowed since it's date would be after any initial jobs. That would also allow for rollback deployments.
The only workaround seems to be having the generation jobs synchronize themselves to wait until the jobs from the prior commits complete.
Proposal
For non-manually triggered deployments, it would be desirable to change the mechanism for the outdated deployment canceling to be based on the root pipeline creation time for jobs.