Child/parent pipelines
Problem to solve
As pipelines grow more complex, a few related problems start to emerge:
- The staged structure where all steps in a stage must be completed before the first job in next stage begins causes arbitrary waits, slowing things down
- Configuration for the single global pipeline becomes very long and convoluted, making it hard to understand
- Imports exacerbate the above item, and create the potential for namespace collisions where jobs are unintentionally duplicated
- Pipeline UX can become unwieldy with so many jobs and stages to work with
Need statement: The user needs a way to let jobs not unnecessarily depend on other jobs within the same pipeline context so that they can let parallel job trails continue without being blocked by each other.
Intended users
Further details
Proposal
If a parent (originating) pipeline was able to trigger a set of concurrently running child pipelines, you could solve each of these problems:
- Child pipelines would execute each of their jobs still according to a stage sequence, but would be free to continue forward through their stages without waiting for unrelated jobs to finish.
- Configuration would be distributed out into each of the child pipeline configurations, reducing cognitive load to understand everything.
- Imports would be done at the child pipeline level, reducing the likelihood of collisions
- Each pipeline would have only the steps relevant, making it easier to understand what's going on.
You also get some nice benefits for doing things this way:
- By using existing triggering functionality, you can take advantage of
only: changes
type keywords to trigger pipelines only when certain files change (this is valuable for monorepos, for example). - By keeping the base (parent)
.gitlab-ci.yml
as a normal pipeline, it can have its own behaviors and sequencing in relation to triggers. - Also by keeping it a normal pipeline, if someone doesn't use this feature, it just works exactly as you'd expect. No special configuration.
- By taking advantage of status attribution (https://gitlab.com/gitlab-org/gitlab-ee/issues/11238), the pipeline can wait for success of the child without any other special code/configuration required, wait for it to complete but not care about result, or can just trigger it and not follow it at all.
All of this will work with includes, so you can retain composability within the configuration.
This first issue will allow for one level of child pipelines (i.e., one parent, n pipelines) and child pipelines will not be able to trigger further downstream pipelines. We will address this in a fast-follow issue (https://gitlab.com/gitlab-org/gitlab-ce/issues/63566) where we will allow for some additional levels, with som limit to prevent infinite recursion.
Implementation
Introduce a new syntax for triggering a child pipeline by pointing to a configuration yml within the repository:
microservice_a:
trigger:
local: microservice_a/config.yml
strategy: depend
only:
changes:
- microservices_a/**/*
For this example, a triggering job called microservice_a
in the parent pipeline would be triggered that would use the yaml in microservice_a/config.yml
as its configuration. It would only run if there are changes in the microservice_a
folder, and treat it as a dependency (i.e., fail if the child pipeline fails.. alternatively, the wait
strategy would wait for it to finish, but not care if it passes or fails.)
A pipeline triggered in this way would start running independently but would have status attribution to a parent pipeline (via https://gitlab.com/gitlab-org/gitlab-ee/issues/11238). If you look at the parent pipeline's pipeline page, you should see that the pipeline is running or completed, and if you click on it, it should take you to the child pipeline's pipeline page.
By implementing things in this way, we do need to radically change the user experience within GitLab and it works largely in the way you'd expect. There is always a parent pipeline since child pipelines can only be triggered from a parent.
-
Latest
tag will not show on child pipelines - Child pipelines will receive a dedicated tag indicating that they are child pipelines
Real-life Example
GitLab QA and gitlab-qa#6 (closed), where we would like to trigger pipeline for GitLab CE or GitLab EE checks depending on project that triggering MR belongs to. This may be also interesting feature for GitLab Omnibus, to build EE / CE images easier.
Considerations
- Child pipelines should not contribute to
Commit Status
, - Child pipelines should not be taken for Merge Request status (again, only the parent pipeline matters but since it has status attribution this should be fine),
- Likely we should show child pipelines as part of
Commit#Pipelines
andMerge Request#Pipelines
, - Likely we should consider a design that shows parent pipeline, and allows you to "expand" parent to see also child pipelines,
-
variables:
of trigger job should be passed to the child pipeline allowing to fine-tune it, - Child pipeline should inherit all settings of parent pipeline when running
trigger: local:
?
Future Enhancement
- For the MVC, a parent can trigger child pipelines, but child pipelines cannot trigger further child pipelines. This could be opened up in the future potentially.
- Add support for multiple pipelines in different files (https://gitlab.com/gitlab-org/gitlab-ce/issues/28592) - or do we get this for "free" with include/extends given the above implementation?
- Filtering of child pipelines (able to hide them or hide them by default) will not be supported with the mvc
Permissions and Security
Documentation
Testing
What does success look like, and how can we measure that?
Links / references
-
pipeline
to include w/ local variables: https://gitlab.com/gitlab-org/gitlab-ce/issues/56214 - Slack Channel (Internal Link only) #f_child_par_pipelines has been defined.