Parent/Child pipelines MVC
Problem to solve
As pipelines grow more complex, a few related problems start to emerge:
- The staged structure where all steps in a stage must be completed before the first job in next stage begins causes arbitrary waits, slowing things down
- Configuration for the single global pipeline becomes very long and convoluted, making it hard to understand
- Imports exacerbate the above item, and create the potential for namespace collisions where jobs are unintentionally duplicated
- Pipeline UX can become unwieldy with so many jobs and stages to work with
Additionally, sometimes the behavior of a pipeline needs to be more dynamic. The ability to choose to start sub-pipelines (or not) is a powerful ability, especially when the YAML can be dynamically generated on the fly.
Intended users
Proposal
If a parent (originating) pipeline was able to trigger a set of concurrently running child pipelines, you could solve each of these problems:
- Child pipelines would execute each of their jobs still according to a stage sequence but would be free to continue forward through their stages without waiting for unrelated jobs to finish.
- The configuration would be distributed out into each of the child pipeline configurations, reducing cognitive load to understand everything.
- Imports would be done at the child pipeline level, reducing the likelihood of collisions
- Each pipeline would have only the steps relevant, making it easier to understand what's going on.
You also get some nice benefits for doing things this way:
- By using existing triggering functionality, you can take advantage of
only: changes
type keywords to trigger pipelines only when certain files change (this is valuable for monorepos, for example). - By keeping the base (parent)
.gitlab-ci.yml
as a normal pipeline, it can have its own behaviors and sequencing in relation to triggers. - Also by keeping it a normal pipeline, if someone doesn't use this feature, it just works exactly as you'd expect. No special configuration.
- By taking advantage of status attribution (https://gitlab.com/gitlab-org/gitlab-ee/issues/11238), the pipeline can wait for success of the child without any other special code/configuration required, wait for it to complete but not care about the result, or can just trigger it and not follow it at all.
All of this will work with includes, so you can retain composability within the configuration.
This first issue will allow for one level of child pipelines (i.e., one parent, n pipelines) and child pipelines will not be able to trigger further downstream pipelines. We will address this in a fast-follow issue (#29651 (closed)) where we will allow for some additional levels, with some limit to prevent infinite recursion.
Implementation
Introduce a new syntax for triggering a child pipeline by using the include
syntax:
microservice_a:
trigger:
include: local: config/microservice_a.yml
strategy: depend
only:
changes:
- microservice_a/*
For this example, a triggering job called microservice_a
in the parent pipeline would be triggered that would use the yaml from the repository in config/microservice_a.yml
as its configuration. It would only run if there are changes in the microservice_a
folder, and it will treat it as a dependency (i.e., fail if the child pipeline fails.. alternatively, the wait
strategy would wait for it to finish, but not care if it passes or fails.)
This approach would follow the same syntax as the include
keyword (so supports local
, file
, template
, and remote
.) This would also support multiple includes, which would be merged following existing include
rules:
microservice_a:
trigger:
include:
- local: config/microservice_a.yml
- local: config/all_microservices.yml
strategy: depend
only:
changes:
- microservice_a/*
A pipeline triggered in this way would start running independently but would have status attribution to a parent pipeline (via https://gitlab.com/gitlab-org/gitlab-ee/issues/11238). If you look at the parent pipeline's pipeline page, you must see that the pipeline is running or completed, and if you click on it, it must take you to the child pipeline's pipeline page.
By implementing things in this way, we do not need to radically change the user experience within GitLab and it works largely in the way you'd expect. There is always a parent pipeline since child pipelines can only be triggered from a parent.
UI implications
- Child pipelines will only be accessible through their parent pipelines to start off with. This reduces scope and complexity, the following follow up issue is created to look into expanding it if need be: #37051
- Child pipelines will thus not be featured in pipeline lists
- Child pipelines are not considered for the merge request widget or commit status
- The pipeline graph, mini-pipeline graph dropdown, and job lists will thus allow the user to navigate to the child pipeline.
- Child pipelines will not get tags such as
Latest
Pipeline detail view
Info widget
- Child pipelines do get a child pipeline tag on the pipeline detail view which is clickable and navigates to the parent pipeline page.
Child pipeline (parent)
similar to a moved issue status badge.- Tooltip states:
This is a child pipeline within the parent pipeline
- Tooltip states:
Pipeline graph view
- Lines connecting the current pipeline nodes with the representations of the child/downstream pipelines are hidden/deleted.
- Individual nodes for child pipelines or parent pipelines will get a tag indicating
child
/parent
to differentiate from multi-project pipelines.
Job list
-
In job lists, the trigger job will get a tag=> This has been considered out of scope for now[icon] child pipeline
with the icontrigger-source
. This tag is an anchor towards the child pipeline.
Mini pipeline graph
- No changes needed
Tasks
-
[p1]
CI yaml syntax to define child pipelines creation - We could use aFeature.enabled?(:ci_child_pipelines)
that isoff
by default - this will enable us to work on the rest of the dependencies backend -
[p1]
ChangeCi::CreateCrossProjectPipelineService
/Worker
to being able to create pipelines for the same project - ensure child pipeline must inherit all settings of parent pipeline when runningtrigger: local:
-
[p1]
RenameCi::CreateCrossProjectPipelineService
toCi::CreateDownstreamPipelineService
/Worker
backend -
[p1]
Define limitations for a child pipeline not to create further child pipelines and for thetrigger:include
to have a limited number of files to be included backend -
[p1]
Child pipeline trigger job should not return triggered array from endpointbackend -
[p1]
Allow frontend to distinguish between a child vs downstream and parent vs upstream pipeline. backend -
[p2]
Filter out child pipelines from being displayed in the merge request pipelines tab and pipelines index backend -
[p2]
A trigger job that triggers a child pipeline should show that it is not a normal trigger job. frontend UX -
[p2]
A trigger job that triggers a child pipeline should link back to the child pipeline detail view frontend
Tasks to be handled in separate issues:
-
[p2]
Allow variables of trigger job to be passed to the child pipeline allowing to fine-tune it ( #37348 (closed) but as part of this MVC scope) backend -
[p2]
Flip defaults for the feature flag to bedefault_enabled: true
- This will essentially activate the feature on production. (#37351 (closed)) backend
Supporting dynamic jobs follow-up
What this would leave out for now, though, is supporting dynamically generated .gitlab-ci.ymls. I have created issue #35632 (closed) as a follow up to support this and scheduled it for immediate follow-up in %12.7. This way this would work would be by improving the include
keyword to support a new artifact
keyword from which the dynamically generated CI YAML could be fetched. This would also add the ability for includes in general to support runtime-generated YAML passed through artifacts.
Real-life Example
GitLab QA and gitlab-qa#6 (closed), where we would like to trigger pipeline for GitLab CE or GitLab EE checks depending on project that triggering MR belongs to. This may be also interesting feature for GitLab Omnibus, to build EE / CE images easier. This is no longer the case with GitLab since we merged our projects, but the case is still illustrative.
Permissions and Security
Permissions and user context of the child pipeline will follow the parent pipeline, so there is no new security implication with this feature.
Documentation
Testing
What does success look like, and how can we measure that?
Links / references
-
pipeline
to include w/ local variables: https://gitlab.com/gitlab-org/gitlab-ce/issues/56214 - Slack Channel (Internal Link only) #f_child_par_pipelines has been defined.
This page may contain information related to upcoming products, features and functionality. It is important to note that the information presented is for informational purposes only, so please do not rely on the information for purchasing or planning purposes. Just like with all projects, the items mentioned on the page are subject to change or delay, and the development, release, and timing of any products, features, or functionality remain at the sole discretion of GitLab Inc.