Option to report pipeline as passed when being blocked by further manual stages
Problem to solve
It is not possible to mark a pipeline as
success when blocked by further stages which must be triggered manually (deploys). That is misleading for developers, because from their point of view the pipeline is successful.
Our pipeline looks like this:
- Static check
- Code quality...
- Test cleanup
- Deploy to staging
- Staging cleanup
- Deploy to prod
- Prod cleanup
Jobs from the first three stages are run automatically. However, for obvious reasons, the deploys need to be triggered manually. When a Deploy (staging/prod) is triggered, it should finish AND after it does, the corresponding cleanup stage should be run.
Therefore I added this to their definition:
when: manual allow_failure: false
The problem is that when the automatic part of a pipeline finishes successfully, its status is
blocked. That is confusing for developers, because for them "tests passed = successful pipeline".
If I did not disable the
allow_failure step, then the pipeline would report
passed, but the problem is that
Staging cleanup and
Prod cleanup would be automatically run and would not wait for their "parent" job. That is also not a desired behavior.
The reason why the I found out that I could use
Deploy to staging and
Staging cleanup stages are different is because I need to run the cleanup regardless of whether the deploy succeeds or fails. Having this logic in the same job would be cumbersome because I would have to check the status of a pipeline before every
after_script, so this point is moot.
I can think of two solutions to this case.
Report pipeline status from a job
A new configuration parameter (or pipeline API endpoint?) could be introduced that would allow setting the pipeline's status.
Wait for manual jobs
A configuration parameter could be introduced that would behave similarly to
dependencies with the difference that in this case the job would wait for the upstream job, even if that upstream job is