Merge train/Release train/Merge when master succeeds: run build on merged code before merging
- As a user when I merge a feature branch I want to make sure that master is still green after that.
- To be 100% sure it requires testing the merge of the feature with master, but master changes all the time.
- Some repositories get one merge a minute, the build+test time is longer than the time a merge can take.
- This calls for a release train, in which merges are queued in a sequence.
How the feature works:
- Enable 'Merge when pipeline succeeds' https://docs.gitlab.com/ce/user/project/merge_requests/merge_when_pipeline_succeeds.html
- Enable 'Merge when master succeeds' or 'Release train', a new function
- All merge requests are tested as a feature branch on every commit, like you expect.
- If the feature branch is green the merge request has a button 'Merge when master succeeds' or 'Add as the 3th MR to the merge train'
- When you click 'Merge when master succeeds' on the first MR the tests start running for a merge with master
- If there is another MR where this button was already pushed and that is already running tests the second MR will enter a merge queue.
- Repeat for all other MR's, they are sequenced based on when the button was pressed and form a release train
- If a test fails on an MR it is no longer merged and all following MRs are tested without it
- Possible optimization: If a branch is the next one to be merged and it is already based on HEAD of master there is no need to run the tests again and it can just be merged when the button is pushed. Related: https://gitlab.com/gitlab-org/gitlab-ce/issues/27337
- Possible optimization: test subsequent MRs assuming the previous MRs succeed, but don't commit the merges until we know the previous MRs actually do succeed. If they don't, then throw away the optimistic pipeline and start over, based on a
master
without the failed MR. This is optional or future because if you have the compute capacity to run a bunch of things in parallel, why not just increase the parallelism of your pipeline so it runs faster in the first place?
Old description:
When I started working with gitlab's CI I expected to be able to set up a similar workflow as I see done on github, where a pull request triggers the CI to run the test suite on the merged code, ensuring that the resulting master after merge passes all tests. As far as I can see this is not possible in gitlab, and I think it would be a really nice option to have.
Judging from other issues posted around e.g. http://feedback.gitlab.com/forums/176466-general/suggestions/5683182-merge-requests-should-trigger-a-build-on-the-main, https://github.com/gitlabhq/gitlabhq/issues/7240, https://gitlab.com/gitlab-org/gitlab-ci-multi-runner/issues/270 this is a feature that a lot of people are looking for. I'm re-posting it here, since this seems to be the place to make feature requests.
If this is not a good workflow maybe some of you experienced gitlab CI developers could explain why, and how to best incorporate CI to test merges before actually merging them.
In our case, we don't want every fork have to configure a runner and to be tested on every commit, since the test suite requires computational resources running on an HPC cluster. And therefore a central runner that is triggered in the merging process would be the most convenient for us.