Draft POC CI Pipelines for packages
🎪 Context
See #324196 (closed).
This MR is a Proof of Concept to explore this idea of having pipelines running each time a package is pushed to the GitLab Package Registry.
☄ Changes
Two projects needed changes for this support to happen: the GitLab rails backend and the GitLab runner project.
Rails backend
(Changes that you can see in this MR)
- Added
Packages::Pushmodel to link a package file to asha. - Added
Packages::CreatePipelineService, a service that will create a pipeline object out of aPushinstance. - Added a new pipeline source event:
package_push_event - Update the chain classes handling the
Ci::Pipelineto make them accept aPush - Added an API to download a package file linked to a Push from a given
sha. (used by the GitLab runner) - Added a call to
Packages::CreatePipelineServicewhen an NPM package is uploaded. - Updated the NPM endpoint that returns package files. The package file is only returned if a green pipeline exists.
- Added pipeline status for package files page (UI is not definitive!)
- Updated the related presenters (
n+1issue there)
- Updated the related presenters (
- Added pipeline status for packages page (UI is not definitive!)
- Updated the related presenters (
n+1issue there)
- Updated the related presenters (
GitLab runner
- Added
curlandtarto the base images- Not happy with that change
- Added support for pipelines that are created with a package push during the "fetch git source" step
- This will
curlthe push API with theshafrom the pipeline to get the package. - The package file will be extracted to the build dir under
pkg(Not happy with this change)
- This will
🤔 Questions
Rails backend
-
Ci::Pipelineis an object that has several references to a git repository but in the case of a package push, we only have ashaand that's it. How can we cleanly support that case? - How do we integrate with packages that need a background processing? (eg. nuget or rubygems packages). Do we need to wait for the metadata extraction? Perhaps not?
- The metadata extraction could be a "mandatory" job of that pipeline?
- Not all packages are represented by a single package archive. Some of them are git tags. How do we support that?
- Are pipelines notifications working?
- How can we handle two
.gitlab-ci.ymlfiles? (one for git repository changes and one for package pushes).- Custom
Configuration::Contentchain class?
- Custom
GitLab runner
-
curlis needed to get the package file but how we extract things from the package archive is highly dependent from the package type. For example, NPM packages aretararchives but nuget packages areziparchives. It doesn't seem wise to include in the runner image all the tools needed to extract archives.- How can we have a more dynamic behavior? Let the rails backend tell runner which tool is needed for the extraction?
🔭 Possible follow-ups/evolutions
Rails backend
- The
Pushobject can be used to build an audit log on uploads. We could store who pushed the package (User or DeployToken). The UI could display thosePushobjects as an activity. - Move the metadata extraction background jobs to CI "mandatory" / "forced" jobs = Users would have full visibility of what is happening with their package.
Edited by David Fernandez