Draft POC CI Pipelines for packages
🎪 Context
See #324196 (closed).
This MR is a Proof of Concept to explore this idea of having pipelines running each time a package is pushed to the GitLab Package Registry.
☄ Changes
Two projects needed changes for this support to happen: the GitLab rails backend and the GitLab runner project.
backend
Rails(Changes that you can see in this MR)
- Added
Packages::Push
model to link a package file to asha
. - Added
Packages::CreatePipelineService
, a service that will create a pipeline object out of aPush
instance. - Added a new pipeline source event:
package_push_event
- Update the chain classes handling the
Ci::Pipeline
to make them accept aPush
- Added an API to download a package file linked to a Push from a given
sha
. (used by the GitLab runner) - Added a call to
Packages::CreatePipelineService
when an NPM package is uploaded. - Updated the NPM endpoint that returns package files. The package file is only returned if a green pipeline exists.
- Added pipeline status for package files page (UI is not definitive!)
- Updated the related presenters (
n+1
issue there)
- Updated the related presenters (
- Added pipeline status for packages page (UI is not definitive!)
- Updated the related presenters (
n+1
issue there)
- Updated the related presenters (
GitLab runner
- Added
curl
andtar
to the base images- Not happy with that change
- Added support for pipelines that are created with a package push during the "fetch git source" step
- This will
curl
the push API with thesha
from the pipeline to get the package. - The package file will be extracted to the build dir under
pkg
(Not happy with this change)
- This will
🤔 Questions
backend
Rails-
Ci::Pipeline
is an object that has several references to a git repository but in the case of a package push, we only have asha
and that's it. How can we cleanly support that case? - How do we integrate with packages that need a background processing? (eg. nuget or rubygems packages). Do we need to wait for the metadata extraction? Perhaps not?
- The metadata extraction could be a "mandatory" job of that pipeline?
- Not all packages are represented by a single package archive. Some of them are git tags. How do we support that?
- Are pipelines notifications working?
- How can we handle two
.gitlab-ci.yml
files? (one for git repository changes and one for package pushes).- Custom
Configuration::Content
chain class?
- Custom
GitLab runner
-
curl
is needed to get the package file but how we extract things from the package archive is highly dependent from the package type. For example, NPM packages aretar
archives but nuget packages arezip
archives. It doesn't seem wise to include in the runner image all the tools needed to extract archives.- How can we have a more dynamic behavior? Let the rails backend tell runner which tool is needed for the extraction?
🔭 Possible follow-ups/evolutions
backend
Rails- The
Push
object can be used to build an audit log on uploads. We could store who pushed the package (User or DeployToken). The UI could display thosePush
objects as an activity. - Move the metadata extraction background jobs to CI "mandatory" / "forced" jobs = Users would have full visibility of what is happening with their package.
Edited by David Fernandez