Sign in or sign up before continuing. Don't have an account yet? Register now to get started.
Register now

Maven plugin managing dependencies between separate jobs

Everyone can contribute. Help move this issue forward while earning points, leveling up and collecting rewards.

  • Close this issue

Release notes

Problem to solve

The Jenkins Maven plugin uses the Free-style type of Jenkins project and tries to correlate independent builds based on Maven POM files. The free-style Jenkins project type is not supported by our current Jenkins importer.

Regarding capabilities for managing dependencies, either in a monorepo or cross-repos, the Maven plugin says:

Jenkins automatically creates project dependencies between projects which declare SNAPSHOT dependencies between each other.

We don’t have such capability today. The closest feature we have is rules:changes but this is very hard to declaratively define rules in a yaml with very large dependencies.

Intended users

User experience goal

Proposal

Further details

Additional context from a customer:

The maven plugin gets inside a maven build by attaching to the maven intercept hook, and determines the inputs and outputs of a given build. These are fingerprinted, and when ties are found in other jobs, connections are established between them. This means that I can make a change in a root component and cause it to build the entire tree. Conversely, I can make a change near the end of a leaf and it will only build the components downstream of that. On top of that, if a developer makes a change in a component dependency, the tree is auto-rewired.

In a pipeline scenario, the only way I see this working is with the dependency map maintained independently of the code (which is potentially brittle), and the pipeline would have to contain code to determine the next step to execute as a function of changes against that dependency tree.

In addition, since module builds are independent jobs, the triggers in Jenkins are smart enough to decide not to trigger the downstream job if it senses another change in flight that will do the same thing.

I see pipelines/yaml as a brute force approach to builds – essentially a way to distribute a batch build job across multiple nodes. Scaling out the build is one approach. Limiting the build to what is actually required is the other.

The problem I see is that pipelines, both in Jenkins and Gitlab, are written more to support modular and version-managed dependencies (with components that publish by version, just like OSS) and not for modular but interdependent codebases.

Permissions and Security

Documentation

Availability & Testing

Available Tier

What does success look like, and how can we measure that?

What is the type of buyer?

Is this a cross-stage feature?

Links / references

Edited Aug 04, 2025 by 🤖 GitLab Bot 🤖
Assignee Loading
Time tracking Loading