add "pipelines_json.log" logfile to use pipeline data with 3rd party software

Everyone can contribute. Help move this issue forward while earning points, leveling up and collecting rewards.

  • Close this issue

Problem to solve

gitlab pipeline view is slow when you've >1500 jobs and leads sometimes to timeouts (see ref issue) - especially if the pipeline is currently running.

an optional alternative would be to feed the data into elasticsearch and use kibana as frontend for statistics.

pipeline/job data of api_json.log and production_json.log is very limited atm and doesn't have the necessary information to get an overview or more details about the status of pipeline/jobs - like job name, stage, job, job url, duration,etc..

of course this could also be achieved with a custom script which makes the necessary api calls and write it to a logfile.

Further details

Proposal

add something like job traces just for pipelines or add an (optional) logfile like pipelines_json.log which basically contains the same data as the with the api call.

What does success look like, and how can we measure that?

A (structured - json) logfile which contains all the pipeline data.

Links / references

gitlab-ce#57487

Edited Aug 26, 2025 by 🤖 GitLab Bot 🤖
Assignee Loading
Time tracking Loading