add "pipelines_json.log" logfile to use pipeline data with 3rd party software
Problem to solve
gitlab pipeline view is slow when you've >1500 jobs and leads sometimes to timeouts (see ref issue) - especially if the pipeline is currently running.
an optional alternative would be to feed the data into elasticsearch and use kibana as frontend for statistics.
pipeline/job data of
production_json.log is very limited atm and doesn't have the necessary information to get an overview or more details about the status of pipeline/jobs - like job name, stage, job, job url, duration,etc..
of course this could also be achieved with a custom script which makes the necessary api calls and write it to a logfile.
add something like job traces just for pipelines or add an (optional) logfile like
pipelines_json.log which basically contains the same data as the with the api call.
What does success look like, and how can we measure that?
A (structured - json) logfile which contains all the pipeline data.