GitLab feedback - feature proposal: Charts - Overall statistics are misleading

Dear Gitlab-Team, I am kind of new to the CI/CD development and use GitLab for about 1 Month now. If i am right i found a really little bug - maybe you can fix this one soon :)

Problem to solve

The numbers on the CI / CD "Overall statistics"-Page do not match the "has to be numbers". Example: --> https://gitlab.com/gitlab-org/gitlab-foss/pipelines/charts 70001 (Success) / 155303 (total) = ~ 45.07 % BUT your page supposes a success ratio of 48%

I can only guess that the diff results from "skipped" pipelines. Maybe you have even more Tags that are ignored in the statistics.

Skipped Pipelines imply that ..

  1. .. somebody is working on the project (which is good)
  2. .. there might by problems to solve issues like semantic errors or concentration on work is low or (whatever) This is interesting for Scrum-Masters or likewise. Therefore the charts lie about the team's success und suppress (maybe needed) interaction to help the team.

Please include "skipped" (& others) into the charts - It is not appearing there, yet.

Intended users

Everyone from manager over team lead to all coding people starting runners ...

  • Parker (Product Manager)
  • Delaney (Development Team Lead)

--> Parke & Delaney are not mislead by the ratio/charts anymore.

  • Sasha (Software Developer)
  • Presley (Product Designer)
  • Devon (DevOps Engineer)
  • Sidney (Systems Administrator)
  • Sam (Security Analyst)
  • Dana (Data Analyst)

--> Everyone can see how many pipelines were skipped that day -> Reveals problems earlier than before

Further details

Please provide more detailed charts.

Proposal

Include the skipped pipelines (& likewise) Maybe a hover action on the success ratio which reveals the calculation.

Permissions and Security

None - its open.

Documentation

None

Testing

What additional test coverage or changes to tests will be needed? Testing the correct number of the ratio including having a setup with skipped pipelines.

What does success look like, and how can we measure that?

Skipped tests (& others) are visible.

What is the type of buyer?

I think it is a devOps relevant issue which should be included on every tier.

Links / references

None

Assignee Loading
Time tracking Loading