Add a "quality report" generated by AI
Release notes
A customer asked me yesterday if we can ask the AI to provide us with an efficiency score of our code. The idea would be, like for code explanation or test case generation to submit a piece of code or a file and ask the AI to rate it. So we know if it's worth refactoring it or not.
Problem to solve
Sometimes, over time, developers add more and more methods and complexity to a class. Or continue using "good old framework" while a newer one is a replacement. It would be good to have the AI generating a "Score" for us to know if our code could be significantly improved or not.
Proposal
Intended users
- Delaney (Development Team Lead)
- Sasha (Software Developer)
- Priyanka (Platform Engineer)
- Simone (Software Engineer in Test)
- Isaac (Infrastructure Engineer)
Feature Usage Metrics
If it is a new feature in the UI, then tracking its usage like any other feature is enough.
Does this feature require an audit event?
This page may contain information related to upcoming products, features and functionality. It is important to note that the information presented is for informational purposes only, so please do not rely on the information for purchasing or planning purposes. Just like with all projects, the items mentioned on the page are subject to change or delay, and the development, release, and timing of any products, features, or functionality remain at the sole discretion of GitLab Inc.