Skip to content

RTV - Service classes to retrieve a LLM code fix for a vuln

What does this MR do and why?

This MR introduces the necessary service classes to provide Resolve This Vulnerability functionality to GitLab. None of this code connects to a user interaction yet, but is intended to be provided for user interaction by way of the subscription WebSocket workflow currently used by all AI interactions, hence it's provisioning within the LLM CompletionsFactory.

How to set up and validate locally

Testing this isn't straightforward unless you have:

  1. A local GDK configuration with a project that can produce valid SAST vulnerabilities
  2. Local GDK Vertex.ai LLM credentials configured.

With that, you can run the following script in your gdk rails console:

Gitlab::Llm::Completions::ResolveVulnerability.new(Gitlab::Llm::AiMessage.new(role: 'user', user: User.first, resource: Vulnerability.sast.last), Gitlab::Llm::Templates::Vulnerabilities::ResolveVulnerability).execute

If successful, this should return a !5 style merge request reference referring to a merge request created on the project associated with the vulnerability, which was created using the LLM patch provided by Vertex.

MR acceptance checklist

This checklist encourages us to confirm any changes have been analyzed to reduce risks in quality, performance, reliability, security, and maintainability.

Related to #426575 (closed)

Merge request reports