Fix Explain This Vulnerability Disregarding `include_source_code`
While verifying Explain This Vulnerability: Handle Null Prompt ... (#416521 - closed) • Gregory Havenga • 16.4 • On track @dftian identified that the response being received was for that of a no-code response while attempting to manufacture a situation where a null prompt should have been produced.
Further investigation shows that due to a discrepancy in the use of the Template class between the AiAction and GraphQL query, while the graphql api presents a null prompt where appropriate, the use of the to_prompt
method in the Completion class does not pass the parameter correctly, resulting in it being disregarded.
This is significantly problematic as this means users who would not want their code sent to the AI will be actively lied to by the interface regarding what prompt will be sent and their code sent to the LLM anyways.
Implementation Plan
- Patch the completions code to ensure that the
include_source_code
directive is applied appropriately.
diff --git a/ee/lib/gitlab/llm/completions/explain_vulnerability.rb b/ee/lib/gitlab/llm/completions/explain_vulnerability.rb
index 39e0e162338c..de681922df13 100644
--- a/ee/lib/gitlab/llm/completions/explain_vulnerability.rb
+++ b/ee/lib/gitlab/llm/completions/explain_vulnerability.rb
@@ -30,7 +30,7 @@ def formatted_error_response(message)
def response_for(user, vulnerability, options)
Rails.cache.fetch(cache_key(vulnerability, options), expires_in: 5.minutes, skip_nil: true) do
- prompt = ai_prompt_class.new(vulnerability, options).to_prompt
+ prompt = ai_prompt_class.new(vulnerability).to_prompt(options)
if prompt.to_s.empty?
formatted_error_response(NULL_PROMPT_ERROR)
else