Support claude 3.5 Sonnet new model version

What does this MR do and why?

The following MR only adds support for the new claude-3-5-haiku-20241022 model in GitLab-rails.

Some features haven't fully moved over their prompts to the AI-Gateway Prompt Migration to AI Gateway (&14259 - closed) • Unassigned, as such, support for Anthropic model changes needs to be also applied to the monolith. For beta and experimental features that sometimes still call the Anthropic's API directly, we have to support the new model for usage within `gitlab-rails``. To allow users to test new changes with the newest claude model, the MR only adds the model as support for with out full rollout to other features.

See &14259 (closed) for more details about the groupcustom models decoupling prompt changes from monolith changes.

Before After

How to set up and validate locally

To test whether the change works as intended, please take any experimental or beta feature's prompt & update their parameter definition to use CLAUDE_3_5_SONNET_V2. I choose the duo_code_review feature for example. tempaltes/review_merge_request

  1. Create an MR with a new draft files.
  2. Run the following slash command /duo_code_review.
  3. Ensure the following response is generated with the new model

Example response output


=> #<Gitlab::Llm::Anthropic::ResponseModifiers::ReviewMergeRequest:0x000000034221db68
 @ai_response=
  {"id"=>"msg_014u2EsdtEB4UfHajbAHGsNE",
   "type"=>"message",
   "role"=>"assistant",
   "model"=>"claude-3-5-sonnet-20241022",
   "content"=>
    [{"type"=>"text",
      "text"=>
       "<review>\n<comment priority=\"2\" line=\"16\">\nConsider adding a guard clause for `resource` to ensure it's not nil before calling `ai_review_merge_request_allowed?`. While `super` might handle this, being explicit about the requirement would make the code more robust.\n\n<from>\n      super && resource.ai_review_merge_request_allowed?(user)\n</from>\n<to>\n      super && resource&.ai_review_merge_request_allowed?(user)\n</to>\n</comment>\n</review>"}],
   "stop_reason"=>"end_turn",
   "stop_sequence"=>nil,
   "usage"=>{"input_tokens"=>1014, "output_tokens"=>128}}>
Edited by Nathan Weinshenker

Merge request reports

Loading