Skip to content

feat: allow codestral to be used for code generations and completions

Bruno Cardoso requested to merge bc/custom-models-codestral into main

What does this merge request do and why?

Allow codestral models to be used for code generations and completions.

Part of gitlab-org/gitlab#467437 (closed).

How to set up and validate locally

  1. Make a request to the generations endpoint specifying the codestral model.
curl -X 'POST' \
  'http://localhost:5052/v2/code/generations' \
  -H 'accept: application/json' \
  -H 'Content-Type: application/json' \
  -d '{
  "current_file": {
    "file_name": "app.py",
    "language_identifier": "python",
    "content_above_cursor": "def hello_world():",
    "content_below_cursor": ""
  },
  "model_provider": "litellm",
  "model_endpoint": "http://127.0.0.1:11434/v1",
  "model_name": "codestral",
  "stream": false,
  "choices_count": 1,
  "context": [],
  "prompt_version": 3,
  "prompt": [{"role": "user", "content": "help me"}]
}'
{
  "id": "id",
  "model": {
    "engine": "litellm",
    "name": "openai/codestral",
    "lang": "python"
  },
  "experiments": [],
  "object": "text_completion",
  "created": 1718897380,
  "choices": [
    {
      "text": "\n Of course, I'd be happy to help you. Could you please provide more details about the topic or issue you're facing? It could be anything from a math problem, a question about programming, understanding a concept, troubleshooting a technical issue, or even just having a conversation. The more specific your request is, the better I can assist you.",
      "index": 0,
      "finish_reason": "length"
    }
  ]
}

Merge request checklist

  • Tests added for new functionality. If not, please raise an issue to follow up.
  • Documentation added/updated, if needed.
Edited by Bruno Cardoso

Merge request reports