Skip to content

Gitlab Self Managed - AI Gateway throws errors when connecting to o3 or gpt-5

Summary

When attempting to use openai/o3 or openai/gpt-5 for Gitlab Duo Chat on our self-hosted Gitlab instance, the AI Gateway returns 500 status codes and fails to connect to the OpenAI API.

Steps to reproduce

  • Set up self-managed Gitlab instance with AI Gateway
  • Navigate to GitLab Duo Self-Hosted in Duo admin dashboard
  • Set up a self-hosted model pointing at https://api.openai.com/v1 with model identifier openai/gpt-5 or openai/o3 and include a valid OpenAI API key
  • Click Test connection
  • Change model identifier to openai/gpt-4o . Test connection will work successfully

Example Project

What is the current bug behavior?

For gpt-5, the request fails and logs the following: litellm.BadRequestError: OpenAIException - Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.

For o3, the request fails and logs: litellm.UnsupportedParamsError: O-series models don't support temperature=0.1. Only temperature=1 is supported. To drop unsupported openai params from the call, set litellm.drop_params = True

What is the expected correct behavior?

Requests should be completed successfully.

Relevant logs and/or screenshots

Output of checks

Results of GitLab environment info

Gitlab omnibus container: gitlab/gitlab-ee:18.3.1-ee.0

AI gateway container: registry.gitlab.com/gitlab-org/modelops/applied-ml/code-suggestions/ai-assist/model-gateway:self-hosted-v18.3.0-ee

Expand for output related to GitLab environment info

  (For installations with omnibus-gitlab package run and paste the output of: \\\`sudo gitlab-rake gitlab:env:info\\\`)  (For installations from source run and paste the output of: \\\`sudo -u git -H bundle exec rake gitlab:env:info RAILS_ENV=production\\\`)   

Results of GitLab application Check

Expand for output related to the GitLab application check

 (For installations with omnibus-gitlab package run and paste the output of: \`sudo gitlab-rake gitlab:check SANITIZE=true\`)  (For installations from source run and paste the output of: \`sudo -u git -H bundle exec rake gitlab:check RAILS_ENV=production SANITIZE=true\`)  (we will only investigate if the tests are passing)  

Possible fixes

I suspect the fix here might just involve updating the litellm package, but that's just a guess.

Patch release information for backports

If the bug fix needs to be backported in a patch release to a version under the maintenance policy, please follow the steps on the patch release runbook for GitLab engineers.

Refer to the internal "Release Information" dashboard for information about the next patch release, including the targeted versions, expected release date, and current status.

High-severity bug remediation

To remediate high-severity issues requiring an internal release for single-tenant SaaS instances, refer to the internal release process for engineers.

Edited by 🤖 GitLab Bot 🤖