Extend feature flag to log LLM input/output in V2 Chat endpoint

Problem

When we got a bug report from GitLab team members, we want to understand further what's going on under the hood.

Proposal

  • Push expanded_ai_logging feature flag. See https://docs.gitlab.com/ee/development/ai_features/index.html#push-feature-flags-to-ai-gateway
  • Log the user input and LLM/Agent output in AI Gateway.
Edited Sep 09, 2024 by Shinya Maeda
Assignee Loading
Time tracking Loading