Extend feature flag to log LLM input/output in V2 Chat endpoint
Problem
When we got a bug report from GitLab team members, we want to understand further what's going on under the hood.
Proposal
- Push
expanded_ai_loggingfeature flag. See https://docs.gitlab.com/ee/development/ai_features/index.html#push-feature-flags-to-ai-gateway - Log the user input and LLM/Agent output in AI Gateway.
Edited by Shinya Maeda