Skip to content

Fix LLM AI client not returning a HTTP response with 204 responses

Stan Hu requested to merge sh-fix-llm-ai-client-204-response into master

What does this MR do and why?

Previously Llm::AiGateway::Client#request returned nil if the AI Gateway return a 204 No Content response. However, this makes it impossible to discern whether the request was successful or whether the server returned a 5xx error.

This was happening because run_retry_with_exponential_backoff returned nil if the body were blank. To fix this, return the HTTParty response even in this case.

References

None. I discovered this in development.

MR acceptance checklist

Please evaluate this MR against the MR acceptance checklist. It helps you analyze changes to reduce risks in quality, performance, reliability, security, and maintainability.

Edited by Stan Hu

Merge request reports

Loading