Skip to content

Pass token usage metrics to clients

Problem to solve

As a first step for Instrumentalize token usage per user in LLM calls (&36), let's return input/output token usage from the AI Gateway to clients. This way clients can associate individual calls with users (the AIGW only receives anonymized user IDs, so it can't perform this task)

Proposal

Further details

Links / references

Edited by Alejandro Rodríguez