Skip to content

chore(deps): update dependency litellm to v1.55.2

Soos requested to merge renovate/litellm-1.x-lockfile into main

This MR contains the following updates:

Package Type Update Change
litellm dependencies minor 1.54.0 -> 1.55.2

⚠️ Warning

Some dependencies could not be looked up. Check the warning logs for more information.

WARNING: this job ran in a Renovate pipeline that doesn't support the configuration required for common-ci-tasks Renovate presets.


Release Notes

BerriAI/litellm (litellm)

v1.55.2

Compare Source

What's Changed

Full Changelog: https://github.com/BerriAI/litellm/compare/v1.55.1...v1.55.2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.55.2
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed 250.0 285.12950797290193 6.288841435893255 0.0033415735578603907 1882 1 149.6715149999659 2193.2730590000347
Aggregated Passed 250.0 285.12950797290193 6.288841435893255 0.0033415735578603907 1882 1 149.6715149999659 2193.2730590000347

v1.55.1

Compare Source

What's Changed

Full Changelog: https://github.com/BerriAI/litellm/compare/v1.55.0...v1.55.1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.55.1
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed 250.0 274.17864765330575 6.170501674094568 0.0 1846 0 212.15181599995958 2203.3609819999356
Aggregated Passed 250.0 274.17864765330575 6.170501674094568 0.0 1846 0 212.15181599995958 2203.3609819999356

v1.55.0

Compare Source

What's Changed

Full Changelog: https://github.com/BerriAI/litellm/compare/v1.54.1...v1.55.0

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.55.0
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed 250.0 286.19507948581224 5.886697197840291 0.0033409178194326278 1762 1 211.68456200001629 3578.4067740000296
Aggregated Passed 250.0 286.19507948581224 5.886697197840291 0.0033409178194326278 1762 1 211.68456200001629 3578.4067740000296

v1.54.1

Compare Source

What's Changed

Full Changelog: https://github.com/BerriAI/litellm/compare/v1.54.0...v1.54.1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.54.1
Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed 280.0 340.7890831504466 5.986291177372485 0.0 1788 0 236.28402200000664 4047.592437999981
Aggregated Failed 280.0 340.7890831504466 5.986291177372485 0.0 1788 0 236.28402200000664 4047.592437999981

Configuration

📅 Schedule: Branch creation - "every weekend" (UTC), Automerge - At any time (no schedule defined).

🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.

♻️ Rebasing: Whenever MR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this MR and you won't be reminded about this update again.


  • If you want to rebase/retry this MR, check this box

This MR has been generated by Renovate Bot.

Merge request reports

Loading