Skip to main content

The LiteLLM Blog

Guides, announcements, and best practices from the LiteLLM team.

Latest

v1.81.6 - Logs v2 with Tool Call Tracing

Deploy this version

v1.81.3-stable - Performance - 25% CPU Usage Reduction

Deploy this version

v1.81.0-stable - Claude Code - Web Search Across All Providers

Deploy this version

v1.80.15-stable - Manus API Support

Deploy this version

v1.80.11-stable - Google Interactions API

Deploy this version

[Preview] v1.80.10.rc.1 - Agent Gateway: Azure Foundry & Bedrock AgentCore

Deploy this version

v1.80.8-stable - Introducing A2A Agent Gateway

Deploy this version

v1.80.5-stable - Gemini 3.0 Support

Deploy this version

v1.80.0-stable - Introducing Agent Hub: Register, Publish, and Share Agents

Deploy this version

v1.79.3-stable - Built-in Guardrails on AI Gateway

Deploy this version

v1.79.1-stable - Guardrail Playground

Deploy this version

v1.79.0-stable - Search APIs

Deploy this version

v1.78.5-stable - Native OCR Support

Deploy this version

v1.78.0-stable - MCP Gateway: Control Tool Access by Team, Key

Deploy this version

v1.77.7-stable - 2.9x Lower Median Latency

Deploy this version

v1.77.5-stable - MCP OAuth 2.0 Support

Deploy this version

v1.77.3-stable - Priority Based Rate Limiting

Deploy this version

v1.77.2-stable - Bedrock Batches API

Deploy this version

v1.76.3-stable - Performance, Video Generation & CloudZero Integration

This release has a known issue where startup is leading to Out of Memory errors when deploying on Kubernetes. We recommend waiting before upgrading to this version.

v1.76.1-stable - Gemini 2.5 Flash Image

Deploy this version

v1.76.0-stable - RPS Improvements

LiteLLM is hiring a Founding Backend Engineer, in San Francisco.

v1.75.8-stable - Team Member Rate Limits

Deploy this version

v1.75.5-stable - Redis latency improvements

Deploy this version

v1.74.15-stable

Deploy this version

v1.74.9-stable - Auto-Router

Deploy this version

v1.74.7-stable

Deploy this version

v1.74.3-stable

Deploy this version

v1.74.0-stable

Deploy this version

v1.73.6-stable

Deploy this version

v1.73.0-stable - Set default team for new users

Known Issues

v1.72.6-stable - MCP Gateway Permission Management

Deploy this version

v1.72.2-stable

Deploy this version

v1.72.0-stable

Deploy this version

v1.71.1-stable - 2x Higher Requests Per Second (RPS)

Deploy this version

v1.70.1-stable - Gemini Realtime API Support

Deploy this version

v1.69.0-stable - Loadbalance Batch API Models

Deploy this version

v1.68.0-stable

Deploy this version

v1.67.4-stable - Improved User Management

Deploy this version

responses_apiui_improvementssecurity

v1.67.0-stable - SCIM Integration

Key Highlights

ssounified_file_idcost_tracking

v1.66.0-stable - Realtime API Cost Tracking

Deploy this version

ssounified_file_idcost_tracking

v1.65.4-stable

Deploy this version

v1.65.0-stable - Model Context Protocol

v1.65.0-stable is live now. Here are the key highlights of this release:

mcpcustom_prompt_management

v1.65.0 - Team Model Add - update

v1.65.0 updates the /model/new endpoint to prevent non-team admins from creating team models.

management endpointsteam modelsui

v1.63.14-stable

These are the changes since v1.63.11-stable.

credential managementthinking contentresponses api

v1.63.11-stable

These are the changes since v1.63.2-stable.

credential managementthinking contentresponses api

v1.63.2-stable

These are the changes since v1.61.20-stable.

llm translationthinkingreasoning_content

v1.63.0 - Anthropic 'thinking' response update

v1.63.0 fixes Anthropic 'thinking' response on streaming to return the signature block. Github Issue

llm translationthinkingreasoning_content

v1.61.20-stable

These are the changes since v1.61.13-stable.

llm translationrerankui

v1.59.8-stable

Get a 7 day free trial for LiteLLM Enterprise here.

admin uiloggingdb schema

v1.59.0

Get a 7 day free trial for LiteLLM Enterprise here.

admin uiloggingdb schema

v1.57.8-stable

alerting, prometheus, secret management, management endpoints, ui, prompt management, finetuning, batch

langfusehumanloopalerting

v1.57.7

langfuse, management endpoints, ui, prometheus, secret management

langfusemanagement endpointsui

v1.57.3 - New Base Docker Image

docker image, security, vulnerability

docker imagesecurityvulnerability

v1.56.4

deepgram, fireworks ai, vision, admin ui, dependency upgrades

deepgramfireworks aivision

v1.56.3

guardrails, logging, virtual key management, new models

guardrailsloggingvirtual key management

v1.56.1

key management, budgets/rate limits, logging, guardrails

key managementbudgets/rate limitslogging

v1.55.10

batches, guardrails, team management, custom auth

batchesguardrailsteam management

v1.55.8-stable

A new LiteLLM Stable release just went out. Here are 5 updates since v1.52.2-stable.

langfusefallbacksnew models