unsubbed.co

LiteLLM

LLM Gateway to manage authentication, load balancing, and spend tracking across 100+ LLMs. All in the OpenAI format.

Overview

Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM] LLM Gateway (OpenAI Proxy) to manage authentication, loadbalancing, and spend tracking across 100+ LLMs. All in the OpenAI format. 🚅 LiteLLM

Call 100+ LLMs in OpenAI format. [Bedrock, Azure, OpenAI, VertexAI, Anthropic, Groq, etc.]

LiteLLM Proxy Server (AI Gateway) | Hosted Proxy | Enterprise Tier The project has 39K+ GitHub stars and is licensed under NOASSERTION.

Getting Started

Source: GitHub README

git clone https://github.com/BerriAI/litellm.git
cd litellm
make install-dev    # Install development dependencies
make format         # Format your code
make lint           # Run all linting checks
make test-unit      # Run unit tests
make format-check   # Check formatting only

What Users Say

A gentle introduction to LiteLLM A useful addition to any LLM technology stack Tituslhy 14 min read · Jul 11, 2025 — 2 Listen Share Press enter or click to view image in full size Introducing LiteLLM. Image generated by ChatGPT Imagine this scenario: You’re part of a generative AI team at a company…

litellm 🚅 LiteLLM Call 100+ LLMs in OpenAI format. [Bedrock, Azure, OpenAI, VertexAI, Anthropic, Groq, etc.] LiteLLM Proxy Server (AI Gateway) | Hosted Proxy | Enterprise Tier Use LiteLLM for LLMs - Call 100+ LLMs (Python SDK + AI Gateway) All Supported Endpoints - /chat/completions, `/respon…

Normalized Features

Source: tool-features-normalized.json

docker, docker compose, npm, one click deploy, pip, sso.

Features

Authentication & Access

  • Single Sign-On (SSO)