Keywords AI
Compare LiteLLM and Portkey side by side. Both are tools in the LLM Gateways category.
| Category | LLM Gateways | LLM Gateways |
| Pricing | Open Source | Freemium |
| Best For | Engineering teams who want an open-source, self-hosted LLM proxy for provider management | Engineering teams who need a reliable, observable gateway for production LLM applications |
| Website | litellm.ai | portkey.ai |
| Key Features |
|
|
| Use Cases |
|
|
LiteLLM is an open-source LLM proxy that translates OpenAI-format API calls to 100+ LLM providers. It provides a standardized interface for calling models from Anthropic, Google, Azure, AWS Bedrock, and dozens more. LiteLLM is popular as a self-hosted gateway with features like spend tracking, rate limiting, and team management.
Portkey is an AI gateway that provides a unified API for 200+ LLMs with built-in reliability features including automatic retries, fallbacks, load balancing, and caching. The platform includes observability with detailed request logs, cost tracking, and performance analytics. Portkey also offers guardrails, access controls, and virtual keys for managing LLM usage across teams.
Unified API platforms and proxies that aggregate multiple LLM providers behind a single endpoint, providing model routing, fallback, caching, rate limiting, cost optimization, and access control.
Browse all LLM Gateways tools →