LiteLLM LLM Gateway (Proxy Server)
About
With LiteLLM Proxy Server, you'll get access to a Proxy Server to call 100+ LLMs in a unified interface where you'll be able to track spend, set budgets per virtual key and users.
You'll be able to set budgets & rate limits per project, API key, and model on OpenAI Proxy Servers.
You can also translate inputs to the provider's completion, embedding, and image_generation endpoints as well as retry/fallback logic across multiple LLM deployments.
Related Products
show moreHow it works?
Search
Search 25000+ products and services vetted by AWS.
Request private offer
Our team will send you an offer link to view.
Purchase
Accept the offer in your AWS account, and start using the software.
Manage
All your transactions will be consolidated into one bill in AWS.
Create Your Marketplace with Webvar!
Launch your marketplace effortlessly with our solutions. Optimize sales processes and expand your reach with our platform.