Webvar
LiteLLM LLM Gateway (Proxy Server) - logo

LiteLLM LLM Gateway (Proxy Server)

OpenAI Proxy Server (LLM Gateway) to call 100+ LLM APIs using the OpenAI format Bedrock, Huggingface, VertexAI, TogetherAI, Azure OpenAI, OpenAI, etc. Get started with Opensource LiteLLM here: https://github.com/BerriAI/litellm (12,000+ Github Stars)
View offer on AWS
awsPurchase this listing from Webvar in AWS Marketplace using your AWS account. In AWS Marketplace, you can quickly launch pre-configured software with just a few clicks. AWS handles billing and payments, and charges on your AWS bill.

About

With LiteLLM Proxy Server, you'll get access to a Proxy Server to call 100+ LLMs in a unified interface where you'll be able to track spend, set budgets per virtual key and users.

You'll be able to set budgets & rate limits per project, API key, and model on OpenAI Proxy Servers.

You can also translate inputs to the provider's completion, embedding, and image_generation endpoints as well as retry/fallback logic across multiple LLM deployments.

Related Products

How it works?

Search

Search 25000+ products and services vetted by AWS.

Request private offer

Our team will send you an offer link to view.

Purchase

Accept the offer in your AWS account, and start using the software.

Manage

All your transactions will be consolidated into one bill in AWS.

Create Your Marketplace with Webvar!

Launch your marketplace effortlessly with our solutions. Optimize sales processes and expand your reach with our platform.