Simplify using LLM APIs from OpenAI, Azure, Cohere, Anthropic, Replicate, Google 🤩 TLDR • Call all LLM APIs using the chatGPT format - completion(model, messages) • Consistent outputs and exceptions for all LLM APIs • Logging and Error Tracking for all models
Hi Producthunt,
@krrish_dholakia and I are super excited to be launching 🚅 liteLLM
An Open Source Python library to standardize LLM Inputs/Outputs for 50+ models (OpenAI, Anthropic, Cohere etc):https://github.com/BerriAI/litellm
🚨Why did we build liteLLM
We needed simplicity Our code started to get extremely complicated managing & translating calls between Azure, OpenAI, Cohere, Anthropic, Replicate LLM APIs.
🚀How does it work
liteLLM is a Python package that enables calling 50+ LLM API models using a consistent input/output interface - in the chatGPT formathere’s an example of how liteLLM works
response = completion(model="gpt-3.5-turbo", messages=messages)
response = completion("claude-2", messages)
response = completion("command-nightly", messages)
📞 Interested in using liteLLM?
You can try out liteLLM today for free: https://github.com/BerriAI/litellm/
If you're interested in liteLLM you can meet with me https://calendly.com/d/4mp-gd3-k... or reach me on Twitter: https://twitter.com/ishaan_jaff