LiteLLM Integration

liteLLM provides callbacks, making it easy for you to log your completions responses.

Using Callbacks

First, sign up to get an app ID on the LLMonitor dashboard.

With these 2 lines of code, you can instantly log your responses across all providers with llmonitor:

litellm.success_callback = ["llmonitor"]
litellm.failure_callback = ["llmonitor"]

Complete code

from litellm import completion
## set env variables
os.environ["LLMONITOR_APP_ID"] = "YOUR APP ID"
# Optional: os.environ["LLMONITOR_API_URL"] = "self-hosting-url"
os.environ["OPENAI_API_KEY"], os.environ["COHERE_API_KEY"] = "", ""
# set callbacks
litellm.success_callback = ["llmonitor"]
litellm.failure_callback = ["llmonitor"]
#openai call
response = completion(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Hi 👋 - i'm openai"}],
user="some_user_id"
)
#cohere call
response = completion(
model="command-nightly",
messages=[{"role": "user", "content": "Hi 👋 - i'm cohere"}],
user="some_user_id"
)

Questions? We're here to help.