Skip to main content

Using ChatLiteLLM() - Langchain

Pre-Requisites​

!pip install litellm langchain

Quick Start​

import os
from langchain_community.chat_models import ChatLiteLLM
from langchain_core.prompts import (
ChatPromptTemplate,
SystemMessagePromptTemplate,
AIMessagePromptTemplate,
HumanMessagePromptTemplate,
)
from langchain_core.messages import AIMessage, HumanMessage, SystemMessage

os.environ['OPENAI_API_KEY'] = ""
chat = ChatLiteLLM(model="gpt-3.5-turbo")
messages = [
HumanMessage(
content="what model are you"
)
]
chat.invoke(messages)

Use LangChain ChatLiteLLM + Langfuse​

Checkout this section here for more details on how to integrate Langfuse with ChatLiteLLM.