Hello,
I was wondering if there was an implementation for chat models using the OpenRouter platform/API. I have viewed this link which works fine for building and testing purposes but I am not finding it helpful while deploying the application. If anyone would happen to know if there is some official implementation (either Langchain or community) in the works, I would really appreciate knowing about any related developments.
I am aware that I can likely build a custom class to interact with this but ideally I would like it to be similar to the other chat models and implement exception handling, message management for the LLM, etc., natively. I am also unsure of how to set up a feature where we can easily switch between models as we can with the API (and automatically alter the request/message formats, tool calling, and so on).
Thank you.
My understanding is that you should be able to use ChatOpenAI directly with OpenRouter as they implement OpenAI’s completions endpoint? All that’s required is updating the BaseURL.
This sample code already works as the practical equivalent of a ChatOpenRouter implementation using OpenRouter’s OpenAI-compatible API.
you need to add OPENROUTER_API_KEY to work this code
import os
from langchain.agents import create_agent
from dotenv import load_dotenv
from langchain_openai import ChatOpenAI
# Load environment variables
load_dotenv()
# Get API key
OPENROUTER_KEY = os.getenv("OPENROUTER_API_KEY")
if not OPENROUTER_KEY:
raise ValueError("Missing OPENROUTER_API_KEY in environment")
# Set OpenAI-compatible environment variables for OpenRouter
os.environ["OPENAI_API_KEY"] = OPENROUTER_KEY
os.environ["OPENAI_BASE_URL"] = "https://openrouter.ai/api/v1"
# Create LLM
llm = ChatOpenAI(
model="mistralai/mistral-7b-instruct:free",
temperature=0.7,
api_key=OPENROUTER_KEY,
)
# Create agent
agent = create_agent(
llm,
tools=[],
system_prompt="You are a helpful assistant. Answer in simple words."
)
# User input
user_question = "What is artificial intelligence in simple terms?"
# Invoke agent
response = agent.invoke({
"messages": [
{"role": "user", "content": user_question}
]
})
# Extract and print AI response safely
if "messages" in response and len(response["messages"]) > 0:
ai_message = response["messages"][-1]
print(ai_message.content)
else:
print("No response received from the agent.")