Hi, LangChain team!
Is there a way to access annotations
and reasoning
values from AIMessageChunk
?
For context, we’d like to use Perplexity’s Sonar Deep Research via OpenRouter.
Using OpenAI’s AsyncOpenAI
client directly returns ChatCompletionChunk
instances that contain annotations
and reasoning
values. However, when used via LangChain’s ChatOpenAI
wrapper inside a graph, and streaming graph response, I can’t seem to access these values.
Graph definition:
import os
from typing import TypedDict
from langgraph.graph import END, StateGraph, START
from langchain_openai import ChatOpenAI
OPENROUTER_BASE_URL = "https://openrouter.ai/api/v1"
class GraphState(TypedDict):
topic: str
report: str
async def node1(state: GraphState):
messages = [
{
"role": "system",
"content": "Summarize the latest research papers about the provided topic",
},
{
"role": "user",
"content": state["topic"],
},
]
llm = ChatOpenAI(
api_key=os.getenv("OPENROUTER_API_KEY"),
base_url=OPENROUTER_BASE_URL,
model="perplexity/sonar-deep-research",
)
response = await llm.ainvoke(messages)
return {"report": response.content}
builder = StateGraph(GraphState)
builder.add_node("node1", node1)
builder.add_edge(START, "node1")
builder.add_edge("node1", END)
graph = builder.compile()
chunks = []
async for mode, chunk in graph.astream(
{"topic": "Artificial Intelligence"}, stream_mode=["updates", "messages"]
):
chunks.append((mode, chunk))
print(mode)
print(chunk)
print("--------------------------------")
Is there a way to use LangGraph’s streaming capabilities and stream these values?
Thanks!