The issue I am facing is that I need to calculate the token consumption for a multi-agentic langchain project. I use astream_events on my agent executor it works fine however when I pass the stream_usage true as a parameter is says:
TypeError: AgentExecutorIterator.init() got an unexpected keyword argument ‘stream_usage’
‘stream_usage=true’ works fine when I use the langchain chatOpenAi client’s astream_events. Is it not supported, I failed to find anything in documentation. Any help would be much appreciated.
stream_usage=True
is only supported for direct LLM calls, not for AgentExecutor
. The AgentExecutorIterator
doesn’t accept this parameter because it’s a higher-level construct that manages multiple LLM calls internally.
To track token usage in multi-agent workflows, use callbacks instead:
from langchain.callbacks import get_openai_callback
with get_openai_callback() as cb:
async for event in agent_executor.astream_events(...):
# Process events
pass
print(f"Total tokens: {cb.total_tokens}")
Or implement a custom callback handler to capture usage from individual LLM calls within your agent workflow.
1 Like
Thankyou for clarifying, it seems that I needed to update libraries to get the usage_metadata through openai callback.