Hey folks ![]()
I’m using LangGraph Cloud as my backend and the NewAgentChat UI on the frontend.
With gpt-5-mini, I’d like to stream not just the final response but also the “thinking” traces (on_thinking events).
What’s the recommended way to surface those on_thinking events from LangGraph Cloud and display them in NewAgentChat?