React SDK How to listen differentiate LLM stream

The SDK’s useStream hook is designed to simplify the most common use case: streaming chat messages into a single, cohesive history.

Yeah, This a handles a lot of complexity for standard chat interface.

However, in certain advanced use cases, we require more granular control over the stream routing than the current aggregation allows.

We need support for scenarios such as:

  • Concurrent LLM Streaming: Simultaneously streaming outputs from multiple LLMs within the same graph run (e.g., an LLM for the main answer and a second LLM for real-time fact-checking or hint generation) and routing those token streams to separate UI components on the client.

  • Streaming Non-Message Fields: Streaming token or structured data from a graph state field other than the main messages field. This is necessary for use cases like Generative UI or streaming JSON objects that are not part of the standard chat history.

Thank for checking

Hey @hieusmiths! Take a look at this guide on how to populate UI components in your frontend dynamically. You can stream llm output tokens and elements in the state of the graph

Ah, So to stream multiple LLM(s) Concurrency, We can use custom event to stream them, right?