LLM Cache for LangGraph

I came across this post: How to set the Cache? · langchain-ai/langgraph · Discussion #1230 · GitHub

while searching from LLM Cache on langgraph. Didn’t find anything on the documentation.
Is this feature still to be implemented? any workaround?

Hey Arthur,

There are two types of caches mentioned in that post:

The llm cache is a langchain feature that you can use within langgraph - docs for the llm cache here.

The other idea was to cache langgraph execution itself, you can now cache expensive nodes in langgraph.

1 Like

Hi nhuang

Is it possible to use Redis instead of InMemoryCache for caching expensive nodes?

Did you find any solution?