LangGraph/LangChain and serverless architecture

Are LangGraph and LangChain compatible with serverless architectures? Are there any limitations or potential issues that don’t exist in traditional server environments?

Some additional considerations:

  • Do they rely on long-lived processes or persistent memory between invocations?
  • Are there cold start issues or performance penalties with certain serverless providers?
  • How well do they handle concurrent executions or state management in a stateless environment?
  • Are there specific deployment best practices or architectural patterns recommended for serverless setups?

Vercel vs NestJS vs Express

I can share my experience. I tried with AWS Lambda and I can confirm that the default in memory checkpoints won’t work across multiple invocations. Even if we mitigate cold start issues using provisioned throughput, we can’t rely on a warm lambda function to persists states.

I think, an external checkpointer like PostgresSaver might help in this case. Would love to hear some thoughts!

I’m doing this with Azure Functions and my own CosmosDBMemorySaver.

So… it works

Yeah we are also adding external checkpointer postgress for maintaining short term/thread memory. In general I think in memory is only for dev purposes