Can i deploy a langgraph agent myself on a VPS or AWS?

Can someone share a GitHub repo link where they have used Langgraph agents with FastAPI and built it with a web app where it streams the AI response with Vercel’s AI SDK or something?

Yes, absolutely you can deploy a LangGraph agent on a VPS or AWS.

I’ve shared some details about my setup in this other post:
Has anyone integrated a multi-agentic system in LangGraph with FastAPI?

Unfortunately, I can’t share my full codebase, but I’ve posted a few public examples in that thread.
If you’re already comfortable building containerized FastAPI apps on AWS (ECS/Fargate), the deployment should be pretty straightforward.

Let me know if you have any specific questions. Happy to help point you in the right direction.

This is helpful. My app features a ChatGPT-like chat interface. To stream in the frontend, do you have any library suggestions(Vercel’s AI SDK’s langchain adapter is outdated)?

By the way, I came across the useStream hook by the Langchain team (you can see this app on Langchain, for example). They are asking for a Langsmith API key (optional), which is required only if I host it on the Langgraph platform, right? If I self-host it, I don’t need the API key or LangGraph Pro/enterprise subscription. Correct me if I’m wrong.