Foundation course: Getting Set Up

I started the Foundation course today using VS code.

For hour stuck with this error message when up to langgraph dev step.

What did I do wrong? I have placed the .env file under same root folder as the requirements.txt

Error code below:

C:\Users\HJ\Downloads\LangChain\002\langchain-academy\module-1\studio> langgraph dev INFO:langgraph_api.cli: Welcome to ╦ ┌─┐┌┐┌┌─┐╔═╗┬─┐┌─┐┌─┐┬ ┬ ║ ├─┤││││ ┬║ ╦├┬┘├─┤├─┘├─┤ ╩═╝┴ ┴┘└┘└─┘╚═╝┴└─┴ ┴┴ ┴ ┴ - :rocket: API: - :artist_palette: Studio UI: LangSmith - :books: API Docs: This in-memory server is designed for development and testing. For production use, please use LangGraph Platform. 2025-08-15T12:10:50.800132Z [info ] Using langgraph_runtime_inmem [langgraph_runtime] api_variant=local_dev langgraph_api_version=0.2.130 thread_name=MainThread 2025-08-15T12:10:50.846187Z [info ] 1 change detected [watchfiles.main] api_variant=local_dev langgraph_api_version=0.2.130 thread_name=MainThread 2025-08-15T12:10:51.030948Z [info ] Using auth of type=noop [langgraph_api.auth.middleware] api_variant=local_dev langgraph_api_version=0.2.130 thread_name=MainThread 2025-08-15T12:10:51.101066Z [info ] Starting In-Memory runtime with langgraph-api=0.2.130 and in-memory runtime=0.6.13 [langgraph_runtime_inmem.lifespan] api_variant=local_dev langgraph_api_version=0.2.130 langgraph_runtime_inmem_version=0.6.13 thread_name=asyncio_0 version=0.2.130 2025-08-15T12:10:51.675789Z [info ] Starting thread TTL sweeper with interval 5 minutes [langgraph_api.thread_ttl] api_variant=local_dev interval_minutes=5 langgraph_api_version=0.2.130 strategy=delete thread_name=asyncio_1 2025-08-15T12:10:51.691729Z [info ] Registering graph with id ‘simple_graph’ [langgraph_api.graph] api_variant=local_dev graph_id=simple_graph langgraph_api_version=0.2.130 thread_name=asyncio_1 2025-08-15T12:10:51.771341Z [info ] 7 changes detected [watchfiles.main] api_variant=local_dev langgraph_api_version=0.2.130 thread_name=MainThread 2025-08-15T12:10:57.955596Z [info ] Shutting down remote graphs [langgraph_api.graph] api_variant=local_dev langgraph_api_version=0.2.130 thread_name=MainThread 2025-08-15T12:10:58.002493Z [error ] Traceback (most recent call last): File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\starlette\routing.py”, line 694, in lifespan async with self.lifespan_context(app) as maybe_state: File “C:\Users\HJ\AppData\Local\Programs\Python\Python311\Lib\contextlib.py”, line 210, in _aenter_ return await anext(self.gen) ^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langgraph_runtime_inmem\lifespan.py”, line 79, in lifespan await graph.collect_graphs_from_env(True) File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langgraph_api\graph.py”, line 407, in collect_graphs_from_env graph = await run_in_executor(None, _graph_from_spec, spec) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langgraph_api\utils\config.py”, line 135, in run_in_executor return await asyncio.get_running_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\AppData\Local\Programs\Python\Python311\Lib\concurrent\futures\thread.py”, line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langgraph_api\utils\config.py”, line 126, in wrapper return func(*args, **kwargs) raph_from_spec modspec.loader.exec_module(module) File “<frozen importlib._bootstrap_external>”, line 940, in exec_module File “<frozen importlib._bootstrap>”, line 241, in _call_with_frames_removed File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\module-1\studio\./router.py”, line 17, in llm = ChatOpenAI(model=“gpt-4o”) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langchain_core\load\serializable.py”, line 130, in _init_ raph_from_spec modspec.loader.exec_module(module) File “<frozen importlib._bootstrap_external>”, line 940, in exec_module File “<frozen importlib._bootstrap>”, line 241, in _call_with_frames_removed File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\module-1\studio\./router.py”, line 17, in llm = ChatOpenAI(model=“gpt-4o”) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langchain_core\load\serializable.py”, line 130, in _init_ super()._init_(*args, **kwargs) modspec.loader.exec_module(module) File “<frozen importlib._bootstrap_external>”, line 940, in exec_module File “<frozen importlib._bootstrap>”, line 241, in _call_with_frames_removed File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\module-1\studio\./router.py”, line 17, in llm = ChatOpenAI(model=“gpt-4o”) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langchain_core\load\serializable.py”, line 130, in _init_ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\module-1\studio\./router.py”, line 17, in llm = ChatOpenAI(model=“gpt-4o”) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langchain_core\load\serializable.py”, line 130, in _init_ llm = ChatOpenAI(model=“gpt-4o”) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langchain_core\load\serializable.py”, line 130, in _init_ ^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langchain_core\load\serializable.py”, line 130, in _init_ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langchain_core\load\serializable.py”, line 130, in _init_ super()._init_(*args, **kwargs) super()._init_(*args, **kwargs) File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\pydantic\main.py”, line 253, in _init_ validated_self = self._pydantic_validator_.validate_python(data, self_instance=self) validated_self = self._pydantic_validator_.validate_python(data, self_instance=self) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\langchain_openai\chat_models\base.py”, line 789, in validate_environment self.root_client = openai.OpenAI(**client_params, **sync_specific) # type: ignore[arg-type] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File “C:\Users\HJ\Downloads\LangChain\002\langchain-academy\lc-academy-env\Lib\site-packages\openai\_client.py”, line 130, in _init_ raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable [uvicorn.error] api_variant=local_dev langgraph_api_version=0.2.130 thread_name=MainThread 2025-08-15T12:10:58.002493Z [error ] Application startup failed. Exiting. [uvicorn.error] api_variant=local_dev langgraph_api_version=0.2.130 thread_name=MainThread 2025-08-15T12:10:58.035421Z [info ] 22 changes detected [watchfiles.main] api_variant=local_dev langgraph_api_version=0.2.130 thread_name=MainThread

I faced the same issue. Here’s what I did which solved it.

Create a .env file within module-X/studio folder. Add the necessary keys over there, including the OPENAI_API_KEY.

Now try “langgraph dev“ and it should work.

1 Like

So I ended up doing one extra step under the studio path and command prompt, enter

set OPENAI_API_KEY=YOUR_KEY_HERE

set LANGSMITH_API_KEY=YOUR_KEY_HERE

then run the langgraph dev it worked