Error with import content from langchain_core.message with LangChain V1

Iam using langchain V1, and i got this error:
```
File “/home/mhkone/AI Progress/Lang_Ecosystem/.venv/lib/python3.12/site-packages/langchain_openai/chat_models/base.py”, line 60, in
from langchain_core.messages import content as types
ImportError: cannot import name ‘content’ from ‘langchain_core.messages’ (/home/mhkone/AI Progress/Lang_Ecosystem/.venv/lib/python3.12/site-packages/langchain_core/messages/init.py)
Could not import python module for graph:
GraphSpec(id=‘agent’, path=‘./src/deep_research_from_scratch/research_agent_scope.py’, module=None, variable=‘scope_research’, config={}, description=None)
This error likely means you haven’t installed your project and its dependencies yet. Before running the server, install your project:

If you are using requirements.txt:
python -m pip install -r requirements.txt

If you are using pyproject.toml or setuptools:
python -m pip install -e .

Make sure to run this command from your project’s root directory (where your setup.py or pyproject.toml is located)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File “/home/mhkone/AI Progress/Lang_Ecosystem/.venv/lib/python3.12/site-packages/starlette/routing.py”, line 694, in lifespan
async with self.lifespan_context(app) as maybe_state:
File “/usr/local/lib/python3.12/contextlib.py”, line 210, in aenter
return await anext(self.gen)
^^^^^^^^^^^^^^^^^^^^^
File “/home/mhkone/AI Progress/Lang_Ecosystem/.venv/lib/python3.12/site-packages/langgraph_runtime_inmem/lifespan.py”, line 80, in lifespan
await graph.collect_graphs_from_env(True)
File “/home/mhkone/AI Progress/Lang_Ecosystem/.venv/lib/python3.12/site-packages/langgraph_api/graph.py”, line 422, in collect_graphs_from_env
raise GraphLoadError(spec, exc) from exc
langgraph_api.utils.errors.GraphLoadError: Failed to load graph ‘agent’ from ./src/deep_research_from_scratch/research_agent_scope.py: cannot import name ‘content’ from ‘langchain_core.messages’ (/home/mhkone/AI Progress/Lang_Ecosystem/.venv/lib/python3.12/site-packages/langchain_core/messages/init.py)
[uvicorn.error] api_variant=local_dev langgraph_api_version=0.4.44 thread_name=MainThread
2025-10-28T03:11:39.156579Z [error ] Application startup failed. Exiting. [uvicorn.error] api_variant=local_dev langgraph_api_version=0.4.44 thread_name=MainThread
```

hi @MHKone

there is likely version mismatch between langchain-openai and langchain-core causing from langchain_core.messages import content to fail; GraphLoadError is a cascade.

Which one do you have in your project: pyproject.toml or requirements.txt?

  1. Upgrade and align langchain, langchain-core, langchain-openai in requirements.txt or pyproject.tom
  2. Reinstall your project (pip install -r requirements.txt or pip install -e .)
1 Like

@pawel-twardziak I encounter the same problem. I try to edit the version in requirements.txt many times, but no luck so far!

superset_app | File “/app/docker/pythonpath_dev/superset_config.py”, line 153, in init_custom_views
superset_app | from superset_chat.ai_superset_assistant import AISupersetAssistantView
superset_app | File “/app/.venv/lib/python3.11/site-packages/superset_chat/ai_superset_assistant.py”, line 15, in
superset_app | from superset_chat.app.server.llm import get_stream_agent_responce
superset_app | File “/app/.venv/lib/python3.11/site-packages/superset_chat/app/server/llm.py”, line 19, in
superset_app | from langchain_mcp_adapters.client import MultiServerMCPClient
superset_app | File “/app/.venv/lib/python3.11/site-packages/langchain_mcp_adapters/client.py”, line 31, in
superset_app | from langchain_mcp_adapters.tools import load_mcp_tools
superset_app | File “/app/.venv/lib/python3.11/site-packages/langchain_mcp_adapters/tools.py”, line 11, in
superset_app | from langchain_core.messages.content import (
superset_app | ModuleNotFoundError: No module named ‘langchain_core.messages.content’

What do you mean “Upgrade and align langchain, langchain-core, langchain-openai“? What do you mean by “align“? Try to set all of them in the same version won’t work, dependency version error. Don’t specify the version and let Python decide leads to Module not found!!

langchain-core >= 0.3.36,<0.4
#langchain-community

langchain-openai >= 0.3.36,<0.4

#langchain-mcp-adapters>=0.0.6
superset-chat
psycopg2-binary
Pillow

hi @quang

could you share your full requirements.txt file?

Nvm after reading the source code and trying different types of combinations below work for me. The problem is the “langchain-mcp-adapters“.
langchain-mcp-adapters <= 0.1.14
langchain-core
langchain
langchain-neo4j
langchain-community
langchain-openai
superset-chat==0.1.0a23
psycopg2-binary
Pillow

When I check the source code

After comparing each version, I found out that after langchain-core 1.0.0, they started to change from

langchain_core.messages.content_blocks.py to langchain_core.messages.content.py

Then I go back to the langchain-mcp-adapters (which calls the above module) source code and realize the same things, the new version calls .content instead. So after some trial and error, I decided to use the old langchain-mcp-adapters, and it works.
Edit: Nvm, encountered another error about the temperature parameter for a specific Open model, but I think I can fix this. What a f*cking headache. F*ck the Dependency management in Python!

hi @quang
sh*t happens, and we can do nothing about it :person_shrugging: