Hi, given the following example code :
from langchain_core.messages import AIMessage
from langgraph.checkpoint.memory import InMemorySaver
from langgraph.graph import START, MessagesState, StateGraph
def dummy_node(state: MessagesState) -> dict:
return {"messages": [AIMessage(content="Dummy response")]}
builder = StateGraph(MessagesState)
builder.add_node(dummy_node)
builder.add_edge(START, "dummy_node")
builder.add_edge("dummy_node", END)
graph = builder.compile(checkpointer=InMemorySaver())
config = {
"configurable": {
# Checkpoints are accessed by thread_id
"thread_id": "9",
}
}
for question in ["Dummy question 1", "Dummy question 2", "Dummy question 3"]:
input_message = {
"role": "user",
"content": question,
}
for step in graph.stream(
{"messages": input_message}, config, stream_mode="values"
):
step["messages"][-1].pretty_print()
state = graph.get_state(config).values
state["messages"]
and the output :
================================ Human Message =================================
Dummy question 1
================================== Ai Message ==================================
Dummy response
================================ Human Message =================================
Dummy question 2
================================== Ai Message ==================================
Dummy response
================================ Human Message =================================
Dummy question 3
================================== Ai Message ==================================
Dummy response
{'role': 'user',
'content': '[HumanMessage(content=\'[HumanMessage(content="[HumanMessage(content=\\\'Dummy question 1\\\', additional_kwargs={}, response_metadata={}, id=\\\'0be043f4-c304-4257-97a4-da35569e811a\\\'), AIMessage(content=\\\'Dummy response\\\', additional_kwargs={}, response_metadata={}, id=\\\'7329176f-957f-4583-bf5a-99dffd5e6999\\\')]", additional_kwargs={}, response_metadata={}, id=\\\'89a836dc-0f0e-41ab-a136-cdad4e83bf0c\\\'), HumanMessage(content=\\\'Dummy question 2\\\', additional_kwargs={}, response_metadata={}, id=\\\'dac14c20-b9e9-4a12-9c07-8c8a208b976a\\\'), AIMessage(content=\\\'Dummy response\\\', additional_kwargs={}, response_metadata={}, id=\\\'7b628c44-cf44-4424-8e4b-7fd589fca134\\\')]\', additional_kwargs={}, response_metadata={}, id=\'f748f55f-d557-4088-ae51-9c729b5a2507\'), HumanMessage(content=\'Dummy question 3\', additional_kwargs={}, response_metadata={}, id=\'1385d7bd-8c09-4cf5-af55-28f73bbe8148\'), AIMessage(content=\'Dummy response\', additional_kwargs={}, response_metadata={}, id=\'9bd4bba5-ab88-4f05-af3f-f8c11ec67aad\')]'}
Similarly, when we inspect the last step["messages"], previous messages are represented as string and not as a list of message objects:
{'messages': [HumanMessage(content='[HumanMessage(content="[HumanMessage(content=\'Dummy question 1\', additional_kwargs={}, response_metadata={}, id=\'0be043f4-c304-4257-97a4-da35569e811a\'), AIMessage(content=\'Dummy response\', additional_kwargs={}, response_metadata={}, id=\'7329176f-957f-4583-bf5a-99dffd5e6999\')]", additional_kwargs={}, response_metadata={}, id=\'89a836dc-0f0e-41ab-a136-cdad4e83bf0c\'), HumanMessage(content=\'Dummy question 2\', additional_kwargs={}, response_metadata={}, id=\'dac14c20-b9e9-4a12-9c07-8c8a208b976a\'), AIMessage(content=\'Dummy response\', additional_kwargs={}, response_metadata={}, id=\'7b628c44-cf44-4424-8e4b-7fd589fca134\')]', additional_kwargs={}, response_metadata={}, id='f748f55f-d557-4088-ae51-9c729b5a2507'),
HumanMessage(content='Dummy question 3', additional_kwargs={}, response_metadata={}, id='1385d7bd-8c09-4cf5-af55-28f73bbe8148'),
AIMessage(content='Dummy response', additional_kwargs={}, response_metadata={}, id='9bd4bba5-ab88-4f05-af3f-f8c11ec67aad')]}
I’m wondering why the messages in the state are represented as a string and not kept as a list of BaseMessage objects?
Is this due to a code mistake in my example, or is this just an internal representation and the messages are being parsed back into a list of Messages before being given to a chat model ?