An exception issue related to the invocation of tools via ChatopenAI()

When I use ChatOpenAI as the model for create_agent, an exception occurs during tool invocation: after adding ToolMessages (where the content field is a list) to the messages list, the model throws a processing error indicating an unexpected type. This issue is resolved when I switch the model to ChatDeepSeek(). Does this mean ChatOpenAI() does not allow serialized list data in message content?

File “D:\Python\Lib\site-packages\openai_base_client.py”, line 1669, in request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {‘error’: {‘message’: ‘Failed to deserialize the JSON body into the target type: messages[10]: invalid type: sequence, expected a string at line 1 column 7900’, ‘type’: ‘invalid_request_error’, ‘param’: None, ‘code’: ‘invalid_request_error’}}
During task with name ‘model’ and id ‘e8b3cc18-0716-fb5d-728a-670eda60c58b’

I’m calling an MCP service, and the data format returned by the third party isn’t fixed.So if I directly use create_agent to create an agent, the type restrictions are not very developer-friendly.When I run into this issue, I have to handle it, which is quite troublesome.

hi @Huimin-station

have you tried the OpenAI Responses API?

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    model="gpt-4o",
    use_responses_api=True, 
)

Not yet; what, then, is its primary function?

It’s that responses API is specifically compatible with tool calling

1 Like

Thank you; I shall proceed to give it a try forthwith.

1 Like