ChatMLX & create_react_agent

anyone knows how to use MLX and an agent with langgraph ?

# loading the model
llm = MLXPipeline.from_model_id(
"mlx-community/Llama-3.2-3B-Instruct",
pipeline_kwargs ={"max_tokens": 1024, "temp": 0.1},
) model = ChatMLX( llm =llm) # creating the agent
class AlbumStudio(BaseModel):
conditions: int agent = create_react_agent(
model =model,
tools =[internet_search_DDGO, process_content],
prompt ="You're a wizard who extracts whole numbers from web pages. You must always answer with a whole number.",
response_format =AlbumStudio
) # launching the agent
messages = [
HumanMessage(
"content": "blabla"
),
] result = agent.invoke({"messages": messages})

this code gives me this error : ValueError: Last message must be a HumanMessage!

The ChatMLX integration is maintained by the community! Took a quick look at the implementation, and right now it’s set that each invocation with a messages list needs to end in a HumanMessage.

This doesn’t work with the ReAct tool-calling agent, because it will append AI and Tool messages as it calls tools in a loop.

Feel free to raise a PR in community that updates this behavior! Alternatively, you could use a different chat model provider without this restriction.