Problems with tool calls using ChatOllama

Hi. I was testing langchain-ollama for a personal OSS project. I tried doing tool calls with this, but no tool-call is done. Could anyone understand why? It seems that it returns just plain json with the question…

hi @renvins

This is expected behavior. Please read Models - Docs by LangChain

When binding user-defined tools, the model’s response includes a request to execute a tool. When using a model separately from an agent, it is up to you to execute the requested tool and return the result back to the model for use in subsequent reasoning. When using an agent, the agent loop will handle the tool execution loop for you.

You can use an agent if you want it to execute the tool for you.

Your code will look something like this:


from langchain.tools import tool
from langchain.agents import create_agent

@tool
def get_current_time(timezone: str) -> str:
    """Return the current time for specific timezone"""
    return f"stub time in {timezone}"

agent = create_agent("qwen2.5-coder:7b", tools=[get_current_time])

msg = agent.invoke({"messages":"What time is it in Shanghai?"})
print(msg)

Or add this if you want to keep the separate model:

for tool_call in msg.tool_calls:
    # Execute the tool with the generated arguments
    tool_result = get_current_time.invoke(tool_call)
    print(tool_result.content)