CLI: Read pdf is failing with DeepAgents CLI

I am only starting to kick the tires of the deepagents CLI, so bear with me. I started by hooking into a local model via Ollama and then jumped to openAI 5.4.

My prompt:

>help me summarize this submission @file.pdf

The file existts. This is for ChatGPT 5.4: The error that is thrown

Error: Agent error: {‘error’: ‘BadRequestError’, ‘message’: ‘An internal error occurred’}

For Ollama Qwen36,

Error: Agent error: {‘error’: ‘ValueError’, ‘message’: ‘Blocks of type file not supported.’}

From the docs, I believe PDF files are supported.

I suspect it’s user error on my end, but I am curious as to what I might be missing.

hi @Btibert3

I’m not sure it’s possible to handle pdf files with Ollama…
@wfh ? :slight_smile:

Hello @Btibert3 ,

I was able to replicate this error on my machine as well.
This happens when you use OpenAI Environment Default Model, which uses gpt-5.2 but by default we are not using the response API in the cli initiation of the model, which causes the Bad Request error, as the file parameter is only supported in the Response API.

My Replication on Langsmith: LangSmith

I have created an issue for this on DeepAgent repo (and also a fix for it): fix(cli): route OpenAI env-default model through Responses API to support PDF file inputs · Issue #2959 · langchain-ai/deepagents · GitHub

Also @Btibert3, while the above fix is not patched in for OpenAI, you can pass the model parameters as well to use response API:

deepagents --model-params '{"use_responses_api": true}'

Update: This is only for OpenAI , for Ollama @pawel-twardziak might be right.

Awesome, thank you. I am only getting started, but good to know I can pass in that flag in the short run.

Thanks for flagging; this will be fixed in the next release following this PR