Does langchain-openai support passing Gemini 3’s thought-signatures metadata via OpenRouter?

We are currently calling various LLMs through the chain langchain-openai → OpenRouter → various models.
The newly released Gemini 3 Pro requires passing back the thoughtSignatures parameter during tool calls, but this parameter is only supported in langchain-google-genai, not in langchain-openai.

Is there any way to work around or solve this issue? Thank you.

4 Likes

I have the same issue. That would be great to support it or provide a workaround.

1 Like

Yeah, I have the same issue too. I would appreciate any help. Thanks!

Not currently possible. See this issue for tracking for a dedicated OpenRouter integration: OpenRouter · Issue #34328 · langchain-ai/langchain · GitHub

1 Like

I’m having the same issue when using langchain/Google. It was handled in langchain/vertex so it’s something that was known and once resolved. Any chance the new typescript langchain package can be updated to handle thought signatures?

Hey @Godrules500, could you elaborate on what you’re seeing regarding thought signatures not being supported in @langchain/google? We do have support for this in that new package

Will also call out that we have a first party openrouter integration now: https://www.npmjs.com/package/@langchain/openrouter