We are currently calling various LLMs through the chain langchain-openai → OpenRouter → various models.
The newly released Gemini 3 Pro requires passing back the thoughtSignatures parameter during tool calls, but this parameter is only supported in langchain-google-genai, not in langchain-openai.
Is there any way to work around or solve this issue? Thank you.
I’m having the same issue when using langchain/Google. It was handled in langchain/vertex so it’s something that was known and once resolved. Any chance the new typescript langchain package can be updated to handle thought signatures?
Hey @Godrules500, could you elaborate on what you’re seeing regarding thought signatures not being supported in @langchain/google? We do have support for this in that new package