We are currently calling various LLMs through the chain langchain-openai → OpenRouter → various models.
The newly released Gemini 3 Pro requires passing back the thoughtSignatures parameter during tool calls, but this parameter is only supported in langchain-google-genai, not in langchain-openai.
Is there any way to work around or solve this issue? Thank you.