Unable to pass maxTokens and extract Resoning Summary in AzureChatOpenAI

Hi @Nikfury

try this:

const ai = await llm.invoke(messages, {
    reasoning: {
        effort: "high",
        summary: "auto" /* summary config if desired */,
    },
});

Your issue might be related to that How to extract GPT-5 reasoning summaries with @langchain/openai? - #15 by pawel-twardziak since it is still OpenAI class that is being exetended.