Unable to pass maxTokens and extract Resoning Summary in AzureChatOpenAI

Hi @pawel-twardziak

let me know if im missing something but the solution you suggested in How to extract GPT-5 reasoning summaries with @langchain/openai? - #2 by pawel-twardziak. Suggests

  1. passing useResponsesApi: true in llm initialiation
  2. and passing reasoning config in invoke
reasoning: {
        effort: "high",
        summary: "auto" /* summary config if desired */,
    },
  • Using that with AzureChatOpenAI gives error: { code: '404', message: 'Resource not found' } however the same azure gpt-5 resource work if i remove both the above parameters but then I am not able to extract reasoning summary

  • Also, I am able to extract reasoning summar when calling the azure gpt-5 resource using offical OpenAI Responses API