When using the
import { AzureChatOpenAI } from ‘@langchain/openai’;
I’m using gpt-oss-120B or even other gpt models it’s not returning the reasoning content nor the reasoning tokens in the response or in langsmith.
If I include __includeRawResponse: true then I can see that azure is returning the reasoning_content but the langchain AIMessageChunk and content blocks isn’t working and doesn’t include it.
Using version “@langchain/openai”: “1.1.3”
thanks
Hi @hesamzkr
could share your code? I can debug what is going on there inside the source code
Hi, thank you. yep here’s the way I’m initializing the model
import { AzureChatOpenAI } from '@langchain/openai';
const model = new AzureChatOpenAI({
maxRetries: 2,
azureOpenAIApiKey: azureCredentials.apiKey,
azureOpenAIApiInstanceName: azureCredentials.apiInstanceName,
azureOpenAIApiVersion: '2025-01-01-preview',
streaming: true,
metadata: {
ls_provider: 'azure',
ls_model_name: 'gpt-oss-120b'
},
reasoning: {
effort: 'high'
},
model: 'gpt-oss-120b',
temperature: 0,
maxTokens: 1000,
azureOpenAIApiDeploymentName: azureCredentials.apiDeploymentName
});
const response = await model.invoke(messages);
1 Like