@langchain/aws v1.0 ChatBedrockConverse: ‘Maximum tokens exceeds model limit’

I’m getting this error when using @langchain/awsChatBedrockConverse although the maxTokens property is set higher

The maximum tokens you requested exceeds the model limit of 4096. Try again with a maximum tokens value that is lower than 4096. 

is this some configuration from aws or with the langchain 1.0, wasn’t getting this before?

Can you provide a code snippet? What is the model being used?