I’m experiencing a critical issue where ChatBedrockConverse.astream() is consistently dropping the first portion of the LLM’s response. The stream starts mid-sentence, missing what appears to be several tokens from the beginning. The rest of the response streams in fine.
Specifically, this happens when using the model id: us.anthropic.claude-sonnet-4-20250514-v1:0.