-
Notifications
You must be signed in to change notification settings - Fork 300
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] The Provided model dosen't support on-demand throughput #508
Comments
Hi, I have the same problem with Claude 3.5 Sonnet on bedrock in Ireland. DId you manage to find a solution? |
Hi, I do have tried various methods to fix it, but none worked in Claude 3 Opus. BTW, Claude 3.5 Sonnet works fine in my environment (us-west-2). |
Hi. Ah yes ok. I'm in eu-west-1 where I think 3.5 is fairly newly added. And it has the cross-region inference. I don't know if that could cause any problems? |
I experienced the same issue when attempting to use Anthropic Claude 3.5 Sonnet in the Paris region.
It seems that the model can only be used with something called an inference profile due to cross-region inference1. Anyway, it boils down to simply changing the E.g. In my case, the HTH. Footnotes
|
@virtualstaticvoid can I ask where I should update to |
In my case, I changed it in the bedrock.py file. bedrock-claude-chat/backend/app/bedrock.py Line 273 in a759f3d
Anywhere where the |
…ssues with different regions
Describe the bug
When Start a conversation with Claude 3 (Opus), there's an error "An error occurred while responding." And the lambda function (WebSocketHandler) logs in CloudWatch show that "Failed to run stream handler: An error occurred (ValidationException) when calling the ConverseStream operation: The provided model doesn't support on-demand throughput."
Screenshots
The text was updated successfully, but these errors were encountered: