Support for InvokeModelWithResponseStream
param in invoke_model
and InvokeModelWithResponseStream
in Bedrock
#3323
Labels
feature-request
This issue requests a feature.
service-api
This issue is caused by the service API, not the SDK implementation.
Describe the feature
AWS Bedrock now supports LLM prompt caching (as a beta feature) and it added a new
explicitPromptCaching
parameter toinvoke_model
andinvoke_model_with_response_stream
. But this param is not yet reflected in the boto request shape so trying to use the param fails validation. It would be great if we could add it!Use Case
We'd love to use the new bedrock prompt caching feature to reduce time to first token.
Proposed Solution
Add
explicitPromptCaching
toInvokeModelRequest
andInvokeModelWithResponseStream
inbotocore/data/bedrock-runtime/<date>/service-2.json
.Other Information
No response
Acknowledgements
SDK version used
1.35.*
Environment details (OS name and version, etc.)
Any environment
The text was updated successfully, but these errors were encountered: