AWS Bedrock provides a number of language models, including those from Anthropic's Claude, using the Bedrock Converse API.
Authentication
Authentication is handled through {paws.common}, so if authentication
does not work for you automatically, you'll need to follow the advice
at https://www.paws-r-sdk.com/#credentials. In particular, if your
org uses AWS SSO, you'll need to run aws sso login at the terminal.
Arguments
- system_prompt
A system prompt to set the behavior of the assistant.
- base_url
The base URL to the endpoint; the default is OpenAI's public API.
- model
The model to use for the chat (defaults to "anthropic.claude-sonnet-4-5-20250929-v1:0"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use. Use
models_models_aws_bedrock()to see all options. .While ellmer provides a default model, there's no guarantee that you'll have access to it, so you'll need to specify a model that you can. If you're using cross-region inference, you'll need to use the inference profile ID, e.g.
model="us.anthropic.claude-sonnet-4-5-20250929-v1:0".- profile
AWS profile to use.
- params
Common model parameters, usually created by
params().- api_args
Named list of arbitrary extra arguments appended to the body of every chat API call. Some useful arguments include:
- api_headers
Named character vector of arbitrary extra headers appended to every chat API call.
- echo
One of the following options:
none: don't emit any output (default when running in a function).output: echo text and tool-calling output as it streams in (default when running at the console).all: echo all input and output.
Note this only affects the
chat()method.
Value
A Chat object.
See also
Other chatbots:
chat_anthropic(),
chat_azure_openai(),
chat_cloudflare(),
chat_databricks(),
chat_deepseek(),
chat_github(),
chat_google_gemini(),
chat_groq(),
chat_huggingface(),
chat_mistral(),
chat_ollama(),
chat_openai(),
chat_openai_compatible(),
chat_openrouter(),
chat_perplexity(),
chat_portkey()
