Skip to content

Anthropic provides a number of chat based models under the Claude moniker. Note that a Claude Pro membership does not give you the ability to call models via the API; instead, you will need to sign up (and pay for) a developer account.

Usage

chat_anthropic(
  system_prompt = NULL,
  params = NULL,
  max_tokens = deprecated(),
  model = NULL,
  api_args = list(),
  base_url = "https://api.anthropic.com/v1",
  beta_headers = character(),
  api_key = anthropic_key(),
  echo = NULL
)

models_anthropic(
  base_url = "https://api.anthropic.com/v1",
  api_key = anthropic_key()
)

Arguments

system_prompt

A system prompt to set the behavior of the assistant.

params

Common model parameters, usually created by params().

max_tokens

Maximum number of tokens to generate before stopping.

model

The model to use for the chat (defaults to "claude-sonnet-4-20250514"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use. Use models_anthropic() to see all options.

api_args

Named list of arbitrary extra arguments appended to the body of every chat API call. Combined with the body object generated by ellmer with modifyList().

base_url

The base URL to the endpoint; the default uses OpenAI.

beta_headers

Optionally, a character vector of beta headers to opt-in claude features that are still in beta.

api_key

API key to use for authentication.

You generally should not supply this directly, but instead set the ANTHROPIC_API_KEY environment variable. The best place to set this is in .Renviron, which you can easily edit by calling usethis::edit_r_environ().

echo

One of the following options:

  • none: don't emit any output (default when running in a function).

  • output: echo text and tool-calling output as it streams in (default when running at the console).

  • all: echo all input and output.

Note this only affects the chat() method.

Value

A Chat object.

Examples

chat <- chat_anthropic()
#> Using model = "claude-sonnet-4-20250514".
chat$chat("Tell me three jokes about statisticians")
#> Here are three jokes about statisticians:
#> 
#> 1. **The Optimist, Pessimist, and Statistician**
#> An optimist sees the glass as half full. A pessimist sees it as half 
#> empty. A statistician says, "We need a larger sample size to draw any 
#> meaningful conclusions."
#> 
#> 2. **The Drowning Statistician**
#> A statistician was drowning in a lake with an average depth of 3 feet.
#> His last words were, "But the mean said I should be able to stand!"
#> 
#> 3. **The Honest Statistician**
#> A statistician's wife asks him, "Do you love me or do you love 
#> statistics more?" He replies, "Well, I'd say I love you... but that 
#> could just be sampling bias since you're the only wife I've had."