Anthropic provides a number of chat based models under the Claude moniker. Note that a Claude Pro membership does not give you the ability to call models via the API; instead, you will need to sign up (and pay for) a developer account.
Usage
chat_anthropic(
system_prompt = NULL,
turns = NULL,
params = NULL,
max_tokens = deprecated(),
model = NULL,
api_args = list(),
base_url = "https://api.anthropic.com/v1",
beta_headers = character(),
api_key = anthropic_key(),
echo = NULL
)
Arguments
- system_prompt
A system prompt to set the behavior of the assistant.
- turns
A list of Turns to start the chat with (i.e., continuing a previous conversation). If not provided, the conversation begins from scratch.
- params
Common model parameters, usually created by
params()
.- max_tokens
Maximum number of tokens to generate before stopping.
- model
The model to use for the chat. The default,
NULL
, will pick a reasonable default, and tell you about. We strongly recommend explicitly choosing a model for all but the most casual use.- api_args
Named list of arbitrary extra arguments appended to the body of every chat API call. Combined with the body object generated by ellmer with
modifyList()
.- base_url
The base URL to the endpoint; the default uses OpenAI.
- beta_headers
Optionally, a character vector of beta headers to opt-in claude features that are still in beta.
- api_key
API key to use for authentication.
You generally should not supply this directly, but instead set the
ANTHROPIC_API_KEY
environment variable. The best place to set this is in.Renviron
, which you can easily edit by callingusethis::edit_r_environ()
.- echo
One of the following options:
none
: don't emit any output (default when running in a function).text
: echo text output as it streams in (default when running at the console).all
: echo all input and output.
Note this only affects the
chat()
method.
Value
A Chat object.
See also
Other chatbots:
chat_aws_bedrock()
,
chat_azure_openai()
,
chat_cortex_analyst()
,
chat_databricks()
,
chat_deepseek()
,
chat_github()
,
chat_google_gemini()
,
chat_groq()
,
chat_ollama()
,
chat_openai()
,
chat_openrouter()
,
chat_perplexity()
Examples
chat <- chat_anthropic()
#> Using model = "claude-3-7-sonnet-latest".
chat$chat("Tell me three jokes about statisticians")
#> # Three Jokes About Statisticians
#>
#> 1. Why did the statistician drown crossing the river?
#> It was 3 feet deep on average.
#>
#> 2. A statistician's wife gave birth to twins. He was delighted. He
#> rang the minister who asked, "Are they boys or girls?" "Yes," he
#> replied, "both kinds, and the probability of each is 0.5."
#>
#> 3. How many statisticians does it take to change a light bulb?
#> "Well, technically... one, but we need a sample size of at least 30
#> bulbs to be statistically significant."