Sign up at https://www.perplexity.ai.
Perplexity AI is a platform for running LLMs that are capable of searching the web in real-time to help them answer questions with information that may not have been available when the model was trained.
This function is a lightweight wrapper around chat_openai()
with
the defaults tweaked for Perplexity AI.
Arguments
- system_prompt
A system prompt to set the behavior of the assistant.
- base_url
The base URL to the endpoint; the default uses OpenAI.
- api_key
API key to use for authentication.
You generally should not supply this directly, but instead set the
PERPLEXITY_API_KEY
environment variable. The best place to set this is in.Renviron
, which you can easily edit by callingusethis::edit_r_environ()
.- model
The model to use for the chat (defaults to "llama-3.1-sonar-small-128k-online"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use.
- seed
Optional integer seed that ChatGPT uses to try and make output more reproducible.
- api_args
Named list of arbitrary extra arguments appended to the body of every chat API call. Combined with the body object generated by ellmer with
modifyList()
.- echo
One of the following options:
none
: don't emit any output (default when running in a function).output
: echo text and tool-calling output as it streams in (default when running at the console).all
: echo all input and output.
Note this only affects the
chat()
method.- api_headers
Named character vector of arbitrary extra headers appended to every chat API call.
Value
A Chat object.
See also
Other chatbots:
chat_anthropic()
,
chat_aws_bedrock()
,
chat_azure_openai()
,
chat_cloudflare()
,
chat_databricks()
,
chat_deepseek()
,
chat_github()
,
chat_google_gemini()
,
chat_groq()
,
chat_huggingface()
,
chat_mistral()
,
chat_ollama()
,
chat_openai()
,
chat_openrouter()
,
chat_portkey()