Cloudflare Workers AI hosts a variety of open-source AI models. To use the Cloudflare API, you must have an Account ID and an Access Token, which you can obtain by following these instructions.
Built on top of chat_openai_compatible().
Arguments
- account
The Cloudflare account ID. Taken from the
CLOUDFLARE_ACCOUNT_IDenv var, if defined.- system_prompt
A system prompt to set the behavior of the assistant.
- params
Common model parameters, usually created by
params().- api_key
- credentials
Override the default credentials. You generally should not need this argument; instead set the
CLOUDFLARE_API_KEYenvironment variable. The best place to set this is in.Renviron, which you can easily edit by callingusethis::edit_r_environ().If you do need additional control, this argument takes a zero-argument function that returns either a string (the API key), or a named list (added as additional headers to every request).
- model
The model to use for the chat (defaults to "meta-llama/Llama-3.3-70b-instruct-fp8-fast"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use.
- api_args
Named list of arbitrary extra arguments appended to the body of every chat API call. Combined with the body object generated by ellmer with
modifyList().- echo
One of the following options:
none: don't emit any output (default when running in a function).output: echo text and tool-calling output as it streams in (default when running at the console).all: echo all input and output.
Note this only affects the
chat()method.- api_headers
Named character vector of arbitrary extra headers appended to every chat API call.
Value
A Chat object.
See also
Other chatbots:
chat_anthropic(),
chat_aws_bedrock(),
chat_azure_openai(),
chat_databricks(),
chat_deepseek(),
chat_github(),
chat_google_gemini(),
chat_groq(),
chat_huggingface(),
chat_mistral(),
chat_ollama(),
chat_openai(),
chat_openai_compatible(),
chat_openrouter(),
chat_perplexity(),
chat_portkey()
