Skip to content

To use chat_ollama() first download and install Ollama. Then install some models either from the command line (e.g. with ollama pull llama3.1) or within R using ollamar (e.g. ollamar::pull("llama3.1")).

Built on top of chat_openai_compatible().

Known limitations

  • Tool calling is not supported with streaming (i.e. when echo is "text" or "all")

  • Models can only use 2048 input tokens, and there's no way to get them to use more, except by creating a custom model with a different default.

  • Tool calling generally seems quite weak, at least with the models I have tried it with.

Usage

chat_ollama(
  system_prompt = NULL,
  base_url = Sys.getenv("OLLAMA_BASE_URL", "http://localhost:11434"),
  model,
  params = NULL,
  api_args = list(),
  echo = NULL,
  api_key = NULL,
  credentials = NULL,
  api_headers = character()
)

models_ollama(base_url = "http://localhost:11434", credentials = NULL)

Arguments

system_prompt

A system prompt to set the behavior of the assistant.

base_url

The base URL to the endpoint; the default is OpenAI's public API.

model

The model to use for the chat. Use models_ollama() to see all options.

params

Common model parameters, usually created by params().

api_args

Named list of arbitrary extra arguments appended to the body of every chat API call. Combined with the body object generated by ellmer with modifyList().

echo

One of the following options:

  • none: don't emit any output (default when running in a function).

  • output: echo text and tool-calling output as it streams in (default when running at the console).

  • all: echo all input and output.

Note this only affects the chat() method.

api_key

[Deprecated] Use credentials instead.

credentials

Ollama doesn't require credentials for local usage and in most cases you do not need to provide credentials.

However, if you're accessing an Ollama instance hosted behind a reverse proxy or secured endpoint that enforces bearer‐token authentication, you can set the OLLAMA_API_KEY environment variable or provide a callback function to credentials.

api_headers

Named character vector of arbitrary extra headers appended to every chat API call.

Value

A Chat object.

Examples

if (FALSE) { # \dontrun{
chat <- chat_ollama(model = "llama3.2")
chat$chat("Tell me three jokes about statisticians")
} # }