Skip to content

PortkeyAI provides an interface (AI Gateway) to connect through its Universal API to a variety of LLMs providers with a single endpoint.

Authentication

API keys together with configurations of LLM providers are stored inside Portkey application.

Usage

chat_portkey(
  system_prompt = NULL,
  base_url = "https://api.portkey.ai/v1",
  api_key = portkeyai_key(),
  virtual_key = NULL,
  model = NULL,
  params = NULL,
  api_args = list(),
  echo = NULL
)

models_portkey(
  base_url = "https://api.portkey.ai/v1",
  api_key = portkeyai_key(),
  virtual_key = NULL
)

Arguments

system_prompt

A system prompt to set the behavior of the assistant.

base_url

The base URL to the endpoint; the default uses OpenAI.

api_key

API key to use for authentication.

You generally should not supply this directly, but instead set the PORTKEY_API_KEY environment variable. The best place to set this is in .Renviron, which you can easily edit by calling usethis::edit_r_environ().

virtual_key

A virtual identifier storing LLM provider's API key. See documentation.

model

The model to use for the chat (defaults to "gpt-4o"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use. Use models_openai() to see all options.

params

Common model parameters, usually created by params().

api_args

Named list of arbitrary extra arguments appended to the body of every chat API call. Combined with the body object generated by ellmer with modifyList().

echo

One of the following options:

  • none: don't emit any output (default when running in a function).

  • text: echo text output as it streams in (default when running at the console).

  • all: echo all input and output.

Note this only affects the chat() method.

Value

A Chat object.

Examples

if (FALSE) { # \dontrun{
chat <- chat_portkey(virtual_key = Sys.getenv("PORTKEY_VIRTUAL_KEY"))
chat$chat("Tell me three jokes about statisticians")
} # }