The Snowflake provider allows you to interact with LLM models available through the Cortex LLM REST API.
Authentication
chat_snowflake()
picks up the following ambient Snowflake credentials:
A static OAuth token defined via the
SNOWFLAKE_TOKEN
environment variable.Key-pair authentication credentials defined via the
SNOWFLAKE_USER
andSNOWFLAKE_PRIVATE_KEY
(which can be a PEM-encoded private key or a path to one) environment variables.Posit Workbench-managed Snowflake credentials for the corresponding
account
.Viewer-based credentials on Posit Connect. Requires the connectcreds package.
Known limitations
Note that Snowflake-hosted models do not support images, tool calling, or structured outputs.
See chat_cortex()
to chat with the Snowflake Cortex Analyst rather than a
general-purpose model.
Arguments
- system_prompt
A system prompt to set the behavior of the assistant.
- turns
A list of Turns to start the chat with (i.e., continuing a previous conversation). If not provided, the conversation begins from scratch.
- account
A Snowflake account identifier, e.g.
"testorg-test_account"
. Defaults to the value of theSNOWFLAKE_ACCOUNT
environment variable.- credentials
A list of authentication headers to pass into
httr2::req_headers()
, a function that returns them when called, orNULL
, the default, to use ambient credentials.- model
The model to use for the chat. The default,
NULL
, will pick a reasonable default, and tell you about. We strongly recommend explicitly choosing a model for all but the most casual use.- api_args
Named list of arbitrary extra arguments appended to the body of every chat API call.
- echo
One of the following options:
none
: don't emit any output (default when running in a function).text
: echo text output as it streams in (default when running at the console).all
: echo all input and output.
Note this only affects the
chat()
method.
Value
A Chat object.