The Snowflake provider allows you to interact with LLM models available through the Cortex LLM REST API.
Authentication
chat_snowflake()
picks up the following ambient Snowflake credentials:
A static OAuth token defined via the
SNOWFLAKE_TOKEN
environment variable.Key-pair authentication credentials defined via the
SNOWFLAKE_USER
andSNOWFLAKE_PRIVATE_KEY
(which can be a PEM-encoded private key or a path to one) environment variables.Posit Workbench-managed Snowflake credentials for the corresponding
account
.Viewer-based credentials on Posit Connect. Requires the connectcreds package.
Known limitations
Note that Snowflake-hosted models do not support images or tool calling.
See chat_cortex_analyst()
to chat with the Snowflake Cortex Analyst rather
than a general-purpose model.
Arguments
- system_prompt
A system prompt to set the behavior of the assistant.
- account
A Snowflake account identifier, e.g.
"testorg-test_account"
. Defaults to the value of theSNOWFLAKE_ACCOUNT
environment variable.- credentials
A list of authentication headers to pass into
httr2::req_headers()
, a function that returns them when called, orNULL
, the default, to use ambient credentials.- model
The model to use for the chat (defaults to "claude-3-7-sonnet"). We regularly update the default, so we strongly recommend explicitly specifying a model for anything other than casual use.
- params
Common model parameters, usually created by
params()
.- api_args
Named list of arbitrary extra arguments appended to the body of every chat API call. Combined with the body object generated by ellmer with
modifyList()
.- echo
One of the following options:
none
: don't emit any output (default when running in a function).output
: echo text and tool-calling output as it streams in (default when running at the console).all
: echo all input and output.
Note this only affects the
chat()
method.
Value
A Chat object.