GitHub (via Azure) hosts a number of open source and OpenAI models. To access the GitHub model marketplace, you will need to apply for and be accepted into the beta access program. See https://github.com/marketplace/models for details.
This function is a lightweight wrapper around chat_openai()
with
the defaults tweaked for the GitHub model marketplace.
Usage
chat_github(
system_prompt = NULL,
turns = NULL,
base_url = "https://models.inference.ai.azure.com/",
api_key = github_key(),
model = NULL,
seed = NULL,
api_args = list(),
echo = NULL
)
Arguments
- system_prompt
A system prompt to set the behavior of the assistant.
- turns
A list of Turns to start the chat with (i.e., continuing a previous conversation). If not provided, the conversation begins from scratch.
- base_url
The base URL to the endpoint; the default uses OpenAI.
- api_key
The API key to use for authentication. You generally should not supply this directly, but instead manage your GitHub credentials as described in https://usethis.r-lib.org/articles/git-credentials.html. For headless environments, this will also look in the
GITHUB_PAT
env var.- model
The model to use for the chat. The default,
NULL
, will pick a reasonable default, and tell you about. We strongly recommend explicitly choosing a model for all but the most casual use.- seed
Optional integer seed that ChatGPT uses to try and make output more reproducible.
- api_args
Named list of arbitrary extra arguments appended to the body of every chat API call.
- echo
One of the following options:
none
: don't emit any output (default when running in a function).text
: echo text output as it streams in (default when running at the console).all
: echo all input and output.
Note this only affects the
chat()
method.
Value
A Chat object.
See also
Other chatbots:
chat_bedrock()
,
chat_claude()
,
chat_cortex()
,
chat_databricks()
,
chat_gemini()
,
chat_groq()
,
chat_ollama()
,
chat_openai()
,
chat_perplexity()