Llm
LLM
The llm
verb connects a call to an AI language model.
OpenAI’s real-time API is the only supported model. Support for other LLMs will be rolling out shortly.
session.llm(
{
"vendor": "openai",
"model": "gpt-4o-realtime-preview-2024-10-01",
"auth": {
"apiKey": "YOUR_API_KEY"
},
"actionHook": "/final",
"eventHook": "/event",
"toolHook": "/toolCall",
"events": [
"conversation.item.*",
"response.audio_transcript.done",
"input_audio_buffer.committed"
],
"llmOptions": {
"response_create": {
"modalities": ["text", "audio"],
"instructions": "Please assist the user with their request.",
"voice": "alloy",
"output_audio_format": "pcm16",
"temperature": 0.8,
"max_output_tokens": 4096
},
"session_update": {
"tools": [
{
"name": "get_weather",
"type": "function",
"description": "Get the weather at a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "Location to get the weather from"
},
"scale": {
"type": "string",
"enum": ["fahrenheit", "celsius"]
}
},
"required": ["location", "scale"]
}
}
],
"tool_choice": "auto",
"input_audio_transcription": {
"model": "whisper-1"
},
"turn_detection": {
"type": "server_vad",
"threshold": 0.8,
"prefix_padding_ms": 300,
"silence_duration_ms": 500
}
}
}
})
You can use the following attributes with the llm
verb:
Option | Description | Required |
---|---|---|
vendor |
Name of the LLM vendor. | Yes |
model |
Name of the LLM model. | Yes |
auth |
Object containing authentication credentials; format varies depending on the model (see below). | No |
connectOptions |
Object containing information such as the URI to connect to. | No |
actionHook |
Webhook that will be called when the LLM session ends. | No |
eventHook |
Webhook that will be called when a requested LLM event occurs (e.g., transcript). | No |
toolHook |
Webhook that will be called when the LLM wants to call a function. | No |
events |
Array of event names listing the events requested (wildcards allowed). | No |
llmOptions |
Object containing instructions for the LLM; format depends on the LLM model. | No |