Features and integrations
Using your own custom LLM
Use Bolna Voice AI with your own custom LLM
We expect your custom LLM to be an OpenAI compatible server.
Adding your Custom LLM using dashboard
- Click on LLM select dropdown as shown in the image
Choose LLM
- From the dropdown click on
Add your own LLM
.
Click Add your own LLM
- A dialog box will be displayed. Fill in the following details:
LLM URL
: the endpoint of your custom LLMLLM Name
: a name for your custom LLM
click on Add Custom LLM
to connect this LLM to Bolna
Add your custom LLM
-
Refresh the page
-
In the LLM settings tab, choose
Custom
in the first dropdown to select LLM Providers
Choose custom LLM Provider
- In the LLM settings tab, you’l now see your custom LLM model name appearing. Select this and save the agent.
Choose custom LLM Model
Using the above steps will make sure the agent uses your Custom LLM URL.
Demo video
Here’s a working video highlighting the flow:
Add custom LLM demo video
Adding your Custom LLM using APIs
You can connect and use your own Custom LLM via APIs as well using Create Agent API.
For custom LLM simply keep provider in the llm_agent
key as custom
and add a OpenAI compatible base_url
.
"llm_agent": {
"max_tokens": 100.0,
"presence_penalty": 0.0,
"base_url": "https://custom.llm.model/v1", # add your custom LLM base_url
"extraction_details": null,
"top_p": 0.9,
"agent_flow_type": "streaming",
"request_json": false,
"routes": null,
"min_p": 0.1,
"frequency_penalty": 0.0,
"stop": null,
"provider": "custom", # add provider = custom
"top_k": 0.0,
"temperature": 0.2,
"model": "custom-llm-model", # add a name for your custom LLM
"family": "llama"
}
Was this page helpful?