Ollama
The Ollama integrationIntegrations connect and integrate Home Assistant with your devices, services, and more. [Learn more] adds a conversation agent in Home Assistant powered by a local Ollama server.
This conversation agent is unable to control your house. The Ollama conversation agent can be used in automations, but not as a sentence trigger. It can only query information that has been provided by Home Assistant. To be able to answer questions about your house, Home Assistant will need to provide Ollama with the details of your house, which include areas, devices, and their states.
This integration requires an external Ollama server, which is available for macOS, Linux, and Windows. Follow the download instructions to install the server. Once installed, configure Ollama to be accessible over the network.
Configuration
To add the Ollama integration to your Home Assistant instance, use this My button:
If the above My button doesn’t work, you can also perform the following steps manually:
-
Browse to your Home Assistant instance.
-
In the bottom right corner, select the Add Integration button.
-
From the list, select Ollama.
-
Follow the instructions on screen to complete the setup.
Options
Options for Ollama can be set via the user interface, by taking the following steps:
- Browse to your Home Assistant instance.
- Go to Settings > Devices & Services.
- If multiple instances of Ollama are configured, choose the instance you want to configure.
- Select the integration, then select Configure.
Name of the Ollama model to use, such as mistral
or llama2:13b
. Models will be automatically downloaded during setup.
The starting text for the AI language model to generate new text from. This text can include information about your Home Assistant instance, devices, and areas and is written using Home Assistant Templating.