Chat with local LLMs using Open WebUI 20 min
Open WebUI provides an intuitive interface for managing Large Language Models (LLMs) that supports both Ollama and OpenAI-compatible APIs.
You can easily install Ollama and chat with your local LLM using the Open WebUI.
Learning objectives
- Download models directly from the interface of Open WebUI.
- Start your first local AI conversation.
Prerequisites
Hardware
- Olares One connected to a stable network.
- Sufficient disk space to download models.
User permissions
- Admin privileges to install Ollama from the Market.
Step 1: Install Ollama and Open WebUI
Open Market, and search for "Ollama".

Click Get, then Install, and wait for installation to complete.
Repeat the same steps to install "Open WebUI".

Step 2: Create Open WebUI admin account
- Open the Open WebUI app.
- On the welcome page, click Get started.
- Enter your name, email, and password to create the account.

INFO
All your data, including login details, is locally stored on your device.
First account is Admin
The first account created on Open WebUI gains Administrator privileges, controlling user management and system settings.
Step 3: Download models
Because the Ollama API is pre-configured on Olares OS for Open WebUI, you can download models directly within the Open WebUI interface without using the command line.
Check Ollama library
If you are unsure which model to download, check the Ollama Library to explore available models.
Step 4: Chat with your local LLM
- On the main chat page, click the model selector in the top-left and choose the model you just downloaded.
- Enter your prompt in the text box and press Enter to start chatting.

Troubleshooting
Download progress disappears
When downloading a model via the dropdown menu, the progress bar might sometimes disappear before completion.
To resume the download:
- Click the model selector again.
- Enter the exact same model name.
- Select Pull from Ollama.com. The download will resume from where it left off.