Just a twin that makes games for twins.
This is a tutorial for installing Open-Webui on Ubuntu. This covers installing Docker and Open-Webui with Ollama. It also covers the error with docker where you cannot install models, as I have had many issues with trying to install models from Ollama.
Open-Webui is software which allows you to access Ollama models without using a terminal, meaning you can access it remotely on any device.
Here is the original documentation I have used.
sudo apt-get update sudo apt-get install ca-certificates curl sudo install -m 0755 -d /etc/apt/keyrings sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc sudo chmod a+r /etc/apt/keyrings/docker.asc echo \ "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \ $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \ sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
sudo apt-get update sudo apt-get install docker-ce docker-ce-cli containerd.io docker-compose-plugin
sudo docker run hello-world
Download Ollama from https://ollama.com/.
Open browser and navigate to: http://127.0.0.1:11434/.
Follow these steps to install Open WebUI with Docker.
Start by pulling the latest Open WebUI Docker image from the GitHub Container Registry.
docker pull ghcr.io/open-webui/open-webui:main
Run the container with default settings. This command includes a volume mapping to ensure persistent data storage.
docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main
-v open-webui:/app/backend/data
): Ensures persistent storage of your data. This prevents data loss between container restarts.-p 3000:8080
): Exposes the WebUI on port 3000 of your local machine.sudo docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
sudo docker rm "The Long string"
sudo docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main