TwinTwinIndustries

Just a twin that makes games for twins.

This is a tutorial for installing Open-Webui on Ubuntu. This covers installing Docker and Open-Webui with Ollama. It also covers the error with docker where you cannot install models, as I have had many issues with trying to install models from Ollama.

Open-Webui is software which allows you to access Ollama models without using a terminal, meaning you can access it remotely on any device.

Here is the original documentation I have used.

Installing Docker


  1. Open your terminal.

  2. Set up Docker's apt repository:
  3. Terminal Code

    sudo apt-get update
    sudo apt-get install ca-certificates curl
    sudo install -m 0755 -d /etc/apt/keyrings
    sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc
    sudo chmod a+r /etc/apt/keyrings/docker.asc
    echo \
      "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu \
      $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \
      sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
  4. Install Docker Engine:
  5. Terminal Code

    sudo apt-get update
    sudo apt-get install docker-ce docker-ce-cli containerd.io docker-compose-plugin
          
  6. Verify Docker Installation:
  7. Terminal Code

    sudo docker run hello-world
          

Install and Verify Ollama


  1. Download Ollama from https://ollama.com/.

  2. Verify Ollama Installation:

Install and Verify Ollama


Quick Start with Docker 🐳

Follow these steps to install Open WebUI with Docker.

Step 1: Pull the Open WebUI Image

Start by pulling the latest Open WebUI Docker image from the GitHub Container Registry.

Terminal Code

docker pull ghcr.io/open-webui/open-webui:main
          

Step 2: Run the Container

Run the container with default settings. This command includes a volume mapping to ensure persistent data storage.

Terminal Code

docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main
          
  • Volume Mapping (-v open-webui:/app/backend/data): Ensures persistent storage of your data. This prevents data loss between container restarts.
  • Port Mapping (-p 3000:8080): Exposes the WebUI on port 3000 of your local machine.

Issues Installing Models


I had issues with installing models as it wouldnt allow me, so this is what I did:
  1. Stop the running docker container by trying to run the container again:

    Terminal Code

    sudo docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
                  
    Then sudo docker rm 'The long string of numbers and letters'

    Terminal Code

    sudo docker rm "The Long string"
                  

  2. Ran with different code to allow network access. (Haven't verified if this did anything)

    Terminal Code

    sudo docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
                  
  3. Then follow this GIF:

    Tutorial GIF


  4. Now you should be able to install all the models Ollama has to offer check them at: http://ollama.com/library