Container Wonderland: Running Open-WebUI and Ollama Smoothly

Running Open-WebUI with Docker: A Comprehensive Guide

If you’re looking to deploy Open-WebUI and Ollama using Docker, you’re in the right place. Docker is a powerful tool for containerization that allows developers to package applications and their dependencies into a single unit, ensuring consistency across different environments. This guide will walk you through the process of running Open-WebUI and Ollama using Docker commands.

Prerequisites

Before we begin, make sure you have the following installed on your machine:

  1. Docker: You can download and install Docker from here.
  2. Basic Knowledge of Docker: Familiarity with Docker commands and concepts will be helpful.

Step 1: Running Open-WebUI

First, let’s run the Open-WebUI container. The command below does the following:

  • Adds a host entry (--add-host=host.docker.internal:host-gateway): This maps the host IP address to host.docker.internal within the container.
  • Maps Ports (-p 8181:8080 and -p 3000:8181): The container’s port 8080 is mapped to host port 8181, and the container’s port 8181 is mapped to host port 3000.
  • Mounts a Volume (-v open-webui:/app/backend/data): This creates a named volume open-webui and mounts it to /app/backend/data in the container, allowing data persistence.
  • Sets Environment Variables (-e OLLAMA_BASE_URL=http://localhost:11434 and -e MAIN_LOG_LEVEL=debug): Configures the base URL for Ollama and sets the logging level to debug.
  • Names the Container (--name open-webui): Assigns the name open-webui to the container.
  • Specifies the Image (ghcr.io/open-webui/open-webui:main): Uses the Open-WebUI image from the GitHub Container Registry.
1
docker run --add-host=host.docker.internal:host-gateway -p 8181:8080 -v open-webui:/app/backend/data -p 3000:8181 -e OLLAMA_BASE_URL=http://localhost:11434 -e MAIN_LOG_LEVEL=debug --name open-webui ghcr.io/open-webui/open-webui:main

Step 2: Running Ollama

Next, let’s run the Ollama container. This command does the following:

  • Runs the Container in Detached Mode (-d): The container runs in the background.
  • Mounts a Volume (-v ollama:/.ollama): Creates a named volume ollama and mounts it to /.ollama in the container, ensuring data persistence.
  • Maps Ports (-p 11434:11434): The container’s port 11434 is mapped to the same port on the host.
  • Names the Container (--name ollama): Assigns the name ollama to the container.
  • Specifies the Image (ollama/ollama): Uses the Ollama image.
1
docker run -d -v ollama:/.ollama -p 11434:11434 --name ollama ollama/ollama

Verifying the Setup

After running the above commands, verify that the containers are running smoothly:

  1. List Running Containers:

    1
    docker ps

    You should see both open-webui and ollama containers listed.

  2. Access Open-WebUI:
    Open your browser and navigate to http://localhost:8181 to access Open-WebUI.

  3. Check Logs:
    To check the logs for troubleshooting, use:

    1
    2
    docker logs open-webui
    docker logs ollama

Conclusion

By following this guide, you have successfully set up and run Open-WebUI and Ollama using Docker. This setup allows you to leverage containerization for a consistent and portable development environment. Feel free to explore further customizations and configurations to suit your specific needs.

Happy Dockering!