Use Open WebUI to Easily Run Local AI LLM on Your Computer
1 min read
Summary
Open WebUI is an AI-assisted chat platform that allows users to run AI models on their own machines, while also giving them the ability to customise their experience and retain control over their data.
The platform is Docker-compatible, and can be run on Kubernetes or Python, whilst also being able to utilise local models such as Ollama or other OpenAI-compatible APIs.
To install Open WebUI on your local machine, navigate to the directory where you want to store project files and create a docker-compose.yml file, including the required code, which can be found online.
Upon starting the containers, access the Open WebUI interface by going to http://localhost:3000 on a web browser, where you can create an admin account to sign in and access the dashboard.
The platform has a number of features, including the ability to integrate plugins, connect APIs, and manage multiple chats at once, as well as a built-in memory system for storing useful content.
Open WebUI also lets users install AI models via Ollama, allowing for control over which models to use and the ability to switch between different models.