OpenWebUI Main Page

Open WebUI

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution

Documentations: link

Community: link

See left menu for various features you can do Open WebUI. Or see below

Table of Contents for Using Open WebUI

(../computing/computing-openwebui-accessingmodel1.qmd).

Accessing LLM models


Various features to add


Updating softwares


Creating/Sharing/Uploading LLM apps using Models