OpenWebUI Main Page

Respect my work

Please do not share website content, slides, or notes with anyone else who are NOT taking this course now. This is my own intellectual work and I hope you can respect that.

Open WebUI

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline. It supports various LLM runners like Ollama and OpenAI-compatible APIs, with built-in inference engine for RAG, making it a powerful AI deployment solution

Documentations: link

Community: link

See left menu for various features you can do Open WebUI. Or see below

Table of Contents for Using Open WebUI

(../computing/computing-openwebui-accessingmodel1.qmd).

Accessing LLM models


Various features to add


Updating softwares


Creating/Sharing/Uploading LLM apps using Models