Downloading Ollama (Do this first)

Why do I need Ollama ?

Ollama allows you to run LLM models on your computer. While Ollama can be used with the terminal app in our laptop, it is a bit challenging to do so.

This is why we will use OpenWeb UI to implement Ollama in a user-friendlier manner. To use OpenWeb UI, we will have to install Docker and then use Docker to install OpenWebUI and integrate Ollama.

Basically, Download Ollama in the first step. Then download Docker and set it up next in the second step.

Steps to download Ollama

Ollama is free and easy to download. The steps below will show you how to download Ollama. Still have problems? I am only an email away.

Step A Download Ollama here

  • Download Ollama at https://ollama.com/ (180MB for Mac/781MB for Windows)
  • Installation screenshots below are for Windows but probably similar to Mac

Open Downloaded File and Click Install


Installing


Once installed, you can see it in your taskbar.


Ollama is technically just a process. It runs in the background.

By default, Ollama (also Docker later on) should only run when you fire up the app. But do check that the app does not auto-launch when you start up your laptop so that it does not take up your device memory if you are not using the app.

That’s all the steps for downloading Ollama. You should set-up Docker next.


(optional) You want to try using Terminal/Command to interact with LLMs on Ollama? You can try this

We will install and use OpenWeb UI to run LLMs on Ollama. BUT if you want to, you can actually run LLM on Ollama using your terminal. Watch this video if you want to try. Or you can also look at this commands in Ollama’s documentation here.

Do note one thing: small LLMs like phi or tinyllama still requires about 1GB harddrive space.