Fusion of Engineering, Control, Coding, Machine Learning, and Science

Download and Run Phi 4 Large Language Model Locally

In this tutorial, we explain how to download and run an unofficial release of Microsoft’s Phi 4 Large Language Model (LLM) on a local computer.

The YouTube tutorial is given below.

The first step is to install Ollama. Go to the official Ollama website:

https://www.ollama.com/download/windows

and click on the Download link to download the Ollama installation file.

Then, to verify the installation, open a Windows Command Prompt and type

ollama

Next, we need to Download the Phi 4 model. Go to the website:

https://ollama.com/vanilj/Phi-4/tags

and search for the appropriate model

The figure above shows all the available models. Let us select the Q8_0 model. Click on this model, and copy the command for downloading and running the model

Next, open a Windows Command Prompt and paste the command:

ollama run vanilj/Phi-4:Q8_0

This command will download and run the model in Ollama.

Exit mobile version