Introducing Ollama: The Local Powerhouse for Language Models

Use Cases:

Imagine it’s early in the morning, and you’re thirsty for the latest news on Drupal. You’d usually hop onto Drupal.org or sift through various websites to find quality articles. But you only want the freshest, top-notch reads from this week. ChatGpt, Microsoft Copilot or any other big tech companies’ LLM tool can help with that. But here’s the twist - with Ollama, you can get that, for free and even automate to email you those articles every-day.

Now, let’s say you’re knee-deep in coding for a Drupal project. Occasionally, you hit a roadblock or need to decipher someone else’s code. Instead of scrolling through endless forums or staring at your screen puzzled, Ollama is your go-to buddy. Just like how you’d ask ChatGPT for help, Ollama can review your code or clarify tricky parts, all locally on your machine.

Or, imagine you’re writing code for a Drupal site and need a second pair of eyes to check your work. Maybe you’re stuck or need to understand what someone else’s code does. Instead of search the web or endless threads on stack overflow. With ollama you can get your answers quickly and locally.

Ollama is open-source, which means it’s there for everyone to use, whether it’s for personal projects or bigger work tasks.

Get Ollama on Your Computer

  • Visit Ollama download page.
  • Select the download that fits your operating system.
  • Download the file, open it, and follow the installation steps.

Running Ollama:

To get Ollama working, do this:

  • Open your computer’s terminal window.
  • Type in Ollama run llama3 and press Enter.
  • Your computer will get set up with LLAMA3.

Try different models:

  • Check out the Ollama Library to see what’s out there.
  • Choose a model and get it with a command like ollama pull phi3.
  • This will download the manifest of the model at your local.
  • Now Run it with Ollama run <model_name>:<tag>.

That’s all for now. In our next article, we’ll explore how to create custom LLM models using Ollama

Video Explaination

If you want video version of this content, check out this video.

Posts in this series