Running Ollama locally for fun and profit


Want to test LLM models locally for free to see which one suits your needs?

Then you will want to get into Ollama.

In my efforts to get ready for a month or more without the use of my right hand I started researching AI tools like crazy to help me better interface with my computer.

As always, I DO NOT LET AI WRITE FOR ME but if it can allow me to do basic tasks like browsing the web just from my voice that would be great.

One of the useful tools I found for local experimentation with LLM models was Ollama. It allows you to download and a wide variety of LLM models with just a few clicks.

It integrates seamlessly with Open Web UI which is great for local experimentation but for scale you would want to use a tool more like Strands Agents which I have talked about before.

I can do a deep dive if enough people are interested but Techno Tim has an amazing tutorial already which I partially based my local designs off of.

Let me know if you want to hear more about hosting LLM models both locally and at scale. If you want access to the docker compose setup for my local AI arsenal I am building to help me work one handed message me and I will send you a link.