How to Run LLMs Locally on Inexpensive Hardware

This weekend, I had some fun installing Ollama on my existing hardware with varying results.

The surprise winner?

The Dell R630 handled the LLM workload best of all my old machines.

I streamed this on FranksWorldTV.


#DataScientist, #DataEngineer, Blogger, Vlogger, Podcaster at . Back @Microsoft to help customers leverage #AI Opinions mine. #武當派 fan. I blog to help you become a better data scientist/ML engineer Opinions are mine. All mine.