More ways to host AI models locally privately and securely bypassing big data's chokehold on your personal data
More ways to host AI models locally privately and securely bypassing big data's chokehold on your personal data.
After my previous post on running Ollama locally, Karl Schmidt reached out and brought to my attention another technology called “Jan” which is an open source ChatGPT-alternative that runs 100% offline.
It seems to have a handful of extensions but what I'm really looking for is one that could be a cursor or Claude code substitute. I'm sure they'll get there in the near future.
For those of you that are privacy-minded and still want to have your own LLM agents running either locally or in the cloud, I wanted to make sure this was on your radar.
If you are interested in the legal side of things, one of the first topics I am planning to cover in the CTOThinkTank Master Mind Group is the legal ramifications of generating code from an intellectual property standpoint.
If you're considering joining, let me know what other topics you'd like covered in that mastermind group.