http://twgwkbsl36sgd3bcpsnvxaznsfygcp3bt7lk4if3ly2dy5ey7xpx4fqd.onion/p/setting-up-your-own-ai-locally
Other models exist, and can be quickly installed from huggingface.com into ollama, for now focus on whats available. To download a model, type " ollama pull mistral " . This will then download the model which can be run when you use " ollama run mistral " I hope this gets some people started, ollama has an API which can be interfaced with using various languages, meaning you can use this with tools such as summarizing nmap.