> Unless you like unnecessary risks. In that case, go ahead, genius.
what an off-putting start
I have R1:1.5B running on my 8gb ram M4 mac mini. Dont know where I would use it, as it is too weak to solve actual problems, but it does run.
Set up a local AI with DeepSeek R1 on a dedicated Linux machine using Ollama—no cloud, no subscriptions, just raw AI power at your fingertips.
Sorry if you guys get so overwhelmed with deepseek submissions these days. This will be my one and only in the next time. It is cool to have an anti-weight to all these pay models.
Are there any security concerns over DeepSeek as there are over TikTok?
Saw this in the article
>I would not recommend running this on your main system. Unless you like unnecessary risks.
I like this. However, I did not find any minimum specs or speed. Maybe I missed? Can some point me in the right direction please?
Maybe you should add "distills" to the title? As this is about installing Ollama to grab the 7b or 14b R1-Qwen-distills, not "R1".