Running DeepSeek R1 on Your Own (cheap) Hardware – The fast and easy way

by BimJeamon 2/1/2025, 12:16 PMwith 19 comments

by cwizouon 2/1/2025, 2:10 PM

Maybe you should add "distills" to the title? As this is about installing Ollama to grab the 7b or 14b R1-Qwen-distills, not "R1".

by ghostie_plzon 2/1/2025, 2:15 PM

> Unless you like unnecessary risks. In that case, go ahead, genius.

what an off-putting start

by Euphorbiumon 2/1/2025, 2:32 PM

I have R1:1.5B running on my 8gb ram M4 mac mini. Dont know where I would use it, as it is too weak to solve actual problems, but it does run.

by BimJeamon 2/1/2025, 12:16 PM

Set up a local AI with DeepSeek R1 on a dedicated Linux machine using Ollama—no cloud, no subscriptions, just raw AI power at your fingertips.

by BimJeamon 2/1/2025, 1:18 PM

Sorry if you guys get so overwhelmed with deepseek submissions these days. This will be my one and only in the next time. It is cool to have an anti-weight to all these pay models.

by assimpleaspossion 2/1/2025, 1:58 PM

Are there any security concerns over DeepSeek as there are over TikTok?

Saw this in the article

>I would not recommend running this on your main system. Unless you like unnecessary risks.

by donclarkon 2/1/2025, 2:06 PM

I like this. However, I did not find any minimum specs or speed. Maybe I missed? Can some point me in the right direction please?