Not sure what to do next. I'm building a home setup to test various LLMs and gain hands-on experience with custom built gear. I want to run Llama 3.1, Hermes 3, Qwen 2.5 Coder
My specs:
- Intel Core i7 13700
- RTX 4090
- SSD NVMe 1TB
Any other options I should be aware of? And how much did you invest in your current setup?
> Not sure what to do next.
What do you mean? Setup the models themselves on your machine, and do what you wanted to do with them. If you see any bottlenecks during the process, then address them.
Check out https://www.reddit.com/r/LocalLLaMA/ if you want to socialise with other people doing this and see their setups, etc :)