Ask HN: What's the largest LLM you can run on your MacBook?

by _falseon 7/8/2024, 9:07 AMwith 0 comments

I'm planning to get a new laptop and am curious how different MacBook specs compare in terms of LLM generation performance. In particular I'd like to know:

1. What's the largest model you can load onto your GPU? 2. Which Apple GPU do you have (M1-M3 Max)? 3. How much memory (RAM) do you have? 4. (ideally) how many tokens/s can you generate at?

0