Is Local the Future of AI?

31 points by wils124


mudkip

Yes, I agree that the value of Apple's chips is hard to beat, but there's still a massive bottleneck with hardware accessibility. The 128 GB MacBook described costs over $5,000 on their website, and in the consumer space, even the most often recommended GPU which is a 3090 with 24 GB VRAM, you can find used going for $700 at minimum. This effectively prices out all non-professional users who don't have money to spend to simply have parity with the $20 subscription on their phones. (and even for those who do, they have to cope with the fact that the model will always be dumber than ChatGPT, and that their hardware will grow outdated very quickly)

Also a noticeable disconnect between the hardware we have and the primary focus of open-source labs (scale up and cater to their enterprise customers, just look at GLM-5's increase to 744B parameters, double from GLM-4.5's 355B). We really just need some kind of Cambrian explosion in cheap hardware for local models to be feasible.