Dell seems to be the first to realise we don't actually care about AI PCs
38 points by eatonphil
38 points by eatonphil
I saw this insane ai laptop ad the other day. It was bragging about how the AI features mean you have more time in the day and it actively showed the user doing other things(swimming, hiking). And it was just insane to me, have we really hit a point where computers brag that you get to use them less?
I would unironically love a computer that I have to use less. But it is not computers or technology that would make that happen.
I saw one recently that was, for whatever reason, claiming you could get more work done on your holiday. I felt such a sense of unreality that I had to check with my partner that I wasn't misreading the ad. How is that a selling point?!??
hit a point where computers brag that you get to use them less?
Generally speaking, computers increase productivity. So far this was used to compact timelines and squeeze ever more work into the same timebox. If you ever wonder why work these days is more exhausting than in the 1990s, that plays a big part.
The productivity increase can be spent on "do the same with less", too.
As I understand it, the evidence from the relevant economics research is that IT did not increase productivity ("Productivity Paradox", a more expansive version of the Solow Paradox).
The increased productivity has lifted the GDP of many countries, and that has in turn been an enabler for higher salaries (at least in the Nordics with strong labor unions where I have my reference frame). If the society had opted for shorter workdays when the increased productivity got traction, the salaries hadn't risen as much.
https://www.epi.org/productivity-pay-gap/
Higher productivity has not been directly associated with wage increases. That chart is in the US, but looks similar in first world economies. Real wages don’t track productivity.
From what I've seen Nordic statistics of, at least in Sweden, real wages have only slightly outpaced inflation.
What you say is true however for China or emerging economies, where productivity increases have a significant impact on income.
But they didn't get to choose shorter workdays, and salaries didn't rise as much in many countries after all...
Also the Nordics also take work/life balance seriously. So more tasks in the same timebox doesn't mean taking home more overhead (more context switches, more people involved, more things to wrap up, more followup questions).
I mean working hours has been progressively going down for a while now, whilst wages in real terms have been stagnant or growing at the same time. Even in the US this is true.
I don’t understand the pitch for an AI PC. Even if you love using LLMs, you can rent a cloud server with over a hundred gigs of RAM for a couple dollars an hour to see how good local LLMs are. They don’t work very well. All the usual issues with LLMs (being constantly confidently wrong) are amplified. And if you’re using a model that fits in consumer GPU memory it is much, much worse.
Yes, people are pushing 4-bit quantisations very hard, but unlike e.g. 8-bit quantisation, 4-bit really leads to huge drops in performance. There are many studies on this phenomenon, e.g. this one. Let's see how model sizes will develop in the future.
let's skip 4 bit and go straight to 1.58 bit quantization
1.58
Sierpiński's triangle?
that's the number of bits in a ternary digit :D
I'm saying, let's just have three values, -1, 0, +1
Balanced ternary is awesome and I recently wrote a paper (currently in press) presenting the first balanced ternary floating-point number format, but despite some promising papers I don't see it happening for LLMs. You can scale up an LLM excessively, which makes the individual weight matter less, but this gives you fewer and fewer gains while you also lose understanding of the quantisation.
microsoft is one step ahead of you! https://github.com/microsoft/BitNet
bitnet.cpp is the official inference framework for 1-bit LLMs (e.g., BitNet b1.58). It offers a suite of optimized kernels, that support fast and lossless inference of 1.58-bit models on CPU and GPU (NPU support will coming next).
Every company is obsessed with this absolute nonsense, Dell is not unique it pushing this garbage.
The article is about how they are unique in not pushing the garbage.
And the article is misleading. I went through Kevin Terwilliger's video talks from that day at CES and it's all AI, AI, AI. Local AI on the PC. The quotes in this article are decontextualised excerpts from Terwilliger talking about how much Dell wants you to buy a PC because of the AI, AI, AI.
booooo to PCGamer
Even if they do talk about AI doesn't mean we aren't getting a hint that they're seeing consumers are not interested in AI being shoved down throats. Which at least is the takeaway I had.
What does AI in your computer usually mean? GPUs and lots of RAM. Both very expensive.
What does Dell mostly sell? Meh specced business laptops with coil whine.
As if it's not their target market ;)
(Yes, I read the other comments, I don't really care if it's misreporting - I just don't see why specifically Dell would benefit in any other way other than "Customer demands AI, we have AI at home" and replacing AI with any feature, imagined or real).
It's been really baffling to see companies try to use AI in consumer facing marketing. That might work to garner investor interest, but consumers only care about actual features. Whether that's AI or not typically doesn't matter to them.