Skip to main content


What if the main reason why these AI corporations are buying up so much hardware is to prevent the consumers from buying hardware because they don't want people to be able to run local LLMs? They're afraid that their business models will collapse when people can use AI without paying them subscription fees, etc.

SomeOrdinaryGamers recently put up this video saying exactly that: their business models will collapse if people can run local LLMs on their PCs, laptops, smartphones, etc.

#AI #LLMs

#ai #LLMs
in reply to Neil E. Hodges

This might also have to do with what Bezos said earlier. :/

'That's not going to last': Jeff Bezos believes AI will force you to rent your PC from the cloud, and the RAM crisis is accelerating it — Tom's Hardware

But the real kicker comes in when Bezos mentions a brewery he visited, and this company had to build its own power station to "improve the efficiency of their brewery with electricity." Since there was no power grid, they had to build their own form of electricity.

"At the time, that's what everyone did. If a hotel wanted electricity, they had their own electric generator," Bezos explains. "I looked at this, and I thought, this is what computation is like today; everyone has their own data center."

Interestingly, he continues to say, "that's not going to last. It makes no sense. You're going to buy compute off the grid. That's AWS."

in reply to Neil E. Hodges

>their business models will collapse if people can run local LLMs on their PCs, laptops, smartphones, etc.
True but their business model is already collapsing as currently even when people pay they actually loose money for each "token".
It is sold to the average joe as if it was gonna power boost people jobs and so far their models have no usage outside of research, programming and a few niche sectors.
They are lying simply because that what they are which is not surprising coming from companies that distributes proprietary software/hardware.
This entry was edited (1 day ago)

Neil E. Hodges reshared this.

in reply to Neil E. Hodges

He gets some things pretty wrong, like Copilot can have access to Anthropic (Claude) models too depending on what you or your company pays for and enables. He tends to confuse the company/model names a lot.

As far as Sora goes, I'm not surprised OpenAI was losing money. I know 5 seconds of video using hunyuanvideo1.5_720p_fp16 can take 8~10 hours on an AMD R9700. I've heard this is significantly faster on consumer nVidia cards, but they also eat a lot more power. (Curious how long it takes on a 5090).

I've heard xAI/Grok charges $60/month for making up to 50x 5sec clips per day, and it only takes 1~2 min. I imagine they're probably hemorrhaging money there as well.

in reply to Neil E. Hodges

The story about electricity grids is funny, because utility companies in the US are building micro grids: https://www.canarymedia.com/articles/solar/california-utility-clean-energy-microgrids-wildfires

In electricity we went from big to small. Only solar and battery systems of the last 5-15 years made it possible to generate electricity cheaply at home.

With computers we went from big to very small, but that shift happened with Intel years ago and raspberry is another example of that shift.