Skip to main content


Yet more companies laying off employees not because AI is replacing them, but because they need more money to fund their AI. I can't remember the last time I saw sunk cost fallacy at this scale.

reshared this

in reply to Marcus Hutchins :verified:

hopefully this will be an extinction level event for the tech industry as it is, which is awful. I'll meet you to wipe out the dinosaurs so something better can take this place

I already see the seeds of much much better second construction being planted, they just have to compete directly with Wall Street bikinis. But what if they blow themselves up with their own incompetence?

in reply to Marcus Hutchins :verified:

so. much. fomo.

Like, there's no money in AI, but these dudes are so afraid of not being the one to make the theoretical future money they're mortgaging their futures.

I kinda hope the bubble popping just complete tanks FAANG. It's time to burn down the forest and make room for new growth.

in reply to Dave Alvarado

@dave my thoughts exactly. Let's let nerds with home Labs build a new future for us and we don't need corporations to what our Tech products
in reply to Marcus Hutchins :verified:

in 5 years there will be no more new senior developers and supply chain will be completely broken. That time old fashioned senior developers will raise from ashes and they will be incredibly expensive. I don’t believe in those rumors, that in 18 months AI will be able to write fully functional complex code from scratch.
in reply to Jan Antoš

@janantos the capacities of generative AI have plateaued because we're running into the limits of what you can do with the available computing, the cost-effective computing, the available training data, Etc.
in reply to Fluffy Kitty Cat

@fluffykittycat @janantos iirc they’re running out of internet to scrape for data as well. Like at this point they’ve already gone thru all of StackOverflow and all the open source stuff on GitHub, there’s not a ton of new publically available code to index and train on and it’s still not good enough

And once you start training AI on AI generated things, it degrades and gets filled with bias extremely quickly

in reply to greenpepper22

@greenpepper22 @janantos yeah, they already scraped the whole internet. The low hanging fruit is picked, they have to make their own training data now
in reply to Fluffy Kitty Cat

They've reached the plateau.

And the plateau is ...
- It does NOT work well.
- Its delivered results are NOT valuable.
>>> BUT <<<
- It is still VERY EXPENSIVE!!!

...

Hmmm...
🤔

This entry was edited (5 months ago)
in reply to Jeff Grigg

@JeffGrigg @janantos LLMs were exactly one discrete step forward. Further advances will take more basic science. Too bad that instead of being modest with what they were selling the tech industry went all in instead
in reply to Fluffy Kitty Cat

@fluffykittycat @janantos
Yes. Exactly!

I am "classically trained" in AI technologies.

Current *hype* is about the "Generative AI" sector of that *ONLY*.

Computer AI is older than I am!
It's helpful to realize that the standard for testing for "Artificial General Intelligence" (AGI) success, the Turing Test, was developed from Alan Turing's work in World War Two!

https://en.wikipedia.org/wiki/Turing_test

"Classic AI" is part of our everyday lives, without us even realizing it.

in reply to Jeff Grigg

@JeffGrigg @fluffykittycat I see myself more like “classical” old fashioned ML “educated” I think the problem happened when managers started looking on it as ML == AI (NN) and completely start ditching all the good stuff sucha as (S)ARIMA models, ensemble trees. My favourite Q during interview is “what’s difference between linear and logistical regression”, you have no idea how many so called professionals are failing to move forward.
in reply to Fluffy Kitty Cat

they are completely different things and that’s the trap. No pros and cons, just completely different use cases. On Linear regression you can in simple terms look as trend line (predicts a continuous outcome), the output is a real number, while logistic regression predicts a categorical outcome, usually binary (true(1)/false(0)).
This entry was edited (5 months ago)
in reply to Fluffy Kitty Cat

@fluffykittycat @janantos
Funny, I figured it was more because there's a ceiling on how much imitation can substitute for thought.
Unknown parent

Fluffy Kitty Cat
@JackPine some European countries have been doing this bit by bit already. It seems like they're choosing open source solution over some proprietary but European one which is great to see. A lot of the tech is comparable and functionality already just doesn't have the network effect but politics has made the American Tech monopolies unpalatable
in reply to Marcus Hutchins :verified:

well we need a lot of people to create real software to replace all the proprietary software with open source.
in reply to Reiner Jung

@prefec2 you'd be surprised at what we can already do it existing tech, it's just complacency and inertia holding up many companies at this point

Neil E. Hodges reshared this.

in reply to Marcus Hutchins :verified:

The tech industry is pushing these fads out so rapidly because they need to justify their existence. They know there haven't been any major innovations in a very long time and they're scared that we'll realize without these "distractions".