I'm don't think this is unique, most bubbles historically as far back as the South Sea bubble have had a lot of people aware of the irrationality, but investing in an attempt to profit from it.
I'd even go so far as to say, this is exactly what makes bubbles so volatile as opposed to normal "market corrections". If the dotcom boom had been all people who really believed they were sensibly evaluating the internet's financial potential, I don't think we'd have seen them jump ship quite so quickly.
I won't predict the future, but another point about historic bubbles: they almost all go on much further than people think they will before collapse.
Commoditization of this scale of compute is definitely going to be a boon for many fields of research. Unfortunately fundamental public research is exactly what is being cut right now in the US.
Long term, I think the real winners are going to be in robotics. Still an unsolved field, but Waymo proves that even a nearly 20 year slog to the finish line is viable. And robotics infrastructure may be more robust to obsolescence than the underlying compute. I find it odd so many companies are making humanoid robots though... Over engineering that reeks of bubble economics and possible fraud.
If you want your robot to be a helper around the general populations houses for example, you would aim to make a general purpose bot capable of stairs, ladders, lying down, reaching high, stepping over things, holding awkward weights and loads while doing all of the above. Pinch, twist, push, pull, in all degrees of motion a human has etc.
If we applied the same logic, there should be a massive effort to ditch wheelchairs and build exoskeletons instead.
All the investment in AI should help bring infrastructure up to a higher level, power distribution and cooling for example are at a much higher level than would have otherwise been.
Who knows what use that might have if it suddenly becomes incredibly cheap.
(this is my silver lining thinking)
What's the corresponding infrastructure of AI? The major cost - the GPUs - are effectively obsolete after 3-5 years. The physical location of the datacenters, power, cooling and fibre that connects them might be the lasting infrastructure. Is datacenter location important? Are we actually building up new power sources (apart from endless announcements about FANGs opening nuclear power stations, which as far as I'm aware have not happened yet)?
Is it all about the actual GPUs though, is that the only "infrastructure" being built? A list from the top of my head of things that I'd say do last:
1. Data center buildings (take a while to build, contents completely aside).
2. Organisations and processes for running operations and procurement in said data centers - doesn't take decades to build for sure, but it's something worthwhile to already have.
3. Advances in the actual chips, i.e. more powerful processing units.
4. Advances in chip fabrication.
5. Chip fabrication facilities and organisations (similar to #1 and #2).
So sure, GPUs are highly temporary. But a lot of the things being developed and built around them much less so.
I do think one possible bubble burst scenario is that we'll have cheap compute available for decades but not a lot of great ideas of what to do with it. That is not unlike the 2000s I suppose.
The GPU hardware rots and becomes obsolete in a matter of years, but the national infrastructure required to support the physical sites isn't going away. Things such as...
- improved power distribution networks
- logistics arrangements to build and support the DC sites
- lots and lots of new fibre interconnects to support the massive bandwidth needs
- hopefully: better power delivery planning laws
- plumbing infrastructure, because all that hardware requires cooling
Some of the DC sites will be decommissioned from their initial use, but given the physical security requirements, might morph into handy higher-security industrial facilities with only small repurposing. Such reuse cases would especially benefit from improved logistics (see above).
I had a friend who got a Sun cluster for basically free when the 2000 dot com bubble burst. And when we were doing recreational math contests a couple of years later it was slower than our laptops.
So it is very likely that a load of today's GPU compute is very competitive next year or the year after?
The AI bubble bursting will kill investment in the next gen hardware in the west.
But china will come to market with its first gen that it is currently building to replace its dependency on the west and will leapfrog the west etc. China isn't really completely dependent on competing in our AI bubble, its using AI for its own things and will plough on even when the west bubble bursts. Seems obvious?
Still, there has been so much talk about the AI bubble bursting last week and this is the the best writeup.
We are not getting the same insane gains from node shrinks anymore.
Imagine the bubble pops tomorrow. You would have an excess of compute using current gen tech, and the insane investments required to get to the next node shrink using our current path might no longer be economically justifiable while such an excess of compute exists.
It might be that you need to have a much bigger gap than what we are currently seeing in order to actually get enough of a boost to make it worthwhile.
Not saying that is what would happen, I'm just saying it's not impossible either.
Regards robot form factor; I'd rather R2D2 than C3PO. I don't want anything approaching the Uncanny Valley; I want a machine that does handy things!
I think at least with CPU:s the depreciation has slowed down a lot compared to 15 years ago.
I like the term "democratize investing" here. "We're granting the masses the privilege of dumping their lifesavings into this overhyped project, so we can make a clean exit".
> Yes, retail can buy Nvidia, but they can’t access pre-IPO rounds where the real speculation happens. This concentration among professional investors won’t prevent a bubble, but it might prevent the kind of widespread financial devastation that followed previous crashes.
The flow of money to spur innovation is exactly like "Cambrian Explosion". We should do this more often, with biotech and future fields to come.
OTOH all the VR headsets gathering dust now didn't turn out to be quite as useful as those fiber optic cables. And I'm not sure what will remain after the AI bubble pops except for a massive matrix multiplication overcapacity ;)
I also wouldn't call all the money being funneled into a single technology a "Cambrian Explosion", it's the opposite of that, an organism being propped up that wouldn't survive on its own in a competitive environment.
[0] "millions of ordinary investors watched their retirement accounts and college funds evaporate. The same middle-class Americans who had been told they were foolish not to participate in the ‘new economy’ now faced financial ruin. Teachers’ pension funds were halved. Family savings meant for homes and education vanished"
And pray we don't enter a "lost decade" (which is closer to 30 years, now) like Japan.
What year is this from? The author might want to do a recent news search.
When you are selling a 5 dollars for 1 dollar doubling revenue is easy. It just creates more losses, same with OpenAI
This article is based off of the Altman bubbly comment.
"AI is an existential risk for humanity, that's why we have to dump all resources we have into building it".
"It's critically important that AI as an industry is regulated, but also we'll pull out of the EU if they try to regulate us"
"AI is an existential risk for humanity ...". ... so you should trust only us to build it
"AI as an industry should be regulated ..." ... to make it harder for newcomers on the market.
There is absolutely nothing else left to invest in when it comes to software development, this is it.
It’s so painfully obvious but so many AI doomers use it as evidence.
He doesn’t want a talent war with Meta and Apple. And Meta has responded by signaling a truce in the talent war by saying they’re freezing AI hiring.
Things seem to be slowly changing in Japan.
At this point it is wrong to speak about Japan's "lost decade" - it should be "lost decades".
Of course, assuming that this would be the only thing where economic gains come from is already such a laughably bearish vision. It's just that that's all you need for the bubble-thesis to fall flat.
If that's true, then we are in a bubble by definition. When AI development eventually stagnates, failing to deliver on these promises, valuations will correct fast (and painfully). What happens then to Nvidia and other hardware companies? And what about the massive AI investments currently propping up the economy [1]? These would also be slashed, messing up the entire supply chain that's gearing up to meet this demand.
While I agree the technology is great and useful, I believe we are in bubble territory. I believe it's unlikely to be as transformative as the CEOs and VCs funding these companies claim.
[1] https://sherwood.news/markets/the-ai-spending-boom-is-eating...
Also whatever LLM productivity gains are currently happening are being massively subsidized. Once companies switch out of lighting money on fire mode most of these products will get dramatically worse and more expensive. Maintaining a cutting edge LLM isn't a railroad that you build once and can run and manage for centuries at a fraction of the initial price, they require constant expensive investment.
“When I see a bubble forming, I rush in to buy, adding fuel to the fire,” goes one of George Soros’s well-known quotes. “That is not irrational.”
I read quotes like this and reminded that it is common that people forget money is just a competitive resource we use to outbid each other for _real_ things. Money moves around, it isn't lost or "Completely vaporized", someone receives it at the other side of the transaction. It is still in circulation, it can still be used to outbid people for real things, just by different people.
Also, pets.com still exists, it just forwards to petsmart.com.
The 2014 doc was a pretty wild read for me when it came out - it changed my perspective quite a bit.
[1]:https://www.bankofengland.co.uk/-/media/boe/files/quarterly-...
[2]: https://www.goodreads.com/book/show/58796370-can-t-we-just-p...
That's true, but the thing that's lost is the economic/productive capacity that the money was spent on, that could have been used for other (better) purposes.
For example, if I raise $100mn in a frothy market, and spend it on employing 100 Engineers on $1mn/yr salaries for 1 year before ultimately going bankrupt, it's true that the money doesn't disappear, as it was simply transferred from the VCs to the Engineers, but what's spent/consumed is the Engineers' time. Society can never get those 100 person-years back, and the VCs have to write their capital investment to 0.
The other comments are separately true - money is created by bank borrowing and destroyed by loans being repaid or going bad. Periods of speculation often result in increasing leverage (e.g. borrowing to buy stocks/houses), which does result in the destruction of money when it unwinds (as well as damage to bank's balance sheets, which can become problematic when it happens at a large enough scale - see 2008).
But money is an abstraction of wealth, and wealth absolutely can be destroyed, in multiple ways:
1. It can be physically destroyed - if I break a window, that's wealth that is destroyed. That window now needs to be replaced, which costs materials and labor, which could've gone to building something new instead.
2. It can be spent on things that end up not used. If five years from now, those millions of GPUs are no longer in use, we created them for nothing instead of creating more of something people would use.
3. Wealth can be spent on the less important things, rather than the more important things. This is not exactly wealth being destroyed, just built more slowly, because instead of building lots of new wealth (via innovation, say) we're creating less valuable things.
I don't think any of the above are relevant to AI, btw.
On the other hand, the monetary value of the stock market (and other assets) going up and down does create or destroy "money". From a financial point of view, it's not a zero sum game.
Is AI a bubble? Probably.
Does that make it meaningless? Not at all.
Carlota Perez’s Technological Revolutions and Financial Capital: The Dynamics of Bubbles and Golden Ages (2003) is strikingly prescient--and worth revisiting
[thread]
* https://twitter.com/rubyscanlon/status/1958891869489836076#mPerez goes back to the technology of canals:
* https://en.wikipedia.org/wiki/Technological_Revolutions_and_...
People getting excited for something (perceived as) new is part of the human character.
If you're worried about money, this is horrible news. If you just want to get shit done, it's great news, only if you can avoid losing personal agency long enough to survive the crash.
We're headed towards the hockey stick in terms of what people using AI can do. I'm rapidly learning that even ChatGPT5 can get confused, and lose sight of goals, but not in the hallucination variety, just the bog standard way people end up trapped in rabbit holes. I'm learning how to talk to it and get it back on track.
AI really can be productive, but it still needs guidance to be really useful.
E.g. someone borrowing against their higher property value(s) to put a down payment on another property.
Leverage is the amplifier. And I don’t see many self-circulating capital flows. I expect contractions to be reasonable for this bubble, or more realistically industry stagflation.
Here's the mechanism in simple terms:
When US manufacturing jobs moved to China in the 2000s, American workers saw their incomes drop dramatically - like a factory worker going from $30/hour at Ford to $12/hour at Walmart. Instead of accepting lower living standards, the system created an alternative solution through housing and credit.
As home prices rose rapidly (often 10-15% annually), workers could borrow against their home's appreciation through equity loans and refinancing. A worker whose house went from $150,000 to $300,000 could borrow $50,000 to maintain their lifestyle - buying trucks, boats, and continuing to consume as if their income hadn't dropped.
This created a win-win illusion: China got manufacturing jobs, US companies got higher profits from cheap labor, Americans got cheaper goods at stores like Walmart, and workers felt wealthy despite earning less. Nobody complained because everyone seemed to benefit in the short term.
The system worked as long as home prices kept rising, allowing people to keep borrowing against appreciation. But when housing prices stopped climbing around 2005, the illusion collapsed - workers were left with lower wages, massive debt, and no way to keep borrowing.
This mechanism essentially allowed America to maintain consumption by borrowing against future wealth rather than addressing the fundamental problem of job losses. The 2008 financial crisis was the inevitable result when this unsustainable system finally broke down.
The "fearmongering" he is trying to create, can be seen as self-serving, so his opinions should be taken with a very big grain of salt.
Blockchain, NFTs and 3D printing are still around and have vacuumed up billions and billions without the average person being able to tell an impact on their lives.
But at the time it was going to be the next big thing transforming everything.
Same as 3D printing. Certainly cool and useful in some niche contexts, but it has not disrupted manufacturing.
Housing is back …
Dotcom came back…
Nothing was a bubble. Dotcom was into a new paradigm shift with mobile in less then a decade. These aren’t even significant timelines when you think about it.
So you pull out of the AI hype today, fine. These past recent bubbles show that everything ramps back up within five years.
AI-is-hype people are delusional. The computer has never been able to do what it’s doing today. We could only dream of it.
Sure, but do the math. It doesn’t work out yet. This stuff burns money and energy. Either revenue has to go up A LOT or costs do have to come down A LOT (or quality has to suffer by using smaller models).
Ironic how you can contradict yourself without realizing. The fact that something "came back", meant it WAS a bubble that popped.
https://en.wikipedia.org/wiki/AI_winter
I knew about since like 2010 or before, anti-tech Luddite will act like it's never a thing, shatup.
It looks likely it could take all the big software companies with it, and all the big cloud providers. It may well kill most GPU vendors, most datacentre and hosting companies. Industrial-scale LLMs are propping up the entire cloud business, and that itself was bloated and overgrown. SaaS was a mistake. Anything -aaS was a mistake.
I'd _like_ to see this kill off MS, Oracle etc.
Intel is teetering. NVidia is probably screwed. AMD may follow.
There's geopolitics here too. China wants Taiwan and has actively been divesting from Western hardware and software. So has Russia. Lots of Linux growth there: it's free, it works, they can just take it.
And there's rapid climate change too, which is starting to become visible.
Everyone who manufuctures in Taiwan may well be doomed. But ditto everyone in the tropics, in the newer tech centres: Malaysia, Thailand, etc.
Everyone who gets chips from Taiwan is probably screwed. Everyone who assembles in PacRim and SE Asia too.
That will take down most Western companies.
Apple might weather it: it sells hardware, and it has its own unique OS family. But others make its hardware for it -- in those areas.
Chinese tech may bloom.
Small scale individual FOSS will be OK.
Stuff reusing legacy tech, that can run on old kit.
Everyone's deprecating x86-32. That may bite them hard.
Everything dependent on virtual stuff and public cloud, everything dependent on K8s and remote datacentres, everything you can't run locally on kit you own that sits in premises you own.
That includes a lot of the games industry.
Non-commercial OSes will be OK.
Bad times for RHEL and the clones. Bad times for SUSE and maybe Canonical.
Maybe OK for Debian. Good for Arch & Alpine & Slackware.
Stuff that needs GPUs, bad. Stuff that works fine in standard def on CPU graphics, good.
But I am just indulging my own biases and skepticism here, I freely admit.
The clearest example is in AI generated visual content. If you dig through what people are doing, its clear that only a very small % of users are actually getting truly high-quality, ready-for-production content, while the rest are just prompting in pure slop. There is a skill level to this that hasn’t really permeated the mainstream
Once that happens, we might see some of that 95% waste figure change to, maybe, 50% waste