The Meta link does not support the point. It's actually implying a MTBF of over 5 years at 90% utilizization even if you assume there's no bathtub curve. Pretty sure that lines up with the depreciation period.
The Google link is even worse. It links to https://www.tomshardware.com/pc-components/gpus/datacenter-g...
That article makes a big claim, does not link to any source. It vaguely describes the source, but nobody who was actually in that role would describe themselves as the "GenAI principal architect at Alphabet". Like, those are not the words they would use. It would also be pointless to try to stay anonymous if that really were your title.
It looks like the ultimate source of the quote is this Twitter screenshot of an unnamed article (whose text can't be found with search engines): https://x.com/techfund1/status/1849031571421983140
That is not merely an unofficial source. That is just made up trash that the blog author just lapped up despite its obviously unreliable nature, since it confirmed his beliefs.
You're assuming this is normal, for the MTBF to line up with the depreciation schedule. But the MTBF of data center hardware is usually quite a bit longer than the depreciation schedule right? If I recall correctly, for servers it's typically double or triple, roughly. Maybe less for GPUs, I'm not directly familiar, but a quick web search suggests these periods shouldn't line up for GPUs either.
But you can see how that works: go to colab.research.google.com. Type in some code ... "!nvidia-smi" for instance. Click on the down arrow next to "connect", and select change runtime type. 3 out of 5 GPU options are nVidia GPUs.
Frankly, unless you rewrite your models you don't really have a choice but using nVidia GPUs, thanks to, ironically, Facebook (authors of pytorch). There is pytorch/XLA automatic translation to TPU but it doesn't work for "big" models. And as a point of advice: you want stuff to work on TPUs? Do what Googlers do: use Jax ( https://github.com/jax-ml/jax ), oh, and look at the commit logs of that repository to get your mind blown btw.
In other words, Google rents out nVidia GPUs to their cloud customers (with the hardware physically present in Google datacenters).
> When companies buy expensive stuff, for accounting purposes they pretend they haven’t spent the money; instead they “depreciate” it over a few years.
There's no pretending. It's accounting. When you buy an asset, you own it, it is now part of your balance sheet. You incur a cost when the value of the asset falls, i.e. it depreciates. If you spend 20k on a car you are not pretending to not having spent 20k by considering it an asset, you spent money but now you have something of similar value as an asset. Your cost is the depreciation as years go by and the car becomes less valuable. That's a very misleading way to put it.
> Management gets to pick your depreciation period, (...)
They don't. GAAP, IFRS, or whatever other accounting rules that apply to the company do. There's some degree of freedom in certain situations but it's not "management wants". And it's funny that the author thinks that companies in general are interested in defining longer useful lives when in most cases (this depends on other tax considerations) it's the opposite because while depreciation is a non-cash expense you can get real cash by reducing your taxable income and the sooner you get that money the better. There's some more nuance to this, tax vs accounting, how much freedom management has vs what is industry practice and auditors will allow you to do... my point is, again, "management gets to pick" is not an accurate representation of what goes on.
> It’s like this. The Big-Tech giants are insanely profitable but they don’t have enough money lying around to build the hundreds of billions of dollars worth of data centers the AI prophets say we’re going to need.
Actually they do, Meta is the one that has the least but it could still easily raise that money. Meta in this case just thinks it's a better deal to share risk with investors that at the moment have a very strong appetite to own these assets. Meta is actually paying a higher rate through these SPVs compared to funding them outright. Now, personally I don't know how I would feel about that deal in particular if I was an investor just because you need to dig a little deeper in their balance sheet to have a good snapshot of what is going on but it's not any trick, arguably it can make economic sense.
Actually the author has worked for Google, Amazon (VP-level), Sun, and DEC; and was a co-creator of XML.
2. That level of seniority does, on the other hand, expose them to a lot of the shenanigans going on in those companies, which could credibly lead them to develop a "big tech bad" mindset.
BUT (my point)
Is that the article is terrible at reflecting all of that and makes wrong and misleading comments about it.
The idea that companies depreciating assets is them "pretending they haven't spent the money" or that "management gets to pick your depreciation period" is simply wrong.
Do you think any of those two statements are accurate?
P.S. Maybe you make a good point, I said that I suspected based on those statements that he had little financial knowledge. tbh I didn't know the author, hence the "suspect". But now that you say that it might be that he is so biased in this particular topic that he can't make a fair representation of his point. Irrespective of that, I will say it again: statements like the ones I've commented are absurd.
Wouldn't AI largely be race to bottom? As such even if expensive employees get replaced, the cost of replacing them might not be that big. It might only barely cover the costs of interference for example. So might it be that profits will actually be lot lower than costs of employees that are being replaced?
To the second point, the race to the bottom won't be evenly distributed across all markets or market segments. A lot of AI-economy predictions focus on the idea that nothing else will change or be affected by second and third order dynamics, which is never the case with large disruptions. When something that was rare becomes common, something else that was common becomes rare.
"Special Purpose Vehicles" reminds me of "Special Purpose Entities" from the 90s and 00s, e.g., for synthentic leases
I thought there was a US IRS Law that was changed sometime in the past 10/15 years that made companies depreciate computer hardware in 1 year. Am I misremembering ?
I thought that law was the reason why many companies increased the life time of employee Laptops from 3 to 5 years.
In somehwere around 1999, my high school buddy, worked overtime shifts to afford a CPU he had waited forever to buy! Wait for it, it was a 1 GHZ CPU!
Except for the physical buildings, permitting, and power grid build-out.
This is how "serverless" became a thing btw.
Those are extremely localized at a bunch of data centers and how much of that will see further use? And how much grid work has really happened (there are a lot of announcement about plans to maybe build nuclear reactor etc., but those projects take a lot of time, if ever done)
nVidia managed to pivot their customer base from crypto mining to AI.
As much as there is market for somewhat-less-expensive data centers. (Data centers where somebody else already paid the cost of construction.)
And where they are doesn't matter. The internet is good at shipping bits to various places.
This means that society as a whole is perhaps significantly poorer than if LLMs had been properly valued (i.e. not a bubble), or had simply never happened at all.
Unfortunately it will likely be the poorest and most vulnerable in our societies that will bear the brunt. 'Twas ever thus.
I think the first part of this is probably true, but I don’t think everyone knows it. A lot of people are acting like they don’t know it.
It feels like a bubble to me, but I don’t think anyone can say to a certainty that it is, or that it will pop.
Or they're acting like they think there's going to be significant stock price growth between now and the bubble popping. Behaviors aren't significantly different between those two scenarios.
Putting your statement another way, if you and I can see the bubble, then it's almost a certainty that the average tech CEO also sees a bubble. They're just hoping that when the music stops, they won't be the one left holding the bag.
I’m guessing the author meant it tongue in cheek but really meant “everyone I know or follow knows it’s a bubble”
Its more accurate to say that bubbles rely on most people being blind to the bubble's nature.
When the bubble pops, do you fire _even more_ people? What does that look like given the decimation in the job market already?
Equivocating about what YOU comfortably would prefer to call it is wasted effort that I don't care to engage in.
I think people need to realize that if the bubble gets bad enough, there will absolutely, positively, 100% be a bailout. Trump doesn't care who you are or what you did, as long as you pay enough (both money and praise) you get whatever you want, and Big Tech has already put many down payments. I mean, they ask him "Why did you pardon CZ after he defrauded people? Why did you pardon Hernandez after he smuggled tons of cocaine in?" and he plainly says he doesn't know who they are. And why should he? They paid, there's no need to know your customers personally, there's too many of them.
[citation needed]
> Anyhow, there will be a crash and a hangover. I think the people telling us that genAI is the future and we must pay it fealty richly deserve their impending financial wipe-out. But still, I hope the hangover is less terrible than I think it will be.
Yup. We really seem to be at a point where everyone has their guns drawn under the table and we're just waiting for the first shot—like we're living in a real-world, global version of Uncut Gems.
People have been calling Bitcoin a bubble since it was introduced. Has it popped? No. Has it reached the popularity and usability crypto shills said it would? Also no.
AI on the other hand has the potential to put literally millions of individuals out of work. At a minimum, it is already augmenting the value of highly-skilled intellectual workers. This is the final capitalism cheat code. A worker who does not sleep or take time off.
There will be layoffs and there will be bankruptcies. Yes. But AI is never going to be rolled back. We are never going to see a pre-AI world ever again, just like Bitcoin never really went away.
Renewed interest by the Trump clan with Lutnick's Cantor & Fitzgerald handling Tether collateral in Nayib Bukele's paradise wasn't easy to predict either.
Neither was the recent selloff. It would be hilarious if it was for a slush fund for Venezuelan rebels or army generals (bribing the military was the method of choice in Syria before the fall of Assad).
Bitcoin/crypto doesn't have earnings reports, but many crypto-adjacent companies have crashed down to earth. It would have been worse but regulation, or sometimes lack thereof, stopped them from going public so the bleeding was limited.
The Bitcoin bubble, if anything, deflated. But I'd still disagree with this characterisation because the market capitalisation of Bitcoin only seems to be going up.
Going by the logic of supply and demand, as more and more Bitcoin is mined, the price should drop because there's more availability. But what I've observed is the value has been climbing over the past few years, and remained relatively stable.
In any case, it's hard to argue that more people are using Bitcoin and crypto now compared to 5 years ago. Sure, NFTs ended up fizzling out, but, to be honest, they were a stupid idea from the beginning, anyway.
(And putting masses of people out of work and and thereby radically destabilizing capitalist societies, to the extent it is a payoff, is a payoff with a bomb attached.)
AI companies are releasing useful things right this second, even if they still require human oversight, they are also able to significantly accelerate many tasks.
This has been true since, say, 1955.
> This is the final capitalism cheat code. A worker who does not sleep or take time off.
That’s the hope that is driving the current AI Bubble. It has neither ever been true nor will be true with the current state of the art in AI. This realization is what is deflating the bubble.
I mean, to one degree or another, this is correct. somethings are not going back into the genie bottle.
The technology will remain, of course, just like we still have railways, and houses.
But, and this is key, AI is not going away for as long as the potential to replace human labour remains there.