Am I missing something or has the need to vast GPU horsepower been solved ? Those requirements were not in DC's before and they're only going up. Whatever way you look at it, there's got to be an increase in power consumption somewhere no ?
You can pick and choose your comparisons, and make an incease appear or not.
Take weather forecasts as an example. Weather forecasting uses massively powerful computers today. If you compare that forecasting with the lack of forecasts two hundred years ago there obviously is an increase in power usage (no electricity was used then) or there obviously isn't (today's result is something we didn't have then, so it would be an apples-to-nothing comparison).
If you say "the GPUs are using power now that they weren't using before" you're implicitly doing the former kind of comparison. Which is obviously correct or obviously wrong ;)
AI is still very near the beginning of the optimization process. We're still using (relatively) general purpose processors to run it. Dedicated accelerators are beginning to appear. Many software optimizations will be found. FPGAs and ASICs will be designed and fabbed. Process nodes will continue to shrink. Moore will continue to exponentially decrease costs over time as with all other workloads.
There's absolutely no guarantee of this. The continuation of Moore's law is far from certain (NVIDIA think it's dead already).
Perhaps that's what Jensen says publicly, but Nvidia's next generation chip contains more transistors than the last. And the one after that will too.
Let me know when they align their $Trillions behind smaller less complex designs, then I'll believe that they think Moore's law is out of juice.
Until then, they can sit with the group of people who've been vocally wrong about moore's law's end for the last 50 years.
Our chips are still overwhelmingly 2D in design, just a few dozen layers thick but billions of transistors wide. We have quite a ways to go based on a first principles analysis alone. And indeed, that's what chip engineers like Jim Keller say: https://www.youtube.com/watch?v=c01BlUDIlK4
So ask yourself how it benefits Jensen to convince you otherwise.
Of course they've been a thing, but for specialised situations, maybe rendering farms or backroom mining centers but it's disingenuous to claim that there's not an exponential growth in gpu useage.
Jest aside, the use of digital computation has exploded exponentially, for sure. But alongside that explosion, fueled by it and fueling it reciprocally, the cost (in energy and dollars) of each computation has plummeted exponentially.