Wonder if the AI rush will result in a situation where the state of the art is so far beyond what's needed for gaming that gpus won't be a bottleneck anymore.
Expect a lot of creatively bankrupt tech demos with eye-watering hardware requirements.
Graphics programming is a lost art, buried deep below an *unreal* amount of abstraction layers
> Wonder if the AI rush will result in a situation where the state of the art is so far beyond what's needed for gaming that gpus won't be a bottleneck anymore.
I dunno, it seems the scaling is different for AI. Like AI is more about horizontal scaling and gaming is more about vertical scaling (after you get to native 4k resolutions).
I've succesfully run (not trained) local models * on my mac mini that cost less than a single video card anyway.
* That fit in my ram. They were probably slower than the FOMO hardware but good enough.
I guess the limited amount of RAM is also a way to limit the cards.