I am kinda sad we have reached point where native resolution is not the standard for high mid tier/low high tier GPUs. Surely games should run natively at non-4k resolution on my 700€+ GPU...
And now antialiasing is so good you can start from lower resolutions and still fake even higher quality
It's really the same problem as in synthesizing audio. 44.1 kHz is adequate for most audio purposes, but if you are generating sounds with content past the nyquist frequency it's going to alias and fold back in undesirable ways, causing distortion in the audible content. So you multisample, filter to remove the high frequency content and downsample in order to antialias (which would be roughly equivalent to SSAA) or you build the audio from band limited impulses or steps.
By Games I mean modern AAA first or third person games. 2D and others will often run at full resolution all the time.
New monitors default to 60hz but folks looking to game are convinced by ads that the only reason they lost that last round was not because of the SBMM algorithm, but because the other player undoubtedly had a 240hz 4K monitor rendering the player coming around the corner a tick faster.
Competitive gaming and Twitch are what pushed the current priorities, and the hardware makers were only too happy to oblige.
For a bit of background modern games tend to do game procesing and rendering at the same time in parallel, but that means that the frame being processed by the rendering system is the previous frame, and then once rendering has been submitted to the "graphics card" it can take one or more more frames before it's actually visible on the monitor. So you end up with a lag of 3+ frames rather than only a single one like you had on old DOS games and such. So having a faster monitor and being able to render frames at that faster rate will give you some benefit.
In addition this is why using frame generation can actually hurt the gaming experience as instead of waiting 3+ frames to see your input reflected in what is on the screen you end up with something like 7+ frames because the fake in-between frames don't actually deal with any input.
I recall playing games at 100 FPS on my 100 Hz CRT. People seriously interested in multiplayer shooters at the time turned vsync off and aimed for even higher frame rates. It was with this in mind I was quick to upgrade to a 144 Hz display when they got cheap enough: I was taking back territory from when the relatively awful (but much more convenient) flat screens took over.
> because the other player undoubtedly had a 240hz 4K monitor rendering the player coming around the corner a tick faster.
I play 99% single player games and in most of those, response time differences at that scale seem inconsequential. The important difference to me is in motion clarity. It's much easier to track moving objects and anticipate where they will be when you get more frames of animation along their path. This makes fast-paced games much more fun to play, especially first person games where you're always rapidly shifting your view around.
For me, it's not quite as big of a jump as, say, when we went from SD to HD TV, but it's still a big enough leap that I don't consider it gimmicky.
Gaming in 4K, on the other hand, I don't really care for. QHD is plenty, but I do find 4K makes for slightly nicer desktop use.
Edit: I'll add that I almost always limit FPS anyway because my GPU turns into a jet engine under high load and I hate fan noise, but that's a different problem.
It’s only when LCDs appeared that 60 Hz started being a thing on PCs and 60 fps followed as a consequence, because the display can’t show more anyway.
It’s true that competitive gaming has pushed the priority of performance, but this happened in the 90s already with Quake II. There’s nothing fake about it either. At the time a lot of playing happened at LANs not online. The person with the better PC got better results. Repeatedly reproduced by rotating people around on the available PCs.