The number of lost souls parroting the old "human eye can only see 30 fps" has gone down considerably over the years. The last 10 years were fantastic in that regard, despite the whole RGB craze.
Even CS servers have 100 Hz heartbeat these days. Of course, by the time we get 1khz displays I'll be too old to enjoy it myself but still likely to put a bittersweet smile on my face.
Personally I believe the newly announced 300hz 27" 1440p monitors[0] are going to be the perfect sweet spot for the foreseeable future. I imagine it will be a long time before technology emerges that is a noticeable improvement to this.
[0]: https://www.nvidia.com/en-us/geforce/news/new-g-sync-monitor...
Realistically I think the two sweet spots are 120hz and 240hz - not necessarily because they are the best of the best but because they are each divisible by both 24 and 30 (the most common FPS of films and television) AND they offer two tiers of increased performance for different hardware requirements. You can run a much more taxing game at 120 and then if you want to spend the big bucks on the latest hardware move up to 240.
As for resolution I completely agree with you - 1440P is really a sweet spot for 27" monitors. If display / DPI scaling improves across multiple OS then I think eventually we will likely have 4K become the norm for 27" sized monitors and it will show some improvement but again be diminishing returns like the difference between 120-240. That being said as more film content moves to 4k I think we will also start to see 1440P become less popular as people will want to view content in something that doesnt scale.
All of this however is nothing compared to the improvement that a true HDR display brings - a high end monitor that can show a large increase in dynamic range is such a game changer and I do not think most people realize it yet - it brings us so much closer to how the human eye really sees that I really think it is equivalent to the difference of going from laserdisc resolution to something 4k. And on top of that now that cameras are also shooting in such massive dynamic ranges it is going to make older content just look plain in comparison.
Yes! I have been crying about latency for nearly 10 years [1]. Computing has always been optimising for throughput. And before Pro Gaming, there just hasn't been a marketable incentive for companies to work on / minimise latency. Now we finally do!
Even in the best case scenario, the lowest latency is still 25ms, and in most cases we are still above 50ms. I think it is worth posting [2] Microsoft Research on Input latency. It would be nice if we could get average system end to end latency down to sub 10ms level. Which is what I hope work on VR will bring us.
Part of that however is also highly related to motion blur - many big directors have done tests in theaters showing "HFR" content (like 60fps) and audiences distinctly said they did not like it on average. The D-Day scene from Saving Private Ryan is a good example - it was not shown HFR but they intentionally made the shutter speed faster to give it that "staccato" and jerky and gritty sort of feel. While in photography we use all kinds of shutters speeds for different effects (think of things like using a super fast shutter speed to freeze the propellers of a plane or using a very long shutter speed in a landscape photo with a river so that the river becomes a nice smooth blur) the movie industry mostly abides by the rule of "180 degree shutter" meaning that your shutter speed is 1 over 2x your fps (x=fps). So for most cinema shot at 24fps the shutter speed is 1/48 of a second.
The importance of this is that because you are not shooting still frames and instead of shooting a series of frames to be played back quickly this adds a motion blur effect that smooths the transition between frames and creates a sort of artistic look. There are technical limitations of this blurring (medium fast pans across a scene are a great example - the whole thing becomes too blurred and is hard to see). Any scene with slower moving objects such as people adds a sort of natural motion blur that many cinematographers believe is an artistically ideal choice.
Now that being said you do not need to abide by the 180 degree shutter with modern cameras (like Saving Private Ryan) and one can theoretically choose a variety of shutter speeds for different scenes regardless of what FPS one is shooting at. A fast pan could be shot at something like 1/120 and even at 24FPS it will appear much sharper and easier to make out individual objects (although perhaps not quite as smooth on the panning motion). However you are theoretically limited on the low end to a shutter speed that is approximately equal to your frame rate (or your shutter would be open LONGER than the frame itself and defeat the purpose of shooting "frames" in the first place). So theoretically we could move to 48 FPS content and still shoot at 1/48 and have the same amount of motion blur PER FRAME but also double the amount of frames which would be a large improvement from a technical sense. I haven't seen any films shot this way but I have experimented quite a bit with my own camera shooting at these kinds of speeds and it works quite well. You can also shoot at 24FPS and drag the shutter to 1/24 to get a full stop (double the amount of light) vs normal 24FPS footage if you are shooting in a very dark environment that is already pushing the limits of your cameras sensor. This of course introduces even more motion blur but depending on the scene it may not be very noticeable or even introduce interesting artistic looks.
TLDR: I think we should move to 48 FPS and shoot most content at a variety of shutter speeds, the most common being the already standard 1/48s shutter, and either increase or decrease that within reason depending on the nature of the scene and the desired artistic outcome.
That being said in the quake days I dont think monitors could go over 60hz anyways so even at 120FPS you were not gaining a similar advantage from what we have today. From what I remember however there were other advantages to high FPS in games like Counterstrike as well in terms of player movement - the monitor might have "smoothed" the motion back down to 60 fps but it still resulted in a more accurate experience.
I forget how refresh rate worked on CRT's though - maybe those could higher than 60?
And of course you can overclock an LCD monitor quite easily - most will not do much but there are some that I got to 90hz which (in my opinion) is a massive improvement compared to 60 and the 30hz difference is a much, much larger jump than the next jump from 90 to 120hz.
Yes, even the standard VGA 13h mode (320x200x8) is 70Hz and many CRTs could do 85Hz. By Quake 3's time CRTs that could do 120Hz and above were very common. Personally i have such a CRT as well as another that can do 160Hz.
Also FWIW the refresh rate is only part of the story - CRTs have practically instant "response time" so 120Hz on a CRT vs 120Hz on a LCD feels very different (in favor of the CRT). Supposedly OLED could be made to be close but personally i haven't seen such a case (and people who have both OLED and CRTs still say that CRTs are better there). I have a 165Hz LCD and doesn't hold a candle to the CRTs i have around in terms of motion feel.
Nowadays you can find small-ish CRTs for dirt cheap on Facebook Marketplace, etc (some even give them for free) - i recommend trying to find one that can do 120Hz if for no other reason than to experience the liquid butter smoothness of FPS motion (and join us in the lamenting its loss in modern monitor tech :-P). Also kinda amusing that when those were new chances are the PCs they were used with couldn't do high framerates (and low framerates do not feel as bad on a CRT as on an LCD, but i'm not sure if it is related).
I couldn't understand "congestion control relied on packet loss"? could somebody explain? Thanks!
does it mean "congestion control is triggered by the packet loss event, which is a signal for buffer being full"?
Input lag is the time between you perform the action and the computer shows that on screen. It depends on your frame rate, refresh rate, and peripheral polling rate, as well as how good the game schedules things (which is what LatencyFleX tries to optimize).
Network ping on the other hand is often hidden away. Whether you are on 2ms ping or 100ms ping, the bullet always goes where you aim at: this is done through rollback netcode [1], which rewinds the server state to the time the action has been performed. I'm not saying that having low ping is pointless, it has an effect on things like peeker's advantage, but the effect of network ping is drastically different from the effect of input lag.
However what you are aiming at may not actually be where you see it. If it's another player or something critical to multiplayer gameplay, then you are going to see what the server has told you you're going to see at that location. And latency means that will be delayed. So you may think you have a clear hit on a target, but with latency the target may have moved and you haven't gotten an update yet. Or by the time your fire command gets to the server, the target has already moved.
This can get pretty confusing especially if the game does client side hit effects (like bullet impacts). You shoot, you see immediate feedback of the bullet hitting, but due to latency your target moved and you missed - so the feedback is false. But if you don't do the client-side hit effect, there's a strange subtle delay that feels wrong. Client tells server they fired, server determines hit location, server sends back "hit here" message, client draws hit effect.
Valve has a writeup at https://developer.valvesoftware.com/wiki/Source_Multiplayer_... which covers some of the issues with network latency.
It definitely uses a lot of network-theory language so it reads like a foreign language to me (in a good way).
Fortunately for us humans that seems to stop at 120Hz because most games can't even hold that at a steady rate with a 3090.
Now whether a 300+W gaming device is interesting in the long run will be answered this year by your electricity bill!
I don't know where'd you get this notion that it stops at 120hz. It's been proven again and again that even with monitors with low refresh rate you still get a better experience by having more fps available. Better so when you have both the frames and the refresh rate in your monitor.
Not that big of a deal when you are all getting together in real life. You can see if you were at a disadvantage due to equipment. Online, you don't get to see.
[1] In a particular game I discovered that abusing the R/G/B controls into giving you something that looks almost like one of those colorblind simulations in normal conditions would give you a massive advantage to the point of most players calling hacks.
I'll be honest, it sounds like you have no idea how competitive gaming works, or any sport at all. Your comments sounds exactly like thinking one can be better at football by buying more expensive boots.
Definitely there are real advantages to running higher framerates but its all diminishing returns. If your rig is fast enough to maintain CONSISTENT frames that is probably more important than going for the absolute highest - there is a reason many set a minimum / maximum FPS target. Consistency is important.
Most of the really competitive games arent really all that hard to run either these days - even an older but still decent gaming computer can do great. Considering how cheap it is now to build something that will run most games well vs how it used to be in the past I think the pay to play aspect has actually REDUCED quite a bit. You can buy a quality mouse these days for 20-30$ that would blow away what we had 10 years ago and that mouse probably would last you for as long as you want.
Reminds me of an old racquetball tournament where they would put everyone in the same level (as in the top players intermingled with the bottom) but depending on the players skill they were given a type of massive handicap. If you were in the very top tier racquetball level they gave you a racquet that had been strung in a very clever circular way where there was a literal HOLE right in the middle of the racquet where the sweet spot was!
Spoiler: Those guys still often won
Not really new, mind. Lan parties that had that one person that spent way more than everyone else was a thing. Did they automatically win? No. But they punched over their league in the party.
Also, higher refresh will give skilled players an advantage, but it's def not pay to win.