By the way, I thought these AI things served to increase resolution, not frame rate. Why doesn't it work that way?
Just moving my mouse around, I can tell the difference between 60 and 144 fps when I move my pointer from my main monitor (144 hz) to my second monitor (60 hz).
Watching text scroll is noticeably smoother and with less eye tracking motion blur at 144 hz versus 60.
An object moving across my screen at 144 fps will travel fewer pixels per frame than 60 fps. This gain in motion fluidity is noticeable.
Though some CRT emulation techniques require more than that to scale realistic 'flickering' effects.