Just remember that stuff like red dead redemption ran on those things with all of 512 MB of unified memory. It ran and looked better than borderlands 4 does on current consoles.
The 360/PS3 was a huge jump forward but very limited by today’s standards. RDR was one of the better looking games of the generation but could not maintain a steady 30fps at 1080p/i (and I’m not sure it was even true 1080).
The PC version came later, had higher resolution textures and other graphical improvements so it compares more favourably to modern games when you play it today. It still had problems running on all but the highest-end PCs of the time.
Of course even low-end PCs can run it without breaking a sweat, because they’ve become much more powerful.
The performance problems in modern games are often not caused by fillrate-vs-resolution bottlenecks though, but by poor engine architecture decisions (triggering shader recompilations in the hot path).
But I’m confused about why you think fill rate isn’t an issue? If you are now upgrading from 1080p to 4K your GPU needs at the very least 4x the pixel pushing power and even then that’s only to maintain the same detail; you bought a 4K screen for more detail.
Maybe that is even related to it's good performance on consoles back then: Rockstar invested a lot of development time and sacrificed portability for performance. Basically the opposite of what modern games achieve with unreal 5.
The game felt like it had significant input lag, and at 720p with upscaling text becomes very hard to read. The game's visual style of "glitch" effects also translates badly with upscaling and I really had a tough time actually understanding what I'm looking at on the screen.
Perhaps the situation is better on OLED.
It's the #10 top most played game on Steam Deck.
The game on new machines is quite impressive, quite unlike anything else made.