No...I have no idea what you are talking about...the downsampling I am aware of is when you have a monitor of one resolution...say 1080p...and then you set the resolution in the game higher than that...say 1440p..so that when it is "downsampled" to 1080p...it makes the card try to display the extra pixels on a display that does not have the extra pixels...which some say makes the game look more "smooth" (which I don't agree with)
From your post it looks like you are doing just the opposite...you are posting screenshots comparing the FPS of 1440p monitors...so I assume you have one...but you seem to be telling me that you are downsampling to a lower resolution than 1440p...and that just doesn't make any sense at all...
And no, if you are talking about FPS, any game with dark shadows and simple repetitive shapes, specifically squares and rectangles...is NOT a good example...it's the absolute worst example...screenshots like that are not pushing that card at all...like I said...if i was being selective I got games I can get over a 1000 FPS on in 4k...but would that be an accurate representation of what the framerate would really be like on real gameplay on any game?
I am not sure what the word is one would use to try and trick a 1440p monitor into displaying an image as if it was in 1080p...but i know that word is not downsampling...downsampling is the exact opposite of that...trying to trick a 1080p monitor to display an image as if it was a 1440p monitor...
look at what you typed
"downsampling from 1440p to 1080p is exactly the same as rendering native 1440p"
:dahellwhat::yeah ok:
Did you mean "downsampling from 1080p to 1440p is exactly the same as rendering native 1440p"? Because it absolutely IS NOT...not even close...you would be missing over 1 million pixels...