Good day.
I am curious about something I completely realised just now:
I don’t know it it happens just in Fedora KDE, but when I have a screen’s resolution selected (native or not; be it my 2k at 1440p, my CRT at 1024x768 or my 4K TV at 1080p)
and the game’s resolution doesn’t match it (it seems that at least by what MANGOhud and the screenshot’s Details tell me) the window’s resolution (borderless or fullscreen) doesn’t represent the game’s resolution, but the desktop’s instead.
I noticed this with my CRT too (the scanlines weren’t separating to reveal the black lines of void at 480p when selected in a game) but only now I’ve realized it while having the I5 3470t + GT 1030 PC running games on the 4K TV to test performance.
DOOM 2016 and Helldivers 2 have the game’s resolution set to 720p, but MANGOhud says it’s 1080p, and the screenshot too.
.
Here’s a screenshot to show what I mean.
It’s 640x480, fullscreen, on a 1440p monitor (main computer, Ryzen 5600x and RTX 2070).
Notice how the pixels are softer.
Even when selecting 640x480 as “selected resolution” the presentation of the screen is somewhat softer than what Windows does.
[ Here you can OBVIOUSLY not see it, BUT if you were to first download the image (as long as you’ve a 1440p monitor, for 1080p or 2160p you need a 540p screenshot/game, and thus to set your monitor’s resolution to that) and set your selected resolution at 640x480, you’d see that the pixels are softer. ]
.
Is there a way to “negate this soft look”?
I do not remember exactly how the Nvidia driver on Windows managed/handled fullscreen, non-native resolutions, but I know this:
When I played borderless on Windows, the game still covered the entirety of the screen, and when I Fulscreen’d it, the game would actually change the resolution at which “it was running”, with a very brief “black rectangle resize” (the funny animation) followed by the new resolution when launching the game.
.
Nvidia Control Panel allowed for TWO different kinds of scaling:
GPU-scaling and Monitor-scaling.
If the GPU scaled it would maintain a 1440p (native, selected resolution) output and smear with a soft filter any “fullscreen resolution” which would be lower (even integer, like 720p and 480p, like Fedora KDE here).
If the scaling was instead left to the monitor, it would try to truly fit the selected resolution over the native one. Integer resolutions would present clean pixels without vaseline.
.
.
My inquiry is:
Is this normal? Is this a KDE thing alone? Does it impact performance in any way (at least on my main PC it seems not to, but maybe it hits heavier on weaker PCs, like the GT 1030 one)?
Is there a setting to make the monitor actually change resolution instead than scaling to the monitor’s native (Monitor-scaling instead of GPU-scaling, but I believe that it isn’t a simple toggle here)?