This is a follow-up to my question from a few months ago. I keep getting corruption on my external monitor. I’m attaching a picture to illustrate; it comes in bands across the whole screen, not just in one window. I have to turn it off and on again, which messes up my window sizes and locations because they have to go to the laptop screen and back again. I’m trying to figure out where the problem might be. I’m not even sure if it’s software, integrated graphics card, discrete graphics card, or the monitor itself. I don’t have extra equipment to swap in and out to narrow it down, so I’m hoping someone here has some insight.
Originally, this corruption only happened when I was running a particular program that I wrote, and only when both the SDL and Eigen libraries were being called at once. But now it also happens when Firefox is running, so I know it’s not just a memory leak on my part. (Plus I never did understand how my userspace program could have corrupted what I’m pretty sure is protected video memory.)
The corruption does not happen when I’m using Windows, which makes it seem like it’s not the monitor itself. But on the other hand, it never happens on my laptop’s monitor, so I’m not sure.
I’m on Fedora 35 Workstation (GNOME), using Nouveau and Wayland. I have a Dell P2720D monitor connected by DisplayPort to a Dell 7779 laptop with Intel HD Graphics 620 and NVIDIA GeForce 940MX.
EDIT: It finally occurred to me to try an Xorg session. So far no corruption, so maybe it’s a Wayland thing. Any ideas would still be appreciated.
EDIT2: Nope, sorry, still present on Xorg after all.
Possibly an nouveau driver issue. Some have fixed that type error by installing the nvidia drivers from rpmfusion. https://rpmfusion.org/Howto/NVIDIA
or from the gnome-software screen enable the 3rd party repos then install the nvidia driver with sudo dnf install akmod-nvidia
No, using the Nvidia driver doesn’t fix the problem. I guess it must be hardware. But would it be most likely to be my monitor, my discrete GPU, or my integrated GPU? Is my discrete GPU even being used if I’m just using things like a browser, terminal emulator, or LibreOffice, and not running a CUDA app or playing a graphics-intensive game?
Hi, from Gnome Settings > Displays, Is there any setting for your refresh rate? If the refresh rates available, you could try to decrease it to around 59-60Hz for your external monitor.
The other things, on the same config above, you could also try to toggle Adjust for TV.
If all above still doesn’t works, if you have HDMI cable available, may be you could try to change from DisplayPort to HDMI.
Thanks! No, there are no refresh rate options or Adjust for TV. I’ll try using my HDMI cable for diagnostic purposes, but it’s not a good long-term solution because for some reason I can’t get full resolution over HDMI. If using HDMI stops the corruption, would that suggest a problem with the monitor or with the graphics card?
[EDIT: and on HDMI the screen goes black for a second every few minutes… I’m currently being reminded that that’s the other reason I use DP now.]
EDIT2: Nope, corruption still happens with HDMI connection. Does this suggest it’s my monitor or does it suggest it’s my graphics card? I’ve read that generally Fedora will only use dedicated GPU for stuff that clearly needs it; if that’s true, this is probably either monitor or integrated GPU issue? Can’t figure out why Linux and not Windows, and why it happens more frequently depending on which applications are running.
It happens reliably when I run a program I wrote (a normal, user-space program) that calls both the Eigen and SDL2 libraries. When it only calls one or the other of those libraries, it happens far less frequently. Recently it’s started happening even when I’m not running this program. As best I can tell so far, Firefox seems to be the common denominator in these cases. It may even happen more on certain websites. Playing chess on lichess.org seems to set it off especially, but it’s not a dramatic enough difference for me to be sure.
The monitor’s built-in menu display is not corrupted. And the monitor’s refresh function does not fix it, though turning the monitor off and on again does. I’m guessing this points toward something on the computer side. But so far I’ve never seen it under Windows, though I’m not on Windows often. So I’d be tempted to blame some part of the Fedora ecosystem, except that it doesn’t depend on Wayland vs Xorg or on mesa/nouveau vs nvidia.
You could try to search on internet about “multi monitor nvidia screen tearing linux” and you’ll find some cases most likely like yours.
I’m always thinking it’s about refresh rate, vsync, or any other related with that. But since you mention about Firefox, maybe it’s about hardware acceleration.
Multi monitor setups using different model monitors may have slightly different refresh rates. If vsync is enabled by the driver it will sync to only one of these refresh rates which can cause the appearance of screen tearing on incorrectly synced monitors.
You need to consider how a laptop with discrete GPU works in linux. The built-in (integrated) IGP handles the laptop screen and the dGPU handles the external monitor by default. This means there may be a disconnect in signals causing the distortion.
Installing the nvidia driver is step 1. The second step is configuring the nvidia driver to handle both screens properly. That involves copying /usr/share/X11/xorg.conf.d/nvidia.conf to /etc/X11/xorg.conf.d/nvidia.conf then rebooting. This config file allows the nvidia driver to handle display on both screens.
There is a 3rd step you may use if you choose, and that involves setting the nvidia GPU as primary so the IGP is ignored and nvidia handles all display on all screens. Doing that is a simple edit of the above file /etc/X11/xorg.conf.d/nvidia.conf and adding Option "Primary" "yes" to both stanzas in that file.
Setting the nvidia dGPU as primary does involve using a small amount of power to drive the GPU that may to some extent shorten battery run time, but to me the benefits outweigh the lessened run time. I only see about half hour of lost run time this way when starting with a full charge.
Thank you. I copied the file as you suggested. This seemed to make the external monitor not work at all under Wayland, so I switched to Xorg. Corruption was still present, so I added the “primary” “yes” option as you suggested. Corruption is still present. Based on what you’ve said about the dGPU always handling the external monitor, I’m beginning to suspect my dGPU is failing. Does that seem like the right guess at this point? (I did as Syaifur hinted and turned off graphics acceleration in Firefox, and I’m still waiting to see if that keeps Firefox from corrupting the screen, but my own program definitely still corrupts it.)
You should have (and if you don’t you should install it) the ‘nvidia-settings’ package installed with an "nvidia X server settings’ icon in the activities menu.
With that you may be able to see the monitor settings.
If you are using gnome the settings control panel has configs for each monitor [ settings → display → single display ] where you can select orientation, refresh rate, resolution, and scale for each. You can also select to use one display (selected), mirror the displays, or joined displays and orient them side by side or one above the other. (joined displays also allows setting the configs for each although mirror does not.)
On my system the built-in screen uses 144 Hz refresh and the external monitor uses 60 Hz at the same time.
There is one additional item that may be of interest. Some systems have shown issues with the 495 driver and users have had to fall back to the 470 driver (and X only instead of wayland)
If you want to try that you can do this. sudo dnf swap akmod-nvidia-495* akmod-nvidia-470xx --allowerasing which will remove all the 495 driver packages and replace them with the 470xx packages for fedora 35. You will not be able to use wayland with that switch but it may resolve the video corruption issues.
I have nvidia-settings, but it doesn’t show display settings. Just GPU-0, Application Profiles, and Configuration. GNOME does show the external monitor, but it gives no options for refresh rate when connected over DisplayPort. As for downgrading from 495 to 470, I may try that, but doesn’t it seem unlikely given that both the nvidia and nouveau drivers have the same problem?
EDIT: nevermind; I’m going to try it anyway. NVIDIA’s site still lists 470 as the production driver for GeForce 940MX.
No, downgrading to 470 doesn’t change the issue. That’s three different drivers, both nvidia and nouveau and both Wayland and Xorg all showing similar problems. I even tried a live USB and got corruption there. I’d think surely it’s a failing GPU at this point, right? But I just can’t reproduce it on Windows, even running Nvidia demos. Although when I tried running two demos at once, the screen blacked out for a moment and one demo crashed, but the screen was fine. Maybe the problem does happen on Windows, but the drivers there are able to detect it and recover?
Is there a way to turn off hardware acceleration globally? I tried the method here (Option “GLX” “Disable”), but I just got a black screen on boot; I couldn’t even use the virtual consoles.
How about find a way to make your program codes not using hardware acceleration (make it process run with CPU instead of GPU intensive)? But I don’t know how. There should be related with your code libraries(?).
But if above not possible (like you program related with crypto mining?), disabling hardware acceleration globally will mess your program or could also the other way around like your program constantly forcing to use hardware acceleration and ruin your system.
That’s not really the issue anymore… the corruption does happen really fast with my own program (it’s a robotics simulation), but lately it’s been happening every few minutes with other programs. I’ve got acceleration disabled in Firefox now, but I’m still having trouble. I think GNOME itself may use GPU acceleration? I just can’t seem to get around it. It’s frustrating because I know my integrated Intel card could power the external monitor just fine, but apparently the ports are wired to the dGPU (though I’m not sure how to confirm that).
I remembered people talking about pipeline or something about screen tearing. You could open this youtube on minute around 17.50. There also a way to set it with command line I believe, but from that video, I think it’s lot easier. Not sure if this could help.