Low performance Nvidia GPU vs iGPU on laptop?

I’ve been running a (this) Dell XPS laptop for around 3 years. It has a 3050Ti GPU and an Intel i7-12700H CPU w/ Xe Graphics. I originally had problems getting it to run with the rpmfusion drivers (back in Fedora 36, if memory serves), but I did manage to get the drivers installed and it’s been running fine on the intervening versions of Fedora since.

However, I’ve never managed to get the 3050Ti to run - the iGPU has always been active in graphics-heavy workloads. I gave up trying. But I just found this helpful reddit article: Reddit - The heart of the internet and sure enough, if I stick DRI_PRIME=1 (no idea what it actually means or does) in front of the command name at a terminal prompt then lo & behold, the Nvidia GPU is enabled. I’m very happy to find that it does actually work under Fedora (I can boot into Windows 11 where the 3050Ti automatically gets chosen for GPU-intensive workloads, but I prefer not to, thanks).

However (again): running glmark2 gets a score of 757 FPS average using the 3050Ti whereas the iGPU gets a much more robust 4949 (my latest AMD 9950X iGPU gets 18000+ but that’s another story).

This does seem to be a “known” issue, but I haven’t found out yet why this is. Does anyone in this forum know, or have anecdotal info that might help narrow it down?

I’m just happy that after 3 years of my 3050 sitting in my laptop just sucking up electricity and not being productive that I’ve finally managed to get it to do something.

Automatically switching between iGPU and dGPU (in laptops) is the responsibility of the Nvidia Optimus circuitry (& code) which doesn’t appear to work well under Linux. Is that correct/known issue?

(As an aside I’m trying once again to get Nvidia drivers working with a 3060 on my latest (9950X) desktop. I’m hoping to upgrade to a 5000-series card but the prices are stupid, if you can even find a card, and show no signs of ever not being stupid again. And Jensen has ensured the 4000-series cards can no longer be purchased, except from secondary market scalpers).

Thanks. Alas, KDE.

While the ability to easily (or even automatically) switch between iGPU and dGPU would be nice, now that I know about DRI_PRIME=1 I can do it from the command line.

I’d really like to know why there’s such a significant difference in the glmark2 scores between the two GPUs. I really expected the 3050Ti to trounce the iGPU but it’s happening the other way around. Is it a bug? Feature? Some driver issue? Other software? Hardware? Simply the expected outcome? Enquiring minds…

envycontrol looks interesting, and also a little bit scary - it’s going to rebuild initramfs and uses rpm-ostree which I’m not that familiar with.

Ah, interesting, thanks. I tried DRI_PRIME=2 (and 3) and the score for the 3050 Ti improved significantly from 757 to 2993. It’s still destroyed by the 12th gen i7 iGPU however at 5503 (latest test carried out at the same time as the others here). DRI_PRIME=3 scored 1365. Interestingly, glmark2 would seem to imply anything greater than a value of 1 for DRI_PRIME will be ignored and 1 used anyway, in which case what explains the wide variation when using the different values? (I did try DRI_PRIME=1 again and there was a slight improvement, but that’s down to the laptop being run cooler than the previous test).

I’m still surprised, shocked really, that the iGPU is scoring so much better than the 3050 Ti on this one test at least.

I should boot the machine into Windows and run Furmark to see if there’s a platform dependency (except I feel that a developer gets an ulcer every time I boot Windows). If not it makes me wonder why Dell included the Nvidia chip, other than that they can charge more, obviously (and CUDA, maybe). Subjectively, this processor - i7-12700H - and its iGPU have been extremely impressive (4K screen!) except in that it overheats like crazy and throttles frequently (partly my fault - I keep tons of browser instances on the go at all times). I don’t play games, not on a laptop; can the 3050 Ti be that much better than the iGPU running games? That would seem to be the prime motivation for sticking an Nvidia chip in the laptop. I have a subsequent model - 9530 with the 13th gen i7-13620H (fewer cores!) and no Nvidia chip (and no 4K screen) and it runs a lot cooler.

For completeness I just ran glmark2 on the 13th gen version of the laptop: iGPU scores 4464 and the onboard A370M scores 2321. Machine was not audibly running fans prior to and during tests.

It seems on the Dell XPS 95xx laptops the Intel iGPUs outperform the onboard dGPUs (on this one test at least) and my subjective experience that the 12th Gen Intel chip outperforms the 13th gen equivalent (again, the latter has fewer cores, something that surprised me when I realized I had been shortchanged!) is borne out by data. For my money (of which I spent a fair amount when they came out), the Alder Lake generation is a very good Intel product (it’s even more parsimonious with power (desktop i9) than online reviews led one to believe).

Thanks again for putting me onto other DRI_PRIME values! I need to research it more.

Ditto. I should have noted that.