FUCK NVIDIA

>doing cuda/ml on linux
>purposely buy a fucking non-founders edition 1080, namely an MSI GAMING X, that has one of the biggest saggy fucking cooler
>install non-autistic DE: Cinnamon/Unity/KDE
>every fucking compatible nvidia driver keeps
the gpu IDLING (ie. scrolling down on Sup Forums or pdfs) precisely at 60 CELSIUS DEGREE so the fans start revving and after 10 seconds turn themselves off

How the fuck is this allowed? Ambient temperature is 21C, under W10 i get 40C idle, and the only DE that keeps it around 50C is LXDE or other autistic sysadmin tiling wms

THIS ISN'T A FRIENDLY LINUX THREAD POST GET THE FUCK OUT OF HERE, THIS IS A REAL FUCKING PROBLEM

Other urls found in this thread:

devtalk.nvidia.com/default/topic/1008405/will-de-experience-on-nvidia-blob-be-ever-as-smooth-as-on-intel-/
twitter.com/AnonBabble

how is that nvidias fault?
t. idle at 25c 1080ti

devtalk.nvidia.com/default/topic/1008405/will-de-experience-on-nvidia-blob-be-ever-as-smooth-as-on-intel-/

Why do you buy a gaming gpu if you want to run linux?
Buy a cheap amd card and be happy

Is it in adaptive power mode?

Some people use the GPU to run calculations on it. I'm at a university chair where half the people are involved with neural networks, running them on nVidia GPUs under Linux.

So you bought expensive gpu to do what exactly on linux?

idk but I'm conformable at 55c on i3 on my msi 970

right but if you do that you dont need the gpu to run display drivers

Yes, i've tried that, not making any difference. Also worth noting that the
nvidia-settings --assign CurrentMetaMode="nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }" setting doesn't really work on heavy DE-s, you always get an initial tear when scrolling and it only locks afterwards

A few more things you can try is underclocking some or turning off visual effects of your DE. I suspect effects etc make your gpu clock to highest possible levels causing the fan revs.

IIRC most of the available libraries have the display drivers as dependencies, along with CUDA and CUDNN. Unless they're developing a machine learning framework, it's a massive pain to work directly with the GPU for general purpose ML.

Because people game on Linux.

kek

most likely you have a background miner using your part of your GPU

Happened to me on Debian, Ubuntu and Mint, newest versions, Arch also had high GPU use but it seems it was not a miner but an unrelated issue

Open Suse and CentOS doesn't seem to have this problem

The real fucking problem is that you're using linux

Why do they advertise CUDA if it's fucking garbage?

Bullshit

I use Ubuntu and have no fan problems with CUDA or any library depending on it (tested on tensorflow and pytorch).

Maybe you're just one of those unlucky people that Linux fails for random reasons ¯\_(ツ)_/¯

LOW QUALITY BAIT

post your temps and case/nr of fans

either post proofs or fuck off fuder

I don't even know the fuck is going in in this thread.
>750Ti
>KDE
GPU idles at like 30°C like it did in w7 and everything is buttery smooth. Not sure what the fuzz is all about here

>he thinks graphics cards are for gaming
Back to right now, you inbred mouth-breathing retarded soy boy.

>biggest market for GPUs
>hurr durr it isn't

>biggest market for GPUs
>not literally 100% every GPU being purposed for coin mining at some point in time