>doing cuda/ml on linux >purposely buy a fucking non-founders edition 1080, namely an MSI GAMING X, that has one of the biggest saggy fucking cooler >install non-autistic DE: Cinnamon/Unity/KDE >every fucking compatible nvidia driver keeps the gpu IDLING (ie. scrolling down on Sup Forums or pdfs) precisely at 60 CELSIUS DEGREE so the fans start revving and after 10 seconds turn themselves off
How the fuck is this allowed? Ambient temperature is 21C, under W10 i get 40C idle, and the only DE that keeps it around 50C is LXDE or other autistic sysadmin tiling wms
THIS ISN'T A FRIENDLY LINUX THREAD POST GET THE FUCK OUT OF HERE, THIS IS A REAL FUCKING PROBLEM
Why do you buy a gaming gpu if you want to run linux? Buy a cheap amd card and be happy
Landon Sanders
Is it in adaptive power mode?
Charles Stewart
Some people use the GPU to run calculations on it. I'm at a university chair where half the people are involved with neural networks, running them on nVidia GPUs under Linux.
Levi Wright
So you bought expensive gpu to do what exactly on linux?
Ryder Bailey
idk but I'm conformable at 55c on i3 on my msi 970
Thomas Baker
right but if you do that you dont need the gpu to run display drivers
Ryan Smith
Yes, i've tried that, not making any difference. Also worth noting that the nvidia-settings --assign CurrentMetaMode="nvidia-auto-select +0+0 { ForceFullCompositionPipeline = On }" setting doesn't really work on heavy DE-s, you always get an initial tear when scrolling and it only locks afterwards
Hunter James
A few more things you can try is underclocking some or turning off visual effects of your DE. I suspect effects etc make your gpu clock to highest possible levels causing the fan revs.
Leo Ramirez
IIRC most of the available libraries have the display drivers as dependencies, along with CUDA and CUDNN. Unless they're developing a machine learning framework, it's a massive pain to work directly with the GPU for general purpose ML.
Anthony Jackson
Because people game on Linux.
Connor Morales
kek
Caleb Rogers
most likely you have a background miner using your part of your GPU
Happened to me on Debian, Ubuntu and Mint, newest versions, Arch also had high GPU use but it seems it was not a miner but an unrelated issue
Open Suse and CentOS doesn't seem to have this problem
Leo Long
The real fucking problem is that you're using linux
Kayden Scott
Why do they advertise CUDA if it's fucking garbage?
Joseph Anderson
Bullshit
John Parker
I use Ubuntu and have no fan problems with CUDA or any library depending on it (tested on tensorflow and pytorch).
Maybe you're just one of those unlucky people that Linux fails for random reasons ¯\_(ツ)_/¯
Tyler Perez
LOW QUALITY BAIT
Nicholas Thomas
post your temps and case/nr of fans
Bentley Peterson
either post proofs or fuck off fuder
Lucas Clark
I don't even know the fuck is going in in this thread. >750Ti >KDE GPU idles at like 30°C like it did in w7 and everything is buttery smooth. Not sure what the fuzz is all about here
Kayden Campbell
>he thinks graphics cards are for gaming Back to right now, you inbred mouth-breathing retarded soy boy.
Gabriel Reyes
>biggest market for GPUs >hurr durr it isn't
David Davis
>biggest market for GPUs >not literally 100% every GPU being purposed for coin mining at some point in time