There are modern graphics cards that consume 80 times as much power as your average smartphone

>there are modern graphics cards that consume 80 times as much power as your average smartphone

Other urls found in this thread:

youtube.com/watch?v=JmxUsGiGp3w
twitter.com/SFWRedditImages

>while having 160 times the raw compute power

>they make more powerful hardware instead of optimizing software

>a big v8 consumes more gas than a little 4 cylinder
MIND=BLOWN

>There are electronic cigarettes that consume double the power of a modern graphics card

If a video card is 200 watts but only has a 100w tdp heat-wattage, where does all that extra power go? It's not exerting actual force or light energy at all, and the heat-wattage inefficiency is already accounted for
If it's not heat, light or motion, where's that power going?
Would a completely efficient graphics card produce zero heat or is there inherent inefficiencies in the transistor switching and silicon currents?

TDP is a marketing term, it all goes to heat

There won't be a perfectly efficient graphics card, only more performance per watt you put into it.

>Every single watt your hardware consumes goes directly towards heating your room

It's all heat and light in the end on a computer.

Except the fans

It's funny, because my Commodore doesn't really use that much more fuel than my gfs focus. You have to be a bit more gentle on the accelerator, but it's still far faster.

modern graphics cards are retarded

they are massively overpowered

for 90 bucks you can get

(3/4)*(1 million)*(1 billion)*(32 bits)

1s and 0s per second from some random Nvidia GPU I just found on Wikipedia

That's a good thing

otherwise there would be no use for a desktop you nigger and everyone would be smartphone users

It's consumed by the bit bucket to reduce entropy.

>dumbphone cuck too scared of being obsolete and irrelevant
lmao

youtube.com/watch?v=JmxUsGiGp3w

my netbook gets hotter just running firefox than my desktop computer when playing video games

Yeah, but your Commodore is obviously a V6.
My '98 VT SS Series II with the 5.7 Litre Gen III V8 uses a fair amount of fuel, but it's worth it.

Nah mate, SS VE. I can get her below 9l/100 if I'm going easy, but I usually sit at about 11. Pretty happy with that, considering the power I get from it.

I need to buy a Fx 8350 and r9 290 then.

They're a hardware company, what the fuck do you expect?

I don't suppose the Radeon RX 480 would come with a letter that says: "This is exactly the same as the RX 460. Developers, optimize your fucking software."

>fell for the video game meme
>pc uses so much power even when idle that the electricity bill increased substantially
idk, how much can i see ll a 7970 8350 with 16gb ddr3 for in Australia?

>netboot
there's your problem

>7970
>8350
>literally the thirstiest garbage ayymd has ever produced
jesus christ user, no wonder your bill increased

it was free

7970 was fine, power use was within margin of error with the 680, which was it's launch competitor.

Of course now five years after launch the 680 performs like a 7870, which uses 2/3rds the power so w/e

>TFW you will never have RWD sedan with a proper engine and a proper gearbox that handle like an e39 m5.

>Almost 5 days with a r9 290 at idle

That kinetic energy eventually ends up as heat too, due to friction. So your computer is indeed a very fancy heating device.

The extra 100w goes to the fan and rgb leds

if you need to heat your room buy a Dell Optiplex 755 sff and a intel q6600. when playing games or in general doing anything burns up the graphics card and makes the CPU a stove you can cook your dinner on

What graphics card has a maximum power rating above its tdp? As far as I'm aware the tdp figure accounts for losses from power regulation circuitry and is always higher than "observed power" measured in RMS.

Nearly all power used in electronics is essentially used to open a gate to allow electricity to then open another and so on.

>implying there is a single chipset for all phones

But is your time?

>What graphics card has a maximum power rating above its tdp?
TDP describes how much heat will need to be dissipated (it doesn't consider power directly), so most GPUs and CPUs can and will briefly pull more power than the TDP as long as the spikes are short enough not to overheat anything and the average power is below TDP.

There are also modern graphics cards that consume as much power as incandescent lamps!

>there are 100w light bulbs that consume over twice the power as your average 40w light bulb

You'd better get a pair of good old GTX480s. Best grilling experience I've ever had.