Can someone please explain to me HOW THE FUCK these tiny things need to draw hundreds of watts?

Can someone please explain to me HOW THE FUCK these tiny things need to draw hundreds of watts?
Especially considering that the amount of "useful" work that is output (digital signals) isn't dependent on the amount of power put in.
I can light and entire house with energy saving bulbs for less power than it takes to run a single chip at max load.

Other urls found in this thread:

ece.cmu.edu/~ece322/LECTURES/Lecture13/Lecture13.03.pdf
reddit.com/r/askscience/comments/2ykv66/why_doesnt_intel_increase_the_size_of_the/
twitter.com/SFWRedditImages

CPU chips can do a lot more math than your energy saving lightbulbs can

Modern CPU's have a higher thermal density than the sun.

ece.cmu.edu/~ece322/LECTURES/Lecture13/Lecture13.03.pdf

>Can someone please explain to me HOW THE FUCK these tiny things need to draw hundreds of watts?
Because transistors have resistance.

can you play crysis on your lightbulbs?

It is possibru tu roon kompooters withoot poweroo

Read about CMOS. Basically, every clock cycle (switches the transistors) requires power. And at 1ghz, that's a BILLION transitions every second. It adds up.

No because transistors need a certain amount of current to work, in the case of power efficiency, more resistance is better (on the base at least), also most of the power consumption happens while switching on or off which is also why a processor consumes more at higher clock speeds.

>how does a transistor work

>Can someone please explain to me HOW THE FUCK these tiny things need to draw hundreds of watts?
They literally do not.

The GPUs themselves almost never draw over 150w. Even on super hungry boards like the 390X. It's the aux, memory, and waste energy that brings it up in the 300+ range.

>Especially considering that the amount of "useful" work that is output (digital signals) isn't dependent on the amount of power put in.
You can't switch transistors with no power. The more transistors, the more power needed, the more you utilize transistors on it, the more power it draws.

Are you really this dumb?

>No because transistors need a certain amount of current to work
Yeah, because THEY HAVE RESISTANCE.

Everything has resistance the problem is when the resistance is too low.

is alot of lemmings

That's not really a problem.
Current quantum computers uses superconductivity for the processor and thus the processor takes extremely little power to run. It's amazing really.

The only problem is that we need a lot of power to cool the processor down to close enough to absolute zero for it to gain that superconductivity.

Who the fuck was talking about quantum computers, we were talking about graphics processors you idiot

Same concept applies you fool.

Lowering the resistance is a big deal, it's part of the reason we reduce the size of the process and why doing so reduces the power consumption of the processors.

Of course we also increase the number of transistors on the processors, so it does equal out to some degree.

That is because resistance causes voltage drops

Which is power consumption. The voltage drop is because power is converted into heat.

When did juggalos get from fucking magnets to fucking transistors?

Most dynamic power is consumed by the click network. Imagine toggling an RC circuit millions of times a second. Even with small capacitances and resistances, it adds up to a whole lot.
Most passive power consumption is in performance intensive and always-on areas which have to use faster and leakier transistors.

Because pic related. Energy required to switch a transistor off/on is converted to heat. 1 billion of those 4 billion times per second - it adds up.

Switching transistors costs energy.

the sun has the thermal density of a pile of compost

>Because transistors have resistance.
because transistors don't have enough resistance.

>Of course we also increase the number of transistors on the processors
We can only do that so much, at the frequencies modern processors run at the speed electricity is a limiting factor.

Because more power means the electrons are more likely to have traveled through the processor by the time the next clock cycle comes around. Less power means you risk grabbing old electrons from the previous clock cycle (because they are traveling through the processor too slowly), leading to data/system instability.

This isn't always the case though because sometimes you get a really well made processor that doesn't impede electrons as much, making them more likely to be where they need to every clock cycle for the same amount of power. This is why overclocking can be done without adjusting voltages. This is also why you may need to increase voltage when you have instability.

The average Joe could do everything he wants, desktop wise, using a low-wattage device like a Raspberry Pi.

Because x86 is literally the most bloated, most abstracted, worst fucking architecture in the world. CISC needs to die.

Billions of tiny parts consuming tiny amounts of energy billions of times per second.

Literally untrue. The GPU still eats up the bulk of the power. Even GDDR5 on its own barely uses anything, like 20-30W at the absolute most. The memory controller on the GPU is what eats up real power. Board power is usually no more than 20-30W too. Even in the worst case scenario it's the GPU itself using up nearly 240-250W in real world power usage. The reason why HBM was a big deal was because the memory controller uses next to nothing, letting them cut like a good 50W + like 5W used by the memory itself.

/Thread

I thought CPUs use FETs? Not just PNP/NPN transistors?

Not CPU

this could be but they'd still need a certain amount of current. I am not sure if they use fets they serve the same function in digital logic but fets are a little more complicated allthough you do need an extra resistor with the normal pnp/npn transistors.

Guys it is very simple. Each transistor is tiny so you need very little voltage to move through lots of them. Like a little more than 1 volt. How ever each transistor needs electrons moving though them, that's the current or amps. So high amps, low voltage.

Welcome to the start of the thread.

Welcome to the middle of the thread.

Hey man, don't call our sun literal shit, he does a lot of hard work up there :(

what has x86 got to do with an nvidia gpu?

>he

Sexism much

>idiots posting about topics they have zero knowledge on.
CPU arch hasn't mattered in 20 years.

I'd like to know the answer to this.

Oddly enough, this statement is true.

>They literally do not.

kys

CPUs do way less FLOPS than GPUs, and my AMD FX at 4.6 GHz, uses over 200 Watts full tilt doing AVX stress tests.

Besides 200 Watts is only like 0.26 Horsepower.

A gas powered weed eater used like 3 times that.

All transistors used in modern CPUs are FETs. FET just means field effect transistor. Planar gates in 200nm strained silicon are FETs. Every modern transistor is.
The field of electrical resistance is how the flow of current from source to drain is arrested.

You can render and play games on a RasPi?

>Lowering the resistance is [...] the reason we reduce the size of the process and why doing so reduces the power consumption of the processors.
>thinner wires have lower resistance
the absolute state of Sup Forums basic electrical education
this is why you don't trust Sup Forums for anything more than blind fanboism
nigga go read plebbit
reddit.com/r/askscience/comments/2ykv66/why_doesnt_intel_increase_the_size_of_the/

>implying Sol isn't a boys name

That's just another part of it, user.

Increasing the die size reduces the yields from the production. Not just the number of working chips from a wafer but also the percentage of working chips from a wafer.
But the fact remains that you get power savings by reducing the process used. There is a reason the term "die shrink" is thrown around. It's because no significant changes were made to the architecture but it was produced at a smaller process. The die became smaller because they didn't increase the number of transistors to fill out to the previous die size.
So what did they gain? Reduced power consumption.

Also, I don't like the way you cut "part" out when you quoted me. I can understand cutting the preceding part but cutting the word "part" makes it sound like I said that was the only reason it was done, which is a misquote.

Because its a rock we tricked in eternal slavery with THE POWER OF THUNDER and this is it's own personal revenge upon us