Why do combuters ged hot?

Why do combuters ged hot?

They get stressed out doing so many things so quickly

...

Transistors are leaky.
Some of the power is used to do the transisting, some of the power just leaks and turns into heat.

elecdriciby

What if we just plugged the leak?

cover it in foam

That looks like shit

Well, ask intel.
They're trying that for years.

You use red leds instead of blue

But blue is hotter than red

Colorbs have tempbrature now?

[that "jew song" with a bunch of micro jews trying comically to plug the transistors intensifies]

Back in the day computers were really big and expensive. Heating the entire hall that you kept the computer in would have cost a fortune. Instead, they would put heating elements inside the computer (the whole computer was the CPU back then) to heat it up. As time went on, computers became smaller and smaller, but the heating elements stuck. Today, it would be too expensive to remove the heating elements, since that would mean the CPU design would have to be completely remade, 40 years of work down the drain. So we just keep making them smaller and smaller, and the heating elements make them hotter and hotter as they get smaller, which is why modern computers have much bigger CPU coolers than computers in the 90s.

Hope that helps.

Wtf I want a cold computer now

joule effect

pick one
> a.
> b. all the porn makes it hot and heavy
> c. resistance

Stefan-Boltzmann law

Can't. Too much pressure builds up and they asplobe

Its all about transistors, when the current goes through due to conservation of energy it can't all stay as electrical energy, so some turns to magnetic energy, sound, light (very little) and heat. Since you're putting a lot of energy into a small space it can get extremely hot so you need to see about cooling to prevent any unnecessary resistance or for it to melt.

Chad elections ramming into virgin ions

I'd genuinely like to know why CPU's produce so much waste heat.

Apparently the majority of this heat doesn't actually come from computations, but is due to how inefficient our chip designs are?

Is it possible to do computations that actually absorb heat?

pls stop frogposting

>when the current goes through due to conservation of energy it can't all stay as electrical energy
stop talking out of your ass. The problem is practical, not theoretical.

physics

it's not though. there's nothing stopping me drawing up something where a transistor does perfectly efficient amplification. Just like there's nothing stopping me from talking about a theoretical device that spins forever in 0 friction or a simple circuit where V_in = V_out. conservation of energy has nothing to do with it, it's a matter of efficiency.

Ok im in. Kill frogposters on sight.