250 FUCKING WATTS

Daily reminder that when you run a GPU like that nowadays you fucking run a fucking SLI card. Do you know what the fuck 240 WATTS is in normal modern computing? It's inefficient as fuck in terms of energy security. Jesus fucking christ. Have you ever even seen a regular high end card how much it spends? It's rarely more than 150.
Let me educate you on something you stupid shits. When you make a CPU you actually have to try to make it faster. But you make a GPU, that's a different story. It's fucking easy, you slap more cores and make it faster because GPUs are not CPUs. They are not serial machines. They are by definition parallel devices.
Hence in practice NVIDIA **or AMD, just slap more cores into them, essentially making bigger chips, essentially making more expensive chips, very easily and they have to charge more. Not to say the two shits aren't a cartel of gpu patents two but that's another subject. Moar cores works for GPUs, not for CPUs.

And that is why GPUs are extremely expensive compared to all your other devices.

Now go fuck yourself.

Other urls found in this thread:

seekingalpha.com/article/4042977-new-details-intels-ip-deal-collaboration-amd
twitter.com/SFWRedditGifs

You know what I think you need?
A nice pancake. A nice, fluffy pancake dripping with syrup.

daaaaaaamn dude thats probably something like $2 per year in electricity

fuck off fatty

no

100% of your computers wattage goes directly to heat or fan-spinning

>$2 per year
that wasn't the point you stupid shit. It's to prove that those "cards" are not very efficient technologies to begin with. They are gigantic chips.

That's why they are expensive.

They are 2 chips in 1.

my electric cigarette uses more power than that

> essentially making bigger chips
It's not that easy, the bigger the chip, the lower the yield,

so?

big fat fps dawg

Nothing is efficient
See Show me that kind of performance for less wattage

Wrong. Sound and light escapes. And even if you have boarded windows, electromagnetic radiation escapes and you lose energy.

> They are gigantic chips

Get a slower GPU then. 16\14nm GPUs will still be produced for a couple years. Also that is the max TDP

That wasn't the point of the OP. The point is that those chips are technologically not very impressive. It's very easy to make a bigger more expensive GPU because they work in parallel at all times, they are no CPUs that depend on the software.

But that's part of the point! They are both bigger, and the yield is shittier. It's why they are so expensive.

Fat ass.

>Electronic cigarette
>Having a vape and not calling it a vape
>Saying your box mod is an electronic cigarette
>Existing

Fuck off

(You)

My computer doesn't make non-fan sounds or visible light
Electromagnetics isn't a power draw until if affects something through work and radiation is the heat. "Electromagnetic radiation" doesn't exist

>it's very easy to make a 1080ti
Then do it fag, show us how much you save

this car use mor gas than honda car

honda is 4 cylinder and chevvy is 8, they just make bigger, total trash, too much gas

You can't you bong. They are a cartel of gpu patents. AMD and NVIDIA.

Even intel has to kiss ass to make iGPUs seekingalpha.com/article/4042977-new-details-intels-ip-deal-collaboration-amd

faggot

Your weight is perfect to split your head from your body, you won't even feel it.

> IT'S PERFECTLY FINE TO PAY $800 FOR A FUCKING CARD THAT WILL BE SHIT IN 4 YEARS
suit yourself faggot
the rest of humanity, i.e. 99.99% of computer users know this is a bad product

So you're blaming nvidia for "forcing" nvidia to release cards that you think aren't as good as they could be?
What are you even trying to say I don't understand at all? Are you saying GPUs are bad?

And? A V12 Ferrari isn't efficient either but that's not really the selling point.

No you fucking moron. The whole fucking point is that these products are inefficient pieces of shit. They cost $800 and they will be absolute garbage in 4 years for that pricing.

They do absolutely nothing impressive.

They just bundle chips into bigger ones.

>My computer doesn't make non-fan sounds or visible light
Your screen. Inaudible coil whine as part of normal operation. Sound from fans passing through walls and windows.
>Electromagnetics isn't a power draw
You're converting electricity to wifi. Wifi escapes the confines of your house, and dissipates into the surroundings, hence you lose energy.

That's another shit product for most people. Only 0.1% of humanity can say with a straight face that a high end ferrari is good technology. It's not good technology, it's overdesigned.

Good technology != overdesigned

only imbeciles believe that.

>The point is that those chips are technologically not very impressive

The architecture that allows them to pack that much computing power in a single chip and only end up drawing 250 watts is extremely impressive. Just because each GPU based on that architecture are just different sizes doesn't make the family as a whole any less impressive.

>only end up drawing 250 watts
Are you fucking kidding me? A mid-end card which right now can run 1080p on most games admirably rarely goes above 120-150W AND EVEN THAT IS PUSHING IT compared to most people's needs. When someone goes 4K with a card that draws 300W and pretends that's not overdesigned shit that will drop in performance/price like a rock within 2-3 years he is an idiot.

t. Electric company

it totally is though

More like 140$ in Europe per year

wow i dont think my fucking housefire of a 290 does that

>Complaining about $12 per month.
What are you, poor?

But all computers are inefficient
Also what will be the big GPU of 2021, O wise seer of the future?

But what about a card that can run 4K
I mean the 1080ti hopefully isn't anywhere near the power consumption of a 1060 (which is more than enough for 1080p why would anybody need more!!?!?!?)

>4K renders 4x as much graphical work
>Uses only 2x the power

What is the point of this thread?

I'm sorry you cannot afford electricity or a good GPU user, now take your (You) and fuck off

My screen is not part of the computer, I don't fucking use wifi as I'm not a complete dipshit
I already excluded fan-noise (maybe try reading?)
The sonic energy of coil whine is DEFINITELY not enough to make a single watt of difference
Try again, hun

>more CUDA cores than GTX 1080
>more TUs than GTX 1080
>more ROPS than GTX 1080
>wider memory bus than GTX 1080
>more RAM than GTX 1080
>faster RAM than GTX 1080
WHAT DO YOU MEAN IT USES MORE POWER!?

hum you still get better perf with the same consumption what's wrong exactly here ?

No it's the exact same performance as a 780ti because they just pack MOAR CORES in until they hit 250w

high end shit will consume more power.
its always been like that with electronics.

You want more work done ? You need more power.

>Hence in practice NVIDIA **or AMD, just slap more cores into them
You stupid fucking retard.
GPU work is much more parallel then cpu work, so slapping more cores IS helpful.

but the 780 Ti already consume 250W
how did slap more cores on it ?

Because they're Jews they tricked you by lying about tdp since 1997

Are you saying that the 780Ti has the same performance than a 1080Ti because they have the same TDP? literally kys

Shit-eating Nvidiot Shill

>BUT IT's HIGH WATTAGE

The fuck? It's super low! my GTX780 is 250Watt and it's fraction of the power of 1080ti

...

so the 980 ti and the 780 ti should have very different power consumption if you are right

Ha Ha ha is you beings serious my dude

i like pancake