147.3W POWER CONSUMPTION

>147.3W POWER CONSUMPTION

IT'S OVER, AYYMD IS FINISHED & BANKRUPT

NO OVERCLOCKING HEADROOM AT ALL

AYYMDPOORFAGS CONFIRMED ON SUICIDE WATCH

>overclocked
>in furmark
>at 1.15v

You didn't even try.

>reference model

HELLO PAJEET

Kill yourself, faggot

Even at the default 1266MHz it requires 1.15V, this GPU is shit at power efficiency

Kyle was right all along

>1.15v

.8375v for 850mhz.

Shitpost harder. Kyle was laughably wrong, and everything you posted just proves it.

Can I underclock it? AMD always gave ample amount of voltage room for their cards.

It worked great on 290x's so i dont see why not

I never said shit, niggas.

>That voltage

AMD is a fucking joke, that voltage is higher than a stock GTX970.
Too much for 14nm.

Because this is a different chip and FinFETs on top of that, they have less leakage so less of a voltage range.

This is all just made to save money, but just breeds bad press, they could have scrapped everything over 1.1V and generally have lower power consumption across the board, but they probably target everything up to 1.17V so they don't have to scrap chips that don't make the 1.11V cut, but the bad side is that more hungrier chips will be out.

Note that this 1.15V is overclocked in the OP

Are you being retarded on purpose ?.

oh, explain why do you think that?

Nvidiots don't have to pretend to be retarded.

1.1500v is with overclock, not stock config.

>2229 RPM
damn that must be loud as fuck. my card sounds like a jet engine above 2k rpm

That's is still pretty high for a card with that performance on a shrink node.

A GTX 970 at 900.2 core clock runs at 0.9310v
at 696.4 core clock it runs at 0.8500v

Which explains why those PCBs are carrying those heavy power phases relative to the die size. Seems AMD fucked up.

I guess we will find out soon, personally on a 670 so either way i need a upgrade.

what voltage @1500mHz?

What's the voltage matter if the power consumption is low? Also as someone already said AMD is pretty liberal with its voltage ranges across all its GPU and CPUs to save cost, chances are pretty high this can be undervolted.

Also do take note that if the performance is 980-tier with around 110-120W at stock clocks, the card is still a good 60% more efficient than Maxwell, which is pretty much right there around Pascal

Depends on the GPU, I can hit 1519 at 1.212V which is what is stock for most aftermarket cards but I can hit 1506 at 1.170v or 1506 at 1.150v if I keep the card below 55ÂșC which can only be done with watercooling.


Most cards can hit 1500 which is average without increasing voltage past the one provided by BOOST 2.0 itself, which is based on the ASIC quality and temperature, to get a card to provide more than 1.212v on stock clocks without modifying anything you need to get a really good silicon with a chip that could overclock to 1600mhz you get around 1.218.2v out of the box but you need really really good silicon we are talking more than 85% ASIC here which is super rare.

>What's the voltage matter if the power consumption is low?

>1300mghz
>147w

It doesn't. Either way it doesn't matter for home use.
That 2k rpm at 58% is worrying though, but heatsink is tiny and nobody sane buys reference cards anyway.

That's overclocked, and holy shit when have GPU-z sensor readouts been accurate in the least?

So the guy who claimed sources told him AMD had managed to create a product that runs significantly hotter and less efficient than Pascal was right all along?

That's typical for a reference cooler fan. They're so small that they don't usually get loud till around 1600 but they get annoying quickly

uh since forever? they use the microsoft api itself for reading which isn't different from any other monitoring software like hwmonitor.

>1266mghz
>1322mghz

What's your point?

No, rx cannot physically give as much heat as nvidia cards. You are just a moron

And since when could you read rail wattage with a fucking software tool? Guess that makes wall meters and oscillators useless.


Also, I'm gonna start posting GPU-z readouts with 110W for the 480, I guess they're correct too? Wow, look at this, 80W, and it's overclocked!

1/5

dunno, thinking of life

More overclocked, 110W this time!

Amazeballs

You have to admit this is pretty disgustingly shit for a midrange card though.

It's expected that high range cards run hot, but this is a midrange card, on relatively low clock speeds.

We would almost believe you

I get your point but look at the power graph there for a moment.

>1080p
>medium settings

More crap.

No, I understand heat transfer and not a huge faggot like you.

Last one.

Different applications or games will use different parts of the GPU, meaning in some games some parts of the GPU die, physical silicion is being unused, this isn't limited to GPUs only even CPUs do the same, modern Intel CPUs can idle down to 2watts which is lower than DDR3 RAM at 1600mhz on idle.

No you don't

>a midrange card
who started this meme?
970 and 390 is midrange
960 and 380 is low end
370 and 950 is the budget build

You sound like a nice guy

You're talking about last generation grandpa.

By that logic the 580 is still a high end card

>gpu load 15-50%

it's weird?

You're exposing yourself as a tech-illiterate retard my friend. Voltage is completely irrelevant compared between two different architectures. Skylake CPUs use higher voltages than Haswell CPUs (significantly more in fact, 1.45V is safe on air, which would have Haswell on fire) despite the drop from 22nm to 14nm. Despite that, they use less power under load than the Haswell chips, even with the slightly higher TDP that's entirely down to the stronger iGPU.

what?

yes it is still high end card because it was going for $400+ when it came out

What's your point? That GPUs have power spikes and whoever screenshots that power spike wins the shitposting contest?

Holy fuck GPU-z is a worthless tool for reading something like power consumption, possibly worse than watt-meters due to even smaller sample size and refresh rate.

so basically its a 390x?

fuck it waiting till volta and vega

Oh my friend, you forgot to mention how Skylake has improved different instructions set including AVX2 which is why they can run at higher voltages without burning themselves ;)

>MAX
Can't GPU-z post the average reading?

Yes, but that's not good for clickbait.

these pictures were taken from live stage though, it made sense there

HOLY SHIT GTX1080 CONFIRMED FOR 320W HOUSEFIRE NEEDS 120% MORE POWER THAN RX480 FOR 60% MORE PERFORMANCE

That one in the OP is a separate chiphell image, it's watermarked.
Everything else wasn't watermarked.
Could it be a different source?

Except it isn't?
Holy shit you're autistic.

Post again when you read the thread.

>XFX

This explains everything.

I'm no fan of nvidia but isn't that about right? Is GTX1080 the same size?

>overclock
>MAX
>MAX
>MAX

This shitpost isn't worth the effort to dignify with a shitpost.

150mm2 bigger

Oh wow, look at the desperation from AMDfags ITT

Oy vey you know what i meant. Die size

>shitpost
>get anally probed by your superiors
>"d-desperation!"

Don't you have a house to watch out for? It's summer and your GP104 might suddenly combust.
I recommend heat sensors connected on your own server so you can monitor away from home.

AMD fanboys are such fucking hypocrites, I remember when the 1080 FE overclocked from from 1600 mhz well over 2000 mhz and they called it shit for overclocking.

Not we have proof the 480 doesn't even overclock further than 10% and none of you idiots are even mentioning it.

Another card that runs hot as shit and can't oc, looks like AMD delivered in typical fashion again

yes, and it is ~150mm2 bigger than 480

You mean a 400MHz overclock with 10% 3D performance increase if it doesn't throttle?

Haha.

Pascal has horrendous clock scaling, everything is due to the lower IPC than Maxwell.
Nvidia blew their entire load on GP100, honestly.

It's
a
fucking
200$ card
Why the fucking hell are you arguing about low-end graphic cards ? You twats fucking know nobody will change their minds after any argument. What are you trying to accomplish ? Do you buy a GPU just for shitposting ? Don't you have better things to do with it ?

I'm more interested in the GPU and its characteristics than the card.
Now sod off.

Then you're a rare minority in a world of arguing fanboys.

>Quoting everyone ITT
Here's your (You), fucking faggot

There are a number of those on Sup Forums, they're hard to find elsewhere so it's worth trudging through dirt and mud I believe.

this fucking threads are getting fucking old
for fucks sake at least argue about loonix vs winblows

>AMD is a struggling company
>Major GPU release

This is an interesting time for the gaming industry, but apparently your walnut does not comprehend this.

There's more of those on Sup Forums than other websites. still, those threads feel like those shit arguments, I guess it's kinda normal after all.

>revision
>C7

HOLY SHIT HOW DID NOBODY SEE THIS?
This GPU could have been released fucking 4 months ago!

>itt butthurt nvidiots

It probably was, but they had to build up stock and not do a paper launch.

what did they mean by this?

no but seriously, what are these artifacts called?

Of course I do not comprehend why some people compare a GPU like the GTX 1070 versus the RX 480. Of course the 1070 will be better.

I just go nut because the major part of those arguments is kinda shitty.

Is that just the way the game handles shadows? i remember farcry 3/4 had the same issue when it loaded in detail to fast.

>AMD claim 150W card, it actually peaks at 147W
>Nvidia claim 150W card, it actually averages 150W and peaks at 200W

We see it every single time.

POWER OF A 980 for $200

Fuck I hate the AMDfags for hyping this shit up

nvidiafags reduced to actually lying about AMD cards to make it seem worse than it is

Remember the days when nvidia fanboys didn't have to make shit up?

>Nvidia fanboys threatened this badly by a $200 mainstream reference card

O I am laffin

i'm inclined to say it's an AMD thing because I used to have it on my 6950 but don't on my 770. could just be the 6950 being too slow though

Peak is irrelevant for both.

AMD should from now on just hide the TDP and share it only with OEMs.
Because the huge number of twats, normalfags, 'professional gamers' and shitposters from Sup Forums and other subhuman dwellings thinking TDP == power consumption is ridiculous.

>I think TDP is power consumption
See, that's where you're wrong.
150w is a TDP, AMD usually rates it for a maximum TDP while Nvidia is for a minimum.

No, not really?

they had an elaborate blog post about the way they filtered everything to give the game a certain look, one of those had the nasty side effect of those dots.

it would happen in more games then, i have a 280x and quite a few demanding games, but i have only ever seen this weird loading in during farcry when they pain shit around to fast, the gta benchmark is just getting to the fast part of it too.

EVERYONE LAUGH AT AMD

This is pretty realistic though, AMD looks good in benchmarks but in real life(benchmarks ARE NOT real life) they're usually 2-3 times slower than the Nvidia equivalent, AMD is like those concept cars that can do anything but are just cheap shit with a coat of paint on them.

Any 28nm 200-250mm2 chips can use >200w you fucking dipshit, absolute power usage limit is still a function of total active area. A 230mm die and any other 230mm die on the same/a similar process generally have similar power draw limits.

>Like I said 2 days ago, most review samples are having trouble reaching anything beyond 1350 MHz. The highest reported clock is 1379 MHz. The member of the PCGamesHardware team confirmed that their sample did not reach 1400 MHz.

>IT OVERCLOCKS TO +1600 MHZ AND REACHES 1070 PERFORMANCE
>BELIEVE ME, WCCFTECH SAID SO
Only 3 more days until all their dreams get crushed.