Anyone have 480x benchmarks ?

Anyone have 480x benchmarks ?

I'm predicting : 480x > 980/390x

Other urls found in this thread:

videocardz.com/60253/amd-radeon-r9-480-3dmark11-benchmarks
videocardz.com/60819/amd-radeon-rx-480-confirmed-as-polaris-67dfc7
techspot.com/review/1093-amd-radeon-380x/page8.html
tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572-10.html
tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-12.html
tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-9.html
twitter.com/AnonBabble

There are none yet, likely won't be either. If AMD actually produced a 40CU die and is only showcasing a cut down version as the RX480 then Apple probably bought the cream of the crop again like they did with Tonga.

Nvidia won

The 480 is better than the 980 and 390X. 480X is probably just under the 980Ti.

Only one.

My tip hurts when i lay it into the amd fan

Source besides the announcement even? Oh shit that's right you don't have any source. Fuck off with your hype bullshit.

Stay on funnyjunk retard

TEAM RED WON !!

has anyone else started to read it as Poolaris?

> Angry birds benchmarking

970 < 480 < 980
you're also probably right.

I'm really curious what the FP64 will be. For some reason I think it's 1/16, which is kind of lame.

I'm curious to know the frametimes since they have not released shit. They are keeping so much behind the curtains it's a little suspicious. I mean the pro duo was a fucking failure, who even bought one?

videocardz.com/60253/amd-radeon-r9-480-3dmark11-benchmarks
videocardz.com/60819/amd-radeon-rx-480-confirmed-as-polaris-67dfc7

Will there be anything else?

That would be if it's a 390x
It's supposed to be a Fury

Not him but you do know the rx 480x will have marginal performance over a standard 480 (if the 480x even exists). We can see this by looking a performance difference between previous gen 280,380 -> 280,380x. We already know amd aren't going to be competing with the gtx 1070 till vega so there's no way they can have a 980 ti equivalent performance card.

280x*

This was just my source showing.
>The 480 is better than the 980 and 390X.
I make no claims to be an expert on any of this but that much is clear. If the difference between the 480X (if it exists, the RX thing gives me doubts) and the 480 is similar to the difference between the 380X and the 380 then that's going to put it just below the 980Ti. Purely from a numbers standpoint of course.

They said at the presentation that they'll have cards in the $200-300 range, and the 8gb RX 480 supposedly only costs $229. AMD has a presentation at the PC Gaming Show at E3 in a couple weeks, they might announce something then.

Thats probably the price for 3rd party cards with gaymer fan shrouds and LEDs.

Pro duo is a workstation card not a gaymur card. Had the highest compute performance of any card under $3500

AMD nor Nvidia have ever done such thing.
AIBs is AIBs, how much they priced their custom card is none of their concern.

You're way off with those numbers. There's an 8-10% performance gap between a 380 and a 380x going by techpowerup's relative performance chart and by a techspots review so if the the 480 is around 390x level then the 480x and it's marginally better performance over the 480 wouldn't put it even close to the 980 ti because there is a 35% difference between a 390x and a 980 ti.

95/70= 1.35

Here's my sources btw

techspot.com/review/1093-amd-radeon-380x/page8.html

>As expected, in the dozen games tested the R9 380X was never more than 14% faster than the R9 380 and on average was ~10% faster.

Also pic related.

>mfw I can throw my GTX 770 in the garbage

just 1 month until I can trash this nvidia meme I fell for

>it's more than okay when nvidia does it.

funny this is you can probably sell it for $200

some people are retarded

>8-10% performance gap between a 380 and a 380x
Yeah, I just took that 10% (perhaps a generous estimate on my part) and as such multiplied the 480's 3DM11 score by 1.1. This puts it just below the 980Ti's score in that chart. I really didn't pour a lot of effort into this, but as a result it's really easy to see that it's perfectly sound with no room for error.

>june 29

It's won't best a 980ti but it will be miles cheaper and will come close. AMD did good.

You also have to take into consideration that the benchmark was done with 1260 MHz clock and early drivers.

Higher clocks and mature drivers could put the RX 480 closer to FuryX in performance.

The RX 480 has an advertised 5.2TFLOPS compute power. 36CU means that the card is clocked at 1130-1150mhz.
GCN is far too high IPC with too short a pipeline to target excessively high clocks.

>come close

Sorry but this has already been disproven. A 30-35% performance difference between it and the 980 ti is no where near close. It'll be just above a 980 at absolute best but realistically everyone knows it'll be about 390x performance in real world gaming, so just under a 980.

wait what? what's the story?

390x is pretty much as fast as the 980

You should see what a lot of Nvidia owners think their card is worth. At 200 burgers polaris is going to basically gut the second hand market for 970's and 980's (as well as anything slower than them).

Prices are already starting to crash.

>>Prices are already starting to crash.
They'll rebound when the card actually comes out though.
Or rather, when actual benchmarks come out

The 480X doesn't exist yet

How so? Even the worst case rumours put the 480 at 970 to 980 levels of performance - that is going to instantly invalidate the high(er) price these cards sell for.

Gamer children will buy them because Nvidia.

>rumours
Rumours are always 0% accurate.
If you believe any of them, you're choosing which ones to believe and which ones not to believe on purely arbitrary grounds.

>"Leaked benchmarks"
>Way before the announcement
Moron

this

Yeah, they just guessed what AMD named it. Get out of here you mindless shill.

>Hey why aren't you defending this GPU from a company that has been proven to lie about their hardware yields before the product is even released or out of NDA you mindless shill
Grow up you fucking idiot

Are you talking about AMD or Nvidia or both?

Lying about their shit before it's available isn't anything that's exclusive to either company.

...

The third party got an engineering sample and benched it, AMD didn't distribute it. Unless you mean to suggest AMD owns them, which is... autistic at best.

No it isn't, that's why I don't defend any fucking company until the product is release.
Nvidia lies, AMD lies, every company lies.

No they didn't you fucking idiot the benchmarks were on the futuremark site. Nobody got an "Engineering sample and benched it" stop spewing bullshit.

Mate that's literally what they do for a living.

If you think Futuremark got a sample to benchmark other, you're actually retarded.

What? Do you think AMD couldn't bench their own card? Lol retard.

You are fucking stupid

I think there is no 480X

Why are their GPU so high IPC compared to the competition but the exact opposite for their CPU?

Because Bulldozer was machine designed and half the problem with it was GloFo fucking up target power and clock goals.
The other was that nobody knows how to write code which doesn't take a dumb on core 0 and the rest kinda stand there.

Anyone take a guess why the clockspeeds on the 480 are so low, and why the TDP was relatively high for a 14nm chip?

I would have expected much more from a die shrink

GCN was a sound arch from inception, the concept behind it was already proven.
The Bulldozer family was based on a whitepaper from a rookie, and though it works in theory, the execution was nothing like the concept.

AMD has a wider design so clocks are low.
What's wrong with the TDP? 150W is its maximum load, gaming won't even be at 120W

fuck me dude i am so ready for the rx480 and ZEN. Swear to god Im buying that shit day 1. Keep my 800 dollar PC strong.

Because the GPU and CPU guys got nothing to do with eachother outside of memory management and interconnects/buses

>GCN was a sound arch from inception, the concept behind it was already proven.

GCN hasn't lived up to the hype at all if you compare it to the last few VLIW architectures it replaced. 4000 series and 5000 series literally destroy NVIDIA, AMD hasn't been able to replicate that with GCN and has instead just been second wheel since.

>AMD has a wider design so clocks are low.
That makes no sense
>What's wrong with the TDP? 150W is its maximum load, gaming won't even be at 120W
It's not wrong, but it's not very low either for the level of performance it offers. Nvidia already had that last generation with the 970.

Judging from the low clock speeds and relatively high tdp I'm guessing that they haven't changed much about the architecture (except for the die shrink) and it's yet another iteration of GCN instead of a branw new architecture that many fanboys were touting

Two RX 480's barely beat a GTX 1080. The RX 480 basically has the horsepower of an R5 340,

Good god.

Are low clock speeds a massive issue considering that overclocking is a thing anyway?

GCN isn't a high clocking arch. Never has been and never will be.
The sweet spot is going to be right around 800-850mhz no matter what specific GPU in the family you're looking at. Even the Polaris 11 demo showed this with an 850mhz core clock and 0.8375v vcore.

Polaris 10 is a somewhat small die comparatively, and AMD chose to push clocks as high as reasonably possible. The alternative would have been to fab a larger die, but yields would have decreased, and Polaris 10 was made to be a cheap mid range.
If it were clocked lower it would likely be 100w or lower under load. Seeing how they undervolt and underclock is going to be amazing. Winning the silicon lottery with Polaris is going to net fucking astounding power savings.

>Judging from the low clock speeds and relatively high tdp I'm guessing that they haven't changed much about the architecture

it's even worse than before. the 480 has the same TDP as a 1080 yet is 70% slower than the 1080.

Yet also more than half the price.

Most people genuinely do not care about power draw.

Exactly, it seems Nvidia still is miles ahead in terms of performance per watt

Just like in the past, AMD is competing on price point again and not on performance or efficiency

>Most people genuinely do not care about power draw.

I do not care much about it either, but it does say alot about the architecture used.

Look at the difference in performance per watt between keppler and maxwell, it was huge even without a die shrink because it was a brand new architecture.

>Most people genuinely do not care about power draw.

most people do care about heat and noise, however

I guess some people do, yeah. Personally, I'm not that fussed, but I know that some people do like silent rigs.

I use headphones, so I'm not bothered by noise unless it's actually like jet-engine-tier.

1080 is 180W TDP but it's closer to 190-200W in usage.
480 is 150W TDP but it won't be close to that outside of furmark, did you even see that it's a single 6piin? It can't draw more than 150W at any time without breaching ATX spec and no OEM would ever buy such a card.

>480 is 150W TDP but it won't be close to that outside of furmark, did you even see that it's a single 6piin? It can't draw more than 150W at any time without breaching ATX spec and no OEM would ever buy such a card.

thank you anonymous AMD employee, i never would have known this before the NDA lifted and reviews were out without your help

There are better ways of dealing with getting told, like not responding and further embarrassing yourself.

>did you even see that it's a single 6piin? It can't draw more than 150W at any time without breaching ATX spec and no OEM would ever buy such a card.

6 pin and 8 pin connectors allow the same current to flow through them, 8 pin just has two extra ground wires. so the 480 can draw 75w (from the pcie bus) and 150w from the connector in total. AMD also has a history of egregiously lying about TDP (like the 290x and it's supposed 275w TDP, despite it drawing 400w under moderate load).

The GTX 1080 draws about 175w nominal. Even if you "overclock" it, it'll hit around 200w because the GPU is absolutely incapable of sustaining its clock speeds under a heavy load. That and line variability is absolutely worse in Pascal vs Maxwell.

Also for the record, this stuff everyone keeps repeating about ATX specs and how much power can be pulled via PCI-E and 6pin or 8pin connectors. All of it is 100% wrong. Boards can deliver absurd amounts of power over PCI-E, they just shouldn't do it for sustained periods. The reality is that if I wanted to design a peripheral ASIC which pulled 120w with no external power connector, I could do it entirely over PCI-E. Some boards might fry after a while, or some might fail to deliver that target power consistently leading to component instability, but they'd still do it.

Links show the most in depth analysis of power consumption for the cards in question on the web. Only Tom's is doing this. No one else has picked up on it despite being thoroughly embarrassed for lacking technical prowess.
tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572-10.html
tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-12.html

28nm Maxwell shows better stability in line draw than 16nm Pascal. Pascal's dynamic power is out of control. The only saving grace for the architecture is how it handles workloads in sporadic bursts. If you were to measure real time clock rate with the same resolution(likely impossible since you can't see inside of the shader core) you'd see true clock speeds and pstates bounding around nonstop.

tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-9.html

Factory overclocked R9 390X 8GB card draws just under 300w in gaming workloads. It'll pull over 360 during a Furmark stress test. The reference model draws 53w less in the same bench.

Don't pull things out of your ass, user.

>Factory overclocked R9 390X 8GB card draws just under 300w in gaming workloads.

> Maximum: 492W

Doesn't really matter when the power draw of the competition is still vastly inferior.

And lets be honest, it is

>I don't understand literally anything
>all I do is shitpost

The millisecond variations in power consumption don't matter for accumulative power draw, the average is what matters. Or are you now claiming that the GTX 1080 is a 300w+ card under a normal gaming workload?

On the upside, GCN can do >muh async compute because that was the point of its design.

The only games it has actualy seen benefits of that are games that nobody plays though.

>it's another "AMD targets midrange and Nvidia high end" episode

>millisecond variation

nice try

GCN was a straight uplift in IPC vs VLIW4. GCN has better performance per clock, and per transistor.

Your comparison is totally irrelevant.

guys we already have it
The 480 is stronger than the 980 but not near 980ti strength. Not that it matters since you are getting 980 strength for 200

You're still just shitposting.
The average draw there is only 250w. That is all that matters for accumulative power consumption.That graph is showing the 8GB 390X drawing 250w in a gaming workload.

Are you legitimately retarded or do you honestly think you're being clever right now?

> GCN has better performance per clock, and per transistor.
But still shit compared to the competition, the 390 has about thousand more transistors than the 970 yet they perform on par.

And how is having benefits in games that nobody plays irrelevant? You guys keep bringing up async like it matters, I will agree it matters to me when I start seeing it in games that I actualy play

>you are getting 980 strength for 200
Well, you think that until you actually use it and realize the drivers are shit.

>"leaked" benchmarks from 3 months before card release

Lol.

>GCN has better performance per clock, and per transistor.

only for artifical compute applications that can fully utilize the GPU, GCN is an inferior design for >gaymes, which is the only place AMD even has any customers these days.

those exact leaks of the 1070 turned out pretty accurate

The three Fiji owners probably killed themselves by now lol

HOLY SHIT HE'S SO POOR HAHAHAHAHAHAHAHAHAHAHAHHAHA

Because you're comparing a old architecture to a new one, Maxwell is one generation newer.
Compare the Nano, which is GCN at its sweet spot and you won't see any difference in perf/watt from AMD and Nvidia

Nano is top binned

You'd need something like the laptop full 980

HOW COULD YOU GET SO POOR! LET ME KNOW SO IT NEVER HAPPENS TO ME HAHAHAHAHAHAHAHAHHAHAGAGGAG

HOLY SHIT HIS DAD IS SO RICH HAHAHAHAHAHAHAHAHAHAHAHHAHA