Vega RX benchmarks

>same performance than vega frontier edition
>same power draw than vega frontier edition
>but m-m-muh gaymen drivers

videocardz.com/71090/amd-radeon-rx-vega-3dmark-fire-strike-performance

wait™ the drivers & wait™ Navi

Other urls found in this thread:

nordichardware.se/nyheter/radeon-rx-vega-prislapp-sverige.html
gamersnexus.net/guides/2990-vega-frontier-edition-undervolt-benchmarks-improve-performance
digiworthy.com/2017/06/30/amd-vega-frontier-tile-based-rasterizer/
overclock.net/t/1633406/so-what-went-wrong-with-vega/120
anandtech.com/show/10536/nvidia-maxwell-tile-rasterization-analysis
twitter.com/SFWRedditImages

forgot
>$850

this
AMDead

just run it on my 1070:

Graphics Score 20 771

So explain to me why would anyone buy a GPU that is twice the price, uses twice the power and is a year late to the party?

Don't question AMD fanboys

That explains why they're poor.

Price is unknown, that "leaked" price is complete BS
Power consumption is unknown but likely 250-275 watts
You are comparing a partner GPU to a reference GPU

How many schmeckles are you being paid to shill? This is old news, no need to rehash

yeah yeah just wait™ for drivers/non-reference cards/navi/ and let the AMD wait->disappointment loop continue

>but m-m-muh gaymen drivers

Whilst this is a ridiculous argument to begin with, nothing in your post disproves it whatsoever. The claim is that drivers will be ready for launch that will boost its performance. How would pre-release leaks using pre-release drivers be in any way relevant to that theory?

Almost every post in this thread is anti-AMD circlejerk shitposts. This board is a fucking cesspool.

you can always go back to r/amd

since when price is announced?

...

Shh can't ask difficult questions, the answer isn't printed on OP's Shill Instructions

Rumor posted on Swedish site nordichardware.se/nyheter/radeon-rx-vega-prislapp-sverige.html

>More expensive than a 1080, whilst being barely-on-par performance-wise.
>Waiting all this time for fucking nothing

Does that offend you?

>Sup Forumsnu
>pro nvidia

Oh boy...

>what is occasional Sup Forums spillover that happens every GPU launch
?

But all the "thermi" and 3.5 GiB meme. Sup Forums is the amd board. They hate nvidia and intel with passion.

And don't forget, all this with 375TDP and a year later than the competition.

Wtf is AMD thinking

they aren't thinking

I'm calling BS.

Considering the results were posted by RTG marketer, it's BS.

375 watt TDP was only on the watercooled Vega FE. I swear you idiots just show up and spout dumb shit all day.

Fire Raja

If RX Vega is a cheaper 1080, I'm still going to buy it, But that's the best case scenario for Vega at this point, and AMD is going to lose a shit ton of money trying to sell it cheap compared to Nvidia with HBM2 and a larger die.

>replying to obvious bait
?

are you frustrated

Really?

Here, I'll just copy/past from the other shitposting thread:

Threadly reminder that anyone who claims to "know" how RX Vega is going to perform is a clueless shitposter since all of Vega's new uarch features remain disabled in Vega FE's gaming drivers.

These include: Advanced Clock Gating and Adaptive Voltage and Frequency Scaling, it should also have Primitive Shaders, Tile Based Rasterization and a High Bandwidth Cache Controller

Proof:
gamersnexus.net/guides/2990-vega-frontier-edition-undervolt-benchmarks-improve-performance

GN's article proves that Vega's Advanced Clock Gating isn't working at all, and that Adaptive Voltage & Frequency Scaling are in a failsafe mode where 1.2v is being used regardless of clocks. This is why GN was able to increase the performance of Vega FE by undervolting it to 1.09v and ~280w.

digiworthy.com/2017/06/30/amd-vega-frontier-tile-based-rasterizer/

Here's proof that Tile Based Rasterization is not working on the current Vega gaming drivers.

overclock.net/t/1633406/so-what-went-wrong-with-vega/120

HBCC and Primitive Shader are also disabled currently in gaming drivers.

Just to maybe get to to at least TRY and educate yourself a little bit, here is a nice article about what the implementation of TBR meant for Nvidia in going from Kepler to Maxwell:

anandtech.com/show/10536/nvidia-maxwell-tile-rasterization-analysis

What does this mean? It means that until we see IF and how well RTG's driver team manage to implement these Vega uarch features by the launch at Siggraph it's actually impossible to know how RX Vega performs compared to Vega FE in gaming workloads.

Will RTG's driver team fail to implement ANY of these by Siggraph and it flops at launch? Who knows? How well will any of these actually be implemented by RTG? Who knows? There is way too large a gap between the floor and the ceiling here to make confident predictions.

They try so hard.

>anyone who claims to "know" how RX Vega is going to perform is a clueless shitposter

I disagree.

Have a stroke, Jensen.

No matter how good the R300 was all it showed is that for one reason or another people buy Nvidia even when they have an inferior product. See also: 970.

No, R300 sold INCREDIBLY well.
It was Evergreen that sold like shit compared to how good it was.
A shame, but whatever.