Official RX Vega Review Thread

Almost 2 years of hype for this? Honestly, what a disappointment...

Other urls found in this thread:

translate.google.ca/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=https://www.computerbase.de/2017-08/radeon-rx-vega-64-56-test/&edit-text=&act=url
radeon.com/_downloads/vega-whitepaper-11.6.17.pdf
twitter.com/SFWRedditImages

oh fuck it.

I will pick up Volta and get over with GPU bullshit.

At least Linux users now have a high performance-ish GPU with good open source support.

Literally the one and only small upside to Vega.

>literally an overclocked fury
fuck you guys weren't memeing

>big massive die
>power hungry
>12+ tflops
>bad game perf
GAMERS BTFO

It also comes free with a 500W home heather

>just wait
Hope you enjoy a 8 month wait.

not him but if you don't want to wait the obvious choice is still to buy nvidia retard.

Meanwhile desperate amdrones are "just waiting" for better drivers as usual

Benchmark of video decoding out yet?
Just want to watch HDR 4K/60fps videos and occasionally game on the side.

Vega 56 looks pretty attractive. BTFOs a 1070 in most cases at lower cost. Hopefully there are AIB partner cards with higher clocks and good coolers that will put it close to the 1080. Vega 64 kinda seems disappointing though, I was hoping for something to at least come close to the 1080 Ti. Honestly though I would really just like playable 4k on high/ultra.

It's not even in stock 10 minutes after launch, lol.

>literally OC'd 1070
predicted this like 7 month ago, wish I could find a post in archive

>BTFOs a 1070
HAHAHAHAHAHAHAHAHAHAHAHAHAHA

Bought a 1080 last week after seeing the shitshow that is vega.
Luckily I only waited 2 weeks.
Feel bad for the cucks that waited a whole fucking year for this.

So by Q1 2018 driver will add 15% performance, and maybe improve power draw by 10%.

But that won't matter because then 6 months later Volta will destroy it in every way. Unless AMD can get Navi to release within 3 months of Volta and be 20% better in price or performance they are fucked.

based

The madmen at RTG hard launched RX Vega with primitive shaders still not working, and with DSBR having apparently made it only into the literal test drivers sent to reviewers.

translate.google.ca/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=https://www.computerbase.de/2017-08/radeon-rx-vega-64-56-test/&edit-text=&act=url

Not to mention none of the reviewers seem to have asked RTG what the hell is going on with the memory bandiwdth on Vega, either.

>primitive shaders still not working, and with DSBR having apparently made it only into the literal test drivers sent to reviewers.
WTF. AMD is this stupid?

Too fucking late now, AyyMD is finished.

And yet they've still sold every single card they sent to retailers. OcUK sold through the 1000 they got in under an hour. Really makes you think (that dumb gaymers aren't as important as they think).

Yes, shill, even Anandtech agrees. At 4k and 1440p Vega 56 wins nearly every benchmark, and many of them by a comfortable margin. The 1070 is now a dead card except for people who are overly concerned with power draw.

Apparently. I can't believe they hard launched in this state. How the fuck do you hard launch with the drivers barely half finished?!

>Nvidia will get away with selling 300mm2 dies for flagship prices for at least the next 5 years now

Just kill me.

More like understaffed.

WTF Raja? God dammit get your shit together.

But it's not Raja, it's the mooks.

At least 56 is not completely terrible performance-wise and a reasonable option if you are "AMD or nothing
!" for 1080 to 1440p. Provided it is sold at MSRP and without miner-markup.

Yeah, but he's the guy whose face is associated with it.

doesn't really matter it's sold out anyway who cares, Ayymd knew it would get sold out in seconds everybody knew no need for game optimizations and drivers right away, better to instead work on fixing any mining bugs

Ayymd isnt finished, they're just getting started

Rayydeon tech group is on suicide watch though

>Intel slayers
>finished
?
They are fucking murdering Intel across every fucking CPU market.

>murdering
>still 10% behind in IPC and way behind in overall serial performance because their arch is hamstrung by a shitty low power mobile process

>serial performance in current year
?
Perf/watt > niggahurtz.
Or you want another rounds of B I B E L I N E S?

>le everything is easily parellelizable meme

>muh overclocking
>MUH VIDEO GAYMES

The server market is roughy one million times more important than dumb faggot Sup Forumstards and AMD have a clock speed ADVANTAGE over Intel in that sector, where people don't want to run a 350W housefire. Enjoy jerking off to your Call of Duty: White Genocide benchmarks while AMD make bank through the server and mining markets.

feels good not to be poor :3

>le housefires clocks for good ebin singlethread are more important than perf/watt meme

>AMD GPoos are shit for games
>AMD moar coarz are getting beaten by OC'd 2600k
SAD!

So, would this be worth upgrading to from a 1060?

if you want a house fire yes, otherwise get a 1080ti

A 1080ti is a waste of money though.

I am beyond butt blasted that they pulled a jensen and locked down the cards. It's like Intel selling the 8700k with an unlocked multiplier but not allowing you to change the voltage from 1.25 or something.
Vega 56 would have been the most awesome card to throw high end cooling on like an open loop or 280/360mm AIO and crank it up.
Now with GPU Boost 3.0 acting against the average overclocker's best interest and pascal's locked down voltage and power limits by way of encrypted bios and AMD joining suite and locking down vega GPU overclocking is dead.

>It's like Intel selling the 8700k with an unlocked multiplier but not allowing you to change the voltage from 1.25
WHAT?

>implying Vega 56 won't be unavailable for months on end thanks to crypto miners and won't fade into complete irrelevancy by the time Volta releases
Being an AMDrone is suffering.

It's a rhetorical example. You can "overclock" vega and pascal but you can't increase the voltage or power limits(past 300w on vega 64 or without the shunt mod on pascal but that doesnt matter because it needs volts too). The closest thing to this would be a CPU with an unlocked multiplier but locked voltage.

Ah, that.
Who cares, GPU OCing was always a fucking meme.
Also they have to segment the shit somehow.

> Pienempi tulos parempi (sekuntia)
> Smaller is better (seconds)
> Hurr durr it's worse than nvidia look at this graph gtx has bigger numbers lol

WHERES MY PRIM SHADERS YOU RTG POONIGGERS
REEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE

By the time you can actually buy the card at MSRP, it'll be enabled.

I care. A shit load. Outside of performance bracket it is by far the most important consideration when buying a card to me. I was not aware of the motherfuckery and anti-consumer practices with pascal when I purchased my 1060 so I sold it and bought a 980ti. Been happily tweaking it and making my own bios and benchmarking since. It runs as fast as a stock 1080 now.
I was very excited for vega because it would be the first high end card in a while that you actually own and can do what you wish with but that is not the case.
They don't do it with CPUs. Even Intel, lord of kikery and keeping the goyim down, lets you run your CPU as fast as your cooling can allow at any voltage you wish. Same with Ryzen, put in whatever voltage you want, doesn't matter. But GPUs? Oh no, can't let the goyim have fun there. What is john doe bought a 1070 for a monitor we could have sold him a 1080 for???
In the AMD ad that said "poor volta" it was actually "poor voltage" with the ge crossed out to say volta. I thought it was AMD attacking nvidia for locking down their GPUs. It was an angle they could have taken to make their hardware more appealing.
I am so fucking revolted. Had been looking forward to getting my vega for a while now yet it's now useless.

Vega hits 14nm LPP limits pretty fast anyway.
OCing is a meme.
REEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE
ENABLE IT

I don't speak baguette, sorry.

>By the time you can actually buy the card at MSRP,
You mean somewhere around the time Navi will hit shelves?

Are we expecting a decent performance increase as the drivers mature, like Ryzen?

Implying you wont be on a waiting list while the Vega 56 is stroking off some coin miner in the back room.

Depends on how prim shaders would impact performance.
But yes, everything, especially DSBR, needs tuning.

Yes, there's still features they didn't even enable yet.

It's really not. My 6600k is shit with out it's OC and my 980ti is slower than a 1070 with out it. With heavy overclocks(4.8ghz and 1500mhz) I effectively have near 7700k/1080 performance.
That aside it's FUN and I spend more time tinkering and benchmarking than playing games.

>OCing is a meme.

Newfag detected.

So Vega is comfortably above the 1070 and the 64 is about 10% better than the 1080 @ 4k. Do you think we'll see the 56 be better than the 1080 and the 64 match the 1080ti?

No one knows.

It doesn't matter how much you overclock the i5 - it'll never come close to a 7700K when a program wants more than four threads to work with. In fact, it won't even touch a 2600K.

Certainly seems to be the case in that game. I saw massive uplift in WoW though.

GIVE ME PRIM SHADERS REEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE

I'm guessing 10-15% improvement from current numbers.

Did you even order one? Goddamn.

I was a bit too late.
Now i wait™.

what sort of performance can we expect from Vega Nano? Assuming nothing changes.

I don't either. It took exactly 10 seconds in google translate.
Laziness is not an excuse for believing everything you're told.

what would be the upper limit of non-reference memory clocks be?

So what's the point of buying the RX 64, when I could spend $50 more and get a 1080?

probably around 1200-1250.

1070 performance

Even the lowest Vega beats the 1070, albeit at a higher power draw.

I wonder how much this would affect performance for Vega 56.

how is this pic implying the primitve shaders aren't working?

JUST WAIT LMAO

...

Mining

Wait for Vega...wait for drivers...wait for Navi....

...

You can't buy it anyway because it's out of stock 10 minutes after launch.

Look at the link, dumbass.

...

I probably shouldn't have used that image but alright I understand. I was planning for my next build to be Ryzen and Vega but nevermind.

...

Truth is, if you actually built an AMD system you'd know it's better.

how much performance are HBCC and primitive shaders going to bring?

HBCC? Barely, if any, it's an enterprise feature.
Prim shaders?
A hefty chunk.

>500W
>1070 performance
AMDrones are still in shock and dismay but give them a day and Sup Forums will be turned again into designated indian street.

So is 56 shit? Are miners taking everything regardless?

I thought there was a coin crash or something in July

raja plz, stop smoking cow dung posting about your magic carpet driver updates.

people here are discussing that vega is bandwith starved, isn't HBCC exactly that thing that fixes this? and what's primitive shader doing? does it convert the regular shading? don't understand how you can get rid of the vertex and geometry shader

AMD cards are absolutely beasts at compute like always

HBCC allows to work with large datasets.
It doesn't help with BW starvation.
And prim shader REPLACES vertex and geometry stage.
But it's inner workings are a mystery yet.
Well besides position and attribute shading being separate in prim shaders.

AMD should stop marketing their cards to the gaming area.

It's clearly not designed for it.

They should pander to miners and people who need raw computing power.

Actually enabling core uarch features detailed in the white paper does not count as "magic drivers" it counts as actually getting the drivers into the condition they should have been in when the card launched if RTG's driver team wasn't literally three fucking dudes.

Here, straight from the horses mouth:

radeon.com/_downloads/vega-whitepaper-11.6.17.pdf

"In a typical scene, around half of the geometry will be discarded through various techniques such as frustum culling, back-face culling, and small-primitive culling. The faster these primitives are discarded, the faster the GPU can start rendering the visible geometry. Furthermore, traditional geometry pipelines discard primitives after vertex processing is completed, which can waste computing resources and create bottlenecks when storing a large batch of unnecessary attributes. Primitive shaders enable early culling to save those resources. The “Vega” 10 GPU includes four geometry engines which would normally be limited to a maximum throughput of four primitives per clock, but this limit increases to more than 17 primitives per clock when primitive shaders are employed."

They fucking hard launched the card with this disabled, the fucking madmen.

It's not just compute. Vega 10 is pretty beastly at pure geometry workloads when it isn't bandwidth bottlenecked.

Vega is literally pushing pixels & large datasets: the uarch.
Volta is pure compute.

so we are literally waiting for magic drivers, wtf amd

Time to sell my 1070 and get a Vega 56 if they ever become available