How did AMD succeed with Ryzen/Threadripper while monumentally fucking up Vega?

How did AMD succeed with Ryzen/Threadripper while monumentally fucking up Vega?

Other urls found in this thread:

asus.com/us/Graphics-Cards/ROG-STRIX-RXVEGA64-O8G-GAMING/
wccftech.com/amd-radeon-rx-vega-limited-quantities-launch/
videocardz.com/72123/amd-issues-official-statement-regarding-radeon-rx-vega-64-pricing
twitter.com/SFWRedditVideos

Two different divisions.

How did you brilliantly succeed with Sup Forums shitposting but can't run a profitable blog?

Priorities.

vega aint bad considering its priced lower

>Put a white man (Keller) in charge
>Zen is a success
>Put a street shitter in charge
>Constant failures

Try harder please

RTG is Raja's designated shitting division.

how is Vega a "monumental" fuckup? it performs on par with a 1080 and even has fantastic build quality and HBM stacks that will last forever and has excellent overclocking room and even better compute performance

...

Haha
Very well, MAD shill
Here's your 2 rupees

this

This right here my dude. You've got the main AMD division which handles CPUs and embedded stuff then you've got the "Radeon Technologies Group" which is the GPU division of AMD.

CPU side is doing good shit right now with Ryzen and Threadripper. GPU side is shitting itself in the enthusiast graphics card market right now with DOA Vega 64 and the barely worth noting Vega 56.

Can Sup Forums leave already? It's a compute monster. What's not to love?

>>>t_d

Bad cost-efficiency. Low yields, high cost components, and an architectural design that probably sounds great on paper but will probably require further iterations and a die shrink or two to a high-yield process before their production costs per unit are low enough to make it sellable at a more aggressive price point.

This.
Power companies love it too!

Both vega and ryzen were built for data centers.
Turns out that data center CPU's work well as normal consumer CPUs, but data center GPUs don't work that well as consumer gaymen GPUs

Gaymers BTFO
>>>/pcbg/

>monumentally fucking up Vega
Fucking up for who? The damn thing is sold out everywhere.

Because only like 100 were listed up for sale. Most of Europe still hasn't gotten a single card.

We can't, the new NPP's aren't finished yet.

Do we thing the GPU and CPU software teams are separate?

Obviously a team of software engineers are frantically working on unlocking the fine wine in Vega. I'm just wondering if that same team would be the ones working on the NPT bug that has plagued AMD processors all this time.

As soon as it's fixed I'm buying Ryzen.

Got mine in my tiny European shithole for launch price :3

hmmm

The npt bug is fixed friend. Though you'll need to apply the kernel patch yourself on most distros.

I'll take the b8.

Vega 64 is $600-$700.
A GTX 1080 is about ~$550.

With Vega 64, you'll get
>High power usage (~400W)
>High temps
>Thermal throttling because of said temps
>Disappointing gaymen performance for the price

A GTX 1080 will get you
>Reasonable power usage (~200W)
>Minimal to no thermal throttling depending on cooler
>Decent geymon performance for the price
>Anywhere from $50-$150 still in your wallet


Choice seems obvious to me ¯\_(ツ)_/¯

>inb4 go back to Sup Forums
>inb4 nvidia shill

Cry baby bitches like you are funny. I buy a graphics renderer to render my fucking graphics.

>AMD RX VEGA 64™
>499$
>1 gallon of petrol and a box of matches
>3$
Not a good investment desu.

AGESA 1.0.0.6 fixes this.

maybe if you're a gamer manchild. It's easily the best option for my compute workloads.

Sold out because there were only like 16,000 in stock around the world lmao. Miners probably bought up 1/3 of them.

Raja Koduri is a snake oil salesman. And Jim Keller is a certified shit wrecker.

seriously speaking vega was in development for a long time in time that amd didnt had resources

the card itself is a beast no question about it they finally have a better triangle render than nvidia
BUT
given the lack of money basicly what i think went down is that they decide to put all of the time to develop a good pro driver(which we already saw that it shits on p6000) and let the gaming driver for later on
almost every new feature the card has is not enabled even the primitves arent working as intented its literally just a shrinked fiji on its current state

Go buy a fucking FE Vega then, you dipshit. FE Vega is literally aimed towards rendering and neural network fags like yourself. It's marketed for that shit and it performs well in that shit.
RX Vega is marketed for gaymen. It doesn't perform so well in gayms. Therefore, it's trash.

Why the fuck would I spend twice as much for identical OpenCL performance? Are you retarded?

(checked)
kek

AMD doesn't gimp their consumer cards for most compute tasks. Buying a pro card is silly.

>render my fucking graphics
BLACKED 3d renders?

because unlike intel, nvidia was actually doing stuff the last 6 years

>RX Vega 64: $600-$700
>FE Vega - $1,000
Check your math buddy.

Can someone explain to a casual what went wrong with Vega? The hype was really high before it came out. So what exactly went wrong?

Gaymers BTFO
>>>/pcbg/

like what? there havent been a major hardware change in nvidia lineup since maxwell came in
all of their lines so far is just die shrinks and microcode updates
the good stuff will come once fp16 comes into play and nvidia will be forced to enable all the cuda cores on the gaming cards skyrocketing their tdp

In short: RX Vega is a workstation card with shitty drivers.

Basically, they were hyper focused on creating a strong workstation card. They ended up doing that and released that as FE Vega. RX Vega is literally FE Vega, but gimped with shitty drivers that don't even use the card's full potential in games.

Not to be that user, but the dude said he only paid ~$500 for it. That sounds like it'd be half of $1000. Obviously that doesn't hold true anymore.

the "gaming" drivers have nothing of the new tech enabled what so ever
on its current state its using fury x drivers since its almost identical

AMD lies about pricing, unfinished drivers

AMD lied pajeets died

Nah dude. $499 is the old MSRP for Vega 64.

Bought mine for 500. Not sure what you're talking about.

maxwell was massive though

Link? Citation?

No, the NPT bug is still present.

If he bought it at 500 he still bought it 500 and FE would've cost him twice as much you retard.

The gaming driver and the pro driver are one and the same. Just a few features enabled/disabled underneath.

It's not a totally different driver.

Sounded like you hadn't bought one yet. Lucky you for getting at MSRP.

The problems are several tiers high

Miners
Cost of production
Shit marketing department
Performance of product is over a late and uses too much power
Hype train then disappointment

it is cause you dont need hbcc so much on the workstation or culling tech
dont forget this is the first time amd has an awesome culling and rendering triangles perfomance (they stated it was on 11 but on the whitepaper they released it says they went up to 17..)

>Zen = Jim Keller = White
>Vega = Rajeesh = POO

Whatever happened to the people screaming that the FE wasn't for gaming and RX Vega and gaming drivers would fix everything?

Now they don't care about gaming or are waiting instructions from redteam+

like where? i've been spamming that F5 key since launch a few days ago and i've yet to se a single vega card below 650€
i guess im going to skip yet another GPU generation. that old R9 290° is still more than fine for 1080p.

so you are implying that the card itself is shit and not the drivers?

>sold out worldwide
>Vega a failure

Get a job and learn the meaning of success.

There were literally 16,000 shipped out worldwide.

Not to mention that it's literally shit for the price.

I have no idea why it has disappointing gaming performance, probably shit drivers given that it's AMD. It's just funny that so many people here and on Sup Forums said you can't judge Vega gaming performance with Vega FE since it's "not a gaming card." Whenever someone pointed out that Titans and Quadros get roughly the same performance as their GeForce equivalents in games, they just screamed "This is d-different!"

The hardware is essentially the same. So your low level driver is going to be the same. The higher level stuff is going to be programmed in the same way. Maybe a module that is focused on gaming will be disabled in the pro driver and vice versa.

But we aren't talking chalk and cheese here. It's essentially the same driver.

It's the Nintendo strategy of bragging about selling out of a low stock.

16000....
LOL this is just sad you dont even know what you are talking about

Just some local store. I figured I wouldn't get one for weeks. Just wanted to see if they even had listings yet and they had 2 in stock at 500. I almost cried when I fucked up my paypal login on the first time and figured someone else would snatch it.

Ryzen: Designed mainly by white people in America.

Vega: Designed mainly by Poo and Chinese

On the extreme offchance you want a serious answer, it's because RTG is way way understaffed for driver development and Vega FE and RX Vega hit hardlaunch with hilariously unfinished drivers.

Vega, being GCN 5, carries over Fiji's fundamental limitation of only being able to process 4 triangles per clock versus the 6 triangles per clock that Pascal can handle at the front end of the rendering pipeline. This limitation is not much of an issue in professional rendering and professional compute applications, where Vega already performs very well, but for gaming specifically this was the key bottleneck that prevented Fiji from ever being able to really stretch its legs in games, as it would be heavily front end bottlenecked.

Vega was designed to include primitive shaders and primitive culling to allow processing way more than four triangles per clock (up to 17 they assert in the whitepaper), and thus to be able to keep the rest of the rendering pipeline saturated at all times unlike Fiji.

However, their driver team is so understaffed that they hit hard launch with primitive shaders still not working and so Vega is still bottlenecking on trialngle throughput in games. This can be empirically demonstrated in that Vega 56 at the same core and memory clock will perform exactly as fast as Vega 64 AIr in games.

It's this front end bottleneck that explains why Vega FE can trade blows with full GP102 in pro rendering applications like Creo, Solidworks and 3ds Max while struggling to match GP104 in games. It remains to be seen whether RTG's driver devs can actually get it working any time soon, but if they actually do really succeed in substantially increasing Vega's triangle/clock rate then Vega will see big gains in gaming.

>It's a compute monster
It isn't just a compute monster. It's geometry performance trades blows with a P6000 on half finished non-certified drivers in pro rendering applications.

The only area in which Vega is underperforming right now is gaming.

asus.com/us/Graphics-Cards/ROG-STRIX-RXVEGA64-O8G-GAMING/
12+1 phase power

wccftech.com/amd-radeon-rx-vega-limited-quantities-launch/

and yet oc uk said they expect way more than 90000 both aib and stock in less than a month..
hmmm
wccft being wccft as always

They dropped the ball on gaming since they realized that it is a niche market compared to servers and data meme market. Vega excels at compute heavy applications while it sucks for gaming.

because AMD makes CPUs and ATI makes GPUs

2006 was a good year :(

Vega was developed by Pajeets and Chinks.

Ryzen was developed by White engineers.

Go figure out the rest, OP!

Literally came here to post this

edit: rip my inbox :)

No one post this yet

because instead of focusing on bruteforce and gddr5x, they gambled on HBM/2 and it failed. and then they thought that people might bitch about only having 8gigs of vram, they created HBC which further decreases the overall bruteforce of the gpu.

they could have just made vega run on gddr5x, have it be released a year faster, and be able to mix and match how much vram they wanted on the card. and we wouldnt have HBC taking 15% of the die and that way maybe increasing the CU's on the card.

Imagine that, we could actually have a competitor to a 1080ti a year ago or at least a competior to the 1080 then releasing a competitor to the 1080ti later.

and it would have been much cheaper or at least help AMD make more money.

i believe this too be final nail in the coffin for HBM on consumer cards now that gddr6 is a few months away

HBM doesn't take up die space idiot. They're added on an interposer together, not on the same silicon wafer.

HBM is staying because it's ridiculously power and space efficient compared to GDDR.
The only reason Vega cards aren't less than half the size is because of the cooler.

>This limitation is not much of an issue in professional rendering and professional compute applications, where Vega
I hope this is a joke. A complex cgi scene could have dozens of millions of polys, maybe hundreds, because those cgi monkeys don't give a shit about efficient resource usage when designing assets. A complex game will have maybe a few million after tessellation

Budget

Lisa made is clear Zen is their focus

the HBM stack actually runs extremely cool and the card itself at load never goes much above 80c what the fuck are you talking about with "high temps" typical gaymen shitposter that belongs on Sup Forums

the HBM stack is also power efficient on Vega UNLIKE on Nvidia cards where the GDDR5 runs quite hot and is power hungry in itself creating another failpoint, the wattage differences are also NOT 200w, in real world instances they won't be much more than ~75 watts from each other with both cards overclocked reasonably and if you live in the first world power is not expensive so who cares, again both viable options and like I said the pcb on the 56 and 64 Vega is actually the same as the 1500 dollar variant and extremely solid

Would've come out even worse because of AMD's godawful GDDR5X controllers. Supposedly the memory controller alone on the RX480 consumed ~40W of power, and to go with GDDR5/GDDR5X in Vega would've meant that ~80W of power would literally just be going to memory, for what's already a card that'll easily do 400w if you let it.
(Info is all from buildzoid's vega rant)

who AMD+nVidia here?

>vega aint bad considering its priced lower
But it isn't. That lower price was only for the first limited batch that went on sale on release day because AMD gave vendors $100 vouchers to artificially lower the price. All the other ones are now more expensive than their Nvidia counterparts. Vega is a fucking joke.

We won't be able to say definitively where Vega sits until the drivers are sorted out in a few months. Launch reviews all used drivers that had features still missing.
At present it looks like Vega56 at least is competitive with the GTX 1070 and nearly the 1080. It does appear heavily memory bottlenecked, and there seems to be a problem with effective memory bandwidth all together. That might get fixed with firmware, but at present a small memory OC can get a Vega56 at stock clocks equal to a reference GTX 1080. That isn't stellar, but its not a total trainwreck.

Vega64 however, at least right now, is 25% to 32% slower than the GTX 1080ti on average. While still drawing more power.
But when it comes to compute and pro use it seems Vega is pretty good.

Theres a lot of inconsistency when looking at performance, most seem memory bandwidth related, so we'll have to see how long it takes AMD to unfuck this with updates. Even a 10% uplift would at least make Vega64 somewhat less embarrassing, and it would make the 56 very impressive.

Testing of Vega FE in Creo, Solidworks and 3ds Max shows it trading blows with a P6000 in those applications whereas the exact same GPU struggles against a 1080 in games.

From what I've been led to believe this difference results primary from Vega's front end bottlnecking much more on gaming workloads than those pro 3d rendering workloads.

videocardz.com/72123/amd-issues-official-statement-regarding-radeon-rx-vega-64-pricing
>Our initial launch quantities included standalone Radeon RX Vega 64 at SEP of $499 [...] We are working with our partners to restock all SKUs of Radeon RX Vega 64 including the standalone cards
>standalone Radeon RX Vega 64 at SEP of $499
>SEP of $499
>$499
MSRP hasn't changed, stop lying.

wasnt Nvidia gain almost 50% on those with the new driver

nobody is saying HBM is bad. On paper its a fantastic idea, but why have it if the card is still a normal length card and in the end it delays a product by a year. it doesnt give any benefits to the normal consumer.

Amd doesnt have gddr5x controllers. so you dont know that for sure. even if it was a shitty controller. they could have released it a year ago, and still it would have been better.

That was the Titan Xp, not the P6000 which is their $6,000 pro card that has certified drivers.

>Vega releases over a year later than the 1080 FE, more expensive and uses way more power
>performs on par or worse than 1080 in DX11,
marginally better in DX12
>turning on AA makes the card shit itself
>crossfire doesn't work
>consumer drivers still don't allow for proper voltage/power control
>some advertised features aren't enabled in the drivers yet
Did I mention it's over a year after Pascal's debut? Now you tell me, how do you consider Vega to be good?

AMD needs to hurry up with the Zen APUs

Ryzen and Nvidia?