Nvidia is Done

>AMD told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor

guru3d.com/news-story/amd-rx-vega-shown-against-gtx-1080-at-budapest-event.html

Nvidiot BTFO, #redteam win

Other urls found in this thread:

youtube.com/watch?v=HyyOP3ZI6qc
youtube.com/watch?v=VI4ssGtfdxw
youtube.com/watch?v=QQhqOKKAq7o
twitter.com/AMD_UK/status/887653978049318914
amazon.com/dp/B01N4UQIGT
amazon.com/Asus-PG348Q-34-Inch-Ultra-wide-Monitor/dp/B01C83BE6U
devblogs.nvidia.com/parallelforall/inside-volta/
twitter.com/NSFWRedditImage

For a 300w TDP 1080 performance they sure as fuck better price it around $300 or no one would bother.

In the article they say it barely had a slight advantage of like 5 frames and you're saying nvidia is done?

That's $300 including the freesync monitor which doesn't carry goysync tax. So it's at most $100 cheaper for the GPU.

Actually, them having to resort to using the premium attached to GSync (which is actually in some regards superior to freesync) as a selling point speaks volumes about their card, but not in a favorable way.

Vega is late to the competition and the performance we have seen so far has not been groundbreaking. The power efficiency isn't looking too good either. They can only compete in price, and even that might prove difficult considering the die sizes and cost of HBM2 memory.

Love the BTFO rhetoric btw. Nice start for a thread.

Mike, Vega is DOA, its the goysync. If it would have been faster they would have shown real FPS numbers

...oh and say it with me: 1070 level performance

Everyone has said it's 1080 performance. Let the meme die. It was never funny.

(No, it's not 1070 performance in every other game but the ones AMD showed us)

Got a source for your claim? I think the OP already had enough trolling for one thread.

AdoredTV pretty much confirmed the system that was running better was the vega one... they could of still used DX12 for both giving vega an advantage but they would of came right now and said witch system is witch and showed FPS....AdoredTV also goes on to say g-sync monitors just cost more and there is the $300 gap

youtube.com/watch?v=HyyOP3ZI6qc

What do anons think the chance is that This isn't the top end Vega GPU?

TBP =/= TDP =/= actual power draw or heat

it's a meme guys...

Overclock 1070 should beat vega easily

TBP =/= TDP ~= power draw or heat

FTFY

Slim, at least on a reasonable timescale. I don't see why AMD would hold any punches this close to launch especially considering the current image Vega has.

So do any of you shitposters want to actually try and give a technical explanation of how it's possible for Vega FE to perform like a $5,000 Quadro P6000 in pro workloads while also performing like a 1080 in games?

Both of those things should not be possible to be true simultaneously, and any attempt to read the tea leaves on RX Vega that doesn't address this contradiction is just low effort shitposting.

>So do any of you shitposters want to actually try and give a technical explanation of how it's possible for Vega FE to perform like a $5,000 Quadro P6000 in pro workloads while also performing like a 1080 in games?
The Quadros are the same chips as GeForce cards. Certain functions are disabled in the GeForce (and normally Radeon) cards to facilitate selling the WS versions at higher prices.

Right, so how can a Vega FE perform like pic related compared to a P6000 while simultaneously being 1080 level in games, whereas a P6000 utterly destroys a 1080 in games?

Dude, it's depend of the bench...

Because it's not.

Are professional and gaming workloads really so similar? Honest question, I am by no means an expert on the subject.

Perhaps they went for a microarchitecture that just lends itself for professional use cases better. Of course it could also be that the drivers just can't utilize the raw power the card has when it comes to games for one reason or another.

Across the preponderance of pro workloads at least tied with the P6000.

The freesync monitor that was used cost $799
The goysync monitor that was used cost $1299
>$500 difference just for the monitor alone
It's not the first time the AMD marketing team has used dodgy ways of promoting a product. When they focus on "but it's cheaper" rather than "it performs better", then there's good reason to be sceptical, especially since they have failed to provide any real-world performance benchmarks.

>It's an overclockers dream!
It's shit at overclocking.
>Make. Some. Noise.
Then nothing but silence for 6+ months
>Don't miss Capsaicin! We're revealing some spicey info about vega! You'll love it!
They gave away a couple of free VEGA t-shirts, and then told us that Vega is called RX Vega, which turned out to be a lie with the release of Vega FE.

You don't seem to understand the question, maybe this will help:

How can Vega FE perform ~95-100% as well as a $5,000 Quadro P6000 in pro workloads, but perform like a 1080 in games when a P6000 utterly destroys a 1080 in games?

Do you see in pic related how much faster the P6000 is than the 1080? It poses a logical contradiction. It should not be possible for Vega FE to be as fast as a P6000 in pro workloads while being this much slower than a P6000 in gaming workloads.

So anyone who wants to draw a conclusion on how RX Vega will perform has to explain this contradiction.

...

>2D benchmark
>relevant

Maybe because the drivers are shit and also they need reduce the TDP for the RX version?

Lets post some babies

I mean, it is a bit odd when testers were able to significantly undervolt it at stock clocks, especially since it's supposed to have power gating. Kind of odd, no?

Promotional models are the worst. I don't even know why, but for some reason I loathe them and the whole idea of promotional modeling.

AMD has always done better in compute workloads. That's why people buy them for mining.

Roxanne you don't have to put on the red light
Those days are over you don't have to sell your body to the night
Roxanne you don't have to wear that dress tonight
Walk the streets for money you don't care if it's wrong or if it's right
Roxanne You don't have to put on the red light
Roxanne You don't have to put on the red light
Roxanne (put on the red light)
Roxanne (put on the red light)
Roxanne (put on the red light)
Roxanne (put on the red light)
Roxanne (put on the red light) Oh
I loved you since I knew ya
I wouldn't talk down to ya
I have to tell you just how I feel
I won't share you with another boy
I know my mind is made up so put away your make-up
I told you once I won't tell you again it's a bad way
Roxanne you don't have to put on the red light
Roxanne you don't have to put on the red light
Roxanne (you don't have to put on the red light)

We'll know for sure eventually, next month. Everything now is speculation on limited data.

youtube.com/watch?v=VI4ssGtfdxw

Who the fuck cares about price when they can barely match a year old gtx 1080 performance wise?

Who cares I just want a GPU that isn't nvidiaids

Because Vega is a piece of shit for gayming. Simple!

At $300 I'd say it's alright. Not good but not dead in the water also. If they try to sell that POS at something like $500 they'd deserve all the memes. No one would want a Chernobyl recreation in their tower for the performance and the price of the 1080

>who the fuck cares if 480 is 970 for $200
>who the fuck cares if 1060 is 390 for $200

yeah, give me higher prices, I want to buy 1060 class GPU for 600, it's midrange dude!

this TDP for GPUs meme got out of hand

two simple things - CPU TDP matters because it has to push power through socket and you have to buy cooler yourself

GPU TDP doesn't matter because cooler is attached and calculated for you already and you power it directly from PSU.

it costs less than 3% more to run 300w card vs 250w card.

You're completely ignoring the fact that the cooling solutions on hotter cards is noisy.
youtube.com/watch?v=QQhqOKKAq7o

Simple. More raw compute, but shitty front end. All AMD GPUs are like this.

The image you quoted says the exact opposite, Creo being geometry bound

First of all, Vega only has like 5% more ALUs than P6000, and way less ROPs.
So there's no large theoretical tflops advantage like before for AMD.

DELET_ THIS

>having to bring in monitors to cause a price difference

Top fucking kek, this doesn't bode well for AMD

The price difference is in the monitor you stupid fuck, not the gpu.

G sync monitors are much more expensive than freesync monitors

...

Which translates to a $300 win for Nvidia in Europe, the ME and east Asia because of amds inability to make distributors sell it for it's actual price.

in which third world shithole do you live where there is a whooping 300 bucks difference? And is there even any monitor product that has both a freesync and a goysync version? I can't find any in Sweden (the specs are the same but the design and name are different, and even then, the price difference is like 150 bucks)

The gsync monitor used in the comparison is $500 more expensive than the freesync one.

>So anyone who wants to draw a conclusion on how RX Vega will perform has to explain this contradiction.
First explain to me how ryzen can perform like a 7900x in professional workloads but perform like a 2600k in in games. It should not be possible.

You ignorant fucking swine

There is even a bigger difference between the monitors than 300$ meaning the Vega card might even cost more while performing the same as the 1080.

Stupid gay Swede

>Over 1 year late
>350W - 440W HOUSEFIRES
>Still losing to GTX 1080 310mm2

CPU =! GPU

>yfw AMDrones are actualy excited for this piece of trash

>they still deludedly believe it will somehow be cheaper than a 1080 while the gpu is obviously much more expensive to manufacture (much bigger die, HBM2)

Oh wow, so they are using a top-of-the-line Asus ROG 34" 21:9 3440x1440 265Hz goysync monitor for Nvidia

but using the cheapest shit from a hong kong ebay monitor (1920x1080, 144Hz, 16:9) for their FreeSync monitor?

totally fair comparison my dude. I can find that 500 bucks difference by doing what AMD is doing, but I can also find only a 150 bucks difference if I choose the same specs for both monitors

Isn't the Vega FE like 1000 bucks or some shit. Iirc the Nvidia quadro gpu which is in the titan xp is around the same price as the quadro at also like 1.2k. Obviously they'll cut down the vega FE to make the rx Vega like Nvidia did with the 1080 ti but even then the 1080 ti is like 700 bucks so you might be onto something.

All in all Vega is looking like trash for gaming though anyway.

Samsung is an AMD Epyc partner

maybe AMD gets discount HBM2 prices

Who. Are. You. Quoting.

Come on....

youtube.com/watch?v=HyyOP3ZI6qc
It's all explained here

The person I quoted

twitter.com/AMD_UK/status/887653978049318914

NVIDIOT BTFO

>"AMD to show off the impossible at Siggraph 2017"

Holy shit, prepare your anus Nvidia. RX Vega is gonna fucking beat 1080Ti and cost 300 bucks less

It's either gonna be cheap as dirt or performing really fucking good and expensive.
Honestly I think the market would prefer the first, 1080+ for $400? Give us another Ryzen

The guys running the event said both monitors are 1440ultrawide 100Hz with freesync and goysync respectively.

Freesync monitor
>amazon.com/dp/B01N4UQIGT
>$749
>On special from normal price of $799

Goysync monitor
>amazon.com/Asus-PG348Q-34-Inch-Ultra-wide-Monitor/dp/B01C83BE6U
>$1182.99
>On special from normal price of $1299

Those are the best prices I can be bothered to find.

This is some pretty pathetic stuff

Instead of focusing on the performance or actual price difference between the cards, they went out of their way to find two monitors with the same resolution and largest price difference. AND they show it in one of the most AMD favored games ofcourse, most likely running the 1080 on the shittest api and the rx vega on the best api.

This straight away tells you two things, the performance is not significantly higher than that of the 1080 and the price is not lower than that of the 1080.

Any marketeer worth its salt would have pointed out any of these two things if they actualy were the case.

And we aren't even talking about the power consumption, with rumoured tdp's of 375w on the top vega card it's going to be insane.

And keep in mind, they are 'announcing' not even releasing this A YEAR LATER than the competition.

AMD is officialy done for in the high end market. As if not beating Pascal wasn't enough Volta is on its way which will be the nail in the coffin.

>AMD shows off the impossible
A card that draws 600W and barely performs better than a 1080

...

>2017
>the return of 3-slot cooling

AMD is cheap unreliable trash for poor people. Was and always will be. Literally equivalent of poor people buying expensive phones.

don't forget that they are expanding the PCB height aswell (not lenght, but the height, shown in pic related)

devblogs.nvidia.com/parallelforall/inside-volta/

>The new Volta SM is 50% more energy efficient than the previous generation Pascal design, enabling major boosts in FP32 and FP64 performance in the same power envelope.

>50% more energy efficient

THANK YOU BASED NVIDIA

AYYMD IS FINISHED & BANKRUPT

AYYMDPOORFAGS CONFIRMED ON SUICIDE WATCH

- They can't compete with the competitor's products.
- "Our product is just as good in practice. Even though it's slower you don't notice the difference."
- Adaptive sync makes things much harder to spot the difference.
- If it would have been faster they would have shown real numbers.
- They used GTX 1080 which means that it's no where near GTX 1080 Ti. If they did use GTX 1080 Ti it doesn't really change much.

Vega might be decent architecture for APUs but it seems to have serious scalability issues. Either AMD should start to rethink their approach when it comes to keeping GCN alive forever or they should just ditch GloFo until 7nm process is ready. On top of that RTG's marketing team does just ridiculous things. "Poor volta" Seems like that's going to bite in their ass.

>Creo
>benefiting from moar ALUs
?

>buying reference
>ever

it's true about reference, but same can be said about reference 1080 it's hot and loud.

hardware schedulers, more cores

...

The same reason the 390x lost to the GTX 980 in games but was vastly Superior in computing.
The hardware scheduler is not necessary for games.

What the fuck scheduling has to do with bonkers geomtry perf of Vega?

You're confusing compute with graphic/geometry workloads..

Maya, Audocad, 3ds, creo are NOT compute, they're graphic workloads like games and Nvidia has always done fantastic in them.

no, they are different in a few ways, especially where accuracy of the rendered image is concerned (It doesn't matter necessarily if your textures are slightly misplaced when gaming)

Well it's the reason why AMD GPUs are good at compute.
But something with Vega is broken

>Are professional and gaming workloads really so similar? Honest question, I am by no means an expert on the subject.
Depends very much on what professional workloads you have. AMD has consistently offered far more compute power than Nvidia in the past due to their inefficient gaming performance otherwise, so that can translate to better results in professional applications.

I read that Vega FE in some cases has 350-400W of power draw. Can't wait for that powerbill.

Scheduling has NOTHING to do with compute perf you moron.
GCN historically had moar ALUs than nvidia products that competed with it.
That's not the case for Vega, its still 4k ALUs.
Guess why.

Anons, can you give me details on this reference to hardware schedulers?
Does AMD have more vs Nvidia on chip? Can someone give me some good source sauce?

I wouldn't be surprised if they didn't enable goysync and used DX12 on the 1080 machine. Might explain why the AMD machine felt "smoother" as some have reported.

>"AMD to show off the impossible at Siggraph 2017"

Has anyone drawn the connection that this might refer to the vega cube?

Why does this remind me of that song by Arctic Monkeys?

I had a screencap explaining GPU hardware schedulers.
Can't find it.

Basically allows more efficient computing, at the cost of drastically increased power usage and temps.
Example gtx480.

>"AMD are going to start making high-tech fingerboxes!"
All my wildest dreams have come true

What?
Are you fucking retarded?
GF100 was hot and hungry due to JHH thinking he can fab 500mm^2 die on a new node without preparation.
It has nothing to do with hardware scheduling.

I think it's a dying 'trade' in the western world, because of how tacky it is.
>we get it, they've got tits
I much rather they served a purpose than eye candy. Not that many of them are that attractive.
Serve me a beer or some nibbles. Stop standing there like a twat.
Would be much better if they had product knowledge. but that's expecting too much for what is a temp position.
Even at car shows they're thinning out. Blokes are much more interested in what they came for than some bimbo. Unless that bimbo built that car.

So has anyone explained why some 5 billion transistors in Vega are just sitting there?
Because, they could have just shrunk Fiji, but they didn't.
They could have doubled Polaris10, but didn't.
They could have released both a year earlier.

So what exactly is going on?

>Not understanding green text
>Typing. Like. This.
autism

Yeah, I need that source source user. See if you can find it. I think its a granularity type of deal. Nvidia has a couple of high level ones. So, I'm wondering what AMD has over this? Increased granularity of scheduling and more schedulers

It's a mystery.
Ask RTG for whitepapers if you want to know it.

Sandbagging.