Literally a rebranded and overclocked R9

>literally a rebranded and overclocked R9
Can AMD get sued for false marketing at this point? Considering how vega was supposed to be this brand new and revolutionary architecture?

Other urls found in this thread:

gamersnexus.net/guides/2977-vega-fe-vs-fury-x-at-same-clocks-ipc
youtube.com/watch?v=yudueaG5_rE
pcper.com/reviews/Graphics-Cards/Radeon-Vega-Frontier-Edition-16GB-Air-Cooled-Review/Professional-Testing-SPEC
twitter.com/SFWRedditVideos

Way to cherry pick your benchmarks.

>this hard in denial
here are all of them
gamersnexus.net/guides/2977-vega-fe-vs-fury-x-at-same-clocks-ipc

...

How the fuck did they manage to fuck this up so bad

I'm pretty sure the Vega Frontier Edition has no driver optimization for gaming workloads on the new architecture.
That, or the new architecture gives no benefits to games at all.

I didn't see it in the article.
Here's the proper picture.

Sure, right after you sue Nvidia for their 970 advertised as a 4GB card and crippled to 3.5

You guys still haven't accepted 1070 level performance? Sad!

>I don't know what a rebrand is: the shitpost

they were
everyone got 30 bucks

The FE card had a hardware flaw that had to be fixed. This is why it performs badly in certain tasks, and phenomenally in others. This is why Vega was delayed. If you think FE is indicative of RX Vega in terms of performance, there's a problem.

>complains about cherry pick about something that isn't chery picked
>cherry picks the ONLY benchmark out of like 20 that's not a total embarrasement
HAHAHAHAHAHAHAHAHAHAHHAHAHAHHAAHAHAHAHAHAAHHHAHHAHAHAHAHAHAHAHHHHHAHAHAHHAHHAHAHAHAHAHHAHHAHAAHHHAJJHAHHHAHHHHAHAHAHAHAHHHAHHAHAHHAHAAHHHHAHHHAHAHAAHHAHAAHHAHAHAHAHAAHAAHHAHHAHAHAHA

All of the other benchmarks are gaming workloads, which only tell you how good the card is at one thing.
SPEC is a combination of different workloads and tells you how the card performs under various different workloads.

1. The fact that you think this another company doing another thing is relevant just shows how much of a pathetic fanboy you are
2. see

Should have just die shrinked fiji and used 8gb GDDRX.

What a complete joke, Lisa Su needs to get rid of the Pajeet in charge.

They simply disabled anything related to gaymen on FE. Maybe it does less cache flushes due to ROPs being clients of L2 but it doesn't seem to work right now.
It's a glorified compute board.

If all the other benchmarks are just gaming workloads" then that benchmark is just "mesh workloads". Also, nobody even cares how it performs in workstation loads since workstation cards from both AMD and Nvidia BTFO it

>Just Wait (TM)
AMD might as well adopt this as their slogan since all their "fans" seem to like it so much

FE was not optimized for gaymen workloads. At all. In fact, most graphics-related features are simply disabled (for stability reasons or whatever). Only packed math seems to be working.

We all just waited two quarters for Maxwell2. It was decent, though only one SKU (GM200) was any good.

Are you retarded?

The FE card is explicitly designed and stated to be a prosumer card, meant for game developers.

How is game development benchmarks invalid?

This is just dumb.

>this retard literally thinks there's magical "gaymen" switches on graphics cards that will dramatically affect performance.
AMD knows they can't compete on games, so they release this """"""Workstation card""""" to buy them some time while they try to reach at least 1080 performance

You need to go back. Back to the Sup Forumseddit.

>two quarters
Yeah, and AMDrones have been saying "just wait" since bulldozer launched 25 quarters ago

so the vega FE is barely better than the fury r9, putting it in a stupid space betweek the fury r9 and the 1080ti.

Wow it's fucking nothing, AMD does it again. Why can't they release a decent product, like they did with Ryzen?

CPUs are GPUs now? Your Sup Forums is showing.

>then that benchmark is just "mesh workloads"
But that is factually retarded statement.
>AMD and Nvidia BTFO it
Not at the price/performance they don't
Also this is THE FASTEST AMD CARD

>The FE card is explicitly designed and stated to be a prosumer card
see How is game development benchmarks invalid?
>nobody even cares how it performs in workstation loads since workstation cards from both AMD and Nvidia BTFO it

Does this come as a surprise? It's running fiji drivers, of course it's going to perform in a similar manner.
As fun as all this sensationalism is, we should really wait for release.

However, it is amds fault for not giving free cards to reputable websites, and not letting hurrdurrgaymers nexus control the narrative.

ryzen is a heck of a lot better than bulldozer though. and pretty well priced for 6 and 8 core processors

Yes, You've been saying "just wait" for both ever since bulldozer.

You need to go back.

Pascal is literally an overclocked Maxwell. You get the lawsuit going against AMD and I'll get the lawsuit going against Nvidia. We can end the tyranny of gayming once and for all!

>identical flops = identical clock
HAHAHAHAHAHHAHAHAHA this is literally how retarded AMDrones are

>Pascal is literally an overclocked Maxwell
With moar ALUs.

Hmm maybe the chips have different ALU count?
Wow really makes you think.

>so desperate to shitpost that he failed at reading

>identical flops = identical clock
>HAHAHAHAHAHHAHAHAHA this is literally how retarded AMDrones are

Image actually doesn't support that statement.
Interesting how MOAR ALUS =/= Better gaming performance

Your meds.

Wow.

Hahahahah man Moore's law hitting the ceiling fucking hard as

You can only slap so much ALUs before the GPU starts choking somewhere, see Fiji.

Looks like IPC is up 30-40% on average. And then we have the clockspeed adding another 10-15.

That's as impressive a GPU arch jump as anything Nvidia has ever put out. Butthurt gamers can cry in a corner, AMD is making money where it counts.

Can't wait for another 4 weeks of these shitty threads.

Don't worry they'll talk very briefly about RX Vega on SIGGRAPH. It'll be all about shilling Instinct and Vega SSG (this one is REALLY nice).

F I N E W I N E
D R I V E R S

RTG should deliver working drivers first before delivering better ones.

I agree it was shady for them to pull that shit but I loved following the perception of the card.

>before 3.5
wow, benchmarks show this is a great card for the price!

>after 3.5
wtf, this card is horrible, no one buy it ever!

970 is good card for the buck, if you know that it's 3.5.

not him but you're a fucking retard. It would still be good for the buck if they lied about it having 12 GB and you believed it

No, no it wouldn't it would of been terribly ass when I would try to populate all the memory.

>R9
who says this?
it's like you're saying "i5"

how many claimed that money and got it?

>And then we have the clockspeed adding another 10-15

Which is the real question. Why does performance jump only 10-15% when clockspeed jumps +60%?

That's really, really weird. Should scale linearly.

Funny this. I understand why people would make a big deal in terms of muh spec change and marketing but in terms of performance the benchmark results were never affected by the spec change and I've yet to see any evidence whatsoever of any performance degradation due to the split config in games. I remember searching up and down YouTube for benches but found nothing. Even recent benchmarks by HuB and that fat German guy showed no slowdown even with vram hog games which came out in the last year.

thanks for correcting the record fellow pede :^)

It scales only linearly on CPUs,, at least close to linearly since clockspeed affects the entire chip.

Unlike GPUs, where there are different clock domains and overclocking the core doesn't do anything for the other clocks.

There's a bottleneck somewhere, obstructing the +60% potential jump in performance. Maxwell to Pascal performance jump was pretty much all due to bumped up clock-speeds.

Is the bottle-neck due to mistakes in hardware acrhitecture / some process issue, or is it completely within the domain of stuff that can be tweaked with software (drivers)

To this day we have no idea what the clock domains of GCN designs or Nvidia designs are, or if there are really any more than memory amd the processing unit, or if theres a seperate clock domain for the shedulers and shader arrays and backend and etc.

You clearly dont know anything at all with a comment like that and for that matter CPUs definitely hit gain limits with clock increases.

This board is full of retards.

>there seems to be a bottleneck in hardware or software
>WE HAVE NO IDEA WHERE THE BOTTLENECK MIGHT BE IN HARDWARE YOURE A RETARD HURR DURR

truly Sup Forums at its finest

>j-j-just a rebrand guise
>Proven WRONG
>HUEHUEHUEHUEHUEH


sure is 12 up in here

>wait for drivers to activate XYZ

no, the 3.5+5 was not what they lost on.

Its not a hardware issue, it's literally a driver one, what should be there is not currently implemented in driver and they had to use fiji as a fall back mode because there will be situations where the working driver doesn't work for older games and shit, this way when it works it really fucking works and when it fails it doesn't crash.

There is no way in hell a card that computes this much better at the exact same clock as fiji won't perform better then fiji in graphics rendering. but we will get shit posting like this till rx vega comes out,

because intel is incompetent and have sat on their dick for nearly 20 years
nvidia has had the 4000, the 5000, the 6000, the 7000 and the 200, all gpus that were better at launch then the nvidia counterparts they were up against to deal with.

you can laugh at amd's gpu side, but the market never rewarded them when they did good, in fact it actively punished them while nvida made fucking bank and keep improving, at least till maxwell, as pascal is pretty much just a die shrink and volta is an unknown outside of the high fucking end teslas.

Except that 3.5 gb meme was found somewhere in January or February 4 or 5 months after the release and the card still sold like a hotcake pretty much thorough whole 2015 and early 2016. I had 970 gtx myself and honest to god never had any issues with the 3.5 meme. I've even been called a shill back then because I was posting smooth non-stuttering webms of few games that actually used more than 3.5gb of vram like Lords of the Fallen using my 970. Nvidia fucked up but at least they somehow made drivers ignore that last slower section in games that can chug more than 4 gigs of vram. Also contrary to popular belief this slow memory bit wasn't accessed at all if not necessary. 970 was a great 1080p card and could easily compete with stock 480 and 1060 when oced.

I love how you spout some hardware issues when AMD released no official statement and when first benchmarks showed up you shills were literally dead quiet on Sup Forums just so you can end up using reddit comment arguments (more like speculations) like yours.

>shills are going again with the rebrand meme
Buy an ad faggot

>youtube.com/watch?v=yudueaG5_rE
>Gayming mode is a placebo switch
>No performance difference between Pro Mode and Gayming Mode
>AMDrone fanboys everywhere continue the chant "b-b-but it's not designed for games!!"
AMD have taken a page out of the Nvidia/Intel guide, and shafted their red army followers!

yeap its a rebranded 390
oh wait
pcper.com/reviews/Graphics-Cards/Radeon-Vega-Frontier-Edition-16GB-Air-Cooled-Review/Professional-Testing-SPEC

another nvidia shill gets blown the fuck away as usual

For gaming purpouses, it is

a card beats almost everything nvidia has to offer on the pro field

we know that the game mode is there to test games on vega uarch

>muh gaymin
Why don't we see how Shitquadro do on games?

Didn't Raja admit in an interview that they have been more or less ignoring RND in the GPU department for years

you mean every murican

A Quadro P6000 stomps 2xRX580 at 1800mhz at in 1080p/4K

literally A SINGLE game being benchmarked on a p6000
and you conclude its stomps everything LOL

it has been confirmed that gaming mode only changes the aesthetics of the crimson UI, nothing else.