FINALLY

FINALLY

A tech reviewer, unlike the so-called ones, that actually does their fucking job and tests shit out.

Ryzen tested with the RX480 vs the 6900k (on 720 so it's less bottlenecked) instead of relying on Nvidia ""drivers""" to work properly.
And other tests

>thetechaltar.com/amd-ryzen-1800x-performance/2/

Other urls found in this thread:

rbt.asia/g/thread/S59317047#p59317856
youtube.com/watch?v=nLRCK7RfbUg
youtube.com/watch?v=gFBKFz9n2hc
youtube.com/watch?v=0tfTZjugDeg
youtube.com/watch?v=nIoZB-cnjc0&feature=youtu.be
twitter.com/NSFWRedditGif

Much of the same games, but 1080p with a GTX 1080, and the 6900k wins instead.

They tested with "threaded optimization" on/off, but didn't try MSI on.

>Ryzen + Vega

This will be sweet if Nvidia's drivers being shit is true

inb4 everyone tests Vega with a 7700k or some Haslel

It looks like Vega + 1700 is going to slaughter Vega + 7700k, at least in DX12/Vulcan/AZDO, eh?

Nvidia's DX12 support is awful, BF1 runs SLOWER DX12 than DX11.

These are also using the 1080. Sadly no like RX480 on 720 low comparison or something here so we could get an idea of how it'd be with vega.
Same 3000mhz RAM for all CPUs tested.

7700k with the 1080 wins in 19 games.
1800X wins in 12 games.
The 1700X and 1700 still beat the 1700, stock, in a few.
>1800X is faster in Arma3? I didn't think that'd be the case.

Those are good results, considering how poorly Nvidia drivers are working given this was with the 1080.
Seems like the 1700+Vega may end up almost beating the 7700k across the board.

>testing any Novideo with DX12

People never learn

Or 1600X+Vega, I should say.
Will probably be faster in 90% of games.

Yeah, but I said Vega + 7700k. So no shitty Nvidia DX12 drivers holding it back there.

7700k+1080 in DX12 would be even worse for them both.

>50% utilization on TitanXP

Great drivers, Nvidia

It really just seems like the """""tech reviewers""" were creating misleading results and shilling on purpose, doesn't it?

No, they're just retarded and clueless on how hardware and software work.

Makes sense.

It's been a month. A big company like Nvidia should fix it faster than Ashes of the Benchmark.

I agree that it's not some conspiracy, though. Just incompetence all around.

rbt.asia/g/thread/S59317047#p59317856

Oh look, it's almost like I haven't said it might be a problem some 3 weeks ago?

DX12 is shit. Nvidia is sabotaging microshit and you idiots don't even realize that this is a good thing.

>bitsandchips

clickbait bullshit. nobody believes anything those filthy dagos says.

amd is just incompetent as fuck. they didn't even send asetek am4 bracket dimensions before ryzen even launched. trying to blame nvidia for amd's own problems is hilarious.

but user, nvidia gets REKT In Vulkan too, which is what we all truly want and need game devs to switch to.


because fuck DirectX, Vulkan is better for end users in all cases.

Lots of us have posts on anandtech and other forums noting the same thing 3-4 weeks ago, yep.

Fucking retarded """""hardware reviewers"""

The conclusion of the review in the OP was:
>Ryzen is a superior CPU, though one can find the worst way to benchmark something to show it’s weaknesses

>which is what we all truly want and need game devs to switch to.
Then why are all the threads about this so far have been nothing but Ryzen fags talking about DX12. This has nothing to do with Vulkan or even DX12 really, just AMD drones trying to cling on to anything to defend their subpar CPU.

Because game devs are making DX11 and DX12 games, which is why it matters.

While its all good and well to hope for Vulkan all the way, it probably wont happen, thus, with nvidia being shit with DX12, specially on ryzen - it is a fair arguement that they should at least get their shit togther when its what game devs use and will use more than they do now.

Being loyal to any brand is fucking retarded though, I hope you know this.

>though one can find the worst way to benchmark something to show it’s weaknesses

Yeah, this is one of the few reviewers who didn't use slower RAM for Ryzen. 3000mhz on them all.
This is one of the few reviewers that didn't do 16x AA and other setting tweaks on games that favored Ryzen to create a GPU bottleneck (like tom's hardware and tech power up) so in games where Ryzen was better it was even with the 7700k, while keeping settings lower on 7700k favored games with Nvidia DX12 so the 7700k would pull far ahead for those and look like the 7700k is even on some, better on others, when really it's better on some and worse on others - and that's mostly just with an Nvidia GPU.

HMM. Go figure when you do a fair and even methodology, you get reasonable results that show it's a good CPU.

Because there aren't enough Vulkan games to test.
There's like 3 or 4? And one of them still uses an OpenGL rendering pipeline so hardly sees benefits.

Mad Max Vulkan beta just came out but most """tech reviewers""" are too ignorant to know it exists. Feral Interactive is making a Vulkan update for all their games 2015 and newer, it seems.
The Mad Max Vulkan beta update increases FPS to 175%-350% versus OpenGL.

Intel+Nvidia is going to get pretty BTFO if someone actually bothers testing a 7500+1060 against a 1500X+RX480 even stock if Nvidia doesn't fix this.

The all AMD complete system is like $80 cheaper, yet going to outperform it so much.

>Being loyal to any brand is fucking retarded though, I hope you know this.
Tell that to this guy I'm not the one who is making a GPU driver issue that has nothing to do with the CPU all about Ryzen

While we do appreciate your shekles here at bintel, fact is, ryzen sales and reviews has been in a very negative light due to Nvidia having piss poor drivers.

While it IS Nvidias fault, AMD are the ones who pays for the idiot reviewers and masses that blindly belives them.

Im sorry that you really want any company you dislike or dont own any parts of, or have no intention of blinds you from seeing such things.

FYI im on an Intel CPU and nVidia GPU.

You're a delusional paranoiac which is why no one takes you seriously. You want people to care about this then try making a thread without obvious pro-AMD bias.

>calling me bias
>for pointing to a review with actual even methodology in their testing and a wide variety of games, old and new, tested
>not those who shill around reviews where Ryzen has 2400mhz RAM while Intel has 3200Mhz
>those people that you're probably one of

Whao. You're some kind of fucktard, huh.

5 shekles has been deposited into your account

>a review with even testing methodology across the board
>tons of games and applications tested
>OBVIOUS PRO-AMD BIAS REEEEEEEEEEEEEEEE

youtube.com/watch?v=nLRCK7RfbUg

Go get cucked. Even with AMD's drivers Ryzen still performs like shit.

Says the paedophile.

I'm not calling you biased because you linked to a review. I'm calling you biased because of the narrative you're pushing in your posts.

I don't take issue with the review or their methodology. OP is a mentally deranged individual who is working very hard to skew the discussion and make it all about how great Ryzen is, rather than Nvidia's fuck up

>GPU bottleneck to test CPUs
k

There is so much mental retardation being flung around on this thread.

Nvidia is obviously incompetent in DX12. Either by hook or by crook, they do not want DX12 to happen.

AMD is at fault for not having a high end GPU for Ryzen.

These test results show that the tech press are incompetent at best, crooked at worst.

And AMD needs to hire more fucking people.

That's what I have learned this last month about the industry.

>OP is mentally deranged
>for pointing out there's finally another review out there that uses the same speed RAM on all systems
>and for pointing out that the reviewer actually compared various settings
You're the mentally deranged one.
You seem triggered hard by fair testing methodology.
Is that you, Steve?

>amd is shit
>blame nvidia

What am I supposed to with my life if AMD makes good products? I should just kill myself then

>I'm a faggot who doesn't understand computers beyond their use as Xbox consoles

Try and look like a dumb fuck harder.

>1080p+ "realistic use case" results are bad
>use 720p which no one with this setup actually uses

top kek

The Pro Duo exists, but it seems like literally no "tech reviewer" has one.

That isn't the point though.

>Reverse course! Ryzen doesn't suck!

The Produo is an impractical card to test with.

I was planning on a 1700X and a Vega anyway, so good.

I'm happy with my Intel, and I don't need anything else, I certainly don't need AMD, AMD can go fuck itself and rot for all I care.

...

It's not because Nvidia told us not to test it!

We'd never do that! Nvidia just makes the best GPUs you can buy!

This is a staged benchmark as much as it can be, the 6900k is a lot faster than a 1800X, it has more cache, no CCX nonsense and higher IPC, it losing to Ryzen is retarded and makes no sense, the price of the product reflects that as well.

...

every bi GPU are terrible

Shilling and delusion isn't enough to properly articulate this, what do you call this kind of mentality?

Shitposting aside, it's probably mostly just a failure of AMD marketing to not send cards to reviewers.

There aren't enough to send to the thousand+ reviewers that'd want one. It was a high end workstation card and not really geared toward games, especially in the numbers they produced them.

>ayymd need 2 gpu to compete with the 1080 non ti

it's biased
BIASED
fuck!

>"compete"
>35% faster
>merely "competes"

You shouldn't do this, technical deep dives and curiosity are bad, only do what everyone else does and only test with Qualified Vendor Approved hardware and software.

Or else you'll never get a review kit or support from us again!

Yes, AMD needs two GPU because they can't compete a full Pascal.

>those two GPUs combined cost less than a third of what the single Nvidia one does

AMD drone falseflagging. From the consumer perspective, the real Intel cpu Ryzen is competing with is the i7 7700k. Nobody but AMD gives a shit about the 6900k.

Maybe it's because AMD hasn't released their highend GPU yet?

At least not for desktops.

>$800

k

Is that a FP64 or FP16 card?

fucking this

> 99% of reviewers are comparing the 1800X to the 7700K
> 7700K is a quad-core
> games benefits more from single thread performance
> The 7700K easily hits 5GHz
> 7700k is indeed better for games
> Then the conclusion is that Intel is better

WHY THE FUCK can't those Intel shills compare the 1800X to a 8-core Intel, the damage control is unbelievable.

ALWAYS compare the 1800X to the 6900K.

wait™

It's a machine learning accelerator at 12.5TFLOPs fp32, 25tflops fp16.

FP64 is reserved for Vega 20, which I think is 2018.

Yeah, sucks that AMD prioritized enterprise/datacenter before consumers.

HOL UP HOL UP

Wait. I have a conspiracy brewing in my head
wait wait wait
what if, nvidia did this on purpose to gimp amd performance in the CPU world? so amd would lose revenue and be less competitive with GPUs?

But wait! What if it doesn't stop there. maybe because AMD don't have a card to compete with the gtx 1080 nVidia thought they could get away with it?

What IF?!?! What if the exact same thing happened at intel when AMD couldn't compete on the CPU level. Intel could have optimised their architecture to work better with nvidia cards.

It would explain why the 980ti beat the fury x despite having 3 less teraflops of performance. in fact the furyx should beat the 1070 handily but it doesn't. is it all a conspiracy??

All this time amd have been superior to nvidia but we couldn't see it! Because if you bought an intel cpu they would be gimping amd performance, and if you bought an amd cpu you would be gimping the performance of any graphics card. So reviewers would only test gpus with intel cpus and nvidia would get an artificial advantage.

it all makes sense now! why the 8teraflop< furyx lost to the objectively less powerful 980ti. and all the previous generations of nvidia superiority. it was all because amd couldn't compete in the cpu market so they're own products were at the mercy of intel. Now that amd are only competing in the cpu market their products are at the mercy of nvidia!!!!

this has to be true it makes so much sense in my head. this is why we multi gpu support so that manufacturers end up gimping themselves as well if they try to gimp the competitition

tell me why im wrong, i dare you.

>sucks that AMD prioritized enterprise/datacenter before consumers
> enterprise/datacenter aren't consumers
ayy lmao

Because that's the price range AMD decided to stick Ryzen in you idiot. It's only natural that people are looking at Ryzen and comparing it to what else they could get for that money. The difference between 8 and 4 cores is nothing more than another bullet point in a list for most people.

jesus christ you sure drank the kurry aid

Nice delusion, bruh

>because it's cheaper it often outperforming this $1000 part for half the price or less doesnt matter

7700k is irrelevant for high end users anyways.

Intel shills just want to compare ryzen to the 7700k in gaming and the 6900/6950 when it fits them.

No, not really? Nvidia released enterprise cards before gamer crap first as well.

The 7700K is only better in some very limited use cases like playing some older titles on a 144 Hz monitor. The 7700K loses badly to Ryzen in much more relevant cases like BF1 multiplayer, where it turns into a stuttering mess while Ryzen's 8 cores deliver perfectly smooth performance.

Oh wow? No shit.
Why do you think that is?
Why do you think Intel is changing its release cadence to servers first?

Because the PC market is stagnating in shipments, while enterprise is growing.
Honestly you fuckers have 0 perception or management abilities.

>Because that's the price range AMD decided to stick Ryzen in you idiot

HAHAHAHAHAHAHA

This is probably the most genius bit of shilling I've ever seen.

>Sure, it outperforms our $1000 CPU, but it's half the price so you can't compare them.

It's going to be the 580 in 2 weeks with a 1340mhz boost clock on reference boards for the same price or cheaper than a 480/8. The rog stix 480 is 1330 in OC mode so I'm guessing aftermarket 580 oc cards are going to be in the 1400mhz range

...

I can't give a shit about Polaris even if it's clocked 15% higher and has GDDR5X

You should, because its competition will be a Pascal refresh 1060 since Volta is now safely Q2 2018 territory.

Not terrible, just under utilized.

Actually I fucked up. It has nothing to do with price ranges. The 7700k is simply the intel cpu the vast majority of people are inclined to purchase because of its great value so Ryzen would have had to compete with it regardless while nobody really gives a fuck about the 6900k. AMD tried to steer the conversation towards the 6900k but it was only marginally effective.

Shills act like they can simutaniously have a 6900k in gaming when the 6900k is better in one game (which is rare, as we can see when Nvidia drivers aren't holding it back), but then have the 7700k in gaming when the 7700k is better in another game.

That's even funnier than people ignoring cost and CPU utilization left over.

since no one posted the vids yet.
This new round of discussions started here:
youtube.com/watch?v=gFBKFz9n2hc (AdoredTV "what to trust")

and everybody started to pay a bit more attention here:
youtube.com/watch?v=0tfTZjugDeg (AdoredTV "Ryzen of the Tomb Raider)

got some good supporting arguments here, (I'd rec watching only this, above any other)
youtube.com/watch?v=nIoZB-cnjc0&feature=youtu.be (nerdtechGasm "AMD vs NV Drivers: A Brief History and Understanding Scheduling & CPU Overhead")

and as usual, moronic run off the mill youtubers didn't get it and i'm quite sure that this is only the first vid on this subject, since youtubers love this kind of circlej erking.
youtube.com/watch?v=nLRCK7RfbUg (hardware Unboxed "Does Ryzen Work Better With AMD GPUs?")

Anyone with 40~mins to spare, I'd recommend watching Adored's 2nd vid and then Nerd Tech Gasm

I've seen them.

I just believe they're less complete tests and harder to digest than the test in the OP.

None of those test MSI enabled on Nvidia, too.

The nerdtechgasm one is VERY good, though. I would recommend watching that.

It is true, watch this video
youtube.com/watch?v=0tfTZjugDeg

>CPU utilization left over.
Very important thing too, those extra cycles can and will be used when multitasksing, preventing frame drops.

first off, what a terrible bench, the 1080 is just ahead of the 980ti.

and the fury x is close to a 980ti.

please kys yourself

>kurry aid

LMAO!

>video games

we have separate boards for a reason, you fucking faggot.

I disagree.
They're deliberatly misleading, first, shifting the focus for cpu reviews onto gaming in such an extent I've never seen before. Not to say that games never featured on cpu reviews, but I'm an old fag who's never seen cinebench or wprime, intel's favorites up until january 2017, lose the limelight to games like this.
Second, and this is very debatable, the very methodolgy and especially the choices for benchmarking games, somehow always seem to carry one or two outliers that would end up skewing the final averages for green+blue. I say it's debatable because, regarding gaming benchmarks it's unarguable that their choice usually represent very popular titles, but any moron who played with graphs and averages on excel for 10 minutes would know that it's quite easy to paint scenario you desire and still preserve an unbiased façade; "there are lies, goddamned lies and statistics".

You haven't even seen my final form.

This is just the beginning.

The review covers both.

But we already know that Zen shit stomps for servers and workstations.

Please stop making threads on a failed and outdated CPU that is Ryzen. The piece of shit is a massive flop for AMD and soon will be bought out by INTEL.

Why are you triggered?

Say this for 295x2.

It's definitely AMDs fault here for not allowing Mobo manufacturers and GPU makers to test their CPU.
Although as Level1techs notes, AMD has every right to be paranoid about these companies and keep their shit secret until launch. So frankly, it doesn't matter. I wouldn't recommend buying either Ryzen or Kaby Lake for a couple of months because they are both riddled with a host of problems that will get fixed eventually.

cue in: joker will be the next one to release something that will just prove that he didn't even understand the subject; tech yes city will come next probably counter argumenting his old boyfriend;, Paul and Bitwit will miss the point entirely on their weekly show; Wndell will get invited for another podcast and will talk some sense although briefly; linus will do his weekend show and won't bring anything new after a whole week of reddit freaking out over this topic; and JayZ will release a 3 min vid making a brief commentary on this, because he knows he's a moron but he's quite good at skipping the bandwagon to save face and will bring the abridged and digested version of this debacle, usually a reasonable closing point.

Look at the resolution, you dumb shit.
The Fury, and GCN in general, scales far better with resolution.

That's why the RX480 beats the 1060 on average handedly in 1440p, and the FuryX handedly beats the 980 in 1440p.
So at 4k, yeah, the Fury X can certainly beat the 980Ti in some games.

You have no clue what you're talking about, and this whole GPUs performing differently relative to one another at different resolusions perplexes you, so why the fuck are you commenting?