Ryzen 5 series performance is literally half of competing Intel CPU

>Ryzen 5 series performance is literally half of competing Intel CPU
Where were you when AMD is kill?

>spic
no thanks

I don't think we're looking at the same chart.

it looks really nice considering it costs half as much

These cherries sure are delicious, mmm. Now, what were you saying?

Are you even trying? You seem lazy.

Hey, you cherrypicked the wrong chart. Let me help you with that.

Total Shill: Intelhammer is my favorite game.

>Battlefield
>Doom
>Metro Redux
>Rise of Tomb Raider
>Total War Warhammer
>Total War Warhammer

Okay, but now, where are the benchmarks if more serious applications? Like, Blender, Mathematica, PovRay, h.265 transcoding...

Can Ryzen render bewbs that big?

There's some benchmarks out from Puget Systems, but they are pretty old. Solidworks and Lightroom results were really poor on Ryzen systems so it may suggest a rebuild is necessary for this platform and it might be significantly affected by special optimizations.

...

Looks pretty good.
What's the problem?

hes an intel shill

>Novidya drivers

Discarded

>ROTR
>1600 gets 20 fps more

its compeating against in i7

Chapuzas?

at 3.6 with 2400 RAM

it costs less than i5

>literally half of competing Intel CPU

Oops, you forgot to post the chart showing this.

>Nvidia the way it's meant to be gimped!

why are they comparing a 1600 to 6700k?
the 1600 is over 100$ cheaper
almost 150 cheaper if you include the free cooler

what is this nigger shit?

i just want a reviewer to compare CPUs that are the same price? is that so fucking hard?

Really makes you think.

So is just a dumb shill faggot?

>why are they comparing a 1600 to 6700k?
>the 1600 is over 100$ cheaper
No, it's over $250 cheaper.

6700k+z270 motherboard+decent cooler is about $550.
1600+motherboard = $290.

>when crying intelshills lose touch with reality so bad they post images that don't even remotely support the 'points' they try to make

Is this the site that used 4GHz RAM for the Intel processors and 2GHz RAM for the AMD ones?

How many times are we going to go through the whole "AMD releases line of products, lackluster launch, drivers improve, it BTFOs the competition" until people get it?

Ryzen just needs better drivers and when devs start taking advantage of Ryzen's advantages, it'll be GOAT (just like their GPUs)

Until AMD gets its shit together and launches a goat product from the get go?

Instead of having to wait for the performance increase that inevitably comes?

AYYMD RYPOO HOUSEFIRES

But that'll never happen because they invest in future technologies

AMD Fine Wine technology is real for a reason.

fuck yes broheim, 1600 confirmed for 60fps 1080p ultra
works better on red cards
>RX580 1500mhz $200
75hz Freesync monitors are cheap
this is looking good

Then we will go through this every fucking time until AMD gets a fucking clue that building for tomorrow today doesn't meant shit to a customer who just wants the best right now.

Why does the R5 get such a massive increase over the R7 in tomb raider?

Because Tomb Raider is fucking broken since Nvidia got involved.

For gaming it really feels like the 1600(x) is the one to get.

Glad i waited.

How lame, they didn't even overclock the i7 to 5ghz or more. how is that fair.

you know some dev are not lazy and willing to optimize their game right?

>Is this the site that used 4GHz RAM for the Intel processors and 2GHz RAM for the AMD ones?
Pretty much.
3466 for Intel, and 2400 for RAM, iirc.

Because they didn't retest the 1700, they just used past results, for some of them. For that game it's just because the engine is retarded and scheduling threads wrong.
I'm sure there is lots more fucked up with their methodology, BIOS, Windows settings, etc., as well.

R7 is bottlenecked by AMD retarded infinity fabric design

Fair enough but for anyone that knows better and is faced between a 7700K or a 1700X, just go with the 1700X, we know how it'll end.

I literally got it for free with a fx8350

How's your fourth generation Skylake?

>Competing Intel CPU
>CPU that costs 50% more and have an average of 10% higher performance in games and 20% lower performance in multithreaded apps.

Selling promises is never a good business strategy. Sell results.

The i5 is going to be interesting. It will have higher maximums, but lower minimums and the averages will be in favor of the 1600 because of it.

Does that mean this test is more unfair to the i7 than any of the others because of optimisations made for it since release?

Fucking shills

Should average FPS in gaymes even be shown?

0.1% minimums, 1% minimums, and 10% minimums should be the only numbers anyone cares about.

It's meaningless if you get 250 max FPS when it still keeps dropping to 80 when you have a 144hz monitor.
It's meaningless getting 90 average when it drops under 60.

It looks like the 1600, especially with an OC to 3.8-3.9ghz, is not going to drop under 60 in just about any game when a 7600k OC'd to 4.8 or even 5 does on tons of them.

Either Intel shill, or at the peak of mount stupid

oy vey let's pay $1000 for 8 cores instead of $300
r5 also uses infinity fabric with 3+3 configuration

only shows that TW warhammer is apparently a huge pile of shit with that huge a discrepancy.

>blue: avg fps
>blue: min fps
ok

oiiiiiiiiiii veeeeeeeeeey

Shit, I just noticed this. He better delete that fucking video now.

The problem with selling promises, and being AMD, is the whole over a decade of failing to deliver thing m8.
Personally I'm waiting a month to decide on my build to give time for shit to come out, and to be improved. However not every buyer is willing to do that.

>R7 is bottlenecked by AMD retarded infinity fabric design

The R5 uses the IF crossbar too, dumbass. It's the exact same dual-CCX setup, just with one core disabled in each.

They shouldn't.
They should compare 1400X with 6700K
So it's on equal terms, 4C8T vs 4C8T

>gaymes are the only measure of peformance
who cares, any NEET with that much time to play games will buy a G4560 because it's all their saved allowance can afford

I'm going to downgrade my i7-2600K to a G4560 just to spite you.

1700 should be compared to the 7700k because they're most similarly priced!

1400 should be compared to the 7700k because both are 4c/8t!

EYE FIVES ARE THE BEST AND NOTHING CAN BE COMPARED TO THEM.

do it

i spent my neethood a few years ago with a 3770K

shit was cheap back then because strayan dollars were worth 10% more than those of the burger

>graphs are actually green and red

kek

why the fuck doesn't having 2 cores more to compute any effect on performance? fucking gamedevelopers.

so the 1600X will perform like the 1800X and be faster than the 7700K? looks sweet desu

Are you retarded?

1060 min. fps is way higher than the overrated RX480

>Doom (OpenGL)
>terrible

Welp I was planning to buy a Ryzen for my OpenGL work but nevermind then.

>1800X faster than the 7700K

Go kys delusional faggot

that's how he got fat hehe?

You are supposed to compare the RX480 on the 1800X vs the GTX 1060 on the 1800X pleb. The RX480 on the 1800X is higher than the GTX1060 on the 1800X, showing that Nvidia is gimped in DX12.

You tech illiterates are really suprising me everytime.

You don't understand jackshit and try to cover it up by using buzz/meme words.
Why don't you atleast read 1 article on that matter instead of showing how dumb you are.

Goes for the rest of you guys too.

Sure you were....sure you were

>hurr durr
>you just don't understand!

Why is "minimum framerate" even a stat, assuming it's literally the lowest the framerate was measured at? Surely a more useful measure would be the average framerate of say, the highest 2% of the frametimes.

>hurr durr
>muuuh gimping

Anyone using that word should be permabanned from Sup Forums as that person obviously doesn't even possess superficial knowledge about core tech.

>0.1% minimums, 1% minimums, and 10% minimums should be the only numbers anyone cares about.

I agree these should have priority, but I would not discard average completely

>muh 25% single core
It gets shrekt in anything even remotely multithreaded and that's facts. Faggot. p.s. kys

people don't like hiccups, lag, etc.

a single slow frame isn't a "hiccup".

It does not explain if it's the lowest group of frames over a second, 0.1% minimums, 1% minimums, or if it's just what the framerate would be if you took the single slowest frametime.

Why do you even comment when you don't know what that guy is asking?

Because it doesn't matter if you have 100 average fps if you go down to 30 min fps in the most extensive moments

You missed the point. I'm saying that the 'minimum framerate' could easily be a statistical outlier. If it were say average of the bottom .1% or 1% or whatever, that'd give a much better picture.

You're right, but we are not the reviewers. So unless it becomes a demand, we are not gonna see this become standard

That's an AMD game dumbass. Nvidia isn't humping anything.

>Being a phoneposting moron
>Getting cucked by autoccorect
>Being too dumb to read a chart
>Or correctly infer it's meaning
How embarrassing for you.

>looking at amd 5

found the poorfag

>H-he's going to get a cheaper and better CPU than me
>Better call him a poorfag!

but everyone buy a 7600k your computer and no applications ever use over 4 threads!!!!

>a single slow frame isn't a "hiccup".
it's not a single frame, don't be stupid
it's average minimum fps, meaning how low it will go on average

ideal measure would be frametime graph 1hour long for cpu performance
because just measuring averages is retarded and doesn't give you time relation when what frame happens, it can go from 190 to 70(or from 70 to 40 and average would be ~60) every other frame which would make it unplayable but averages would look like it's got 120 fps

fun thing with R5s is that you get 6850K($500) for $250

How is the 1700X losing to the 1600?

Older benchmark before updates improved performance.

The nvidia single threaded driver + total warhammer first core usage is the main issue with the performance differences.

Total Warhammer (and many other) typically utilize first core/thread for the main task and the other threads as auxilliary systems.

Nvidia driver uses first thread/core as the driver.

What would fix this discrepancy is if nVidia can change the thread affinity to one that's used the least or change it to not use the default thread.

This would then bring both the performance in line.

...

Give it me straight.

will 1500x be any good?

Uhhh, isn't the r5 1600 $220 ?

Beat bang for the buck on the market.

yes

>people still testing BF1 DX12 on Nvidia cards

is it a cateter in the middle

>not spending that extra $40 for 2 more cores

Some people don't need the higher power draw for just gaymen.
Some people don't want to OC, and the 1500X boosts to 3.9Ghz stock.

It also has more L3 per core so it theoretically should have a higher IPC.
It's also better binned (seems confirmed from people getting 1400s and 1600s early doing OCs) so you can probably get another 100-200mhz OC out of it than the 1600 if you do OC.

As much as $30 more for 2 more cores is nice, someone that doesn't need it ought to not spend the money. Especially when they could just get a 2nd gen Ryzen Athlon next year that's a single CCX and can clock to 4.5ghz or so and just use the 1500X as something good enough for now.

If you have a functioning pc at all you should just wait for R2, don't be an early adoptard