Reasons to buy Ryzen instead of a cheaper, slightly better 7700K

>reasons to buy Ryzen instead of a cheaper, slightly better 7700K

Other urls found in this thread:

youtube.com/watch?v=r0fgEVEgK_k
twitter.com/NSFWRedditGif

There are none. Cores are a meme. Those faggots who wasted their money on a i7 2600K back in in the day were left wishing today that they'd just bought an i3 2100 instead, and so it will be with people who choose Ryzen over Kaby Lake.

MOAR CORES

Oh wait...

Don't be stupid. Those CPUs don't only have cores. They also have enormously more cache for example.
Of course, going for the -E line, is falling for +10% for +90% more money.
Also 4/8 processors are still the sweet spot and they will remain for a while.

>reasons to buy 7700k instead of a much cheaper, slightly better ryzen5

I swear to fucking god , If I see you posting that stupid benchmark again with that filename I'm going to kill you. It's like I can't browse any thread without this shit or the other retard posting about Netflix.

truth hurts I bet.

This! 4/8 will remain the gold standard until Intel tells me otherwise. Anything more is a meme. Cores literally don't do anything over a certain amount.

>Dual cores are the future!
Why isn't everyone still using 1c/2t Bentium 4 HTs?

CAN YOU NOT shitpost in this thread please, it's for serious discussion only. Thanks.

>until Intel tells me otherwise.
No, it's just what benchmarks prove. e.g. AMD shilling here often posts benchmarks supposedly show more cores are better for games, but the same graphs show the 7700K being only like 4% worse. That alone proves NOT getting a 7700K instead of a 6900K for example is utterly stupid for cost effectiveness since you can give less than $400 and get something that is barely similar to a $1,000 chip.

Why even have dual cores
there is literally no reason to have more than 1 core

Obvious shitposting side (until AMD tells me otherwise) 4c/8t is legitimately on the small side. Threads are irrelevant to a discussion about core count, they're equivalent to giving a 10-30% IPC boost. They aren't real cores.

The sweet spot today is 6 core processors. Most games will use 4, leaving you two for background programs and for future games.

Of course, for any actual work, it basically boils down to more coars

I'm using a dual core 15W celeron and it's great

For the Ryzen the 8/16 might be stupid compared to their 6/12, but that's also because it might overclock better to have 6 instead of 8.
However, at the same time that means, if their 4/8 is binned well (e.g. the "X" model, 1400X), that might be actually good potential.

This is not speculation, we KNOW the 7700K is extremely cost effective since it's only 8% worse than $1,000 chips on games that can use good multrheading.
If the 1400X can go near it, it might be a sweet spot too.

In any case either the 1400X or the 6/12 (1600X?) will be the sweet spots for most regular Sup Forums people, gamers but also Desktop power users. The 8/16 will have overclocking issues for sure and very few games can even use it fully.

The 8/16 apparently overclocks well on motherboards with sufficient VRM, it's heat constrained (as opposed to hitting a voltage or architecture wall).

So giant Noctuas or triple rads might get 4.6GHz out of an 1800X.

You guys are retards. Everyone knows games are poorly optimized for multiple cores because devs are lazy shits. Everyone knows by now Shilltel's single core performance is still slightly ahead. The point is you're getting almost the same performance in muh games for less money and way WAY better performance in multi threaded applications (aka real work).

>what is vulkan or dx12

Games are getting better about threading. CS:GO is probably the oldest game that can effectively use eight cores.

>devs are lazy shits.
That high school kid meme again. It's not because they are "lazy". Go learn what a mutex lock is and now realize an interactive application needs thousands of them per second to not segfault.
They are not video encoding.
Those locks slow them down.

If you gave devs a button where all they had to do was click it to optimize perfectly across all available cores and multiple gpus, I still don't think they'd click it. Idk

Only performs better in gaymens. And gaymen is for fags. You aren't a faggot are you user?

In general games will always require at least a large part of serial performance. Interactive applications create de facto a conglomerate of interactions that can not be fully paralelized. They are not offline rendering, the user affects the data on every move.
Sure you can improve it over time, and there are cleverer methods than others, but you will NEVER make a game to become 100% parallel.
tl;dr: it's both hard but also inevitable to not require serial performance partly.

stop posting kid
youtube.com/watch?v=r0fgEVEgK_k

I disagree with you man, I'm not him, but many developers don't try very hard to optimize their games just for the reason computers are very powerful today (i.e go look at some old psp games, i dont remember exactly how much ram they had or processing power, but the games did look pretty good, even though the specs are very weak)

I'm not talking only about cpu optimization, but about code optimization in general

Oh look it's you again
>Physics is serial
>Visual effects are serial
>AI is serial
>None of them can be run on separate cores
>Amdahl's Law is gospel despite multiple desktop applications exceeding it's projected gain

Do Intel shills never sleep?

Kind of hard to sleep while you're busy shelling pakistanis

So, if you go Intel, you never have to sleep? Even if performance is lower, that is kinda tortoise and the hair, isn't it? We shall persevere with our lack of tiredness! AMD BTFO!

you won't be feeding the jewish machine

I'm so sorry.

>Still worse
>Still the same price

DAMAGE CONTROL

STOP STOP STOP YOU'RE COMMITING ANOTHER HOLOCAUST STOP IT THINK OF THE SIXTY TRILLION WHO DIED DUE TO NOT BEING BORN

Oy vey this is not fair we're the chosen people the people of God tihs can't be happening to us.

NO NO NO NO NO NO NO NO NO NO NO NO NO NO NO NO

>worse
literally within margin of error. And GTA5 cannot into 8+ threads.

Ryzen at its worst is just as good as skylake/kaby lake by the looks of it.

>the only valid measurement of CPU performance is entertainment applications whose developers struggle to scale their application performance with multiple cores

I'm not a dumb Sup Forums manchild that plays gaymes

More cores actually are extremely important because every application is use in work is heavily multithreaded and scales very well.

>r7 1700 stock
>3 ghz
>avg fps 85

>i7 7700k
>5ghz
>avg fps 88

KEK LMAO

Not a shill or anything.
Averages are very close but minimum matters quite a bit since dips down usually leave a suttery feeling.
Maximum is relatively useless stat though.

This

Ryzen is a stuttery mess, very bad gaming CPU

Lmao now that is really pathetic

AMD's $400 CPU can't even beat Intel's $340 CPU

Ryzen is dead on arrival.

That's not what I said faggot.
I said minimum frames are to be taken into consideration.
Don't fucking use me as a leg up to propel your shilling.

Minimums are actually just as bad as maximums.
99th percentile frames is what you want, that'll show you what the microstutters are going to look like.
As an example, I'll give you a set of numbers where the minimum is the same as the AMD chip but the 99th percentile frame is actually quite a bit higher.

28, 68, 79, 80, 50, 88, 89, 120, 110, 132, 100, 90, 85, 85, 88, 52, 60, 55, 72, 77, 78, 99, 30, 54, 59, 76, 88, 90, 85, 87, 88, 87, 84, 80, 81, 89, 90, 71, 77, 74, 77, 62, 58, 49

In that, the average should be around 77, maximum 132, minimum 28.

However, 99th percentile should be around 50-60 or so, you only see the 28 for a moment as the benchmark starts up.

It's not just games, the vast majority of applications are not optimized for multithreading. This is why Ryzen fails at so many basic tasks.

AMDtards mad they will never be able to play 4k netflix.

DX12 will change all that all wee need id game devs getting off their dumb dead asses and use it as standard.

what the fuck are you on about

plays just fine on my AMD FX

>not filtering the image md5

>Minimums are actually just as bad as maximums.
That's not the case I have found.
I have had games that run 120-144 fps on average. But they have so much micro stutter where 1 frame will take easily up to 10 times much longer than everything else.
The result is a very nice looking fps score and average and even maximum, but the stutter is there and depending on the game is can be really annoying.
Simply put I don't care if the game can sometimes reach 4 times is average.
But I do care of what is the minimum it will hit at times. Because that some what an assurance that what ever dips happen (frequent or infrequent) are not as bad.
Ideally the closer the minimum can get to the average the better.

In your own example, I could give you something where a huge portion is 160 but with often dips into 28.
The average might look okay, but the frequent dips will feel like murder.

Again, I am simply pointing out that minimums are something to consider which may or may not indicate a bad experience(really depends on how frequent and how bad the dips are). While maximums are usually useless.

>bent pins
>chipped ceramics
>out of the box


Just say no to AMD

Uhh no it doesn't, you literally need a Intel Kaby Lake processor for 4k netflix