AMD's 6 core

>AMD's 6 core
$215
$70 for motherboard
$285 total

>Intel's 6 core
$389
$219 for motherboard
$608 total
>Over twice the cost and
>IT'S FUCKING SLOWER

No wonder Intel pried these so high. Only a handful of idiots will buy them and they need to make as much money from those idiots as possible.

Other urls found in this thread:

techspot.com/review/1450-core-i7-vs-ryzen-5-hexa-core/
youtube.com/watch?v=UfNMn7RWgLw
twitter.com/SFWRedditImages

Why are you not facturing delidding costs and risks of frying both your socket AND cpu like Steve did?

Who the fuck is Steve

The guy who did the benchmark you posted, retard.

I stopped taking OCed benchmarks seriously. OCing is a huge waste of time for a 1-2% improvement in framerates in most cases. Tons of troubleshooting and extra equipment for the most marginal of gains, just get a better CPU.

>Intel's 16 core
>$700 on ebay for an entire system

And it's slower while using way more power. What a garbage CPU.

Just give it slightly more voltage than you really need instead of trying to keep it on a razor's edge.

PUBG's update was worth it. Arma 3, and Argo update when?

>4GHz overclocked 1600 uses less power than stock 7800X
makes u think

No it doesn't. It's the process it's built on. It's also why it basically can't go past 4Ghz clock speeds.

>benchmarking CPUs with a GPU limited test
really makes you think

>if shitel doesn't win the benchmark is not valid

>Ryzen 1600: 139 fps
>Ryzen 1600 @ 4.0 GHz: 139fps
So you're telling us the 1600X, 1700, 1700x, and 1800X are total scams?

>if shitel doesn't win the benchmark is not valid
Before the update this game was a sin to mention. Now it's okay to post though since it's gpu bound now.

Not a scam, just not a necessity if you want to only play gaymes with your build.

$608 for a shit 6-core is a scam, though.

Whatever goy. Buy intel pls.

>game probably doesn't use more than 8 threads
>minimum fps changes according to clockspeed
>hurr-durr it's gpu limited

oy gevalt

Congrats, you just discovered why having a 5GHz i7 doesn't help in most games.

No it doesn't what, retard?
It says right there that the stock 7800X consumes more power than a 1600 overclocked to 4GHz on all cores.
That's an easy-as-fuck to read graph. There is something seriously wrong with you if you can't read it.

If it was GPU limited then the numbers would be the same.

Is everything ok user? You just sort of

>Whatever goy.
>Buy intel pls.
k. Good luck with the thread friend.

The worst part about this is that that's total system power consumption

GPU+chipset+aux is probably ~250 watts.
So it's more like 128 watts for the overclocked 1600 and 196 for the 7800X. Almost double the power consumption for often worse performance.
What a disaster. Nice mesh fabric, Intel.

I should be asking you that question, what you said is not semantically sound while all I said was that having high CPU performance in single core is pointless in video games because video games are bottlenecked by GPU 99% of the time.

>minimum fps changes according to clockspeed
Please learn to do second grade math. The largest change is a 1.9% increase for a 34% increase in clock speed. It's GPU limited.

It's obviously not GPU bound when minimums are concerned.

You don't seem to understand how games utilize CPUs. It's not a perfectly parallelized task, dips do not scale with clocks one for one.

Anything that's CPU limited will -- get this -- scale with CPU.

Only somewhat in the average framerate. If the dips are induced by a CPU, like in that PUBG benchmark, they will not necessarily scale with clocks, and may come from other places - like memory controller or driver issues.

Then it's not a CPU benchmark, moron.

Every benchmark is a CPU benchmark, you ARE a moron if you cannot comprehend the difference between averages and minimums.

I am sort of silly too wasting my time arguing with a literal retard, though.

What you're looking for is a synthetic benchmark. Try cpuboss.com
>but muh future tellings of cpu
All a cpu bound video game benchmarks will tell you is how that specific game will run today. Video games aren't going to be pushing a cpu untill we get another supreme commander.

>Every benchmark is a CPU benchmark
Ah yes, let me just run CrystalDiskMark on the one of the laptop drives I have in my i7 4790 and on the SSD I have in my i5 750 and we'll see that the i5 750 is a faster processor than an i7 4790.

Wasn't supreme commander single threaded? That hardly pushed CPU, it just pushed 1 core.

>w-wait for coffee lake ..

Yes, but at the time it was (and probably still is) the most you can push a CPU using a video game.

No way, there are games that can push more than 8 cores. Even shitty ones like Ashes of the Singularity. The only thing Supreme Commander is gonna push is your 50 dollar Pentium.

Saying a game is CPU heavy is vague and not informative. It could be heavily single threaded and max out even higj end CPUs on one core. Most games dont use cores evenly either, even if the game uses multiple threads it might still max out one. Others only use up to a certain number of threads, then diminishing returns hits hard. Others might require a minimum number, most commonly 4 now, to play the game with acceptable performance. A game doesn't have to literally say 100% CPU usage to scale off of CPU power.

>40% more FPS at 480p low
How will AMD ever recover?

OY VEYYYYY

WE NEED THAT MONEY TO SUPPORT ISRAEL YOU STUPID GOYIM.

STOP BUYING AMD

You're not funny.

>not playing at 320x240

>Not testing in DX7 even if it's not supported

>muh CPU bottleneck

Well if results with Nvidia's 1080Ti don't matter because a "CPU bottleneck", what does then? Two Titan XP in SLI on 1280x720?

techspot.com/review/1450-core-i7-vs-ryzen-5-hexa-core/


>cpuboss

Get out

>this massive stutter on a 4 core

I thought 7700k was the KANG of gaming???

>PUBG

game optimised like shit considered as benchmark

L O L

Only the KANG when it's not thermal throttling :^)

>stutterlake-x

Sasuga, Intel.

Jesus christ, what the fuck happened?

>benchmarking with settings people actually use instead of arbitrarily low resolutions where context switches matter more than throughput
Oh no, can't have that.

nu-intel happened, enjoying them (((innovations))) goy?

Always same... Amd must use strange tests amd settimgs so it becomes better.. Not honest not at all

Working hard for this $10 Starbucks card I see..

>FOUR CORES ARE ENOUGH FOR GAYMEN
AHAHAHAHAAH

I implore you to head over to the video and leave the same comment, I'm sure you'll find a lot of like minded people will definitely agree with your assessment. youtube.com/watch?v=UfNMn7RWgLw

Dips below 38ms are dips below 27 FPS. Let this sink in.

>y-you're cherrypicking
>it only happens a few times
>m-muh clocks

This is why i hate reviewer using 1% min

it hide those type of stutter

You forgot to add a heatsink for the intel cost

Grametime measures the time it takes to render a frame.

you don't say?

Thank you ps4 and xboxone, can't wait more and more lovely port games.

>quadcore
>enough in 2017

Just pointing out for people

>not running it headless

>not playing with two titan XPs on a i3
pleb

>Measuring FPS instead of the CPUID

*Vendor ID

>techspot.com/review/1450-core-i7-vs-ryzen-5-hexa-core/

That's great. Holy shit the 1600 is awesome.

But, what the fuck is happening with the Deus Ex: Mankind Divided on DX12? Wasn't it supposed to run better than on DX11?

>Wasn't it supposed to run better than on DX11?
>square
Ass backwards game from ass backwards publisher.

>dx12
>running better than dx11
Maybe on a theoretical paper.

Ryzen, Nvidia, and DX12 are gimped when all 3 are put together. This will be the case until Nvidia patches their drivers or AMD releases a better video card.

Nvidia has fake dx12. Sometimes it gets gimped by single-core performance.

And yet the $215 R5 1600 shits on the head of the $400 i7 7800X on Total War: Warhammer while using DX12 too?
All of my keks

Still pulls more frames than Polaris and Vega.

It depends on how game uses api and how nvidia drivers works with this particular api call order.

It's not single core performance that's gimping it.

The X-series are scams. Literally small factory overclocks that can be achieved with a little more agency.

1700 and 1800 are not shining in gaming really but anyone serious about it can make use of those extra cores

Jaguar cores are meaningless just like all Bulldozer cores were.

INTEL REBRANDXEON HOUSEFIRES

>small low power OoO core is meaningless
?

No I mean Jaguar suffers from the same issues that plagued the FX CPUs, the module architecture was designed poorly which cripples multithread performance.

Bobcat uses usual x86 OoO cores instead of "modules" you cockgobbling moron.

Then why do console CPU's perform worse than an Intel dual core in games? Also console games don't even use all of the console cores anyway, most use at most 6. Using current gen consoles as an example for games to follow is bad, since the AMD CPU is easily the worst aspect of the PS4 and cripples the entire platform.

It's a LOW POWER core.
Do you know what LOW POWER is?

It literally says "modules" right here. PS4 and Xbone CPU's aren't "8 cores" anymore than the Bulldozer trash was. AMD CPU's being in consoles hasn't helped game optimization at all, it's held gaming back if anything. Current gen games aren't any more impressive than last gen, the CPU's are just too weak.

Please, idiot.

Please stop.
Please.

Bobcat is not bulldozer you retarded cockmongler.

>Has no clue what he's talking about
>Links Wikipedia screencap

Come on

>video game "benchmark"
>gpu not listed for each system
Why do people do this?

...

>gpu not listed for each system

see

It's a 1080 Ti.

Why is it not on the fucking benchmark table? I just don't get it.

Because it's not relevant, there are no better graphics cards out there to make that information relevant.