COVFEFE FAKE

hardwarecanucks.com/forum/hardware-canucks-reviews/76333-i7-2600k-vs-i7-8700k-upgrading-worthwhile.html

Oy vey! Upgrade now, goy!
Y AREN'T U UPGRADING!?

>3.7MB
>888x545
Sorry OP but I thought you should know this completely invalidates whatever this fucking thread is about.

...

i'm waiting on a 45nm phenom ii x6.
might as well wait for 7nm.

delete this post

It's not my problem Sup Forums's rules on file size are so shitty. 4mb limit is fucking retarded in this modern day and age.

Why don't you start by not saving the image as a png?

Nah, schlomo.

Use .jpg you fucking animal

>Deliberately gimping your image with artifacts in this modern day and age
KYS

>~3MB for a fucking 400p
Kill yourself, Sup Forums-tarded.

>4MB
>shit resolution
>learn2compress

Even though I checked 'em, you're still wrong. You're the animal if you're still continuing on schlicking to JPEG in this modern day and age. Grow up already and get out of your cave.

Not only that, the issue is that the image is small enough you can't even see the fucking benchmarks and it's just a bunch of memes smidged together. What a failure.

Sheesh, fine, you little whining underage fucks. Have at you!

See it wasn't so difficult. By the way, you're wrong and this thread is shit,

lmao poorfags on capped data plans detected

It's just a smaller file type which is useful in this context, you could have created a readable picture

> 1440p
> 2xMSAA
Videocard benchmark.

Your image post isn't readable in my 4k display, you fucking poorfag.

Give me a 20MB picture, but at least make it fucking readable.

Cry more, Brian.

What? In the tests that really use the processor, the difference is amazing. I didn't think it was so faster. Being an AMD user, if this article proved me something, is that covfefe lake is amazing.
>B.but muh gayms
Fuck them.

.png was smaller so i didn't use it :)

>1440p is mainstream resolution nowadays (literally in every 2-and-a-half system)
>Literally 1~8 FPS difference between 2600K and 8700K on same settings and same config in 8 out of 10 games
>Your benchahmarks are lie, give us 1080p!
>1080p same exact results
>That's all while 8700 having more cores and threads and technically should be giving MUCH better FPS in games which utilize more cores/threads
>WAAAAH, YOU LIE, IT'S ALL WRONG, GIVE US HANDBRAKE AND WINRAR WITH BLENDER!
Ayy lmao

Just get RyZen+/Zen APU, no need to wait for 7nm necessarily. 12nm Zen is great.

> HANDBRAKE AND WINRAR WITH BLENDER
Game benches can not ensure 100% authencity since some games tax CPU more, others tax a videocard more, many of them aren't programmed to use more than 8 or even 4 threads. CPU epeen should be measured with CPU benchmarks, games may be used to just see how well will it run on a CPU.
BTW, Winrar isn't good in that regard too.

Sorry, can't hear you dude, you were bragging about your 4k display and calling me a poorfag?

>Video games
>Benchmarking anything but GPU

CPU benchmarks in games are irrelevant, all you want to know is weather you CPU will bottleneck GPU or not.

>.png
AMD fags everyone

The point of that comparison was to determine if upgrade from 6 years old Intel CPU is worth a purchase of a """""(((new)))""""" more expensive Inturd "flagship" mainstream CPU. Games, in this regard, are a perfect catalyst, as 2500K and 2600K are notoriously famous GAMING processors. And that comparison test concluded that there was roughly no more than ~8% increase in gaming performance on Intel's CPUs IN SIX YEARS. Which makes Inturd's "modern" trash ABSO-EFFING-LUTELY IRRELEVANT and dead-in-the-water, considering that Zen exists. Just a daily reminder that RyZen 7 performs like TWO i7 2600Ks sheer IPC and PPC-wise, making it a MUCH better upgrade than ANY of the "today's Inturd" offerings. What essentially has been shown, is that Intel didn't do JACK SHIT since Sandy and ALL of those Ivys, Haswells, Skylakes, Broadwells, and other shit - are SLIGHTLY TWEAKED REBADGES OF SANDY. IT'S ALL SAME SHIT! IT'S SANDY! SIX YEARS LATER!

Maybe Sup Forums should be the appropriate board for gaming topics?

Stop being such a retard. We're discussing relevance of hardware here.

You don't even own a 26 or 2700k, why do you care?

I am on 4930K

png is literally the flac of maymays, and all you faggots (including hiroshimoot with his gay file size limit) are just salty you don't have decent data plans.

Stay mad, kikes.

KYS, fag

I'm "upgrading" from a 2500k to a 8350k and migrate from ATX to mATX. Blow me, OP.

>"upgrading" from a 2500K to a 8350K
The only person you're blowing here is yourself, kid.

[spoiler]I would if I could

I'm on neither of the sides beside personal taste, but I gate that OP is not able to post properly images so in the next minutes I will be posting them.

Not only you're a shit-eater (8350 in 2017, lel), but also a fag. I won't be surprised at all if you also have some Apple shitware.

Only took you 20 minutes to find a pic on the web? Impressive!!
Your shits still 25-30% slower than a 7700k, much less a 8700k.

8350K is an i3, not a FX. Learn to double-read.

Read the thread, dumbo.

You seem to be mad an uninformed. I'm not talking about an FX-8350, but an i3 8350k.
I curently don't own any Apple products.

right lol sorry

>find a pic on the web
Nice try, kiddo. Very impressive indeed. Alrightie, how about this then:

How about a blender or cinebench comparison?

>I have no arguments, so I'll just "SHOW ME THE SYNTHETICS!" so that no one would see how much I've shitted myself. Yes, I'm a good mamma's boy today indeed
Great play, kiddo. You are truly a genius.
I still have to remind you that it's ~8% in 6 years, in games. And that's all that matters, not schlicking to handbrake.

Then you're even more of a moron, as you're deliberately spending money on "4 core pseudo-XXXXTREME" gimped garbage. Typical Intbecile, lol.

I am just correcting your novice mistake. I have no intentions of getting a 8350K.

Nothing about an i3 screams extreme, I think you're once again confusing something here. Maybe Intels new i9 line.

Yeah, yeah, keep whining.

>Nothing about an i3 screams extreme
It's a cut EX, which is a cut Xeon. "4 core i3" doesn't exist, but Intbeciles like you prove it can purely because jews at Intel love milking each and every last drop from you retards, so it might as well be.

> "4 core i3" doesn't exist,
The Coffee Lake i3 is a quad core without SMT.

So what you are saying is that the new 4 core i3 line and 6 core i5 line actually do not exist at all because, of course, former generations were built around 2C/4T and 4C/4T. If that makes perfect sense for you, great.
Meanwhile, I will be enjoying my new CPU.

Cinebench and blender are professional software used to make money. Your games are pajeet-tier disasters that are very often ports from another system.

LOL, morons. Proved yet once again how utterly autistic Inturd fanboys are. Absolute sheeple.

A Covfefe lake i3 has the same stepping as a Kaby Lake i5/i7. It uses the same die.
No, it is not a cut EX. A cut EX's die would be too big. Only i5-i9 X would use a EX die.

DELID THIS

Black pot calling white kettle black much?

...

Yeah yeah, and GayForce 970 has 4GB, uh-huh. Just fucking off yourself already.

even if your 25-30% shit was true its still not worth the upgrade. 99% of people aren't going to use it anyway

I'm not a intel fangay, but at least follow the industry and read the spec sheet you useless nigger.

Besides semen lake is DOA with the mandatory delid.

>i9s are shit binned faulty Xeons
>All EX are cuts of Xeon
>4 core i3 is a cut of i9
>NOP, NOT A CUT OF XEON
KYS

>i9s are shit binned faulty Xeons
They are though. What are you on about you knuckle dragging autist?
>4 core i3 is a cut of i9
4 core i3 uses a different die to the Shitlake-x and Xeon (execpt the 4 core xeons on 1151)

Wow, you accidentally forgot to include the results that show a large gap in your post! Don't worry, bro, I got you! Also:

>GTX 1070
So GPU limited in most games, yet already slightly bottlenecked in many and majorly in others. I guess that's fine then if you don't and never plan to own any card more powerful than a 1070.

Your picture is comoletely unreadable but good thing you didn't gimp it with unnoticeable compression. Fucking retard.

>GTA V
>Literally the most retardedly coded game out there that performs BETTER the lower core count is, and other shit
Nice try.

>1440p is mainstream resolution nowadays (literally in every 2-and-a-half system)

Yeah, that isn't even close to being true. 71.38% of systems recorded in the Steam hadrware survey use 1080p. 3.40% are gaming at 1440p and 0.52% at 4K. Statistically almost irrelevant.

>Your benchahmarks are lie, give us 1080p!
>1080p same exact results

Because the benchmarks are using a GTX 1070 and maxing every setting out. Many aren't even hitting a consistent 60fps as a result. So yes, congratulations, a 2600K will keep up in a situation where you intentionally cripple yourself below 60fps to create a GPU bottleneck. A round of applause for the retard!

This makes me feel a lot better about the 3970X + 1070 build that's coming together in my living room.
It's okay, the 3970, Rampage IV and 64GB of RAM were all free.
>2017
>paying for Intel

I like how there is a lack of actual not 10 years old games that use multi core. Might as well be tested in CoD4. Check out AC:O for example. More is coming since 6+ cores is now pleb market.

It has test with 1080 Ti, you nimrod. Same exact shit - ~8% difference in 8 out of 10 tested games. The Witcher 3 tests are especially exceptionally hilarious.

You also fail to mention Intel's GPUs dominate the survey also.
Hmmm

>systems recorded in the Steam
Off yourself. Steam is a cesspool of retarded kindergartner-tier trollie schoolboys. Real people play by other means. In this modern day and age Steam doesn't represent jack, as it's an echo chamber.

>Because the benchmarks are using a GTX 1070 and maxing every setting out
>PLEASE, PLAY ONLY ON MINIMAL SETTINGS! WHY AREN'T YOU PLAYING ON MINIMAL GRAPHICS WITH 800x600 RES IN 2017!?

>Might as well be tested in CoD4
It was.
Same shit as BF1.

Butthurt incoming

>Gaymer'sAsses
>Synthetics
At least you tried.

should I read this with magnifying glass or what?

Literally shows what I was talking about earlier: RyZen 7 performs exactly like two full 2600Ks. RyZen 7 is cheaper than Covfefe. Covfefe = DOA.

See here .

If you have anything but a 1080 Ti you are not going to notice much difference upgrading from Sandy Bridge to Coffee Lake in gaming. If you want to boost your productivity however go for either Coffee Lake or Ryzen (But wait for Ryzen+ in the second quarter of 2018 if you go that route. Even if you don't need Ryzen+ original Ryzen prices will plummet).

No more synthetic than a game. It's cool if all you do is game at 60fps, but don't hold others to your lowed, arbitrary performance metrics. Your mobo is getting super old, and since you have to oc the shit out of your CPU for it to be still remotely relevant, it's not going to last much longer. Are you really that poor you can't afford an upgrade after 8 years?

Are you blind? 1070 shows same results. Meaning 1060 will show same shit too. It's NOT only about "top tier". It's across most of the tiers. It's NOT about the recorded performance, BUT about the DIFFERENCE in performance between a SIX YEARS OLD CPU and a "fresh" MORE EXPENSIVE CPU with "MOAR CORES". And that difference is just ~8%! Across everything!

>No more synthetic than a game
Holy shit what a truly autistic moron you are. Calling a game a synthetic. Holy fuck, I never thought there could be posters with IQ less than 21 points on Sup Forums, but here we are, you just proven otherwise.

It's almost as if games are a horrible way to measure performance or something

Not an argument.

So for pretty much any game other than GTAV, Coffee Lake barely gives you a performance boost over Sandy Bridge

Nice, just saved myself $1500

If you are upgrading to a 8700K *FOR GAMING* you will most likely want a faster GPU to go with it. So effectivly a GTX 1080 Ti or the upcoming Volta. This means you will need to spend A LOT of cash in the process. Simply upgrading to an 8700K when using a GTX 1070 won't net you much performance increase.
For productivity applications (and having better features like USB 3.1 and M.2 etc) then yes. For gaming no. You would get more performance upgrading your GPU to a GTX 1080 Ti on a 2600K if you want a boost in gaming and don't want to empety your bank account in the process.

It's all about costs vs needs/wants here.

Did you read the article? Even with a 1080Ti for most games the 8700k gave you very little over a 2600k. A handful of games benefited from it.

More like 7-800 retard
That's less than $100 a year.
Is this the new poorfag thread?

Nope, I priced it out a night or two ago
$450 for the proc (because no one is selling them at MSRP)
$300-400 for the mobo
$200 for RAM
$150 for a M.2 NVME SSD

JUST IN GAYMANG ISN'T CPU INTENSIVE. STOP THE FUCKING INTERNET.
Any other enlightening observations?
Oh oh.
How about how wireless N won't speed up browsing vs AC in all but the most edge-bound use

oh, forgot, $80 for a new cooler
add in a new graphics card and you're pushing $1700-$2000

They're not. It's just an excuse of whining fanboy faggots like you.

Game applications aren't synthetic, you dumb illiterate fuck.

If I'd wanted to "boost productivity", I'd just get Threadripper, not Coffin Fake. We're not talking about "productivity" here, however. Both i5 2500K and i7 2600K are GAYMING CPUs first and foremost, so is 8700K (it's being advertised as 7700K's successor. which is already utterly retarded in itself since 7700K sucks ass in games due to INEPT stuttering and other problems). That's why their performance in games is all what matters, NOT synthetics or any other shit. And that performance difference is ~8% between the two, regardless of GPUs and settings used. That's 6 years. In 6 years Intel only managed to increase performance by measly ~8% (and that's in best, Intel-compiler biased cases. In many it's actually no more than 2-4% usually). And this is with "two more cores" while being horse cum-splattered, with RF ID under the cover and hidden Minix, and costing more money. Literal DOA garbage.

>How about how wireless N won't speed up browsing vs AC in all but the most edge-bound use
>when you're so mad you get the order of standards wrong

its just common sense about sane image compression and not having to deal with useless data.

My Sandy is on Z77 board with updated BIOS. It has AC and it works just fine.