INTRASH ETERNALLY BTFO

phoronix.com/scan.php?page=article&item=amd-epyc-7401p&num=1

>you need 5000+ Illuminati shekels-worth of Intrash "poocessors" in a 2S to compete with a ~1000$ mid-tier EPYC that's in 1S.

>two top EPYC 7601 stones consume less energy than one "Golden Zion inTURD™".

>even TWO Golden inTURDs™ in 2S can't beat ONE mid-tier EPYC 7401P in 1S when put through Ebizzy (and also gets COMPLETELY OBLITERATED by AMD in C-Ray). Absolutely EBYNzzy.

>literally ANYONE can afford 7401P just fine, and most working people will even be able to relatively easily afford 7601, meanwhile - even "Silver Zion inTURD™" is overpriced as fuck for the underperforming power hungry holocaust oven that it is, not even mentioning Golden inTURD™.

>this is not even remotely close to Zen's final form, this is not even STARSHIP™ yet.

B T F O
T
F
O

Other urls found in this thread:

phoronix.com/scan.php?page=news_item&px=AMD-EPYC-Linux-4.15-First-Test
wccftech.com/amd-epyc-powered-hpe-server-breaks-performance-world-records-spec-cpu-benchmark
youtu.be/11NfsMykyAk?t=5m1s
gamersnexus.net/guides/3130-best-cpus-of-2017-round-up-gaming-blender-premiere
twitter.com/NSFWRedditGif

Intel scales like garbage (already less that 78% at just 2S), while Zen scales nearly perfectly (almost 100% at 2S and still more than 80% at 4S). Imagine putting two top EPYC stones in 2S and comparing that to four golden Xeons in a 4S. Now THAT will be a true laugh out loud moment all right.

...

these synthetics dont show us anything real.
just go about your day, and wait for the real benchs where intel will be back on top

I know Intel dropped the ball but there's no need to shitpost and samefag this much dude.

Also the only anons with jobs and money to go full retard server cpu in their homes already are going AMD because intel hasn't made a good one since the lga 2011v3 sockets.

Jesus, how pathetic

Thank god I went for an AMD processor, Intlel really fucked up their current gen big time.

...

...

>meanwhile, at non-server segment

REPORTED FOR ANTI-SEMITISM

Jesus Christ. That thing fucking demolishes intel. I want to see how they react to that.

>0000

checked

...

Quads of truth have spoken

>I want to see how they react to that
>Bribe paycheck e-mails to reviewers and OEMs
>Deploy the "PcPoo+JayZ+GayNexus+LinuxTurdTips+Tom's+TPU" army
>Massage Ryan Shrout's prostate while sucking him off vigorously, so that he produced made-up BS FUD fairy tales about AMD's products in the next PcPoo Podcast episode

...

>inb4 they resort to MUH SINGLE CORE meme

>77
Checked 'em. Also - see the on "MUH SINGLE CORE". They've fucked up there big-time too.

>Unironically and sincerely implying LGA 2011 wasn't any garbage ever, in every iteration
KYS, I guess.

OHHHYYYYYYY

Was pretty good you were just too poor or a NEET gaymer to enjoy all dem pci-e lanes and memory bandwidth while doing real work.

You forgot your avatar, here, let me help you:

>all dem PCI-e lanes
Which are nonexistent on Inturd. Now Threadripper, on the other hand...

phoronix.com/scan.php?page=news_item&px=AMD-EPYC-Linux-4.15-First-Test
OH FUCK
AYYYY

SHREEEEEEEEEEEEEEEEEEEEEEEEEE

Enterprise customers don't give a shit about normie gaymen reviewers. Clouds and big hosting companies are the only thing that matters here - if Amazon or Google decided to buy AMD indel is literally finished

I've meant hardware reviewers in general, not just gaymen.

Intel literally can't bribe big companies like Google Amazon or Microsoft.

>End of 2017
>Someone still sincerely believes this

M8 Intel is thinking of actually fucking suing Microsoft because of "muh x86 emulation"

Yes they're this retarded, their bribing is literally worthless

It's neat because if it was fairly close, Intel would just market the fuck out of their customers and make shit up about AMD.
But the difference is colossal, EPYC just completely destroys Intel's server parts. Intel can't make up enough bullshit to cover a 200% deficit.

No, he's right. They can't bribe them.
However, they can release complete bullshit marketing propaganda along the lines of "you get what you pay for" and "AMD is garbage, go for quality Intel chips, they're 3 times the price, make more heat, and perform worse because they're more reliable unlike untested amd garbage"
Which is what they are doing. If you check their marketing material, I'm not even exaggerating, it reads like a series of Sup Forums shitposts

Datacenters have engineers and the moment they see they can save thousands of dollars in energy bills while increasing the performance they will change to whatver processor does that.

You know why? Because that's an easy way to get a promotion from the CEO.
>Oh hey boss I just saved you 400K in energy bill while making everything two times faster, so how about that bonus?

Not to mention, save millions in construction costs of having to expand data centres to accommodate increased numbers of servers, when instead they can massively increase density.

Can you even buy an Epyc processor?

You can buy an EPYC powered server from Dell, HP, SuperMicro, and a few other vendors, but supply is understandably limited for EPYC right now, so buying just a processor on the market is unlikely.

>this is not even remotely close to Zen's final form

Thank Keller our Great Shitwrecker.

The CEO is reading this shit though, and fires you for buying 'cheap Chinese garbage'.

Say his name.
Say it loud and clear.
You know you want to.

Intel is definitely the platform of the future, just look at how badly AMD sucks in comparison to Intel:

Based Jim Keller.

Looks like a draw to me, AMD won 2, Intel won 2. Ultimately companies will go with the more reliable supplier, i.e. Intel.

>Looks like a draw to me
Sure it does.
Until you consider the fact that the AMD chip is priced at little over $1000, and is competing against a 2S Intel system with 2 x $2000 Xeons

Suddenly those "Victories" don't seem to much like a win, do they?

...

>2S Intel system with 2 x 2000$ Xeons
It's actually 2700~2800$, lel. And mid tier 1S EBYN that beats that "Zionist Golden inTURD™" garbage in EBYNzzy and C-Ray, barely costs 1080$ right at this very moment.

Jim "The Contract" Keller

wccftech.com/amd-epyc-powered-hpe-server-breaks-performance-world-records-spec-cpu-benchmark
IT HATH BEGUN!

FUCK YOU GOY

Checkiked

That server is 300% antisemitic

Make that 4601%, schlomo.

-->

SHREEEEEEEEEEEEEEEEEEEEEEEEEEEEE

JIM
E
W

KELLER
I
L
L
E
R

2600k owner here.
Those numbers are trash.
They removed a CPU bottleneck and put it all on the GPU.
Trash

I used to be an intel famboy but that's changing.

I was going to build a new rig with intel but I've changed my mind.

So you are telling me the 8700k bottlenecks with a 1080 gpu ?

I want Jim Keller to come into my room while I sleep and play with my asshole

I'm not sure whether I want to compile linux kernel any faster than this.

It's been a part of my morning routine for a long time now. I wake up, start compiling it and then go take my morning shit. I wanted to time taking my shits so that I'd finish the moment I'm done compiling since that felt right. This used to be pretty easy. I could take my donald duck comics with me to the shitter, read for a bit and relax while taking a shit. But with each new generation of CPUs, I'd have to be faster and faster when taking my shits. I'd need to run to the bathroom. I'd need to push my turds out fast. I'd need to skip washing my hands just so I could get back on my computer the moment the compilation finishes.

The next AMD CPU I buy will probably seal the deal. I need to give up. I don't have the time to rush to my bathroom. I can no longer poo in loo.

Nice try, but you have to work much harder than that to sway good conscious people to buy your "new" horse cum-splattered RF ID and hidden Minix-rootkitted trash, schlomo.

Just compile more shit or run it with PGO, that will triple your compile time.

What would be really good right now is if you just went ahead and killed yourself right now my man

THANK YOU BASED JIM

i thought the poo in loos are in AMD?

that pic is literally you since 1440p is more gpu dependant

AMD is the most whitest and most American company now + Asian MILF mom. Jews are gassed, pajeets are out. Perfect timeline to be alive.

>If I'd wanted to "boost productivity", I'd just get Threadripper, not Coffin Fake. We're not talking about "productivity" here, however. Both i5 2500K and i7 2600K are GAYMING CPUs first and foremost, so is 8700K (it's being advertised as 7700K's successor. which is already utterly retarded in itself since 7700K sucks ass in games due to INEPT stuttering and other problems). That's why their performance in games is all what matters, NOT synthetics or any other shit. And that performance difference is ~8% between the two, regardless of GPUs and settings used. That's 6 years. In 6 years Intel only managed to increase performance by measly ~8% (and that's in best, Intel-compiler biased cases. In many it's actually no more than ~4% usually). And this is with "two more cores" while being horse cum-splattered, with RF ID under the lid and with hidden Minix, and costing more money. Literal DOA garbage.

>And the most hilarious thing about that is the Witcher 3, which is Intel compiler-fucked as hell. Witcher 3 should theoretically get the most benefit from new Intel processor, but it actually doesn't and difference is so negligible it's downright laughable considering that Sandy came out 4 years before Witcher 3. 2 FPS difference on 1070 and 5 FPS on 1080 Ti. And Deus Ex is 1.5 FPS on 1080 Ti and on 1070 2600K actually BEATS the fucking 8700K! EL-OH-EL! SIX COARZ, HIGHER FREECUMZEES! AYY LMAO!

GPU bottleneck, run the game at 720p and come say that idiot shit again, retard

>Not only 1440p, but 1080p also. It's barely noticeable (margin of error) regardless of resolution, settings, or GPUs. Increase from 1070 to 1080 Ti is obviously noticeable, sure, but in lieu of comparison between same GPU/RAM setups with 2600K and 8700K - it's almost nonexistent.

>The point is - you won't see a big difference even with 1080 Ti. FPS will be higher between 1070 and 1080 Ti, obviously, but ~8% difference between CPUs is same across all GPUs and settings. It's downright disgustingly laughable, as it just shows that i7 2600K STILL DOESN'T BOTTLENECK EVEN THE MOST HEAVIEST OF MODERN YOBA EVEN SIX FULL YEARS LATER, so 8700K's "relevance factor" is literally NONEXISTENT as "better productivity" can be gained with much cheaper Zen.

>turn this real life workload into a synthetic one, Intel will win then!

Low res game benchmarks are brain damage, what they're meant to do(provide representation of future game performance) has been debunked almost half a decade ago, because surprise surprise, game engines changed a fucking lot in half a decade.

This, Inteltards hate seeing actual CPU loads that stress the chip fully, but love these insignificant legacy edge cases that literally no one in their right mind uses in real life.

It's kinda disgusting.

Thank
You
Beatiful bastard

THANK YOU BASED JIM

THANK YOU BASED JIM

...

>people running 400 shekel cpu with 600 shekel gpu on 70 shekel 1280x720 display
yea sure.

1. Who are you quoting?
2. I7s don't stutter, that's just a bad meme some AMD fanboy created.
3. Yes, Intel has gone basically nowhere in 7 years

You don't seem to understand what a CPU benchmark is.

The thing is, a 8700k doesn't sacrifice anywhere. You get goodGaymen and workload performance.
That being said, Intel needs to get a new arch out.

>i7s don't stutter
youtu.be/11NfsMykyAk?t=5m1s

>I7s don't stutter, that's just a bad meme

>a 8700k doesn't sacrifice anywhere. You get good Gaymen


>The pic-related shows us that:
>Lower "average" FPS on RyZen, but it must be noted that: a) RyZen doesn't stutter at all, while Inturd stutters like fuck all the time, b) RyZen has much smoother overall experience because minimal FPS is much higher than on Inturd, c) RyZen has much more accurate and better load distribution across cores (which adds even more to the improving the overall quality of the playing experience, alongside two of the previously mentioned factors). Basically what this means is - higher average FPS doesn't mean jack shit in this modern day and age. Only frame-pacing and minimal FPS matters, and both of these are way better on Zen than on Inturd. In other words - if you're buying a CPU for quality gaming you have to be a total idiot to buy Inturd instead of Zen. But dirty kikes would try to sway you into thinking otherwise, of course. Do NOT get pixie-dusted by Jews.

>P.S.
>And if it's productivity - Zen still completely and utterly obliterates Inturd. This is truly a bad time for anyone to buy anything Inturd-related or branded. Just don't. Don't be a moron. Know better. Get Zen.

He does this every single day, pretty much all day.

>10 hours ago
>2 minutes ago

...

Legal concerns aside, it would only make sense if the bribe would be higher than the cost savings from using amd. Google has hundreds of thousands of installed cpus, jewtel doesn't have that kind of shekels.

DELET DIS SIR

All thanks to the AMD GLUE TECHNOLOGY™.

>Sup Forums completely forgot Inturd's dirty tactics and shady deals of early mid-2000s, if not for which AMD would've never had to create the abominate abortion that is FX

Mr. "Certified Shit Wrecker" Keller

>while Zen scales nearly perfectly
You'd be better off saying this in a thread where OP's picture clearly shows far less than 100% scaling given the number of cores. Not that it scales badly, but it's a long shot from 100%.

All of what you say is true, but nevertheless you're not measuring CPU performance unless you do that.

>those c-ray results
I know it's an SISD FPU-based best-case scenario for EPYC, but dear lord what a shitstomping with 16 cores beating 40 by a wide, wide margin.
>intel's face when

Min for numbers are meaningless
They are outliers not hard data.
1% lows show a real repeatable senero for fps dips.
You can read about the "i7 stuttter" myth here.

gamersnexus.net/guides/3130-best-cpus-of-2017-round-up-gaming-blender-premiere

I was talking about per-CPU scaling, not per-core scaling, you dumb fuck. OP's pic has only one Epyc tested, while Xeon Golden has 1S and 2S. And 2S has shitty scaling, while Epyc's not represented in 2S in that pic.

>You can read
>GaysNesux

>Y AREN'T U TESTING OUR POOCESSORS IN 640X480 WITH GRAPHICS ON MINIMUM IN 2017, GOY!? YYYYY!!!??

Nice argument Soyboy.

You need to try harder than that, kid.
I have biceps the size of a 2l can.

what the fuck is a soyboy

Don't bother. He's a soyboy, so he's calling everyone who he doesn't like soyboys.
Soyboy - sissy dystrophic furfag.