Meltdown Performance Impact Facts

Facts from someone with a degree in the field:

What is NEARLY NOT AFFECTED (0-5% slowdown):
- Pure computational performance (rendering, number crunching, video games)
- Workloads that fit in RAM (small scene rendering, video games, basically everything)
- Rare large file operations (>1 GB)
So, 95% of computer usage by YOU personally. Your own personal computer is practically not affected. You may never notice a decrease in performance. That's what benchmarks usually measure and they, obviously, show that performance hit is negligible. You are NOT FUCKED.

What is FUCKED UP (5-30% slowdown):
- Workloads that don't fit in RAM
- Often small file operations (

Other urls found in this thread:

en.wikipedia.org/wiki/Lazy_evaluation
support.apple.com/en-us/HT208394
theverge.com/2018/1/4/16851132/meltdown-spectre-google-cpu-patch-performance-slowdown
security.googleblog.com/2018/01/more-details-about-mitigations-for-cpu_4.html
businessinsider.com/google-amazon-performance-hit-meltdown-spectre-fixes-overblown-2018-1
twitter.com/AnonBabble

What will NON-RETARDS do:
- Increase chunk sizes to read/write data to disks less often
- Increase chunk sizes to move data between instances less often
- Cache more, get more RAM so everything caches more by itself
- Do less kernel-based IPC i.e through msgsnd, move to shared memory
- Do less loopback/socket-based IPC on one machine, it's a lazy degeneracy
- MEASURE performance impact on toy cases before spending time on implementation
- MEASURE performance impact on real cases before deploying

What should HI-TECH DEGENERATES do:
- Install GNU+Linux and ascend
- Collect $20 from class action lawsuit
- Happily play your stupid games with unnoticeable 0-3% slowdown
- Stop creating threads with obvious benchmark results, it only spreads misinformation
- Consider masturbating to meaningless number less, get curious, get educated, use this opportunity to learn about computer internals and share the knowledge with concerned non-tech people and your hi-tech fellows

NOT SO FAST SLOWPOKE

What will RETARDS do:
- switch to AMD

It's the Showdown of the Century!

But don't retards already have AMD?

Not yet. they are still on the inferior intel 8xxx series.

>what will non-retards do
>they'll slow down a shit ton
really makes you think

>linux
>games

>Collect $20 from class action lawsuit
Where?

>implying that abolishing lazy computing practices and buying more RAM will slow down anything

>a degree
Literally worthless opinion pajeet

*Although to be fair I agree with you

>>parallel computing with a lot of work sharing
Does this mean muh xeon phi cluster running stupidly parallel stuff is gonna slow down significantly?

Check out GNU+Linux game selection on Steam, it'll last you through a few lifetimes. Wine became pretty good, too, I've recently finished Doom on it, ran on native speeds with no bugs and with no tweaking, worked literally out of the box.

I play the sims 4 natively on wine fine

Depends on workload profile. If you spread it across cluster and each node then crunches the data for a whole day - minor slowdown. If the load doesn't fit on one node at all and they have to constantly exchange intermediate results - yeah, you're fucked.

>>Facts from someone with a degree in the field:
>source: my dad works at Nintendo

like fine wine xDDDDD

servers don't need to be patched, server admins have full control over what code gets executed

- I've read on news that there is some mumbojumbo flaw in Intel processors and it's severe, is our infrastructure patched, is our client data safe?
- Ackschually, I know better than every security specialist in the world, so no, we won't be patching, I'm in a full control
- You're fired, smartass

a 5-30% slowdown is unacceptable to a company like facebook. they'll avoid it if they can.

And if I need to have cores in one socket talking to each other all the time am I totally fucked?

So backend servers that only run a certain set of applications could technically skip the patch thus avoiding the performance penalty without any increased security risk, right?

>without any increased security risk
user...

from what i've read you need to run malicious code you don't just get hacked out of nowhere

i play bf2 60 fps on a 4 year old 6300 whos a retard for paying more ;)

If they communicate via shared memory - no, if you pass messages through kernel - yes, but you can always move to shared memory and implement message passing over it.
They'll rather move their databases to AMD than accept that any RCE/minor malware on any machine can lead to a total data compromise without even escalating privileges. Imagine every handgun on a planet turning into a nuclear missile, you can't just ignore that.

>They'll rather move their databases to AMD
do you even have an inkling of an idea what the cost of that would be

Well, the cost of Facebook database breach may cost as much as Facebook itself. They're over-provisioned as fuck, that 0-30% increase in load should be tolerable for them.

>a 5-30% slowdown is unacceptable to a company like facebook. they'll avoid it if they can.
so facebook will leave user data and internal project X's wide open to theft and destruction which would result in bankruptcy just because they didn't want to buy 30% more computing power? no, they will eat the cost for now, but intel and mexico are going to pay.

tl;dr

Intel is fucked in the server market.

That why the Intel CEO dumped stock.

Sounds like gaming may still be affected by everyone's ping getting worse.

>wide open to theft and destruction
[citation needed]

What's wrong with lazy computing?

Nothing, until you have to remove all those hacks because otherwise everything will be 30% slower.

Explain this the OP.

Could be a number of things. Personally I don't trust the testing method.

>shill shill shill
I'm running my own tests now, my very first run was about 20 points higher than the next four, which were all within a few points.

Whether this caused by latent heat resulting in quicker throttling, I don't know, but it basically isn't the point. It gives a better idea of where the processor really is under heavy loads.

These are meaningless numbers with no explanation of what is tested.

Yes? 30% of non-browser desktop games, not counting using WINE. 100% of console emulated games and browser games. Almost all dx9 and lower games will work perfectly on WINE. So, overall you'll have access to over 70% games you'd have on windows, which is still thousands of games.

>browser games
woooow

You never used cinebench?

It should be fine, as it is all about raw computing power.

Tuning can be a bitch in large systems like you find in companies and research institutions. Especially since much of what needs to be tuned is written in house, and not maintained terribly well.

Posted my results

>intel loyalty thread
the fuck

I'm amazed that you faggots still think this is actually going to change anything here's whats going to happen
>Intel patches bug
>Companies that use Intel CPU takes 30% hit
>Company does absofuckinglutly nothing because they were over-provisioning from the start as would ANY major corporation do
>If they threaten to "switch to AMD" give them a more favourable contract and court them back over.
It's not like I enjoy this, but every fucking time you people get my hopes up and nothing happens.

>- Happily play your stupid games with unnoticeable 0-3% slowdown
What games?

>more shared memory
>When Spectre exists

We need to share LESS memory

...

>full-disk encryption
who doesn't use disk encryption nowadays?

sry brah, already ripped out my intel CPU and mobo and replacing it with threadripper. intel fucked up. they are done. toast. fuck them like the niggers they are.

95% of real-world high-performance computer usage, meaning datacenters have to invest 5-30% more money in ccpus. wowee.

Or finally go Epyc.

>not getting the obvious pun
U don't belong here, kys
en.wikipedia.org/wiki/Lazy_evaluation

He's an imperative faggot at best

If you lower performance by a third you need to buy 50% more equipment, not 30% more.

Just how fucked am I if I was planning on using a Windows virtual machine with GPU passthrough?

>95% of real-world high-performance computer usage, meaning datacenters have to invest 5-30% more money in ccpus. wowee.


time to invest in intel stocks goys

Very. KPTI does hit VMs.

...

>investing in Intel while every datacenter is in a middle of jumping into the EPYC bandwagon
Go ahead, throw your life away.

...

Yeah, chinks have already jumped on EPYC cock. I wonder when will kikegle or kikezon will do that.

...

the CEO sold out but stocks are soaring already lol

investors are irrational, the world we live in is irrational user.

remember how the capacitor plague crippled motherboard manufacturers in the mid 2000s?
me neither lol.

everyone will keep buying intel chips like nothing happened.

Laptop garbage.

>everyone will keep buying intel chips like nothing happened.
Except they'll need more chips and more power for the same computational tasks. they'll just lay off a few thousands.

Okay but will I still be able to watch 4k porn on my iPad?

>- Workloads that rely on many calls to kernel like full-disk encryption using kernel-based crypto facilities
So basically, if you are a gentoo user with full disk encryption, FUCK YOU.
Did I sum it up correctly?

>just rewrite everything!!!
Are you fucking retarded?

It's 1 fucking VM guys. The cloud corps are up in arms because they run those at scale. Unless you could barely run it before, especially because you'd be passing the hardware directly, it's not enough to matter.

>Switch to linux faggot
>But games
>BROWSER GAMES LMAO YOU'RE DUMB
>O-okay
The linux experience

Nah, I understood that he pretended to misunderstand what I meant by 'laziness'.
I'm all faggot, actually, all programming paradigms are good.
I think, around 30*sqrt(2)=42.42640
Well, not exactly, because here it actually hit mostly companies, normal people can play their games normally.
No idea, can anyone try this to run the on iOS? Apple admits that it is possible
support.apple.com/en-us/HT208394
Use Ryzen then lmao.
Nah, just a few fixes if you know what you're doing.
And latest Doom (oldest too), and Talos Principle and tons of games on Steam lol. Browser games you can play everywhere, it's nothing to brag about.

>And latest Doom (oldest too), and Talos Principle and tons of games on Steam lol. Browser games you can play everywhere, it's nothing to brag about.
See:

>Companies ARE FUCKED.

Wrong.

>Apple: “Our testing with public benchmarks has shown that the changes in the December 2017 updates resulted in no measurable reduction in the performance of macOS and iOS as measured by the GeekBench 4 benchmark, or in common Web browsing benchmarks such as Speedometer, JetStream, and ARES-6.”

>Microsoft: “The majority of Azure customers should not see a noticeable performance impact with this update. We’ve worked to optimize the CPU and disk I/O path and are not seeing noticeable performance impact after the fix has been applied.”

>Amazon: “We have not observed meaningful performance impact for the overwhelming majority of EC2 workloads.”

>Google: “On most of our workloads, including our cloud infrastructure, we see negligible impact on performance.”

I do the same on my Haswell Celeron. Who's the retard now? :^^^)

What do you think their PR statements are gonna say?
"Holy shit virtualization is so much more expensive with all this extra overhead and we'll probably even need more mitigations in the future. Fuck our whole business model sucks now?"

>as measured by the Geekbench 4 benchmark
They are not even trying to hide anymore.

>6300
>Haswell that came out after faildozer
K

See

Well, it's nice than, which means SOME companies are fucked up to 30% while others are not.

>bf2
>2005
>4 years old 6300
Now who is he fucking retard

See what? I'm not sharpshooting, there's a lot of games on GNU+systemd+Linux.

>not able to play latest battlefront ii or any EA game

Fucking disgusting!

I think this shitstorm is actually good as it puts AMD back on the map. And drives competition.

And I don't think companies that run Intel will switch to AMD because that would require to change every fucking thing from the board to the code that is mostly optimized for Intel or AMD.

Companies will take the hit but will probably consider having both architectures for fallback.

60% of the time, it works every time

When talking about "effects on performance" you're talking about the patches, not the bug itself, which is a mere security flaw, right? Or are you this retarded?

What sort of idiot would continue to rely on intel after this travesty, on top of the ME disgrace. Intel are just sloppy.

What about those intel multicore compiling performance hits? :^)

Compilers relying on FS performance instead of caching/buffering everything by themselves.

T. Inthell

More like The Slowdown of the Century

>average programmer
>knows what they're doing
yeah, sure

How about video games with server reliance?

How much is emulation going to be impacted, as in CEMU and Android emulation?

>Nah, just a few fixes if you know what you're doing.
t. unemployed armchair "programmer"

theverge.com/2018/1/4/16851132/meltdown-spectre-google-cpu-patch-performance-slowdown
Pure bullshit
AMD shills on full force

>theverge

>f-fake news!
security.googleblog.com/2018/01/more-details-about-mitigations-for-cpu_4.html
businessinsider.com/google-amazon-performance-hit-meltdown-spectre-fixes-overblown-2018-1