NoVideo IS affected!

pcper.com/news/General-Tech/NVIDIA-addresses-Spectre-vulnerabilities

>"it's not affected!", they whined
>"stop producing lies, you AMD shill!", they screamed
>"you're just trying to sell Radeons, you poojet!", they autistically screeched

Well, how does it feel NOW, you greenassed ostrich-brained morons? Nvidia is fully affected too. Enjoy your immense buttmad coupled with 30~67% performance drops soon.

Or...you know...you could've just stopped being nVidiots and join the #BETTERED. It's for you to decide to either be cucked over fully, cucked partially, or not to be cucked over at all.

After all, just a daily reminder:
>Patch affects syscalls directly.
>Nvidia's video cards don't have internal syscall module and heavily rely on external syscalls from OS, thus potentially making nVidia cards susceptible to performance drops and being essentially compromised even if paired with an AMD CPU.
>If paired with Intel CPU, however, nVidia GPUs get a 100% confirmed performance drop.
>AMD is absolutely flawless, unaffected on neither the security of CPUs or performance of CPUs and GPUs. Radeon GPUs are absolutely not affected by syscall performance drops because Radeons had an internal integrated syscall module as far back as GCN 1.1, nVidia, however, deliberately got rid of them since GTX 5xx days.
>Worst combination of hardware EVER starting from this point on - Intel CPU + Nvidia GPU.
>Fine, but potentially compromised combination of hardware - AMD CPU + nVidia GPU (compromising is purely on nVidia's side if it ever happens, as AMD is absolutely flawless).
>Absolutely flawless, impenetrable, invulnerable combination - AMD CPU + AMD GPU.

Other urls found in this thread:

nvidia.custhelp.com/app/answers/detail/a_id/4611
images.anandtech.com/doci/10325/GeForce_GTX_1080_SM_Diagram_FINAL.png
images.anandtech.com/doci/5699/GeForce_GTX_680_SM_Diagram_FINAL.png
twitter.com/NSFWRedditGif

Friendly reminder:
1050 Ti is STILL the best GPU that was EVER created by humanity.

My systems are either Intel+AMD or AMD+nVidia though

Enjoying being a cuckold yet?

Amd Ryzen + Amd R7

Feels good man

>R7
>Not R9 or RX
>Good

I think this is bad reporting. The article seems to be confusing the issue. Firstly it needlessly mentions GPUs and SoCs together. Secondly it implies that there are vulnerabilities with NVIDIA's GPUs without providing any evidence for it. From NVIDIA's bulletin it looks like the issue is with the display drivers. On January 3 NVIDIA released the following statement:

"NVIDIA’s core business is GPU computing. We believe our GPU hardware is immune to the reported security issue and are updating our GPU drivers to help mitigate the CPU security issue. As for our SoCs with ARM CPUs, we have analyzed them to determine which are affected and are preparing appropriate mitigations."

There's nothing in these new bulletins that contradicts that assessment. The new bulletins address exactly those issues outlined in the January 3rd bulletin: SoCs with ARM CPUs and display drivers that run on CPUs. As for AMD, AMD's SoCs are similarly affected. We'll have to see if AMD releases display driver updates with security fixes in the upcoming weeks. My guess would be that they will.

>mfw bought a 1060 for mining
>mfw this is gonna affect my resale value

Nice buttmad noVideot pasta, matey.

so is every softwere ever. even firefox

...

nvidia.custhelp.com/app/answers/detail/a_id/4611
The article literally just copy paste nvidia post and they did specify geforce, quadro etc affected by this.
Even the fucking title is "NVIDIA GPU Display Driver Security Update for Speculative Side Channel"

DELID DIS.

helps that their run their warp on the cpu instead of the gpu

would love to see the perfomance after the patch LOL

English language, motherfucker. Speak it.

>Nvidia's video cards don't have internal syscall module and heavily rely on external syscalls from OS, thus potentially making nVidia cards susceptible to performance drops and being essentially compromised even if paired with an AMD CPU.
Fermi cards had a hardware scheduler.

Did you read OP fully?
It clearly says they got rid of it after 5xx. Fermi was 4xx and 5xx.

nvidia scheduler aka warp runs only on the cpu its the main reason that nvidia has low tdp since kepler

and i was suprised that they told to everyone that they werent affected considering that they dont encrypt or anything else on the fly
turns out as a traditionally nvidia twist they LIED once more

So "Warp" is a marketing scheme internal name, I see. Didn't know that one. What fucking autists. The only thing that will warp there, is nGlide wrapper around their sorry asses, lol.

>Did you read OP fully?
No, I stopped reading after the dumb statement. I now see that he included it after saying that blanket statement though.

>dumb
Nice projection, kid.

pretty much thats why they are fighting for vulkan and dx12 to be a single render api just like dx11
because if the games move to dual render setup(which is the norm for such api's) the overhead from their scheduler cloging the cpu will be comparable to what amd had when playing gameworks games

>DX12
DED

Shit, I was going to buy a 1080 to replace my r9 280x. Time to wait(tm) again for the aftermatch.

>wait for aftermath
There will be no "aftermath", as it cannot be fully fixed on Intel's and nVidia's side simply out of these problems existing on a hardware and not software level. The ride will NEVER end. Not until they make completely new hardware from scratch. And Intel just recently CANCELLED THE FUCK OUT their next poocessor line, because they're STUCK and DON'T know what to do. Nvidia will highly likely follow next with "cancellation/moving further in timeline" of the GayForce 2xxx series, because the haven't been using internal syscall modules for almost a decade now and since they're directly affected by this - they're fucked until their return it back. Which they can't anytime soon. Absolute cuckolds.

just to be clear
screencap this and watch if any of those reputable youtubers will do any before and after benchmark

Ryzen + Vega feelsgoodman

Stop listening to this butthurt AYYMDPOORFAG

Nvidia schedulers are on the GPU and people already benchmarked and there's no performance loss

Did this shit really have to happen, now? I have a Ryzen 5 1600 but I also bought a GTX 1080. Am I fucked, or is this just a meme by radeonfags? I guess it could be worse, my older system had an Ivy Bridge i5 and a GTX 1050 Ti

they benchmarked on a fix that is suppose to come at jan 8 from nvidia?

my god your shill academy must be shit tier

images.anandtech.com/doci/10325/GeForce_GTX_1080_SM_Diagram_FINAL.png

Warp schedulers are on the GPU, this AYYMDPOORFAG that doesn't understand GPU architectures have been nothing but a lying piece of shit

How isn't it dumb?
>Nvidia's video cards don't have internal syscall module
>there's nvidia cards with exactly that
>Nvidia schedulers are on the GPU
>Additional die spaces are acquired by replacing the complex hardware scheduler with simple software scheduler.

Not the anan you're replying to, but AMD CPU + Nvidia GPU combo should be all right because AMD doesn't need the Meltdown patch which slows things down.

Do you even READ?

nvidia.custhelp.com/app/answers/detail/a_id/4611

>Variant 1 (CVE-2017-5753): Mitigations are provided with the security update included in this bulletin. NVIDIA expects to work together with its ecosystem partners on future updates to further strengthen mitigations.

Nvidia is adding mitigations for Spectre because Spectre affects ALL CPUs which Nvidia GPUs run on those platforms, something that you can't seem to understand

>Variant 3 (CVE-2017-5754): At this time, NVIDIA has no reason to believe that the NVIDIA GPU Display Driver is vulnerable to this variant.

No affected by Meltdown at all, but hey lets FUD because you're such a loser at life, Bondrewd

Ok, I was about to get terminal depression if it was going to cuck my performance since my CPU arrives on monday

Kepler removed (large) parts of Fermi's scoreboarding hardware, which (in Fermi!) allowed tracking of variable instruction latencies without any compiler knowledge about it According to Nvidia the main culprit was, that this part was a power hog, so in order to trim down the fat, they moved fixed instruction latencies (most math ones) into the compilers scheduling policies. What I am very unsure about is, how this compiler scheduling is turning out in practice, when (known) math instruction alternate with (unpredictable) memory operations.

come back to me if you actually have any knowledge about the matter without idiotic press release diagrams

yes and we all have seen the effects of syscalls having a tremendous overhead when they pass a certain threshold which a gpu that uses so much the cpu will basicly have..

>. Your Shield device or GPU is not vulnerable to CVE-2017-5754, aka Meltdown, however the two variants of Spectre could theoretically be used to infect you.

When you can't read

images.anandtech.com/doci/5699/GeForce_GTX_680_SM_Diagram_FINAL.png

Again, warp schedulers are on Kepler, you don't know shit

shit nvidia change the name of their scheduler to a more catchy one

>ITS ONLY ON KEPLER

fucking hell shills are so fucking idiotic they dont even know why the whole async fiasco back 2 years ago happened guess where nvidia does the context switch senpai and also when you guess right guess as to why it was always so slow

pro tip: no hit

>people unironically repying on poor bait or literal 12yo
>people repying to post with ellipsis and #betterred
Sup Forums is worse than Sup Forums

lolololol didn't read but the name of the thread is enough for my ayy soul GET REKT GREENS

I haven't slept in 2 days but this post destroyed my sides

I can't stop laughing

If /ve/ga/ reference - batman, man.

...

>Am I fucked
Purely potentially - yes. This was not proven nor refuted thus far. We will see soon.

Again, the waiting game is on.

Where you guys get the nerve to shit on nVidia with how AMD dealt with Vega? And I'm saying this as a Ryzen + RX580 user.

Because it's fun to shit on companies with shitty practices.

It's bondrewd, a know butthurt amdrone shitting up various sites and forums 24/7.

There's literally 0 problems with /ve/ga/, it's good. The probalems are with unavailable functions due to drivers and with devs not fully utilizing HBM2's potential yet. I dunno for reference, but third party veggies are super-solid cards.

Fix your defectors or buy a new one, kid.

So the switch will need an update?

A good move from the banks to slow down the mass coming of bitgoons miner, now the jew can monopolize crypto currency by making everyone shit at it, a fine move

Who cares what they call it, nVidia does it mainly on the CPU.

Ayy

>tfw 1600 + 1060

I knew I should've bought a 580

Bondrewd, just end your suffering and bite the bullet.

1060 performs roughly 20% better gaymen FPS-wise, so getting RX 480 or 580 instead is only better if you're strongly afraid for security. Basically for peace of mind there's no better combination than AMD CPU + AMD GPU, but performance-wise and UNTIL the possibility of nVidia GPUs being compromised actually become proven - AMD CPU +nVidia GPU will stay best combo raw performance-wise.

NVIDIA IS FINISHED

>dodged the Intel bullet
>bought a 1080
>dem mixed feels
Somebody make a wojack out of this

What the fuck? 480 and 1060 trades blows. There is no 20% Delta you fucking nvidia shill.

Oh sweetie if you fell for the nvidiajew then you are done for

yes there is

if you downclock 480 to oblivion that is

Wow.

You are largely fine because syscall performance is not affected on AMD CPUs. Expect similar performance as before. An AMD GPU would only be advantageous against an nVidia's if you are using an Intel CPU.

>480 and 1060 trades blows
Not in the slightest.
A 6GB 1060 is cheaper than 4GB RX 480 or 4GB RX 580.
And 6GB 1060 gets roughly 10~15 goyming FPS across the board than the 8GB RX 580. There are some titles in which Radeon does better, but they're few in-between. 1060 does better in general, across the entirety of PC MASTURACE's 19000+ titles currently released out there throughout all these years.

>syscall performance is not affected on AMD CPUs
But it's nVidia GPU that does external syscalls in games, not CPU...

lol

it doesnt effect gaming, so it doesnt matter

Yes, since AMD CPUs are not slowed down by Syscalls, the nVidia GPU will not be slowed down either when it requests syscalls from the AMD CPU.

those syscalls literally dont effect cpu performance, you must be a retard if you think this will effect game fps, which is mainly limited by GPU and not the CPU!

...

this

OP is grasping at straws to try to shill for amd
benchmarks or it didn't happen

GayForce does external syscalls to OS, not CPU. OS is affected by performance degradation on syscalls as patch is applied to both OS and BIOS.

>those syscalls literally dont effect cpu performance
Of course they don't affect AMD CPU. But noVideo GayPoo is not a CPU. And it doesn't rely on CPU syscalls, but on OS syscalls. But nice try anyway.

i dont really care, il keep buying nvidia, since amd is always overpriced in my country
I mainly buy on price/performance, so if nvidia is fastest il buy it, you can keep memeng

>I don't really care, I'll keep buying GayForce
Here we go, people.
Typical ostrich-brained noVideot in a nutshell, ladies and gentlemen.

what do you expenct? novidia shoved a botnet on their drivers and they didnt do nothing

>buy best price/performance
>ostrich
Pubg 970gtx overperformance both 580 and 570, such a amazing buy 580 only 350euros (same as 970 on release 5years ago)

...

i am computer illiterate
what does this mean

literally nothing

It's the same underagaed OP who spam anti Nvidia shit. Nothing

>noVideo
>nVidiots
>#BETTERED
>greenassed

>botnet on their drivers
Huh?

Intel has a massive security bug affecting everything they shitted out in last 10 years.
AMD is absolutely flawless, completely unaffected.
Security patches got released. Security """""(((patches)))""""" create performance drop anywhere from 30% to 67%.
Patch affects syscalls directly.
Nvidia's video cards don't have internal syscall module and heavily rely on external syscalls from OS, thus potentially making nVidia cards susceptible to performance drops and being essentially compromised even if paired with an AMD CPU. Alas, this was not proven nor refuted thus far, We will see soon.
If paired with Intel CPU, however, nVidia GPUs get a 100% confirmed performance drop.
AMD is absolutely flawless, unaffected on neither the security of CPUs or performance of CPUs and GPUs. Radeon GPUs are absolutely not affected by syscall performance drops because Radeons had an internal integrated syscall module as far back as GCN 1.1, nVidia, however, deliberately got rid of them since GTX 5xx days.
Worst combination of hardware EVER starting from this point on - Intel CPU + Nvidia GPU.
Fine, but potentially compromised combination of hardware - AMD CPU + nVidia GPU (compromising is purely on nVidia's side if it ever happens, as AMD is absolutely flawless).
Absolutely flawless, impenetrable, invulnerable combination - AMD CPU + AMD GPU.
Brian Krzchanich sold several millions of Intel's stock several days before this was found out, while Intel's board of directors fully knew what was happening.
Covfefe Lake was released even though Intel already knew it was affected.
Each and EVERY Intel processor released in the last 10 years is affected.
Even if AMD encounters some future attempts at bypassing it's absolutely flawless SEV protection system, any and all potential holes can be fixed easily on software level.
Intel's holes are hardware and cannot be fixed fully.

>he doesnt know
nvidia users

...

source

No, I'm a glorious Radeon race representative, didn't use GayForce since GTX 285 days, that's why I'm asking.

This might have to do with CUDA. CUDA has JIT compilation of a virtual ISA, and anything JIT is affected by spectre.

>uses every other botnet like gooshit/microshit/apshit/etc
>whines about that nvidia collects few card parameteters

you have to log in with facebook to use the glorious gameworks options
puts also a telemetry on their arm cpu and doesnt tell what it does

This is a brainlet question

But why would sensitive info go through my GPU?

NO THIS CAN'T BE HAPPENING

source

>you dont have to login with facebook, you can login with whatever email account you want
>AMD has similar ARM processor to Intels ME built-in there cpu, nobody knows what it actually does, just like intels ME

>>AMD has similar ARM processor to Intels ME built-in there cpu, nobody knows what it actually does, just like intels ME
actually we do know what it does besides being the obligatory nsa botnet

>tfw R7 1700 + 8GB RX 470
Feels good man.

>Tfw Phenom II X4 955 + RX 460
the dragon platform still lives