So, the shadows are working in Prey on Nvidia GPUs after a patch. And wow, look at that FPS drop...

So, the shadows are working in Prey on Nvidia GPUs after a patch. And wow, look at that FPS drop. 25-30 FPS lost across the board. Sweet Jesus. youtube.com/watch?v=YC2ZlLoo14w

Other urls found in this thread:

techreport.com/review/30639/examining-early-directx-12-performance-in-deus-ex-mankind-divided/3
youtube.com/watch?v=oNF8c9y9GxE
youtube.com/watch?v=nSqWqbLMMrI
youtu.be/YC2ZlLoo14w?t=31
youtu.be/YC2ZlLoo14w?t=75
youtu.be/YC2ZlLoo14w?t=117
twitter.com/NSFWRedditVideo

It's at least playable

How's this gayme?

I bought it but haven't played it yet.

>completely different game
>"switched to DX12"

You shills are getting really desperate, aren't you?

Wow so I actually found the source for this shill cocksucking bullshit, and SURPRISE it's testing a pre-release version of the fucking game. techreport.com/review/30639/examining-early-directx-12-performance-in-deus-ex-mankind-divided/3

"From our results and our subjective experience, it's clear that the developers behind Deus Ex: Mankind Divided have a lot of optimizing to do for Radeons before the game's DirectX 12 mode goes gold in a week and change"

Not surprisingly, in the release version it's fine. youtube.com/watch?v=oNF8c9y9GxE

A long time ago tech magazines would do image quality comparisons between different GPUs. Why did they stop doing that?

They're lazy shits.

nvidia kept getting caught having weird anomalies like op's pics. where shadows where not being rendered, color wasn't as rich, AA issues, extra.

its not a meme when you here /g shitpost about color issues and whatnot with nvidia cards. this has been a long standing issue with nvidia.

Decade old issue nobody wants to talk about

>here
hear* and before anyone accuses me an amd shill, i'm using a 1080 so fuck off.

DELET

>nvidia kept getting caught having weird anomalies like op's pics. where shadows where not being rendered, color wasn't as rich, AA issues, extra.

[citation needed]

...

STOP, DELETE THIS RIGHT NOW

That just looks like a lower resolution entirely.

Also, not only the graphics, but the physics changes too. Even between cards of the same company you can have worse or better physics but no ones gives a shit

Kek
Pretty smart though.
>Have the drivers nerf the graphical settings so all the reviewers get lots of high FPS
>The 'general opinion' of the card is sorted at this point so then they reintroduce the features quietly to hopefully get people to be like 'wow it's even better now, thanks nvidia!'
>reviewers won't bother retesting the cards

Too late though since you already got nvidia'd "The way it's meant to be played"

the tech report is decent, don't shit on them because some dicksucker took a paragraph out of context.

Not many reason to do it, anymore.
They used to do comparisons up to the AMD 7000 series which had lower quality than the 6000 series by default (High in the 6000 series is equivalent to Very High for the 7000 series), the next AMD "gen" was the 200 series which are just 7000 rebranded with only the 285/290/290X being new next was the Fury X.

At some point AMD with a driver update started using the same tricks Nvidia does like bilinear filtering instead of trilliniar in scenes where trilinear is not needed, etc

Nowadays is all up to the game developer to implement their AF, you can force it using the Control Panel but that one is a hit or miss as it uses a "generic" implementation so it can work on all games, while the in-game implementation is catered to the game engine.
Sometimes the control panel AF returns lower quality than the in-game AF and sometimes the opposite with in-game AF being awful and the control panel setting providing better quality.

Some cases where control panel AF is worse, Battlefield 4 and some where the control panel AF is better; Just Cause 3.
Nowadays you can just do AF x16 and only get very tiny performance hit out of it.

Now the question is; which reviewers are going to bother updating their articles? I bet we won't see TOMS SHILLWARE update for example.

>I don't know what delta color compression is: the post

Do you even know what delta color compression is?

What you're describing is literally optimization. Frankly, it should be considered amazing graphics card manufacturers, green and red alike, can squeeze out these kinds of improvements on a card-by-card-by-case-by-case basis. Reducing raw rendering workload with clever methods generally imperceptible by most people is how they keep their hardware relevant. It's the reason drivers exist, you twat.

I don't even know what you are sperging about.

Happens on all GPUs.

Never going to buy an nvidia after their "driver updates? Better sign up with facebook" shit. Also my old gtx 560ti died after pretty much exactly 2 years, it was maybe one month out of warranty when I started getting problems and it kept getting worse, and the new drivers just kept shitting on it.

Can someone post the amd / envidia comparison with a bunch of naked anime girls around a table?

It's kind of like this one

youtube.com/watch?v=nSqWqbLMMrI


Too bad for AMD.

wtf, how can an AMD card from 2009 completely BTFO their own high-end card from last year?

People like should be banned from Sup Forums.
It's one thing to shitpost and post obvious fake shit for lulz.
It's another to be completely misrepresentative and a serious shill like that.

It seems it was more a problem in the past, and that this may have just been a game issue.
Maybe an efficient algorith required intrinsic shader functions only on AMD GPUs and they initially didn't make a fallback for Nvidia ones. It might not be Nvidia's driver.

In the past? Definitely. Nvidia did that shady shit a lot when AMD was butt-devastating them in the 4000-7000 times.

That just looks like contrast.. the shadows are there, but less contrast and not blended right.

Na, lots of them worked on AMD ones fine.

>but the physics changes too
Examples, please?

I think you wandered into the wrong thread here

You know there's a timer on post deletion, right?

Yeah, whats your point

My point is below my eyes :^)

In DOOM they reduced the render distance.
>>>google

Well, it's a lot closer than it was before. They're neck and neck in most scenes, with the 1060 gaining a slight 10 FPS advantage at times. Not really a blowout result.

GTX 1080 press driver happened to render less snow than normal in Ashes of Singularity and thus improve fps slightly.
Of course this was just an accident and not intentional.

youtu.be/YC2ZlLoo14w?t=31 - no shadows around small poles around helicopter base in either version.
youtu.be/YC2ZlLoo14w?t=75 - flast lighting on the neck in either version (was it different with AMD?).
youtu.be/YC2ZlLoo14w?t=117 - no shadow from LCD in either version.

Ok, are those present with AMD too?

Putting aside corporate cheating, how would this happen? Is it possible to make a driver where shadows don't work for one game, even one using the same engine as another where they do work?

They used a beta driver that was 2 weeks old not even available to the public, a public driver without the issue was available at the moment they showcased Polaris, also the benchmark where found out in the ashes database and they were made the same day, it's not like they did it 2 weeks before the presentation so they AMD had no excuse for using that Nvidia beta driver when the public driver was available.

>it's testing a pre-release version of the fucking game.
Latency on the Nvidia cards seem fine though, definately looks like an issue on AMD's side

They used the driver that was sent to journalists to expose Nvidia's cheating.

Some games can render at wrong depth which the driver can fix and actually happens quite often with the skybox and effects like smoke and walls, in this specify case the problem was the game itself and not the driver, people demand a lot of drivers nowadays instead of telling game developers to fix their shitty broken games. I personally think Nvidia/AMD deserve all the backlash they get when a game doesnt works fine for trying to push that "GameReady" driver bullshit instead of releasing stable drives every quarter or so.

You seriously don't believe this, do you?
It was a bug and it was patched, if they wanted to cheat they wouldn't have fixed it, it was fixed even before Polaris was showcased.

they fixed it 5 days after all the gtx 1080 reviews were out. All of them being done with a driver that conveniently forgot to render half the scene. How would that happen by accident?

It happens all over the place. I doubt it's all intentional either. I remember years ago when catalyst control panel launched with an "optimisations" tick box which tried to improve performance for unnoticeable degradation. I suspect all game ready drivers do such a thing like wise and people eat that shit up. But when the catalyst thing came out people were up in arms.

Doom and ashes are the only two example which have come to light that I can remember recently. But the amount of flicker shadows and texture still prevalent in games, I don't think it's possible to know if it is dev incompetence, driver bugs, aggressive optimisations from devs or driver teams.

I tend not to update drivers that often unless there's something truly broken. For example I dodged a bullet on nier2 where more recent amd drivers cause problems but I'm on 16.11 or something without issue. (apart from crashing when I drop internet connection, available offline my arse)

This would bee a good argument if it wasn't it because;
a. it was fixed.
b. sequential benchmarks with the fixed driver showed no performance difference at all with the snow rendered correctly.

>I remember years ago when catalyst control panel launched with an "optimisations" tick box which tried to improve performance for unnoticeable degradation.
Not that guy but that was mainly to fight the overuse of tesselation in some games where it gave no noticeable benefits but a huge performance hit. Crisis was the original sinner IIRC.

Damn that's shady.

It's shady, it's a play, it's a shadowplay™