WATCH_DOGS 2 OFFICIAL BENCHMARKS BY NVIDIA

Source: geforce.com/whats-new/guides/watch-dogs-2-graphics-and-performance-guide

>SLI gets 10 to 15 less FPS than non-SLI on average
what the fugg

Today's driver update should fix it, right? From the 11/28 patch notes:

> Game Ready
> Provides the optimal experience for Watch Dogs 2.
> Application SLI Profiles
> Added or updated the following SLI profiles:
> • Watch Dogs 2 - updated SLI profile
> • Call of Duty: Infinite Warfare - disabled SLI
> • Call of Duty: Modern Warfare Remastered - disabled SLI

>Need a 1060 just to get close to 60 fps on 1080

wew
fuck this industry

jesus christ those drops for AA and fog

>need a low-mid tier budget card to reach 60 fps/1080 on a current gen title

????

>falling for the SLI meme

>980 barely gets 60 fps at 1920 x 1080
Has it happened? Is my rig obsolete? I didn't want to buy another GPU until the 1100 series was out, but I cannot accept playing games on my PC under 60 fps.

>1060
>low-mid

What the hell is mid to you, then? A 1070?

Makes sense to me

1050/Ti - low
1060 - mid-low
1070 - mid
1080 - mid-high
Titan X - high

9 series has been obsolete for quite a while now, with the exception of 980ti

>Fifth best video out there outside of a Titan
>Low end

Holy fuck dude

>1080
>mid

Alright, I can clearly see you don't know what you're talking about.

Then why can I still get 90 fps in Battlefield 1, but Watch Dogs 2 runs at console levels?

>GTX 1070 costs $400
>cpu costs $330
>monitor costs $500
>mfw I've been going on logicalincrements for the last 3 years and never did the build because too expensive and parts get outdated the year after

>1080 - mid-high
Jesus Christ.

W-What about AMD?

English isn't your first language, is it? I said the 1080 is mid-high, and the 1070 is mid.

because the bf1 engine runs well

>watch dogs 2

literally who cares

Hey retard, you're still implying a 1080 is within mid-range like a big fucking idiot.

anyone who uses SLI is fucking retarded

>$500 monitor
You don't need a 1440p g-sync monitor. I paid $200 for a 144hz 1080 monitor, and I'm perfectly happy with it. I'd never spend more than $250 on a monitor.

a single GTX 1080 beats out two GTX 1080s in SLI

what the fuck is this retarded benchmark?

No one does, we're just here to circlejerk over GPUs.

He's right though. The 1080 is a upper-mid-range card. It's the small die version of Pascal. The high end is the large die, used in the Titan and Ti from each series.

Am I gonna have to sell my 970 soon? ;_;

As long as the Titan X is on the market, that's exactly what the 1080 is. It's the upper end of the middle ground. What is so hard to understand about that?

someone forgot to do the SLI at Ubisoft

That would make sense if the game actually looked good and had some impressive technology in it. But it doesn't.

>51.2fps at 2k

Hang in there 980 ti, I want a couple more years out of you

No this is not acceptable. Unless your game specifically goes for new records in graphics quality it should run fine on the high tier cards from 3 generations back at least.

>1070 - mid

Hello newfag who bought his first computer parts after 2010.

You people are deluded as fuck and are the reason why companies are getting away with this so easily.

Titan's aren't consumer cards, more like "prosumer."

Well to be fair here

Ultra presets usually contain some settings of experimental technology which absolutely rape frames

But then you will never see real colors on your monitors either.

>the brand spanking new cards can't even get above 100fps on 1080

seriously?

sounds fun

...

>Playing this nigger-loving garbage ever

Oh, you sweet summer child. I've been building systems since before you were born. To call the 1070 anything other than a middle-tier card is ignorant.

1080 is a high-enthusiast card.

Are you really gaming at 4k on a 980ti and expecting it to last?

I'm still doing fine at 1440p on a 780 at least until the 1080ti comes out.

No one cares about SLI/Crossfire anymore. Why would a company bother optimizing for multiple cards when less than 1% of it's users have a setup that could take advantage of it?

>$450 GPU
>Mid-tier
l o l

No, you really fucking haven't, you are clueless as fuck, if you weren't you wouldn't have said what you just said because then you'd realize how fucking stupid does that make you sound.

That's fine, I have my TV for "real" colors

hello there poorfags
pc gaming is not for you

Hello razer-kid, hardware discussion is not for you.

This, I don't understand why people don't realize that ultra settings aren't exactly "optimized" since by definition they expect the user to brute force whatever technologies they're using. You can't predict what card it will take to "max out" every setting in a given game.

>I'd never want to play at 2K

ok loser

Post a pic of your speccy. Lets see how rich you are.
You show yours I will let you see mine.

Well let's see here, you're what, 16? I built my first PC in 1998, which means I've been building systems since before you were born, just like I said.

>Realizing you're being victimized by manufacturers is being a poorfag

From that statement, I'm guessing that you are a 17 year old kid.

Are you mentally retarded or something? The post says TWO KAY

PC master race faggots like you are the reason console players shitpost about PC.

Stop blaming the cards or your perception of their rankings. This is entirely on Dishonored 2 either being shit or a shit port. There are hundreds of bad Unity games on Steam that won't run more than 10-20 FPS even on a Titan X because they are just that badly made. Even though they look worse than Minecraft. Does that mean the systems are shit? Of course not.

Why would I play at 2k when my PS4 Pro can play at 4k?

good luck selling that meme card, RX480/1060 are cheaper and better, absolutely no reason to buy a gtx970 unless you're selling it for a very low price.

300 dollaridoo nvidia card has been low-mid for as long as I can remember

I mean, it isn't, and you won't find a single hardware enthusiast who agrees with you. Saying something doesn't make it true. The high end for Pascal is the Titan XP and 1080 Ti. The 1080, like the 980 and 780, is a mid-range part.

>Trying this hard to sound old
Fucking pathetic.

Why are game devs forcing GPU makers to subsidize their laziness so they don't have to optimize their games?

Upscaled 1880p checkerboarding is not 4k

no need to be so butthurt
you can always buy a console and play video games at 30fps in sub-720p lol

>thinks hes playing on geniune 4k.

My fucking sides

>Are you really gaming at 4k on a 980ti and expecting it to last?
Not him but I do. I'm just waiting on that 1080ti or whatever finally makes 4k 60fps in modern titles feasible.

at 30fps, maybe, gratz

This. The 8800 GT was ~230$ when it came out. The prices have inflated madly.

>1998
>17 years old
Know how I know you're underage?

It's a conspiracy. GPU manufacturers are pushing shit optimization and barely functional new technologies onto triple A games to force people to upgrade their GPU.

yeah im not butthurt. I just want to see the shitty rig you have. As promised heres mine

It's obviously a false-flagging console cuck.

Well you just said yourself you're not expecting it to last since you're upgrading within 1 gen.

>why are game devs intentionally holding back optimization on PC so consoles can keep up?

FTFY

Or I could build a $800 PC and play games made by people who know what the fuck they're doing, like D44M.

This is a fucking cop out

For years ubisoft has made an example of themselves as being the worst at and making the worst optimization efforts ever

Ultra graphics were always reserved for the high end video cards and with Nvidia line up always the x80 and x90 gpus

This fucking retarded notion that the 1070 is mid tier is the most baseless garbage ever. Nvidia even devices it themselves that their new mod range card is the 1060 and the 1070 and 1080 are their high end and enthusiast models

Somehow saying that ultra graphics are meant to run like shit and its OK that they're unoptimized is a slap in the face and blatant blind defense of ubisoft and their garbage game support. This is the SAME shit that they did with watchdogs one where ultra ran like ass and no body ate it because it was shit it was self evident

SPECCY THREAD!

>$800
yeah lol
only thing you're gonna play with that is undertale

But the PS4 Pro runs lots of games at native 4k, user.

Hasn't SLI been shit for a long time now?

Err, I do as in I do play. I messed that sentence up.

>buying a low tier gpu

lol at least get a mid tier gpu like titan x.

logicalincrement's $740 build has an RX 480, aka 150fps on average at 1080p on DOOM.

yeah it's a meme to get retards to buy two cards

>WATCH_DOGS 2
For an atosecond I thought I should pretend to care

>native 1080p
>runs at "native 3k" just to inflate them to 4k even though it's not native at all

>saying "2k" for 1440
What? 2k is 2048×1080. 1440p is QHD / WQHD.

Name 1 recent game that is not an indie or a last gen remaster that runs at native 4K

>SLI-ing two mid-tier cards
Fucking retard.
Fuck of autismo.

You are small time.

no not really, but apparently watch_dawgs 2 was not designed for it at all

actually its "gpu's" theres 2 there

whilst you are posting from a rig that wpuld struggle to run any game bar minewsweeper

Why not indie or last gen? Why do you need to move the goal posts like that? You should really just learn to accept when you're wrong, it's unseemly for a man to have to resort to such pathetic tactics in an argument.

are you from the future?

Not him but it partially implies that AAA devs dont give a fuck about their games to the point that their quality is absolute shit. AAA games were supposed to be top tier quality, performance, graphics and everything in between. But they're fucking not in anyway and they're prime examples of bad development

You guys are fucking stupid, jesus christ. Resolution means jack fucking shit when you take shit such as supersampling/obnoxious amounts of AA and crank it up the ass.

>shiggydiggy

Is the case a refrigerator?