Dear Sup Forums

Dear Sup Forums.

I was looking at Sup Forums just now and came into this little pic, avarage and minimal FPS with Rise of Tomb Rider with different GPU.

Pic suggesting that fluid animation - 60 FPS - on 1920x1080 resolution, is possible only on GTX 980/Radeon R9 Fury and better.

This makes me a bit shocked - did this little train carrying the GPU demand and lack of optimization in games started to accelerate a bit too fast?

I used to buy new card, from GTX 60s series, once per 2-3 years and it was fine. Now it looks like they would rather want me to burn money on 70s or even 80s card every year to be able to play properly.

Or this is just the deal with "AAA" titles?

Isnt this a bit too far?

Other urls found in this thread:

youtube.com/watch?v=DvCsT_dk-1E
youtube.com/watch?v=_tlzzSxeenk
youtube.com/watch?v=NtXRKUj_fGc
twitter.com/SFWRedditGifs

Anybody?

you dont have to run every game on max settings, numbnuts

This is for maximum graphical settings. Any decent midrange card can run this at 1080p60, you'll just have to go without the ridiculous multi-dimensional AA or super ultra ambient occlusion HD+ shit.

Ultra and sometimes even High settings are a waste. Shit like shadows and AO kills FPS.

...

Because shit like SSAO, tesselation, and high resolution textures didn't exist back then.

Yeah, there's a lot of games that could always be optimized better, but graphics have gotten more demanding to the point that it's straining hardware more than ever. Especially considering node shrinks are becoming harder to achieve.

>my gpu isn't even on there
Living the dream

That's a bullshot. If you don't believe me then feel free to download the game for yourself since it's in open beta.

Devs don't have the budget to optimize games because PC market is borderline irrelevant.
Nvidia makes marketing deals with publishers.
Payment works by having game devs on Nvidia's payroll working on the game.
They have no interest in writing efficient code because their business is selling GPUs.
They implement shit like ultra shadow settings that cut fps by 30% compared to high settings or only work on 6GB GPUs.

Moral of the story: Figure out which settings cost the most fps and turn those to high instead of ultra.
You can't even see the difference.

Many multi-plat games are poorly optimized for PC. Rise of the Tomb Raider is a good example of this; I can barely push 50 fps on my 980ti High setting 1080p.

Developers build around console specifications and consider the PC platform as an afterthought.

Tomb Raider was well optimized.
Rise of the Tomb Raider isn't because Nvidia got involved.

>That's a bullshot

No, it is a well known fact that literal who's are known to do better texturing than "crowned" developers.

This is why you mod any game actually on pc.

My GTX 770 sure got nerfed quickly.

I bought it mid-2014 and it played everything on very high settings with 60fps. Now I can only play games on low and I don't even get 60fps (Witcher 3, Hitman, etc.)

Gonna order a GTX 1080 just to play at 1080p, I would wait for AMD but I heard Polaris is gonna be midrange at best. I just want silky smooth 60fps.

>That's a bullshot.
Kek, who started this meme?

Anyone that has actually played the game.

I play Halo MCC just fine at max settings 60fps.

problem is directx 11 and fucking optimization retard developers.

i've seen better looking gook and chink f2p mmo's designed for shit computers.

Unlike you I actually have played the game.

This is why
Nvidia holds a majority of the high-end market for GPUs. People who buys these cards actually ENJOY the prospect of spending more money every few years. Look it up, people get "upgrade fever" and just want to buy new hardware.

Nvidia doesn't optimize that well for older architectures anymore. Ot's quite likely they actively ignore or even sabotage older generations of their own cards in order to sell new ones.

Mate your CPU is on fire

Boot it up right now then instead of posting hud-less shots that for all we know are bullshots (they are)

The game just doesn't run very well, but as with most games turning down a few resource intensive features will net you a whole lot of FPS for an almost unnoticable fidelity difference.

Fallout 4's God Rays come to mind. Also there's almost no difference in that game between high and ultra (except for the huge percentage increase in FPS that you get)

Nigger, you can turn off the HUD by pressing F9.
Play the game before you spout bullshit.

If only MHO wasn't optimized like shit.

Nice reading comprehension, mongoloid. I wasn't saying you can't hide the hud.

It's actually really well-optimized.

No, no it isn't. It's worse than tree of savior.

What? That's on PC?
I don't enjoy it, It's not fun fighting enemies and my fps tanks into the high 40s/low 50s in W3, hitman gets low fps but it is playable as it doesn't require a high level of precision. Doom runs like shit on my card as well.

They do sabotage their older cards. Look at Gameworks, it was really to force Kepler users to upgrade to Maxwell, AMD has a neat work around but Kepler owners don't get any of that.

Yeah, Nvidia is a garbage company, but I just want to enjoy my games at 1080p 60fps and I'm getting impatient.

U fucking wotm8?
Game runs amazingly well considering the visuals, see .

The bullshot?

>Shows performance metrics from MSI Afterburner right in the image
>"BULLSHOT!"
user, are your parents related?

>it's impossible to place that over an image

So that would be a yes, then.

Then you should upgrade to AMD instead. You know the 1080 will get gimped again in 18 months, then you'll happily buy a 1180.

>bought a 980
>only play japanese games that use a tenth of the power

Flagship cards are a meme unless you play western games, and western games are shit

Is that why you still haven't showed an actual in-game screenshot yet?

Doesn't make sense at this point to get an R9 390 because GTX 1070 will be coming out soon.

Why is trolling this fun for you?

For $400. You can find R9 390s under $300

You've actually played the game while apparently I haven't. Why is it so hard to produce a screenshot?

This
"max" doesn't even fucking mean anything
>oh God I can't run hairworks at 16x AA at 60fps
>ugh I have to use high shadows instead of ultra shadows
>I'd rather have hbao+ instead of a higher resolution
>I need AA at 1440p
Fucking kids these days

And Polaris 10 will bring fury performance at $299 retail

Anyone buying a gpu before next month is retarded

A lot of developers just don't take the time to optimize the games on PC. A lot of these games hardly have anything amazing graphically, but they're a pain in the ass to run perfectly for no fucking reason.

You done being retarded?

looks like a last gen game

>$300 for slightly worse than Fury performance vs $380 for slightly better than 980ti performance

If this is what happens, then AMD is fucked.

So does RotTR.

AAA titiles are poorly optimized, breaking news

then it's far ahead of the other monhun games
plus, it is a last gen game

You can thank Denuvo for that.

But user, Mad Max used Denuvo, and it ran pretty well.

>My old card struggles on new games!
Holy shit!

youtube.com/watch?v=DvCsT_dk-1E

Mad max also was an average looking game.

gee wasn't helpful on this one.

How much am I losing with a gtx 960 @ 2.0 x4 down from 2.0 x16? It's literally the only working slot I have.

He'll never be done. People like to pretend to have a double digit IQ a little too often.

It looked very nice but it had very little on screen. Just empty deserts and 6 enemies tops. And no AA.

This came up while I was watching that video. Strangely fitting

youtube.com/watch?v=_tlzzSxeenk

It'll be pretty bad.

You realize that the games themselves are biased towards GCN and Maxwell in code actually making the GTX780 less efficient which is why its beat horribly by cards that it easily beat on launch

Kepler is now performing much worse than GCN

Hopefully Nvlink fixes this.

That's for a high end card though, i don't think it'll matter at all with a gtx 960.

wait a bit
with the powergap of the next gpus coming out devs will just put optimisation even lower on their priorities list
and /v will be full of threads saying this kind of shit :
what you don't have a 1070 ? fucking poorfag

Look at the last generation of console ports.
In the first few years, you could get by with some 256MB X1800 or whatever. But once the generation was in full swing, you needed at least an 8800 GT 512MB for most games at 720p medium/low.

The same will happen this generation. The 750 Ti has already been BTFO and the 950/960/7870 aren't doing so well.

I'd say a 390 or 980 are going to be the minimum effective GPU for most games until 2022.

>But once the generation was in full swing, you needed at least an 8800 GT 512MB for most games at 720p medium/low.
I assume you mean 1920x1080 high.
Hell, even BF4 ran well on medium settings at 1600x900 on an 8800 GTS.
youtube.com/watch?v=NtXRKUj_fGc