He fell for the "midrange GPU" meme because he really thought a GTX 960 was enough due to his poverty

>he fell for the "midrange GPU" meme because he really thought a GTX 960 was enough due to his poverty.

Maybe you should have spent more time flipping burgers to save up for a real card, poorfucks.

maybe... maybe just turn down the antialiasing

>This thread again
Did you really enjoy the last one so much? Was it that successful?

moderate AA at 1080p shouldn't be much of a problem with cards like the 980Ti and 1080

>having to turn down antialiasing on his shitty card

oh god poorfucks make me laugh

>unoptimised console trash

>16x AA
>can't even tell the difference from 2x
>caring about other people's choices

is this bait

>Trusting a site that can't even make proper charts

>GTX970 SLI on par with GTX1070

wew lad, looks like I'm not upgrading this season then.

keep dreaming loser

Just how badly optimized is this game?

>That feeling when in the future the 1070 price will dip for a 1070 SLI

ALL SYSTEMS FULL POWER

For a midrange GPU RX 480 is doing really good.

>Original Titan the same as a 960.
>Beaten by a 280x.
NVIDIA doesn't gimp guys.

I use a GTX950 that I got last year for $120 and am completely satisfied with it

>high end cards struggling to get 60fps on 1080p
Must be a poorly designed shit game

>playing newest AAA unoptimized trash

>mfw a/v/tists fell for the Paxwell meme

>DirectX 12

Yeah, I don't really give a shit, because I'm never installing that virus they're calling Windows 10 on any machine I own.

>idiots do benchmarks at ULTRA 16xAA
>When literally nobody needs that shit and it ruins your FPS

>barely breaks 60fps average on a fucking gtx 1080

very badly optimized. It would be one thing if the game looked absolutely incredible, but it doesn't. I mean it doesn't look bad, but nothing to justify how badly it performs.

that said, hopefully this is just one of those outliers where at literally maxed settings it performs like dogshit but there's one single thing fucking it up. So just turn down antialiasing one notch and suddenly you get an extra 30 frames or whatever.

>in the future the 1070 price will dip for a 1070 SLI
this is what I told myself I would be doing, i bought a single 1070 now and planned to go SLI in the future when the price comes back down to Earth

but let's be honest by the time that happens we'll be better off just getting a single gtx 1170 or gtx 1260, or whatever card the company that buys out AMD makes. Technology marches on after all

...

960 was not midrange though, it was bottom tier trash with midrange pricetag because they can gouge the price and AMD can't compete.
The last useable "midrange" card nvidia did was like 660 or around that period.Decent price, decent performance.

The way I see it the 1270 or whatever card at that time will probably be around $400 like the price of the 1070 right now. The 1070 at that time will be somewhere in the ball park of $200.

I'll have to look at benchmarks when that time rolls around but it should be enough to last another 2 or so years. And I want to run a SLI rig at least once to say I did it.

*670

I have the 760 and it's pretty good desu yo. 1060 also looks nice. Agreed on 960 being poop though like 5% better performance than a 760.

the game will use 6gb of vram if you let it.

What a piece of poorly optimized shit.

1080 >> 1080 SLI
kek

Obviously a CPU bottle neck. Who knows what the next generation of CPU will bring

the creators are dumb
what did you expect faggot

That depends on stock.
Prices should fall but sometimes they just dont. Like the r9 290x is 300 bucks even though its 2 generations old

My friend bought a 4K monitor
Told me:
>I turned off AA, you don't need it at 4K
>I laughed
He got mad

True Story!

That's fair enough. This is all speculative and I'm making a bet.

If it drops, I go SLI for another generation or two. If it doesn't and cost is prohibitive, I get the next gen card and my PSU lasts several more years than it would have under SLI load. No real way to lose.

750ti ftw

This...so much this...

Why is the 280x placed below the titan in the graph when it performs better.

Also 7970 (basically a 280x) released on January 09, 2012.

GeForce GTX Titan released on February 19, 2013.

you're retarded, 960 was useless literally because AMD's equivalents were way better

>AMD can't compete

Except the 380 was better than the 960 for less money, and the 380X was better than both. But yeah, AMD "can't compete" because retards like you won't buy their cards even when they are better. Because they "can't compete". Hurr.

Why on earth would you need more then 2x AA @4k for?

>the power of 1070 only with 3.5gb
>no async
>no dx12
>no vulkan
>already for gimping
>lel sli
wew lad

so its badly optimised because its actually using async and compute shaders as models? and not the single render path that nvidia calls async?

yeap "badly optimised"we called it since 480 was launch...deus ex will showcase just how many lies nvidia has said

on dx11 it still runs like fucking ass
its badly optimized, period

>x32 AA
>can't handle it at 130fps? You some kind of poorfag?

>implying anybody said this

and he still gets more fps than you

Utility DX12 launches.

dx12 already launched. still getting more fps than you.

In what games other then Tomb Raider?

Im just waiting how nvidia cards will tank, specially the legacy Maxwells when these DX12 options are enabled

what do you think?

You what?

In my current library, I don't think there's even a single game that can fully utilize my 960, often despite maximum settings.

Best buy I ever did. It will be really easy to upgrade with no regrets once I manage to get employed and have actual disposable income.

In fact, it should be a really good PhysX card.

Now, the thing that I actually regret is my Celeron. I should have gone with an AMD for 20 bucks, not a Celeron for 45, as now I'm upgrading to a Pentium anyway.

>playing newest games on midrange cards with VHQ settings

(you)

why do we make gpus the pissing contest when the real issue is we have modern games made by developers who couldn't program their way out of paper bag. look how fucking unoptimized this garbage is

I have a 750ti and it is massive overkill for the most of the stuff I play.

960 here, and same.

>1080 sli
>10 fps more

Sad days when devs fucking expect money for horse shit programming.

fucking faggot. i live in a third world shithole, you can literally buy a gtx 1080 in the us with the same amount of money i bougth my gtx 960 here.

kill yourself

If this was 6 years ago I'd agree but AA doesn't have much performance impact on modern cards.

what the fuck is that real?
are those framerates all divided by 10?
who would release software in this state?

Nixxes also did RotTR and that DX12 sucks too. Nixxes can't code for shit.