He fell for the "midrange GPU" meme because he really thought a GTX 960 was enough due to his poverty

>he fell for the "midrange GPU" meme because he really thought a GTX 960 was enough due to his poverty.

Maybe you should have spent more time flipping burgers to save up for a real card, poorfucks.

Other urls found in this thread:

youtube.com/watch?v=02uxM3vyClY
twitter.com/SFWRedditVideos

what are the settings, what is the difference between each setting?

these are important things, because usually shadows from medium to max look about the same but have different processing, some of said processing takes up far more resources,

I grabbed a 960 as a holdover card early this year while waiting for the 1000 line, but this gen fell a bit short of what I was expecting

I'll hope it lasts until the 1100 series, as its been doing very well so far

>1060 trading blows with 980
>1070 outperforming Titan x
>1080 destroying the Titan x
>pascal Titan x capable of 4k60fps

>this gen fell short

Kinda important to know what settings its on. Probably max.

Turn things down a bit and you're golden.

Same here. I dont see the point in upgrading especially with this low yield issue jacking up prices. Ill wait for HBM2. 960 is good for everything but a couple games, but its a shame about DXMD

>1060 retails at same price as 970
>1070 same price as 980
>1080 same price as 980ti
I'm not him, but it's not a big improvement when you factor in price

invalid benchmark because radeons are not suppose to take performance hit in dx12 vs dx11 and this is an amd gaming evolved game

>two of the best cards in the market running simultaneously can barely push a game beyond 60fps
These game devs really need to get their shit together.

looking at the site this comes from, lowering settings seems to just lower the ram use, not give more frames, or i should say significantly more frames.

from what i can tell, the dx12 wont be in at launch is not out yet, so this has to be beta.

Nonsense.

1060 and 560ti have same msrp, you're being silly.

The GTX 1080 SLI is disappointing. It's especially bad since even the Fury X is having trouble and it excels with DX12.

1080 SLI can't stretch its legs at such low res, whatever the bottleneck is, it's not on the GPU side. I mean 1080 SLI averages 71FPS at 1080p and 51 FPS at 4K, there's only a loss of 20FPS for 4x as many pixels. It's by far the fastest option at 4K too (as you would obviously expect).

So what's this shit I hear about my glorious Pascal card not being good enough?

Then it's just a horribly optimised game.

DX:HR ran fine on a toaster so this is disappointing.

Anything ran fine on a toaster back around 2010. Xbox 360 software demand was at its zenith, so everything was cut out for that thing. PC versions differed little besides frame rate and resolution. The current consoles however are completely irrelevant in comparison, so devs take more risks, use more future tech so their sales on PC can last as long as possible. That strategy can result in a performance loss. That's the price you pay for progress and the well-deserved final death of console gaming.

>thinking console gaming will ever die
Top kek

>these frames at 1080
holy shit why

All things considered, it is dying. It'll be undead at best. I'll be looking pretty stupid if Microsoft's Xbox Scorpio project and PS4k turn into tremendous successes, but we all know they'll be closer to 2160x1440 than 4k resolution, and the vast majority of former console consumers won't give a shit anyway.

why the fuck does this game look so mediocore

Consoles are using "tiled rendering" at 1900p then upscaled to 4k with some aliasing. It won't be true 4k, but close enough that frame rates will hit closer to 60 frames a second.

GTX 960 was a laughable card when it came out and overpriced.

I don't know why people do this. It's like they did no research.

>It won't be true 4k, but close enough that frame rates will hit closer to 60 frames a second.
If it's running console tier hardware, that will actually be really fucking sick.

I don't think that's on the highest settings.
The game actually looks really good, but it and Rise of the Tomb Raider are some of the most taxing shit to date for GPUs.

That's on the highest settings. It's not a demanding game actually. And it's still an open world and made by a small team, impressive as it is. It runs this way smoothly and well above 30fps on my 1080. Only a bunch of games can do that. I just like it because of the art and the scale. The hut I'm standing on in the first screenshot is barely visible in the distance here. If only the tech were a little better, you could see it clearly. Can't wait for someone to make a game with 4k and higher in mind.

>If it's running console tier hardware, that will actually be really fucking sick.

Noone will buy a 4k set just for that, instead they'll be upset about their new PS4s and Xbones being made obsolete by something that looks only marginally better on a 1080p TV. Most of these people would agree wholeheartedly if you told them a TV should last half a lifetime and a console should last a decade. Console gaming can no longer grow, it's impossible.

I haven't seen anything hat suggests its dying. in fact isnt this generation (PS4 at least not sure about XBone) selling a fair bit faster than last generation?
Tons of people have been buying and are currently buying 4K TVs when there is basically no content to use it with, so when these come out and give people an actual reason to buy 4K TVs sales will increase even more. Also the prices of 4K TVs are coming down a fair bit, i just saw a catalogue with a 4K 48" TV for $449 AUD, the same brand 1080p same size TV would be $350 or so, very few people buying a new TV will be going with 1080p now

Naah usually shadows are based on shadow map tech and they are generated using depth buffer. The resolution is only thing what matters so 1024, 2048, 4096 shadow maps. They are always pretty fast to render.

Did you know that in renderman, when rendering professional cg, shadowmaps over 2048 are usually waste of time? Now you know.

actually, at least where i live, most content is 720p at best, 1080 is reserved for a few channels, 4k would fit 720p and not look as bad as it does on a 1080p screen.

also, 4k is coming down to 1080 prices, with the only 1080 i have seen that was worth a damn was a 40-50 inch oled.

my fucking god do oleds look good.

not really, shadows are fucking demanding things, and the difference between max and medium is usually insignificant.

there are a few games that scale the shadow map directly with quality, but most games i look at, somewhere around medium the map maxes out and some other filters over the top are added for higher end.

Because you have shit opinions and taste.

how the fuck can they run this on consoles then? is the shit performance because of denuvo?

Cause on consoles it's
A. Optimized to shit
B. Piss poor graphics

because it's made by slavs

is why you have to defend ur game

I bought a GTX960 because it is a silent card, which is more important to me than just raw power.

Yeah, Because you can turn down settings. Every PC game must be played at Maxed out Settings

Consoles running it on low details i reckon. Game plays well on PC aside from Ultra settings which is hardly any different from High
youtube.com/watch?v=02uxM3vyClY

Why's the RX 480 doing that much better than the r9 290? This looks like bollocks to me.

Enjoy your silent 16 fps.

>295x2 is still a beast

How can Nvidia compete.

DX12 performance is irrelevant to me

Kek also they listed the 295X2 as a 4gb card, it's 8 gigs tho.

Things are going to run even worse in dx11, you massive tool.

It's still 4 GB per GPU

And that's nearly as irrelevant to me

I'm hyped for this game, but what the fuck? How unoptimised is his piece of shit that the cowns of last gen can't get 60? Since when was deus ex fucking crysis?

Your existence is irrelevant.

Since you're unable to get the hint:
I'm not even running Windows, so why would I care about the DX performance of my GPU?

>he's a freetard
Yes, you're irrelevant.

That's still a bit disingenuous tho, even if only 4 gigs of VRAM is used during gaming, it's still an 8gb card. Might as well have it say 3.5 for the 970 aswell

i got a 960, went from a 8800, don't bully. there wasn't much that interested me to warrant a bigger upgrade. VR would warrant a upgrade but there is nothing happening there for a while.

i was planning to hold off a complete system build until zen.

They don't just add up all the memory on setups with 2 cards either.

The GTX 970 issue is tricky, since technically it actually has 4GB that could be used, just that 0.5 of it are crippled so much they could have just not put them on the card.

i have a 960 4gig/amd 8350/6gigs ram

i kind of want to play deus ex. last game i played was far cry 4 and this comp ran it fine on medium settings. if i ran deus ex on low what rates do you think i'll get?

>>pascal Titan x capable of 4k60fps
Not true, it can't get 60 in all games and its price has been bumped up to $1200

I had a 960 and I ran Far Cry 4 on high/ultra. You have some bottlenecks?

when i played it i only had 4gb of ram. so it was probably that