PC graphics settings

Would you rather run a game at high resolution with all the "goodies" (bloom, ssao, fxaa, etc) turned off or run a game at lower resolution and have all the "goodies" turned on

FPS > Native Res > Grafics > AA

Fps takes priority.
I would obviously run the game at as high a resolution as possible with stuff turned off, as long as it meant the framerate was high and stable enough.

>bloom
>goodie

Always native resolution (or higher than native resolution) at stable 60 FPS. Resolution makes the single most obvious visual difference, while graphics settings even when they take massive resources to render often times have effects that are barely noticeable.

Nah.
FPS stability > Image quality > Graphical fidelity

FPS > Native Res > AA > Graphics
It's true because my post ends in a 5.

>comparing 1080p shots
>in a resized image of height 524
>doesn't even divide the original dimensions
This triggers my autism.

I usually aim for native res and then tune down the settings to get a good framerate. Before I upgraded sometimes I would run 1600x900 in windowed mode instead though, I refuse to run it on non native though looks like shit.

>AA lowest
Nigger what

lol wrong faggot
End in 7 and Horizon gets ebin

The sweetspot between performance and fidelity is different for almost every game.

It's the classic 'you don't need AA at high resolutions' argument

My thoughts exactly. If you're going to upload a comparison the resolution of the image needs to be high enough to actually see the difference

autists want to see the pixels when playing shooters

Framerate comes first. It HAS to be at least 60.
Native resolution.
AA
View distance
And the filler graphical options come last.

Turn off Depth of field, Bloom, motion blur, film grain, vignette, and any other pointless movie shit that's used to advertise the game.

look a the gladiator's arm wrap detail, there's a difference

I like jagged edges, it makes games look like games I don't know maybe it's N64 nostalgia.

Not every game needs a 60fps framerate.
It's mandatory in fast paced games that require fast reactions.
Everything else is perfectly playable with +40

They aren't saying there isn't a difference, they're pointing out how the comparison image in OP is made in a really dumb and self-defeating way.

What are some post-processing effects that can/should be turned off?

I turn SSAO off.

this nigga got it

fxaa, grain and motion blur

>Turn off Depth of field, Bloom, motion blur, film grain, vignette, and any other pointless movie shit that's used to advertise the game.

This is probably the main thing I do before anything else when it comes to adjusting settings. Fuck blur of any kind. I want crystal clear images when I play video games, I don't give a single fuck about what the human eye does when it comes to distance, peripheral vision, etc. Fuck that noise.

Also fuck chromatic aberration. I love Bloodborne as much as the next guy but that shit pisses me the fuck off. All these types of things should be able to be turned off, no exceptions.

>Playing at anything between 30 and 60

in a serious way, this is true

FPS is most important has to be at least 60, after playing games at 80+ FPS for so long 30 honestly looks so choppy it makes my eyes hurt or i feel sick.

>Everything else is perfectly playable with +40

Sure, but hitting that 60 mark usually means consistent framerates and that's mainly what people want in addition to 60 FPS. 99% of the time anything below that means it's not consistent and thus garbage. No one wants shitty fluctuating framerates.

Never ever peacock.

I don't care as long as the game is playable. That's seriously all I ask for.

Now this, I can get behind.

>png
fucking why? it was a gif

Speaking of AA

Which from of AA is the best?

i'll do both for a game and see what i like better. i prefer fps to eye candy

>png
>being this bad at shitposting

Supersampling.
Of course, it's also the slowest and takes the most resources. That's why FXAA and so on exist.

The only reason it's a problem is because of display frequencies. Play a game with G-Sync and you won't notice the difference above 40.

It's not always a question of "need". Games are choppy at lower frame rates, it's distracting. You can definitely play a game at 30 FPS and it's not a game killer even with relatively fast paced games, but in every case it's less enjoyable than a smooth high frame rate of 60 or above.

>playing at anything between 30 and 60

Gsync actually makes this totally okay

I just remembered that stupid AA that's used on some console games (Far Cry 4) that leaves after-images or some kind of motion blur whenever you look around.
>I heard that you liked motion blur, so I put motion blur on top of the motion blur you already had
Why does that thing fucking even exist?

>Chromatic aberration.

Oh right I forgot everyone owns a gsync(tm) monitor

Mid res+AA

Even if I could have high res and bloom I wouldn't. I'd rather have fucking low res and no bloom. Fuck bloom.

N64 didn't have jaggies though, it had hardware AA to take care of that. What it did have, as a result, was everything looking soft and blurry.

>game has a visual feature that uses a lot of resources and slows game down
>turning it all the way down has no visual difference, but runs better anyways
>has another taxing visual option that cannot be turned off in-game

Fucking Dying Light.

My current rig still has no problem to get 60fps on ultra in almost every game that isn't a optimisation shitshow.
But I still think, that while 60fps is some kind of sweetspot its not the mandatory number some autists want it to be.

But I have also playing games on PC for almost 25 years. Hardware jumps back in the day were bigger than they were in the last 5 years and that's why I'm probably ok with sub 60 in some games.
Because back in the day hardware was more expensive and got obsolete faster.

Which one? Motion blur or depth of field?

is it true once you get to 4k you basically dont need anti-aliasing anymore? That's pretty cool if it's true.

I don't think anyone legitimately believes it's a mandatory number. Tons of people who play games on PC get framerates below 60, whether it's because they can't afford good enough hardware or the game is poorly optimized. It doesn't stop them from enjoying the game if it's any good.

I sometimes get sub 60 FPS too, because my CPU is unsuitable for gaming, it's annoying but it doesn't kill it.

My guess would be chromatic aberration
Fuck this trend and devs that think it's the best thing since sliced bread.

No, not true. Now of course if you had something like a 15" screen with 4k resolution, aliasing would be quite difficult to notice but no, you still get aliasing at 4k, in fact with more pixels you get more aliasing.

This exactly. I might compromise a little bit on the FPS if it means I can get my native res. But this is the right priority.

FPBP

>2001+16

>still being poor

>Implying I have to choose between them

I wish I understood how to use upscale. Just to try it

Nigga, I'd disable most of that shit to get native resolution.
Effects in videogames are mostly lousy attempts at faking camera effects anyway, the worst being shit like excessive bloom and sunshafts and such. Motion blur is useless. Depth of field is usually overblown as fuck as well.
The one setting I find important enough to keep is occlusion in some form.

I wish games would have settings for specular mapping as well. The models of some games look entirely like plastic because of excessive amounts

AA kills performance, I use 2x or 0x depending on the game 2x is enough most of the time anyway.

Looks prebuilt - disgusting.
Nice specs tho.

>2 grafix
>sli disabled

por que?

Those are some really high idle temps for the 980s

The problem with a "I'll always turn this off" mindset is that technology and techniques change.
In 2006 bloom was done by blurring the brightest parts of the image and wasn't worth a damn. Now that games use physically based shading, they don't even give you an option for it half the time because it makes 0.004% difference to the image. In 2008 motion blur was done by smudging everything in the same direction as the camera movement. Now that everything has compute shaders and the math's been worked out, most games can do it realistically and subtly.
Everything is better now. Only the things that are objectively bad like chromatic aberration are worth disabling. It's funny that people talk about film grain, because in a few years time you won't be able to turn it off. It'll be part of the temporal sampling solution, a random offset to make volumetric light shafts and sparsely sampled depth of field look better.

Not true, but it means you don't need MSAA anymore. At 4k, FXAA works.

Quite simple. Play a game fullscreen at a rendering resolution lower than your monitor's native resolution, the image will be upscaled.

...

I turn off motion blur and chromatic aberration.

Nah, all the parts were bought by me and then put together.

Speccy just shows it that way, same reason it doesn't show overclocks on my CPU.

Those are the STRIX models. The fans are off most of the time so that's why the idle temps are slightly high compared to other cards. It makes the PC quiet AF though when you're just doing shit outside of gaming.

Right one is modded, im sure.

Shadows atleast at medium>FPS>res>2xMSAA>HD textures>rest of graphic options>highest MSAA possible

Obviously. There's no native way to turn off the Vaseline filter that covers everything in DaS.

3rd Person: 30+ fps
1st Person: 60+ fps

They both have the same texture mod for the Elite Knight armor.
The point of the image is to show the sort of difference in visual fidelity a higher resolution will provide, not to shitpost in some dark souls thread.

Nope. Dark souls actually has pretty insanely (for the time) high rez textures. But the game ran like absolute dogshit on consoles so they were downsampled to all fuck. Even on pc it took a while to get running well.

I mean yeah technically it's modded, but the modding wasn't adding new textures, just revealing the ones already in the game.

Always turn off Chromatic Aberration and Depth of Field. Motion Blur on a case by case basis. Some games it works, some it's awful.

I barely can notice AA.
If I focus really hard maybe. But past 2x it really is worthless. It has terrible returns when you consider how much it hits performance.

>It's funny that people talk about film grain, because in a few years time you won't be able to turn it off. It'll be part of the temporal sampling solution, a random offset to make volumetric light shafts and sparsely sampled depth of field look better.
Upgraded my 9yo computer few months back and noticed impossible to turn off shit film grain in Deus Ex:MD, mfw i learned that this is due to how graphic is generated now.

Ugh... I was waiting to load the dance animation.

Sorry buddy, while the original Dark Souls has extremely high quality assets, that specific picture is modded beyond dsfix. It's not the point, however.
See

Right one has SSAO where the left one doesn't

I can't understand all the shit Asus gets for their cards lately, I have the same one (but only one) and it's a preddy gud card.
Can also be OC'd very easily and still does a fine job without dying a fiery death,

Admittedly the Maximus VII Hero I have has had a couple USB based issues that are kind of annoying, especially with the Corsair K70. Still pretty decent board and had very little hiccups otherwise.

I can understand some of the hate with their monitors though. They seem to have extremely bad quality control and refurbishing. Love the rest of their stuff though, and their customer assistance has been pretty good.

>Turn off any form of AA
>Game runs at 260 FPS
Fucking master race gaming is just a meme.

CA, Grain, DoF, motion blur and bloom in some cases.

Both screenshots taken with the same dsfix presets, 'vssao2' iirc, the only difference between the two is the rendering resolution.

Always turn off motion blur, bloom, depth of field.
And set aa to a lower setting.

C H R O M A T I C A B E R R A T I O N
the biggest offender