"Ultra" settings has lost its meaning and is no longer something people generally should build for

A lot of the build help request we see on here is from people wanting to "max out" games, but I generally find that this is an outdated term as even average gaming PCs are supremely powerful compared to what they used to be.

Here's a video that describes what I'm talking about

>youtube.com/watch?v=d5ZsaavKNR8

Maxing out a game these days usually means that you're enabling "enthusiast" (read: dumb) effects that completely kill the framerate on even the best of GPU's for something you'd be hard pressed to actually notice while playing the game. Even in comparison screenshots it's virtually impossible to notice a difference in image quality.

Around a decade ago, the different between medium quality and "ultra" settings was massive. We're talking muddy textures vs. realistic looking textures. At times it was almost the difference between playing a N64 game and a PS2 game in terms of texture resolution, draw distance etc.

Look at this screenshot of W3 at 1080p on Ultra settings, and then compare it to this screenshot of W3 running at 1080p on High settings. If you're being honest, can you actually tell the difference with squinting at very minor details? Keep in mind that this is a screenshot. It's usually even less noticeable in motion.

(1/2)

Other urls found in this thread:

cdn4.dualshockers.com/wp-content/uploads/2015/05/witcher3_03_Ultra.png
cdn4.dualshockers.com/wp-content/uploads/2015/05/witcher3_03_High.png
twitter.com/NSFWRedditGif

the human eye can't see above 1080p

cdn4.dualshockers.com/wp-content/uploads/2015/05/witcher3_03_Ultra.png

cdn4.dualshockers.com/wp-content/uploads/2015/05/witcher3_03_High.png

Why is this relevant? Because the difference between achieving 100 FPS on Ultra is about $400 more expensive than achieving the same framerate on High, and I can't help but feel that most of the people asking for build help on here aren't as prone to seeing the difference between the two as us on the helping side are.

The second problem is that benchmarks are often done using the absolute max settings (with good reason, mind), but it gives a skewed view of the capabilities of some of the mid-range cards like the 580, 1070 etc. These cards are more than capable of running everything on the highest meaningful settings at very high framerates, but they look like poor choices at times when benchmarks are running with incredibly taxing, yet almost unnoticeable settings enabled.

I can't help but feel like people are being guided in the wrong direction when they get recommended a 1080ti for 1080p/144hz gaming. Is it just me?

TL/DR: People are suggesting/buying hardware way above their actual desired performance targets because they simply don't know better and we're giving them the wrong advice and/or they're asking the wrong question.

(2/2)

Go the fuck back to Sup Forums

>people are stupid
you don't say

This is a thread about giving advice to people on building pc's. Highly relavent to Sup Forums

This is shit anyone who bought a mid-range GPU to save some cash already knows.

This is why Ive been buying mid tier cards since the days of voodoo cards. Usually $150-$200 will get you a really good card capable of playing most games on high enough settings that it doesn't matter. I'm still running a gtx760.

The human eye also can't see more than ~56.5 fps

So 1060 or 1050ti?

hey what do ya know, for once op isn't a faggot. I've thought basically the same thing for years. One thing I can think of is that getting, say, a 1080ti over a 1070 is that you will be able to run more games on higher settings further in the future. Also, going from medium>high or ultra usually results in better particle effects, or less frame dips in situations with high particles, which is something you won't be able to see in 99% of side-by-side comparison shots like the ones you posted. I still think you're generally right though

I imagine a lot of these people get into a vicious cycle and just buy top of the line gpus every year anyways for "future-proofing" which almost entirely defeats the purpose.

1060

That's what I thought - good on you user

or just don't use AA

This belongs on Sup Forums more than GPU flamewars do. And he has a point.

friendly reminder that games are for kids and shouldn't be discussed outside of

Yes, I can see the difference immediately and obviously. The trees in the distance look like absolute shit in the High screenshot, like something out of a low budget PS2 game.

friendly reminder that shitposting is for kids and shouldn't be posted outside of

the human eye can't see above 240p if you are far enough away from the screen

This is a good thing though, isn't it? Graphics are basically as good now as they're ever going to get. It means that developers are going to start diverting funds toward quality of gameplay, story, music, and their game's actual aesthetic, instead of just improving raw graphical power.

We all know that's not gonna happen

In w3 mostly taxing settings are aa and proprietary like hairworks.

Things like occlusion are also very taxing on highest levels.

1080p 1080ti is over kill but a single ti can't sustain 144hz on 1440p.
Even with SLI 1080 sometimes I drop as or some setting to keep 140+ hz.

The thing with image Fidelity vs speed, I will go ever with speed. Once you are gaming you pay less attention to very subtle detail, but lower frequency can be a shit really.

>he doesn't go at a leisurely pace in RPGs and enjoys the atmosphere and scenery.

Fag

They will if they realize that they're not getting any return on that investment. You pour millions of dollars into a product's graphics and get no real improvement in graphical capabilities. No real edge over your competitors. No extra profit. So you start putting money into something else that'll actually drive sales.

1080ti can't even do 144hz on high at 1440p.

Stop playing at 1080p and get a 4k monitor if you can't see between High and Ultra. It's fucking obvious.

But yes, Ultra is useless for you if you're a poorfag.

Now fuck off back to Sup Forums.

Exactly. Look at how popular a game like playeruknown is with shit tier graphics

The way I see it is ultra is the screenshot setting, you set it to ultra to take a nice screenshot and then go back to sensible settings to play the game. The difference is not noticeable at all while playing, but the little (but expensive) improvements in Ultra make screenshots look a lot better. After looking at the OP image you can notice that things like the trees in the distance look noticeably worse on high settings, you probably wouldn't notice it when actually playing the game but for a static screenshot you can see the difference.

This has already been happening for years - sort of.

Graphical quality more-or-less plateaued in fucking 2007 when Crysis was released. It has been 10 years since then and the peak difference is still marginal, and many "AAA" titles today have even worse graphics.

Around the same time we started seeing the rise of games that focus less on graphics - indie gaming started to boom and even some big developers started releasing big-hit games with simpler graphics (LOL, DOTA2, Overwatch, Mario, Zelda, etc). Minecraft went public in 2009.

The bigger issue is that achieving the graphical quality of 2007-Crysis is STILL a huge development undertaking. Hardware advancements have made it easier to run intensive graphics, but the software development side hasn't improved in the same way, so it still takes just as much work as ever to create a game with that level of detail.d This means that although graphics aren't really getting any better, the massive costs involved still means that they aren't going to be "diverting funds towards quality of gameplay" etc.

...

That makes sense. I suppose eventually the toolbase will catch up, too, and making amazing graphics will largely be a task for artists instead of programmers.

>RX vega launches
>All of a sudden threads about Ultra settings being a meme

That game looks good maxed out, you probably have shit specs.

you Sup Forumsoogeyman spammers are literally worse than Sup Forums, you don't belong here

fuck off /prog/fugee, your board was deleted for a reason, go back to your containment thread

so are you going to even bother disputing the points in the OP or just throw (((tinfoil))) innuendo around and pretend to be smart

Maybe if we were in 2013

hello this is technology board not game board
pls go to game board to post game threads

The human eyes cannot also love more than one waifu

This is an outdated meme. Unmodded Crysis was visually impressive for the time (and still does look quite good), but it's not even close to being any sort of "gold standard" for videogame graphics. The difference between 2007 and 2017 may not be as big as 1997 and 2007, but videogames with a realistic aesthetic do look significantly better than they did 10 years ago.

Bought an r9 280x in 2012 and it still runs everything at high 60+ fps. Not much of a gaymer anymore so I'm riding this thing out till it dies.

...

>computer games
>computer

I had been making this point in threads for over a year. Knowing what to turn up / down to get a balance between graphical detail / performance isn't that hard, and you'd be surprised how good a game can look while still running smoothly on what would normally be considered low end hardware. Presets are cancer, and settings above high are unoptimized.

With that being said, while you may not be able to pick out individual details to show how a higher setting is objectively better. You can genuinely see a difference while playing. Could also be part placebo, but the only way to find that out is to run blind tests with frame caps and whatnot. So you can get a fluid 60fps+ and only have graphical detail be the determining factor in difference. Personally I'm on the side of higher fov, draw distance, and refresh rate over resolution and detail. But it is nice to get the whole package.

Huh? Even in your screenshot, the distance stuff looks waaaay better on ultra.

The only difference between The Witcher 3's high and ultra settings for textures is ultra allows the use of more VRAM. They're literally the same textures.

>2kliksphilip
Awwwww shieeeet, I recognized that voice.
I used to watch his videos about hammer editor.
Good watch.

Did the human eye got upgraded recently or something? I always thought it was 24 or 30 fps.

The problem is that the ultra setting gives the perfect standard, once you go about tweaking the settings in you start biasing towards cost effectiveness and against performance.

The end all be all solution as I see it would be to benchmark the cost from base low and incrementally bump each feature in the engine to calculate the performance cost of each feature and the different technologies available, then test for resolution costs as well. Weight RAM target clocks and capacity for resolution scaling, otherwise test at the highest available common resolution.

There are just as many holes in that solution though, given, and especially now, the different architectures handling different features in alternative ways. Who knows how hardware features like MCM will scale with something like shadows vs. AA compared to monoliths, furthermore impacts from drivers and engine optimizations would prove detrimental to both progression and regression testing.

It would promote better optimization though, which I feel is the sole boon to the method versus just bumping ultra.

Sweet spot begins 240fps and limit above 480fps

The eye doesn't see in frames, it sees in photons hitting cones and rods. 24fps was the standard embraced in the early days of film as it was the lowest number of frames that the human eye perceived as fluid, go lower and things appear like a slideshow, go faster and burn through the company's wallet.

The whole industry doesn't have to change, but there is definitely an opportunity for a hardware review site which tests games on medium/high settings.

There is no need to test at variable settings assuming you can make rough calculations for feature cost, especially if the cost is approximately linear.

For example:
All low - 240FPS
All low, 2xMSAA - 216FPS - 10%
All low, 8xMSAA - 144FPS - 40%

You know 4xMSAA is the sweet spot for 1080@30", which is what you're targeting so you expect a 20% loss at low settings. Or maybe you're targeting 1440 which means you'll want to see how the VRAM weighs in at that resolution and what kind of performance to expect relative to 1080, or 4k.

>

...

>Maxing out a game these days usually means that you're enabling "enthusiast" (read: dumb) effects that completely kill the framerate on even the best of GPU's for something you'd be hard pressed to actually notice while playing the game.

Let me tell you my story. My one and only pc until just three months ago was a Dell XPS 630i that I bought new in the Spring of 2007. It came with a Core2Duo cpu. I eventually put a Radeon HD 6850 gpu in there. That was my gaming rig until three months ago. I stripped down every bit of detail and resolution from every game for the last couple of years just to get a max of 15 fps in new titles like Witcher 3, Fallout 4, and so on. Modded Minecraft was essentially unplayable once I built a few machines in a single chunk.

Three months ago, I put together an i7-7700k with 16 gigs of DDR4-4200 and a GTX 1080 8GB on a Z270 motherboard including a pair of 512GB SSD's displayed on a 1080p 144Hz monitor. Max-ing out for me means "I FINALLY CAN, AND I LOVE IT" because I have been min-ing out for most of the last five years suffering with an old pc.

I'm not sure what you mean about "effects that kill the framerate", because I max out everything and still get 144 fps vsynced to my 144Hz monitor with just about every game out there today. I'm getting an average of about 120 fps with Witcher 3.

fuck off back to Sup Forums you retarded low IQ manchild!

Who here /1080Ti and just put everything on ultra anyway/?

...

Sounds like something a console plebian would say.
Look at the shading on the grass and trees. Ultra looks way better.

...

Thank you for your public service announcement. It ain't going to have much effect though because the most vocal people regarding consumer tech are the "thought leaders" that have invested a lot of their identities into the artificial enthusiast market that the video card companies shape and use as a marketing vehicle.

maybe you are satisfied with medium settings on your shitbox, i'm not satisfied with current graphics even on ultra there is still so much more to be done. not planning to give up on quality any time soon

>people are waking up to the scamy "800$ gpu for dem maxed out 4k 60 fps experience, good goyim" meme
Finally

>It means that developers are going to start diverting funds toward quality of gameplay, story, music, and their game's actual aesthetic, instead of just improving raw graphical power.

If the gamedevs of the games that you play aren't doing this already then don't hold your breath for them to start now.

The same with the arma series. Exept that all these games can look good with the right settings, but is better to play with some turned off.

>RX vega launches
>All of a sudden threads about Ultra settings being a meme
Exactly.

>So you start putting money into something else that'll actually drive sales.

Microtransactions?

Especially lootboxes

Here you go, you can see 2k now.

The most annoying people are those that whine you need a 1080Ti to play 2160p/60 maxed out, and then recommend a 144Hz monitor instead not realizing 144fps is even harder to hit consistently than 4k. I have a 1080 and cant hit 144fps in pubg at 1440p/very low settings.

ITT: Blatant self-delusion from pitiful poorfags. Sad.

I agree. There's also very little detail difference between ultra and high which isn't worth the toll on performance in my opinion.

I'm still running a gtx680 4GB and most newer games still handle 60fps on high settings instead of ultra.

I have been eyeing one of the new AMD cards though so I could get a slight increase in performance with less heat output.

4k @ 60fps = 497,664,000 pixels

1440p @ 144fps = 530,841,000 pixels

You also have to factor in resolution is 100% GPU dependant while high framerates require a strong CPU.

That's pretty cute.
I'd recommend everyone in this thread get over there right now.

Anyways, Sup Forums is an anime website and video games are technology
Furthermore I don't get why people focus so much on the textures when the shitty way that the straw roof over here fails to blend into the air is far worse looking.

Where I live the 3gb and 6gb have a 100$ difference, should I pull the trigger on a 3gb model, or save to get the full package?

I think you are right. Next up my purchases include a 1440p monitor because I find 1080p small for work, and a gtx1070 for the single reason that
1. i want something for the next 5 years, with good graphics but not ultra
2. linux drivers are good

>video games are technology

Gosh I guess SOFTWARE suddenly ISN'T TECHNOLOGY.
Better remove all the browser threads, all the video player threads, as those are obviously Sup Forums or Sup Forums content, and so on and so forth.
Need I point out that the subject of the thread is in regards to the technical aspects of the game and not how it plays or how the story is?
Are you genuinely incompetent?

...

Blame outdated apis still being mainstream. Nothing will change until Vulkan gains popularity

making software is technology
playing games is not :/

1070 isn't going to last 5 years at 1440p unless you only play esports titles. Just try using a 570 at 1080p in modern games.

Your point?
This thread is just as much about playing games as the MPV threads are about watching movies.

have you tried actually syncing your eyes refresh rate to a monitor? it's pretty difficult

i'm not that user.
The 3GB variant not only has reduced memory but Nvidia have also disabled 10 percent of the processing cores from 1,280 down to 1,152.
It's for you to weigh up the pros and cons, look at a few tests and decide.
I bought the 1060 6b near launch and it was a huge boost, I had a 760 before that it can handle all the games I'm playing without issues. Oh and most of the time it doesn't even get hot enough for the fans to kick in. I like that feature because it keeps very clean and silent.

You don't make the rules computer snob. We can't all use our computers like you, to download an internets worth of gay porn.