Have we hit a plateau in GPU technology?

Have we hit a plateau in GPU technology?
there doesnt seem to be much point in going past 4k for non-vr, and super-wide monitors are useless shit.
Until VR becomes relevant (which wont be for a while) what reason could there be to ever go past a 1080?

Yes, still graphics realtime could improve per next decades.

It seems more like you're complaining about hitting a plateau in video game graphics instead of GPU technology. GPUs are getting faster and faster each year hence why miners are buying them up for ridiculous prices and also companies like Pixar will always be making better movies with better GPU tech.

There hasn't been a point in increasing resolution for some time now but because it's a "number" that can be "higher" people will buy this newer stuff with higher numbers.

There is a point and it's selling stuff to people wanting "better" stuff than their peers and since they are not knowledgeable on the subject they need a number to clinch on to.

If I can see aliasing the resolution is still too low.

Will 1080p graphics demands be increasing with the same speed they did before? I mean, what else is there to improve? There are only so many pixels.
How are they going to sell future GPUs if modern ones can handle everything 1080p and less than 5% of people have monitors supporting resolution above that?

Convince gentil- uhh gamers that they need higher resolutions, also I can see the price of higher resolution monitors coming down over time

for

We've plateaued because Vega is a failure. And Nvidia is taking their sweet time to release gaymer Ampere/Volta.

Competition does not accelerate roadmaps (unless it's a third rebage of the same uArch, courtesy of Intel).

Maybe, if you see graphics as 3d photographs. Because Frostbite has mastered photogrammetry. Just as Unreal Engine they kinda cheat using baked lights and shadows.

But there's a lot of work to do for physical realism, you can't destroy or change physical qualities of realistic environments yet.

We might see hardware geared towards path tracing in the near future. It's an ongoing topic of research at Nvidia.

I'm going to be a pedantic autist and point out that Pixar movies are rendered entirely in software (CPU), not with graphics hardware.

Can't wait for the VR meme to die.

Shit is a useless waste of money.

I think they're actually moving towards industry rendering hardware now, but you're right up until recently.

>Have we hit a plateau in GPU technology?
no. cheap 4k60 with overhead so devs can actually start doing things with the additional power beyond just bumping up res.

the bump to 4k has been an annoying, albeit necessary detour, that's stood in the way of better visual effects.

Photoreal empty static environments are nearly a solved problem, yes. See also Ethan Carter and other walking simulators.

People are a lot harder. Rise of the Tomb Raider is the best I've seen in person in terms of realtime character/facial rendering on commodity hardware. (Not best animation, but best rendering.) The Last of Us 2 looks like it will improve on that, if that trailer with the hanging scene is realtime. These games still don't look anything close to real, and they don't try to. They're sidestepping the uncanny valley with some great stylisation, rather than attempting to cross it.

Graphics tech will have peaked when a game:
>has photoreal environments
>has characters indistinguishable from human actors (this requires both photoreal rendering and very advanced animation)
>has photoreal interactions between characters and environments
>has scenes filled with every detail the developer wishes to have
>is sharp enough (both overall render target res and quality of individual textures/model tessellation) that my eyes would not be able to see any further improvement (this is a much higher bar than people think it is)
I'm not completely convinced that the above is actually possible in realtime. I hope to be proven wrong in my lifetime. In the meantime, games are getting more beautiful all the time even if they don't look more real; I can enjoy that and look forward to what the future brings.

The percentage performance increase each GPU gen is nowhere near what it was 10 years ago.
We're waiting on a decent GPU increase and a new gen of consoles for a decent increase in computer graphics.

which would be relevant if resolution was the only thing that changes with graphics improvement

No.

Yes, and same with CPU technology. My i5 3470 and GTX 970 should last me at least 3 more years.

you wont be saying that when njudea start releasing their "late life updates"

Same my Ryzen 1600x and rx480 should last me for like 4-5 years

they don't need to gimp 3.5gb

I miss going from Doom to Duke 3D and then Quake and Unreal Tournament.
Jaw dropping every time you first saw it.

Nowadays it's basically all the same shit over and over.

Why would you even update?
Most updates focus on patches for new GPU architectures. If you have a old GPU, stick with the last stable driver before the added support for the next/newer one.

I want 4k 144hz

I know this isn't really a GPU problem but animations in video games are absolute shit. We've achieved nearly photorealistic graphics in a couple of games but characters still moonwalk around

PC gaming isn't what it was anymore. Without a thousand (either dollar or euro) investment, you can't play anything even remotely recent with your friends. And even with that, people are still numb to games, if threads about it here are anything to go by. I've quit games years ago and this is one of the reasons.

games still have more room for detail and render distance. there is still growth to be had even if it isnt in resolution.

When the fuck are we getting more 4k content. Who gives a shit we hardly have any 4k content out there.

> 4k 144hz 16:10

Life is nothing but pain

So you don't play them and you haven't done so in years, but you're convinced that they're all terrible and the hardware is too expensive?

The truth is that PC games scale better now than ever before. You can buy a game and expect it to run well on your mid-range laptop or entry-level desktop.

One thing I'll grant you is that if someone's budget is particularly low and they don't need a computer for anything else, they are likely to be better served with a console.

>people are still numb to games, if threads about it here are anything to go by
I suggest you don't pay attention to those threads. I actually avoid most commentary/criticism of the games I like, and I only read a bare minimum of reviews before making a purchasing decision.

I prefer going into a game knowing that it is generally considered to be of high quality, but otherwise as blind as possible. Afterwards, I might google for trivia/lore/etc but I usually don't participate in critical debates or try to defend/condemn the game.

Either playing it made me feel good, or not. If it did, the fact that other people didn't get as much out of it does not take anything away from my own enjoyment. If it didn't, I'll have another data point to avoid similar things in the future and I might warn anyone who is considering it and has similar tastes to me, but otherwise I leave it alone. My most hated game is probably someone else's GOTYAY.

I take the same approach to film, TV, music, books, everything. If the dopamine/epinephrine flows, it's "good". If I'm bored, it's "bad". It's as simple as that.

Despite the above, I still find myself posting in Sup Forums and watching trailers sometimes. Can't help myself. But when I see long arguments about whether a game is good or not, I can't help but feel that everyone involved is missing the point.

Upgrades are cheaper then ever. Your PC isn't outdated in 2 years like it was in the 2000's. You could still rock 2600k and DDR3 from 2011, throw in a GTX 1060 and do 60 FPS on 1080 all day, every day.

Ultrawide is great, dude.
I'm never going back to 16:9

can confirm. didn't expect it to be so great but now that I have I'm never going back to ratiolets.

How about improve monitors first ? Consistent backlight, colorless black and white, more colors (more than 10 bit for sure), large viewing angle... and all that affordable. That would be sweet.

No, scalable multi chip module GPUs will be coming.

I'm waiting on this as well. In a 32" monitor of course.

lol of all the grotesque wastes of money, time and resources in the world and VR is the one you pick.

Just upgraded from a bargain bin 1080p screen to an Acer XR38QCK. it's ridiculous how good it is.
see pic for details

Ayy LMAO
samefaging to defend you poor display choice

why so aggressive, friendo?

The amount of space they eat and bandwidth is the current bottleneck

>tfw still using a gtx 760 hooked up to a 4:3 vga LCD monitor

>failure
>beats nvidia with new drivers

I already have a 2600, but a GPU of that capability would cost me a monthly salary where I live. Given that I've already quit gaming, I'll decide what to do after my 4670 HD dies.

>digital to analog to digital

A 1080 can't consistently maintain 60fps at 4k. As for upgrading, high refresh rate 4k I believe should be the end game, at least for the forseeable future. Also higher resolution textures and better geometry are definite reasons to upgrade.

What goyim is this?

Vega is 50% faster than Fury X computationally and both of those architectures leave the R9 290X/390X in the dust.

Volta is right on track, and Vega is a decent architecture that was delayed far too many times.

Several of those points rest mostly on game engines and attention to detail from the developers. As for the final point, you're correct that that is an extremely high bar but a couple decades ago we barely had realtime 3D rendering on commercially available hardware.

I agree 1996-2006 totally knocked my socks off. Now its like WOW LOOK AT THAT SLIGHTLY MORE REALISTIC DIRT, WATER AND LIGHTING!!!

I want 4K @ 250Hz
Thin Bezel, No curve, No ultrawide.

yes, we hit a plateau, and unlike prior ones where the wall to move forward was some techniques that haven't been invented yet, the way forward here is a 110 degree shere face.

we cant push poly counts outside of tesselation on circular things to improve graphics, textures are still iffy, but that's mostly because cameras will let you get so close to things to wall lick.

the next big area where graphics are going to get a massive boost is pathtracer and we are getting close ish to being able to do that, however, we need such a retarded amount of processing power that its still far off.

The next push forward is likely what amd is doing, pushing half precision over full, as from what I read half is more than enough for path tracing, and its possible for real time games, quarter is also enough. If we went that road path tracing could possibly be implemented within then next few years, however, it may be a setting that demands a separate gpu like physx did on its early versions, imagine it, a 25-50 teraflop card dedicated to lighting alone.

At least this is how I see it.
we still have improvements to a comprehensive physics suite that could make games better, but no one will do this,

There are many aspects of gpu technology and graphics in general.
I would like complete 100 photo realism in games, so that it would be indistinguishable from real video footage, also running on at least 8k/120hz/24inch oled screen

4k60 or 1440p144

>in one game

I already have 1440p @ 165Hz

It can on high but not ultra

Ultra settings are diminishing returns, though then a 1080ti can do ultra 4k60

don't be pedantic

No.

But we are coming to a point were nanomemes will stop us eventually. I predict about 8 years until Nvidia and AMD move to a 3-4 year cycle for GPUs series effectively.

It really depends on the game.

For the current rendering styles yes, The next big breakthrough will be when we can finally render point clouds where each point is 1 pixel and they subdivide/multiply by how close you are to an object combine that with a real time high resolution ray tracer that can handle more then 5 rays at a time for lighting.

Mmmmmm yes.

New technology for nanoscale cooling channels in the GPU could be a game changer. For CPUs too.

Still remember the last big jump in graphics was doom 3. That shit legit blew me away. Haven't had that same feeling about a game since. Thus why I still play this living shit out of the series.

Also zandronum ftw

>tfw using GTX 1050ti paired with high end 1024x768 Sony LCD from 2006 that I for $30 from a thrift store
I'm hesitant to upgrade because It works pretty well for my needs and it still looks great. Also has the superior 4:3 ratio for web browsing.

>last big jump in graphics was doom 3
naw, it was PBR. All Doom 3 gave us was real time per pixel lighting. Before that we had per-vertex lighting

>4k monitors
>most texture are sill less then 2k
We have a long way to go.

i have a brand new dell ips 16:9 monitor in the box but i don't want to cave into the hdmi/displayport and widescreen jew just yet

wtf i hate nvidia now

>VR
>ever relevant
Please stop thinking VR will ever be a thing. This shit needs to die, right the fuck now. Nobody gives a living fuck about subversive, alienating, useless pseudo-technology.

Game graphics have already hit a plateau in the level of effort the developers are willing to put in.

Though there's nothing wrong with focusing on good art styles and not trying to go for photorealism.

It's called diminishing returns.
/thread

>diminishing returns
This !

cant wait for gaymer fags to start doing insane audiophile orgonite bullshit when their technology matures

I want 21:9 240hz.
But most games don't even run at 60fps. Like there was this f2p game trend when everyone ran with shit framerates. Then companies started to optimize and then "Greenlight", and "Early Access" and PUBG happened. And it's okay again to make games that run with 30 fps on a Core i9.

Fuck this gay earth. Or just fuck people who give money to half finished shit games.

> Muh 120fps
> Muh 1080ti
> Muh max settings
> Muh 8K OLED
We're pretty close already desu. We already have the crt meme going, which is like the audiophile vinyl meme.

glorious ! my body is primed and ready to give back some hate to the people who think i am crazy for listening to 24bit flac

1440p 144hz

Judging a product off of its dopamine rushes is awful, clicking games make you release them too.

Please do not compare videophiles to audiophiles. Audiophiles embrace a fucking pseudoscience that companies are more than happy to propagate to them. The thing about audio autism and video autism is that to a point, investment in video technology generates quantifiable returns. I have a CRT monitor that's about 10 years old, but still stomps most things on the market (2K resolution, 160hz max, was a 5,000 dollar graphics monitor when it was new.) I did professional video editing with it for quite some time. Even now I can ask a normalfag what the difference in picture is between a 1080p LCD monitor and the graphics monitor and it's night and day. On the other hand, if you ask a normalfag what the difference between something downloaded from Youtube and a pristine vinyl recording they likely won't notice shit. You have to train your ears to care about that fucking shit.

no we are getting hitted by the idiotic devs that doesnt know how to code

Dude games can't even use multiple cpu threads yet.

Once they fix that and we can start getting 30x more drawcalls and so on we go back to the GPU issue.

>there doesnt seem to be much point in going past 4k for non-vr
you're retarded and have never seen higher than 4k resolution displays in your entire fucking life.
also having things rendered at a higher resolution alone makes it look better, regardless of where your displays resolution ends.

"modern" gpus have been able to play everything at 1080p since 2012.
the software/firmware for the hardware will no longer be supported and will require bullshit gimping tactics in order to sell the next shit hardware which will have the same thing done to it in just 3 years, rinse and repeat.