4K is dead on PC

I'm starting to feel that 4k will never take off on PC. In all honesty it's been in the waters for a long time. We are two years into 4K Tech and the PC has lost its graphical crown to the PS4 Pro and the upcoming Scorpio console is making it harder for PC to catch up, this is due to a number of reasons.

>Screens
Have you see the price of 32" 4K monitors? absolutely fucking insane, they cost more than their TV counterparts. Some may argue you can get cheaper 24" 4k panels however these run at lower refresh rates or have limited feature sets that PC gamers require. and the stats back it up, less than 0.5% of the PC Steam gaming community have 4k displays, it's quite clear they are either waiting for cheaper displays or waiting for the displays to get better.

>Not enough VRAM
If you look at the picture above most PC gamers are still on 1GB, this is nowhere near enough to run games at 4k resolution. in fact the only recommended GPU's for 4K gaming are either a TITAN X or 1080, both of these cards run into hundreds for 45fps or so performance.

>No support for HDR
at this time there is no real support for HDR for games or support from AMD or Nvidia, add the fact the hardware is not even outer there for gamers who want to use monitors, maybe next it might change.

>PC is not about cutting edge graphic / high end resolutions s anymore.
The PC platform is not about cutting edge graphics anymore, I can barely recall a PC only title that pushed the platform. The market is catered towards mid-semi pro market who prefer streaming games and playing with friends, they don't need the greatest and latest to play games.

Other urls found in this thread:

144hzmonitors.com/monitors/asus-computex-2016-27-inch-4k-144hz-gaming-monitor/
youtube.com/watch?v=6__TvBzAVj8
lg.com/us/monitors/lg-24MP88HV-S-led-monitor
twitter.com/SFWRedditVideos

>1366x768
>25% of users play on a cheap laptop
explains a lot.

I feel no need to go past 1280

I agree and I also thing consoles should stop trying to push it. They should focus on making the graphics better and maintaining 60 fps at 1080p. Look how many PS4 4k games average at 20 fps.

monitor resolution =/= render resolution

pc monitors last 10+ years so it will be a long while for widespread adoption.

if it ever even happens.

the most popular gpu is a 970 and that's capable of 4k 30fps for basically any game, 60fps for most of them.

>PC has lost its graphical crown to the PS4 Pro
gr8 b8

also it's stupid to bring up HDR because PC monitors have had better specs than these new HDR televisions for many years. and graphics in general isn't getting pushed, they're just milking them for the most part as most people don't give a single fuck.

I'd rather they focused on better picture quality and smooth framerate on lower resolutions.

HDR is just fancy talk for it being able to take advantage of a 10 bit color, right?

PC has had that for ages.

Yes and better contrast. It's shit 1080p monitors on PC have had forever.

>PC has had that for ages.
No, HDR is more than a 10 bit screen. Nevertheless the current TVs on the market have shit HDR anyway so PC gamers aren't missing much. A good 10 bit panel is probably better anyway

Sorry, OP. Looks like you'll have to delete your thread and try again. No one's taking your bait this time. Hope you have fun on your Playstation though, lol.

Doesn't the majority of Steam gamers rely on iGPUs?

>PC has lost its graphical crown to the PS4 Pro

Having to turn every graphics setting apart from resolution down to the PC equivalent of medium is not a 'graphical crown'.

Nice concern trolling though

He took it straight off of neogaf

No effort at all

Fuck the PS4 HDR threads were cringe filled. The fanboys had no idea what they were talking about.

>ignoring xbox has hdr support too
>ignoring windows has hdr support
>ignoring neo-hdr games have been on pc
>ignoring PC monitors shitting on $9999 HDR televisions
>ignoring ENB/sweetfx/reshade letting you adjust any PC game's EVERYTHING to your liking instead of hoping a dev gives good support for the new 'HDR' specification and protip they don't

Barely anything more, the specs are lower than a decent monitor especially when you factor in input lag.

It was never a majority, just a higher percentage than any one GPU and those days are behind us.

>at this time there is no real support for HDR
Any IPS worth its salt supports HDR, stop spewing this dumb meme

You know software HDR by games such as half life 2 lost coast and HDR by the atual TV are two completely different things, right?

>a better resolution is dead.

This is fucking retarded. Anything that is better is obviously going to become to norm in the long run.

yes did you read my post

It's funny cause as graphics get better I swear the games get worse.

Also internet needs to catch up big time

Yes, you were talking about ENB/sweetfx/reshade

yes do you know what those are?

I didn't think the pro even did true 4k

4k is a meme

>Anything that is better is obviously going to become to norm in the long run.

Tell that to the Nazis

Software shaders that have nothing to do with HDR by the TV?

yes do you know what you're getting at?

Not anymore. Do you?

4k will never "die" because there's very little downside to supporting it. Even if a game is made with 1080p in mind (textures, etc), rendering it at 4k will provide a visual boost if you want it.

If you want it. That's always been what stupidly high end pc shit has been about. It was the same way when SLI first came out, likely before you were born-- people with a ton of money were obsessed with performance and fidelity, and dumped money into a sub-optimal outlet for it.

For the rest of us? Eh fuck it, when 4k becomes price/power efficient I'll look into it. It doesn't need to become mainstream to 'take off."

this

That you can't get the same HDR affect as HDR TVs no matter the software you use on normal 10 bit screens?

yes did you read my post because I didn't say you could.

I did say monitors have wider specifications than shitty HDR TVs though. and mods translate those wider specs to something tangible you can actually see without praying to your sonysama he supports the hot new thing.

>can't get HDR on 10 bits screens
sit your dumb ass down, 10 bit color is what enables hdr retard.

That's not what I said

There are 10 bit screens
And there are HDR 10 bit screens

It's just a matter of gpu technology catching up. The Titan x is really the only gpu that can put out 60fps at max settings right now, the 1080 can in certain games. Once gpu tech is to the point where the midrange gpus ($200-$300) can comfortably do 4k there will be a lot more people buying them. There's no point in buying a 4k display when your pc can't play games at it comfortably. And I'd imagine the displays will be cheaper by then as well. It's not a gimmick like 3d or VR that may or may not take off, higher resolutions will always become more popular in the long run

HDR is a marketing term for a set specification. A spec that monitors surpassed years ago.

>The Titan x is really the only gpu that can put out 60fps at max settings right now
It can't output 60fps at 4K with literelly everything turned on and maxed on Rise of the Tomb Raider

>at this time there is no real support for HDR for games or support from AMD or Nvidia, add the fact the hardware is not even outer there for gamers who want to use monitors, maybe next it might change.

False. Even my U2410 from 2009 supports HDR

>muh max settings
always grasping at straws

It will take years for PC games to ship with proper HDR support. Unreal Engine doesn't even support it yet. The only reason the consoles have HDR titles is because Sony/Microsoft push for it.

ok but it's good for the majority of games. Plus, AA really isn't needed at that high of a resolution so you don't need that all turned to max

Even if you exclude AA

You don't really need max settings m8, and even now $200-$300 GPUs can do 4k pretty comfortably.

>support from AMD or Nvidia

Nigga are you dumb

>I'm starting to feel that 4k will never take off on PC
Name 500 games worth playing on 4k.

Shadow Warrior 2 is only on PC at the moment and it has proper HDR support.

Thats not true, Alien Isolation also does.

>there are 1080p screens
>and then there are FULL HD 1080p CERTIFIED screens
you're a retard and you fell for bs marketing. sit the fuck down.

>consoles can't even do 1080p
>thinking consoles will be able to do 4k gaming

lul

as the hardware gets cheaper resolutions will go up. I mean maybe the console DRM boxes will be able to do 1440p when PC is starting to have an average 4k resolution.

It will take off when 4k equipment is actually affordable. The biggest barrier is the graphics card. It takes a Titan XP to push out 4k at anything better than cinematic 30 FPS.

What's not true?
SW2 does have HDR support.

4K is literally a meme. 1440p165 > 4K60

Misread, thought you wrote Shadow Warrior 2 is the only PC game that has HDR support.

and not a single person cares

>t takes a Titan XP to push out 4k at anything better than cinematic 30 FPS.
Only on absolute max settings.
With proper settings management even a GTX 1060 could pull off 4k at respectable framerates.

>The PC platform is not about cutting edge graphics anymore

This is the only part of your post worth anything more than bad bait.

If that were to be true, it would actually be a good thing. The worst problem with PC gaming is the hardware rat race, and it didn't used to be at the forefront of PC gaming at all.

PC works best when it's doing what it does best - serving all needs at once. Nobody should need the latest and greatest to play games, and that should never be the benchmark. It should, however, always be part of the package, to show "hey, if you really want to get into this and you don't mind throwing the dosh at it, look what you can make it do."

That's where 4k gaming is right now, and that's not a bad thing in the slightest.

>comparing framerate to resolution
165hz is smooth but a 1440p screen is not sharper than 2160p. Depends on what you give a shit about.

Consoles don't have proper HDR support either. And they never will.

This is just PBR 2.0, marketing nonsense that amounts to nothing.

As of this moment, there is practically no support on the PC for this new HDR standard. Outside of 4 or so games, there is nothing else. Implementing it is currently fraught with problems, due to how new the technology is. And few people have any real understanding of what it is to begin with, which is why people like you are claiming it's already supported in 10-bit monitors.

However, from a technical aspect it's a far bigger deal than anything you can get with ReShade/ENB, and that's speaking as someone who makes graphical mods. No contrast/saturation boost or tonemapping change or colour filter can match the effect of variable dynamic range.

The difference between 1440p and 2160p is negligible in comparison to the difference between 150+ FPS and 60 FPS. We should be pushing for higher framerates first and foremost

Now why would I want to play at shit settings and a high resolution unless I was literally forced to?

>due to how new the technology is
good god you're an idiot

Guys please, nobody will go out in a hush to buy the latest new tech available every month.

It will take years for people to move from full hd to 4k.

Because in pretty much every game there's little to no difference between ultra and high settings.

I mean high framerate 4k is out there, if you have ludicrous amounts of money to throw at it. here's a 144hz 4k monitor:
144hzmonitors.com/monitors/asus-computex-2016-27-inch-4k-144hz-gaming-monitor/

I have a 144hz monitor, but really above 120 fps I see negligible gains in smoothness. Would rather play a game at rock solid 90 fps at 4k than 144 at 1440p, but that's just my preference.

>expecting mass market adoption within two years of availability

>implying 4k on PC is fucked because of consoles and not lack of support by Microsoft

>4K is dead on PC
Because we can spot a meme when we see one.
See that glorious 'noticeable' increase in fidelity line when you sit this close? Thats 4x the gpu bandwidth, just for 60fps.
What is way more than 'noticeable' is higher fps, for 120hz or 144hz capable monitors.
If the pro would just lock the res at 1080 and push for 60fps they would have way more idorts but they chase the 4k buzzword because its what normalfag console buys want to hear.

4K is the new stereoscopic 3D. No one is buying into it.

>falling for the EVERYTHING MAXED meme
nigga high is fine for 95% of settings, and the diminishing returns you receive on visuals vs. performance is not worth it unless you have multiple top end GPUs.

the PC has lost its graphical crown to low-mid range pc hardware (ps4 pro)

The people who post this kind of things, what were they thinking? Were they even thinking?

It's nearly off the shelf parts, only downclocked to run worse so that it won't overheat in that little box.

That monitor isn't even on the market yet, but I can easily tell the difference between 120 and 150 at this point, and in terms of actually powering it we're much closer to powering a 1440p165hz monitor than a 4k144hz monitor.

forgot my ">"s

No reason to get into 4k gaming when most new games cant even run at Ultra 1080p 60 fps on the newest cards.

>yfw all those laptop users.

The only games that push tech are AAA games, and AAA games became overpriced trash so no one cares about the SJW tech demos anymore.

Everyone just plays actual fun multiplayer games like LoL, HS, SC2, Rocket League, Dota, etc. None of these games have high requirements.

m8 I'm a pixel peeping autist. I look at each pixel at 400% zoom just to make sure it's perfect. More pixels ain't gonna cut it if they are lower quality pixels.

>this same thread is on neofag

Good job OP, you doxxed yourself.

>mfw enjoying glorious 480p resolution on my 135" monitor from 40' away

How many PS4 Pros are running on a 4k TV

I'd say like 90% of them...

PCfats are all running shit tier 1366x768 or 1080p.. KEK

I'm going to go to bed tonight hoping you develop vision loss over time, ruining your life once you realize everything will slowly become more blurry over time. Corrective lenses may improve it, but it'll never be as good as just pure 20/20.

after seeing ARK's ps4 pro performance I just can't not laugh at sonyggers.

What the fuck am I reading?

>PBR
>nonsense
You're an idiot. If it weren't for PBR, none of this would matter.

>he thinks TV HDR and graphics HDR is the same thing

This chart is useless for PC games. Unless a game has perfect antialiasing, you'll be able to see the difference between 1080p and 2160p from much further than that.

>hurr durr pbr saved gaming
fucking retard, please keep spouting buzzwords that have no definition.

>he thinks TV HDR is anything special
>he believe in the PBR meme

Since you're a Tood poster I'll just ask, does Fallout 4 use PBR?

>the PC has lost its graphical crown to the PS4 Pro
Are you serious?
youtube.com/watch?v=6__TvBzAVj8

What do you mean "proper settings"?

Not trying to be an asshole. It's just that I just got a 1060 and wanted to know what you meant.

By the way, sweet screenshot. That's a R9 playing at 4k? the FPS are over 60, sounds crazy.

>no definition

>Physically-based rendering promises photorealistic lighting in 3D environments by offering a mathematical, less production-intensive approach to the rendering of light.
>PBR employs mathematical equations that determine how a light source will diffuse (diffusion) or reflect (specularity) when it interacts with two primary types of materials: Metals and non-metals.
>Diffusion and specularity (effectively reflectivity) are the two means through which light interacts with a surface. When light connects with a metallic, glossy surface, a significant percentage of the light will be reflected off that object; light connecting with softer, non-shiny surfaces (think: wood, cloth) will be absorbed and diffused across that surface, making it brighter without reflecting as much outward light. In instances of semi-transparent but non-shiny materials (the skin of a human ear or hand), sub-surface scattering diffuses the light underneath the surface and produces a new color output. “Albedo” is used to describe the color of light that scatters out of a surface.

What do you think the 400% zoom is for, dumbass?

RX 480 is better than a 1060...

Too bad PC users are retards that easily fall for intel/nvidia marketing.

note that 99% of 'PBR' games don't adhere to this definition :)

>This chart is useless for PC games.
Its your choice to choose slightly smoother edges at 25fps mate, not mine
oh but you have a titan, right

HDR is basically a spec for brightness and contrast levels, along with 10b colors, that's it. Just because it's now a spec with a name tagged on doesn't mean screens which already match or exceed this spec didn't exist before somebody put a name to it. The increased bit depth does nothing for dynamic range (aka contrast when talking about devices which display images) since that depends on the quality of the device itself, it will only help with gradients and in some situations with detail in very dark or very bright areas, which aren't really a problem in the vast majority of games. You'll get a major improvement in image quality when you switch to an OLED screen or some other tech which can actually display blacks.

Basically "HDR" is a label they slap on something to tell you it has non-shit image quality, but that doesn't mean everything before it was actually shit.

what are you talking about? I have a 1080 and 50 inch 4k samsung and literally play every game maxed out at 4k. most people buying tvs and monitors now buy 4k tvs and monitors. how is it "not taking off"?

In my case, usually high settings.
TF2 I run with everything maxed except for sun shadow resolution.
BF1 I run with pic related settings.

is this a good monitor?
lg.com/us/monitors/lg-24MP88HV-S-led-monitor

Reminder for retards that that TV/monitor HDR is NOT the same thing as Reshade HDR or the settings in Lost Coast/Crysis/Far Cry 2.

My 2 year old PC does 4k better than your shitty consoles. It also does it at roughly 45fps depending on the game so that's also stupid as fuck.

Honestly though, it really is a nice baiy thread. Almost subtle enough, but not quite.

Claiming TV HDR is nothing special is like saying 120 FPS is nothing special because TVs only go up to 60hz. The difference is there, but there's no way to see it without a decent screen supporting it.
Fallout 4 uses PBR, bolted onto their Gamebryo engine in a lazy way. I'm pretty sure they aren't using proper values for lighting and in places they just forgot to implement things.

seeI was the only person who brought up graphics mods because graphics mods can maximize any display, including HDR10 displays.

>Fallout 4 uses PBR
Hook line and sinker, keep believing those lies you retard.

HDR TVs aren't anything special, I own one and my monitor is much nicer to look at.
>TVs only go up to 60hz
Non stop retardation huh.