4K

Is 4K still a useless meme?

Or is it time to upgrade to something like this?

Other urls found in this thread:

youtube.com/watch?v=LuViSYChy7Q
imgur.com/a/a7iCQ
twitter.com/NSFWRedditVideo

5k or go home plebs

>Not 120hz

Wait for OLED

Monitors in general are a meme. Having a monitor implies you can't touch type.

The most retarded shit I've heard in a while.

>OLED
How long though?

Dell wants to retail theirs at $5000. So it's probably gonna be a few years before they hit sub $500.

>How long though?
Until you can afford it

>Dell wants to retail theirs at $5000. So it's probably gonna be a few years before they hit sub $500.
That's their flagship 4K model (specs comparable to monitors that cost $2000+ today), although to be fair it's hard to make an OLED display that's *not* a flagship compared to current-gen TFT.

Monitors are an investment that last a long time - if you buy a shoddy LCD now because of “muh resolution” while ignoring all of the other specs, you'll be kicking yourself once OLED monitors start flooding the market and making your display look like a muddy joke in comparison.

Have you seen one of them in-person though?

Is it really that much better or is it another gimmick?

1080p is enough on 24 inch screen and always will be.

What about 27+ though?

1440p or 4k is viable then but we are talking about desktop monitors, most of which are and should be 24".

I don't know.

My shit has been feeling too small to me recently.

I was thinking about getting a 27'' as my main monitor.

Someone hasn't used a 2560x1440 screen at 25inches yet.

I have a 28" 4k monitor and I wish it was 32" or even a little bigger.
There is a phillips 40" 4k that is really nice but imho it's too big for my viewing distance.

It's a gimmick, don't listen to this retard.

Get an IPS display, they do the same fucking thing as oled for cheaper.

If you have ever seen an Apple Watch before, it uses an OLED display. Most beautiful display i've ever seen.

I haven't seen OLED in person, but they're better in every measurable regard than today's highest-end displays (which already beat the shit out of your average consumer display like )

>Get an IPS display, they do the same fucking thing as oled for cheaper.
Enjoy your 1000:1 contrast, IPS glow and 0.2 cd/m^2 black points

Enjoy your burn in display

>implying OLED suffers from burn in

How would you know that if you haven't seen it though?

I wouldn't feel comfortable paying 5x the price for some better numbers on paper if my eyes can't tell any difference.

it's pretty cool. text looks a lot sharper and it gives you some more space to work with.
however, win7 doesn't really fully support 4k scaling. not sure about win8/win10 or loonix. you can just scale up everything, but that only works if you only have one monitor or all of your monitors are 4k (scaling affects all monitors).
if you don't use windows' scaling, lots of programs won't scale properly. they'll be usable, but text might be hard to read.
also, forget about 4k gaming unless you own at least two high-end GPUs.

If you need the real estate, sure. Other than that, there is no content besides some porn and for gaymen you need a top tier video card that will still only barely handle it. I would just wait until it's closer to becoming standard in content and they'll be cheap as shit

Because I'm well-educated enough on display specifications and human perception to have a good understanding of how to translate numbers to perception?

I know how bright 1 cd/m^2 is, I know how saturated BT.709 is, I know how to tell D55 from D65, I've seen the difference between 60 Hz and 120 Hz. I know what overdrive looks like, what 5 ms pixel response times look like, what blank frame insertion looks like, etc.

It's not exactly rocket science once you get down to the basics - every display technology is just the sum of its measurable numbers.

>resolution over PPI
Have you seen a retina macbook?

Another pretentious cuck on Sup Forums. What else is new?

I've got it, I don't think you'd actually have a problem with it if you're already used to multiple monitors. Its actually narrower than a pair of 24" monitors side by side and about as tall as one turned vertically.

Its rare that I use applications maximized or full screen other than certain games. For work I dedicate one half of the screen to my text editor, and the other half to testing and debug output and that's really probably the best way to think about it in most use cases as a pair of 1920x2160 monitors side by side with no bezel.

>resolution
>ppi
Don't forget your viewing distance, which is one of the biggest factors affecting apparent pixel density

Even a 24" 1080p screen can be made retina by placing it far enough away

It's a question of ecosystem. If you buy one 4k screen you will be looking out for 4k content and a PC that can display 4k content and an internet connection that can stream 4k content (good luck finding any) without stuttering etc.

This is why I'm determined to stick around on 1080p for ~5-10 years to come. I can actually torrent good quality 1080p stuff (~25GB movies) in a reasonable time.

>I have no argument so I'll just call you names instead
good one, I forgot this website has no fucking clue about technology

The point is no matter what the number are. Two identical displays can look different to you and me.

The thing is I want a bigger monitor, but I've heard that 1080p starts to look like shit on anything over 24''.

1080p content still looks fine on a 2160p screen. That's actually one of the big advantages of 2160p vs 1440p or 2880p. You can run 1080p content on it simply by having 4 pixels act as 1 pixel so there are no weird scaling artifacts from having an odd aspect ratio. So does 720p content for that matter since its the same deal 9 pixels become 720p 1 pixel.

>implying that's not the whole fucking drawback

Well yeah, your text and vector content will look sharper but your movies will look exactly the same while you are paying a lot more for the screen.

It's up to personal preference just do what you think feels right OP.

1080p - 20-24" monitors
1440p - 24-32" monitors
2160p - 32-48" monitors
2880p - 24-32" & 48" - 60" monitors

meanwhile my 1080p 55 oled can output 60hz and cost me 1.2k in usd

Actually my monitor is only slightly sharper than a typical one. But it is almost four times the size.

I usually watch movies either 1:1 on the screen while doing something else, or I put it in full screen mode and sit back on my couch.

I regard 4K as useless.

>But user you're a fucking a faggot...

I know but let's be realistic here.

Not even a 1080 runs 4K at 60 FPS.

21:9 plays movies as it should and you get more space in editing software.

120-144Hz is just plain better for games no doubt.

Ideally you'd go 21:9 120Hz IPS but that costs 1000$ and up...

So the only logical thing is to pick 21:9 or 144Hz....

4K not there yet with GPUs.

And by the time it is theres going to be 144Hz 4K panels anyways in like 4 years.

I mean look back t o 2012... IPS was like the shit back then and it was 700$ for a good monitor. Now decent IPS runs for half that.

True, for example if your vision is severely impaired then you would probably not give a shit about anything below 5 am/px

I was implicitly assuming you had normal, funcitoning eyesight

It's not only about your eyesight faggot, but also your perception.

Can someone explain to me why OLED isn't a thing yet?

Can't they just Chain 5~6 of them 5" OLED for smartphones into a monitor?

You're probably thinking of phosphor degradation which has nothing to do with burn-in.

Phosphor degradation: Color shifts over time (e.g. white becomes more yellow). All displays involving phosphors (OLED, LED-LCD, CCFL-LCD, CRT, etc.) suffer from this.

Image retention: Static images displayed for long periods of time remain visible for some amount of time afterwards, but the effect is not permantent (e.g. goes away after a few hours or when displaying non-static content for a while). OLED and LCD displays both suffer from this

Burn in: Static images displayed for long periods of time permanently damage the display, remaining visible forever. Plasma and CRT suffer from this (which is why they need screensavers)

Image retention is completely benign and harmless. Phosphors degradation is a serious threat, but all displays suffer from it and it essentially only means that their quality decreases over time. Modern OLED lifespan is not that much worse than what you'd get out of IPS etc. either way, but some of the earlier products were especially bad in this department.

>UD590
I tried out that monitor for about two weeks. Didn't like it. So I picked up an LG 27UD68. Such a better display.

It's an IPS; response 5ms compared to the 1ms of the sammy but the quality is so much better. 4K gaming is a meme.

>Not even a 1080 runs 4K at 60 FPS.
stopped reading here. Go back to if you think monitors are only used for VIDYA GAEMS

As long as you're not using nearest neighbour sampling, almost all content can be freely scaled to arbitrary (integer or non-integer) resolutions with little perceivable difference between the two

If you *are* using nearest neighbour sampling, then you might as well buy a 1080p display if you enjoy seeing pixels and aliasing that much

1080 does run 4k at 60 fps in most games at ultra if you turn those fucking gimmicks like hairworks off. If you just set the preset to high I can assure you it runs 99% of games.
I think the sweet spot for 4k will be next nvidia 1170 with current TitanXP performance.

>but your movies will look exactly the same while you are paying a lot more for the screen.
Unless you're watching 4K videos

“Your vision” i

>OLED

ITT so many richfags.

Posted from my Raspberry Pi 3 Model B

>Not even a 1080 runs 4K at 60 FPS.
Depends on what you're playing. There are a lot of games that even my old card handles fine in 2160p, and for everything else 1080p still works just fine.

>21:9 plays movies as it should and you get more space in editing software.
A 36" 21:9 3360x1440 still doesn't have as much room for use in editing software as 40" 16:9 3840x2160 and the displays are often not that different in price and large format 2160p displays can be found for less than $1000. Plus if the size and aspect ratio appeals to you you can always just create a 3360x1440p window and work inside of that.

144hz 4k displays may not be here yet, but ultra high refresh rates are hardly a requirement for gaming, and if your main concern is getting shit done the higher resolution and greater amount of space pays off massively.

>“Your vision” i
implies both physical and psychological components, which together make up your sense.*

Blindness can occur in the eye or in the brain, but the distinction doesn't matter for my argument - you're either visually impaired or not

Stop trying to be pedantic, becuase it's a pointless non-argument that doesn't have any sort of point relating to the technology or displays in question, but at best about your inability to comprehend implied facts. (i.e. legit autism)

>Posted from my cum stained used chinkpad with a 1336*768 screen*

They are. You'd profit a lot more form a 21:9 at 1440p then you would from a 4K for productivity there literally no argument there.

Shit at 4K simply becomes to small to be useful even on a 27"~28"

You're much better off getting a 34" 21:9 1440p for productivity.

You're also much better off getting a normal 1080p with swivel if you want to write a lot so you can have 1080×1920

Do you have anything to back that up because I sure do.

youtube.com/watch?v=LuViSYChy7Q

It simply isn't 60 locked. It dips way too often.

>Can someone explain to me why OLED isn't a thing yet?
It is a thing, and has been a thing for many years. It's just currently too expensive for consumers like you

>Can't they just Chain 5~6 of them 5" OLED for smartphones into a monitor?
No, yields go down quadratically as your dimension increases. This is why monitors get drastically more expensive as you go up the size classes (compare a 24" 4K display against a 32" 4K display)

How does 1080p content look on a 4k monitor? Is it more blurry than a native 1080p monitor because it's not a 1:1 pixel map?

My GTX 970 runs most of my games just fine at 4K res, and for the rest I usually just have to turn it down to 1440p - very rarely to 1080p (for the odd AAA title)

I'm not talking about single slabs here.

I'm talking about literally using 6 phone display linked like one next to each other each with it's own controllers and shit.

>You're much better off getting a 34" 21:9 1440p for productivity.

40" 2160p monitors exist. Your logic is faulty.

Did you even read my post, ALL of those benchamrks are of the most demanding titles with everything maxed except for a few monstrous perf hits like hairworks.
Select high preset or turn down the AA and you get 60fps+.
The titan XP does get 60+ fps even with those retarded settings maxed.
I prefer playing on high at 4k compared to ultra at 1080 or 1440.
144hz is nice but I take resolution over that any day.

>How does 1080p content look on a 4k monitor?
Depends on the type of content and on your scaling algorithm

>Is it more blurry than a native 1080p monitor because it's not a 1:1 pixel map?
Depends on the type of content and on your scaling algorithm

What games do you run at 4k just fine? Record some with shadow play so I can see how fine it actually is instead of taking your bullshit word for it.

1. It will be non-uniform as fuck
2. The backlighting will be convoluted (most TFTs are edge-lit)
3. The seams will almost surely be visible
4. Your cost still goes up quadratically.

To fill a single 30" area with 5" phone displays, you need (30/5)^2 displays, which is 36. Go multiply the price of a phone times 36 and you get effectively the same price.

>
>The most retarded shit I've heard in a while.

This is battlefield 4 on a 970.

What does Turn down AA mean? To what x2 form x16?
Shadows to what medium?

What does high present mean using below 1080p textures?

What about Ambient Occlusion? Is that off?

You're gonna have to make a screenshot for me.

Forgot link
imgur.com/a/a7iCQ
So yeah I'm sure a 1080 could max any title released so far except for the most demanding ones.

>What games do you run at 4k just fine?
Indie games, stuff like SC2 or WoW, puzzle games (like the witness or infinifactory), older stuff in general

>Record some with shadow play so I can see how fine it actually is instead of taking your bullshit word for it.
Does shadow play run on Linux?

3 year old game. But call me impressed.

How often does it actually dip below that 60 when shit starts to happen?

this entire thread contains almost no useful information

Also, for examples of games I needed to turn down to 1440p: Talos Principle, SOMA or Firewatch

I'm pretty sure you can get OBS to use Nvidia's hardware encoder.

This entire thread has asked no interesting questions

>tfw been eyeing the xps 15 with the 4k monitor for awhile
Talk me out of it Sup Forums.

Isn't it actually better to turn it down to 1080p since it scales better for 4K?

I'm hopeful that Vulkan will get Talos up to 2160p@60 for me before I upgrade, but if it doesn't arrive in time for my card I'm sure that the Vega card I'm planning to upgrade to will do that just fine.

What questions do you have user?

Not really, no. 1440p is still going to beat the shit ouf of 1080p in terms of sharpness etc.

Just make sure you aren't using a completely braindead algorithm to upscale (like nearest neighbour, which introduces severe aliasing for non-integer scaling ratios)

Even so, I would prefer 1440p over 1080p even with NN upscaling. The image is simpler sharper.

It depends on the scaling method.
If we're talking nearest neighbor scaling, turning it down to 1080p would look better on a 4k screen.
But with methods that try to smooth the image out, 1440p would look better.

Yes, although the scaling on my monitor isn't that bad. I still usually either aim for 2160p full screen, 1440p windowed (I use this for a lot of strategy games), or 1080p full screen.

Up close the scaling is more noticeable than the blockier pixels at 1080p, and from 10 feet the difference between downscaled 1440p and 1080p isn't enough to justify the performance hit of the higher resolution.

I'm on a much older 3GB 7950 though so I'm sure not as many things run well in 4k for me as for somebody with a 970.

This is not quite the question you were interested in, but it's closely related.

In this example, 1440p is the one that's native (1:1) and 1080p is upscaled to 1440p size with varying scaling methods.

The effect you see with “NN” (nearest neighbour) sampling from 1080p->1440p is similar to the effect you get with NN sampling from 1440p->2160p

i got an oled tv and its majestic
but now i need a monitor too. its hard to look at LCD it spoiled me

>Up close the scaling is more noticeable than the blockier pixels at 1080p
Now that's extremely subjective

I would *much* rather have the sharper 1440p image in exchange for some added aliasing over the significantly blockier 1080p image.

But then again, I'm playing on 32" so the blocking at 1080p is probably much more prohibitively obvious to me

That particular model probably isn't even IPS if they're advertising 1ms response times, I'd advise to stay away from TN no matter the resolution.

I use a 27" 4K IPS monitor, the high DPI makes a huge difference compared to the more typical 27"/1440p screen and to 24"/1080p as well. Everything looks like a jagged, blurry mess compared to 4K. Fonts on 4K look gorgeous and so do games. Video is also sharp and beautiful, but isn't really worth taking into account because there's not much in term of 4K video content out there.

Here's another interesting comparison, incidentally

It compares native 1080p (1:1), native 1440p (1:1) and 1080p upscaled to 1440p (jinc)

To get the right simulated visual effect, stand back from the display so the images appear to your eye about the same size as the ones on do when sitting normally

I'm guessing most people have a monitor already... So I think the best option is to just get a 4k monitor and switch to your other one for specific games when say a witcher 3 equivalent comes out in 2017

What I hate about IPS is that when the screen is black you get a fucking flashlight to your face instead of actually turning black.

I actually love 4k and everything looks amazing, the real problem is that I use Win10 and they fucked up the UI scaling, so some programs don't scale properly and some seem blurry. There's a tool that makes the DPI scaling like it was on Win8, so I'm fine.

My 1070 is a 4k card, I'm just not autistic about Ultra settings

The real question is if HDR is worth it or it's just a meme.
I've yet to see it side by side a non-HDR screen

>What I hate about IPS is that when the screen is black you get a fucking flashlight to your face instead of actually turning black.
Most high-quality IPS panels have essentially the same static contrast as TN panels

If you want to avoid it, you're either buying VA (expensive), CRT (obsoleted) or OLED (very expensive)

Pick your poison

HDR is a marketing buzz word that's being applied to multiple different and individually entirely unrelated concepts

I can guarantee you that “HDR” will be used mainly to mislead customers into buying cheap “HDR LCD” displays that have absolutely nothing in common with the type of true, static dynamic range you get out of OLED.

You don't really need AA at 4K at all (on a 27" disaplay, viewed at regular monitor viewing distance like 60-70cm). It depends on the game/engine/scene, but in some cases aliasing isn't actually visible at all. If it is, some form of lightweight post-processing AA is generally all you need, like SMAA. I used to play games at 2560x1440 with at least 4x MSAA and 4K really is fine without AA at all.

As for settings, it depends on your hardware of course. You will probably want to disable AA or turn it down to some lightweight post-processing method anyway (it's also mostly useless, as I've said). I'm running GTX 1080 SLI and can max pretty much everything other than AA and get 60+ FPS. In some very demanding games, like TW3 for instance, some other settings may need tweaking, like HairWorks. If you really wanted HW, it should probably be turned down to low tessellation and 2x MSAA, because it brings a huge performance hit and those settings make almost no distinguishable difference at 4K. ROTTR on the other hand runs beautifully using DX12 multi-GPU, everything maxed other than AA, PureHair and VXAO.

That's true and I don't like it either, but it's not like we have too many options. The advantages of IPS vs. TN far outweigh the issues for me, so I use IPS until OLED or something else comes along for reasonable prices. IPS is very obviously imperfect but it's pretty much the best you can reasonably get.

Why 24"?

>ips does the same as oled
Just look at a VA panel to see what proper blacks look like. OLED actually has proper blacks as well, TN and IPS never do.

VA's got nothing on OLED

>You don't really need AA at 4K at all (on a 27" disaplay, viewed at regular monitor viewing distance like 60-70cm)
can confirm

nothing pisses me off more than game benchmark websites doing all of their 4K benchmarks with 8xAA enabled

Still enjoying my Dell Ultrasharp 2407wfp from ~10 years ago.

How long until worthwhile upgrade is released?

U2515h. Such a perfect monitor and cheap for what it is too

Point is that VA has 5000+:1 compared to IPS' 1000:1. The black level is also off the chart in comparison. No one says that VA is on the level of OLED, especially since VA has different weaknesses as a result of the high contrast ratio, but it makes pretty clear how abysmal IPS contrast ratios and black levels are.

I'm waiting for oled because that's the closest thing to plasma 4k I'm ever going to get.