Did you fall for the 4k meme?

I stood in front of both a standard 1080p set and a 4k next to it. I am such an asshole that I put my nose an inch away from each screen for 5 minutes. Studied the greens, rain forest, the stupid tiger footage, dolphins etc. Conclusion. It is a scam. There is absolutely no difference visually. I said VISUALLY. Now, there may be a technical difference, PPI etc. But in the end, I did not notice a difference between a Blu ray playing on a 1080p HD screen or a 4k disc on a 4k screen. Sorry. It's overpriced horseshit.

Other urls found in this thread:

en.wikipedia.org/wiki/PenTile_matrix_family
twitter.com/SFWRedditVideos

I did the same thing and noticed a big difference
Sorry your too blind and/or retarded to see it

His blind and/or retarded to see it what?

For media consumption, 1080p is fine.
For desktop use, 4K is much nicer for large amounts of text.

I never saw a 4k display smaller than 32 inch because all my friends are poorfags and own 720p/1080p monitors like me. I have no idea how good 4k is on 27inch and smaller displays but I believe it should be a major improvement since I used a 720p monitor for years and once I upgraded it to a 1080p monitor it was a massive improvement. I guess 4k TVs are a meme but I wouldn't say the same about monitors.

Yes. The biggest thing ironically is fonts are noticeably clearer. Useful for 漢字 and any other squiggly chinese esque characters, not really useful for anything else.

If you're talking about TVs, then you're blind because there's a huge difference. Every good TV nowadays is 4k so you have no choice anyway. HDR, 10bit, local dimming zones - no 1080p TV has all those features.

For monitors, yeah it's pointless.

For monitors over 22", 1080 is absolutely abysmal. Need QHD up to about 28" then 4K.
In large monitors the blending of colours to make white begins to fail when the DPI drops. This causes terrible eye strain because eyes focus differently on strong primary colours.
Also the loss of smoothness of the text makes the space unusable.
High DPI monitors used to be the standard before everything stagnated at 1080.

HDR is the real game changer.

Play Forza Horizon 3 and then play it in HDR.

Nearly. There was a 43 inch 4k monitor on special that i almost purchased but decided against it because it didnt have adobe rbg

HDR is just 10bit color, I dont' get it

TV HDR: Expanding the TV's contrast ratio and color palette to offer a more realistic, natural image than what's possible with today's HDTVs.

Photo HDR: Combining multiple images with different exposures to create a single image that mimics a greater dynamic range.

Same here user I looked at a 4k monitor and a 1080p monitor and I saw no difference!

I hate rich people, I bet they don't even know what love feels like they buy products because they are sad people who never got to enjoy the feeling of friendship or a woman's touch. Whenever I see a person who buys the newest products I just realize how much better I am! They are just consumerist phonies, they need and excuse to be consumers and these corporations allow them to be consumers.

yes? that's what i said.

My guess is retards are using HDR to boost saturation and contrast to make the image "pop" and the retard consumer thinks "wow" and that's that.

>he doesn't run 4k
fucking lmao

4k is only good on 40"-45" and there are no good monitors in that size.

hdr is just a marketing buzzword.

i cant help but respond with something condescending about wasting time with startup scripts when you could be writing useful code but at the same time i wish to see what it actualy does

No, but I fell for freesync meme.

>10 months ago
>Spend over four-hundred Australian monopoly money on a 144Hz 1080p Freesync monitor
>Connect it to my RX 480
>Enable freesync in Radeon Settings
>Start playing games
>"Wow! Freesync is amazing! It feels so smooth and crisp!"
>3 weeks ago I realise freesync wasn't enabled in the monitor settings
>That means for almost 10 months freesync wasn't even working
>Literally no difference in image quality with freesync on or off

Freesync is a load of shit. The picture quality is no better than my 7 year old 23" 1080p 60Hz monitor, and there's no difference between 60Hz and 144Hz.

What? Let me guess, 21" screens? Kek
I don't even have a 4k monitor and I call bullshit, the difference is obvious and the advantage of screenspace is unarguable

the startup class just sets up dependancy injection, the rest of it is a website for gathering data on language reading patterns for phd thesis research

it has more pixels

it's for people who want more pixels

i bought and amd gpu (2013) because they can pump pixels

go to bed jamal

24" 1440p monitors are where it's at.

whats the second monitor used for?

interesting, what are you hoping to find?

>60=144

Get out of here you retard, if you didnt even bother turning freesync on you probably didnt even set refresh rate properly. Theres a assive difference fro 60 to 144, try playing at 144 the watch it drop to 60.

I Agree OP. Look at a 4k image. Now look at a 1080p. No difference. Sorry, Sup Forumsosers, it's a Total Scam.

two things, correlations between text difficulty and reading speed and correlations between text difficulty /reading speed and comprehension.
the site collects more data than anyone could ever need so there might be some other interesting patterns that arise, like some users will improve faster than others.. can I find out why? who knows. Maybe showing a running timer to the user will make them read the text faster, will it have a negative effect on comprehension, will it just be distracting causing them to read slower.. so many tests to do. I'm just doing the programming and stats, someone else is doing the real work

but what about if you're using a 37" screen? 1080p at that size or bigger would give you pixels the size of bricks

liar, i use a 50" and at 2' you already can't see pixels

I'll have to agree, there's no difference between 4k and 1080p. The only reason I got a 4k panel on my p50 was because they were only offering TN panels for the 1080p displays and I wanted something with more color gamut and contrast ratio.

Fucking windows 10 still does a shitty job supporting 4k as well.

at 2 foot i can see the pixels on my 45 inch tv, youre so full of bs

probably bc it's a shitty tv with a shitty tn panel

i'M talking about plasma and Oleds

For lcds it might be 3' or whatever measure it with a ruler:
>walk back till u no longer see pixels
>take ruler and measure distance
>post and tell us fagit

That guy really is a stupid nigger

How do you measure 'not seeing pixels'? If it's a single black dot on white background, you will always see it (in context of modern displays), because that's how our eyes work. You will never be close enough to see the space between pixels. So what do you mean by 'not seeing pixels'? Is drawing a moderately sized circle with 1 pixel wide line in paint so see if it's jaggy a good test? Does it stop looking jaggy if the pixels are small enough?

t.assburgers

42 inch tv
100cm (3.28feet) distance till pixels cant be seen

Right so we just proved "seeing pixels" for tv is a complete meme unless u fucking cram ur face to the screen which is bad anyway even with 8k res

>i cant see pixels at 2 feet on my 50 inch tv
that was the basis for what i said, if i can see them at 2 foot on my 42 inch tv that just shows your full of shit.

read motherfucker
>

>t. reddit

I will fall for it as soon as good screen get cheaper.

Yeah, got a Samsung 43" 4k TV for 400€.

Don't care about the 4k though.

i dont have a TN panel so whats your point? your chart doesnt back up your 2 foot claim either

don't post her here.

>Samsung 43" 4k TV for 400€.
>samsung
>400€
>Don't care about the 4k though.
good, because you may have got one of those fake 4k pentile displays:
en.wikipedia.org/wiki/PenTile_matrix_family

>AMDrones in damage control
Face it freesync is crap and high refresh rates are a meme

the chart is the OPTIMUM distance brainlet.
autist

I fell for the 25" 1440p monitor. I have bad vision and sit 4 feet away from my monitor so it's hard to read text without zooming in to 150% or more. I would imagine that 4k would be much, much worse.

As expected of Sup Forums, everybody is arguing about 'muh dpi' but no one thought how the quality of a display could be measured apart from 'muh dpi' which doesn't reflect our perception of the image.

There are many TVs that size though.
>inb4 muh gaymes, pre-emptive

1080p is too small cor 27", 1440p is great. 4k might even be overkill there.

That's why you use scaling dipshit. Most 4k displays are way too small to run them with native res and 100% scaling. The point is not primarily to display more information on screen but to make everything sharper. For example, a 27" 4k screen should be scaled to match 1440p.

So what scaling factor do I use on mine? Is there a way to calculate this?

Nah, you're just dumb a shit.

It can actually be an advantage. For example, if you play a competitive shooter with lots of dark and bright areas on the map you'll be able to see more clearly in both lighting conditions without adjusting brightness/gamma.

You don't calculate the scaling factor, you choose it in Windows settings or whatever crap OS you happen to be using.

Yeah. Fuck my life right. It's terrible, I assure you.

But how do I know which one to use? I'm running Ubuntu Budgie.

You try. Run the display at 100% and if text is too small use 125%. Or 150%. Whatever Ubuntu allows you to use as a next step.

Sadly the average human eyes max out at about 1200p, with each eye seeing about 600p.

Some people, myself included, can truly see the full 4k and it is nicer to look at. Mainstream acceptance is unlikely because most just physically can't see well enough. Too little bandwidth on the optical nerve.

>max out at about 1200p
wow what levels of retard we've hit. It's not about resolution, it's pixel density and distance to the screen. Do you really think your eyes would "max out" if you looked at a 100" 4k display from 2 feet away?

the obvious b8 is obvious nigger

I don't think Budgie allows you to run fractional scaling. Only 200% scaling.
If you want fractional scaling your only option is Unity or plasma I think.

>Loonix
you're shit out of luck then. it usually takes like 10 years until new standards work properly with Loonix

apparently not that obvious

Not him, but lurking on Sup Forums I'm never sure if those people are just pretending to be retarded.

>there are people who fell for the 1080p meme

I put a 1080p screen and a 720p screen right next to each other, I couldn't tell the difference. There is literally no reason to spend money on anything higher than 720p, you cannot tell the difference. Technology should stop progressing past 720p, because it's as good as it'll ever get.

>Watched 1080p content on a 1080p anda 4K next to it.
OP, you should probably do another look at something that renders a game at 1080p and 2160p with the same monitor size instead.

HDR is the only way to get the blackest blacks possible. This is the only reason to upgrade to 4K

baka do u realize HDR boost contrast not actual black levels?

The only way to get "da blackest" is OLED

This.

HDR seems like a market trash thing, just like 2160p (3840) gets called 4K even though it's not. because we don't call 1920x1080 2K.
Though there's a special type or retards that call 1440p 2K when it's closer to 3K than 2K.

HDR is shit.
Focus on chroma 4:4:4 and how many bits of colors the monitor can render.

A 10-bit per channel monitor offers way better color than the typical 8-bit.

HDR is great if it is footage captured with a camera that has a high dynamic range

No...
"HDR" is for those that want to over saturate sources with low contrast and color vibrancy.

Over-saturating "HDR" would be overkill.

A meme?
4k is fucking top-tier for getting rid of a useless extra monitor.
You can fit fucking 4 of them in it.

That shits a dream for editing of any kind, be it code or drawing.
I regularly draw in 4-8k for downscaling to target resolution.

If you're running at 144 hz, you have so little tearing you don't need freesync. It's most useful at lower refresh rates.

Your image is wrong. Actually it's plain misleading. Most bit depth converters will simply dither the color when converting to lower bit depths. There's no "10bit" in that image.

Are you retarded? I can tell the difference between a 1080p screen and 4k screen from 6 feet away.

I still use a 1080p screen at home though, because I don't care about upgrading.

Hey, at least even if 4K is a meme there will never be another "upgrade". Having anything beyond 4K is a definite meme, even normal people can see it.

I think some people in this thread have false notion of HDR. It's foremost use is with high-nit displays that can output higher amounts of physical light. Conventional displays like PC monitors only output 300cd/m2 at best. HDR TVs target 1000cd/m2 with some high end displays going up to whooping 4000cd/m2. It's a physical property, not something emulated like dynamic contrast.
The primary use is to deliver more believable picture that more accurately matches the extreme dynamic range eyes perceive. Extended luminance also makes it easier to see display content in broad daylight.

>Sup Forums in ten years
>'you can't afford 16K? what a poorfag'

Are you that retarded?

It's meant to simulate it for people using 8-bit
A person viewing a 10-bit image would look the same as the 8-bit.

The 10-bit gets a similar result as dithered 8-bit but without the dithering which is a form of jitter and artifacts to hide the banding which doesn't happen on 10-bit.

t.12-bit monitor user.

1) TVs are all fucking glossy
2) gaymers use TVs

Doublenigger