Why are people pushing this garbage?

Why are people pushing this garbage?

Other urls found in this thread:

techdissected.com/ask-ted/ask-ted-how-many-ppi-can-the-human-eye-see/
youtube.com/watch?v=sz_m6N1IYuc
twitter.com/NSFWRedditImage

"bigger is better" mentality

Because it's the future...

People don't.

TV salesmen do.

To sell us shit we don’t need

8k is coming out soon

And no fucks are given.

>We dont need resolution higher than 1080p on 50" tvs
What

yes we dont

Only affordable "upgrade" consumers can understand

Everything else like contrast and color improvements are expensive

I'll bet many consumers "upgraded" from a high quality 1080p TV to a garbage quality 4K TV.
And they would be happy too because they don't judge the screen for what it is, only for what it says on the box.

People are this stupid.

Well to be fair, and not citing this chart, 1600p is about the best people can see when sitting the optimal distance from a TV for movie watching.

We got 4k instead of a reasonable resolution because it's cheaper to manufacture from existing infrastructure and it scales 1080p perfectly.

I guess it's not totally useless though for shit like font rendering and video games where aliasing is visible.

Nah, I'd love an 8K display at ~32" for PC use.

Was this graph made by a blind person?

Well maybe that's stopped now because of the HDR meme. There's some attempt to improve contrast and even the cheaper 4k sets with local dimming are a good stop gap until OLED is improved and competitively priced.

FONT RENDERING
O
N
T
R
E
N
D
E
R
I
N
G

t. resolutionlet
You can never have too high resolution. If you can't afford it or consider it diminishing returns then don't buy it but for people who aren't visually impaired and can afford it there is no reason not to go 4k and eventually 8k. I bet you think no one needs more than 80 columns either.

I can confirm that chart is bullshit.

Higher resolution unnecessarily uses more bandwidth, processing power, inflates the price, etc

In the case of LCD it also blocks out more of the backlight.

>I'll bet many consumers "upgraded" from a high quality 1080p TV to a garbage quality 4K TV.
Thanks to that there's lots of really good 1080p tvs still in stock getting thrown out on sale. Mongs just read 4k on the box and jump for it like hungry dogs. They end up buying rebranded garbage chink tvs.

What does the t stand for?

Because of chroma sub sampling you retarded faggot. 4K video is true 1080p video.

Yes, even blu-ray discs 420 blaze it faggot, 444 video is just too dam big in file size even for blu-ray.

Do you also drive only at the speed where your car has the best fuel efficiency even if it's well below the speed limit?

adding to this: I'm assuming 4K videos and not 4K TVs which are a huge fucking meme until we get widespread 8K content.

>Not being able to see any more detail is as useful as getting somewhere faster

anything that needs to be scaled so it can be legible is idiotic

It's just a higher resolution, dimwit. They happen.

Retina display is really good.

>I bet you think no one needs more than 80 columns either.
You LITERALLY don't.

Imagine being this blind

You also need a 4K image sensor to make a "true" 1080p video.
Each pixel records just one color, the other two are interpolated from adjacent pixels.

For "true" 4K you'd need an 8K image sensor, which I'm not even sure exists yet.

4K and beyond only make sense for massive screen sizes.

Videophiles are the only market that even gives a shit about it.

1080p is really good enough for the normies HDTV manufactures are just bloody desperate. The 3D fad failed. HDR didn't work either. They are now trying 4K fad to get units sold.

I don't think it is going to work either because you need really massive screen to make worthwhile and that's simply beyond the budget for overwhelming majority of normies (screen itself and the room needed to house it)

You aren't going to notice the difference between 4:2:2 and 4:4:4 in real life situations. Just like you don't notice quantization errors in your audio, jitter, DCT, etc.

Oh sorry. Did you get offended for still owning a 1366x768 chink laptop display?

Not exactly, pro cameras (ie red) can record raw 4:4:4 true 4K video but no one will ever see that because this video gets butchered to 4:2:0 video in editing software and/or the final blu-ray/web encode.

We have yet to see true 4:4:4 4K video recording on phones.

t. newfag

Doubt it but I know for a fact you can tell the difference between 4:2:0 video and 4:4:4 video especially around edges of things where 2 different colors meet.

No, it is not.

There are practical limits for human visual acuity. Even if you are a freak with 20/10 vision (tiny minority).

4K for computing monitors is near the practical limits of human visual acuity for computer screen size (30-40") and going beyond that is just silliness.

That still doesn't stop videophile non-sense from being sold to gullible idiots.

>Not exactly, pro cameras (ie red) can record raw 4:4:4 true 4K video

Only the ones that say "8K" (which I just looked up do exist now)

I’m just waiting for all these different HDR formats to eventually converge into a single industry standard. Right now it’s just a goddamn mess.

Also the exclusive bundling of streaming content with certain smart TV manufacturers (like Samsung with Amazon HDR10+) makes my stomach turn.

Also only a minority of streaming content is available in 4K, much less in some form of HDR.

Yeah this is true

But I had to stare at for a really long time to see the red sticker looks like shit. Everything else looks normal.

If I'm looking at objects, and not objects within objects, like people actually watch video, it doesn't matter at all.

I like that stuff like this is improved behind the scenes and I don't need to give a fuck to benefit from it, even if the benefit is microscopic. But bandwidth is a limiting factor right now.

>“If the average reading distance is 1 foot (12 inches = 305 mm), p @0.4 arc minute is 35.5 microns or about 720 ppi/dpi. p @1 arc minute is 89 microns or about 300 dpi/ppi. This is why magazines are printed at 300 dpi – it’s good enough for most people. Fine art printers aim for 720, and that’s the best it need be. Very few people stick their heads closer than 1 foot away from a painting or photograph.”

techdissected.com/ask-ted/ask-ted-how-many-ppi-can-the-human-eye-see/

Even with okay 20/20 vision you still require at least 300 PPI 12 inches away from a screen to not be able to see no pixilation, even minor.

Even 4K 30 inch monitors only achieve half that PPI.

Question for anons

What is the difference between HDR content and SDR content when both are viewed on an HDR capable display? Is it just 10 bit color or higher?

Watching HDR vs SDR comparisons on youtube is hilarious. SDR does not look like someone puked gray on my screen. The contrast is the same.

>LITERALLY buying a TV that records everything you say and sends it to the manufacturer
>Even worse, buying from a foreign company so when those recordings are sent to HQ the NSA can consider them 'foreign transmissions' and legally keep copies of them all
y'all are fuckin stupid

Utter horseshit, especially for gay men there's a night and day difference when it comes to things like aliasing and shimmering, visible from distances much further than that.

But there are also plenty of good quality 4K sets on sale right now. Something like the 40" Samsung MU6120 barely costs any more than a 1080p TV and is very decent for SDR content. Step up a bit more and you can get something like the 49" Sony XE9005, with FALD and great HDR rendering. This is the perfect time of year to buy a TV, when they're clearing all the old stock.

Also, people don't really have a choice any more if they want a TV bigger than 32" either. Manufacturers aren't making 1080p sets any bigger than that these days. And 32" is pretty damn tiny by today's standards.

source? As far as I know red 4K sensors are 4K res and save to raw 4:4:4 video

Don't take my word for it, try it yourself. Play vidya and record it with a lossless 4:4:4 codec like the one frap has. Then convert that to 4:2:0 with a high CRF like 16 and compare the videos side by side. Clearly a huge fucking difference.

The only reason people won't notice it is because they don't have access to the original 4:4:4 video to compare it to.

Near practical limits, a.k.a seeing a noticeable difference.

You would be hard-pressed to find a difference between 4K and 5K in a double-blind test without specialized tools and images.

Silly videophiles will still insist that's a noticeable difference though.

Not him but RED uses 8k sensors

The only thing people do with screens is watch video

dpi is not ppi, at all. Dpi needs to be higher to produce the same detail as measured in ppi

Also being 1 foot from my monitor makes me want to kill myself.

4k TVs are retarded because of how little 4k content is actually available. In burger land streaming services have trouble streaming at 1080p, good fucking luck getting 4k.

>shortneckfags thinking they know anything about proper usage of screens

Not the user you were replying to but in the context of standard computer displays (ones that aren't using pentile or something) then DPI and PPI are interchangeable.

Well that train picture is also clearly shit

Someone could easily make a real example using screenshot comparison

Videophiles are the red-headed cousins of audiophiles who nit-pick and insist that silly non-sense make a noticeable difference ($5K cables, wooden knobs etc.)

The rest of the world just laughs at them for falling for overpriced snake's oil crap.

>Still thinks videos are the only thing people use digital displays for

Too bad there's very little 8K video content to take advantage of it.

I'm not a videophile but you can't ignore how powerful human vision is, even average at that. The only reason we all don't have 8K monitors and 4:4:4 8K video is because all of this is too fucking expensive right now.

You got that in reverse, you need higher ppi to produce the same dpi.

you sir are and idiot :\

Actually, human vision kinda sucks for color acuity, visual acuity and how easily it goes south with age and diseases/disorders.

It is rather easy to fool as well with simple optical illusions.

>source?

Google "Bayer sensor"

>As far as I know red 4K sensors are 4K res and save to raw 4:4:4 video

No, barely any sensors are.

I think Sigma is actually the only ones making sensors with 3 colors per pixel (they call them "Foveon") and they make a fucking big deal about it to the point where they call their 26MP sensors "39MP equivalent".

In the old days they used prisms to split the light into 3 separate sensors.
But modern cameras have a single sensor.

Also most "4K" content is encoded with 420 blaze it faggot chroma sub sampling meaning it's actually 1080p content. 8K content which is virtually non-existent is required for 4K TVs.

like i said, don't take my word for it. try it yourself, there's a huge difference.

videophiles = digital imagery in general. It is just not limited to video playback either.

I sit 13ft away from my 65" Samsung. Everything from 480 to 1080p looks great on it. Even some low res media from library looks decent on it even though its being up scaled to 1080. Nothing I got uses/outputs 4K. My Cable box don't support 4k and far as I know the cable company has no plans to "upgrade" to 4k support any time soon. Also being that some of my media is low res wouldn't further up scaling to 4K render it like shit? Seems like it would, just like trying to enlarge a 8x 10 print to poster size; the result would look like garbage due to required pixels not being there to start with.

8K Bayer sensors.

Which only have 4K worth of red and blue pixels.
And two times 4K worth of greens.

yes it would

If you're interested in seeing the best 1080p image your tv can output get a roku/chromecast and play 4K content on it.

8K video won't be a thing for at least a century so 4K tvs are a meme right now.

Why is Sup Forums so fucking stupid?

>65" 4k 3d 240hz motionMeme Samsung TV
>Paid $200 because some jackass at the store thought it was a "display tv" since samsung moved the ports to an external box and they didn't have the box
>Bought box on Amazon for $50

If not for that I'd still be on a 720p plasma.

personally I don't want a whole cinema for my desk
I can barely even handle a 19" 1080p monitor

I read some articles saying that 8K + HDR12 content removes the pixelgate of the equation and tricks your brain to be seeing through a window, like, depth and all.

great buy I'm jelly but like everyone said you'll have to wait a while for 8K content to get the best 4K image

That's fine, you just keep on keeping on - I'll buy the 8k 32" in your place ;)

I already run a 32" 1440p...

Are we even going to be alive when "16K" video aka true 8K video will become widespread?

>The 3D fad failed.
3D was practically killed off by the hardware manufacturers. There were no standards for glasses or display tech. The cost premium for the privilege was way too high. Content producers were half assing their way through most of the content they produced.

On the PC side, nVIdia and AMD both loaded up on proprietary garbage and devs supported neither well. nVidia required that the entire hardware stack be blessed by them or it doesn't work, and it didn't for the most part.

The only good part about it was the market deployment of 144hz monitors.

>In the future retina display will mean you can see the actor's retina

I have never owned a monitor with a higher resolution than 720p, what's so great about 4k and 8k? Or 1440p for that matter, isn't 1080p the perfect balance between resolution, power usage, and performance?

I'm not using it for video though...

>2003
>Both AMD and nVidia have built in 3D glasses support

>2013
>Both AMD and nVidia have proprietary glasses and restrict you from using them without specific monitors

Currently watching blade runner 2 1080p on a 4K samsung meme tv

Looks crisp as fuck. Seriously can’t imagine a need for 4K content

>native 3d glasses support
Not really, it was a few third party vendors that managed to get them to bundle drivers. Game support was still very limited. Support was silently dropped around the GeForce 3 era since devs forgot the stuff existed.

In the last attempt, it was the card manufacturers themselves attempting to push their own gear. The reason everything was so restrictive is because they learned a few things from computers in the 90's: people will strap high end toys to low end gear and blame the toy for the subpar experience. Locking down the requirements kept the experience standards at a baseline. My only complaint about that is that since they dropped support, they should just turn the safeties off and let people do what they want. Other manufacturers had similar issues as well. Occulus did a poor job of handling people when they started whining that their five year old Acer laptop couldn't handle proper VR.

For gaming I don't think it's necessary at all
I mean the extra detail is incredible and looks amazing but I use my 4k monitor for productivity and photo/video editing. If I didn't do any of that though I would've gone with higher refresh rate.

>Not really,
Yes really. nVidia used to include stereoscopic 3D output in the drivers standard. When they decided to do "3D Vision" they removed this from the driver unless you had a "3D Vision compatible" monitor and their $300 glasses.

holy shit

Most normies have good eyes

>1600x1200 75hz
>CONTRAST
youtube.com/watch?v=sz_m6N1IYuc

>more bandwidth
>(((inflates the price)))
OOOOLLLOLLOLOLOLOL

>21inch

im sure, youll
probably get a headache
i seen some tv at this store i was like wtf am i looking at? real pretty good looking 3d and stuff depth but, if i stared long enough, i dont know now you cant beat real life and they should use something that uses real life analog stuff like a CRT

>Why are people pushing this garbage?
Because YIFY rips get
>V:10 A:10 thanks yify
comments.

People are fucking dumb and the only thing they can understand or appreciate are easily measurable things that the manufacturer can slap on the box.

Good luck finding a decent 1080p TV these days. They are all bargain bin piece of shits designed to be as low price as possible.

Why are 3rd worlders still browsing my board? Can you 'le humble tech' cuckolds please fuck off to another board?

high quality 1080p... yea... fuck no
high quality 1080p was around a time when most tvs sub 3000$ had dynamic contrast, and not local dimming.
Where even the good tvs had so much lag with dynamic contrast it was always better to turn off.

no, most people didn't have good 1080p screens, hell the only redeeming thing about the 1080p in my living room is that it's a VA with 3500 contrast ratio.

my computer monitor is now a tcl 605p
It has some issues, the very edges are dim, but beyond that, 6500 native contrast, 7500 local dimming, and local dimming is fast enough to not be annoying, so black means black.

Then that tv in the living room isn't even displaying 8bit color correctly, and another tv the wife uses is a fucking 6 bit piece of trash, that was 400~ dollars a few years back.

yea, no, most people didn't buy top end panels, and 600$ now gets you 4k, local dimming thats worth a damn, very high contrast, and 93% dci-p3
THAT is a fucking good display. it just happens that we are getting video that demands better displays, so people are starting to put good shit out. I mean fuck me, I can see a difference between 12 12 13 and 14 14 16, that small of a fucking change and I can see the color difference, holy shit.

4k is being pushed partially because cable broadcasts 720p, and satolite fucking sucks, and 720p can fit into 4k perfectly, unlike 1080p.

sure, some people will buy bad 4k tvs, but thats on them not doing research.

buy buy buy

>implying the retards who would buy any crap as long as it has 4K written on it had a high quality 1080p TV to begin with

>connecting your TV to the internet

To some degree it may be a result of the TV's built-in upscaling algorithm, although it is true that there isn't that much of a difference between 4K and 1080p. 4K is god tier for monitors though.

>4K is god tier for monitors though.
LOL

What's wrong with it?
>inb4 gaymer manchild who needs gorillion Hz refresh rate

>Reduces brightness
>Shits all over colors
>Doesn't improve image quality unless you keep your face glued to the screen
>Same desktop space as 1080p
>60hz
>Reduces performance even when doing something as mundane as browsing the web
1080p is perfect for 24". 1440p is perfect for 27". The only place 4k makes sense is in a 36-38" monitor.

30 inch 4k Rec2020 TVs are nice.