>go to Best Buy
>the shiny new 4k TVs on display
>so sharp you can see the pimples on people's skin
>yet the input lag, it's still there
>still, all these years later they can't make the buggers have motion quality as smooth as a CRT
It annoys me so.
Go to Best Buy
Other urls found in this thread:
youtu.be
youtu.be
youtube.com
rtings.com
rtings.com
rtings.com
youtube.com
LG.com
bestbuy.com
bestbuy.com
twitter.com
Thats just tvs. They always have a bunch of stupid fucking shit running in the background. Even with game mode on they have a bunch of crap running in case a dumbass user connects something that isn't intended for gaming. if you want something that has low input lag get a gaming monitor. mine has a 1ms lag and runs at at 1440p 144hz.
Anyone serious about about gaming uses a monitor, the TV industry doesn't care about gaming. Linux even mentioned LG said that game mode still has some processing because they were afraid normies would be too dumb/lazy to turn off game mode when watching a movie. Just save up for that bigass Nvidia gaming TV I guess.
anime doesn't have this problem
>motion quality doesn't affect TV broadcasts with moving pictures
Uh...
It doesn't to any degree that actually matters.
Enjoy your smear trails when watching sports on TV.
Most TV's have black frame insertion and some high end ones even have backlight strobing. Problem is this drastically reduces brightness, which makes colors appear more vivid. Same problem was on CRT's, it's hard to make them brighter especially on bigger screen sizes.
That's fucking nothing. You're not actually watching 4K video when you play "4K" video on it lmfao.
Literally kill yourself and stop spamming every thread. Luma is much more important than chroma, you can kick and scream all day about "4k being 1080p" but it will never be true.
oh look, it's THIS retard again.
>luma doesn't matter
end yourself.
>muh luma on fucking COLOR video
Humans are more sensitive to changes in brightness than color, that's the entire reason luma us left alone. I bet you're the same retard who was whining a week or two about Blurays using uncompressed audio, and how they should heavily compress audio to make more room for chroma because "audio quality doesn't matter for movies."
what do you think HDR is?
More levels of brightness.
How many circles are we gonna have to run around each other before you finally accept how wrong you are, you fucking DUMB SHIT?
10-bit video, yes it will give more luma samples per pixel which mean jack shit when you are still missing 75% of all chroma samples.
>throwing out 75% of luma data and missing out on HDR is "better"
The guy in the Youtube video is a colossal retard and so are you. A 10-bit panel and wider color gamut is basically worthless unless you have the brightness levels to back it up.
The guy on youtube didn't say anything wrong. It's just this retard misinterpreting him.
Most current generation panels don't really have issues with smear, but choppiness is inherent to sample-and-hold displays.
>buying from best buy
Kill yourself.
>gaming TV
Is it covered in LED's and does it have a huge heatsink?
I don't trust shipping monitors or TV's in the mail.
Did you know the Geek Squad has been found snooping on customers' hard disks and calling the FBI if they think there's something illegal?
Shut up, Ian.
To be fair, I was only there to find directions on how to get away from there.
M8 the "HDR" isn't there for "better colors". It's there because 10-bit video encoders better guess colors than 8-bit video encoders. In fact lossless 10-bit 4:4:4 looks the same as lossless 8-bit 4:4:4.
It's a lost cause mate, might as well tell them chroma sub sampling was a mistake
there are literally no monitors that have sub 9ms input lag, and the 1ms response time is a bullshit number too.
Its why you need black frame insertion to kill motion.
the closest thing to being good in this regard is oled that have actual sub ms response times, however the burn in makes them unusable for computer monitors at the moment. they either need to become commodity goods that you replace every other year without a care (oleds can easily become this but no one triggered the race to the bottom yet) or lcds need to be replaced with self immission that doesn't burn in.
crts, even high refresh ones make me sick watching. get bad headaches. its just the nature of how they work and how good my eyes are at seeing the refresh rate, if im far away its not an issue, but monitor distance... lcds don't have this due to the resistance
tcl 405 and 605 both have 15ms response times with all the good shit (local dimming and hdr) enabled. that better then a good deal of the higher end lcds.
and the tv industry doesn't care about gaming, are you fucking retarded? the only reason response times are now sub 200ms is because gamers, remember how many wiis/ps2s/xbox360/ps3/bone/ps4's have sold, and a damn good number of them choose the tv on how good it will play games, hell, its the whole reason sony tvs in 1080p will play at 120hz
with more devices streaming to the tv, and processing on the boxes themselves, tvs are starting to just pass through the signal rather then process it.
you want motion interpolation to stop the stutter on panning, and as for shmear, motion blur is a thing on the camera end, so the smear does not exist the same way it does on monitors or games where a pristine image is rendered.
most lower end ones do, the choppiness is a lack of interpolation when the monitor is 60 hz but the video is 59.XX something. its really only seen in slow pans so sports are where you see it the most.
most tvs don't have black frame or strobing, I was looking into this shit recently.
with brightness, the main issue is uniformity and making the tvs thin as fuck. if they gave up thinness, they could blast 4 100 watt leds at the screen, possibly even making a led array that was 400 watts worth of little leds and you would get a few thousand spots of local dimming.
the issue is so easy to pinpoint, but its so hard to solve because consumers are fucking retards and they chased a trend that ultimately fucked them hard.
>so the smear does not exist the same way it does on monitors or games where a pristine image is rendered.
I've tried out NES games on a Sony Bravia. Games like SMB with 60 fps animation leave ghost trails, believe me, they do. I've seen it with my own eyes.
CRT's couldn't scale as well. They tried solving that with rear projection but the image was a bit blurry and they weighed like 100lbs or more.
un ironically, my 605p has 72 leds and I would assume a reasonably large heatsink considering its disappearing nearly 175 watts of power on the leds alone.
>the issue is so easy to pinpoint, but its so hard to solve because consumers are fucking retards and they chased a trend that ultimately fucked them hard
For a good 60 years, we had CRTs with pristine, smooth motion quality. No matter what the brand of TV or the decade it was made in, or the size, you could guarantee it wouldn't have frame stutter or ghosting. Yet now apparently we're willing to accept that as normal and ok.
did you miss the
>motion blur is a thing on the camera end
part of my post?
If you watch a movie or any tv, there is motion blur on the camera end, so any flaw in the tv is going to be mitigated entirely by that alone
as for the shmear, every monitor has it, there is no getting around it without getting an oled, black frame insertion and strobing HEAVILY mitigate the appearance, but its still there.
>as for the shmear, every monitor has it, there is no getting around it without getting an oled, black frame insertion and strobing HEAVILY mitigate the appearance, but its still there.
Well, we used to have strobing in every TV. This feature has since been, ah, lost?
CRT's do have ghosting when the phosphor doesn't unexcite fast enough, it's especially noticeable with moving white on black. Take a dark background and move your mouse, you can see a clear phosphor trail. Also you're forgetting that the only reason movies didn't have judder was because of 3:2 pulldown, which causes combing artifacts during fast motion. Modern progressive displays simply repeat frames to convert 24 to 60, instead of combining them. Take off your nostalgia glasses, CRTs were no where near perfect.
no we fucking didnt.
we had 20-25 years of good crts at most, having seen most of the old ones, they are objectively worse then lcds even when lcds were just starting and every one had noticeably bad blur.
and let me tell you this, because I highly doubt you owned one
we had a hd crt, it would to 1080i or 720p, it was 36 inches wide screen
wany to know why this shit fuckign died?
The cocksucker weighed nearly 300lbs.
For living rooms, you had a crt for smaller shit, and you had backprjection for bigger, and if you never seen a back projection, even a shit tn panel was preferable to that shit
a projector would have been a good replacement to back projection, however 5000+$ on a piece of shit that warps the colors, burns in, and has a 100-500$ cost to replace the lightbulb...
plasmas were a decent replacement to crts, I can't remember the picture to well on ours, but I do remember the 4 times we had to higher a repairman to replace a board that would burn out, and when I looked into it myself, a 75 or 175$ board was burning out every half year for the entirety of the time we owned it.
look, I just want an lcd thats 3-5 inches thick, is properly cooled, and possibly has a VERY high powered backlight and can be properly defused over an area, I don't want a return to the 36icnh backbreaker we had
Oh, and just to make that crt even shittier, there was absolutely no way to get a grip on it until you had it off the ground, and even then the grip areas were sharp, so you get that cutting into your fingers the whole time, and then slammed down when the person helping you lift the bastard eventually let go of their end first.
the strobeing made me violently ill, and makes a good number of people sick. If you have tolerance for it, good, but if you don't... what can I say?
>did not read the post he's replying to
>instead went off on a bunch of completely irrelevant tangents and personal anecdotes in a giant wall of text
read it, crts died for a reason and I stated why.
unless you want literal garbage in your living room, lcd is the only choice, or at least was the only choice for a good, 20 year period of time if you weren't poverty tier.
now, laster projection could largely replace lcds but it won't happen.
>the strobeing made me violently ill, and makes a good number of people sick
Not him but you really don't notice CRT flicker when you're watching TV. It can be an issue with computer monitors.
You made me do a thing
While more color is nice, lumia matters more than anything else.
>we had 20-25 years of good crts at most, having seen most of the old ones
And somehow, even TVs from the 1950s did not have sample and hold blur or compression artifacts. What happened is, while trying to correct one problem (that being the weight and bulk of CRTs), all that happened was introducing a bunch of new problems that had never existed before.
Did you actually just encode a 4:2:0 4K video file to a 4:2:0 1080p file?
you do, it all depends on how big the tv is and how close you are to it.
a 32 inch tv you are sitting 6 feet away from is fairly small, but a 15 inch you are sitting 2 away from likely takes up more of your field fo view
Where are you going with this?
took the example straight from the video that he linked to where the person already did all the work
You or the guy must have done something wrong because the pic you posted has even less chroma samples on the left.
Many of the technical issues with flat panels (including motion quality, proper compatibility with analog signals, light gun compatibility, etc) were a long way from being solved in the late 2000s when the industry decided that it was absolutely necessary to replace CRTs with this still very immature technology. In my opinion, they should have waited some more years--it was way too early to phase out CRTs. It shouldn't have been done until the problems with flat panels had been solved first. If the day comes when the motion quality and other issues are solved, then we can say definitively that we don't need CRTs anymore. But as it is, we've been forced to use the equivalent of a car that looks sleek and modern, but it has a leaf spring suspension with a solid axle.
>compression artifacts
What the fuck are you on about? You realize broadcasts were in 480i right? VHS was like 250x480i and even Laserdisc was only like 480x480i. VHS and even Beta had shit color accuracy compared to what we have now.
compression artifacts... how shit of a monitor did you use it had compression artifacts? your display is literally just displaying the content. artifacts come from how the video was compressed, uncompressed video is something around 50+gb per minute uncompressed at all, so they compress, and this compression introduces artifacts, then when it gets broadcast, it may be compressed again.
as for sample and hold, no clue what you are talking about on it being blurry, with crts they had to keep refreshing the same image over and over again just like lcds do, unless you are talking about burn in, and yes, crts had this issue too, though far less prevalent then it was on lcds they used in the early days.
look,
image is based on
youtu.be
so unless the guy in the video fucked up majorly, 4k 4:2:0 looks significantly better then 1080p 4:4:4 because we rely in brightness more then we do color
considering that even between the 1080p 4:2:0 and 1080p 4:4:4 there is not a whole hell of alot of difference, and even then, he had to zoom in for it to be instantly noticeable, it's clear or should be clear to anyone that the chroma sampling in motion barely matters, the only reason it does on computers is we have pixel wide fonts, and pixel wide details, the flaws get pointed out, however in motion, there is VERY little difference between 4:2:0 and 4:4:4
now onto colors, the one the human eye is best at distinguishing differences in is green, and even that doesn't hold a candle to out ability to see brightness.
The poor color accuracy on VHS was not really an issue with the display, but because they were using NTSC which is a compressed signal and will lose some detail. If you used RGB, you'd have perfect color accuracy but it used too much bandwidth to be suitable for TV broadcasts.
no, that is not the case at all
however, having video mastered to a contrast ratio of 27,000 and 64,000 (I believe hi10 and dolby respectively) makes it look that way, but honestly, 10bit really helps in areas where there are subtle changes in color that would otherwise result in banding or colors being overblown when compressed, red tends to have this issue more then the others.
no, at least not when 10bit is the source, and its not 8bit upsampled to 10bit
>input lag
Like it matters when you're just using the thing to watch movies?
they should have been doing that regardless, the only difference is they were paid to do it, that introduces incentive to plant.
how would you look at tvs to see them in person at the very least before you make a major purpose, unless you are one of those people who can afford to shit away several thousand dollars like its nothing.
>as for sample and hold, no clue what you are talking about on it being blurry, with crts they had to keep refreshing the same image over and over again just like lcds do
Do you not even know how different display types work? LCD pixels are solid state and will retain their state as long as power is applied. A CRT has phosphors which rapidly decay in luminosity after the electron beam passes over them. The vertical retrace phase then produces momentary blackness as the electron beam shuts off and goes back to the upper left corner of the screen.
As things would have it, flicker is what allows that smooth CRT motion, because it's related to how the human brain processes images. The momentary blackness during the vertical retrace gives your brain a "pause" to process the image and this is lost on a sample and hold display, causing it to smear.
And compression artifacts are an issue with the source as well, it made no sense to bring it up.
the picture wasn't just blurry, unless you looked at them straight on, you didn't see a picture at all, and this is from 5 feet away on the same couch as a friend, just one seat off center.
they weighed more than crts just because rear projection meant you were getting a 40+ inch tv
Thing is when you resize 4K 4:2:0 video to 1080p on a display you are now getting all the chroma values for each pixel and no chroma upscaling is being performed. So yeah that guy fucked somehow.
I think this luma > chroma meme exist because people haven't seen video on oled/amoled displays.
Nope he's right, most human eyes can only see around 1 million colors. 8-bit is 16 million colors and 10-bit is like 1 billion colors. 10-bit is the 24-bit 192 Khz of the video world if not used for video encoders.
Why can't I get a 32" 4K TV? And WTF is with the influx of bedroom sized 720p TVs?
1080p is disappearing and small 4K tvs don't exist, yet I can get smaller 4k monitors & 4k phones.
Don't even start to tell me I can't see it, because FFS you can!
>4:2:0
this thread has so many retarded comments, i won't even bother giving (You) a serious reply, just know that the posts you're reading are retarded
32" 4k looks identical to 1080p beyond 4.5 feet viewing distance.
The reason 4k TVs are usually 49-55" at the smallest is because that is the optimal viewing distance for 4-5 feet.
If you're 6-8+ feet away you need a 65-85" 4k.
32" 4k is only really useful at a desk where you can have your viewing distance under 3 feet.
I don't get it either. It's kind of like how S-video inputs disappeared from TVs after about 2008 or so but composite and component stayed. They kept the best and crappiest inputs, but dropped the middle-tier one? Huh?
Right?
>muh luma
lmao
>continuing to pretend luma doesn't matter
Even if it's not as significant as color information, luma still matters, and 4k 420 at native 4k WILL look better than 1080p 444 at 1080p.
1080p 444 is certainly better than 1080p 420, but 4k 420 is better than both if being viewed at 4k.
Lol what kind of cuck shed sized house do you live in?
I have a 46" telly 1080p and sit 3m away
9-10 feet in retard units.
man we need a proper objective comparison ffs
>Implying there are 4:4:4 TVs with 4k HDR
Feel free to sit further back, but you lose out on the apparent image quality.
I was stating values for optimal viewing distance only.
I think in that case it was just for cost reasons because composite/component use standard RCA connectors instead of the stupid S-video DIN plug.
>all these years later they can't make the buggers have motion quality as smooth as a CRT
you need a model with black frame insertion
youtube.com
Companies like samsung and LG have thousands of models but they never tell normies which one are best they simply advertise buzzwords like "HDR" and "TrueHD Surround"
rtings.com
rtings.com
rtings.com
The industry didn't "decide" to replace CRT with LCD, plasma was meant to do that. It was between rear projection, LCD and plasma (SED and FED were stuck in patant hell of course). Projection was never going to be mainstream, it was too heavy, too blurry and had shit viewing angles. Plasma was great but had burn-in and manufacturing issues, also the price never dropped enough to be attractive to the mainstream.
That left LCD. Relatively lightweight, sharp pixels, falling prices, and the greatest feature of all: little to no burn in. LCD was the definition of "good enough" and that's why it won.
There aren't but:
A.) HDMI outputs lossless 4:4:4 video from the player/PC
B.) Resizing 4K video to fit on a 1080p video eliminates chroma up-sampling so you view a native image.
my god, do I have to call you a dipshit too?
lcds largely refresh the same way that crts did, they cant hold images for shit as that physically breaks them, think of it, why would there be a lower limit for freesync or gsync
youtube.com
you are on your own looking for the exact reason they break, I forget, but on a standard tv or monitor, regardless of motion, it's refreshing everything at its stated rate.
as for the flicker... yes and no.
crts had instant response times, if you took a high speed photo there would be no ghost of the old image, meanwhile with lcds, there are ghosts of the old images even in high speed photography where there should be none.
what we are talking about with schmear is I think two different things entirely.
i'm talking about pixel response times leaving a several ms ghost, you are talking about a type of motion blur.
TVs are not for video games.
The issue with plasma displays is mainly that the pixels are pretty big and they couldn't make them smaller than 32". Also it's still sample and hold, so while the motion quality is better than an LCD, but still not as good as if it had CRT flicker.
>not buying a 4k LED projector instead
>Why can't I get a 32" 4K TV?
You can. Buy a TV box/media center for 100 dollars + this: LG.com
Pro tip: you can also find good iiyama monitors and even a 32" IPS TV for a very good price with good quality
A good monitor without flickering and 3000:1 contrast ratio is all you need. If you need extra processing to have better picture (like contrast, edge enchantment or noise cancellation) you are a pleb
>crts had instant response times, if you took a high speed photo there would be no ghost of the old image, meanwhile with lcds, there are ghosts of the old images even in high speed photography where there should be none.
And there we go. We've somehow lost that instant response time that was taken for granted on any and all TVs for 60 years.
everything I have ever owned, even in an era where svideo was a thing, was either component or composite, its removal, or even it not being on a device at all has alot to how often it was used.
This chart is a meme. There's something subjectively impressive about tiny pixels, it's why literally everyone who sees an 8k monitor is visibly impressed and compares it to looking out a window. You can't really get that window effect at 720 no matter what size or distance.
but my sister said so otherwise
That's a different argument.
This chart is for apparent image quality and viewing distance.
If you want high PPI, by all means, go for it.
But that wont change the optimal viewing distance.
To be honest, I've never used an S-video input. I had a Proscan 27" for 15 years and never once used the S-video connector. I kind of regret that I never got a Playstation S-video cable for it back when I had a PS2.
>LCD was the definition of "good enough" and that's why it won
>the issue is so easy to pinpoint, but its so hard to solve because consumers are fucking retards and they chased a trend that ultimately fucked them hard.
This user was right, it would seem.
it was never taken for granted, however, in the living room where video is played, you either have tv shows shot at 30-60hz that will have motion blur, or you have movies shot at 24hz, that will still have more motion blur even if you up sample to 60
the response time was an easy toss away for watching video.
and as I said, going hd with a crt was fucking heavy as hell along with back projection sucking every dick put in front of it.
while I would like instant response times, I don't miss it, what I do miss is actual blacks, and those are finally back if you have an oled, or good enough if you have a nice array local dimming.
yea, kind of wanted one, but then you find out that some games were made specifically with composite in mind because of the way it would blur pixels just enough, you would effectively get more resolution/color than the thing could output when they dealt with gradients.
I've just realized something Sup Forums, if playing 4K video on a 1080p screen gives you "true 1080p video" then is there a benefit in playing vidya in 4K resolution with no AA but resizing it to fit on a 1080p screen?
it's called super resolution in the Nvidia settings and it's been around for awhile.
Renders the game at 4k and then downsamples to 1080p.
Looks way better than having AA.
that's 27"
>while I would like instant response times, I don't miss it, what I do miss is actual blacks
A lot of CRTs didn't have good blacks anyway, certainly nothing prior to the 80s. What all of them did have was instant response times and now we...do not.
LCD "flicker" is caused by the reversal of polarity in order to preserve the pixel lifespan. It's not noticeable to the average person's eyes though, which adds some motion blur. CRTs have real flicker because they have to physically draw the entire screen over and over again, and the phosphor image fades faster than the refresh so there's a brief moment of blackness between each refresh. This is why black frame insertion is on LCD TV's, it emulates a CRT image.
not necessarily
>LCD was the definition of "good enough" and that's why it won
lcd was likely always going to be the replacement, the issue was when
when lcd came onto the market for high end home use, it was less of a good enough so it won, it was a it finally got good enough to replace.
Mix that in with plasmas apparently being break prone, us going through 3 warranty repairs, 1 out of warranty before we gave the fuck up and got a lcd for the living room,
lcd came faster to some areas because it was the only option, not because it was the good option, but the issues with scaling crts to large forms, the issues back projection had with viewing angles, along with projections cost and recurring cost problems, lcd was going to overtake, it was just a matter of when.
as for sed, I was looking into this as the replacement for lcd, however it never took off because after the patent hell was over, it was going to cost twice what lcd did, and no one wanted to step up to manufacture even though I would have gladly paid double the price of my lcd for THAT much better of a display at the time
>CRTs have real flicker because they have to physically draw the entire screen over and over again, and the phosphor image fades faster than the refresh so there's a brief moment of blackness between each refresh. This is why black frame insertion is on LCD TV's, it emulates a CRT image.
Yes, like I said, the flicker creates a "pause" that gives your brain time to process the image.
once you have per pixel illumination, where you could have a single pixel shine, this goes out the window, but its still and overall how big you need it what resolution you need chart.
It's called supersampling, downsampling or SSAA. It basically eliminates aliasing at the cost of framerate.
I disagree, you shouldn't snoop on people's private data
You fix the problem and get out
LCD panels up to the late 2000s had really bad response times which led to ghosting--the picture would update faster than the pixels could change state. I have a Bravia from 2010 that still has a lot of ghosting issues. The newest panels have mostly eliminated this issue, but the sample and hold thing is still a problem unless black frame insertion is used.
Plasma was popular for a few years mostly because it didn't have ghosting issues but it had many other drawbacks, the biggest being that they can't go smaller than 32".