Is it just a fad or is it truly the next big thing in television?
Do you believe in the HDR hype?
Other urls found in this thread:
github.com
twitter.com
Never seen it in action, I think 1080p movies look good enough, then again most of what I watch and consume is still in 480p or below and it satisfies me just as well.
Likely a fad. I remember back when the very first LED TV with full zone lighting was developed, it had not only incredible contrast ratio but was also incredibly bright. Even though content that could take advantage of those dynamics didn't exist yet, what was obvious is that it put quite a bit of strain on people's eyes.
While our eyes have 20 stops of adaptive dynamic range, our static DR is only about 7 stops, and this is important because static DR is what comes into play when we focus on any object such as a screen. If the dynamics of said static object start to exceed what we can take in at once, it causes eye strain.
While display makers can claim all the contrast ratio that they want, most of it comes from black levels, but it's peak brightness that's the key here. For example, most computer screens are set (or should be set) between 100-140cd/m2 for best viewing comfort, and HDR likely won't be able to stretch its legs at those illumination levels.
fpbp because this is me
I brought one of those curved 4K screen once, and ended up only watching the Andy Griffith show on it before realizing after about 1080p shows from the 50s (which is all I watch) isn't going to look any better or clearer. I sold it
>The Next Big Thing™
It's a fad.
I only believe in good-quality display panels, not in shitty software gimmicks
i think it's pretty legit
but desu i want brighter screens
sick of having to shut curtains when i wanna play games or watch movies during the day
wasnt sony developing a 40000 nit screen?
More than likely it's going to be dynamic contrast repackaged yet again under a new name.
HDR will not compensate for a TV having a shitty cheap panel, and most do, with low contrast and bad colour reproduction.
There's more than enough range in the current standards, people do professional graphical work that then goes to print on them, and medical work on LCDs
Displays are already capable of HDR, this is just another image processing gimmick that will change what you are watching into something different than it originally was.
Not exactly... The idea is that as you increase the possible range of tones that can be displayed (such as from raising brightness or showing a greater color space), the more obvious gaps in the color depth become visible, with 8-bit potentially not providing enough gradations. This is actually a real problem that is solved with expensive 10-bit displays that you need to run from a pro graphics card, and even then you can only output 10-bit from OpenGL-accelerated programs. So while this does exist as you say, it's currently a niche within the prosumer segment.
HDR as its being marketed is simply the availability of high-depth color output for the masses.
I have seen a 4k OLED hdr tv with hdr content in the local media markt and it was awesome
it costed like $3000
HDR is basically a marketing buzz word to sell a few incremental technical improvements to people who don't really understand the underlying technology.
In other words: ‘HDR’ is a meaningless concept. Yes, HDR devices are technologically better than non-HDR devices, but it's not really an inherent difference - they're just incrementally better (lower black point, higher white point, better contrast, etc.).
As for whether you should buy HDR, it depends on the underlying technology more than anything (as usual). Don't buy a buzz word because it's a buzz word, buy the underlying technology due to the technological specs themselves.
In this case, you should only buy real OLED displays - but stay away from non-OLED “HDR” which tries to fake it by doing shitty active dimming and other such bullcrap.
Waiting for that 120hz 4k oled to drop on price.
I want my cursor to look and move smooth as silk.
If out devices can't display "HDR" then how can they demonstrate it with images/video on our devices?
You're confusing different things here. Color gamut and brightness curve are different things.
>such as from raising brightness
Most of the real benefit of HDR is not raising the brightness but rather lowering the black point. You can already raise the brightness as much as you want, but the problem is that the blacks go up alongside them.
Decrease the black point (e.g. by using emissive technologies like OLED instead of backlit technologies), and you get a better dynamic range automatically.
>or showing a greater color space
Technically unrelated to HDR, although I'll excuse you for confusing it because they usually get marketed together (as part of the UltraHD umbrella term of technologies).
>the more obvious gaps in the color depth become visible, with 8-bit potentially not providing enough gradations.
10-bit is the mandatory minimum for BT.2040 / UltraHD / PQ / HLG etc, but to claim “8 bit potentially not providing enough gradations” is still pushing it - keep in mind that the vast majority of consumers are currently doing just fine with 7- or even 6-bit displays, thanks to dithering. It's the same with Hi10P video, for example - 10-bit gradients still looks better than 8-bit gradients on an 8-bit display since you can dither the gradient to preserve the gradation.
Consumer GPUs can do 10 bit decoding/rendering -> 8 bit dithering, so this is really a non-issue in practice.
It's easy: they can't. The demonstrations are complete and total, utter bullshit. Don't buy into them a single bit.
The most hilarious of all is the people who say “HDR looks so amazing!” when using a HDR-aware video player like mpv or madVR to view it on their consumer devices.
It's the same fucking thing as SDR unless you actually have a display that supports it.
>costed
Kill yourself
AMOLED is true HDR. The rest is post processing.
I can personally say HDR looks amazing because I watched star wars in Imax w/ Laser which uses 4k HDR laser projectors.
It really helps with movies that have lots of dark scenes (like star wars), everything is so clear even in dark scenes that would normally be washed out on a normal TV or monitor. You can see the subtle differences in the very darkest parts of the scenes in the corners, all the fine dark shadow details. It's quite nice.
sorry
costinged
>It's the same fucking thing as SDR unless you actually have a display that supports it.
Actually, I should correct myself somewhat. There is a technological difference between a HDR-encoded video file displayed on an SDR screen using a HDR-aware video player and an SDR version of the same video:
The HDR->SDR mapping is tunable. For example, in mpv, you can adjust the brightness and contrast controls to bring different parts of the scene “into focus”. Think of it like centering your limited dynamic range onto a particular part of the image's brightness spectrum, to inspect details in that brightness range more closely.
See here for some examples of how this would work/look in practice: github.com
its probably the only thing that will create a need for new hardware
HDR isn't just about enabling 10000:1 screens. it's about a set standard, like 4K, to enable higher bandwidth data videostreams for the mass
just overall much more color and contrast data, which is also the reason why gpu's manufacteurer amd says that 4k content equals to 1080p HDR content in terms of bandwidth
>HDR is a standard that affects resolution
you blew all your credibility from the get go
HDR is literally just a blanket term for a family of transfer curves (notably PQ). Resolution doesn't even play into it
You're probably thinking of UltraHD
>amd says that 4k content equals to 1080p HDR content in terms of bandwidth
bullshit