Why are some hardware display devices marketed as having HDR...

why are some hardware display devices marketed as having HDR? I thought that was merely taking camera footage under several different exposure settings and fail to see how this would translate to display technology.

Software HDR is a thing.
It's not much different from other TV settings like "magicolor" and "intellisense" or whatever

so they add a bit of excessive gamma to dark areas of the image and then slap on an extra $500? it's not like information can be recovered with such a pseudo HDR which is the purpose of real HDR, to get as much information from the scene as possible.

Is that Olivia Munn?

yes

I want to slide down her bannister, if you know what I mean

I want to sniff her ass.

it wouldn't be particularly unique i don't think, as far as ass smells go

,

>I thought that was merely taking camera footage under several different exposure settings and fail to see how this would translate to display technology.

That is the meme HDR photography

HDR in terms of displays refers to a larger dynamic range the display is capable of outputting

the monitor use of HDR is a meme, not the photography use of it because its being used out of context and they are effectively making it a buzzword.

Right, HDR actually uses 10 bits per color instead of the traditional 8 too, you actually do get more color.

"more color" would imply extending the gamut range. and don't get me wrong, "HDR" does use an extended gamut. more bits simply means better differentiation between similar colours, the benefit of which is reduced banding artifacts.

They're both actually fine, the idea is about the range of brightness that you can either capture, in regards to a camera, or output, in regards to a display.

If you ever try to take a photo of a mixture of sunlight and shadow you will run into the problem of a constrained dynamic range, if you expose for the bright areas the shadows will be too dark, if you expose for the shadows the bright will be too bright. The wider the dynamic range of the camera the more you can capture in the dark areas when exposed for the bright and visa versa.

For displays, you probably haven't really thought about it but your display has a fairly limited range of brightness if you don't touch the brightness setting and don't have an OLED display.
If you pump up the brightness it is uniformly brighter, blacks become more grey, all your darker colors are lighter. If you do the opposite the same is true, your display becomes uniformly darker.
The idea behind HDR displays is that at any brightness setting the range of apparent brightness between dark colors and light colors will be greater.

Not this fucking bullshit that they promote.

Right, so I'd call extending the range and increasing the number of possible color values as "more color".

test

okay but why can't computers display orange?

Clearly orange doesn't exist.

Seriously though, why can't computers display orange properly? I mean there is shit like this, but compared to orange in real life it's absolutely mediocre.

Also, why does this color orange render differently in the thumbnail? There is something very strange about computers and the color orange.

sure. but again, that depends if the hardware can capture that extra colour in the first place, store it in a format that ensures that the data isn't lost, using a content delivery method doesn't fuck with it, using software and hardware that is capable of fully reproducing it as intended. you're pretty much required to overhaul the whole ecosystem if you want to reap the benefits of extended gamut, dynamic range and bit depth.

Right, or have it rendered locally, like in a game, assuming the game supports it. I think it will mainly be used for that for PCs in the immediate future, maybe with netflix and so on following.

bcus of RGB
There just isn't a nice way to make saturated orange with red, green and blue.

that's because your iToddler machine uses a colour profile for its display which alters the colours. in the 2 pictures that you posted, i see 3 different versions of orange. the first one is closest to the real deal, as it is bright. second picture you've saved with your colour profile data embedded, which means it's a representation of how you see it. it's definitely a lot more dull than the original. i think you can keep screencapping the same image to the point where it loses pretty much all of its colour gradually.

yes, i can see it being far less problematic with games, as you only have to worry about the hardware and software on one machine instead of 2 or 3 like you would for video production.

>why does this color orange render differently in the thumbnail?
Because red suffers the most in jpeg compression.
You should know this.

that's just to do with having deeper blacks. we've never called that HDR in the past.

tfw no gf

it's because 4chin deletes the color profile and converts it to sRGB before upload

either use .PNG which doesn't have color profiles, or make sure to save your image as sRGB in color

oh wait, you just uploaded a PNG file. Do you have an Nvidia card by chance? They are fucked up color range even when manually selecting 0-255 in Nvidiot control panel kek

works on my machine

It's not just deep blacks, its the ability to have a wide range of actual brightness.
CRTs had deep blacks, but weren't very bright.
Plasmas had deep blacks and were fairly bright, but sucked lots of power, has laggy displays, suffered from burn in fairly easily and persistence.
LCD and IPS generally suck at their range, either grey blacks to get bright colors, or dim colors to get dark blacks. The problem here is actually getting each pixel opaque enough to block out the bright backlight, and preventing it bleeding around the pixels.
OLED has the best shot because each pixel is a light source, blacks are true black and the limit to the actual brightness is the maximum output of the pixels, but again there are burn in problems.