>>57431624

I like the way actual video looks nothing like the "simulated" SDR image and much more like the "actual" HDR image.

Other urls found in this thread:

youtu.be/tO01J-M3g0U?list=PLyqf6gJt7KuGArjMwHmgprtDeY8WDa8YX&t=29
en.wikipedia.org/wiki/High-dynamic-range_video
twitter.com/SFWRedditImages

youtu.be/tO01J-M3g0U?list=PLyqf6gJt7KuGArjMwHmgprtDeY8WDa8YX&t=29
In fact, the actual video on my display looks over saturated in comparison.

...

Looks like a shitty SweetFX preset.

So this is the next gimmick the industry is going for.
I wondered what the next big thing was going to be after 3D and resolution whoring.
Apparently it's the colors.

This looks like one of those videos they continuously play as a preview on TVs and monitors that are on display at stores.
It's over saturated to hell and back, so it would make even the shittiest of displays look good, which it does.

HDR doesn't actually make the images more colorful though.

I don't understand something. How the shit is HDR supposed to improve contrast by increasing bit depth? You can display finer gradations of color since it's 10b of course, but how the fuck does that affect contrast? How is feeding your screen 30x 0 bits any different than feeding it 24? It's still the darkest value it can go to.

Is it just marketing shit for displays which come with better contrast, which could very well exist with 24bpc as well? I just don't see how a screen's black level or max brightness could possibly depend on the number of color bits you feed it. Is "HDR" basically more colors + some sort of minimum contrast ratio certification which a good-quality SDR screen could get too?

It doesn't affect contrast.
You have to sell it to normies somehow though, so they try to market it like that.

what the fuck they literally just halved the saturation on the left one
what the fuck

It's funny how they advertise HDR. Like they'll show the non-HDR as a shitty, greyed out pícture, while the HDR version is just a standard picture with boosted contrast.

Gotta sell new TVs/monitors somehow I guess, since they have to wait for content creators to catch up to the 4k/8k meme resolution.

>Sanjeev Verma, Product Manager,

I've seen actual HDR TVs at a shop (the super expensive OLED ones), and they do exactly that. It's false advertising

Displays have always been able to display what HDR's main advantage is. Shadow detail and avoiding blown out skies. That's always been an issue with the device recording it. HDR on new displays just add more bits for color gradation and higher contrast ratios due to light output on the display. But for just revealing shadow detail and blown out highlights HDR in recording terms is still something you can see on a standard display.

POOLETE THIS

Yeah that's what I thought. Can 10b color really make that big of a difference though? I haven't ever seen proper 10b content on a 10b screen, but other than potentially decreasing banding issues (which aren't really a huge deal anyway) I can't imagine how it would have some sort of huge impact. Most people apparently can't even tell 720p from 1080p or 1080p from 4K, are they really going to notice 2 extra color bits per channel?

2 bits are a huge fucking deal. It means 4 times the color resolution.
It is apparent mostly in video games, but none of them support it right now apart from shadow warrior 2 (I think).
The ability to display skies is much better, and shadows and dark areas are much more defined.
Thanks to the ps4pro we will see a lot more games with 10bit.
I think the change is necessary and good.
No longer will gradients look fucking retarded.

SW2 does have the option, yeah. I have to see it for myself, I know we're going from 256 to 1024 values/color, but I haven't really seen any glaring issues with shadow detail and skies, even in games, as long as you're running them on a decent monitor which doesn't crush blacks or show clipping in the whites.

What the fuck kind of device was used to take the left picture then? I've seen chink phones with better sensors. Is this even legal?

>desaturate photo
>slightly oversaturate same picture and out it next to it
>LOOG MA INNOVASHUN XDDDDD

Now presenting to you: Ultra-HDR. Compatible with every display!

10bit doesn't make it HDR.

High contrast makes it HDR.
But if you stick with 8 bit HDR will look like garbage, it's barely enough for regular screens.

ITT Sup Forums doesn't understand what HDR is and thinks it has anything to do with color.

Pretty much. The 3D thing flopped hard, the 4K things flopped hard, so this is the new buzzword for why you absolutely MUST run out and buy a new TV. It's going to flop hard as well.

I prefer XtremeHDR+
It looks so much better and makes the colors pop-out.

3D and 4K are memes.

But better contrast really makes a difference.
Especially when it's darker blacks.

Yes, the industry needs a new hype every year to sell new TV's.
But at least this is a useful hype unlike most years.

>4k
>meme
Maybe for videos and video games, but certainly not for every day use.

For TV's, ie: movies.

HDR is also not meant for spreadsheets.

I prefer Nvidia G-UHDR desu

...

HDR in film and photography is about the ability to expose so that bright lights aren't over exposed while dark areas remain visible.

The way HDR TVs are advertised is with saturation levels.

...

...

made it HDR for you

...

NOW THIS IS HD™

It was already bad enough. either RED has shitty colors or whoever was grading the image was DUDE WEED.

...

Holy shit, now I know why the matrix dudes wear sunglasses even when using computers

2160p60 HDR porn when?

LCD's still can barely keep up with 8-bit so what the fuck is the point? Where is the monitor that will show this?

>The way HDR TVs are advertised is with saturation levels.

Advertised, yes.
Because you can't actually show HDR on a non-HDR screen so the marketing geniuses decided to show saturation instead.
Sucks but such is life in normie land.

>not 4k 120fps 3D hdr

and in monitors and video coding it's purely about higher contrast levels and wider gamut without banding, dithering pretty much already makes it happen and 99.999% of content is 8 bit so the real world gain will be minimal

fugg :DDD

They'll be here right before Christmas I suspect.

They could make HDR screens 15 years ago, but they had to get on the megapixel race first.

>How the shit is HDR supposed to improve contrast by increasing bit depth?
HDR is not about increasing the bit depth, it's about increasing the dynamic range. They're related but ultimately orthogonal.

>You can display finer gradations of color since it's 10b of course, but how the fuck does that affect contrast?
HDR is *not* the same thing as 10-bit SDR (as was used with UltraHD or Hi10P content). HDR is about increasing the brightness range you can represent, from 0.1-120 to more like 0.001-10,000 nits

>Is it just marketing shit for displays which come with better contrast, which could very well exist with 24bpc as well?
Pretty much. Brighter whites and darker blacks. That's what HDR is all about.

Note that youtube adding HDR is actually a bigger benefit than just increasing contrast: Since HDR sort of implies BT.2020, we're also getting a much wider color gamut out of the box.

and what is the technology? not lcd. oled? enjoy paying $10000 for that TV

btw, mpv supports this but it misdetects the gamma (it detects it as bt.1886 but it should be st2084, I think)

>3D porn

>tfw no foveon digital film camera

Just improves LCD and OLED screens, yes.

>enjoy paying $10000 for that TV

I doubt it will even be more expensive than 2016 models.
It's just improvements to be able to keep selling new TV's.

Since I never fell for the full-HD meme, let alone the 4K meme I've saved plenty of money to replace my good old 720p TV.

that looks so comfy

I understand that but it is misrepresenting current state of devices and the feature.

I feel your autism, bro.

>Foveon
Where is that? looks a lot like where I live

You can buy HDR OLED TVs for like $5000 right now.

It's irrelevant whether it makes a difference or not. People aren't going to throw away their TV and buy a new one to get it.

Is this your first time seeing a marketing picture?

Meh, my D800 has a better sensor.

>Sony sensor
>good

Uhm....yes, Sony sensor good.

they usually at least try

>improving LCD

how? they're at their contrast limits already, if display manufacturers could push more inflated native contrast numbers than 1:2000~ for the very best they would have done so long ago

I have a Dell Ultrasharp 2408wfp monitor which supports a color gamut outside the standard 16 million colors.

Will I be able to see any benefit from this?

The spec sheet says it has

>110%2 Color Gamut (CIE 1976) - With Dell TrueColor Technology, you’ll see more color than average monitor of 72% color gamut.

>if display manufacturers could push more inflated native contrast numbers than 1:2000~ for the very best they would have done so long ago

No, because they were focused on making higher resolution panels with shitty contrast because nobody gave a shit about anything except how many "K's" it had.

>which supports a color gamut outside the standard 16 million colors.
You're thinking of bit depth. A monitor can support 10bpc color but that doesn't mean it has a wide gamut. Likewise there are plenty of 8-bit monitors with a wide gamut.

In either case, no, you won't notice any real world benefits.

...

yeah the backlight thing, which isn't native and looks like shit, it's just localized dynamic contrast

Are there any videos that exist with a wide color gamut that I could play on my monitor see it using its full potential?

windows assumes srgb, and performs no correction for dekstop elements, so you're probably already seeing it, just stretched from srgb to your monitors native

Yes, the colors look very over-saturated on everything unless I put the monitor in sRGB mode, and then it looks dull compared to my other normal monitor.

The overall image is still crazy good. I wish there was a way I could use it as my main desktop monitor without funky colors where I only see things with more color when they actually have more color.

It's exactly the same with actual HDR monitors/TVs. I've seen a few. They grey out the normal picture to make it appear worse.

No they don't. Show me literally a single video tech or monitor marketing picture that even remotely tries

get OLED and you have it native

the comparisons are mostly overdone because they compare it to the oldest tech grandpa possible.. if you compare a new HDR IPS/MVA Monitor with an OLED one, OLED would still look better.

but compare that with TN, it most likely wouldn't.

i really hope quantum dot will make it's appearance in the same momentum

You're confusing HDR with 10bit video.
They're two different things.

>the comparisons are mostly overdone because they compare it to the oldest tech grandpa possible..
No, the comparisons are always completely and absolutely made up. There is no technological basis whatsoever in them

SUCH VVVVVVVVVVVVVVVVVVVVVVVIBRANCE

>some sort of minimum contrast ratio certification which a good-quality SDR screen could get too?

Unfortunately I don't think this is the case. An HDR rated display can have any contrast ratio.

But yeah it is confusing and the facts are fudged. But what consumers need to know is that HDR video and good contrast ratios go hand in hand, because you need good contrast to take advantage of HDR video.

HDR is basically a marketing label used to mislead the public into buying “HDR” LCD panels. It's a bait-and-switch: Show people fancy OLED tech at conventions, get them hyped about OLED, sell them shitty “dynamic HDR” LCD panels instead.

>HDR is not about increasing the bit depth, it's about increasing the dynamic range. They're related but ultimately orthogonal.

Yes it is, depending on which technology named HDR you're talking about.

>Pretty much. Brighter whites and darker blacks. That's what HDR is all about.

That's static contrast ratio.

HDR video is 10 bit
We've always had high contrast displays, and we didn't call them HDR. You could for example, have an OLED display that can show perfect blacks and whites as bright as the sun, and it only displays 8 bit video.

en.wikipedia.org/wiki/High-dynamic-range_video

That's the best simulation you can present on an SDR display though. Black is actually dark grey, and white is bright grey. Your eyes just adjust.

>yeah the backlight thing, which isn't native and looks like shit, it's just localized dynamic contrast

Compared to what? OLED is a good while away from being cheap and long lasting.

Then again the active matrix backlight with a decent resolution is a long way from being cheap too.

>but other than potentially decreasing banding issues (which aren't really a huge deal anyway)
Really? It's a pretty huge deal.

>Yes it is, depending on which technology named HDR you're talking about.
I'm talking about the term `HDR` which literally stands for `high dynamic range`, and in this context refers to SMPTE ST.2084 and ARIB STD-B67

>That's static contrast ratio.
Sort of. The point of HDR is not just pushing the static contrast number up, but also increasing the overall envelope (i.e. brighter brights, not just darker darks)

In a way, static contrast ratio is a number that tells you almost nothing about the real black/white points. For the purpose of clarity it's best to focus on the concrete numbers rather than their ratio.

A typical, high-quality SDR display has a black point of like 0.1 cd/m2 and a white point of like 100 cd/m2, giving you a contrast of about 1000:1

A high-contrast SDR display might have a black point of 0.001 and a white point of 100 cd/m2, giving it a static contrast of 100,000:1. But it's still not HDR

A HDR display might have a black point of 0.01 and a white point of 2,000 cd/m2, giving you a static contrast of 200,000:1 but a greatly increased dynamic range (meaningful brightness spectrum).

>HDR video is 10 bit
That's an implementation detail, really. You could have an 8 bit HDR video, it would just be lower quality. You could also have a 12 bit HDR video. (In fact, the spec allows for the latter iirc)

You can make up for minor differences in bit depth by dithering heavily, but the problem with dithering is that it's extremely expensive to encode.

(This is why 10-bit encoding is more efficient than 8-bit encoding on anything with gradients - you don't have to dither)

So really, while 10-bit isn't strictly necessary, it's simply more efficient bit-wise.

>8 bit ought to be enough for everybody

This is the same thing I got when I set the display output range in my video player from 16-235 to 0-255.

11 years ago.

/p/ here

i want to murder you all

this is some atrocious dunning krugering

That's from shit encoding though.

The image literally displays a lossless 8 bit grayscale ramp

...

>HDR
>marketing alphabet soup that has nothing to do with the photographic technique it's named after
>there are two wildly different competing standards - one open and easily implemented, the other a Dolby abortion that requires your display include a proprietary image processing chip
Not only is it meaningless marketing speak, it's three or four separate and arguably unnecessary improvements. Increased bit depth, a different color space, vastly increased brightness, and contrast-boosting techniques like local dimming.

>marketing alphabet soup that has nothing to do with the photographic technique it's named after
Fuck off, the `HDR` from photography is pure cancer and has nothing to do with actual dynamic range. That's just shitty faux-HDR tone mapping.

This is about true HDR, end-to-end; which requires a HDR camera, HDR encoding and HDR display. No tone mapping involved.

>the other a Dolby abortion that requires your display include a proprietary image processing chip
Wait, why? Do they have patents on the algorithm or what?

If it were up to Sup Forums we'd all be using 480i CRT eye cancer crates.

Sup Forums here

I implement the HDR support in your image editors and media players

feel free to fuck off

I watched it in 144p for the luls

you madman

>photographic technique it's named after
Im getting second-hand embarrassment here, kill yourself. HDR tech in TV is not derived from the tonemapping techniques used in photography. If anything it attempts to do the opposite: actually displaying a greater range of contrast, rather than compressing it into a lower dynamic range.

>arguably unnecessary improvements
You're a giant faggot

But AMD has FreeHDR which does the same shit you fanboy faggot