Explain HDR to me

Explain HDR to me

Other urls found in this thread:

dpreview.com/news/6758360083/benq-announces-32-inch-4k-high-dynamic-range-monitor
youtube.com/watch?v=wthhc1s0Pig
youtube.com/watch?v=-X4K69BJKD4&t
twitter.com/NSFWRedditImage

stuff

Something that is bright won't just be white, it will actually be brighter than the rest of the picture.

Multiply light bounces. Usually an additional pass of a .rad compile.

...

memes and fools with cash to spend

So, hdr is just higher contrast?

Its as shit as your monitors

More details in dark and light areas, a wider colour spectra, and more nuances in colour.

It's bloom v2.0

Imagine 2 pixels of the same color, one is actually brighter than the other. But everything else is the same.

So HDR is supposed to be bottom right?

Sony wants to get back in the electronics business, so they're using the PS4 Pro's ability to play HDR as the latest must have feature.

Yes

...

if the term "high dynamic range" isn't self explanatory, you need to go back to school

more colors

Well the meme was the eye cant see more than 30fps, which is false, the eye doesnt see in fps at all.
But as far as nits goes, aka candelas per meters squared(light measurement) the tv backlighting leaves a lot to be desired.
The fluorescent tubes in the back and even the LEDs cannot change their level of light individually so you get a very compressed brightness and color space.
Dolby also really didnt like the bandwidth hike for just the 'noticeable' improvement for UHD.
Since they worked with the movie industry for years, they came up with a standard, which does not add very much bandwidth costs but the image quality improvements from light to dark areas is astounding.

Pic related.

BULLSHIT

(You)

That's interesting. Why no monitor manufacturer makes HDR monitors yet, while TVs are all about this crap?

higher range of colors
higher contrast
more colors
more brightness

more realism

dpreview.com/news/6758360083/benq-announces-32-inch-4k-high-dynamic-range-monitor

do what

I miss my CRT. I thought "upgrading" to LCD would give me better picture, lol. I was such a fool.

Okay, when will I be to use normal monitors that don't cost over $2k because they are for mastering photos.

They are but the normie market is always bigger so they have to beta test all the hiccup models and let them stretch the legs of the tech before its seasoned and we get it.

Never, normie displays are dominated by cheap TN panels.

>High Definition Radio
>radio that outputs in 720p

get a job

Because none of them are dumb enough to waste their time on a gimmick. HDR is basically just a better emulation of the really fucking annoying effect you get when you go from a dark room to the middle of the day.

Screens can already display enough colors, and they can approximate the effect of bright light sources on the human eye.

Getting a job doesn't change the fact that this is a monitor that is not for gaming: the refresh rate and response times are usually to be desired, not to mention an insane price that is only worth it if you're working with photos, since no other media is capable of displaying the true scale of 99% RGB palette. Not to mention that this shit won't bring RGB gaming content since no one would actually buy this shit.

>99% RGB palette
>RGB gaming content

>RGB gaming content
HDR obviously

true,
but if you had a job you would get a 32inch HDR low response time TV for your desk.

Hardly that long. Everyone push HDR these days.
And with Intel having integrated 10bit video output in Kaby Lake, I doubt it will be long until everyone has some device that can output it.

>tfw when you buy a cheap HDR HDTV and it has worse color gamut than your old non-HDR HDTV

Nigger I don't give a fuck about correct terms within the photofag industry, I just want them gaymes and shit. I understand the concept

>Have brown eyes
>Will never be able to percieve the extended color range offered by HDR
It sucks, bros.

still trying to meme this huh

High
Dynamic
Range

>tfw peasants are being bombarded with 4K and HDR ads and they are buying them, so they are getting cheaper and cheaper

youtube.com/watch?v=wthhc1s0Pig

>explain 1080p to me
>explain color television to me
Its just darker blacks and brighter whites, but more technical than what adjusting your contrast can give you

Again crt did it before

>HDR is basically just a better emulation of the really fucking annoying effect you get when you go from a dark room to the middle of the day.
That's part of tone mapping, compression from a higher dynamic range to a lower one. With HDR displays this isn't necessary or the effect is at least diminished.

PC subhumans don't have HDR yet so they don't care about it. Based Microsoft is introducing it to PC in Windows 10 Creators Update. Once PC subhumans get it you can be sure this board will get one thread about how good HDR is a day.

ugh, I can't stand his smug face and voice.
Do you have a normal person describing it instead of chad?

HDR monitors have existed for a while, but only enthusiasts use them (photo/video editors, artists, etc.)

>linus
>chad

no they fucking didnt
non of them could get bright enough to do hdr

>have massive color blindness
>actually cant see shit
>faggots like you say this

Die scum

Ill have the one on the left

>building a retro PC setup because I can
>got everything lined up except the monitor, thinking I'll have to use the ghetto ass 19" LCD from 2008 sitting in my closet
>rummage through uni's electronics recycling depot, spot a fairly clean looking CRT
>turns out it's a rather high-end model and in mint condition
Sure beats fiddling with DOSBox and/or having to suffer through the old LCD experience again

Funny how as soon as 4K screens are cheap and 1080p screens dirt cheap that they release something new.

Its a buzzword way to say 10bit color + higher brightness without losing detail in the extreme ends of black/white. I really wish they went with something else, idiots are confusing it with the old HDR lighting effects.

Why the fuck uses Windows 10? It's shit.

HDR in photography isn't really related to HDR in rendering at all.

A lot of monitor have already 10-bit output

It's the most used OS on Steam. If you're not using it why are you on the video game board?

I thought 10 bit is something different

FYI: most TVs on market only cover like 80% the range they advertise

HDR isn't just having a 10bit panel, its also being able to hit a certain level of brightness without washing out the colors and losing detail.

>what are UHD televisions
just get a 40 inch samsung. they cost like $600 and have 20ms input lag

Windows 10 is only shit if you don't know shit about computers.

The Windows 10 core is very fast and snappy and reliable.

Just disable the metro shit, half of the services and you are done.

shitty cards are also the most used on steam, so you have one of those too? Most stuff on steam is a laptop pajeet.
Also most computers have already a 10-bit capable output, but there is a lack of content

In a nutshell, computer graphics can technically generate colors beyond the scope of 256 shades per color channel

However monitors can not

Traditional HDR adjusts/normalizes the color range of a pixel. Imagine if a pixel was supposed to be twice as blue as it was red, but it was already maximum red and twice maximum blue, it would appear physically 1:1 red:blue. Thus the image gets tone-mapped such that the red gets halved (OVER SIMPLIFIED/IN A NUTSHELL)

Its not

10-bit displays are often found in IPS screens that are worth their salt. 10-bit HDR technology isn't new, Sony wants to pretend its ONLY ON PS4 PRO but its been around for about 3-4 years affordably

That's nice you underaged idiot. My body is also capable of flight, but there is a lack of wings.

Is this the same HDR?

putting it simply yes.
also it is an uncompressed form of video, you'll need at least 20mb of data stream to have proper hdr.

No, completely different

Then they're not actually the same color fuckhead.

That is literally the HDR effect in photography emulated in a video game.

To your eyes, not to PC, you dumb fuck

>Also most computers have already a 10-bit capable output
Not really. Not being able to output 10bit in OpenGL makes it kind of pointless.
And I doubt anyone itt has a Quadro or FirePro.

Old/traditional HDR

monitors are now starting to offload the process

It's HDR in rendering but it still has to get tone mapped for a non-HDR display.

>be blind
>REALLY ACTUALLY can't SEE ANYTHING
>faggots like you say this

rot in the deepest parts of hell

>Not being able to output 10bit in OpenGL makes it kind of pointless.
What fucking uses OpenGL today? Its all DX and shit

Is there any cheap monitors that actually have good black levels and don't look like complete shit during dark movies / games? My current one is giving me cancer with how bad it is

Take a pen, write something.
That is the "color" it doesn't change. How it looks to you when you read it inside vs daylight vs nighttime.

there are 1080p IPS displays for

Most computers having 10-bit outputs is not actually true. Jewvidia refused to allow any consumer cards have 10-bit output (until the most recent 1000 series cards), as they wanted people to buy shitty quadro cards instead. AMD has had 10-bit output, but only since the 7000 series, for the same reason, but obviously with firepro instead.

If you have a consumer desktop graphics card that isn't an AMD card that's 7000 or newer, or a 1000 series Jewvidia card, you do not have the ability to have 10-bit output.

It's a hardware issue, not a software one. You can output 10-bit with OpenGL if you have a capable card.

well, if you have one you can download a HDR test image/video and check for yourself how it looks. 10 bit is the primary requirement, how good is the light to make it pop out is variable across TVs and nor blue-ray player nor computers have a mean to control it.
yep, I was surprised when I checked my monitor with an image test. It's not even marketed as 10-bit, it might be using dithering, but at 1440p it must be invisible
what does that mean? Windows 10 isn't going to make any content magically. Internet is still sRGB anyway. Stop showing your butthurt pajeet
>Not being able to output 10bit in OpenGL makes it kind of pointless.
what? I'm pretty sure you can if you have a card that supports it

Your a retard m8.

You keep your telemetry and forced updates.

>What fucking uses OpenGL today? Its all DX and shit
lmao

If you fucking ask why are you being an idiot?

Nope. It's a newer process that allows imagery to display a higher range of lumens in a digital image. Instead if certain digital images interpreting a range of lumens as the same shade of grey you see every minute difference - like you would in real life. Higher contrast just removes the middle range greys and forces them to the brightest and darkest ends.

>tfw wide gamut IPS

feels good

isn't newdoom opengl?

youtube.com/watch?v=-X4K69BJKD4&t just watch this to get an idea , don't listen to Sup Forums when it comes to technical matters if you wanna actually learn something

Yes, OpenGL and Vulkan.

Hold on. Hasn't GTX had 10bit in DX for a while now? But Nvidia fagging out on OpenGL (Because of Adobe and shit).
Does the GTX10xx-series support 10bit in OpenGL?

Super simple analogy

Imagine going up in resolution from a 480p display to a 1080p display

but instead of the number of pixels, it's the number of shades of each color in the visible range that's going up.

It's difficult to imagine it in reality, so computers tend to adjust the color range of the screen depending on how bright the other colors are

AMD got 10-bit with 7000 series and Nvidia got 10-bit with 1000 series. Software doesn't make a difference.

I work with OGL daily on a 1070
Can I use a 10 bit framebuffer or not?

Yes.

It's a Jew tactic to get you to part with your cash.

to make it easy, let's pretend that your current monitor can display color, with every color coded in 8 bits, and brightness of every pixel, also in 8 bits. HDR display is when your display have higher range of brightness. Your old monitor was able to choose brightness from 255 different values, your new one can choose from, let's say, 1024 different values.

There is also HDR in games which is about rendering scene in way higher brightness range than your color can output, then simulating camera diaphragm by picking brightness from some point and darkening way darker points/brightening way brighter points.

There is also HDR in photography, which is about taking couple of photos exposed differently and picking out details from every photo, so no detail is lost. Then you compress it to 0-255 range and everything looks like shit. Art.

just go look at an HDR tv at best buy