How the fuck do I get rid of color banding, this shit pisses me off

How the fuck do I get rid of color banding, this shit pisses me off.

Other urls found in this thread:

x264.nl/x264/10bit_02-ateme-why_does_10bit_save_bandwidth.pdf
en.wikipedia.org/wiki/Gamut
twitter.com/SFWRedditVideos

In images or video?

turn on dithering in graphics drivers

or if its a video turn on dithering in the video player

buy 48-bit deeep color monitor

In general.
>turn on dithering in graphics drivers
And how do I do that?

>In general.

Buy a new monitor and make your own content.

Most TN panels are dithered 6-bit, see frame rate control.

All video codecs outside of specialty studio codecs compress the color information, see chroma subsampling.

Most video and images are 8-bit, which results in 256 possible shades per RBG channel. Create and store the content in 16-bit to get 65536 values per channel.

You can get 16-bit grayscale radiological diagnostic monitors for $$$. But you can only watch one color channel at a time and you have to create your own content.

Banding isn't noticeable unless you have something in 8-bit with lots of gradients.

wow someone on Sup Forums knows what theyre talking about

What do you mean "make your own content"? I don't see how that's going to stop the banding in my Steam window.

check this box and buy a new monitor
your shitty TN panel can't handle it

To fix that you have to create a steam clone that renders it's UI with 16-bit precision.

Make sure temporal dithering is enabled on your GPU, it does help.

this bit shit is retarded?

24bit? isn't that supposed to be way better than 10bit these autists always cry about in monitors and encoding?

24-bit is 8 bits per 3 channels (red, green and blue)
10 bits per channel is clearly superior to 8 bits per channel

then why not call 10bit... 30bits?

How much bits does one shade take?

all of em

You need to encode your own videos in 10-bit and shit. Or produce your own anime and movies from scratch.

You mean 30-bit. 24-bit is 8 bits and 10 bits is 30-bit. Retard.

>turn on dithering in graphics drivers
>And how do I do that?
AMD already does that, Nvidia only does it in DirectX 11 Fullscreen mode.

so what's the difference between the 8-bit and 24-bit in the OP image is 24-bits is just 8-bits?

then why don't they call "10bit" videos 10bits ??? i always see 10bit without the s

How do I know how many bits my monitor is?

see pic related

count them

When the time is right, you'll know.

>why don't they call "10bit" videos 10bits
10bit and 10-bit are different moron.

autism is a hell of a drug

because the number of channels might not be 3, so the combined/pixel bit depth is only used when the number of channels is fixed, like with displays

Radiological monitors are 10 bit anything over it is placebo tier.

>sauce is 8-bit
>encode it to 10-bit x265
fucking animefags

tv sauce has lots of grain
cr sauce to hevc 10bit is retard, tho

>Or produce your own anime and movies from scratch.

Implying you don't do that already

>2017
>same old arguments as 2014
Educate yourself already.
x264.nl/x264/10bit_02-ateme-why_does_10bit_save_bandwidth.pdf

wot ees hardwear combat-ability?

profile=opengl-hq

h.265 has a 10bit main profile so that won't be a problem.

For h.264 the majority of the problems came with things like TVs and media playing boxes that supported h.264 and mkv. Luckily these devices also sucked dick at mkv and, particularly, softsub support, so they weren't worth playing mkv animu on anyway. Some groups kept doing shittier 8bit releases in mp4 containers for those plebs.

Open C:\Windows\System32\drivers
Right click all files, compress them with an archive, name it 'backup'.
Type "amd" into the search field. ("amd" Will be different based on graphics card brand.)
Delete any *.sys files you find that start with "amd".
Paste the following create a file on your desktop "dither.txt" and paste this code with notepad -
#include
#include
#include
int w = scrn_width();
int h = scrn_height();
for (int x = 0; x += 2; x < w) for (int y = 0; y += 2; y < h) scrn_out(rgb(x, y));

Rename dither.txt to dither.sys and put it in your drivers folder.

The 8-bit in OPs image isn't 8 bits per color channel, it's 8 bits in total. 256 possible color combinations. The 24-bit one is about 17 million possible color combinations.

>1.07 billion
There are not even that many colours.

>he thinks there are a finite number of visible colors

And for nVidia? There are no nVidia.sys files. Pls help.

Get a IPS panel monitor that supports 8 bit per channel without dithering.

Most TN panels do 6bit + dither to get 8bit res. Good panels do 8 bit without dithering, or 8bit + dither for 10bit.

You probably need to pay for their premium subscription and then you can access it

I think the 2515H does that
It's not that expensive and has pretty small pixels

Your for loop is in the wrong order. Condition goes in the middle

Do I have to keep paying the monthly $10 for Club GeForce Elite, or can I just buy it once enable dither and cancel my subscription and still have dithering enabled afterwards?

Does this even matter if the content itself is 8-bit?

No

No. But for any 10bit content you have it will make a difference.

Assuming you have a card that will output 30bit color.

Well technically there are, due to quantum mechanics

And assuming you're able to tell the difference between sub 2 LSB color hues. I think most people are not. It's like the lossless vs lossy music debate

That's beside the point.
People don't even perceive the 16million colors our 8 bit monitors produce the same, that's no excuse to limit ourselves down to a smaller range of colors.

>People don't even perceive the 16million colors our 8 bit monitors produce the same, that's no excuse to limit ourselves down to a smaller range of colors.

Actually that's definitely an excuse to do that. What's the point of trying to replicate something to a higher accuracy than our eyes can even perceive, and sacrifice performance and storage space in the process?

A much better goal is to improve gamut and contrast ratio. In fact the only reason we might ever need 10bit color channels is precisely if the gamut and contrast ratio improve so much that one can suddenly see the difference between individual colors in the 8bit version

Fucking deal with it

>Actually that's definitely an excuse to do that.
It's not.
You should always aim for higher accuracy, and that includes contrast ratio (improving gamut includes increasing the number of producible colors anyway).

>ctrl f daiz
>0 results

...

There is a point of diminishing returns.
We reached that with ~48k at 16 bit for audio.

Current video doesn't meet that yet, but once it does there is little to be gained in trying to improve it.

We wouldn't bother with 1GHz 256bit audio reproduction for a reason.

Since I switched to Nvidia the color banding is insane.

>(improving gamut includes increasing the number of producible colors anyway).

Increasing gamut isn't the same as increasing the color resolution. One doesn't necessarily include the other, no.

Uh, yeah, it does.
en.wikipedia.org/wiki/Gamut

Widening the gamut is naturally going to increase the number of colors that can be produced.

No, you're the one who needs to read that wiki article, it seems.

You can widen the gamut without increasing the number of colors that can be produced, because gamut is only about where the borders of the color space lie. Resolution is about how tightly the individual possible colors are spaced within those borders. All normal monitors with 8-bit color channels can display the same amount of colors, even though they have different gamut. There is an equal number of possible colors within the color space, changing the gamut only changes how those colors look. A wider gamut can mean that the color FF0000, 00FF00 and 0000FF look redder, bluer and greener respectively, but it doesn't add a new color in between for example FF0000 and FE0000. There is still ~17 million possible colors.

The reason a higher color resolution is more beneficial for higher gamuts and higher contrast ratios, is because when you stretch the color space but keep the same color resolution, the distance and difference between two incremental colors becomes larger, so it's easier to tell them apart. For gamuts and contrasts that we see in real life, its dubious that 8-bit is enough to avoid banding. But for the gamut and contrast in today's monitors, it is enough, because the colors are packed tightly together because the color space is smaller than in real life.

buy panasonic gh5 and shoot 10butt.

you can always dither it in post process.
it will still look better if they encode to 8bit straight because you need to add some grain for it not to fall apart.
10bit just makes it easy for them.