(upscaled, banding)
(upscaled, PQ)
Here is an animation of dimming the lights down. The luminance is scaled
down from 10^2 to 10^-5. There is no objectionably banding at low lights for
the second animation. (There is a slight one but that's due to GifCam.). At
low light levels you can see how the dithering is working hard to cover the
long band. But see, the pattern is not flickering.
Unfortunately, you won't be able to see the fine dark shades at very low
limunance levels with your eyes adapted to normal (office) light. But if you
turn all lights off and make the surround if the gif black as well, you will
see the fine banding-free shades to the lowest levels. This could be used
(but these times are over now) to mimic an HDR display out of sRGB (Rec.709).
For, you basically dim the room low with your eyes adapting to the low light
making them more sensitive to intensity changes such that you can easily
perceive the dim shades appearing for you as normal. The bright shades can
now be used for the really bright lights, highlights whatsoever. That way you
can create a relative contrast on the screen similar to real HDR images. This
technique isn't new, but was never really applied, for, you would need to
sit in a dark room (the freaks do anyways), but you also need a method to get
rid of the banding at low light levels. Hence, there aren't that many games
out there playing in the dark. However, this problem will be solved with the
new HDR TVs and consoles because the brighter displays will allow you to build
darker games! You don't need to sit in the dark anymore because a HDR display
has much more luminance/m² than the standard displays have to still make the
highlights very bright on the screen, even with you adapted to normal lighting
levels. However, the banding problem basically still exist, but was solved by
sending 12 bits of HDR data (10 bits for now) to the HDR display. Mind you, 12
bits aren't sufficient to suppress the bands at low light levels. 16 bits
would be needed. But then came Dolby (Dolby Vision) having created a
perceptual quantizer (PQ) for HDR displays. Using this PQ to quantize the HDR
buffer down to 12 bits won't produce any banding at the lowest light levels.
Well, it does, but you won't be able to see them. You may now imagine what the
disadvantage of a 10 bit HDR TV is. They will likely produce banding at low
light levels until the TV takes care of it in some ways (perhaps by making the
image more noisy to cover the bands).
The problem of banding doesn't only exist just for monochromatic shades as
seen here, it also happens for dark colored gradients as well. It's very
likely that you have seen banding on a lot of (dark) gradients. If no PQ is
available, you still can use standard quantisation and dithering or random
noise to cover the bands down to a given darkness quite good. It's not perfect
but will smooth out most of them. However, applying the same techniques to
textures one needs to know that dithering will make the image compress much
less, which was perhaps the reason why some earlier dither patterns where held
so primitive for certain scenes as can seen here form the game LOOM of
Lucasfilm Games;
There were only so much kbytes on a diskette.