OK, so this is the latest catchphrase for next gen quality graphics, but I'm wagering most of aren't too sure about what it is that it actually does.
I wanted to ask the question on GAF too, but while typing out the post, I thought about it and it kinda makes sense.
Anyway, the issue is that, with traditional color displays, it displays a range of color values... about 16.7 million colors (with 8 bits going to alpha channel). Only problem is it doesn't account for variances in intensity that can exist in natural lighting situations.
I mean; if you take a black crow out in sunlight, and you retain its intensity and color value, bring it into a moderately low lit indoor setting, the crow will be noticably whiter/brighter than a white shirt or sheet of paper.
Current display devices don't really account for this kind of natural dynamic range in intensity... nor can current monitors even express the wide range of values that can exist; with a relatively low contrast ratio 300-800:1, as opposed to the hundred thousand plus that we might experience in a day to day situation.
On the upside, our perceptual system works such that, even though our optical system is able to recieve the huge range of color and intensity values that it does, that the color is understood in relativity. That is to say, something of a moderate intensity value brought into a bright place (LCD screen in the sun) is percieved as dark, while that same intensity in a low light situation (LCD screen in a poorly lit room) can be percieved as bright.
In such a situation, while our monitors can't display the full range of colors that our optical function is capable of recieving, they can still simulate a full range of colors that can simulate color reproduction in relative terms quite well.
On the otherhand, when coupled with 32 bit video output solutions, we only get that relatively flat range of intensity values; if something is brighter than the brightest white the video can output, then its represented as the same color as another value which might have an intensity value, 3-4 times below the original value, but just be at the threshold of the display device.
On the otherhand, if we use higher bits to encode the color... the intensity values can be kept; when represented on a 24 bit monitor, the color range, I guess would be compressed; so that colors above a certain threshold in intensity would display as the brightest element the display is capable of, with other colors below that value scaled down relative to that brightest element.
I guess in simple terms, it might be something like...
128, been the highest possible intensity value; the display been only capable of 32 intensity values. When a color with an intensity value (IV) of 16 displays (as the highest value), all colors would be reproduced on the monitor naturally as intended.
OTOH, when a color with a IV of 72 displays, the original color with the IV of 16 will be shown as a pixel with a IV of 7 (16/(72/32)), while the color with 72 IV displays at 32, the display monitor's max...
I guess the bonus of HDR is that when they introduce displays capable of producing a relatively natural color/intensity range, as in the link above, HDR video devices will be ready to use the display in a natural manner.
Well, this is what I've come to understand just thinking about it; although I'm not quite sure its right at all, and could just be some convincing sounding tripe. Anyone else want to confirm, deny, explain, query this?
I wanted to ask the question on GAF too, but while typing out the post, I thought about it and it kinda makes sense.
Anyway, the issue is that, with traditional color displays, it displays a range of color values... about 16.7 million colors (with 8 bits going to alpha channel). Only problem is it doesn't account for variances in intensity that can exist in natural lighting situations.
I mean; if you take a black crow out in sunlight, and you retain its intensity and color value, bring it into a moderately low lit indoor setting, the crow will be noticably whiter/brighter than a white shirt or sheet of paper.
Current display devices don't really account for this kind of natural dynamic range in intensity... nor can current monitors even express the wide range of values that can exist; with a relatively low contrast ratio 300-800:1, as opposed to the hundred thousand plus that we might experience in a day to day situation.
On the upside, our perceptual system works such that, even though our optical system is able to recieve the huge range of color and intensity values that it does, that the color is understood in relativity. That is to say, something of a moderate intensity value brought into a bright place (LCD screen in the sun) is percieved as dark, while that same intensity in a low light situation (LCD screen in a poorly lit room) can be percieved as bright.
In such a situation, while our monitors can't display the full range of colors that our optical function is capable of recieving, they can still simulate a full range of colors that can simulate color reproduction in relative terms quite well.
On the otherhand, when coupled with 32 bit video output solutions, we only get that relatively flat range of intensity values; if something is brighter than the brightest white the video can output, then its represented as the same color as another value which might have an intensity value, 3-4 times below the original value, but just be at the threshold of the display device.
On the otherhand, if we use higher bits to encode the color... the intensity values can be kept; when represented on a 24 bit monitor, the color range, I guess would be compressed; so that colors above a certain threshold in intensity would display as the brightest element the display is capable of, with other colors below that value scaled down relative to that brightest element.
I guess in simple terms, it might be something like...
128, been the highest possible intensity value; the display been only capable of 32 intensity values. When a color with an intensity value (IV) of 16 displays (as the highest value), all colors would be reproduced on the monitor naturally as intended.
OTOH, when a color with a IV of 72 displays, the original color with the IV of 16 will be shown as a pixel with a IV of 7 (16/(72/32)), while the color with 72 IV displays at 32, the display monitor's max...
I guess the bonus of HDR is that when they introduce displays capable of producing a relatively natural color/intensity range, as in the link above, HDR video devices will be ready to use the display in a natural manner.
Well, this is what I've come to understand just thinking about it; although I'm not quite sure its right at all, and could just be some convincing sounding tripe. Anyone else want to confirm, deny, explain, query this?