Whitecrow
Banned
Just thought that a thread like this would be usefull im case there's still someone who doesnt know how HDR calibration windows on consoles works.
First of all, lets understand HDR correctly:
Yes, we all know what it means and what it's suposed to do. more brightness range and details in both dark and bright zones. But lets take a look to the theory behind the scenes.
HDR is a standard and it's ABSOLUTE. And what that this even mean?
Well, this means that even if display capabilities of displaying HDR varies, it has an specified range of nits from 0 to 10.000.
But modern displays are not even close that that. OLEDs (2020 and back) can output a max of 700 or 800 nits. LED TVs can go above 1500.
And what does absolute means? It means that since the colors in HDR are driven by luminance, and the accuracy of it's representation comes from the
ability of the display to actually output at the nits asked by the input signal.
Since SDR is a quite smaller range of brightness, it has a window where in can move between brightness ranges as long as the contrast and gamma respects the brightness variation between shades.
It has a more relative accuracy, even if technically, it specifies a peak luminance of 100 nits, you can increase brightness and contrast depending on the viewing conditions without being afraid of
clipping whites. At least on TVs. On PC monitors max contrast is usually over its capabilities so you'll need to know the exact contrast value at which the monitos starts clipping.
In HDR the story is a bit different. The TV must be able to squeeze all the possible luminance levels inside their modest capabilities. And this is where tone mapping comes into play.
Each manufacturer have its own algorythm to do this, and it's related to how they defined the HDR PQ luminance function:
The X axis is the color value (with an associated brightness), and the Y axis the luminance emitted by the display.
With dynamic tone mapping off, the TVs have a determined function that makes them able to display a wider range of brightness at the expense of accuracy.
We have an ideal reference curve, which is the one with no roll-off and just hard clips at it's max brightness.
Here the display would accurately give each color it's correct luminance levels. But then, lets take this OLED for example. Every color under 820 nits would be accurate, but everything above it would just clip and would not be displayed.
This is "fixed" on the real PQ curve, by applying a roll-off, wich diminishes the luminance and darkens all colors beyond that point in order to handle a wider range of brightness. But obviously this is not accurate to the reference standard.
And as you can see, everything clips at the maximum of 820 nits.
Each TV does this differently.
Let's then talk about the calibration screens.
Those are for the HGIG standards for HDR. They specify 3 key values to the tone mapper (which is not the dynamic tone mapper, in HDR the TV is always tone mapping):
-Max Full Frame Tone Map Luminance (MaxFFTML) : It specifies the max level of brightness the TV can display before clipping, WHEN MOST OF THE SCREEN IS REALLY BRIGHT, also called primary HDR range.
This is usually the first step of calibration.
-Max Tone Map Luminance (MaxTML): The same as above but for small bright windows and specular highlights. Also called Extended HDR range. It is higher than the MaxFFTML.
-Min Tone Map Luminance (MinTML). The pure black will be tone mapped to this level.
With this, lets sort out how to do in the calibration screens:
For the brighter limits, there's two options:
- With Dynamic ToneMapping Off, just follow the instructions. Make the logo clip and dial one step back. This will give the tone mapper the maximum luminance the display can handle.
- With HGIG Dynamic Tone Mapping (which some modern TVs have), it's better to stay at the exact step where the logo just clipped. Why?
Because each step of the slider represents a 100 nits step. So if you leave it where you can still see the logo, there can be still levels of luminance above it that doesnt clip.
Lets say that in this 820 nits OLED, when the logo clips, the logo is at 850 nits, but in the previous step, the logo is 750 nits. So in order to not clip, you are missing 70 nits of brightness (from 750 to 820).
But if you go one step more, you have actually the full brightness range of the display, with the inconvenience of losing a bit of detail.
Obviously you can adjust this to your liking.
And for the dark limit, the exact same, let it at when the logo just clipped.
The sense of this is that in the HDR standard, pure black must bit 0 or as close to 0 nits as possible. So the only reason you want this with the logo being visible is if the viewing conditions are too bright that you need the black be more bright to appreciate the details.
And there's an obvious fact:
You must run the calibration tool with the dynamic tone mapping you want already engaged. There's no point in turning it off in the calibration and engaging it in game.
And what about HGIG?
If your TV has it, use it.
What HGIG does is following the PQ curve very accurately until hard clip at max brightness. With no roll off. So all colors mapped into this range are as accurate as possible.
And I think this is everything.
I'll leave here the sources:
- The official HGIG documentation:
-Vincent Teo explanation:
Is not that I trust Vincent blindly, but after some research and thought, I completely agree with him.
Sooo, there you go.
First of all, lets understand HDR correctly:
Yes, we all know what it means and what it's suposed to do. more brightness range and details in both dark and bright zones. But lets take a look to the theory behind the scenes.
HDR is a standard and it's ABSOLUTE. And what that this even mean?
Well, this means that even if display capabilities of displaying HDR varies, it has an specified range of nits from 0 to 10.000.
But modern displays are not even close that that. OLEDs (2020 and back) can output a max of 700 or 800 nits. LED TVs can go above 1500.
And what does absolute means? It means that since the colors in HDR are driven by luminance, and the accuracy of it's representation comes from the
ability of the display to actually output at the nits asked by the input signal.
Since SDR is a quite smaller range of brightness, it has a window where in can move between brightness ranges as long as the contrast and gamma respects the brightness variation between shades.
It has a more relative accuracy, even if technically, it specifies a peak luminance of 100 nits, you can increase brightness and contrast depending on the viewing conditions without being afraid of
clipping whites. At least on TVs. On PC monitors max contrast is usually over its capabilities so you'll need to know the exact contrast value at which the monitos starts clipping.
In HDR the story is a bit different. The TV must be able to squeeze all the possible luminance levels inside their modest capabilities. And this is where tone mapping comes into play.
Each manufacturer have its own algorythm to do this, and it's related to how they defined the HDR PQ luminance function:
With dynamic tone mapping off, the TVs have a determined function that makes them able to display a wider range of brightness at the expense of accuracy.
We have an ideal reference curve, which is the one with no roll-off and just hard clips at it's max brightness.
Here the display would accurately give each color it's correct luminance levels. But then, lets take this OLED for example. Every color under 820 nits would be accurate, but everything above it would just clip and would not be displayed.
This is "fixed" on the real PQ curve, by applying a roll-off, wich diminishes the luminance and darkens all colors beyond that point in order to handle a wider range of brightness. But obviously this is not accurate to the reference standard.
And as you can see, everything clips at the maximum of 820 nits.
Each TV does this differently.
Let's then talk about the calibration screens.
Those are for the HGIG standards for HDR. They specify 3 key values to the tone mapper (which is not the dynamic tone mapper, in HDR the TV is always tone mapping):
-Max Full Frame Tone Map Luminance (MaxFFTML) : It specifies the max level of brightness the TV can display before clipping, WHEN MOST OF THE SCREEN IS REALLY BRIGHT, also called primary HDR range.
This is usually the first step of calibration.
-Max Tone Map Luminance (MaxTML): The same as above but for small bright windows and specular highlights. Also called Extended HDR range. It is higher than the MaxFFTML.
-Min Tone Map Luminance (MinTML). The pure black will be tone mapped to this level.
With this, lets sort out how to do in the calibration screens:
For the brighter limits, there's two options:
- With Dynamic ToneMapping Off, just follow the instructions. Make the logo clip and dial one step back. This will give the tone mapper the maximum luminance the display can handle.
- With HGIG Dynamic Tone Mapping (which some modern TVs have), it's better to stay at the exact step where the logo just clipped. Why?
Because each step of the slider represents a 100 nits step. So if you leave it where you can still see the logo, there can be still levels of luminance above it that doesnt clip.
Lets say that in this 820 nits OLED, when the logo clips, the logo is at 850 nits, but in the previous step, the logo is 750 nits. So in order to not clip, you are missing 70 nits of brightness (from 750 to 820).
But if you go one step more, you have actually the full brightness range of the display, with the inconvenience of losing a bit of detail.
Obviously you can adjust this to your liking.
And for the dark limit, the exact same, let it at when the logo just clipped.
The sense of this is that in the HDR standard, pure black must bit 0 or as close to 0 nits as possible. So the only reason you want this with the logo being visible is if the viewing conditions are too bright that you need the black be more bright to appreciate the details.
And there's an obvious fact:
You must run the calibration tool with the dynamic tone mapping you want already engaged. There's no point in turning it off in the calibration and engaging it in game.
And what about HGIG?
If your TV has it, use it.
What HGIG does is following the PQ curve very accurately until hard clip at max brightness. With no roll off. So all colors mapped into this range are as accurate as possible.
And I think this is everything.
I'll leave here the sources:
- The official HGIG documentation:
-Vincent Teo explanation:
Is not that I trust Vincent blindly, but after some research and thought, I completely agree with him.
Sooo, there you go.