Whitecrow
Banned
Here I go again bringing some fresh 'news' (experiences) on HDR and TVs. I'm sorry I'm a nerd of this things.
This may make no sense at first, but bear with me.
I'm sure almost everyone with and HDR TV and a console know Vincent Teoh and already knows about HGIG tone mapping and how to correctly set up the TV and the console for correct HDR display.
But to put everyone here into context, HGIG tone mapping on the TV (those who have it) what actually makes, is disable any tonemapping so the TV follows the HDR PQ luminance curve as is, and just clip at the maximum display luminance.
In the case of OLEDS, maximum luminance sits around 800 nits, so the TVs, with HGIG, everything above 800 nits will clip to full white and all that detail will be lost.
But the thing is, HDR format goes as further as 4000 nits for luminance (Dolby Vision goes up to 10.000), and even if that maximum is just for future-proofing, we cant really know what's the maximum nit value HDR games are 'mastered' to.
So as soon as you engage HGIG, and you limit the console to output at that max luminance of your TV, you are actually leaving detail out, for the sake of luminance accuracy.
HDR TVs already have a tone mapping (in LG OLEDs, this is Dynamic Tone Mapping set to off) curve which instead of trying to get the luminance right according to the PQ curve, it tries to retain as much detail as it can from 1000+ nits sources.
I was thinking about this yesterday, and then I came to a stupid realization. HDR can make the image pop... but it was never about that pop! (or at least, not only)
It called High Dynamic Range for... having high dynamic rangue, not high brightness. HDR is about detail! It's about the number brightness levels and colors! And guess what, the image have more detail with HGIG off, with the default TV tone mapping.
(And the console tone mapping set accordingly).
Yes, the average brightness is lower, but overall the image looks a lot more rich and full since there's a lot more data packed in. But obviously, since each game HDR is different, results will differ.
Supporting this 'theory', is the fact that all color theory, just as chroma subsampling, is developed taking into account that the human eye is a lot more sensitive to light than to color, so the 'quantity of detail' over brightness, may have a reason to be a thing.
I'm actually testing it on FF7R, which looks amazing to me now, and GoW:R.
GoW:R didnt change much, but I would need to play more.
So, I invite you to try at least and see how it goes for you. I'm just throwing my thoughts and impressions here.
EDIT
Let me show you Tone mapping curves from a Vincent Teoh video about LG E9:
You can see, that with HGIG, as I already mentioned, the TV only handles from 0 to 70 brightness levels, accurately following the PQ curve.
But with no HGIG, it follows the curve below its corresponding brightness, but can handle up to 90 levels, hence retaining more detail.
What else do you need me to explain : /
This may make no sense at first, but bear with me.
I'm sure almost everyone with and HDR TV and a console know Vincent Teoh and already knows about HGIG tone mapping and how to correctly set up the TV and the console for correct HDR display.
But to put everyone here into context, HGIG tone mapping on the TV (those who have it) what actually makes, is disable any tonemapping so the TV follows the HDR PQ luminance curve as is, and just clip at the maximum display luminance.
In the case of OLEDS, maximum luminance sits around 800 nits, so the TVs, with HGIG, everything above 800 nits will clip to full white and all that detail will be lost.
But the thing is, HDR format goes as further as 4000 nits for luminance (Dolby Vision goes up to 10.000), and even if that maximum is just for future-proofing, we cant really know what's the maximum nit value HDR games are 'mastered' to.
So as soon as you engage HGIG, and you limit the console to output at that max luminance of your TV, you are actually leaving detail out, for the sake of luminance accuracy.
HDR TVs already have a tone mapping (in LG OLEDs, this is Dynamic Tone Mapping set to off) curve which instead of trying to get the luminance right according to the PQ curve, it tries to retain as much detail as it can from 1000+ nits sources.
I was thinking about this yesterday, and then I came to a stupid realization. HDR can make the image pop... but it was never about that pop! (or at least, not only)
It called High Dynamic Range for... having high dynamic rangue, not high brightness. HDR is about detail! It's about the number brightness levels and colors! And guess what, the image have more detail with HGIG off, with the default TV tone mapping.
(And the console tone mapping set accordingly).
Yes, the average brightness is lower, but overall the image looks a lot more rich and full since there's a lot more data packed in. But obviously, since each game HDR is different, results will differ.
Supporting this 'theory', is the fact that all color theory, just as chroma subsampling, is developed taking into account that the human eye is a lot more sensitive to light than to color, so the 'quantity of detail' over brightness, may have a reason to be a thing.
I'm actually testing it on FF7R, which looks amazing to me now, and GoW:R.
GoW:R didnt change much, but I would need to play more.
So, I invite you to try at least and see how it goes for you. I'm just throwing my thoughts and impressions here.
EDIT
Let me show you Tone mapping curves from a Vincent Teoh video about LG E9:
But with no HGIG, it follows the curve below its corresponding brightness, but can handle up to 90 levels, hence retaining more detail.
What else do you need me to explain : /
Last edited: