• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

High Dynamic Range lighting

Zaptruder

Banned
OK, so this is the latest catchphrase for next gen quality graphics, but I'm wagering most of aren't too sure about what it is that it actually does.

I wanted to ask the question on GAF too, but while typing out the post, I thought about it and it kinda makes sense.

Anyway, the issue is that, with traditional color displays, it displays a range of color values... about 16.7 million colors (with 8 bits going to alpha channel). Only problem is it doesn't account for variances in intensity that can exist in natural lighting situations.

I mean; if you take a black crow out in sunlight, and you retain its intensity and color value, bring it into a moderately low lit indoor setting, the crow will be noticably whiter/brighter than a white shirt or sheet of paper.

Current display devices don't really account for this kind of natural dynamic range in intensity... nor can current monitors even express the wide range of values that can exist; with a relatively low contrast ratio 300-800:1, as opposed to the hundred thousand plus that we might experience in a day to day situation.

On the upside, our perceptual system works such that, even though our optical system is able to recieve the huge range of color and intensity values that it does, that the color is understood in relativity. That is to say, something of a moderate intensity value brought into a bright place (LCD screen in the sun) is percieved as dark, while that same intensity in a low light situation (LCD screen in a poorly lit room) can be percieved as bright.

In such a situation, while our monitors can't display the full range of colors that our optical function is capable of recieving, they can still simulate a full range of colors that can simulate color reproduction in relative terms quite well.

On the otherhand, when coupled with 32 bit video output solutions, we only get that relatively flat range of intensity values; if something is brighter than the brightest white the video can output, then its represented as the same color as another value which might have an intensity value, 3-4 times below the original value, but just be at the threshold of the display device.
On the otherhand, if we use higher bits to encode the color... the intensity values can be kept; when represented on a 24 bit monitor, the color range, I guess would be compressed; so that colors above a certain threshold in intensity would display as the brightest element the display is capable of, with other colors below that value scaled down relative to that brightest element.

I guess in simple terms, it might be something like...

128, been the highest possible intensity value; the display been only capable of 32 intensity values. When a color with an intensity value (IV) of 16 displays (as the highest value), all colors would be reproduced on the monitor naturally as intended.
OTOH, when a color with a IV of 72 displays, the original color with the IV of 16 will be shown as a pixel with a IV of 7 (16/(72/32)), while the color with 72 IV displays at 32, the display monitor's max...

I guess the bonus of HDR is that when they introduce displays capable of producing a relatively natural color/intensity range, as in the link above, HDR video devices will be ready to use the display in a natural manner.

Well, this is what I've come to understand just thinking about it; although I'm not quite sure its right at all, and could just be some convincing sounding tripe. Anyone else want to confirm, deny, explain, query this? :P
 
No one cares for HDR?

Sure, everyone is ready to jump onto the explain anisotropic filtering bandwagon, but no one wants to discuss something that's... a little more worthwhile discussing in terms of actually finding out about new stuff?

I mean, look at the link I gave... which looking back at my post I didn't give at all!

http://www.cs.ubc.ca/~heidrich/Projects/HDRDisplay/

Anyway, they're talking about display TVs capable of achieving that high contrast which is more akin to reality, by using a matrix of programmable white LEDs, they can increase the brightness of local areas dramatically, without washing out other areas of an image.

But what's the likelihood of something like this becoming consumer technology? Would it be too costly?

Does anyone know anything about how good the contrast and brightness will be for the future flat panel tech? Klee?
 
I was going to respond with the link in your second post, but I couldn't put my fingers on it! Thanks for posting it..

I'm interested in this stuff too, but I think it's a bit away in terms of consumer availability. I'd like to think it could be available soon though..
 
Neat stuff. I haven't bothered to look up HDR in technical terms partly because a lot of it is over my head, so this was nice to see it explained a bit in terms of the visual accuracy its able to achieve (or mimic as may be the case). All I knew was it looked way cooler than bloom lighting. :)

Here's the demo I linked to in the aniso thread since its more on topic here, nice little visual demonstration. Again only PS 2.0 capable video cards are able to run it.

http://www.daionet.gr.jp/~masa/rthdribl/#System
 
I don't understand what's so difficult about displaying white and black in the same frame - didn't Splinter Cell do that?

sc31.jpg


Maybe someone could clarify it...
 
Well, my whole post above was trying to clarify that idea.

But I'll take the example of that image...

If we were to make that light more intense, with a 32bit color value, you can't do much more than flood the area with more white to create a larger chroma of that spotlight.

Moreover, if the sun was out, while in reality sunlight is actually more intense than most spotlights, the game would probably end up rendering the spotlight as brighter, because that's how it needed to be rendered in the game in order to look accurate or decent in it's target setting (i.e. at night) (while rendering daylight brighter than that would obviously mean that the whole sky would be too friggin white).

On the otherhand, with 128 bit color, color can scale dynamically based on the brightest value... so if that light was still on when the sun came out, you would see the brightness of the sun washout the brightness of the light, subsquently introducing a perceptually darker light, as would occur in reality.
 
joaomgcd said:
This is that special effect that forza uses when you drive out of a tunnel, isn't it?

No idea what Forza does, but I suspect they 'cheat' with the effect, and manually brighten the outside tunnel section dramatically while inside, and then reduce the brightness upon exiting. Well... that's what HDR does, but it's a cheat in the sense that its done manually, they're not using a real representative color value, and that it can be done, when using it in a controlled, linear environment like driving through tunnels.

Well, the practical effect of HDR will be like that, except more consistent across the board and accurate.
 
So if I have understood this correct, HDR emulates the light intensity going from one spectrum to another.. i.e. from bright to dark and vice versa.. and this is done only for a couple of seconds or less right?
 
Not just that I'd think; if you put the entire spectrum on screen at once, the colors would scale accordingly to what's present on screen.
 
No one cares for HDR?

All future textures I take for the sake of (something vaguely realistic in) 3ds max will most probably be done at various exposures and made into an HDR map.
 
PS3 might support 128bit HDR. It's pure speculation based on the E3 presser for now. So if it's really an unmodified G70, then 64bit HDR (fp16) is pretty much all we'll have. But as mentioned conuntless times, that's still a wide range, and should be easily doable by boyj next-gen machines. Heavenly Sword already uses fp16, and Deano already mentioned hitting limitations (although in strange situations looking at the sun). 128bit HDR might not be necessary, but if it is handled efficiently, it would be nice to have.

Anyway, I didn't know HDR had anything to do with monitors. I mean, monitors/tvs display movies just fine. I wasn't aware that lighting was off in anyway do to limited range. I thought HDR only referred to GPUs. :? PEACE.
 
Pimpwerx said:
PS3 might support 128bit HDR. It's pure speculation based on the E3 presser for now. So if it's really an unmodified G70, then 64bit HDR (fp16) is pretty much all we'll have. But as mentioned conuntless times, that's still a wide range, and should be easily doable by boyj next-gen machines. Heavenly Sword already uses fp16, and Deano already mentioned hitting limitations (although in strange situations looking at the sun). 128bit HDR might not be necessary, but if it is handled efficiently, it would be nice to have.

Anyway, I didn't know HDR had anything to do with monitors. I mean, monitors/tvs display movies just fine. I wasn't aware that lighting was off in anyway do to limited range. I thought HDR only referred to GPUs. :? PEACE.

Well, if you create a monitor such that its able to display every color and intensity value as the human optical system can recieve, then technically it wouldn't be a high dynamic range... just a massive range!

Seriously though, monitor technology can still get better, in order to produce a wider range of colors and intensities; simulated dynamic range is intresting... in situations where theoretically, more light should be present, the scene actually ends up darkening to accompany the range of the light. In an immediate transition environment, it's not too bad... but if you download the HDR demo linked in the anisitropic thread, the noticable brightness shift when moving the camera around gives the user a much stronger perception things have gotten darker, as opposed to our eyes adjusting for brighter values.
 
All it really is is the equivalent of a camera aperature. Lets say it is a overcast day and you want to take a picture of the clouds in the sky you have to adjust the aperature (the thing that lets the light into the lens) to compensate for how bright the sky is. The picture will reveal a beautiful sky but the ground and surrounding buildings will be near silhoutte.

If you were to take a picture of the buildings instead and adjust the aperature accordingly the sky would blow out to absolute white.

Now if you were to take a picture in a forest for instance where very little light is breaking through the outside open areas near the forest would also blow out very harsly.

The Forza example is pretty apt but I'm pretty sure it is a easy fake. HDR is going to take some getting used to and sometimes it will be used wrong. The human eye is much better at quick changes in light intensity and equalizing it than a camera is.
 
Hmm... from what I understand about cameras, if too much light enters the camera, then values below a certain intensity value will 'wash out' appearing black, even if the human eye detects a perceptual difference.

HDR otoh, can preserve even the subtle differences between two relatively low intensity values even with a high intensity value present. This would of course depend on how they translate the high intensity value onto the screen... if they do it linearly, between the highest and lowest IV available on screen, then even if there is a technical difference, the perceptual difference may not be too detectable, but if the color values are scaled according to some function, so that it emphasizes upon the IVs range on screen, rather than just a linear function between the lowest and highest... it would give better details of shadow detail than a normal photograph would be able to.
 
Just so you guys no, Valve posted a video of HDR in their upcoming The Lost Coast release. Looks impressive. I'm not sure how to link to it, since it came through Steam.
 
God's Hand said:
Just so you guys no, Valve posted a video of HDR in their upcoming The Lost Coast release. Looks impressive. I'm not sure how to link to it, since it came through Steam.

the video is in the post above you.
 
Zaptruder said:
Well, if you create a monitor such that its able to display every color and intensity value as the human optical system can recieve, then technically it wouldn't be a high dynamic range... just a massive range!

Seriously though, monitor technology can still get better, in order to produce a wider range of colors and intensities; simulated dynamic range is intresting... in situations where theoretically, more light should be present, the scene actually ends up darkening to accompany the range of the light. In an immediate transition environment, it's not too bad... but if you download the HDR demo linked in the anisitropic thread, the noticable brightness shift when moving the camera around gives the user a much stronger perception things have gotten darker, as opposed to our eyes adjusting for brighter values.
Zap, you talking about white balance here? I think that's what it's called. Where you adjust the levels in a scene to better fit the range? I mean, the scene darkening just sounds like that. You darken everything less than the max, and scale them to fit, I guess. Anyway, is that what you're referring to?

Anyway, that Valve vid wasn't as useful for me as that demo. What type of HDR is Lost Coast using? PEACE.
 
Pimpwerx said:
PS3 might support 128bit HDR. It's pure speculation based on the E3 presser for now.

I think he clearly says RSX will support 128-bit pixel precision for HDR.
Quote

"With the RSX we're introducing a 128-bits with floating point percision for each and every single pixel" [...] "RSX will bring that to the game console world."
 
Top Bottom