• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Wow, this is what real HDR looks like:

Burger

Member
Comptest3.jpg

Comptest6.jpg

Comptest7.jpg

Comptest14.jpg

Comptest9.jpg

Comptest5.jpg

OnBed1_forumsize.jpg

GMan3_forumsize.jpg


More here

Looks incredible. Done with HL2 models in a HDR rendering program.

If X360, PS3, Nintendo whatever can do realtime HDR next gen, thats where we are going to see some real improvements in graphics. This stuff looks insane.
 
That's called Image Based Lighting. Use a chrome sphere to take an HDR image of a real place, and use that image to light a real scene. Paul Debevec is a pioneer in the field, check out "Fiax Lux" and "The Parthenon"

To be clear: those are rendered images composited onto photographs. Nothing is real-time, so don't get excited thinking that's what games are going to look like.
 
This kind of Hdr in real time ? In your dreams :P

Though, i've heard of a few upcoming games (Wreckless 2, for example) which are said to be implementing some kind of HDR, but it's obviously a dumbed down version of this.
 
In computer graphics and cinematography, high dynamic range imaging (HDRI for short) is a set of techniques that allow a far greater dynamic range of exposures than normal digital imaging techniques. The intention is to accurately represent the wide range of intensity levels found in real scenes, ranging from direct sunlight to the deepest shadows.

For example, this gives the opportunity to shoot a scene and have total control of the final imaging from the begining to the end of your photography project. A concrete example would be that it gives you the possibility to re-expose. Basically, it is like being on set of location and capture the widest information as possible and choose after what you want.

From wikipedia.
 
edit - better explanation already posted

I wonder how The Lost Coast is going to look compared to the X360 games that are launching this year.
 
What this is showcasing is really IBL. I honestly thought I invented IBL a few years ago and was doing research to see if anyone thought of doing the same (basically my idea was a more primitive version, giving EVERY object a reflection map and change the intensity of the reflections based on how reflective the surface is) and then I found out there were papers written on it back in '98. :P

Ever since then I thought I won't be shy to share my ideas here, so in that case I have proof that I actually thought of these things years ago before they were implemented.

The issue with IBL is that you can't possibly capture a real time reflection for everything. It's hard enough for games like PGR2 and FM to do it at 30 and 15 frames per second. Pre-recorded reflection maps make sense in some cases, but won't look that realistic when the environment changes.

I do think that pre-rendered ray-traced lightmaps should be more accurate in this regard, and while they do much of what IBL does, it simply isn't that high-res.
 
aoi tsuki said:
A very interesting example of the technique, along with some great motion capture, is here:

http://www.ict.usc.edu/graphics/research/afrf/animatable_facial_reflectance_fields.avi

Motion capture is one area next gen really needs to improve. The canned and static motions of today will look even more jarring next gen as realism improves. i think that's the main thing motivating me to watch videos Madden '06 on Xº. :)

I've always thought motion capture for character animation looks crappy.

What developers need to work on is the transition stages between two sets of mocapped animation. Currently most transitions are flat out jarring.

HalfLife 2's character animation remains the par of this gen IMO.
 
Top Bottom