Shameless...
http://www.gamespy.com/articles/584/584744p1.html
http://www.gamespy.com/articles/584/584744p1.html
This technology could be very cool when applied to first-person shooters. Imagine that you creep up to the corner of an alley. Then, when you want to peek around the corner, instead of strafing your character over you could just move your head and literally 'peek.' Dr. Marks showed a demo of a street scene where he moved his head to look down a street and then moved it back to duck around cover as the bullets flew.
Sure, that's cool. But once the camera knows how close or far away you are, it opens up other possibilities, specifically in the way that digital objects interact with the real world depicted on the screen. In another demo, Dr. Marks held up a wand that attracted butterflies that swirled around it on the screen. As he moved the wand, the butterflies flew to chase it. But then, when he passed the wand behind his back, the butterflies on the screen flew behind him. It really looked like they were flying around him. The illusion was much more complete than you can get with today's technology.
In his next demo, Dr. Marks moved around and on the screen a skeletal version of himself moved to match. He'd wave his arms and the skeleton would do the same. Physics was built into the simulation, so when he punched his arms forward, the skeleton punched, and it could hit objects around the virtual room. Because the camera was tracking distances, it could actually track where he was in a 3D space -- standing in certain spots triggered certain actions, for instance. The Eye-Toy's motion tracking looks pretty primitive in comparison. Imagine the gaming possibilities of this kind of interface! You'd literally be, full body, involved in the on-screen action, stepping into another character.