HL2 + Johnny Lee head tracking effect + faceAPI = OMG

quadriplegicjon said:
do you guys really want to be flicking your head to the side and doing all that crap for a full game? i can see how it would be fun for a little while, but any extended play would get incredibly annoying. imo.
I'm going to load the coast and Nova Prospekt levels into the mod sometime today and see how bad the strain gets.

If I don't post in this thread again, I've probably broken my neck and died.

ninjavanish said:
When I load it up and hit "load test map" nothing happens
I had this issue when components of the mod didn't install in the right place. There's the mod folder in the SourceMods, folder, but there also has to be new stuff added to the Source SDK Base folder.

If you have this stuff installed to Program Files/Valve/Steam, copy over the folders into the proper Steam folders (Program Files/Steam...).
 
I tried some distance tests with the face api demo from their site, using the pseye (just cos it's a better camera than my webcam).

At its default setting it's meant to only pick up faces close to it, but you can up the maximum range in the settings..i messed around with it a bit, and got it to pick up my face at what the software reported was 2.5 metres. The settings allowed you to go up to about 3m.

I can't speak for its tracking capability at that distance because the demo basically stops when it finds your face..it's a limited demo really just to 'prove' it works. What it does appear to detect is pretty cool..your eyes, the outline of them, your eyebrow shape, your nose, your mouth and mouth shape etc. Head rotation seems fine. If all of that works and is robustly trackable at 2-3m, that's pretty impressive. You could have facial expression mapping on characters in games and stuff.

Its specification more specifically says that it requires a minimum distance of 24 pixels between the outer corners of the eyes in order to do the tracking..so maybe one could work out the maximum tracking distance based on that for the typical head (for a given camera) :p

There was another demo I tried that had seemed to have a better distance allowance still, although it's hard to compare robustness since the face api demo, like i say, cuts out after a couple of seconds..
 
thanks for the help. I got it working. The tech is okay but my hair is too long right now and keeps getting in m face and throwing off the tracking
 
quadriplegicjon said:
do you guys really want to be flicking your head to the side and doing all that crap for a full game? i can see how it would be fun for a little while, but any extended play would get incredibly annoying. imo.

The head flicking to turn looks stupid for anyone with a mouse, I haven't got a cam but like the look of the rest of the ideas, most of them seem to translate to digital inputs in HL2, so obviously tweaking the game to accept analogue control over these features would be better.
 
Zaptruder said:
There's also all the things that conventional 3D games on 2D displays are already doing... but the other part that they've been missing out on is the parallax perspective shift, that this sort of head tracking enables.
This is not correct. There is parallax (no need to add any words to that) in every single 3D game on the market. All head-tracking does is link parallax to shifts in your head position. That makes it feel more natural, but you're not getting any visual effect that you weren't getting already.

And Johnny Lee didn't even come close to developing this technique. I experienced it myself at a live demonstration at my college in 1993 (though in that system the camera was on the head detecting the screen, rather than under the screen detecting the head).
 
beermonkey@tehbias said:
Hat, ponytail holder, duct tape.

Don't own a hat. Hair isn't nearly long enough to be in a ponytail.

It usually just goes back and isn't a problem but doing lots of head movements gets my hair in my face.

Looking for duct tape.
 
Gabyskra said:
"well natal does 3d scanning, could potentially put yourself in a game in 3d, actually, imagine that, with MK9... id love to rip my own head off"
I can't believe that people actually buy that argument. It does not. So many people have too high expectations, they are all going to be disappointed.

It is BS to call it "3D scanning", but pinging IR beams off of a body does give you an image map that could be translated into depth. That is a powerful function. I'm sure once Natal gets home there will be plenty of people disappointed by their expectations (and by the "EyeToy+" games that are bound to be the bulk of its support,) but in the games that use Natal properly, I do think it's possible for some of those expectations to be met.

I wish Sony had implemented that into PlayStation Eye, they had discussed the technology several years ago but it must have been too expensive at the time.

gofreak said:
I think the question of tracking at sofa distances is an interesting one..although I think it applies equally to any camera. I don't see how a depth reading would solve that problem if it's primarily linked to resolution (which I'm guessing it is). I'd be curious to see Sony's implementation in the PS3 SDK since I presume they wouldn't release it if it wasn't usable at reasonable distance.

I’m also very much looking forward to demos of Natal and PSMC (to a lesser extent though, as the camera-based object tracking has been demonstrated plenty of times and can be done at home) in normal homes. Especially with Natal, I fear a lot of camera orienting (to set up the difference between shooting the family room versus shooting a dude on the couch) and focusing to get it right every time you play a game with different functions. The wow-factor of seeing Natal has well worn off for me after having some time to digest it, I’m very impressed with the number of tracking points it identifies but I’m not sure I saw anything in any of those body tracking demos that couldn’t be ported to PlayStation Eye.

We also have yet to hear with either system (as far as I know?) what resolution and framerate the cameras are capturing at (PS Eye can do 640x480@60Hz and 320x240@120Hz, I imagine both resolutions would be optionally used by developers depending on the game; Project Natal apparently does its computations 30fps but at what resolution, and is that both cameras at 30 or just the IR camera?) The motion detection was laggy in the Jimmy Fallon demonstration, but the disastrous “avatar shoe” demo redeemed itself a bit in my eyes by how little lag there was on a simple modified application in the OS, so I don’t really know how that will turn out.
 
Welp, plan foiled by some roadblocks: scripted sequences in several places in the HL2 maps are broken, they won't fire and thus you can't really advance. Looks like the bsps have a few other dependencies I'm not aware of. The choreographed scenes are probably stored somewhere else in the game files.

Played for around a half hour, however, and regularly used head gestures (Such as the ironsights, zoom, and spin). No neck strain I could detect, but the aforementioned issues ended my playthrough just before the first Gunship fight (never recieved the citizen greeting and the briefing sequence never took place, Odessa just stood there holding the Rocket Launcher).
 
Kritz said:
I can't wait until software similar to this makes it into full retail games. Just the ability to have the camera rotate by me moving my head would be mindblowingly awesome. I'd love to see it in games like STALKER and Battlefield.

Isn;t this what Natal COULD allow for?
 
Swifty said:
I own a TrackIR unit. Image recognition software like faceAPI and the software supporting Natal is going to make TrackIR obsolete. Why use the infrared spectrum and wearable reflectors when you can use the visible spectrum and not have to wear anything?

Though it makes me wonder why no one has coded these types of gestures and behavior already using the TrackIR API. faceAPI and TrackIR already have the same amount of head tracking data available to the programmer: pitch, yaw, and roll. Though, the gesture recognition capabilities of faceAPI blew my mind. I'm not sure if that's gonna be as relevant as head tracking though.

EDIT: There are numerous TrackIR enabled games with head tracking features out on the PC already. Head tracking is pretty cool and definitely vital in games like ARMA II and various flight simulators. However, TrackIR is freaking expensive. For those who want to try out head tracking in PC games, there are other alternatives. One is FreeTrack which uses a regular webcam and IR reflectors to make a "poor man's" TrackIR.

But with faceAPI, you just need a webcam. Here's a video of this guy playing a TrackIR enabled game by coding faceAPI into PPJoy which in turn emulates the TrackIR drivers.

http://www.youtube.com/watch?v=eNE9FfFMeh0

He put the source code in the description. I might have some fun tinkering with it in Visual Studio. :D

You have peripheral vision and you have eyeballs that can change direction. This isn't a problem at all when I play PC games with head tracking enabled.

I can imagine that TrackIR is more efficient because it only has to recognise a few fixed points. I'm kinda wondering the same though, as why face tracking tech hasn't been made compatible with games like Arma. Maybe it is because there is not enough commercial incentive for doing so. It's not really possible to make any money with webcams because practically every webcam supports it.
 
Liabe Brave said:
This is not correct. There is parallax (no need to add any words to that) in every single 3D game on the market. All head-tracking does is link parallax to shifts in your head position. That makes it feel more natural, but you're not getting any visual effect that you weren't getting already.

And Johnny Lee didn't even come close to developing this technique. I experienced it myself at a live demonstration at my college in 1993 (though in that system the camera was on the head detecting the screen, rather than under the screen detecting the head).

I'm quite aware that Johnny Lee isn't the inventor of the technique. I'm simply saying that he's commonly credited with it (which he is).

And while I should've been clearer (or more technically accurate) with regards to the parallax comment, it's pretty obvious in the context of the discussion what I meant.

With regards to the bolded part, it's like saying that 3D won't give you a visual effect that you weren't already getting. The effect we're talking about is a huge part of how people percieve 3D (along with all the various other cues).

Botolf said:
You're gonna have to explain what you mean by parallax perspective thing (I don't follow). There was something in the video that looked like it allowed you to rotate and angle your game head independent of your gun.

Basically, if you haven't already got it, I'm talking about the effect that's kinda like making the monitor into a 'fish tank' (or a window). That is, look dead on, and it gives you one view. But move your head to the side and you get to see the angle and perspective change as you might expect a 3D object to change in viewing angle and perspective. Rightly or wrongly, I've been referring to this effect as the 'parallax perspective effect.'
 
Top Bottom