Oculus Rift - Dev Kit Discussion [Orders Arriving]

I was just reading a comment from someone saying that sniper with Hydra + Rift is a lot of fun. I'm wondering if there's a control mode where you aim with the Hydra sticks held in front of you, because that could be very cool.
 
Sniping in TF2 with the bow is so much fun, the added depth perception that you're given makes quick shots easier at most distances. The demoman also really benefits a lot from the rift, it's even easier to place grenades where ever you want them to go.
 
None of the things you mentioned are conceptually different from what we have now. Screens will get higher-fidelity, sensors will become wireless, Moore's Law will keep cranking on. Of course, we can refine that experience for decades to come, but the next completely novel thing will be direct neural stimulation.

mmmmmmm
 
None of the things you mentioned are conceptually different from what we have now. Screens will get higher-fidelity, sensors will become wireless, Moore's Law will keep cranking on. Of course, we can refine that experience for decades to come, but the next completely novel thing will be direct neural stimulation.

No, I don't think that will ever be the case.

Likely, the best we'll get is just some method of projecting directly onto the retinas.
 
Likely, the best we'll get is just some method of projecting directly onto the retinas.

There is already considerable research and development in the area of direct brain interface technologies.

Primarily targeted towards disability sufferers at this point in time; but it's not a large stretch to think with an improvement of that technology, we will see applications for able bodied people as well.

Additionally, the problem is not intractable. Worst comes to worst, we can simply feed the brain the raw information, and let neuroplasticity sort out the signals patterns. This is pretty much how our brain comes to learn in the first place, and has been shown to be salient/provide vision for blind people... that have had a stimulating dot-matrix patch attached to the skin on their backs.
 
I'd almost want them to just pop in the A cups at the factory after blowing any dust out of the screen area. The amount of dust in that area on the video was driving me crazy, and I know I'll be in there with a can of compressed air if it turns out to not be dust resistant in any way.
 
HOLY COW SHIT

How to blow your mind:

1. Download UDK
2. Open one of the huge UT3 maps or Epic Citadel
3. Launch it, go to the console and type showhud, then slomo 8, then fly
4. Push forward on the controller
5. FIND YOUR FACE BECAUSE IT JUST MELTED OFF
 
No, I don't think that will ever be the case.
You shouldn't say such things. Maybe not in our lifetime. But that comes back to my original point of we're stuck with this basic configuration for a long-ass time, aside from incremental upgrades and screwing around with haptics. And maybe a little galvanic vestibular stimulation, if Palmer gets his way. :D
 
Can't wait to get my Rift.

Will provide detail impressions (although I suspect I'll just say largely the same: fantastic experience, but a ways to go yet).

Got my Razer Hydra today. It's... pretty much as I expected; good sensors, accurate, and yet, a completely lacking experience in terms of implementation, and just a mismatch with the existing 2D display paradigm.

But it feels like a very solid, competent device for 3D motion control... and will lend itself especially well for VR.

Would've been disappointed with it if I had bought it to use with existing games. But I'm excited for its potential with VR.
 
Can't wait to get my Rift.

Will provide detail impressions (although I suspect I'll just say largely the same: fantastic experience, but a ways to go yet).

Got my Razer Hydra today. It's... pretty much as I expected; good sensors, accurate, and yet, a completely lacking experience in terms of implementation, and just a mismatch with the existing 2D display paradigm.

But it feels like a very solid, competent device for 3D motion control... and will lend itself especially well for VR.

Would've been disappointed with it if I had bought it to use with existing games. But I'm excited for its potential with VR.
I was just thinking yesterday that I should see if it can work in a lightgun configuration, holding a single Hydra controller with both hands. I think the cord is long enough.

Another option would be movement controlled with the left analog stuck, and the gun free-moving around the display with the right motion control. Are the hybrid modes mentioned by MotionCreator already doing this? I don't think I ever got them to work.
 
I was just thinking yesterday that I should see if it can work in a lightgun configuration, holding a single Hydra controller with both hands. I think the cord is long enough.

Another option would be movement controlled with the left analog stuck, and the gun free-moving around the display with the right motion control. Are the hybrid modes mentioned by MotionCreator already doing this? I don't think I ever got them to work.

This vid breaks down the main modes you can use Hydra in:

http://www.youtube.com/watch?v=NQkATuxo2c8

It sounds like you're describing the Mouselook mode.
 
Q&A with Denny Unger of CloudHead Games about The Gallery: Six Elements. A game designed from the ground up for (but not exclusive to) the Oculus Rift.

A few quotes:

Denny Unger said:
A lot of people in the industry obsess about the hardware limitations but really, once you accept those limitations and work with what you have, it’s easy to create compelling experiences. The technology will get better, but what we have right now on both motion control and HMD fronts is absolutely enough to start making good games. And Indie developers can take those risks to break us out of the current gaming stagnation.

Denny Unger said:
You’ll need to consider how you are handling your camera interactions and how to hook that up to your animation sub-tree. Accounting for little things like how your eyes travel through space with a natural neck motion and an appropriate field of view. How gscreen7much of your body you can see, what those body proportions are and how tall you are. We’ve found that it can be quite jarring to embody a skinny player or a tall player if that isn’t how you are in reality, so creating tools to adjust that in a pre-launch setting can be important. You want to make players feel comfortable in their virtual skin first and foremost.

Denny Unger said:
General body movement is another key area. You can’t get away with instant start/stop motions in VR. There needs to be subtle ease-in, ease-out and a sense of mass to the motions. Otherwise people feel like they are being violently accelerated and that can cause nausea issues.

Denny Unger said:
Latency and frame rates are discussed often. Keeping your frame rate up is vital but its also true that it’s a bit more forgiving than what gscreen8you might think. The higher the better of course but we’ve found that anything in the range of 45-60 fps is workable.

A lot more where that came from. A very interesting read on game development for VR. Highly recommended.
 
I played around with my Rift some more -- standing up this time around, with a gamepad, and with correctly configured settings -- and experienced none of the nausea that I felt the first time around. I wish I had the time right now to do some actual development for it.
 
I played around with my Rift some more -- standing up this time around, with a gamepad, and with correctly configured settings -- and experienced none of the nausea that I felt the first time around. I wish I had the time right now to do some actual development for it.
Have you tried any of the UDK (with or without supersampling) stuff yet?
 
I was just thinking yesterday that I should see if it can work in a lightgun configuration, holding a single Hydra controller with both hands. I think the cord is long enough.

Another option would be movement controlled with the left analog stuck, and the gun free-moving around the display with the right motion control. Are the hybrid modes mentioned by MotionCreator already doing this? I don't think I ever got them to work.

This vid breaks down the main modes you can use Hydra in:

http://www.youtube.com/watch?v=NQkATuxo2c8

It sounds like you're describing the Mouselook mode.

Mouselook is the only mode I use. I feel it's the best for FPS games. The downside is you have to use ratcheting to disable the right wand in order to recenter your wrist. But with my mouse I use a very low sense and have to pick the mouse off of the mat anyway I kind of liken it to that.
 
I think I've seen at least two comments talking about UDK feeling mushy and Unity feeling crisp. But I thought I saw someone here mention how UDK felt very low latency, so I"m not sure what the exact situation is. I noticed comments about latency testing in the UDK code, so maybe there will eventually be a way to measure that.

Weren't there some UDK engine updates to try to lower latency specifically for the Rift? I'm not sure how much it helped though.
 
Mouselook is the only mode I use. I feel it's the best for FPS games. The downside is you have to use ratcheting to disable the right wand in order to recenter your wrist. But with my mouse I use a very low sense and have to pick the mouse off of the mat anyway I kind of liken it to that.

I've been messing some some of the different modes the past few days. I actually think all of them are good choices. It seems like it essentially boils down to your preferences on how ratcheting the camera is done. Whether it's manual with a button press, stick turn, or some sort of motion. I'm probably going to end up sticking with the Hybrid mode. It works well, seems decently accurate, and is comfortable for me.

I also started experimenting with the Hydra's Sixsense Motion Creator Editor. I wanted to add support for a game - Dead Rising 2 Off The Record. And the process to do so wasn't as horrible as I had thought it'd be. I hadn't even realized that they had included this editor ability in there. It took a little while, but I think it's working fine with the Hydra now.
 
Isn't the nausea from the fact that you're making unnatural movements with the controller? Your body shouldn't be used to those kind of movements because that's not how it moves, so it's tricking itself, right?
 
Actually, after looking through all the modes with that video, I feel like none of them offer what I was wanting to try, which is keeping the view steady and only moving the gun arm around. =P As in, your target reticule would move around the screen but the left analog stick would move the view itself.
 
Apparently I was right last week when I said that the Rift's on ebay were being trolled by some guy that wouldn't pay. People saw the price that auction supposedly ended at, figured Rift's were going for a grand, and the next person that came along had a bidding war on his hands that ended at $1350 which was actually paid apparently.

If that isn't a grand trolling, I don't know what is.
 
I was just thinking yesterday that I should see if it can work in a lightgun configuration, holding a single Hydra controller with both hands. I think the cord is long enough.

Another option would be movement controlled with the left analog stuck, and the gun free-moving around the display with the right motion control. Are the hybrid modes mentioned by MotionCreator already doing this? I don't think I ever got them to work.

Something like the GunCon 3 would seem like a great option if it had PC support

800px-Guncon-3.jpg
 
Isn't the nausea from the fact that you're making unnatural movements with the controller? Your body shouldn't be used to those kind of movements because that's not how it moves, so it's tricking itself, right?

In part, yeah, but I've read that it can also be just as nauseating if you don't properly configure the Rift's inter-pupilary distance, which is just exacerbated by the unnatural movements mismatching in your mind. So it's not just the controller scheme, but actually a couple of things on top of that.

Nice first-day impressions, especially of Museum of the Microstar http://www.youtube.com/watch?v=9qqH_L8H-7s

Ugh, watching that just made me want it even more. I played MotM without the Rift and could instantly tell it'd be awesome in the Rift, but seeing people talk about it just makes me so damn impatient.
 
Actually, after looking through all the modes with that video, I feel like none of them offer what I was wanting to try, which is keeping the view steady and only moving the gun arm around. =P As in, your target reticule would move around the screen but the left analog stick would move the view itself.

I'd be curious to move the strafe inputs off the main analog sticks or keys and onto secondary buttons (eg LB/RB or Q/E). Strafing isn't a natural human movement, but it has become common in games because it is so convenient. I'd like to try left analog stick be forward/back/turn like tank controls, and have 'side step' buttons which don't continuously move you left/right, just one step each time you press so its like a quick evasive move.

Then the head for looking around, detached gun aim with hydra or mouse.

Museum of the microstar looks great, definitely trying that when my rift eventually turns up.
 
Pretty much any content that comes out with Rift support will get my money. I'll try every third party mod that I can get my hands on as well.

I'm surprised we haven't seen any Mirror's Edge or Skyrim let's plays yet. Does anyone know where the support level is on either of those? Is there some central place that collates the status of various games?

Edit: I found a video of someone testing a video player in the Rift (just as a proof of concept I guess). What video did he show, you ask? A My Little Pony fan video set to the theme from Card Captor Sakura. Classic.

I want to know the porn implications of this; I know that video is going to look like hot garbage through the Rift until at least the consumer model (if not consumer model v2) but still.
 
Pretty much any content that comes out with Rift support will get my money. I'll try every third party mod that I can get my hands on as well.

I'm surprised we haven't seen any Mirror's Edge or Skyrim let's plays yet. Does anyone know where the support level is on either of those? Is there some central place that collates the status of various games?

Edit: I found a video of someone testing a video player in the Rift (just as a proof of concept I guess). What video did he show, you ask? A My Little Pony fan video set to the theme from Card Captor Sakura. Classic.

I want to know the porn implications of this; I know that video is going to look like hot garbage through the Rift until at least the consumer model (if not consumer model v2) but still.

A quick look at the MTBS3D forums tells me:

Mirror's Edge has head tracking. I've been told it mostly works with the Rift dev kit, but the key to adjust the zoom isn't working.

and

Did you change the FOV settings in Skyrim - increasing the FOV to 120 really help things. The only thing that I find odd it the distortion of the image when you tilt you head side to side. Other than this it looks great and the tracking is great!!!!

So from that, it looks like Skyrim has some solid support with a few bugs and Mirror's Edge is also almost working, but with a few bugs.
 
A quick look at the MTBS3D forums tells me:



and



So from that, it looks like Skyrim has some solid support with a few bugs and Mirror's Edge is also almost working, but with a few bugs.

It sounds like support should be buttoned down by the time I get a Rift later this month/early next month (hopefully).

Thanks.
 
I played around with my Rift some more -- standing up this time around, with a gamepad, and with correctly configured settings -- and experienced none of the nausea that I felt the first time around. I wish I had the time right now to do some actual development for it.
Then what are you doing with the DevKit? Bring it over to Vienna! :D
Btw; Was there any response from the Oculus team about the AT <-> AU
mismatch? Hopefully my Rift isn't going to Vienna/AU. lol
 
Actually, after looking through all the modes with that video, I feel like none of them offer what I was wanting to try, which is keeping the view steady and only moving the gun arm around. =P As in, your target reticule would move around the screen but the left analog stick would move the view itself.

I've only had the hydra for a few days, but I'm pretty certain you can set this configuration using sixense's software.

However, the sticks on the hydra are not analogue as they're designed to replace WASD movement.
 
Actually, after looking through all the modes with that video, I feel like none of them offer what I was wanting to try, which is keeping the view steady and only moving the gun arm around. =P As in, your target reticule would move around the screen but the left analog stick would move the view itself.

If I can gather the required programming/scripting chops...

The control scheme I want to try is:

1. Left motion controller; yaw left/right = slow turning. tilt left/right = moderate turning, yaw+tilt left/right = fast turning.

2. Right Analog stick = turning left and right.

3. Left analog stick = set movement heading.

4. Knee raise + foot fall = forward movement (essentially walking on the spot - with movement synced to the movement of your legs).

5. Right motion = right hand

6. Right trigger = Grasping/action

7. Left Analog Stick button = toggle use controller as a hand. (Left trigger becomes active as grasping/action as well).

8. Kinect for knee and foot tracking = alternating foot falls will produce forward motion in sync with expected variable rate motion we experience in reality while walking. The higher you raise your knees, the longer the stride. The faster you foot falls, the quicker the strides.

9. Additional motion controller (hillcrest or some such) for translational head tracking.

I think that'll probably be about as good and intuitive as we can get it for a while.
 
Anybody consider combining Oculus VR with binaural audio for a project?

Is there a difference between binaural and stereo?

Edit: Having looked it up...no...I don't think anyone is going to go out of their way to record audio with two microphones spaced at the average distance between the human ears. I'm sure some games try to replicate this, or at least try to find a balance between generic stereo and binaural audio.

this is binaural, http://www.youtube.com/watch?v=IUDTlvagjJA

now you know the answer ;)

but the binaural question was handled in the Rift panel @ SXSW or whatever that show was called, and they concluded that it would be awesome but the computations are extremely hard and the result is very personal, i.e. what sounds natural for one person is "uncanny valley" territory for the other

I'm using Dolby Headphone right now (Xonar DGX) and quite pleased with that, but it only does horizontal surround so no heights ... Creative has a bit more advanced algorithm but only on their most expensive cards

I haven't tried surround sound processing since the early days of creative. Have they gotten any better at positional audio? I used to turn it right off in games like Unreal Tournament.

I'll bet Creative (or Dolby) charges quite a bit if you want to implement their tech in your games.
 
Is there a difference between binaural and stereo?

this is binaural, http://www.youtube.com/watch?v=IUDTlvagjJA

now you know the answer ;)

but the binaural question was handled in the Rift panel @ SXSW or whatever that show was called, and they concluded that it would be awesome but the computations are extremely hard and the result is very personal, i.e. what sounds natural for one person is "uncanny valley" territory for the other

I'm using Dolby Headphone right now (Xonar DGX) and quite pleased with that, but it only does horizontal surround so no heights ... Creative has a bit more advanced algorithm but only on their most expensive cards
 
Shouldn't that stuff be super easy to code into a game? You know where the player is and where the sound is coming from, so you could calculate the time delay you need between the stereo channels.
 
I haven't tried surround sound processing since the early days of creative. Have they gotten any better at positional audio? I used to turn it right off in games like Unreal Tournament.

I'll bet Creative (or Dolby) charges quite a bit if you want to implement their tech in your games.

Like I said, I'm very happy with Dolby Headphone in games like Red Orchestra 2 and BF3 ... I've got a Xonar DGX (with the UNi drivers) --> Headphone Amp --> Sennheiser 595 setup, and the positioning and sound quality are great :) movies sound awesome too

On PC side the technologies are only implemented in hardware/drivers, not in the games itself (unlike some console titles)

Dolby Headphone mixes 5.1 sounds to back to stereo. Creative CMSS-3D does the same with some extra height elements in certain titles I believe, but it depends on the drivers/game... It's at least as good as Dolby Headphone, but the latter (in my experience) always works in any surround sound game without additional settings. Anyway the Asus Dolby Headphone card was €40 so I grabbed that one.

A game like BF3 has positional audio in software (to be used when you use stereo headphones) but it can't compete with Dolby Headphone or the Creative solution (check out some videos on youtube to compare).

So yeah positional audio has gotten better, but there is unfortunately no uniform standard ... Dolby Headphone is the most compatible, but there could be much better solutions if there had been better standards (in DirectX for example)

google around if you want to know more ... plenty of post on sites like head-fi probably :p

Is there a difference between binaural and stereo?

Edit: Having looked it up...no...I don't think anyone is going to go out of their way to record audio with two microphones spaced at the average distance between the human ears. I'm sure some games try to replicate this, or at least try to find a balance between generic stereo and binaural audio.

Shouldn't that stuff be super easy to code into a game? You know where the player is and where the sound is coming from, so you could calculate the time delay you need between the stereo channels.

well that's the problem right ... you need to model not only the position of the ears, but also the shape of the ears itself as they all lead to different sounds (reverb, that kind of stuff)

so to actually model the sound a certain set of human ears capture with live, on the fly, audio you'd need a really really really sophisticated audio engine to correctly model that ...

and as there have been virtually no improvements in audio quality in the last decade (DTS, Dolby, Dolby Headphone, Creative CMSS are all pretty old techniques by PC standards), I seriously doubt we will have something in a year or two. But hey, once all the visual issues with VR and the displays it uses have been conquered, audio is probably the next field of interest :)
 
Anybody consider combining Oculus VR with binaural audio for a project?
Am interested in realtime computation of 3d sound (wave) propagation for
video games. Actually, I'm looking for a mathematician specialized in partial
differential equations (PDEs, as well as FEM theory) who will work with me on
such a model for video games. If someone is interested exploring new fields,
hit me up.
 
Man, despite all the imperfections of the kit (motion sickness, low resolution and etc) im really tempted to buy one.

anyone dare to guess how much time before the next version release ?
 
Man, despite all the imperfections of the kit (motion sickness, low resolution and etc) im really tempted to buy one.

anyone dare to guess how much time before the next version release ?

Roadmap (which appears to be what they're planning for/hoping for, not carved in stone) says Q3. Since the backlog is till maybe June, not sure it's not a good idea to just sit tight for a bit.
 
I really wonder what the shipping hold up is, why the batches are so small and why they only send out one a week, if they really have 10k units completed as has been reported. Surely negotiating a bulk shipping agreement isn't that difficult. They absolutely need to be more transparent about this.
 
I really wonder what the shipping hold up is, why the batches are so small and why they only send out one a week, if they really have 10k units completed as has been reported. Surely negotiating a bulk shipping agreement isn't that difficult. They absolutely need to be more transparent about this.
They have been a bit more transparent, they did an update (can't remember if it was on the kickstarter or otherwise), talked about the shipping order, international shipping, I think some of the reasons for different things they are doing, and mentioned that the delay is the factory, and they did NOT have 10k already assembled as was (if you believe them) erroneously reported.
 
I really wonder what the shipping hold up is, why the batches are so small and why they only send out one a week, if they really have 10k units completed as has been reported. Surely negotiating a bulk shipping agreement isn't that difficult. They absolutely need to be more transparent about this.

They do not. As per their international shipment update...

Right now, the main bottleneck is the number of units coming off the line at the factory. We don't have a huge stock of units sitting in our warehouse waiting to ship out. As soon as we have enough units ready to ship a large batch, they ship out the same day.

I'm not sure where everyone is getting this 10,000 units completed, it's just simply not true.
 
Top Bottom