• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The High-end VR Discussion Thread (HTC Vive, Oculus Rift, Playstation VR)

Froli

Member
RIPmotion: VR running in place locomotion

Anyone seen this? This seems really better compared to teleportation movement!

ripmotionc6jmh.gif
 

spekkeh

Banned
Does it solve any of the motion sickness issues though? Because you are still not really moving so I would assume this solves nothing.

Yeah might as well use the trackpad. At best this doesn't create as huge a disconnect from standing while moving to walking around, but it won't help against sickness much.
 

spekkeh

Banned
No but I do have a number of peer reviewed articles on cybersickness to my name.
Which admittedly says little more than I'm not a total dunce on the subject, but still.
 

bloodydrake

Cool Smoke Luke
No but I do have a number of peer reviewed articles on cybersickness to my name.
Which admittedly says little more than I'm not a total dunce on the subject, but still.

what would it hurt to try it before dismissing it? he's made it available to try.
 
I have concerns that I would like to see you try to address. The more you go on, the more I can dissect all the faults in your thinking. Are you scared to go down the rabbit hole?
Except you’re not dissecting anything. You’re just spouting nonsense and demonstrating your fundamental misunderstanding of VR. For example…

“Teleporting doesn't make me nauseous" is proof that teleporting doesn't make a person nauseous.
"Ratcheting sounds ridiculous" is NOT proof that ratcheting doesn't make people nauseous…

If I pushed off a wall that wasn't there in VR, I wouldn't FEEL my body going in the opposite direction. I'd just SEE it going in the opposite direction. Seeing locomotion without feeling it is the #1 cause of nausea in VR.
https://youtu.be/godnB5PhoDU?t=4m30s

See, this is why people are hostile towards your idea. You're pulling these claims straight out of your ass, with no direct experience with your proposed locomotion technique. I can tell you right now that any VR head movement that you don't physically feel IN YOUR HEAD, will cause nausea. You can't just shift that feeling to your hand and expect your brain to just magically rewire.
I still can’t find the specific video* where he refers directly to camera control, so hopefully this will do. It’s specifically about ratcheting again, but he talks about similarly violent camera movement.
https://youtu.be/OcyKhtHOc5U?t=24m30s

*YouTube needs searchable transcripts ><

Again, nausea isn't caused by the disconnect between when you touch with your hands- it's caused by your pesky inner ear detecting false movement.
Except your brain doesn’t consider the results to be "false," because those results are a direct product of the instructions being sent to the hand. Your brain thinks, “I’d like to be closer to tha… <Burns>Excellent.</Burns>” Your brain is actually quite comfortable with the idea of producing results with the hands. Most of your brain is devoted to it, in fact.
Dig those tiny feet.

I’ve told you before Walkabout doesn't require you to reverse to gain open space- you simply pivot to face a spacious direction in your room, while in-game you're still facing the chest. The long-winded essay that you continued to write has just been voided. *frowny_face
Yeah, my bad; I was watching it while I was doing something else and mostly just noticed that he spent a lot of time in the CB, but I guess he was demoing, so whatever. Anyway, it would be a lot more efficient if a button tap would cast rays from the helmet to find “the far end” of the CB, and simply snap-rotate the VE so the chest then lies along that path. Then the fairy dust can appear to tell the user which way to turn to relocate their loot without strangling themselves. That’d be faster, and probably less disruptive.

Just answer the question :/ What is the reason you claim that a Move sphere can track better than a theoretical Move-sphere-sized Touch constellation pattern?
Much like stars, actually, the comparatively small IR markers get lost easily, especially at longer ranges where they fill less of the camera’s view. So they use widely spaced arrangements of markers with a fair bit of redundancy, so when individual stars inevitably twinkle out of sight, they can still recognize the constellation by the stars which are visible tonight… err, visible this frame. So there might be a dozen LEDs on the front of the mask because during any given frame, they’re only gonna be able to see a random selection of six or ten of them. But, that’s enough for them to say, “Nah, that’s definitely Orion, cuz there’s the belt… yeah… and see Betelgeuse and Bellatrix up above?” Hence the name.

As for why Move is "better," do you generally find it easier to spot Orion or an internally lit blimp of similar diameter in the night sky? "Easier to spot" is "better" when we're talking about trying to spot stuff.


Regarding the glide idea, I think it would rely on pretty specific game conditions to be useful, so it's probably not a good general fit, but could still be fun as a novel input method for a game built around it.
Specific in what way? It would basically just serve as an auto-run function. Or instead of skates-on-ice, make it socks-on-linoleum and peter out after a short distance, but still allowing them to cover twelve paces with three good pushes, for example. Really, any kid of momentum you tack on at the end will help hint to the player that they’re moving through the environment. But as long as the user is setting the velocity “manually” and it then drops off at a predictable rate — even 0 — that should be pretty comfortable, I suspect.

Edit: just saw the one you linked, https://youtu.be/MtY12ziHuII?t=2m38s - good example. I think it would be pretty difficult to move without accidentally causing motion sickness if the grip applied to full 6DOF movement like that. Problem stems from the relative ease of moving your lightweight hand without having a real body mass behind it. It would be sensitive to pretty fine hand movements which would make your vision tremble. Would wind up needing to have some smoothing applied I think, which could introduce latency.
Well, like I said, we can constrain the user’s movements arbitrarily. We just need to be reasonably predicable, like introducing appropriate planar restrictions. You have 6DOF control in the pool, but if you’re holding on to the wall at the ice rink, you can really only change your x, y, and yaw. See what I mean?

The grip idea would work well for rotating a 3D model while working on it with your other hand in a VR editor. Just could get a bit unpleasant if that extended to direct control over moving your whole universe in the same way. Comfort should be fine with 2D surface movement though.
I actually found a video where Rick Marks and Doc_Ok were talking about how awesome and useful ratcheting was, and how sad it was that nobody would even give it a chance. Well, it was more like Rick mentioned ratcheting is a thing, and Doc_Ok immediately started gushing about it (after politely waiting for Rick to finish). He was describing a little more freedom than you’d probably want to give players in a “typical game situation,” but like I said, he was pretty excited someone else finally brought it up. Doc concludes with, “Everybody frowns upon it, but once they try it, everybody loves it. It’s just the most natural thing.”
https://youtu.be/ojuqO0wzweE?t=37m3s

Worth acknowledging as an input concept, and there are some applications where it has a place. Just don't try to stretch it too far, I'd say.
Well, what’s nice about it is that as Doc_Ok points out, the technique itself allows for some crazy and useful possibilities in the abstract, and I think we can simply allow/disallow certain techniques at certain times so that player locomotion is only allowed in non-game-breaking-yet-predictable ways. For example, we could allow players to simply fling themselves in to the air like Thor, or we can make it so that gravity still affects them and they plop right back to Earth as soon as they release the button. I think the flexibility there may be one of its greatest strengths, really.

You should watch the rest of the panel, because it’s pretty good stuff. They talk at some length about the power and flexibility provided by having a pair of 6DOF controllers, and we’ve really just begun to tap it. But since we control the functionality we expose to the users, we can offer them the simple ability to reach out and pull themselves forward as Rick suggested, or the crazy and convenient travel-via-world-scaling that Doc_Ok described, or anything in between.

But at the end of the day, we need to figure out how to use these powerful input devices provide users enough functionality to ensure that they always have an effective way to move from A to B, but not so much functionality they turn in to game-breaking gods … unless that’s your game, of course. While it’s probably not the ultimate solution for all problems, I think giving users the ability to simply reach out and say, “I’d like to be right here,” is a reasonable enough place to start. Once we have some basic control over positioning, perhaps similar mechanics could be used to control traversal, or maybe we can think of better ways to use the 6DOF to describe that. The “best way” will likely depend on the type of traversal we’re trying to replicate, whether walking, climbing, swimming, flying, or whatever.

Best application I can think of would be a game where you work in zero-G and can push through a space station - that lets you get the glide periods in to give your arms a break.
Sure, but like I said, skates are a thing too. And once everyone is used to skating around everywhere, they won’t even care if their toons actually have wheels on their feet or not. They’ll be buying ankle-wing mods and shit like that instead.


I think ratcheting could certainly evolve into a valid choice for many game designs, either in addition to[/b] or as the sole means of locomotion / movement (in as much as there is a 'toggle' for the ratcheting function.) Being new to the concept, I'd have to play something with a strong action/arcade element to see how well it could be done simultaneously with aiming and interacting (it seems like ratcheting could be achieved in a limited way with a single arm, with the other used for aiming or environment interaction/manipulation). In a twin stick shooter you use one stick for movement and the other for aiming and it gets to be second nature - almost.

Yeah, it’s really just another tool in the chest, and to some extent, it’s just a function of a 6DOF controller. It’s already designed specifically to allow you to grab and manipulate stuff in a virtual environment, and that includes the environment itself. We can basically allow the player as much or as little freedom as we choose.

On a semi-unrelated note, the input lag in that move video (yes I know it's PS3 era move) is so bad, I really really hope that PSVR can keep input latency in the sub 5 ms window, or it will make any locomotion/interaction scheme look bad.
5ms! o_O No, most of the headsets are ~18ms, I think, which is “good enough,” apparently. I suspect we won’t hit 5ms any time soon though.


RIPmotion: VR running in place locomotion

Anyone seen this? This seems really better compared to teleportation movement!
That's clever. Anyone know why he can't just use the headset to detect the movement? Too much stabilization done by the rest of your body to get a good read from there? Regardless, sticking the wand in his belt is clearly just a proof of concept. Seems like you should be able to get all of the needed data from a Bluetooth IMU clipped to your belt. He's basically just using the wand as a networked pedometer so he knows when to move the avatar forward. Just the jostling in your head should help minimize sickness though.


This would get so tiring so quick. Seriously, just test it it out by jogging in place for intermittently for 15-20 minutes. Then imagine that with a Vive on your head/face.That said, it could be great for a work out app.
Yeah, I was thinking it might be nicer using more subtle movements, but you might need an "appropriate" amount of jostling in your ears based on how fast you're moving. You could probably train yourself to be comfortable with less jostling and/or more speed though, I'd imagine.
 

cakefoo

Member
Except your brain doesn&#8217;t consider the results to be "false," because those results are a direct product of the instructions being sent to the hand. Your brain thinks, &#8220;I&#8217;d like to be closer to tha&#8230; <Burns>Excellent.</Burns>&#8221; Your brain is actually quite comfortable with the idea of producing results with the hands. Most of your brain is devoted to it, in fact.
Your hands and brain communicate alot. That doesn't mean that your head will re-map those signals with no qualms.

Anyway, it would be a lot more efficient if a button tap would cast rays from the helmet to find &#8220;the far end&#8221; of the CB, and simply snap-rotate the VE so the chest then lies along that path.
A constructive idea.


Much like stars, actually, the comparatively small IR markers get lost easily, especially at longer ranges where they fill less of the camera&#8217;s view. So they use widely spaced arrangements of markers with a fair bit of redundancy, so when individual stars inevitably twinkle out of sight, they can still recognize the constellation by the stars which are visible tonight&#8230; err, visible this frame. So there might be a dozen LEDs on the front of the mask because during any given frame, they&#8217;re only gonna be able to see a random selection of six or ten of them. But, that&#8217;s enough for them to say, &#8220;Nah, that&#8217;s definitely Orion, cuz there&#8217;s the belt&#8230; yeah&#8230; and see Betelgeuse and Bellatrix up above?&#8221; Hence the name.
Who's to say the cameras aren't high-res enough to differentiate smaller clusters of IR led's?

As for why Move is "better," do you generally find it easier to spot Orion or an internally lit blimp of similar diameter in the night sky? "Easier to spot" is "better" when we're talking about trying to spot stuff.
Show me a QR code, and I'll have no idea what it leads to. Show it to a phone, and it will decypher it instantly. So I don't find your example all that relatable.

(Running in place is) clever... snip... Just the jostling in your head should help minimize sickness though.
I can't find the post, but I believe it was a Hover Junkers developer who claimed that walking in place on the Virtuix Omni still causes motion sickness, because even though your head is bobbing up and down and your legs are moving, there's still no horizontal movement for your inner ear to agree with what's happening in the virtual world. Bobbing up and down in place would therefore fail the same test.
 
So I just got to try out The Lab and a couple of other things using my Razer Hydras, which I literally had to dust off. Tracking clearly isn't 1:1 but I've got them pretty well set up and they work okay for maybe half of the experiences in the lab. The catapult works fine. The bullet hell thing worked really well. I tried the demo of The Brookhaven Experiment too.

They twitch more and more the further from the base station they get with the current implementation, but it's nice being able to get a sense of these things without waiting for my friend's Vive or the Touch controllers.

Plus, as janky as the current implementation is, it's also nice to use my Hydra's again for something. I'm not sure why they're so twitchy as they never were before. Hopefully they can get them working better.

It's impressive how fun something as simple as Brookhaven is, even with super twitchy guns that aren't motion tracked that well. As twitchy as they get, it doesn't seem to effect the aiming much. Maybe some autoaim is happening?

Unfortunately the twitchiness really prevents you from enjoying the Longbow game in the Lab. I was hoping that one would be plenty playable, but the tracking isn't nearly good enough for it.

I can't get the budget cuts demo to install unfortunately. I'll mess around with that tomorrow. Anyways, time for some Pinball FX 2 before I hit the sack. It's still my favorite VR game.
 
This was on gamestop.com/vr

CUuNxMV

"PERSISTENCE - Just a few years ago, if you were to put on an Oculus Rift developer kit, turning your head would result in the image smearing. This "screen door" effect was created by images lingering too long on the screen. Over the years, Oculus and others working on consumer virtual reality have reduced image persistence to minimize the lingering after effects."

I'm sort of new to VR Stuff (I have a Gear VR and plan on getting a Vive)
Isn't the screen door effect the "grid"/lines you see between the pixels, and not what they said in this?
 

Wallach

Member
This was on gamestop.com/vr

"PERSISTENCE - Just a few years ago, if you were to put on an Oculus Rift developer kit, turning your head would result in the image smearing. This "screen door" effect was created by images lingering too long on the screen. Over the years, Oculus and others working on consumer virtual reality have reduced image persistence to minimize the lingering after effects."

I'm sort of new to VR Stuff (I have a Gear VR and plan on getting a Vive)
Isn't the screen door effect the "grid"/lines you see between the pixels, and not what they said in this?

Yeah, it's not caused by ghosting. In fact as you move your head the gaps between pixels are harder to perceive than if the image is perfectly still. Low persistence screens are used to minimize ghosting, but I don't know why they conflate the two in that blurb.
 

Exuro

Member
Holy crap charge is pending! I didn't get the email last week. WHOOO. Hopefully this week I can actually participate in this thread.
 

Soi-Fong

Member
Just got done playing Vanishing Realms.. Jesus... I think besides Audioshield, this is the killer app for Vive. The battle system is real nice. For nearly an hour I literally had presence only broken when I hit a box or my treadmill in the basement.

For anyone with a Vive, you guys owe it to yourselves to get Vanishing Realms. This is the future of dungeon crawling and my gosh, I cannot wait for an MMORPG with these mechanics. Seriously, it would be fucking glorious.
 
Just got done playing Vanishing Realms.. Jesus... I think besides Audioshield, this is the killer app for Vive. The battle system is real nice. For nearly an hour I literally had presence only broken when I hit a box or my treadmill in the basement.

For anyone with a Vive, you guys owe it to yourselves to get Vanishing Realms. This is the future of dungeon crawling and my gosh, I cannot wait for an MMORPG with these mechanics. Seriously, it would be fucking glorious.

Ha, I really enjoyed what I've played of it so far. I'm such a wuss, tho. That first skellie almost gave me a heart attack.:p
 

Arulan

Member
Roughly two weeks since I received my Vive and even the simplest of room-scale experiences continue to impress me. I find myself pacing around areas just to be there. I'm convinced that the most rewarding VR games will be those that find clever ways to apply room-scale to the core of their design.

I'd love to see something resembling a dungeon crawler (blobber) where searching for secret passages involved closely examining your surrounding walls, from banging object on it to search for hollow sounds, to moving a torch around to check for air currents.
 

KingSnake

The Birthday Skeleton

artsi

Member
Shame about exclusivity, but the game looks good.

Thanks to Oculus delaying my order to mid-June, I have plenty of time to think if I want to have it too in addition to Vive.
 

Paganmoon

Member
I wonder if we'll start to see "can only be played on Logitech Keyboards" deals sometime down the road. Exclusivity to peripherals is a new and scary thing to the PC-space.
 

artsi

Member
I wonder if we'll start to see "can only be played on Logitech Keyboards" deals sometime down the road. Exclusivity to peripherals is a new and scary thing to the PC-space.

Keyboards aren't cutting edge technology, they all use standards set in stone ages ago, so no.

VR is new and these things will smooth out as time goes on. That's after both Oculus and Valve are done with the cockfighting and VR API development goes under a consortium instead of a single company with their own interests.
 

Raticus79

Seek victory, not fairness

These are funded by Oculus. It's basically the same situation as first party console games.
http://ca.ign.com/articles/2015/07/13/oculus-luckey-offers-explanation-for-exclusive-rift-games

So I just got to try out The Lab and a couple of other things using my Razer Hydras, which I literally had to dust off. Tracking clearly isn't 1:1 but I've got them pretty well set up and they work okay for maybe half of the experiences in the lab. The catapult works fine. The bullet hell thing worked really well. I tried the demo of The Brookhaven Experiment too.

They twitch more and more the further from the base station they get with the current implementation, but it's nice being able to get a sense of these things without waiting for my friend's Vive or the Touch controllers.

Plus, as janky as the current implementation is, it's also nice to use my Hydra's again for something. I'm not sure why they're so twitchy as they never were before. Hopefully they can get them working better.

It's impressive how fun something as simple as Brookhaven is, even with super twitchy guns that aren't motion tracked that well. As twitchy as they get, it doesn't seem to effect the aiming much. Maybe some autoaim is happening?

Unfortunately the twitchiness really prevents you from enjoying the Longbow game in the Lab. I was hoping that one would be plenty playable, but the tracking isn't nearly good enough for it.

I can't get the budget cuts demo to install unfortunately. I'll mess around with that tomorrow. Anyways, time for some Pinball FX 2 before I hit the sack. It's still my favorite VR game.

Yeah, they get worse with distance because they rely on a magnetic field. Can't cover the same area for that reason.
(I actually took the $100 hit to refund my STEM preorder once I heard about the Lighthouse system)
 

Zalusithix

Member
Keyboards aren't cutting edge technology, they all use standards set in stone ages ago, so no.

VR is new and these things will smooth out as time goes on. That's after both Oculus and Valve are done with the cockfighting and VR API development goes under a consortium instead of a single company with their own interests.

A universal VR API will require some third party with enough clout to step and force it. I don't see Valve and Oculus getting along on that front if left to their own devices given they have rather opposing ideals. The best we're likely going to see in the near to mid term are the most used engines (Unreal, Cryengine, Unity) supporting both and unifying their implementation from a dev's standpoint.
 

artsi

Member
A universal VR API will require some third party with enough clout to step and force it. I don't see Valve and Oculus getting along on that front if left to their own devices given they have rather opposing ideals. The best we're likely going to see in the near to mid term are the most used engines (Unreal, Cryengine, Unity) supporting both and unifying their implementation from a dev's standpoint.

It takes time but I'm sure it'll happen at some point as more HMD manufacturers are entering the market.
Valve might license OpenVR openly but it could be stressing them too being the sole developer.

Could also be that the consortium developed API will come completely from somewhere else (Google?).

Looking at the past at (somewhat comparable situation) SGI was the only developer of IRIS GL, 3D graphics API until they said fuck it in 1992 and formed a consortium to develop and maintain what's called now OpenGL.

Should be noted that despite it DirectX (Microsoft developed) is still alive and kicking. Doesn't mean VR has to go the same road, though.
 

Zalusithix

Member
It takes time but I'm sure it'll happen at some point as more HMD manufacturers are entering the market.
Valve might license OpenVR openly but it could be stressing them too being the sole developer.

Could also be that the consortium developed API will come completely from somewhere else (Google?).

Looking at the past at (somewhat comparable situation) SGI was the only developer of IRIS GL, 3D graphics API until they said fuck it in 1992 and formed a consortium to develop and maintain what's called now OpenGL.

Should be noted that despite it DirectX (Microsoft developed) is still alive and kicking. Doesn't mean VR has to go the same road, though.

Short of Microsoft rolling a VR API into the traditional stack of DirectX/Xinput/DirectInput, or the Khronos Group doing something similar, I only see OpenVR as having a real shot at being a universal API - assuming they partner up with other major players. Oculus wants to completely tie the API to their store which is a nonstarter. OSVR just doesn't have the support needed in general to gain critical mass. Google is too focused on their own platforms to care much about the PC, and frankly they're too fickle of a company to rely on for an industry standard.

Anyhow, a VR API to rule them all will happen eventually, somehow, but that's probably years away. In the interim things are going to be a bit rocky.
 

artsi

Member
Short of Microsoft rolling a VR API into the traditional stack of DirectX/Xinput/DirectInput, or the Khronos Group doing something similar, I only see OpenVR as having a real shot at being a universal API - assuming they partner up with other major players. Oculus wants to completely tie the API to their store which is a nonstarter. OSVR just doesn't have the support needed in general to gain critical mass. Google is too focused on their own platforms to care much about the PC, and frankly they're too fickle of a company to rely on for an industry standard.

Anyhow, a VR API to rule them all will happen eventually, somehow, but that's probably years away. In the interim things are going to be a bit rocky.

Yeah, overall I think everything is a wild card right now.

Looking at the past, IRIS GL was the closed but better performing API (like Oculus SDK now). They made it open after they started losing marketshare to the (back then) more open PHIGS, again taking the lead with that move (in time).

Then DirectX (100% closed API) came and kind of wiped the floor with both in the PC gaming space.

I would not actually see it impossible Facebook doing the same move in the future, right now they do have plenty of open-source project in the web technology space. But like I said, it's a wild card and no one knows what the situation is in 5-10 years.

I'm just very interested what will happen, meanwhile I think it's better to get both Rift and Vive if I want to play every game without hiccups.
 

pj

Banned
Except you’re not dissecting anything. You’re just spouting nonsense and demonstrating your fundamental misunderstanding of VR.

Incredible. You are probably the least informed person posting consistently in this thread, and have the audacity to say someone else misunderstands VR?

The thing you bolded and underlined as an example of his "fundamental misunderstanding" is 100% true. What do you believe is the #1 cause of nausea in VR? (aside from using a shitty non-Sony Playstation® VR system)

The reason "grab the world" movement doesn't make you sick is because your brain believes that is what's actually happening. You are moving the world around while your body is stationary. Whether it makes sense to use in a specific game or if it's more or less immersion breaking than roomscale/teleporting are separate matters entirely. Matters that you probably shouldn't comment on because you have hands on experience with neither.
 
The thing you bolded and underlined as an example of his "fundamental misunderstanding" is 100% true. What do you believe is the #1 cause of nausea in VR? (aside from using a shitty non-Sony Playstation® VR system)

Huh?

People are sure getting mad about VR. Should call it Virtual Rage.
 
So I've been messing about a bit more with my hydras and Steam VR. First of all it further confirms that the Rift can absolutely do room scale. Heck, if I'm doing it today with the Razer Hydras, then I'll definitely be doing it with the Touch controllers, even if I need to get some USB extension cables.

I got Budget Cuts to work... mostly. For whatever reason, the hydras do not remotely recognize throwing motions. I'm guessing throwing in Budget Cuts and in the Lab are based not on the positional tracking, but the accelerometers, and I don't think the hydra has those.

Even without being able to use the throwing knives, I was able to beat the Budget Cuts demo and it was a fun experience. I'm not sure how engaging teleporting will be for everything else using it, but it was absolutely engaging in Budget Cuts.

And contrary to what the game claims (about being roomscale only) it's a standing 360 game. Yes, you can teleport into a room and walk around it. But you can just as easily teleport around it, and most people I've seen play it, play it this way.

It's definitely one to pick up whatever your play space. I wouldn't recommend buying the full version to play with Hydras (again, you can't use the throwing knives!) but the atmosphere is great, and it's incredibly tense. The teleportation gun has enough limitations to it's range, and firing speed to not undermine the tension.

It's great stuff.

Catlateral Damage in VR is going to need more space than I've got though. Teleporting up to stuff to knock it over isn't nearly as cool as teleporting around the office in Budget Cuts. I can imagine running around a room and knocking stuff over being a *lot* more fun.

I'm certainly not going to make up my mind about motion controls based on the current Hydra implementation, but of the stuff I've been able to try out, certainly a lot of it felt like it would work perfectly well as a standing 360 experience... but at least one feels massively limited if left as a standing experience.

Again, space limitations in my house (which we now officially own! WOOO) rear their head. I've got plans for the multipurpose room in the basement, but it's going to take some funds and we won't have those for a while.

If anything I'm more eager to try out the Vive and I hope my buddy Jerry gets his soon so I can try these games out in a larger space with perfectly tracked motion controls. But I think the Touch is going to cover a lot of these games just fine, Budget Cuts included... even if you have to use Steam VR to play them with chaperone.
 

Cyriades

Member
OSVR version 1.4 unboxing

osvr-hdk-gallery-13-v2__store_gallery.png


So I got my 1.4 a few days earlier than it was actually supposed to ship. Awesome. Thanks Razer!

NOTE: I actually noticed that the teardown reference material I have is for a HDK 1.2, so I can't offer a direct hardware comparison between 1.3 and 1.4. This is based off of just my experience with the 1.3, not a teardown of. If you have teardown images of a 1.3, please contribute to the conversation! Pop the face plate off and give us a picture so we can compare notes.

After spending an evening looking over my 1.4, what has really changed and what's the same? Not much except the screen diffuser, and rubbery nose bit it appears.

Behold! The HDK 1.4 rubbery nose bit. For me, it does nothing for comfort but does protect against outside light quite well. Also take note of the still very thin foam padding. Wasn't there supposed to be more of this stuff?
https://goo.gl/photos/3r9bgPiZdquLChxMA

First, of course, everyone wants to see the new screen diffuser and know what it does. What it does, is a great job of virtually eliminating the screen door affect. What you pay for with that luxury is a slight pearlescence on whites and a slight texture to the image. Kind of like rear-projection, actually. Check out the pics below. No tricks of the camera, that's what you see.

This change alone makes this HMD more attractive, I think, because the image is markedly more pleasant to look at. I sure am a fan of it.

All images taken with a Galaxy S5 jammed in to the eye-cup until it looked centred, naturally distorted, and focused to the phone display. I encourage you to view the full resolution images as they show off the affect of the diffuser well.

https://goo.gl/photos/JSnkrUQB1oPEaxTNA
https://goo.gl/photos/1oZhcvzcWAWiDWaq9
https://goo.gl/photos/1DCygFzsphRTH9Nr6
https://goo.gl/photos/qSqTnc2o5Kjcz6Wx7 (Source content: http://fallout.wikia.com/wiki/Fallout_4)


Now, 1.4 doesn't come without its downsides, and the biggest one that I and a number of other 1.4 owners seem to have is a weird bubble/fish-eye affect centred in the middle of the FOV It distorts everything around it like you are literally viewing the image through a drop of water. It isn't being displayed. This very fish-eyed distortion makes viewing moderately straining as you see the image 'roll' around the edges.

Another downside seems to be the newest OLED firmware, or its interaction with the device, which appears to have trouble retaining settings following a reboot or a disconnect/reconnect (which I do frequently). For two whole days I've tried to get the headset to stop reverting back to desktop SBS mode and away from 60Hz 10% persistence. 1.92 has issue. 1.91 seems better. COM connections seem much more difficult to successfully open.

Distortion
Ignore any horizontal distortion seen at the top or bottom. That is just due to the position of the camera and is not present when wearing the headset. Instead, note the spherical distortion right in the middle of the image. For the forum page it's quite obvious. For the file listing, look right at the file listing for 5802 and 5807 which are basically at the center of the 'bubble'.

https://goo.gl/photos/uRAwdN2ZirofALGG6
https://goo.gl/photos/wt4VBYyagSPwBgse7


I've removed hardware comparisons as the only reference material I have is for a 1.2 and I don't think it's fair to point out hardware differences while missing a generation to compare against. Instead, I'll just post the images below and let you take a look. I checked under the board as well and there was nothing remarkable.

[Headset images]
https://goo.gl/photos/VL14pibdtCn7ZrQVA
https://goo.gl/photos/UYBWV2cxYqjBHkdk6
https://goo.gl/photos/KeBwzgXjzuFmezrq5
https://goo.gl/photos/vPUEAzgQFpfnyJyg7
https://goo.gl/photos/d1KhPATsCRm83Nm79


http://www.osvr.org/forum/viewtopic.php?f=10&t=3831
 

Cyriades

Member
OSVR gets Native jMonkeyEngine support

c9956d3df3ebc5029d487c3fc9b13dc8fb006905.png

Apr 17,2016

It's been a long time coming, but I finally have come so far as to be able to commit an initial version of a native OSVR integration3.
It's very basic so far, only extended mode, and hard-coded distortion due to that. More features will be added in the coming weeks, and hopefully direct rendering too.
In parallel I'm working on the stand-alone JNI project3 (this is where most of the work has been put, by far).

Why would anyone be interested in this when OSVR can be used in jMonkeyVR (which is also way more feature-rich) via the Steam plugin?

I can think of two scenarios:

You don't need Steam installed. I guess most gamers have it already, but for some types of (especially non-game) applications, you may not want to have to install Steam.

You are targeting OSVR for one reason or another. OSVR is entirely open-source and also has plugins for Vive and Oculus, so it may eventually be a viable option. It's still quite early in development, though.

https://hub.jmonkeyengine.org/t/native-osvr-support/35669
 
Top Bottom