That's kind of what I've been leaning towards as well, just based on ease of use.
It'd be weird if it didn't have the touch pad detection, but I haven't gotten in to play with it yet.
I've mainly worked in UE4 over unity in the past, but only with the Oculus Rift. This is also the first time I've really worked with motion controls in UE4 - with HL2VR we just worked with the Sixense API directly. But yeah, UE4 is super easy to use thus far. In just a weekend of work, I've already cloned a bunch of gameplay systems I've seen in smaller games that I want to use to build a larger, more featured game. It's pretty easy to go from concept to execution in UE4. Also part of my exercise - I've done absolutely everything in blueprint. I've been trying to learn blueprint for a while now, and it's sort of starting to click.
UE4's abstraction for the motion controllers is awesome, I tested it and it works interoperably with my razer hydra and my vive controllers. But their abstraction is also why it seems to be lacking a touch state. You have a variety of states for the controller, but no touch on/off. You can detect a state of pressing the touchpad (it's seen as a thumbstick click), you can detect an X/Y of the touchpad (it's seen as a thumbstick X/Y), and you can treat the touchpad as 4 buttons in a diamond cluster (they are facebuttons 1-4), but I can't find anything to detect touch state.
The Steam Controller API itself has a separate state for touch on/off. I've only been working with their abstraction layer for motion controls, so perhaps the vive api itself has a state for touch that UE4 doesn't report right now.
Did you follow any particular guides to implement the gunplay or are you figuring it out as you go along?
I'm curious what the process is like to decouple aiming compared to making a traditional fps.
Figuring it out on my own. I just started with a completely blank UE4 project and began from there.
UE4 takes care of a lot of the abstraction for you. Your default character pawn's camera will automatically inherit the VR camera if you've chosen the option. You have to add a SteamVR component to your pawn so that it'll track around the room. You also add a couple of motion controller components - one for each hand - and can bind them to a mesh to represent your limbs. There are 2 different SteamVR state machines that return information you need - one for head tracking component, and one for hand tracking component (which handles both left and right). From these state machines you can get real world position and real world orientation, as well as control steamVR directly (turn on and off chaperone, interface with vive controllers, etc). Your camera position is separate from your pawn position, but you can get relative position by using the location of the pawn vs the real world location of the camera (or hands).
So, like, I'm not really starting out with an FPS template. I've just got my head and hands tracked, and I've been designing concepts around that. Like, I don't have to program "aim" into my game - I just look at which direction my hands are facing and use that as a forward vector.
VR game development is really very different from normal game development, and there are little tutorials or guides that explain how concepts work, both programmatically and functionally. Like, there is no drag and drop VR teleportation system yet, so I had to create one from scratch. It involved some trajectory calculation, I wound up actually projecting a small sphere mesh that I gave rubber properties around the scene, and as soon as it touched a ground object it would stick and spawn a camera location for the viewport on my hand.