Figuring it out on my own. I just started with a completely blank UE4 project and began from there.
UE4 takes care of a lot of the abstraction for you. Your default character pawn's camera will automatically inherit the VR camera if you've chosen the option. You have to add a SteamVR component to your pawn so that it'll track around the room. You also add a couple of motion controller components - one for each hand - and can bind them to a mesh to represent your limbs. There are 2 different SteamVR state machines that return information you need - one for head tracking component, and one for hand tracking component (which handles both left and right). From these state machines you can get real world position and real world orientation, as well as control steamVR directly (turn on and off chaperone, interface with vive controllers, etc). Your camera position is separate from your pawn position, but you can get relative position by using the location of the pawn vs the real world location of the camera (or hands).
So, like, I'm not really starting out with an FPS template. I've just got my head and hands tracked, and I've been designing concepts around that. Like, I don't have to program "aim" into my game - I just look at which direction my hands are facing and use that as a forward vector.
VR game development is really very different from normal game development, and there are little tutorials or guides that explain how concepts work, both programmatically and functionally. Like, there is no drag and drop VR teleportation system yet, so I had to create one from scratch. It involved some trajectory calculation, I wound up actually projecting a small sphere mesh that I gave rubber properties around the scene, and as soon as it touched a ground object it would stick and spawn a camera location for the viewport on my hand.