So the terms for general vr sickness, sometimes erroneously called motion sickness, is actually an umbrella term for several similar sicknesses caused by different things.
Some people get vertigo from playing vr - technically not a fault of the tech (its acting as intended) but rather poor software design. I can easily induce vertigo in a demo by placing someone high up, as you would get vertigo in real life doing this. Which is to say, doing things in VR that would make you sick in real life, will probably make you sick in VR as well. You can't expect to be Solid Snake in Metal Gear Solid The Twin Snakes - part of the reason we don't do backflips left and right all over the place is because doing so, for an average person, would end in vomit. It's no different in VR. Some early demos do stupid shit like making you run at the IRL equivalent of 70 mph through rotating corridors and people wonder why they get sick. With Half Life 2 VR, we've actually spent significant amounts of time adjusting walking and running speeds, redesigning areas to cut back on crazy jumps you need to make - because expecting players to do that is unfeasible. We're not superhumans IRL.
Some experience sickness because they can perceive the latency between their head moving and the world updating - low persistence strobing oled has effectively solved this. By strobing the display with essentially black frames fractions of a seconds after they scan out, our bodies take advantage of a natural phenomenon our brains use to "fill in" the gaps of the missing pieces of animation. Hence, by simply not drawing anything to the screen at all, our brains will do the missing work for us, which winds up feeling much more comfortable.
Some experience vestibulochoclear disconnect, where there cochlear fluid in their ear isn't moving the way their eyes say they are. This is essentially unsolvable at the moment and depends on your personal limits. There are two frames of thought on how to solve this:
1) electric stimulation of your cochlear to make you feel like your choclear fluid is actually moving (good luck getting a guinea pig for that, zapping your brain with an electric charge)
Or 2) actually make the person move irl to cause harmony between their cochlear and what they see.
In terms of vestibulocochlear disconnect, not all motions are created equal. Cardinal translation isn't bad - moving forward, backwards, stafing - we use parallax cues to figure out the expected motion and our bodies adjust, only feeling discomfort at the start of the motion, and mild at that. Rotation is the killer, so the solution is to either place the player in a swivel chair so they can physically rotate or some other omnidirectional treadmill.
Our studies have uncovered an interesting phenomenon, however - expected motion severely limits your discomfort. To conceptualize this, we built a demo using positional tracked hands. In the demo, you can reach out and grab the world by closing your hand, at which point the movement of your hand translates the world around you. In essence you are grabbing and shaking the world.... nobody gets sick. Play back the same translation without using your hands, people get sick to their gills.
With that in mind, we're toying with a hand operated method of locomotion where you basically "swim" through the environment without using your feet, and it's producing neat results.
Still other people get sick because they can perceive the screen flicker - 90 hz is where it becomes imperceptible pretty much universally. At 75 hz on the dk2, I can perceive it in my peculiarity periphery but it doesn't make me sick. 120 hz is preferred over 90, however, because 60, 30, 24, etc divides evenly into 120, allowing for native frequency playback of, say, standard television or movie content within the context of a virtual screen. Samsung can actually drive Gear VR up to 70 hz, but since it has a focus on media playback, 60 hz was deemed a better refresh rate because it could natively play back movies at their natural frequency.
EDIT: Forgot one - some people got sick with Dk1 simply because the tracking wasn't close enough to their IRL position, and we have strong proprioception in our head and hands (the ability to know where we are in 3D space without visual cues). Our heads would say we were moving into one position, and our eyes would say another. The biggest offender was the lack of any positional tracking in DK1 at all - only pitch, yaw, and rotation, no X, Y, or Z. DK2 does X, Y, and Z in a forward 180 degrees. CV1 and Morpheus will do X,Y, and Z in full 360 degrees.
This is often something people miss - the need to match our proprioception as close as possible is incredibly, massively important and something that didn't really approach acceptable levels until the last 5 or so years. You hear it repeated often - we need sub-millimeter accuracy with our positional tracking, or else most people can tell that it's "off."
In short, nobody can tell you if your girlfriend will get sick without knowing why she gets sick playing fps games in the first place. She'll just have to try it. My sister in law got violently ill from dk1 for over a day, yet she could do gear vr for hours. Apparently the low persistence solved her sim sickness.
EDIT: I typed this on my phone, I'm going through and correcting all the mistakes.