Actually the only way to have a genuine representation of how the game looks is by capturing and encoding it at 60 fps, due to potential frame timing issues.
A game can output 30 frames per second, but do it in a bad way. For example, here is a game outputing a perfect 30 fps:
1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 9 9
Every frame is duplicated.
In that case if I encode it to 30 fps, you get:
1 2 3 4 5 6 7 8 9
Perfect, nothing is lost, and your pc/ps4/whatever will duplicate each frame again, so you get exactly the same thing as what your console outputted in the first place.
Now let's take a game with bad frame timing:
1 1 2 2 2 3 4 4 5 5 5 6 7 8 8 8 9 9
You still get the same number of frames, but sometimes one will be show for 50ms, one for 16 ms. Lots of judder, very annoying.
If I encode at 30 fps, I have 2 ways of converting, either I use a dumb "drop every second frame" filter, which will output this:
1 2 2 4 5 5 7 8 9
Which means when you play back the file, some frames will be displayed for 66 ms, and others will simply vanish. Making the game look even worse than it is.
Or I can use a smart filter that will look for the frame with the most changes from the previous frame. In that case I will get this:
1 2 3 4 5 6 7 8 9
Perfect output again. Great, but not at all a true representation of what the game will actually look like.
So 60 fps is the only solution. I could reencode a 30 fps version automatically, but it simply would not give a good representation of the game unless it's perfectly locked at 30...