That's because Min/Max/Average is a really bad way to measure performance. For example, here's 3 minutes of gameplay in Firefall. First chart is min/max/average, second chart is a plotline of the FPS.
Minimum 83 FPS, pretty good right? Wrong. Here's a plotline of the exact same data that shows, in milliseconds, how long it took to render each frame.
There are a huge number of frames that are taking over 16.7ms (60fps) to render, with some reaching 70ms (14fps). These are the kind of stutters that one would encounter when running out of VRAM.
A much more accurate measurement of 'average' experience is 99th Percentile Frame Latency, which shows us how long it took to render 99% of all frames.
In instances where there aren't major stutters, this should generally line up with average FPS. But that's the thing, average FPS completely misses situations where things are going wrong. It only polls data once a second, yet it could be averaging out 70ms frames with 8ms frames over a second to show you 80fps.
The more I'm researching this stuff, the more I can't believe that FPS has been a standard for so long. It's sooooooo bad.
I have med-small hands, and I definitely cannot claw grip my sensei. Spawn/Xornet is the way to go with claw, but as you are left handed, that's definitely not a good fit. I'll do some research for you and let you know if I find some good stuff.