Sounds like some may not understand the value of the 1% which is really more important than the average frame rate. You need to see the 1% low in the context of the average to gauge how "consistent" the performance is. It's only 1% tho right? Well what could that look like:
For example, let's say you have a GPU that renders 1000 frames. 1% of 1000 is 10 frames right? Well let's say that the GPU is rendering at ~100fps (10ms/frame) which implies roughly 10s to complete the 1000 frames. However, let's say that the 1% low is only 60fps (~16ms/frame). That means that 10/1000 frames will render ~60% slower than the "average" in this example (extreme but just to illustrate the point). Well at ~100fps, that actually looks like 10 huge frametime spikes (10ms - 16ms) over a span of 10secs. That's roughly 1 huge frametime spike every sec for 10 seconds. That would yield an absolutely atrocious gameplay experience despite the average being close to 100fps.
That example was a bit extreme to illustrate my point but make no mistake, the
1% low number is extremely impactful to the true performance of a game and the overall experience you will get. Ideally, the 1% low would be very close or equal to the average which will yield the most consistent and smooth experience (i.e a locked FPS). Practically it's almost never equal but the closer, the better. More optimized games will generally have those numbers be closer to each other. On the other hand, you get this nonsense:
Here the
average is clearly ~60fps but the
1% low is closer to 50. Those frametime spikes are horrible and completely ruin the experience (as Alex from DF clearly articulated in the video).
Fact remains that the console performance here is very consistent, where the 30, 40, and 60fps targets don't diverge much. The result is a much better feeling game than the PC experience with the wildly variable frametimes. I'm sure they'll improve it on PC with further optimization via patches eventually but the console version is much more optimized at the moment.