Frame Latency - Why you should stop talking about FPS

Have you got any results like this with SLI/crossfire?

Not that I'll be ditching my SLI setup for anything else anytime soon, but I'm just curious.
 
Interesting write up, didn't even know such a thing was a thing(?) So this is how long it takes for 1 frame to render?

Is this the graph? 5min match of BF3, all settings max, 64players.
spDD5gY.png
Yep, that's it.
 
I misspoke. You wouldn't need a normal distribution, but you would need to know the mean. Standard deviation alone wouldn't be enough: a game with a stdev of 0, sounds great... but not if it's running at a steady 100ms/frame.

If you are calculating the standard deviation, you already have your mean. And if the standard deviation is 0 in both cases, then of course we'll take the 60 fps games over the 50 fps.
 
One question: is it possible for a frame to take longer than 16.7ms to render at 60fps with vsync? I assume it can't. If it did it means it would miss the window for the screen refresh, so I assume that should imply that it did not maintain 60fps.

As AMD user I think this should replace fps in every benchmark and card review ever.
It's a way more meaningful metric.

So sick of the hopelessly uneven framerates in tons of games on my amd card.

This answer goes for both, here's Skyrim's frame latency with vsync turned on (is there any other setting :/) running at an indicated 60 fps. The worst of all is that AFAIK Skyrim does limit frame latency by rendering 3 frames ahead which is what causes some input lag.
q6gr6Zt.png


As you can guess it is jerky to play. I tend not to notice the big, but short spikes, but a thick line is really awful to play with. Vsync that constantly switched between 17 and 33 ms frame time was especially horrible when compared to a different game running the same frame rate (40ish), but without vsync. A constant 33ms is much better than that shit.
Have you got any results like this with SLI/crossfire?

Not that I'll be ditching my SLI setup for anything else anytime soon, but I'm just curious.

It depends on the config and like mkenyon said SLI is more consistent. When it's bad it looks like the aforementioned horrible vsync jitter. Techraport checked it out in their first article on frame latencies.

 
Have you got any results like this with SLI/crossfire?

Not that I'll be ditching my SLI setup for anything else anytime soon, but I'm just curious.
No, but I can tell you SLI will be more consistent. They do this buy adding a whole frame of input lag to allow for lining up the output more precisely.
 
Hope this new method of benchmarking games takes off; Tom's Hardware plans to feature it in their reviews going forward, along with their usual average fps data.

More info by TechReport for those interested: Inside the second: A new look at game benchmarking
Tom's is doing a shit job of representing the data though. They're either purposefully dropping some really key information from their charts (95th percentile? seriously?) or don't understand why it's important. Then they fuel the fires of the AMD vs. Intel CPU wars and get a bunch of hits. Really really questionable stuff.
 
Tom's is doing a shit job of representing the data though. They're either purposefully dropping some really key information from their charts (95th percentile? seriously?) or don't understand why it's important. Then they fuel the fires of the AMD vs. Intel CPU wars and get a bunch of hits. Really really questionable stuff.

Heh that's true, I just stumbled upon their initiative while I was looking for TechReport's old article, which your OP reminded me of.
 
This is why I would like to see devs lock the framerate in certain cases and build the game to maintain that fr throughout.
 
Tom's is doing a shit job of representing the data though. They're either purposefully dropping some really key information from their charts (95th percentile? seriously?) or don't understand why it's important. Then they fuel the fires of the AMD vs. Intel CPU wars and get a bunch of hits. Really really questionable stuff.

so tempted to add this to their comments to see if they reply
 
Tom's is doing a shit job of representing the data though. They're either purposefully dropping some really key information from their charts (95th percentile? seriously?) or don't understand why it's important. Then they fuel the fires of the AMD vs. Intel CPU wars and get a bunch of hits. Really really questionable stuff.

Using the %iles is a way to keep their articles neat and clean while still conveying relevant data.
 
People have different requirements though. As a competitive gamer, I really don't care about jarring visuals, as I want the ultimate lowest frame time possible as that means more frequent and faster polling of my input.
 
People have different requirements though. As a competitive gamer, I really don't care about jarring visuals, as I want the ultimate lowest frame time possible as that means more frequent and faster polling of my input.

I have a feeling as we get more and more data on this issue, people will begin to figure out which hardware choices will improve their desired gaming outcomes. It's not as if you can take this data and shop around to reduce bad latency times beyond the current method.
 
This answer goes for both, here's Skyrim's frame latency with vsync turned on (is there any other setting :/) running at an indicated 60 fps. The worst of all is that AFAIK Skyrim does limit frame latency by rendering 3 frames ahead which is what causes some input lag.
q6gr6Zt.png


As you can guess it is jerky to play. I tend not to notice the big, but short spikes, but a thick line is really awful to play with. Vsync that constantly switched between 17 and 33 ms frame time was especially horrible when compared to a different game running the same frame rate (40ish), but without vsync. A constant 33ms is much better than that shit.

Wow, that's disappointing.

The only way that would be happening is if the video card ends up outputting more than 1 frame in a 1/60s time slice to make up for the missed frame. I wonder if that's how they achieve higher benchmark numbers...by essentially outputting useless frames.
 
in the comments section of the Toms article it looks like they just dont care about it as much as others do

We absolutely did take latency into account in our conclusion.
I think the problem is that you totally misunderstand the point of measuring latency, and the impact of the results. Please read page 2, and the commentary next to the charts.

To summarize, latency is only relevant if it's significant enough to notice. If it's not significant (and really, it wasn't in any of the tests we took except maybe in some dual-core examples), then, obviously, the frame rate is the relevant measurement.

*IF* the latency *WAS* horrible, say, with a high-FPS CPU, then in that case latency would be taken into account in the recommendations. But the latencies were very small, and so they don't really factor in much. Any CPUs that could handle at least four threads did great, the latencies are so imperceptible that they don't matter.
 
Using the %iles is a way to keep their articles neat and clean while still conveying relevant data.
Yes, but they are choosing 95% instead of 99%, thus covering up that 2% (compared to 99%) of data where things are going wrong.

Look at what this chart would indicate if the top 2.5% was thrown out as compared to the top .5%.

skyrim-latency.gif

in the comments section of the Toms article it looks like they just dont care about it as much as others do
Oh wow. That response clearly demonstrates they don't even know what the data means.
 
Yes, but they are choosing 95% instead of 99%, thus covering up that 2% (compared to 99%) of data where things are going wrong.

Look at what this chart would indicate if the top 2.5% was thrown out as compared to the top .5%.

http://techreport.com/r.x/amd-fx-8350/skyrim-latency.gif[IMG][/QUOTE]

Either you are confused about 95%ile or I am. The 95%ile mark is showing the worst 5% of frames with respect to latency.

[quote="Anton668, post: 47629407"]in the comments section of the Toms article it looks like they just dont care about it as much as others do[/QUOTE]

This sounds about right.
 
Great read! I need to do some benchmarking with Batman Arkham City and Darksiders 2 as those two were the ones that immediately came to mind.
 
Either you are confused about 95%ile or I am. The 95%ile mark is showing the worst 5% of frames with respect to latency.
The 95% throws out the top and bottom 2.5% of data to say 'What is the frame time that I can expect 95% of all frames to be rendered at?'

It's a metric long used to assess server performance.
 
The 95% throws out the top and bottom 2.5% of data to say 'What is the frame time that I can expect 95% of all frames to be rendered at?'

It's a metric long used to assess server performance.

No, what you are referring to is 2 standard deviations away from the mean = 95% of the area under the distribution starting from the mean in both directions.

95%ile is the top 5%.
 
When I say FPS, I mean average.

Here's why:

In recent comparisons, AMD gets better "FPS" but gets higher micro-stuttering.
nVidia have been getting lower "FPS" but gets much less micro-stuttering.

Either way, I don't care. I just want high frame rates.
 
Great thread, I've been on this train since the first techreport article. I really hope it becomes even more widely used in the future.
 
Really glad this thing has reared its ugly face, now every time I mention the word "SLI", 15 seconds later there's a "OMG LATENCY" comment, really is beautiful.

Especially because I can't feel the difference whatsoever. You could sit me down in front of a new generation single and SLI PC and promise me eternal wealth and I would never be able to tell.
 
No, what you are referring to is 2 standard deviations away from the mean = 95% of the area under the distribution starting from the mean in both directions.

95%ile is the top 5%.
Hrm. Well, where I get my argument from is TechReport:

One way to address that question is to rip a page from the world of server benchmarking. In that world, we often measure performance for systems processing lots of transactions. Oftentimes the absolute transaction rate is less important than delivering consistently low transaction latencies. For instance, here is an example where the cheaper Xeons average more requests per second, but the pricey "big iron" Xeons maintain lower response times under load. We can quantify that reality by looking at the 99th percentile response time, which sounds like a lot of big words but is a fundamentally simple concept: for each system, 99% of all requests were processed within X milliseconds. The lower that number is, the quicker the system is overall. (Ruling out that last 1% allows us to filter out any weird outliers.)

Oddly enough, we want to ask the same basic question about gaming performance. We want our systems to ensure consistently low frame times, and doing so is arguably more important than achieving the highest FPS rate.

http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking/3
Really glad this thing has reared its ugly face, now every time I mention the word "SLI", 15 seconds later there's a "OMG LATENCY" comment, really is beautiful.

Especially because I can't feel the difference whatsoever. You could sit me down in front of a new generation single and SLI PC and promise me eternal wealth and I would never be able to tell.
Not everyone can be a pro gamer Sethos, it's okay. :P
 
Hrm. Well, where I get my argument from is TechReport:

We can quantify that reality by looking at the 99th percentile response time, which sounds like a lot of big words but is a fundamentally simple concept: for each system, 99% of all requests were processed within X milliseconds. The lower that number is, the quicker the system is overall. (Ruling out that last 1% allows us to filter out any weird outliers.)

They are saying the same exact thing Tom's website is saying. 99%ile is looking at the 1%.
 
So in a perfect world we would like to shop for a card that gets a +60fps on the 0.1% and 1% of this graph, right???

spDD5gY.png


Also is this what is commonly called "microstuttering", I remember having my crossfire setup saying I was hitting 100fps yet it felt more like 25fps some times.
 
Not everyone can be a pro gamer Sethos, it's okay. :P

Which is funny because I'm an excellent player when it comes to FPS games and most people wouldn't notice SLI frame latency whatsoever. However, now it's suddenly a 'thing' and there's graphs and write-ups everywhere, now everybody suddenly feel a 3 day delay and are professors in frame latency.
 
I've noticed something very frustrating but common and I've mentioned it quite a few times in many threads. I often get stutter when in full screen mode but when I switch to windowed mode it vanishes. Is there a known link between full screen mode and frame latency?
 
They are saying the same exact thing Tom's website is saying. 99%ile is looking at the 1%.
Maybe I'm not understanding how this: "(Ruling out that last 1% allows us to filter out any weird outliers.)" equates to "focusing on the 1%". That seems to indicate the opposite.
 
Maybe I'm not understanding how this: "(Ruling out that last 1% allows us to filter out any weird outliers.)" equates to "focusing on the 1%". That seems to indicate the opposite.

Edit: I wasn't clear with my wording from the previous post.

It indicates the opposite because they are using the term "percentile" incorrectly. Even if it was reversed (which it is not), there is no cutting off of the 5% at 2.5% on each end.

95%ile means - top 5%
99%ile means - top 1%

The copy machine people are looking for the number X over the shortest 99% of cases. That is not the 99%ile.

Returning to your graph:

skyrim-latency.gif


The 95% would include all data from the 95% mark to "100%"
 
It indicates the opposite because they are using the term "percentile" incorrectly. Even if it was reversed (which it is not), there is no cutting off of the 5% at 2.5% on each end.
*edit*

okay, I get what you are saying now. What's the accurate term then for looking at the 99% of results?

*edit 2*

Actually, isn't it more accurate to say 99% = the number separating the 99% from 1%? Or, the number separating 95% from the 5%? It would be the number right on the line, correct?
 
Really glad this thing has reared its ugly face, now every time I mention the word "SLI", 15 seconds later there's a "OMG LATENCY" comment, really is beautiful.

Especially because I can't feel the difference whatsoever. You could sit me down in front of a new generation single and SLI PC and promise me eternal wealth and I would never be able to tell.

I'm exactly the same so unless data like this pulls out some really ugly results I don't really care as it makes no difference to me.
 
Can't it go either way? Like I said, doesn't the number represent the point in which the two percentiles are separated?

*edit the edit to edit!*

(not editing, just thinking of how our discussion here looks funny)
 
okay, I get what you are saying now. What's the accurate term then for looking at the 99% of results?

Actually, isn't it more accurate to say 99% = the number separating the 99% from 1%? Or, the number separating 95% from the 5%? It would be the number right on the line, correct?

Can't it go either way? Like I said, doesn't the number represent the point in which the two percentiles are separated?

Well, I'm not sure what the technical term for looking at the bottom 99% of the data is, lol.

You wouldn't want to use a number or data point to separate the data because then you wouldn't have any place for that number to go. I suppose if we had the numbers 1-100, the 95%ile numbers would be 96 and up. But then 95 would be part of the lower group.

If we had 40 frames of data, the top 10 frames with respect to latency would make up the 75%ile.
 
Okay, I think we are saying the same thing but in different ways.

Where it gets confusing is that the higher % you go in frame time, the WORSE the result is. This is different than the normal way of looking at percentile. Having a number LOWER than the 4% above it misrepresents that 4% of data.
 
Okay, I think we are saying the same thing but in different ways.

Where it gets confusing is that the higher % you go in frame time, the WORSE the result is. This is different than the normal way of looking at percentile. Having a number LOWER than the 4% above it misrepresents that 4% of data.

percentiles aren't about your opinion/definition of the best or worst. they are based on how you set your data up. we could easily take 95%ile of lowest latency - so that we could see the 5% of frames with the lowest rendering time. it's just that tom's website is gathering the "high render time" frames. you can easily make your own chart to show results how you want ;)
 
Which is funny because I'm an excellent player when it comes to FPS games and most people wouldn't notice SLI frame latency whatsoever. However, now it's suddenly a 'thing' and there's graphs and write-ups everywhere, now everybody suddenly feel a 3 day delay and are professors in frame latency.
Oh come on, take a good hearted jab.

Me, NVIDIA, and most of the gaming community would agree with your sentiment because most people don't even know about it.

Every setup is different though, so if the person already has a substantial amount of input lag due to their monitor and input devices, that 1 frame might be the thing that pushes it over the limits of perception.
 
I misspoke. You wouldn't need a normal distribution, but you would need to know the mean. Standard deviation alone wouldn't be enough: a game with a stdev of 0, sounds great... but not if it's running at a steady 100ms/frame.

Yeah, basically we still need the FPS but accompanied with the frame latency standard deviation.
 
percentiles aren't about your opinion/definition of the best or worst. they are based on how you set your data up. we could easily take 95%ile of lowest latency - so that we could see the 5% of frames with the lowest rendering time. it's just that tom's website is gathering the "high render time" frames. you can easily make your own chart to show results how you want ;)
Now that I understand this whole thing better, when you say 99th percentile, you are not looking at the top 1%. You are looking at the number that separates the top 1% from the bottom 99%. Thus, when you say 95th percentile, you are looking at the number that separates the top 5% from the bottom 99%.

In this light, I think the purpose, my critique of Tom's approach, and TechReport's use is accurate.
 
Now that I understand this whole thing better, when you say 99th percentile, you are not looking at the top 1%. You are looking at the number that separates the top 1% from the bottom 99%. Thus, when you say 95th percentile, you are looking at the number that separates the top 5% from the bottom 99%.

It can be used either way. In the context of Tom's graphs, 95%ile clearly refers to all cases outside of that 95%. From your wikipedia link "For example, if a score is in the 86th percentile, it is higher than 85% of the other scores." so 86%ile means 85% of scores below. The grouping above is more or less irrelevant.

Percentiles just split up the data so we can look at it in meaningful ways.
 
Question,

is this the reason why there are some games that are 30fps and yet some praise the game for how smooth it is DESPITE being only 30fps?
E.g (Resistance Fall of Man)

Or more specifically, is the smooth(isity) of gameplay dependent more on the standard deviation of Latency rather than the Framerate?

So if we played a game at 20fps but latency std/dev was 0ms
vs a game played at 40fps with latency 10ms

Are you suggesting that this could cause a lower FPS to appear on par?
 
Question,

is this the reason why there are some games that are 30fps and yet some praise the game for how smooth it is DESPITE being only 30fps?
E.g (Resistance Fall of Man)

Or more specifically, is the smooth(isity) of gameplay dependent more on the standard deviation of Latency rather than the Framerate?

So if we played a game at 20fps but latency std/dev was 0ms
vs a game played at 40fps with latency 10ms

Are you suggesting that this could cause a lower FPS to appear on par?
That does seem to follow.

You know, if 30 FPS wasn't complete crap from the beginning. :P

Genuinely though, yes. A frame latency graph showing a consistent 33.3ms frame time would appear to be more smooth than one which oscillates between 16.7 and 33.3.
It can be used either way. In the context of Tom's graphs, 95%ile clearly refers to all cases outside of that 95%. From your wikipedia link "For example, if a score is in the 86th percentile, it is higher than 85% of the other scores." so 86%ile means 85% of scores below. The grouping above is more or less irrelevant.

Percentiles just split up the data so we can look at it in meaningful ways.
Thank you sincerely for helping me with that. GAF is the best.
 
Top Bottom