Frame Latency - Why you should stop talking about FPS

99th percentile is essentially a replacement for average FPS as you are saying "99% of frames rendered will be rendered within Xms".

Granted, it can be looked at "1% of all frames were rendered above Xms", but that's not the intent, as the frames in that 1% increase dramatically over that number.
 
Thanks for the thread. Took a few minutes to really grasp all this new knowledge and the new graphs, but it makes much more sense. 16.7 or bust!


Edit : Totally off topic, but I just noticed I'm a member now. Sexy
 
99th percentile is essentially a replacement for average FPS as you are saying "99% of frames rendered will be rendered within Xms".

Granted, it can be looked at "1% of all frames were rendered above Xms", but that's not the intent, as the frames in that 1% increase dramatically over that number.

It's not necessarily indicative of overall quality. A few problem spots does not break a game. Now, if you've got consistent stuttering or hitching, that's a problem, but 99th percentile doesn't make a distinction between the two. My point is that there's lots of way to analyzing this data and choosing a single metric isn't the way to go. Let's make use of ALL of our analytical tools.

I think coming up with a set of metrics is the way to go (e.g. FPS, 99th percentile frame latency, median frame latency, 99th percentile FPS, etc.). It's also valuable to look at the distribution of frame latencies. The cumulative distribution you're using now is nice, but a frequency distribution could be valuable too.
 
It's not necessarily indicative of overall quality. A few problem spots does not break a game. Now, if you've got consistent stuttering or hitching, that's a problem, but 99th percentile doesn't make a distinction between the two. My point is that there's lots of way to analyzing this data and choosing a single metric isn't the way to go. Let's make use of ALL of our analytical tools.

I think coming up with a set of metrics is the way to go (e.g. FPS, 99th percentile frame latency, median frame latency, 99th percentile FPS, etc.). It's also valuable to look at the distribution of frame latencies. The cumulative distribution you're using now is nice, but a frequency distribution could be valuable too.
99th percentile frame latency and FPS is the same thing :P

Just divide 1000 by the frame time number to arrive at the FPS number.

And yes, while I agree with you in that multiple statistics need to be used, there is absolutely no use whatsoever for second based polling FPS #'s.
Ouch, this is not good for AMD's mind share at all.
Eh, I don't know. Their single cards are kicking ass and outperforming NVIDIA cards $100 more.
 
99th percentile frame latency and FPS is the same thing :P

Just divide 1000 by the frame time number to arrive at the FPS number.

And yes, while I agree with you in that multiple statistics need to be used, there is absolutely no use whatsoever for second based polling FPS #'s.

I really don't understand how you think they're the same. I know what frame latency is, but a 99th percentile is nothing like an average. They're fundamentally different metrics. One tells you how bad the worst 1% of frame latencies are, the other tells you what your average number of frames per second are (and you can divide 1000 by FPS to get the average frame latency in ms). It's a very different mathematical process with very different meanings.

Sure, second based polling is arbitrary, but that doesn't make looking at a distribution of means lacks value. Let me give you an example. Let's say you have a 100 second sequence. You get 40 fps for 99 seconds, and for 1 second it dips to 20fps because it's loading a new section or something. That doesn't make the game unplayable, but 99th percentile makes no distinction between this and 1 slow frame each second, which would be much worse for the player.

In the former situation, a mean like fps let's you know your expected experience ignoring a few hiccups.
 
I really don't understand how you think they're the same. I know what frame latency is, but a 99th percentile is nothing like an average. They're fundamentally different metrics. One tells you how bad the worst 1% of frame latencies are, the other tells you what your average number of frames per second are (and you can divide 1000 by FPS to get the average frame latency in ms). It's a very different mathematical process with very different meanings.

Sure, second based polling is arbitrary, but that doesn't make looking at a distribution of means lacks value. Let me give you an example. Let's say you have a 100 second test sequence. You get 40 fps for 99 seconds, and for 1 second it dips to

I think coming up with a set of metrics is the way to go (e.g. FPS, 99th percentile frame latency, median frame latency, 99th percentile FPS, etc.). It's also valuable to look at the distribution of frame latencies. The cumulative distribution you're using now is nice, but a frequency distribution could be valuable too.

Is what I was referring to.

I definitely understand that average FPS and 99th percentile are different.

You can also look at 99th percentile as a way to look at a baseline number in which 99% of all frames are rendered. It's a dividing point, not an analysis of the numbers above or below it.
 
Is what I was referring to.

I definitely understand that average FPS and 99th percentile are different.

You can also look at 99th percentile as a way to look at a baseline number in which 99% of all frames are rendered. It's a dividing point, not an analysis of the numbers above or below it.

I don't think we disagree that much, haha. I'm just saying that using only the 99th percentile can lead to unexpected results (e.g. games playable when they look like they aren't).
 
reading that pc per article and getting my mind fucking blown

it might deserve its own thread if this one wasn't so short

"High framerate PC game stutter explained!" or some such sensational ridiculousness

edit: also, jesus christ

BF3_1920x1080_STUT.png
 
Yep yep. And it's not even just stutter that makes this so insane. It's completely tossing everything we thought we knew about game performance in the garbage.

So glad people are doing this.
 
I have a 2 GPUs on one card Radeon HD 5970, and the observed frame rate usually feels a lot lower than what the FPS counter tells me. I hope AMD's issues can be fixed through driver updates, though the 5XXX series is old enough that they might not even bother.
 
Reading the Anandtech article, I find this paragraph in particular very strange:

As a result FRAPS is best descried as a coarse tool. It can see particularly egregious stuttering situations – like what AMD has been experiencing as of late – but it cannot see everything. It cannot see stuttering issues the context queue hides, and it’s particularly blind to what’s going on in multi-GPU scenarios.

It's pretty clear that Fraps nearly mirrors the FCAT stuff in PCPerspective's tests for single GPUs. How that is 'coarse' is beyond me.
 
Yep yep. And it's not even just stutter that makes this so insane. It's completely tossing everything we thought we knew about game performance in the garbage.

So glad people are doing this.

I found it amazing that dropped/runt frames were literally making it so that crossfire setups were actually performing worse than single cards in some situations.
 
Reading the Anandtech article, I find this paragraph in particular very strange:



It's pretty clear that Fraps nearly mirrors the FCAT stuff in PCPerspective's tests for single GPUs. How that is 'coarse' is beyond me.


It is a course tool because it analyzes data for frames to be built before directx touches it.

Even in the PCPEeR article they trash fraps for the same reason.

There is also a completely different discussion on the advantages and differences of capturing data from FRAPS versus capturing data with our Frame Rating methodology. I believe that the advantages of hardware capture outweigh the concerns currently, but in reality the data that FRAPS generates isn’t that important, it just happens to be the closest data point to another metric we would love to know more about: game time.

If the results were mirroring FRAPS then the observed frame rate and actual fraps framerate would've been identical for all gpu setups tested.
 
The point is though, they bash it on theory and then the end results (other than dual GPU configurations) are right on the money. That seems really inconsistent.
 
lets see if i cant take a stab at this, tell me if this is basically what your saying

so you have 2 games both 60fps stable

game A's frames get shown within that second each being equally spaced apart in time, thus smooth

game B's frames gets shown within that second and the time they show up varies from slow to fast but in the end of that second 60 frames got in?

Thus while they are both technically 60 fps game A is smoother and game B is that game that has that stutter that you cant really pin point but can tell something is off?
 
That is the essence of what has been uncovered by switching to this new methodology.

However, the new methodology also accurately captures frame rate.

The other nifty thing being shown in the PCPer article is that there are these 'runt' frames that exist as anomalies. These are counted towards FPS data, but in fact, do absolutely nothing for increasing the perceived frame rate or smoothness. They can even throw off smoothness.
 
The point is though, they bash it on theory and then the end results (other than dual GPU configurations) are right on the money. That seems really inconsistent.

I feel the same way. It seems like they're anxious to validate their testing methods out of a need to separate themselves from Fraps/Scott's methods.

Have you considered updating the OP with some of these articles released today?

http://techreport.com/review/24553/inside-the-second-with-nvidia-frame-capture-tools

http://anandtech.com/show/6862/fcat-the-evolution-of-frame-interval-benchmarking-part-1

http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Testin
 
I feel the same way. It seems like they're anxious to validate their testing methods out of a need to separate themselves from Fraps/Scott's methods.

Have you considered updating the OP with some of these articles released today?

http://techreport.com/review/24553/inside-the-second-with-nvidia-frame-capture-tools

http://anandtech.com/show/6862/fcat-the-evolution-of-frame-interval-benchmarking-part-1

http://www.pcper.com/reviews/Graphi...ils-Capture-based-Graphics-Performance-Testin
Yeah, will update with TechReport and Anand, already added the PC Per one.

I think I'm actually going to split these off into their own thread and link back to this one for further discussion to see if more people check it out.
 
Pretty damning for AMD.

Sooner all the other sites update to this methodology the better.

Is it just crossfire/SLI stuff? Dont care, and definitely not many people use those, and they're already known to have issues compared to single cards (EG, the whole microstuttering thing in the past). Unfair to rush to blanket condemn AMD over that, seems like some fanboyism creeping in to it. Dont get me wrong, AMD needs to sort "it" whatever "it" is, but trying to make like there's something intrinsically wrong with AMD cards strikes me the wrong way. It's just something they can fix in drivers.

Interestingly Wasson started all this "AMD suck at advanced metrics" stuff, and his reviews actually seem to show Nvidia cards doing poorly just as often as AMD. It seems kinda random, really. Case in point their new 650 Ti Boost article which shows the card doing more poorly than AMD cards in 99th percentile metrics.

In a strange reversal of roles, it's Nvidia who has suffered from unruly frame latencies this time. Both versions of the GTX 650 Ti Boost fared poorly in Crysis 3 and Sleeping Dogs. They scored easy victories in the other games, but their 99th-percentile frame latencies weren't substantially higher than those of the Radeons.

Until there's consistency I dont know what to think. Anandtech is working on something though (they say FRAPS is not a good tool for this, so they're working on something else), and I'm guessing it'll be the gold standard cause that site tends to be.

One thing's for sure, Wasson started a major change and deserves a ton of credit.

But I also wonder if we arent going a little TOO far with this stuff sometimes. I mean I've been playing PC games for many years, and you know, my eyes arent the most sensitive, but they seemed smooth enough to me in the past, before all this frame latency stuff was a "thing" and we just used blind FPS as a measuring stick...I wonder if human eyes can really notice a lot of this stuff. I mean if PC games were really a stutterfest, well, we would have a had a lot of issues playing them in the past.

It'd be nice to get a consistent metric out of this that can be used across sites along with FPS oneday. We're not there yet it's early days.

This whole thing reminds me a tad of the sabermetrics stat revolution in baseball, almost, on a much smaller more narrow scale. A new way of thinking about things.
 
Interesting bit from TechReport's findings:

Going forward, there's still tons of work to be done. For starters, we need to spend quite a bit more time understanding the problems of multi-GPU micro-stuttering, runt frames, and the like. The presence of these things in our benchmark results may not be all that noteworthy if overall performance is high enough. The stakes are pretty low when the GPUs are constantly slinging out new frames in 20 milliseconds or less. I've not been able to perceive a problem with micro-stuttering in cases like that, and I suspect those who claim to are seeing extreme cases or perhaps other issues entirely. Our next order of business will be putting multi-GPU teams under more stress to see how micro-stuttering affects truly low-frame-rate situations where animation smoothness is threatened. We have a start on this task, but we need to collect lots more data before we are ready to draw any conclusions. Stay tuned for more on that front. I'm curious to see what other folks who have these tools in their hands have discovered, too.
 
New article up on Anandtech - AMD Comments on GPU Stuttering, Offers Driver Roadmap & Perspective on Benchmarking

Is it just crossfire/SLI stuff? Dont care, and definitely not many people use those, and they're already known to have issues compared to single cards (EG, the whole microstuttering thing in the past). Unfair to rush to blanket condemn AMD over that, seems like some fanboyism creeping in to it. Dont get me wrong, AMD needs to sort "it" whatever "it" is, but trying to make like there's something intrinsically wrong with AMD cards strikes me the wrong way. It's just something they can fix in drivers.

Interestingly Wasson started all this "AMD suck at advanced metrics" stuff, and his reviews actually seem to show Nvidia cards doing poorly just as often as AMD. It seems kinda random, really. Case in point their new 650 Ti Boost article which shows the card doing more poorly than AMD cards in 99th percentile metrics.

I've only ever had ATI/AMD cards so no, there is no fanboyism creeping in.

Edit - I tell a lie, I think I remember having an Nvidia GPU in a Dell machine back in 2000.
 
I'm snipping a lot of your post because there a a ton of interesting discussion points you bring to the table.
Unfair to rush to blanket condemn AMD over that, seems like some fanboyism creeping in to it. Dont get me wrong, AMD needs to sort "it" whatever "it" is, but trying to make like there's something intrinsically wrong with AMD cards strikes me the wrong way. It's just something they can fix in drivers.
Conversely, I think it is silly to disassociate a card from it's performance, even if it's from drivers. Definitely not fanboyism here, as my main machine is running a 7970. If anything, I feel slightly more loyal to AMD than NVIDIA. Heck, I didn't even use Intel CPU's until two years ago (of course way back in P3 and earlier days I did as well).
Until there's consistency I dont know what to think. Anandtech is working on something though (they say FRAPS is not a good tool for this, so they're working on something else), and I'm guessing it'll be the gold standard cause that site tends to be.
The TechReport article posts the fraps results in comparison to the FCAT results, and they are extremely similar outside of crossfire/SLI results.

I wonder if human eyes can really notice a lot of this stuff. I mean if PC games were really a stutterfest, well, we would have a had a lot of issues playing them in the past.
The only experience I've had with crossfire was on 5870s with a 60hz screen, and it was mostly broken. I am really curious myself though, and I think I'm going to order a second 7970 specifically so I can try this stuff out.

It'd be nice to get a consistent metric out of this that can be used across sites along with FPS oneday. We're not there yet it's early days.
This mindset is something I've been trying to battle a bit elsewhere, so I'm just gonna copy/pasta two posts of mine on OCN, apologies if some of it seems a bit off.

Look at it this way. FPS and Frame Times are measuring the same thing. Frame times measures it with millimeters while FPS measures it in meters. It's an inherently more accurate way of testing.

How the data is presented is where it can differ. The reason why frame times are being used to measure fluidity is because it's accurate enough to do so, while FPS isn't. You can still use frame time to determine things like average frame rate.

So frame times shows you the data for how long each frame took to render. FPS shows you the same thing, but it takes all of the frames rendered over the course of one second instead.

In a perfect world, if you have 120 frames consecutively drawn in 8.3ms, you would have 120FPS. 1000ms/8.3ms = 120 frames in one second.

How the data is put into different statistics is how you can find variance and mean.

The important thing to note is that FPS only polls data once a second, and then averages it. Over the course of that second, you could have frames being rendered at 16.7, some at 33.3, and some at 8.3. Lets say this averaged out to 60 frames per second. That isn't indicative of what actually transpired though.

So what I'm saying is that frame time testing is inherently more accurate and will supplant FPS testing. FPS testing isn't an additional test to provide more information, it's a less accurate and redundant test if you put your frame time testing into similar statistics. 99th Percentile, for example, is more or less a replacement for Average FPS.
 
Scott Wasson said:
I've not been able to perceive a problem with micro-stuttering in cases like that, and I suspect those who claim to are seeing extreme cases or perhaps other issues entirely.

I haven't witnessed the effect in person, but if PCPer's results are to be believed, I don't know how anyone could not notice it when it's basically halving the framerate in a game like BF3. I guess if Tech Report is testing with a 60 Hz monitor, that might mask the issue quite a bit since it would be harder to notice a drop from ~110 FPS to ~55 FPS on a 60 Hz display.

But for Sleeping Dogs, PCPer's 7970 CF setup is getting an observed FPS of around 30, compared to a reported FPS consistently over 60... I think that would be pretty hard to miss.
 
I have a 2 GPUs on one card Radeon HD 5970, and the observed frame rate usually feels a lot lower than what the FPS counter tells me. I hope AMD's issues can be fixed through driver updates, though the 5XXX series is old enough that they might not even bother.
I'm hopeful that it's actually across the board, whatever it is that they're able to fix. I haven't read anything saying whether it's only 7xxx or it's across the board, but I imagine that there'll be at least some kind of fix coming our way, given that it's an issue they've only just realised they have but appears to have been around for at least 4-5 years.
 
Another site on-board, good to see.

Yeah, average FPS is still (in my opinion) the most important denominator in terms of determining how fast a game can be rendered.

Which is not the case as we've seen with PCPer's work.
 
Pretty significantly. TechReport has done a few reviews since then.

I have a ton of data on a number of esports titles I'll be redoing with the new drivers as well, and I'll report back when I have it together.

This is no different than FPS in what it represents, it's just more accurate in that it's the raw data rather than an average over the course of a second. So, yes, downsampling is harder on your PC to render, so it will take longer than it would have otherwise.

Oh wow, this explains why I'm experiencing mouselag when I enable SSAA in some titles even when I have a solid 60+ frames per second. Nice to have a definite answer.

So you can compare it to a highway then? There are 60 cars on the road but some of the cars are slower than others, so it takes longer for them to get from point A (GPU) to point B (Your monitor) thus introducing (in my example) mouse-lag?
 
Good shit. <3 Guru3D.
Agreed, they were one of my go-to websites, and I was getting a bit sad to see them not including frame latency data as the primary metric in their reviews. Glad to see them coming around.

They're one of those websites that I just expect 100% professionalism out of. No silly expose, tightly controlled variables, and always fair reviews.
Oh wow, this explains why I'm experiencing mouselag when I enable SSAA in some titles even when I have a solid 60+ frames per second. Nice to have a definite answer.

So you can compare it to a highway then? There are 60 cars on the road but some of the cars are slower than others, so it takes longer for them to get from point A (GPU) to point B (Your monitor) thus introducing (in my example) mouse-lag?
Yes, definitely.
 
I haven't witnessed the effect in person, but if PCPer's results are to be believed, I don't know how anyone could not notice it when it's basically halving the framerate in a game like BF3. I guess if Tech Report is testing with a 60 Hz monitor, that might mask the issue quite a bit since it would be harder to notice a drop from ~110 FPS to ~55 FPS on a 60 Hz display.

But for Sleeping Dogs, PCPer's 7970 CF setup is getting an observed FPS of around 30, compared to a reported FPS consistently over 60... I think that would be pretty hard to miss.

For SLI and CrossFire though, there isn’t even a debate and the real-world value of adding a second HD 7970 to your system is near zero.
I can definitely tell the difference between 30fps and 60fps, and my BF3 experience doesn't line up with their conclusions. That would indicate a severe 7xxx specific driver issue, or a problem with their methodology. I would like to see them test AMD GPUs from a different series, and test the same card with updated drivers.
 
They're only looking at the end pipeline. I'm not sold on the discounting of runt frames, as the way the game feels is much more dependent on the FRAPS data towards the beginning of the pipeline. What they are testing is basically what the monitor is getting, which is not totally indicative of what the engine is doing, and as a result, how the game feels and is perceived.
 
Agreed, they were one of my go-to websites, and I was getting a bit sad to see them not including frame latency data as the primary metric in their reviews. Glad to see them coming around.

They're one of those websites that I just expect 100% professionalism out of. No silly expose, tightly controlled variables, and always fair reviews.
Their description of the problem is the first one I've seen that's made 100% sense without getting overly technical (with the diagram of the frames and how the nVidia cards are evenly timed where the ATI ones aren't). That 30 minute video on PC... whatever, nowhere near as concise as the G3D one.

It's also an awesome site because the creators of both MSI Afterburner, RadeonPro AND SweetFX frequent there. I actually just read that AMD have sent Japamd (creator of RadeonPro) three LCD screens so he can play with/test Eyefinity and RP. Pretty neat.
 
I can definitely tell the difference between 30fps and 60fps, and my BF3 experience doesn't line up with their conclusions. That would indicate a severe 7xxx specific driver issue, or a problem with their methodology. I would like to see them test AMD GPUs from a different series, and test the same card with updated drivers.

Ah, I missed that last quote. Yeah, it would be interesting to see if the problem translates to older series.

They're only looking at the end pipeline. I'm not sold on the discounting of runt frames, as the way the game feels is much more dependent on the FRAPS data towards the beginning of the pipeline. What they are testing is basically what the monitor is getting, which is not totally indicative of what the engine is doing, and as a result, how the game feels and is perceived.

That's a good point. Even though the game looks like it's running at 30fps, it would still have input latency more like a 60fps game.

That raises an interesting question: are there any games/engines that detach the simulation speed from the rendering speed? Could you have a game that simulates at 200fps but only sends data to the GPU at 60fps?
 
That is theoretically what the Lucid Virtu MVP does. Vsync'd with 16.7ms frame times, but every frame is still being rendered.
 
Frame Rating: Catalyst 13.8 Brings Frame Pacing to AMD Radeon

AMD deserves a lot of credit for stepping up and addressing these frame pacing issues with the 13.8 beta driver. It took a brand new testing methodology to really prove that there was an issue with CrossFire and even internally at AMD it seemed there was a debate if the results we published were "real." Not only does this driver validate everything we have worked on for the last two years but the fact that AMD has decided to enable the frame pacing fix by default emphasizes that fact even more. Evenly paced frames results in a smoother animation and does not mean that your input latency increases in any way.

Users of AMD CrossFire systems in the HD 7000 family should install this new driver and see if they can see or feel the difference. I am confident that users will! In fact, I would love to get some feedback from readers and gamers in the comments section below about their experiences with the 13.8 drivers. Was it better? The same? Worse??

I'll continue to pester AMD to get the rest of the issues fixed: DX9 games, Eyefinity, 4K, etc. But today AMD gets to hold its head up high for improving CrossFire dramatically for a majority of its users.

http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Catalyst-138-Brings-Frame-Pacing-AMD-Radeon
 
Wait, are they saying that the improvements for frame pacing are only available to those using CFX 7xxx series? YOU HAVE TO BE KIDDING ME. It's not like I've been waiting all year to get my 5970/5850 running without microstutter.

Coulda sworn another article (Anandtech?) explicitly mentioned the 5xxx and 6xxx series CF also benefit. I've been playing FC3 for the past while with the new update and while it feels a bit smoother, the overall FPS tended to sway a bit with my 5770s anyway, so I'm really not the best judge, heh.
 
Top Bottom