Frame Latency - Why you should stop talking about FPS

Cool thread. This stuff can be fun to chew on -- the latest Tech Report post about comparing frame times to the internal game simulation opens up a whole new can of worms. Should be interesting to see where this all leads, and how AMD/Nvidia respond.

I dont even know what that means. Is CF outputting a black frame or simply duplicating the last frame that went up. Or is it a null frame?

It's outputting a tiny frame that is too small to matter. If you're not running Vsync, you can have several different frames onscreen at the time (which is what causes tearing). But sometimes (like in their CF testing) a displayed frame is only a few pixels tall. It's still counted in your FRAPS FPS averages, but it's visually almost worthless. In an extreme case, FRAPS might say you're getting 60 FPS, but half of the frames are "runts" so visually it will look no smoother than 30 FPS.

PCPer's video here does a good job of showing how the "runt" frames look and how they cause stutter when viewed in real time.
 
It's a good theory, and we'll need to see a bit more. Scott Wasson's take on what's been presented so far:

http://techreport.com/blog/24415/as-the-second-turns-frame-captures-crossfire-and-more

Taking things a step further, it's important to note that frame delivery timing itself is not the be-all, end-all solution that one might think, just because it monitors the very end of the pipeline. The truth is, the content of the frames matters just as much to the smoothness of the resulting animation. A constant, evenly spaced stream of frames that is out of sync with the game engine's simulation timing could depict a confusing, stuttery mess. That's why solutions like Nvidia's purported frame metering technology for SLI aren't necessarily a magic-bullet solution to the trouble with multi-GPU schemes that use alternate frame rendering.

In fact, as Intel's Andrew Lauritzen has argued, interruptions in game engine simulation timing are the most critical contributor to less-than-smooth animation. Thus, to the extent that Fraps timestamps correspond to the game engine's internal timing, the Fraps result is just as important as the timing indicated by those colored overlays in the frame captures. The question of how closely Fraps timestamps match up with a game's internal engine timing is a complex one that apparently will vary depending on the game engine in question. Mark at ABT has demonstrated that Fraps data looks very much like the timing info exposed by several popular game engines, but we probably need to dig into this question further with top-flight game developers.

Peel back this onion another layer or two, and things can become confusing and difficult in a hurry. The game engine has its timing, which determines the content of the frames, and the display has its own independent refresh loop that never changes. Matching up the two necessarily involves some slop. If you force the graphics card to wait for a display refresh before flipping to a new frame, that's vsync. Partial frames aren't displayed, so you won't see tearing, but frame output rates are quantized to the display refresh rate or a subset of it. Without vsync, the display refresh constraint doesn't entirely disappear. Frames still aren't delivered when ready, exactly—fragments of them are, if the screen is being painted at the time.
 
It's a good theory, and we'll need to see a bit more. Scott Wasson's take on what's been presented so far:

http://techreport.com/blog/24415/as-the-second-turns-frame-captures-crossfire-and-more

This is important since the end user experience depends on the monitor refreshing. But it also makes frame latency that much more important because the timing of monitor refreshing benefits from a more consistent GPU frame rendering. In other words, intermittent spikes in frame rendering times can throw the GPU frame and the monitor display timing off in a manner that is more noticeable to the user with or without vsync. I'm not much of a techie, but that's sort of how I see it.
 
This is huge though. It's years of what we thought we knew thrown for a loop. This literally is a paradigm shift in how we can test for performance. It's not some buzzword to wow people.
 
This is huge though. It's years of what we thought we knew thrown for a loop. This literally is a paradigm shift in how we can test for performance. It's not some buzzword to wow people.

On the one hand sure. On the other hand not really. It's only a method of quantifying of what users have experienced first hand already. Things like micro-stutter and the overall gameplay not feeling smooth is noticed by people, despite whatever frame rate a tool was reporting at the time.
 
This is huge though. It's years of what we thought we knew thrown for a loop. This literally is a paradigm shift in how we can test for performance. It's not some buzzword to wow people.

I dont know, it is very interesting, but it occurred to me today that most of the time it just backs up what you see in the FPS charts.

Unless there is an anomaly, the same cards will do best at latency as do best with FPS. That's usually what you see in Scott Wasson's testing.

Plus, when the whole hubbub over AMD cards doing bad in Wasson's recent testing got me wondering, what is it I really want in a card? I'd rather have a card that does better in 99% of frames, than one that has less dips, but also a lower average, I think. I can live with a card that does well 99% of the time, and has the occasional hitch. Because 99% of the time it's doing well.

I think if there's still "one number" that's my go to, it's gotta be FPS.
 
Unless there is an anomaly, the same cards will do best at latency as do best with FPS. That's usually what you see in Scott Wasson's testing.

Well, yeah. The whole point of this is to expose the anomalies that wouldn't get caught by a traditional FPS comparison. Those PCPer 7970 Crossfire results show an FPS score that's almost twice as high as the cards are actually putting out to the screen.
 
This is huge though. It's years of what we thought we knew thrown for a loop. This literally is a paradigm shift in how we can test for performance. It's not some buzzword to wow people.


Ah I know, I was joking. It's still interesting information though ,
 
On the one hand sure. On the other hand not really. It's only a method of quantifying of what users have experienced first hand already. Things like micro-stutter and the overall gameplay not feeling smooth is noticed by people, despite whatever frame rate a tool was reporting at the time.
Considering AMD admitted to not testing for frame latency and now are activity targeting it I'd say it's a big deal when someone like PCPer runs a test and xFire 7970's run BF3 worse than a single card. It's quantified now.
 
On the one hand sure. On the other hand not really. It's only a method of quantifying of what users have experienced first hand already. Things like micro-stutter and the overall gameplay not feeling smooth is noticed by people, despite whatever frame rate a tool was reporting at the time.

Without ever being able to quantify this, nothing was ever going to change. Now that we can, the change is already underway. That's why this is important, it gets game developers as well as driver teams focused on delivering smooth gameplay rather than a high FPS number. It's a check to keep them honest.
 
That stuttering made oblivion almost unplayable for me on the PC, and I was getting a decent average FPS.

Whatever was causing it dev's need to figure out how to smooth it out.
 
On the one hand sure. On the other hand not really. It's only a method of quantifying of what users have experienced first hand already. Things like micro-stutter and the overall gameplay not feeling smooth is noticed by people, despite whatever frame rate a tool was reporting at the time.
No, it's a method of capturing *everything* more accurately. It's grabbing information on every single frame rendered. Seeing variance is only one aspect of frame time testing. If you want to look at any data accurately, would you rather have 30+ values averaged over one second, or all of the data?
I dont know, it is very interesting, but it occurred to me today that most of the time it just backs up what you see in the FPS charts.

Unless there is an anomaly, the same cards will do best at latency as do best with FPS. That's usually what you see in Scott Wasson's testing.

Plus, when the whole hubbub over AMD cards doing bad in Wasson's recent testing got me wondering, what is it I really want in a card? I'd rather have a card that does better in 99% of frames, than one that has less dips, but also a lower average, I think. I can live with a card that does well 99% of the time, and has the occasional hitch. Because 99% of the time it's doing well.

I think if there's still "one number" that's my go to, it's gotta be FPS.
See above. I can convert the values into FPS if you'd like, but even then, second based polling is missing tons of information.

This isn't just for measuring variance. It's everything, but more accurately.
 
Glad I paid attention to this thread, this is a very true issue, and I'm glad AMD seem to be fixing it already. I tested out the Beta driver, and there's a noticeable difference. For the sake of comparison, I picked an easy-to-run game yet modern like Sonic & All Stars Racing Transformed.

I noticed that even if Fraps constantly marked 60 FPS, the game didn't seem as smooth as it should be compared to the 60 FPS videos I recorded. I had been noticing it on various games before, but this patch helped me clear my doubts of that completely.

I benchmarked the game with the new Beta driver, so here's the data if you're interested on it:

Intel i5 2500k OC @ 4.3 GHZ
Sapphire AMD Radeon HD 6870 @ stock
ff2cI6L.png

Benchmark's CSV File

The random spikes are just the parts where the game needs to load a new resource(Sound, All-Stars Move, etc.), but it seems to average pretty well. The lower frame latency was very noticeable ingame as I started to notice animation details I never saw before, and my timing was a bit off, even if I played this quite a lot already. I should probably roll back the drivers and benchmark again.

Not much point to this post other than anyone doubting this with an AMD card should definitely try out the beta drivers. I didn't really expect this to have quite an effect on single card setups too.
 
Just stumbled across this after my noobness in the PC thread. Very, very interesting stuff. I had crossfire 6950's before my soon to be sli'd 670s, and the stuttering drove me nuts. Will definitely read up on these.
 
It's awesome to see because it's really the only way to force AMD and Nvidia to seriously focus on this aspect. It's paying off faster than I would have expected. I'd love to see a similar revolution happen with televisions and input lag.
 
Yep.

Even beyond that, it's been extremely efficient at finding things like 'average' frame rate. Second based polling is on borrowed time.
 
there really should be a flashing all bold marquee for this thread's title along the lines of "READ THIS IF YOU EVER PLAN TO TALK ABOUT GAME PERFORMANCE AGAIN".
 
It's not a problem, just a new methodology for testing.

But yes, frame spikes occur in a lot of games, regardless of platform.
 
Anyone in this thread own Trials Evolution? I'd love to see how bad things were in that.

It's awesome to see because it's really the only way to force AMD and Nvidia to seriously focus on this aspect. It's paying off faster than I would have expected. I'd love to see a similar revolution happen with televisions and input lag.

LTTP, but at least we have the measuring guide now. It's up to TV manufactures to actually care enough that it's out there:

http://www.displaylag.com/display-database/
 
So wouldn't the effect of frame latency show in the "Minimum FPS" value? If it takes a rather long period of time to render one frame due to latency, wouldn't that cause the computer to render fewer frames for that second?

I have always looked at minimum FPS to determine my graphical settings instead of FPS.

EDIT: I do understand the general issue with frame latency. I am glad it's being discussed. It was driving me crazy as a PC gamer, but I could never figure out exactly what was happening.
 

A very illuminating article. I loled when the anand team came across a measuring tool that was too dense for them to use in time for this publication.

One question that immediately comes to mind is that I recall certain reviewers they decided to create their own benchmarks instead of fraps but they didn't sufficiently explain why like this article does. I will take the time in the future to check who is using what tool in their frame latency tests in the future.

I think AMD is deflecting a bit there.

Most gamers who care about their frame latency will have their display queue length limited to 1, which renders most of their arguments agaisnt using FRAPS as a benchmark tool void.

Wow. All that effort they made and you still didn't get that Nvidia and AMD dislike fraps for measuring frame latency is that it measures frames being made before the preset command.

FOr those who haven't read the article yet, fraps tracks frames being made before it reaches the operating system and the OS is between the program and the GPU.
 
Anyone else find it cool though that AMD is including the option in the drivers though?

As Nvidia user I am slightly impressed by that dedication.. obviously AMD was rattled by these revelations.

Also... durante is right about the whole queue next frame to 1 thing. AMD deflecting partially
 
So right now, what's the moral of the story? I'm running CF 6950s and I definitely notice this stuff.

Should I abandon multi-card entirely? I've read many of these links and hear a lot of bad stuff about Crossfire, but how is SLI in comparison? What's the outlook? Should I just cash in my cards and buy the beefiest single card I can afford?
 
So wouldn't the effect of frame latency show in the "Minimum FPS" value? If it takes a rather long period of time to render one frame due to latency, wouldn't that cause the computer to render fewer frames for that second?

I have always looked at minimum FPS to determine my graphical settings instead of FPS.

EDIT: I do understand the general issue with frame latency. I am glad it's being discussed. It was driving me crazy as a PC gamer, but I could never figure out exactly what was happening.

You could be rendering a scene at 186 fps. Sure, a nasty latency spike would drop the number per second, but if you're still rendering at 117 fps after that, you're going to wonder why your "locked at 60fps" looks like crud.
 
You could be rendering a scene at 186 fps. Sure, a nasty latency spike would drop the number per second, but if you're still rendering at 117 fps after that, you're going to wonder why your "locked at 60fps" looks like crud.

Lol. My machine can't render many games maxed @ 186 fps but fair point none the less. Thanks for the reply.
 
So wouldn't the effect of frame latency show in the "Minimum FPS" value? If it takes a rather long period of time to render one frame due to latency, wouldn't that cause the computer to render fewer frames for that second?

Yep -- the frame latency data just shows a more complete picture. Minimum FPS just shows the single lowest value, but it doesn't really give you a good idea of how often that happens.

These new ways of showing the data can say "99% of the time, your framerate will be at least this good".
 
Wow. All that effort they made and you still didn't get that Nvidia and AMD dislike fraps for measuring frame latency is that it measures frames being made before the preset command.

FOr those who haven't read the article yet, fraps tracks frames being made before it reaches the operating system and the OS is between the program and the GPU.

Read this post here on why Fraps is just fine. It breaks down a bit in multi-card examination, but not for single cards.

So wouldn't the effect of frame latency show in the "Minimum FPS" value? If it takes a rather long period of time to render one frame due to latency, wouldn't that cause the computer to render fewer frames for that second?

I have always looked at minimum FPS to determine my graphical settings instead of FPS.

EDIT: I do understand the general issue with frame latency. I am glad it's being discussed. It was driving me crazy as a PC gamer, but I could never figure out exactly what was happening.
No, because FPS is an average of all frames over one second. Even if you had one frame take 100ms to render (10 FPS), you still have frames over the next 900ms to average it out.
 
Also, here is an article that further explores Fraps as a benchmark tool by comparing what Fraps record with what is actually happening in the game engine.

http://alienbabeltech.com/main/?p=33060

Conclusion:

We feel reasonably confident as we proceed that Fraps is as useful a tool in reporting frame times as it is is for frame rates. It is not perfect, but it conveys a reasonable assessment of what is actually being experienced in-game by the user regarding smoothness or its lack.
 
http://www.pcper.com/

just put an article online on their video-captured based frame counting method , and they've also uploaded a bunch of vids on their youtube channel http://www.youtube.com/user/pcper

edit : watch this video first (20 min), then read the article
This is the most important article about video game performance, possibly ever. Holy crap is this extensive and mind blowing.

*edit*

I also like that this article essentially validates Fraps for single GPU benchmarks. Multi GPU benchmarks is where things start to fall apart. AMD has some work to do.

*edit 2*

Holy crap look at this graph:

Crysis3_1920x1080_PLOT_2.png
 
I'm just halfway through the article now and my mind has been blown to pieces so much already that the pieces are now just particles of dust
 
I'm just halfway through the article now and my mind has been blown to pieces so much already that the pieces are now just particles of dust
No fucking kidding. This is so good.

This answers every little question that I have ever had about random stuff going wrong. I've seen this all, as I am sure most of us have, and here we have the answers. Amazing.

*edit*

Okay, time to start saving for one of these capture systems.
 
This is brilliant, I'm so glad PC tech sites are working towards identifying and exposing frametime issues that interfere with smooth gameplay. See game journalists, this is how you should do things! This movement is sure to impact PC game smoothness in a very real way.
 
Read most of it. There's more to understand than I'm going to be able to absorb for now. I'm really looking forward to them covering the lower price points and seeing how things stack up with AMD vs nVidia on single GPU setups.
 
Well, from everything that I've been able to see, this testing on single GPUs lines up almost exactly with Fraps frame time data.

If you want to see how the lower end cards stack up, check out Tech Report as they've been doing Frame time/latency testing as a central part of their reviews for quite awhile now.
 
I think both metrics have their value. I also think looking at median frame latencies would be interesting. However, there's got to be a balancing act of sorts. Yes, frame latency is important, but at what point are you taking too large of a hit in terms of overall fps in attempting to fix high percentile frame latency? Saying one metric or the other isn't valuable isn't the way to go about this. Each of them has their uses.
 
I think both metrics have their value. I also think looking at median frame latencies would be interesting. However, there's got to be a balancing act of sorts. Yes, frame latency is important, but at what point are you taking too large of a hit in terms of overall fps in attempting to fix high percentile frame latency? Saying one metric or the other isn't valuable isn't the way to go about this. Each of them has their uses.
They're both the same thing, except FPS polls information once a second and then averages it. Measuring frame times isn't just to determine variance. You can determine a more accurate frame rate as well.

It's literally the same set of data, with FPS attempting to measure centimeters with a stick that can only measure meters.
 
They're both the same thing, except FPS polls information once a second and then averages it. Measuring frame times isn't just to determine variance. You can determine a more accurate frame rate as well.

It's literally the same set of data, with FPS attempting to measure centimeters with a stick that can only measure meters.

It's obviously all the same data, but the way you analyze it is different as is the conclusions you draw from it.

I'm talking about 99th percentile vs. a mean reading like FPS. One looks an outliers, the other looks at the expected value. That said, if you've got a frame latency graph, that doesn't tell you much about general IQ, which is where FPS's value is. Sure you can average frame latencies, but that's my point. An average still has value.
 
Top Bottom