FYI, Microsoft had a presentation on this a few years ago based on the 360: What’s in a Frame: Latency, Tearing, Jitter, and Frame Rate on Xbox 360
I dont even know what that means. Is CF outputting a black frame or simply duplicating the last frame that went up. Or is it a null frame?
Taking things a step further, it's important to note that frame delivery timing itself is not the be-all, end-all solution that one might think, just because it monitors the very end of the pipeline. The truth is, the content of the frames matters just as much to the smoothness of the resulting animation. A constant, evenly spaced stream of frames that is out of sync with the game engine's simulation timing could depict a confusing, stuttery mess. That's why solutions like Nvidia's purported frame metering technology for SLI aren't necessarily a magic-bullet solution to the trouble with multi-GPU schemes that use alternate frame rendering.
In fact, as Intel's Andrew Lauritzen has argued, interruptions in game engine simulation timing are the most critical contributor to less-than-smooth animation. Thus, to the extent that Fraps timestamps correspond to the game engine's internal timing, the Fraps result is just as important as the timing indicated by those colored overlays in the frame captures. The question of how closely Fraps timestamps match up with a game's internal engine timing is a complex one that apparently will vary depending on the game engine in question. Mark at ABT has demonstrated that Fraps data looks very much like the timing info exposed by several popular game engines, but we probably need to dig into this question further with top-flight game developers.
Peel back this onion another layer or two, and things can become confusing and difficult in a hurry. The game engine has its timing, which determines the content of the frames, and the display has its own independent refresh loop that never changes. Matching up the two necessarily involves some slop. If you force the graphics card to wait for a display refresh before flipping to a new frame, that's vsync. Partial frames aren't displayed, so you won't see tearing, but frame output rates are quantized to the display refresh rate or a subset of it. Without vsync, the display refresh constraint doesn't entirely disappear. Frames still aren't delivered when ready, exactlyfragments of them are, if the screen is being painted at the time.
It's a good theory, and we'll need to see a bit more. Scott Wasson's take on what's been presented so far:
http://techreport.com/blog/24415/as-the-second-turns-frame-captures-crossfire-and-more
GDDR5 is the new GHz, Frame Latency is the new FPS. Gotta do your homework for next gen![]()
This is huge though. It's years of what we thought we knew thrown for a loop. This literally is a paradigm shift in how we can test for performance. It's not some buzzword to wow people.
This is huge though. It's years of what we thought we knew thrown for a loop. This literally is a paradigm shift in how we can test for performance. It's not some buzzword to wow people.
Unless there is an anomaly, the same cards will do best at latency as do best with FPS. That's usually what you see in Scott Wasson's testing.
This is huge though. It's years of what we thought we knew thrown for a loop. This literally is a paradigm shift in how we can test for performance. It's not some buzzword to wow people.
Considering AMD admitted to not testing for frame latency and now are activity targeting it I'd say it's a big deal when someone like PCPer runs a test and xFire 7970's run BF3 worse than a single card. It's quantified now.On the one hand sure. On the other hand not really. It's only a method of quantifying of what users have experienced first hand already. Things like micro-stutter and the overall gameplay not feeling smooth is noticed by people, despite whatever frame rate a tool was reporting at the time.
On the one hand sure. On the other hand not really. It's only a method of quantifying of what users have experienced first hand already. Things like micro-stutter and the overall gameplay not feeling smooth is noticed by people, despite whatever frame rate a tool was reporting at the time.
No, it's a method of capturing *everything* more accurately. It's grabbing information on every single frame rendered. Seeing variance is only one aspect of frame time testing. If you want to look at any data accurately, would you rather have 30+ values averaged over one second, or all of the data?On the one hand sure. On the other hand not really. It's only a method of quantifying of what users have experienced first hand already. Things like micro-stutter and the overall gameplay not feeling smooth is noticed by people, despite whatever frame rate a tool was reporting at the time.
See above. I can convert the values into FPS if you'd like, but even then, second based polling is missing tons of information.I dont know, it is very interesting, but it occurred to me today that most of the time it just backs up what you see in the FPS charts.
Unless there is an anomaly, the same cards will do best at latency as do best with FPS. That's usually what you see in Scott Wasson's testing.
Plus, when the whole hubbub over AMD cards doing bad in Wasson's recent testing got me wondering, what is it I really want in a card? I'd rather have a card that does better in 99% of frames, than one that has less dips, but also a lower average, I think. I can live with a card that does well 99% of the time, and has the occasional hitch. Because 99% of the time it's doing well.
I think if there's still "one number" that's my go to, it's gotta be FPS.
NP, I'm just repeating what's happening. Credit should go to TechReport and PC Perspective.Thanks for the thread mkenyon.
Haha, if only . . . .there really should be a flashing all bold marquee for this thread's title along the lines of "READ THIS IF YOU EVER PLAN TO TALK ABOUT GAME PERFORMANCE AGAIN".
It's awesome to see because it's really the only way to force AMD and Nvidia to seriously focus on this aspect. It's paying off faster than I would have expected. I'd love to see a similar revolution happen with televisions and input lag.
I think AMD is deflecting a bit there.
I think AMD is deflecting a bit there.
Most gamers who care about their frame latency will have their display queue length limited to 1, which renders most of their arguments agaisnt using FRAPS as a benchmark tool void.
So wouldn't the effect of frame latency show in the "Minimum FPS" value? If it takes a rather long period of time to render one frame due to latency, wouldn't that cause the computer to render fewer frames for that second?
I have always looked at minimum FPS to determine my graphical settings instead of FPS.
EDIT: I do understand the general issue with frame latency. I am glad it's being discussed. It was driving me crazy as a PC gamer, but I could never figure out exactly what was happening.
You could be rendering a scene at 186 fps. Sure, a nasty latency spike would drop the number per second, but if you're still rendering at 117 fps after that, you're going to wonder why your "locked at 60fps" looks like crud.
So wouldn't the effect of frame latency show in the "Minimum FPS" value? If it takes a rather long period of time to render one frame due to latency, wouldn't that cause the computer to render fewer frames for that second?
Wow. All that effort they made and you still didn't get that Nvidia and AMD dislike fraps for measuring frame latency is that it measures frames being made before the preset command.
FOr those who haven't read the article yet, fraps tracks frames being made before it reaches the operating system and the OS is between the program and the GPU.
No, because FPS is an average of all frames over one second. Even if you had one frame take 100ms to render (10 FPS), you still have frames over the next 900ms to average it out.So wouldn't the effect of frame latency show in the "Minimum FPS" value? If it takes a rather long period of time to render one frame due to latency, wouldn't that cause the computer to render fewer frames for that second?
I have always looked at minimum FPS to determine my graphical settings instead of FPS.
EDIT: I do understand the general issue with frame latency. I am glad it's being discussed. It was driving me crazy as a PC gamer, but I could never figure out exactly what was happening.
We feel reasonably confident as we proceed that Fraps is as useful a tool in reporting frame times as it is is for frame rates. It is not perfect, but it conveys a reasonable assessment of what is actually being experienced in-game by the user regarding smoothness or its lack.
This is the most important article about video game performance, possibly ever. Holy crap is this extensive and mind blowing.http://www.pcper.com/
just put an article online on their video-captured based frame counting method , and they've also uploaded a bunch of vids on their youtube channel http://www.youtube.com/user/pcper
edit : watch this video first (20 min), then read the article
No fucking kidding. This is so good.I'm just halfway through the article now and my mind has been blown to pieces so much already that the pieces are now just particles of dust
They're both the same thing, except FPS polls information once a second and then averages it. Measuring frame times isn't just to determine variance. You can determine a more accurate frame rate as well.I think both metrics have their value. I also think looking at median frame latencies would be interesting. However, there's got to be a balancing act of sorts. Yes, frame latency is important, but at what point are you taking too large of a hit in terms of overall fps in attempting to fix high percentile frame latency? Saying one metric or the other isn't valuable isn't the way to go about this. Each of them has their uses.
They're both the same thing, except FPS polls information once a second and then averages it. Measuring frame times isn't just to determine variance. You can determine a more accurate frame rate as well.
It's literally the same set of data, with FPS attempting to measure centimeters with a stick that can only measure meters.