Yep, that's it.Interesting write up, didn't even know such a thing was a thing(?) So this is how long it takes for 1 frame to render?
Is this the graph? 5min match of BF3, all settings max, 64players.
![]()
I misspoke. You wouldn't need a normal distribution, but you would need to know the mean. Standard deviation alone wouldn't be enough: a game with a stdev of 0, sounds great... but not if it's running at a steady 100ms/frame.
One question: is it possible for a frame to take longer than 16.7ms to render at 60fps with vsync? I assume it can't. If it did it means it would miss the window for the screen refresh, so I assume that should imply that it did not maintain 60fps.
As AMD user I think this should replace fps in every benchmark and card review ever.
It's a way more meaningful metric.
So sick of the hopelessly uneven framerates in tons of games on my amd card.
Have you got any results like this with SLI/crossfire?
Not that I'll be ditching my SLI setup for anything else anytime soon, but I'm just curious.
No, but I can tell you SLI will be more consistent. They do this buy adding a whole frame of input lag to allow for lining up the output more precisely.Have you got any results like this with SLI/crossfire?
Not that I'll be ditching my SLI setup for anything else anytime soon, but I'm just curious.
Tom's is doing a shit job of representing the data though. They're either purposefully dropping some really key information from their charts (95th percentile? seriously?) or don't understand why it's important. Then they fuel the fires of the AMD vs. Intel CPU wars and get a bunch of hits. Really really questionable stuff.Hope this new method of benchmarking games takes off; Tom's Hardware plans to feature it in their reviews going forward, along with their usual average fps data.
More info by TechReport for those interested: Inside the second: A new look at game benchmarking
Tom's is doing a shit job of representing the data though. They're either purposefully dropping some really key information from their charts (95th percentile? seriously?) or don't understand why it's important. Then they fuel the fires of the AMD vs. Intel CPU wars and get a bunch of hits. Really really questionable stuff.
Tom's is doing a shit job of representing the data though. They're either purposefully dropping some really key information from their charts (95th percentile? seriously?) or don't understand why it's important. Then they fuel the fires of the AMD vs. Intel CPU wars and get a bunch of hits. Really really questionable stuff.
Tom's is doing a shit job of representing the data though. They're either purposefully dropping some really key information from their charts (95th percentile? seriously?) or don't understand why it's important. Then they fuel the fires of the AMD vs. Intel CPU wars and get a bunch of hits. Really really questionable stuff.
People have different requirements though. As a competitive gamer, I really don't care about jarring visuals, as I want the ultimate lowest frame time possible as that means more frequent and faster polling of my input.
This answer goes for both, here's Skyrim's frame latency with vsync turned on (is there any other setting :/) running at an indicated 60 fps. The worst of all is that AFAIK Skyrim does limit frame latency by rendering 3 frames ahead which is what causes some input lag.
![]()
As you can guess it is jerky to play. I tend not to notice the big, but short spikes, but a thick line is really awful to play with. Vsync that constantly switched between 17 and 33 ms frame time was especially horrible when compared to a different game running the same frame rate (40ish), but without vsync. A constant 33ms is much better than that shit.
We absolutely did take latency into account in our conclusion.
I think the problem is that you totally misunderstand the point of measuring latency, and the impact of the results. Please read page 2, and the commentary next to the charts.
To summarize, latency is only relevant if it's significant enough to notice. If it's not significant (and really, it wasn't in any of the tests we took except maybe in some dual-core examples), then, obviously, the frame rate is the relevant measurement.
*IF* the latency *WAS* horrible, say, with a high-FPS CPU, then in that case latency would be taken into account in the recommendations. But the latencies were very small, and so they don't really factor in much. Any CPUs that could handle at least four threads did great, the latencies are so imperceptible that they don't matter.
Yes, but they are choosing 95% instead of 99%, thus covering up that 2% (compared to 99%) of data where things are going wrong.Using the %iles is a way to keep their articles neat and clean while still conveying relevant data.
Oh wow. That response clearly demonstrates they don't even know what the data means.in the comments section of the Toms article it looks like they just dont care about it as much as others do
Yes, but they are choosing 95% instead of 99%, thus covering up that 2% (compared to 99%) of data where things are going wrong.
Look at what this chart would indicate if the top 2.5% was thrown out as compared to the top .5%.
http://techreport.com/r.x/amd-fx-8350/skyrim-latency.gif[IMG][/QUOTE]
Either you are confused about 95%ile or I am. The 95%ile mark is showing the worst 5% of frames with respect to latency.
[quote="Anton668, post: 47629407"]in the comments section of the Toms article it looks like they just dont care about it as much as others do[/QUOTE]
This sounds about right.
The 95% throws out the top and bottom 2.5% of data to say 'What is the frame time that I can expect 95% of all frames to be rendered at?'Either you are confused about 95%ile or I am. The 95%ile mark is showing the worst 5% of frames with respect to latency.
The 95% throws out the top and bottom 2.5% of data to say 'What is the frame time that I can expect 95% of all frames to be rendered at?'
It's a metric long used to assess server performance.
Hrm. Well, where I get my argument from is TechReport:No, what you are referring to is 2 standard deviations away from the mean = 95% of the area under the distribution starting from the mean in both directions.
95%ile is the top 5%.
One way to address that question is to rip a page from the world of server benchmarking. In that world, we often measure performance for systems processing lots of transactions. Oftentimes the absolute transaction rate is less important than delivering consistently low transaction latencies. For instance, here is an example where the cheaper Xeons average more requests per second, but the pricey "big iron" Xeons maintain lower response times under load. We can quantify that reality by looking at the 99th percentile response time, which sounds like a lot of big words but is a fundamentally simple concept: for each system, 99% of all requests were processed within X milliseconds. The lower that number is, the quicker the system is overall. (Ruling out that last 1% allows us to filter out any weird outliers.)
Oddly enough, we want to ask the same basic question about gaming performance. We want our systems to ensure consistently low frame times, and doing so is arguably more important than achieving the highest FPS rate.
Not everyone can be a pro gamer Sethos, it's okay.Really glad this thing has reared its ugly face, now every time I mention the word "SLI", 15 seconds later there's a "OMG LATENCY" comment, really is beautiful.
Especially because I can't feel the difference whatsoever. You could sit me down in front of a new generation single and SLI PC and promise me eternal wealth and I would never be able to tell.
Hrm. Well, where I get my argument from is TechReport:
We can quantify that reality by looking at the 99th percentile response time, which sounds like a lot of big words but is a fundamentally simple concept: for each system, 99% of all requests were processed within X milliseconds. The lower that number is, the quicker the system is overall. (Ruling out that last 1% allows us to filter out any weird outliers.)
Not everyone can be a pro gamer Sethos, it's okay.![]()
Maybe I'm not understanding how this: "(Ruling out that last 1% allows us to filter out any weird outliers.)" equates to "focusing on the 1%". That seems to indicate the opposite.They are saying the same exact thing Tom's website is saying. 99%ile is looking at the 1%.
Maybe I'm not understanding how this: "(Ruling out that last 1% allows us to filter out any weird outliers.)" equates to "focusing on the 1%". That seems to indicate the opposite.
Someone pass the word on to Digital Foundry.
*edit*It indicates the opposite because they are using the term "percentile" incorrectly. Even if it was reversed (which it is not), there is no cutting off of the 5% at 2.5% on each end.
Yeah, they can add this information to their 720p -> 360p downsampled screenshots.
Really glad this thing has reared its ugly face, now every time I mention the word "SLI", 15 seconds later there's a "OMG LATENCY" comment, really is beautiful.
Especially because I can't feel the difference whatsoever. You could sit me down in front of a new generation single and SLI PC and promise me eternal wealth and I would never be able to tell.
okay, I get what you are saying now. What's the accurate term then for looking at the 99% of results?
Actually, isn't it more accurate to say 99% = the number separating the 99% from 1%? Or, the number separating 95% from the 5%? It would be the number right on the line, correct?
Can't it go either way? Like I said, doesn't the number represent the point in which the two percentiles are separated?
Okay, I think we are saying the same thing but in different ways.
Where it gets confusing is that the higher % you go in frame time, the WORSE the result is. This is different than the normal way of looking at percentile. Having a number LOWER than the 4% above it misrepresents that 4% of data.
Oh come on, take a good hearted jab.Which is funny because I'm an excellent player when it comes to FPS games and most people wouldn't notice SLI frame latency whatsoever. However, now it's suddenly a 'thing' and there's graphs and write-ups everywhere, now everybody suddenly feel a 3 day delay and are professors in frame latency.
Will those drivers affect latency for 6000 series cards like my 6850? Skyrim has had really bad latency for me.To everyone that didn't read throughly the OP, AMD has already launched drivers that make their cards a lot better in this aspect.
For all of my listed info and graphs:Hey OP, what are your system specs?
Great thread btw, I'm glad more people are becoming aware of this.
I misspoke. You wouldn't need a normal distribution, but you would need to know the mean. Standard deviation alone wouldn't be enough: a game with a stdev of 0, sounds great... but not if it's running at a steady 100ms/frame.
Now that I understand this whole thing better, when you say 99th percentile, you are not looking at the top 1%. You are looking at the number that separates the top 1% from the bottom 99%. Thus, when you say 95th percentile, you are looking at the number that separates the top 5% from the bottom 99%.percentiles aren't about your opinion/definition of the best or worst. they are based on how you set your data up. we could easily take 95%ile of lowest latency - so that we could see the 5% of frames with the lowest rendering time. it's just that tom's website is gathering the "high render time" frames. you can easily make your own chart to show results how you want![]()
Now that I understand this whole thing better, when you say 99th percentile, you are not looking at the top 1%. You are looking at the number that separates the top 1% from the bottom 99%. Thus, when you say 95th percentile, you are looking at the number that separates the top 5% from the bottom 99%.
That does seem to follow.Question,
is this the reason why there are some games that are 30fps and yet some praise the game for how smooth it is DESPITE being only 30fps?
E.g (Resistance Fall of Man)
Or more specifically, is the smooth(isity) of gameplay dependent more on the standard deviation of Latency rather than the Framerate?
So if we played a game at 20fps but latency std/dev was 0ms
vs a game played at 40fps with latency 10ms
Are you suggesting that this could cause a lower FPS to appear on par?
Thank you sincerely for helping me with that. GAF is the best.It can be used either way. In the context of Tom's graphs, 95%ile clearly refers to all cases outside of that 95%. From your wikipedia link "For example, if a score is in the 86th percentile, it is higher than 85% of the other scores." so 86%ile means 85% of scores below. The grouping above is more or less irrelevant.
Percentiles just split up the data so we can look at it in meaningful ways.