Got a link proving that mouthful? And no, a link to a dude talking on another forum isn't going to cut it.
I don't need dudes talking on another forum - although tbh I find the request in itself very silly as I could get posts from engineers, insiders and dudes that do reviews going into some things in detail; but I digress, I really wouldn't use such "linking" because I'm lazy, finding such enlightening posts can be hard and you're not exactly asking nicely.
Rather, I'll backtrack and reinforce my logic: It's really simple,
if something was and felt responsive last year and now leo bodnar reports it as being lagged to hell then Leo Bodnar IS the discrepancy. Simple cause and effect logic.
Reviewers realized as much right away, and they realized Leo Bodnar method wasn't being accurate on the Plasmas because:
Panasonic ST50 over fast camera method: 16 ms
Panasonic ET60 over fast camera method: 34 ms
Panasonic ST50 over Leo Bodnar: 47 ms
Panasonic ET60 over Leo Bodnar: 48 ms
(I used these two TV's for a reason, you'll understand soon enough)
They're not the same, no. One of them in reality it doesn't have perceptible 47 ms response time at all. Emphasis on perceptible, or better yet when seen the other way around... it's way more fluid so perhaps it's Leo Bobnar that is perceiving it incorrectly even if the thing it's reading is as accurate as it can get.
Leo Bodnar for LCD's is fine albeit still comparing apples to oranges (ie: "
2011 TV was measured back then as having 33 ms of lag, this 40 ms 2013 TV is such a step down!" perhaps not so!), it matters knowing how it was measured; and before fall of 2012 everything unless otherwise noted is fast camera method.
Cutting the chase though, I can link you to this:
It’s Harsher On Plasma TVs
When we first got our hands on the Leo Bodnar device, we were surprised when we obtained (nearly) the same 48ms figure from a Panasonic ST50 PDP (plasma display panel) and a new Panasonic ET60 LED LCD (both running in their fastest Game mode). From our experience of playing a decent amount of first-person shooter games online, the Panasonic ST50 is a total joy to play on compared to the LCD. The former feels considerably smoother than the latter, but both are returning basically the same figure.
Or, put another way, we can believe the figures returned by the lag tester, but began to wonder if it’s being harsher on plasmas.
An LCD-based display updates the screen from top to bottom, one line at a time, which means that a player’s brain cannot make sense of a part of the image until it has been completely rendered. The LCD’s top-to-bottom addressing can be seen with the Leo Bodnar lag tester: measuring the top patch tends to give a lower number than measuring the centre patch from our tests. However, on a PDP, the result is always the same on both patches.
Because plasma displays work by flashing the screen several times just to draw one video frame, on a PDP, an intermediate image doesn’t look half-drawn in the same way that it would on an LCD. Instead, it would have very low gradation (and brightness). In theory, this means that the player has a better chance of seeing the entire gameplay screen, albeit not at full quality, since the subfield drive throws out different steps of the dynamic range quickly just to draw one fully-gradated frame.
This is the key difference, on the LCD, obviously our eyes can’t make sense of parts of the frame which haven’t been drawn yet (parts of the frame are either fully rendered or not), but on the plasma, we get extra temporal precision in the feedback loop, since we can see rough versions of the frames before they’re even fully drawn. And, in a fast-paced game, our brain doesn’t care if it’s seeing incomplete images – it should still be able to make out rough outlines and shapes.
The incomplete frames don’t necessarily even have to be coherent to our eyes. Even if we can detect the screen responding to our finger movements at all, it should be enough to make the game feel much more responsive.
In isolation, and for slow-paced games, this is all basically moot. But in a first-person shooter (even one which only runs at 30 frames per second) or racing game, etc, where the entire screen is moving and split-second decisions count, we think the PDP’s subfield drive helps tremendously in making the gameplay feel smooth. After all, in reality, playing fast-paced games is a continuous feedback loop between the player and the screen.
How does this explain why plasma televisions that feel much more responsive are shortchanged by the Leo Bodnar input lag tester which returns a higher figure? Well, we surmised that the flashing white bars need to hit a specific brightness threshold before they can be picked up by the device’s photosensor for lag time calculation: if you decrease or increase the on-screen luminance using the TV’s [Contrast] or [Backlight] control, the Leo Bodnar’s lag number should rise or drop correspondingly.
A plasma’s subframe, while not bright enough to trigger the photosensor, can readily be perceived by us in the sensorial feedback loop, thus accounting for the discrepancy between the displayed input lag figure and the actual responsiveness of a PDP. Ironically, the older stopwatch/camera method – though inconsistent – is capable of capturing subframes before they’re fully drawn (since it’s not limited by any luminance threshold, and the shutter speed is much higher than the panel refresh rate), and so more accurately reflects how responsive a PDP is.
This is the reason why we continue to run both tests on most HDTVs we review despite the photo method being such a labour-intensive process.
Source:
http://www.hdtvtest.co.uk/news/input-lag
I rest my case.
You're suggesting the testing method that involves two separate displays, a high speed camera, and a computer is more reliable than the method that involves a single box with a camera built into it.
No, I'm sure Leo Bodnar is more accurate for whatever it's looking for (I actually know what it's looking for, but I like the way "whatever" sounds, thank you) but I'll believe my eyes first when grasping how fluid something really feels in direct comparison to something else; in this case, an abstract "automagical" method.
A camera takes photos in a spectrum my eyes understand against a counter being ran simultaneously, if the result averages at 23 ms and the visible result on the photos is a stable (and not overly blurred image on the TV then while there's space for error I know the thing feels responsive in accordance to everything measured that way. Meaning a Plasma and a LCD on fast camera mode are oranges to oranges, compare the numbers and it the responseness should be in line.
That's night and day different to a machine that is looking for a 100% white bar to materialize telling me the lag is actually "x"; I'm sure the machine is right but often looking for something to register on a "trap" doesn't tell the whole story, the trap doesn't get everything only what it was designed to "catch", hence in reality it might not be as accurate as it supposedly is, even if it is.
Due to lack of data and log collecting of any other kind (it could register when the black pixel first starts to change until it gets the end result for instance), it's really of no use if something is drawing incomplete frames that with succession (and the fact the refresh rate is HUGE) are perceived as complete; perhaps being too accurate is the problem.
Any man of science will tell you the test involving the fewest variables is likely to the most reliable.
Any man of science will bless the fact that he has more than one piece of automatically obtained data to study.
In regards to this, both figures are data and it's a good thing a new method came to, but they have to be taken like that - data, to be ultimately appreciated and classified by human.
We're the humans, so
we have to establish the criteria here, not the other way around. If leo bodnar tells you a plasma is super lagged, you pin it against a LCD that supposedly lags as much, and notice the plasma is way more responsive will you still say leo bodnar is right? that would be silly; science is about not being a blind believer, get to the root question it, and do your own testing. (for the record I'm one of those dudes that keeps CRT's around, and I really dislike input lag; if any plasma I had was lagged... I'd know; and I have several)
Leo Bodnar is really not making any favours for a Plasma - fact, it doesn't tell the whole story in reality it tells you very little; and it's looking for the wrong thing there; it's not the wrong thing on LCD's but it's the wrong thing on plasmas.
See my VT60 lag tests a page or 2 back. I also ran the timer in the U pad browser and it showed 1-2 frames faster than the VT60. I'm fairly confident the bodnar is accurate.
I know,
we exchanged messages there.
I can only talk for european VT60's of course (
I did the European distinction back there due to it) and like before
if there's a nuance between USA and Europe/British models I find that very strange and dunno why it could be but fact is I'll trust british readings nonetheless, their methodology looks to be rock solid and I simply don't think they could have been doing it wrong for years now (for US models I can't speak, but providing the source I might also give them more credit than I do to you, sorry; I understand that you'd stand by it as I would if I delved into that and came to such results , but I'm being honest, I don't regard you as a "pro" so if a regarded pro says something with proper backing and feel I'll try to understand why he's saying that but if I have to choose blindly I'll still go by it). I found your results very strange though, because GT30 result was in line with the expected... the rest, not really, and it's fast camera mode there and it didn't feel like you were doing it wrong, but I also don't know if there's a big difference between the equipment you're using and theirs.
Nevertheless if they said VT60 is 23 ms fast camera method and 41 ms leo bodnar then that's what it is for the VT65B; unless firmware changed something in a big way.
The other part of the equation/reasoning is that I recently (very recently, it's still doing the break in) have purchased a VT60E and while it's sadly not meant for gaming (65" inches is too big for my gaming room, and I couldn't afford the 65 incher and a smaller GT60/VT60 for my game room) I tested it yesterday for gaming using "nintendo hard" classics that expect CRT's with 0 ms lag, I also still have the CRT's to go along with them so I know exactly how they should feel against my Panasonic 42X50; I'll never play them on a modern screen on a daily basis if I do own CRT's for the task, but they're a good test bed for input lag and response feel nonetheless, much more so than current gen games are.
I didn't test them side by side because VT60 is on the first floor and my CRT's are downstairs; nevertheless I also do have image processing equipment (basically a self assembled SLG in a box) meant for retro gaming on flat panels that adds some lag... to be more precise, it adds ~33 ms of lag, and I do slightly notice it when plugging it in onto a PC display even.
I didn't use it whilst testing though, but I'm saying this to illustrate the following; although it's very hard to say I really don't think I experienced more than 33 ms of lag in there; it behaved like linking directly to my X50 sans the SLG+GBS8220+SyncStripper shenanigans (if I turn that on over the TV then it becomes 33+16 ms or 49 ms of lag, it's mild noticeable while not something I can't deal with for most games).
I can't vouch for it other than "it felt like" I'm aware of placebo effect and all that; I don't even know if I can distinguish between 16 ms and 33 ms in a palpable way; but I notice lag if image processing reaches 50 ms; which is not so far from 41 ms.
I really don't think for a second that VT60 is 41 ms at all.