I've met the guy at fighting game tournaments. He's serious about the issue and the entire website was born out of a community-wide frustration that there wasn't a reliable central database for the subject. His numbers are solid as long as you know how to interpret them.
- The tester that he uses only tests at 1080p and only through HDMI. The results won't tell you how well a 1080p television will handle a 720p, 480p, or lower res signal. They won't tell you if the component or composite inputs are handled differently. They'll tell you how well a 4K television will handle a 1080p signal, but not how they'll handle content that is actually 4K resolution.
- The results are reportedly more favorable to LCDs than plasmas because the tester doesn't pick up the video signal from plasma displays as quickly as the signal from LCD screens. Reportedly.
HDTVtest has a decent write-up that mentions this.
- He only publishes the average of all results for each TV/monitor instead of providing all of the raw data. That's not a big deal, but I can go into a little more detail on this because I own the same testing device. Here's a test that I ran on my new TV:
These are the results for this TV after activating Game mode, turning off all post-processing features possible, and at least getting a ballpark estimate on proper color and picture settings. Displaylag also disables as much post-processing features as possible before taking measurements of their own.
You connect the tester to the TV via HDMI, turn it on, and hold it directly over the TV so that the light sensor lines up with one of the 3 bars on the left side of the screen (top, middle, or bottom). The tester will then report the amount of lag onscreen at that specific region of the screen. This is important because many TVs don't update all of the pixels at once; they'll fill in the lines sequentially. (Other TVs actually
will update every line at roughly the same time.) The readings will tell you how much time has elapsed between the point when the TV received the data for the current frame and when that frame has been drawn on the part of the screen that you're measuring. The readings will fluctuate about 0.1 to 0.2 milliseconds in either direction. Displaylag will take these 3 measurements, add them together and average them out, and that's the number that will be reported for the TV. (In this case, it would be 17ms.)
To better understand the meaning behind those numbers, I went to the trouble of using my tester on a VGA CRT monitor just to confirm that it would function the way that I thought it would (spoiler: it did):
I went through some control tests beforehand to confirm that the extra adapters I had to use to do this didn't add any lag of their own. (I measured about 0.1 to 0.2 ms extra, but that's negligible and within the margin of error.) The numbers above should make sense if you think about how a 60hz CRT operates. At 60 frames per second, it takes about 16.6 ms to draw a single frame from top to bottom, after which the next frame will start from the top. In these readings, we see that the top bar is just a little over 0 ms, the bottom bar is just a little under 16ms, and the middle bar is halfway between, as should be expected from a typical "lagless" CRT.
What's interesting, though, is that Displaylag would say that such a CRT actually has 8 milliseconds of lag. And that would technically be true, but since I think most gamers consider the "lag" of a gaming TV to be more specifically defined as "how much it lags behind a CRT," then maybe it makes more sense to subtract about 7-8ms or so from DL's readings instead of just using them raw.
So that's some things to think about. Displaylag is an enormously important resource imo, although I think it's important not to extrapolate more data than what is actually there. Specifically, I think the best use of it is for comparing specific tested TVs of the same type (LCD vs LCD, etc.) against each other.
There are options for it somewhere, probably in "Pro Picture Setup." Can't remember specifically. You can adjust it for each individual HDMI input. They're all set to auto-detect by default but I'm not sure how accurate that is; I set them manually before I connected anything.