• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Gamers, now is not a good time to buy a cheap "HDR" TV

jaaz

Member
It doesn't. As far as i know there are no digital testers right now to measure input lag at 4k so you need to measure it the old fashion way. It adds a bit but I don't know how much.

Sorry, I confused you with spwolf who threw out the 30 ms at 4k on top of the ~20 ms at 1080p (combined >50ms). Was wondering where this number is coming from:

its 1080p at 20ms, 1080p HDR at 23ms and 4K adding 30ms to that.

So around >50ms, and people are lowing low lag on it.

Because if true, this puts the KS8000 at almost the same input lag as the Vizio and LG OLED at 4k HDR.
 

smisk

Member
Been looking at the 55 inch LG 4K OLED TV. It gets great reviews in most areas, but sounds like it has ~50ms of input lag. Is this gonna make stuff unplayable? I mostly play on PC but use my tv for Nintendo consoles and the occasional PS3 game.
 

MazeHaze

Banned
Sorry, I confused you with spwolf who threw out the 30 ms at 4k on top of the ~20 ms at 1080p (combined >50ms). Was wondering where this number is coming from:



Because if true, this puts the KS8000 at almost the same input lag as the Vizio and LG OLED at 4k HDR.
This isn't true. Testers don't measure 4k accurately, but I can tell you from personal experience.

The difference between pc mode and game mode at 1080p is 37 ms for the former, 21 for the latter. I notice that immediately. I know this because sometimes the tv auto switches to pc mode when I boot my computer. As soon as I start a game is when I notice the lag and switch to game mode.

The difference between game mode at 1080p and game mode at 4k is ?. I don't notice that.
So I would conclude that in this instance ? Is less tham 37ms. I would even guess it's less than 30ms.

My old tv had a lag of 50ms, and I did not like playing pc games on it.

Tldr, input lag is certainly less than 50, I would say less than 37, and if I had to guess, it's at least within 10ms of the 20ms 1080p results.
 

EvB

Member
Been looking at the 55 inch LG 4K OLED TV. It gets great reviews in most areas, but sounds like it has ~50ms of input lag. Is this gonna make stuff unplayable? I mostly play on PC but use my tv for Nintendo consoles and the occasional PS3 game.

60fps games have a minimum of 50ma of lag (67.77 being most common

30fps game have a minimum of 100ms

So a TV with 50ms of latency will make a 60fps game respond like a 30fps game and a 30fps game feel even worse
 
It's been stated on a few sites that Netflix and other 4K streaming is compressed to the point where it is basically the quality of Blu Ray. Download a demo to a thumb stick and try it to see what you think.

Id also mention that store are in most stores aren't calibrated. They are set to whatever factory defaults are. Factory defaults are often way off from proper and aimed at displaying the brightest most over saturated colors without necessarily being maxed out. They also usually have motion enhancement turned on causing the Soap Opera Effect. This is no way to judge the set properly and it will also need to be calibrated in your home to the room it is in to get proper viewing. Additionally, store demos have been running non-stop for varied lengths of time straight sometimes causing burn in on some sets.
Thanks, I'll give a thumb drive a go!
 

jeffc919

Member
its 1080p at 20ms, 1080p HDR at 23ms and 4K adding 30ms to that.

So around >50ms, and people are lowing low lag on it.

Where are you reading 4k adds 30ms? I know this topic keeps coming up because the KS series Samsungs look like the only hope for a sub-30ms input lag TV in 4k/HDR this year but nobody seems to know for sure what the real number is. The poster a few spots up has empirical evidence that it is in the 30s or better but I've seen others, like you, claim it is higher without a source or referencing some other model TV. Would you mind clarifying?
 

smisk

Member
Apparently my parents ~2009 Vizio LCD TV that has about 50ms of input lag and it's never bothered me.. To be fair I don't play FPS or Fighting games on console, those would probably be more much more noticeable.
 
I knew somewhat about the limitations / wishy-washy standards, but due to me moving I needed a new TV and wanted to future proof myself with 4k, HDR and not skipping on some features (true 120hz, movie playback without stutter/soap opera, low input lag although not sure about HDR mode) and bought one that I considered outside of my initial pricing range.

Still having a bit of buyers remorse now that the PS4pro upgrade is almost useless and 4K HDR content might be still too sparce. The blacks are also not very good and on that chart it only reaches 297. I was also annoyed that I couldn't get something like the P series in my country as that stuff (and the low 4K prices in general) seems to be US only...
In hindsight I maybe should have gotten a bigger OLED 1080p screen instead, probably would have been cheaper too.

Don't get me wrong, once calibrated and set up correctly it looks amazing, but still :p I hope that firmware update will give some good HDR content on OG PS4, OP brings up some valid questions.
 

Karak

Member
There are a couple in the higher cost range that I could see maybe getting but since there isn't near enough media to risk jumping in and getting burned I think its cool for most to stay back and see what comes down the pipe a bit. I returned the KS8000. Compared to other tv's and testing side by side its edge lighting and bad dimming killed my soul. Not as easily noticed until side by side and then its night and day.
Sadly its still got the best input lag.
 

Hip Hop

Member
I'm gonna wait a year or so to get a HRD/4K TV.

With Scorpio being something I'll buy into, I will want to experience it right.
 

LordOfChaos

Member
5000 nits? I'd like to be able to use my eyes in 30 years time please.

5000 nits? Are they fucking insane? 2000 is strong enough most people would have to look away from the damn screen unless I'm reading this wrong.

I think the point is that's tested from 2% of the screen area. It's not like staring at a 5000 nit laptop screen which would be crazy, it's creating things like sunspots realistically, the whole scene won't be that crazy bright
 

Cyriades

Member
SUznELE.gif

Damn, 5000 nits is bright..
 
I looked at that those charts and that is what causes me confusion. I don't see the necessity of 12bits+metadata, 12bits alone makes sense, as does 10bits+metadata. It looks to me that 12bits using the PQ curve alone is good enough to be below the Barten ramp from 0.001 nits to 10,000 nits.

The metadata, is information for the display. The display decodes HDR10 metadata or in Dolby Vision case the required chip does the decoding and takes control of the backlight, tone mapping, etc.
 

wildfire

Banned
As annoying as it was to read the OP admit to being careless about including OLED in a way that misrepresents how they meet UHD I'm more annoyed with the GAF posters who seem to know better but really don't go into whether or not there are DECENT CHEAP displays to buy.

The OP's heart was in the right place to advise budget conscious buyers to not get hoodwinked by marketing terms. But no one (who seems to understand why the OP was getting the details wrong) is capable of actually moving this discussion forward for consumers who have to be more picky about tradeoffs when buying a low cost alternative.

The only thing I personally can say is if you actually want to take the time to research UHD then to understand it properly you should read this much better thread.

What is the HDR standard?

Now that is how you explain things.
 

le-seb

Member
Brightness should be set using calibration tests disc. Backlight can be fine to set to zero or as low as possible.
Calibration discs are better than nothing, but if you want proper screen calibration, you'll need to use a probe.
But who cares about calibrating their display anyway ?
I do.
 

Theonik

Member
Most gaming monitors these days are all about gsync or freesync and having a high refresh rate and a nice IPS display.
Most gaming monitors are still TN panels with some ips but neither is great for media. VA gives the best black levels. Once HDR takes off on PC games you might see it more though.

On top of this, 1000nits + requires specific lighting, so as not to melts your eyeballs.
It's the other way around. OLED HDR requires special illumination clothing and room setup to hit the required black level. 1000 nit is fine. This is only used on highlights. Your eyes can deal with it.

How does the viewing environment affect the brightness you should be looking at to create the HDR effect. I mean, I don't think 500nits in a pitch black room causes the same reaction as 500nits in a store environment. Doesn't this sort of bring in more variables into the mix?
The sets are designed to be used in darkness as with any media display calibration should be done to the room light.

This is easier said than done right now though. Calibration tools don't really work for HDR right now and there is no way to really adjust to bright rooms due to how HDR works in many of these sets. Basically the sets are already at their brightest so you can't adjust up to compensate.

Something like the Sony Zd9 might get around this being able to do 1800 nits but haven't seen much proper testing of it yet.

E:HDR is about dynamic range, how many steps of brightness you have. It's not about torching your eyes.
 

iMax

Member
Might be waiting forever. I think the internet will eventually mean there won't be OTA broadcasts. I don't know that a 4k signal can even be transmitted OTA. Can it?

4K satellite broadcasts are quite established in Europe now.
 

Mokujin

Member
Very interesting read.

I would like to ask, given that I'm not that interested right now in 4k, what about 1080 PC monitors and UHD? Are there good choices or is the market still immature?
 

hesido

Member
No they are not. 12 bit is the max and you can only do 4096 brightness steps in that.

The 4096th step could be 10000 nits? I don't think banding will be an issue when adjacent pixel can only be 9750 nits, instead of 9999 nits. Or maybe they work some math magic and still be able to carry that much information by storing the information in blocks so you can display the whole possible brightness levels inside of a block, as long as you have don't have 4096 pixels in that block (lets say 512 pixels), through lossy or lossless compression.

I barely know what I'm talking about..
 

Vintage

Member
For a game to well and truly support WCG, every texture and video will need to be replaced with an 10 bit WCG version.

How impactful will changing textures be? In 3d games, textures are applied to polygons, which have lighting depending on light intensity, colour, angle towards light. That alone can output to a wide colour space. Add reflections and other shader effects and I'm pretty sure the final output can take advantage of 10bit channels. Textures are just one of many inputs on the way to final picture.
 

Peltz

Member
Apparently my parents ~2009 Vizio LCD TV that has about 50ms of input lag and it's never bothered me.. To be fair I don't play FPS or Fighting games on console, those would probably be more much more noticeable.
That's because you aren't accustomed to something faster. You would absolutely feel the difference.
 

EvB

Member
How impactful will changing textures be? In 3d games, textures are applied to polygons, which have lighting depending on light intensity, colour, angle towards light. That alone can output to a wide colour space. Add reflections and other shader effects and I'm pretty sure the final output can take advantage of 10bit channels. Textures are just one of many inputs on the way to final picture.

Yeah I suspect that is about all we are going to see at the moment
 
This is totally anecdotal, but I have an 8000, my dad has an 8500. I love my 8000, but his 8500 has enough edge light bleeding that I would have been shipping it back if it were mine.

My 8500 had some light bleed when I set it up, but I "gently massaged" the offending areas at the advice of someone online and the very minor light bleed went away.
 

DonShula

Member
this HDR seems like a huge opportunity for people to have no idea what any of this shit means

Oh it is. There is so much misinformation in this thread that it would take hours to clean up. We already have multiple threads where people obsess over input lag without knowing how to properly test it. People compare numbers from different sources and assume they used the same measurement methods. And practically no one stops to wonder how that actually affects their gameplay experience beyond assuming "high numbers = bad." Now we're doing the same thing with HDR and brightness measurements. I feel bad for the OP as they even explained what the measurements mean and tipped a hat to the source of the measurements, yet it appears almost no one bothered to read this and instead went straight to the chart. Now we have a bunch of posts to the effect of "5000 nits will kill my eyes." Makes me wonder how many people would even notice an HDR source on display without a SDR source on display right next to it.

But it makes great fodder for baseless discussion so I guess that's something.
 

Koppai

Member
My XBR65X810C doesn't have HDR but I honestly don't think it really matters, it's still 4K and looks awesome. Just bought a HT-CT790 Soundbar to go with it yesterday :D
 

Vanillalite

Ask me about the GAF Notebook
If your TV is old

- too small
- cheap model
- maybe only 720p
- maybe on its last legs
- a tv you'd move elsewhere in your house

Then upgrade away

...

If your TV isn't that old

- if your TV isn't that old
- if you bought a high end tv last go round
- if it's already rather big
- if it still works great
- if you want but don't need

Then wait!


There is no reason to rush out and get a TV right just because especially when the tech is gonna mature, standards work out, and things get cheaper.

That being said there are some good TVs out there at all price levels. So if you have an old cheap 720p tv or you have a small 32" you got when you were young then by all means go ahead.

Finally I can't stress this enough. IT'S YOUR MOTHER FUCKING MONEY AND YOUR TV! By all means ask for advice, read reviews, browse around. You ultimately have to live with your purchase. So your needs, cash allowance, room setup, ability to care or not care about shit is uniquely you. If you can deal with higher latency than some then fine. It's your TV internet be damn. If you get a TV and you just don't like it despite good reviews. Return the damn thing. It's your cash and your TV. You gotta be happy with your purchase. Also just cause you buy a $800 TV and it doesn't stack up with the latest $3000 TV don't fret. Not everyone has to drive a Veyron nor should you feel sad if you like your new Honda or Toyota.
 
My XBR65X810C doesn't have HDR but I honestly don't think it really matters, it's still 4K and looks awesome. Just bought a HT-CT790 Soundbar to go with it yesterday :D

The 810C is in a weird place - it supports HDR via USB using the built in video app (try downloaded http://demo-uhd3d.com/fiche.php?cat=uhd&id=144 and put it on a USB stick).

It doesn't support HDR over HDMI though - Sony updated some of the 2015 TVs to support HDR over HDMI and this one was excluded. Its kind of weird since they later promised to update some newer 2016 sets with HDR over HDMI that have worse color. For example the 750D has an 8-bit panel and is being updated while the 810C is a 10-bit panel with no update in sight.
 

Hjod

Banned
I was thinking of buying a LG OLED 4K, the B6, but seeing how high the response time is I might just wait. I already have an OLED that I'm really pleased with.
 

Korezo

Member
So I should not buy the LG E6 oled and just wait for next year for the 100fps LG? I doubt there will be many tvs next year better than the lg oled now unless its another oled by lg or panasonic..
 

jeffc919

Member
Man people here are wrong. Ks8000 HDR is 22.6 ms. Non HDR is 20.6.

Period. No 20 + 30 crap

I think they are wrong too but how do you know it is still 22.6 in 4k if the measurement was taken in 1080p? I think people are jumping the gun buying this thing until we know for sure, especially if their main reason is for PS4 Pro which is almost 2 months away.
 

dose

Member
Okay thanks. But where does it say that going to 4k resolution adds 30 ms on top of the ~20ms at 1080p?

Man people here are wrong. Ks8000 HDR is 22.6 ms. Non HDR is 20.6.

Period. No 20 + 30 crap
I posted the source of the +30ms extra lag with 4k input in another thread.
http://uk.rtings.com/tv/reviews/by-...d-vs-1080p-full-hd-tvs-and-upscaling-compared
z1y8bDe.png

So that's another 30ms input lag on top of the 26.1ms that it has at 1080p.
Syrus, who is more likely to be right, you or a website that specializes in testing TV's?
 
I posted the source of the +30ms extra lag with 4k input in another thread.

Syrus, who is more likely to be right, you or a website that specializes in testing TV's?

This is confusing and requires further clarification. Rtings specifically states a ~23ms rating for the KS8000 with HDR10 enabled.

HDR10 = a 4K signal. AFAIK, there is no flavor of 1080p you can send over HDMI with HDR10 metadata enabled.
 

Theonik

Member
The 4096th step could be 10000 nits? I don't think banding will be an issue when adjacent pixel can only be 9750 nits, instead of 9999 nits. Or maybe they work some math magic and still be able to carry that much information by storing the information in blocks so you can display the whole possible brightness levels inside of a block, as long as you have don't have 4096 pixels in that block (lets say 512 pixels), through lossy or lossless compression.

I barely know what I'm talking about..
The 10k nits are meant to be a later target so likely not part of the 12 bit roadmap. But that's one of the main functions of gamma in older display technologies (our eyes aren't linear neither are displays). HDR doesn't use a gamma function though.
 

gatti-man

Member
Where are you reading 4k adds 30ms? I know this topic keeps coming up because the KS series Samsungs look like the only hope for a sub-30ms input lag TV in 4k/HDR this year but nobody seems to know for sure what the real number is. The poster a few spots up has empirical evidence that it is in the 30s or better but I've seen others, like you, claim it is higher without a source or referencing some other model TV. Would you mind clarifying?

UHD doesn't add 30ms. I have a vizio 4K that has a high speed input that's lower than 30ms total for 4K gaming. 75"

I think they are wrong too but how do you know it is still 22.6 in 4k if the measurement was taken in 1080p? I think people are jumping the gun buying this thing until we know for sure, especially if their main reason is for PS4 Pro which is almost 2 months away.

There are sites that measure input lag and verify manufacturer claims. There is a metric ton of misinformation in this thread. I wouldn't believe anything in here it's that bad.
 
Top Bottom