• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Gamers, now is not a good time to buy a cheap "HDR" TV

n0razi

Member
I think 1000 nits is fine... just cause you can buy a 10,000 watt speaker system doesnt mean you should
 

Ranger X

Member
What's the most important aspect of TV sets for gaming purpose is the freaking response. That's number one priority if you're serious about gaming in my opinion.

Whatever the fluff they throw at you 4k, HDR and whatnot. This will not make your gaming great when those sets are still slowpokes. Personally I am not calling this improvement much when going from "2-4 milli + barely any input lag" to "10-15 milli". This is just disastrous.
 

BlueTsunami

there is joy in sucking dick
SUznELE.gif

*When Jared Leto slowly takes off his shirt to reveal his pasty complexion*
 

jstevenson

Sailor Stevenson
oh man blind leading the blind.

A few TVs hit the new specifications. Also worth noting some of the TVs on your chart that aren't in the UHD Premium Spec actually are certified, notably LG's OLEDs (E6/B6/G6/C6) due to the fact that they have perfect black, and thus, their brightness is comparably 1000 nits.

Your Dolby Vision spec on that window is pure FUD.

The bigger problem HDR wise isn't that some TVs aren't 1000 nits. It's that the HDR content isn't being mastered in a standardized way, nor are the TVs standardized. The key part of this is the HDR metadata should have info on what it's mastered on and also help the TV understand how to display the HDR based on its capabilties. This doesn't work very well.

Dolby Vision doesn't need the home TVs to hit that super high mark, because the Dolby Vision chips use dynamic metadata, and also each side has info on it's capabilities which shapes the HDR curves to the actual display you're watching it on. It knows the display's capability and thus can adjust the roll-offs for brightness your TV can't handle correctly (while also adjusting scene-by-scene with dynamic metadata).


In short - it's not the TV capabilities that are the problem right now, as much as it is the lack of standardization in how the content is mastered and displayed back. This is specific to HDR10, as Dolby Vision has this down, and due to that, is the superior format at the moment, despite not being nearly as widely adopted. HDR10 is getting the dynamic metadata but we still need to see a lot of progress in how the content is mastered for it.
 
Maybe it is one of those things you have to see first hand but I question the benefit of reaching 4000 nits. I would think at that brightness you are going to cause the iris to contract effectively clipping the bottom range of your content off (loss of detail in dark areas). Maybe it is good for simulating sunsets or blindness effects, but can't the same effect be achieved just by grading the content? Also going below 0.05 nits has rapidly diminishing returns since light reflections from your environment and skin/clothing will push the floor close to 0.05 nits, so I would not worry about lower black levels in future standards. This would lead me to say that the primary benefit of HDR occurs in the 100-1000 nits range. That and the expanded colour gamut and 12bit colour which will effectively eliminate banding. I would be paying more attention to the bit depth and perhaps to the inclusion of dynamic metadata (although I don't get the necessity of this at 12bit depth though (at 10bits it makes sense)) than to the ability of a set to hit over 1000 nits.
 
What's the most important aspect of TV sets for gaming purpose is the freaking response. That's number one priority if you're serious about gaming in my opinion.

Whatever the fluff they throw at you 4k, HDR and whatnot. This will not make your gaming great when those sets are still slowpokes. Personally I am not calling this improvement much when going from "2-4 milli + barely any input lag" to "10-15 milli". This is just disastrous.

10-15 ms latency is still under 1 fps at 60 fps. I think most people will give up 1 fps in latency for 4K HDR.
 
What's the most important aspect of TV sets for gaming purpose is the freaking response. That's number one priority if you're serious about gaming in my opinion.

Whatever the fluff they throw at you 4k, HDR and whatnot. This will not make your gaming great when those sets are still slowpokes. Personally I am not calling this improvement much when going from "2-4 milli + barely any input lag" to "10-15 milli". This is just disastrous.

That's less than a frame of difference. You cannot perceive that.
 
How does the viewing environment affect the brightness you should be looking at to create the HDR effect. I mean, I don't think 500nits in a pitch black room causes the same reaction as 500nits in a store environment. Doesn't this sort of bring in more variables into the mix?
 

ShutterMunster

Junior Member
oh man blind leading the blind.

A few TVs hit the new specifications. Also worth noting some of the TVs on your chart that aren't in the UHD Premium Spec actually are certified, notably LG's OLEDs (E6/B6/G6/C6) due to the fact that they have perfect black, and thus, their brightness is comparably 1000 nits.

Your Dolby Vision spec on that window is pure FUD.

The bigger problem HDR wise isn't that some TVs aren't 1000 nits. It's that the HDR content isn't being mastered in a standardized way, nor are the TVs standardized. The key part of this is the HDR metadata should have info on what it's mastered on and also help the TV understand how to display the HDR based on its capabilties. This doesn't work very well.

Dolby Vision doesn't need the home TVs to hit that super high mark, because the Dolby Vision chips use dynamic metadata, and also each side has info on it's capabilities which shapes the HDR curves to the actual display you're watching it on. It knows the display's capability and thus can adjust the roll-offs for brightness your TV can't handle correctly (while also adjusting scene-by-scene with dynamic metadata).


In short - it's not the TV capabilities that are the problem right now, as much as it is the lack of standardization in how the content is mastered and displayed back. This is specific to HDR10, as Dolby Vision has this down, and due to that, is the superior format at the moment, despite not being nearly as widely adopted. HDR10 is getting the dynamic metadata but we still need to see a lot of progress in how the content is mastered for it.

Thank you Based Stevenson!
 
My problem with all of this HDR talk is that it flies in the face of a lot of what we'be been told in terms of current display calibration. For example, 120 cd/ms or whatever is the standard for luminance and is recommended to avoid eye fatigue. How do all these crazy nits cope with that? Torch mode (dynamic, etc) is the scorn of the earth with how unnecessarily bright it is on modern tvs, but now we are trying to push the luminance to insane levels?

I'm more interested in it'd ability to improve shadow or dark detail. Previous tech struggles with that.

Ill be curious to see how calibration standards change with the new standards. Until we have concrete calibration targets and this hdr, wide color scene is well defined, I'm holding off. Prices are also too high right now for me to adopt.

I have a 4k vizio, 2015 m series that does not support hdr. Oh well. But the 4k is still a big improvement.

Still, I'd be using the new consoles on my 1080p Panny plasma. Nothing touches quality plasmas in terms of pq from what I've seen. OLED tvs that I've seen certainly can and do, but they are way too expensive right now.

That's because display technology was limited in color. The moment you started pushing above 100-120 cd/m2, whites would clip, and a blue sky would start to go white and lose its color saturation. With wide color gamut(DCI P3/ Rec 2020, 10bit panels, a new EOTF and higher nits or cd/m2. Displays can produce colors they couldn't before, lightning strikes can be bright without washing out everything else in the scene or image. The display maintains a high contrast ratio. This isn't you increasing contrast, but the display increasing brightness with leds based on the metadata.
 
I strongly disagree with the OP. Now is a fine time to buy an HDR TV, if you can afford the good ones. If you're looking at budget options, then don't bother. Do your research, go to dedicated sites about TVs like avsforum, and be prepared to spend $1500+. Get the biggest TV you can afford and move that couch as close to the TV as possible. If you're not willing to do that, then wait it out.
 

Thewonandonly

Junior Member
Question and I'm ignoring the OP sorry ;)

But I'm looking to get a tv and can't decide between the viseo p series, Samsung k8000, or this don't HDR tv where it's 1200 for a 55 inch at Best Buy.

Help!!

Edit: no idea why NX popped in :O
 

III-V

Member
That's because display technology was limited in color. The moment you started pushing above 100-120 cd/m2, whites would clip, and a blue sky would start to go white and lose its color saturation. With wide color gamut(DCI P3/ Rec 2020, 10bit panels, a new EOTF and higher nits or cd/m2. Displays can produce colors they couldn't before, lightning strikes can be bright without washing out everything else in the scene or image. The display maintains a high contrast ratio. This isn't you increasing contrast, but the display increasing brightness with leds based on the metadata.

Is this 100% accurate? So if you only have edge lit LED and not full array does this still function properly?
 

RoboPlato

I'd be in the dick
oh man blind leading the blind.

A few TVs hit the new specifications. Also worth noting some of the TVs on your chart that aren't in the UHD Premium Spec actually are certified, notably LG's OLEDs (E6/B6/G6/C6) due to the fact that they have perfect black, and thus, their brightness is comparably 1000 nits.

Your Dolby Vision spec on that window is pure FUD.

The bigger problem HDR wise isn't that some TVs aren't 1000 nits. It's that the HDR content isn't being mastered in a standardized way, nor are the TVs standardized. The key part of this is the HDR metadata should have info on what it's mastered on and also help the TV understand how to display the HDR based on its capabilties. This doesn't work very well.

Dolby Vision doesn't need the home TVs to hit that super high mark, because the Dolby Vision chips use dynamic metadata, and also each side has info on it's capabilities which shapes the HDR curves to the actual display you're watching it on. It knows the display's capability and thus can adjust the roll-offs for brightness your TV can't handle correctly (while also adjusting scene-by-scene with dynamic metadata).


In short - it's not the TV capabilities that are the problem right now, as much as it is the lack of standardization in how the content is mastered and displayed back. This is specific to HDR10, as Dolby Vision has this down, and due to that, is the superior format at the moment, despite not being nearly as widely adopted. HDR10 is getting the dynamic metadata but we still need to see a lot of progress in how the content is mastered for it.
Is there any info on when dynamic metadata is hitting for HDR10 and what companies will add it in firmware updates for currently released displays?
 
Well, the TV I bought in January is in the top 10! It says it's HDR compatible. HDR programs on Amazon look amazing. Oh well. I still like 3D, so I have that over the newer, nittier TVs. I also game on this here 1440p monitor.
 

Dsyndrome

Member
But even at 50ms, those people did not detect the lag, and they thought their TV was great...
This is where I'm at. Been playing on my 4K tv from last year in Standard Mode up until a month or two ago, never noticed the input lag. Then again, I don't competitively play FPS's.

Ordered a X930D last week to get HDR, no regrets, input lag will still be better than what I was already fine with before.
 
Maybe it is one of those things you have to see first hand but I question the benefit of reaching 4000 nits. I would think at that brightness you are going to cause the iris to contract effectively clipping the bottom range of your content off (loss of detail in dark areas). Maybe it is good for simulating sunsets or blindness effects, but can't the same effect be achieved just by grading the content? Also going below 0.05 nits has rapidly diminishing returns since light reflections from your environment and skin/clothing will push the floor close to 0.05 nits, so I would not worry about lower black levels in future standards. This would lead me to say that the primary benefit of HDR occurs in the 100-1000 nits range. That and the expanded colour gamut and 12bit colour which will effectively eliminate banding. I would be paying more attention to the bit depth and perhaps to the inclusion of dynamic metadata (although I don't get the necessity of this at 12bit depth though (at 10bits it makes sense)) than to the ability of a set to hit over 1000 nits.

Look up EOTF (PQ) or SMPTE 2084. Even at 10bit you can see some banding. Barten ramp, you can see 12bit is perfect for 10,000 nits.

Most of the luminance values is in PQ or 2084, with tone mapping.
 
Mostly gaming but all watch a lot of movies on it also. And the question is which one should I get :)

Without going into tooooo much detail, I say go for the Samsung KS8000. That's the one I'm getting and here's why:

~20ms latency in HDR and Game mode (compared to ~60 with the Vizio)
4:4:4 Chroma in 4K

The things the Vizios do better are better low-light picture, has Dolby Vision as well as HDR10, and can do 120hz in 1080p.
 
Look up EOTF (PQ) or SMPTE 2084. Even at 10bit you can see some banding. Barten ramp, you can see 12bit is perfect for 10,000 nits.

Most of the luminance values is in PQ or 2084, with tone mapping.

I looked at that those charts and that is what causes me confusion. I don't see the necessity of 12bits+metadata, 12bits alone makes sense, as does 10bits+metadata. It looks to me that 12bits using the PQ curve alone is good enough to be below the Barten ramp from 0.001 nits to 10,000 nits.
 

Gitaroo

Member
Without going into tooooo much detail, I say go for the Samsung KS8000. That's the one I'm getting and here's why:

~20ms latency in HDR and Game mode (compared to ~60 with the Vizio)
4:4:4 Chroma in 4K

The things the Vizios do better are better low-light picture, has Dolby Vision as well as HDR10, and can do 120hz in 1080p.

Didn't the other thread says when hdr signal kicks in, its the base 20ms + another 30ms so 50+ms.
 

Thewonandonly

Junior Member
Without going into tooooo much detail, I say go for the Samsung KS8000. That's the one I'm getting and here's why:

~20ms latency in HDR and Game mode (compared to ~60 with the Vizio)
4:4:4 Chroma in 4K

The things the Vizios do better are better low-light picture, has Dolby Vision as well as HDR10, and can do 120hz in 1080p.
Well I got tinfoil in the windows of my basement so it's always dark as hell. Should that persuade me to the vizeo but I am leaning for the Samsung.

Also thanks for all the help man :)
 

J-Rzez

Member
I strongly disagree with the OP. Now is a fine time to buy an HDR TV, if you can afford the good ones. If you're looking at budget options, then don't bother. Do your research, go to dedicated sites about TVs like avsforum, and be prepared to spend $1500+. Get the biggest TV you can afford and move that couch as close to the TV as possible. If you're not willing to do that, then wait it out.

I agree the OP is very wrong for this thread, scaring people for no reason. That dolby vision graph is odd, and they included OLED in the same charts, which in the reality of things, they can be the closest to recommended specs. Dolby vision is great and all, but I'm skeptical it catches on due to royalties and that its another physical component that much be present thus two of the biggest players, Sony and Samsung, will not feature it.

No reason to scare people or upset them with their purchase. A KS8000 is going to be more than fine for people. Those with a EF9500 or newer x6 OLED are fine as well.

To the person asking between the Vizio and Samsung, buy the ks8000.
 
Man,

The EF9500 is a beast for not being designed for HDR out of the box.

And AstroLad is right. It's never a good time to buy anything because something better is always around the corner. Just buy what you want and be happy.

I've fired up my LG EF9500 and HDR on Netflix looks god-tier on it, and it's nowhere near the 1000 nits standard. I'll probably upgrade to UHD for some movies I love once the Scorpio is out (my Marantz reciever lacks HDMI 2.0a, and is a 1400 dollar bottle neck between not losing sound quality on my receiver and getting HDMI 2.0a.. all so I could save 400 dollars a year ago. Dumb long term move :/)

I have the 65EF9500 - I didn't think the 2015's supported HDR.

edit: oh and I love my OLED. It's amazing.
 
Didn't the other thread says when hdr signal kicks in, its the base 20ms + another 30ms so 50+ms.
That is not correct, at least according to the testing of Rtings:
" We measured an input lag of 62.1ms when sending an HDR signal with 'Game Low Latency' on HDMI1 on the Vizio P. This is quite high for gaming, and will be an issue for a lot of people. The KS8000 is a good choice. We found that when sending an HDR signal, it is necessary to set the color space to 'Native' but then it successfully plays HDR content even with the 'Game' special picture mode. We measured the input lag with HDR metadata and 'Game' mode enabled to be 22.6ms."
Well I got tinfoil in the windows of my basement so it's always dark as hell. Should that persuade me to the vizeo but I am leaning for the Samsung.

Also thanks for all the help man :)
The Vizios are fantastic too, and better in low light. Maybe that's the way to go for you!
 

haxan7

Banned
Even what some people would consider moderately bright tv images have and continue to trigger migraines in me.

When I bought a 4K set in May, I made sure to buy a set that allowed backlight dimming.

Not sure this HDR thing is gonna work out.
 

vivftp

Member
So tempted to buy one of the Sony Z9 series TVs next year, but I think I might try to hold on a bit longer. I think I can be comfortable with my 50" Bravia another couple of years... maybe even long enough to see the PS5 come out, then I finally pounce.
 

Dubz

Member
All I know is that when u watch my xbr65850b I fucking love it. I'm no videophile....ignorance is bliss.
 

jstevenson

Sailor Stevenson
oh man blind leading the blind.

A few TVs hit the new specifications. Also worth noting some of the TVs on your chart that aren't in the UHD Premium Spec actually are certified, notably LG's OLEDs (E6/B6/G6/C6) due to the fact that they have perfect black, and thus, their brightness is comparably 1000 nits.

Your Dolby Vision spec on that window is pure FUD.

The bigger problem HDR wise isn't that some TVs aren't 1000 nits. It's that the HDR content isn't being mastered in a standardized way, nor are the TVs standardized. The key part of this is the HDR metadata should have info on what it's mastered on and also help the TV understand how to display the HDR based on its capabilties. This doesn't work very well.

Dolby Vision doesn't need the home TVs to hit that super high mark, because the Dolby Vision chips use dynamic metadata, and also each side has info on it's capabilities which shapes the HDR curves to the actual display you're watching it on. It knows the display's capability and thus can adjust the roll-offs for brightness your TV can't handle correctly (while also adjusting scene-by-scene with dynamic metadata).


In short - it's not the TV capabilities that are the problem right now, as much as it is the lack of standardization in how the content is mastered and displayed back. This is specific to HDR10, as Dolby Vision has this down, and due to that, is the superior format at the moment, despite not being nearly as widely adopted. HDR10 is getting the dynamic metadata but we still need to see a lot of progress in how the content is mastered for it.

one other point.

This is really from an accuracy / picture nerd point of view.

The current TVs doing HDR10 still look awesome, it's just sometimes there's weird things like the colors being slightly not what was intended (more or less saturated, etc).

Most folks will watch/play and be like AWESOME. And it does look awesome, it's brighter, by far more detailed, and with more color accuracy than many people have on their current TVs at home when they leave them on BRIGHT mode or don't calibrate their picture.

The downsides of HDR10 are just that you lose a lot of control as the display takes over, and nobody is exactly sure if the content is looking as it's supposed to look, and there's no real way to calibrate because the content isn't standardized.

For the vast majority of consumers, what's currently out there right now is a huge step up. I'm personally just a big fan of Dolby Vision because it makes it more plug and play for the end user, and is standardized, which I think is needed --- as even your biggest AV nerds on AVS and otherwise don't know how to deal with the variation in mastering on HDR10 content, and the variation in playback on the displays.

It'll all get sorted out over time, though I truly believe more of it is on the mastering / content side, than the TV/hardware side.
 
If you want streaming, watch the 4K content from these talented folks instead:

https://www.youtube.com/playlist?list=PLD33E5618740295DF

Pretty phenomenal stuff. It's what originally sold me on 4K.
Can't. My YouTube app won't do 4K and neither will my OG PS4. Hard to even know if PS4 pro is worth it cause I can't find a reliable way to even play 4K. Netflix works, but I don't feel like it was a noticeable jump.

My screen is just 55 inches.
 

Nipo

Member
Do you have a source for this? From rtings.com, I was just reading that the Samsung KS8000 with a 4k source has 20.9ms of input lag (on game mode). Where is this 50ms coming from?

http://www.rtings.com/tv/tests/inputs/input-lag

From the review of the tv when you click the question mark next to input lag:

"What it is: Lowest input lag possible on TV with a 1080p @ 60Hz input."

The table you linked was just pointing out it is a 4k tv not that they measured the lag at 4k. The tool they use to measure lag only works in 1080p.
 

jaaz

Member
From the review of the tv when you click the question mark next to input lag:

"What it is: Lowest input lag possible on TV with a 1080p @ 60Hz input."

The table you linked was just pointing out it is a 4k tv not that they measured the lag at 4k. The tool they use to measure lag only works in 1080p.

Okay thanks. But where does it say that going to 4k resolution adds 30 ms on top of the ~20ms at 1080p?
 

Nipo

Member
Okay thanks. But where does it say that going to 4k resolution adds 30 ms on top of the ~20ms at 1080p?

It doesn't. As far as i know there are no digital testers right now to measure input lag at 4k so you need to measure it the old fashion way. It adds a bit but I don't know how much.
 
Top Bottom