Straight Edge
Member
Buyers remorse the thread.
What's the most important aspect of TV sets for gaming purpose is the freaking response. That's number one priority if you're serious about gaming in my opinion.
Whatever the fluff they throw at you 4k, HDR and whatnot. This will not make your gaming great when those sets are still slowpokes. Personally I am not calling this improvement much when going from "2-4 milli + barely any input lag" to "10-15 milli". This is just disastrous.
What's the most important aspect of TV sets for gaming purpose is the freaking response. That's number one priority if you're serious about gaming in my opinion.
Whatever the fluff they throw at you 4k, HDR and whatnot. This will not make your gaming great when those sets are still slowpokes. Personally I am not calling this improvement much when going from "2-4 milli + barely any input lag" to "10-15 milli". This is just disastrous.
oh man blind leading the blind.
A few TVs hit the new specifications. Also worth noting some of the TVs on your chart that aren't in the UHD Premium Spec actually are certified, notably LG's OLEDs (E6/B6/G6/C6) due to the fact that they have perfect black, and thus, their brightness is comparably 1000 nits.
Your Dolby Vision spec on that window is pure FUD.
The bigger problem HDR wise isn't that some TVs aren't 1000 nits. It's that the HDR content isn't being mastered in a standardized way, nor are the TVs standardized. The key part of this is the HDR metadata should have info on what it's mastered on and also help the TV understand how to display the HDR based on its capabilties. This doesn't work very well.
Dolby Vision doesn't need the home TVs to hit that super high mark, because the Dolby Vision chips use dynamic metadata, and also each side has info on it's capabilities which shapes the HDR curves to the actual display you're watching it on. It knows the display's capability and thus can adjust the roll-offs for brightness your TV can't handle correctly (while also adjusting scene-by-scene with dynamic metadata).
In short - it's not the TV capabilities that are the problem right now, as much as it is the lack of standardization in how the content is mastered and displayed back. This is specific to HDR10, as Dolby Vision has this down, and due to that, is the superior format at the moment, despite not being nearly as widely adopted. HDR10 is getting the dynamic metadata but we still need to see a lot of progress in how the content is mastered for it.
Every display device I get I turn the brightness to 0.
Even that's usually too bright *shrug*
My problem with all of this HDR talk is that it flies in the face of a lot of what we'be been told in terms of current display calibration. For example, 120 cd/ms or whatever is the standard for luminance and is recommended to avoid eye fatigue. How do all these crazy nits cope with that? Torch mode (dynamic, etc) is the scorn of the earth with how unnecessarily bright it is on modern tvs, but now we are trying to push the luminance to insane levels?
I'm more interested in it'd ability to improve shadow or dark detail. Previous tech struggles with that.
Ill be curious to see how calibration standards change with the new standards. Until we have concrete calibration targets and this hdr, wide color scene is well defined, I'm holding off. Prices are also too high right now for me to adopt.
I have a 4k vizio, 2015 m series that does not support hdr. Oh well. But the 4k is still a big improvement.
Still, I'd be using the new consoles on my 1080p Panny plasma. Nothing touches quality plasmas in terms of pq from what I've seen. OLED tvs that I've seen certainly can and do, but they are way too expensive right now.
Question and I'm ignoring the OP sorry
But I'm looking to get a tv NX can't decide between the viseo p series, Samsung k8000, or this don't HDR tv where it's 1200 for a 55 inch at Best Buy.
Help!!
That's because display technology was limited in color. The moment you started pushing above 100-120 cd/m2, whites would clip, and a blue sky would start to go white and lose its color saturation. With wide color gamut(DCI P3/ Rec 2020, 10bit panels, a new EOTF and higher nits or cd/m2. Displays can produce colors they couldn't before, lightning strikes can be bright without washing out everything else in the scene or image. The display maintains a high contrast ratio. This isn't you increasing contrast, but the display increasing brightness with leds based on the metadata.
Mostly gaming but all watch a lot of movies on it also. And the question is which one should I getWhat is the question? What do you plan on using the TV for?
Is there any info on when dynamic metadata is hitting for HDR10 and what companies will add it in firmware updates for currently released displays?oh man blind leading the blind.
A few TVs hit the new specifications. Also worth noting some of the TVs on your chart that aren't in the UHD Premium Spec actually are certified, notably LG's OLEDs (E6/B6/G6/C6) due to the fact that they have perfect black, and thus, their brightness is comparably 1000 nits.
Your Dolby Vision spec on that window is pure FUD.
The bigger problem HDR wise isn't that some TVs aren't 1000 nits. It's that the HDR content isn't being mastered in a standardized way, nor are the TVs standardized. The key part of this is the HDR metadata should have info on what it's mastered on and also help the TV understand how to display the HDR based on its capabilties. This doesn't work very well.
Dolby Vision doesn't need the home TVs to hit that super high mark, because the Dolby Vision chips use dynamic metadata, and also each side has info on it's capabilities which shapes the HDR curves to the actual display you're watching it on. It knows the display's capability and thus can adjust the roll-offs for brightness your TV can't handle correctly (while also adjusting scene-by-scene with dynamic metadata).
In short - it's not the TV capabilities that are the problem right now, as much as it is the lack of standardization in how the content is mastered and displayed back. This is specific to HDR10, as Dolby Vision has this down, and due to that, is the superior format at the moment, despite not being nearly as widely adopted. HDR10 is getting the dynamic metadata but we still need to see a lot of progress in how the content is mastered for it.
I've got the Sony X8509C and according to that chart its 282. I have no idea what that means? All I knows is the picture quality is second to none. 4K Netflix is glorious.
This is where I'm at. Been playing on my 4K tv from last year in Standard Mode up until a month or two ago, never noticed the input lag. Then again, I don't competitively play FPS's.But even at 50ms, those people did not detect the lag, and they thought their TV was great...
Maybe it is one of those things you have to see first hand but I question the benefit of reaching 4000 nits. I would think at that brightness you are going to cause the iris to contract effectively clipping the bottom range of your content off (loss of detail in dark areas). Maybe it is good for simulating sunsets or blindness effects, but can't the same effect be achieved just by grading the content? Also going below 0.05 nits has rapidly diminishing returns since light reflections from your environment and skin/clothing will push the floor close to 0.05 nits, so I would not worry about lower black levels in future standards. This would lead me to say that the primary benefit of HDR occurs in the 100-1000 nits range. That and the expanded colour gamut and 12bit colour which will effectively eliminate banding. I would be paying more attention to the bit depth and perhaps to the inclusion of dynamic metadata (although I don't get the necessity of this at 12bit depth though (at 10bits it makes sense)) than to the ability of a set to hit over 1000 nits.
Mostly gaming but all watch a lot of movies on it also. And the question is which one should I get![]()
Is this 100% accurate? So if you only have edge lit LED and not full array does this still function properly?
Look up EOTF (PQ) or SMPTE 2084. Even at 10bit you can see some banding. Barten ramp, you can see 12bit is perfect for 10,000 nits.
Most of the luminance values is in PQ or 2084, with tone mapping.
Without going into tooooo much detail, I say go for the Samsung KS8000. That's the one I'm getting and here's why:
~20ms latency in HDR and Game mode (compared to ~60 with the Vizio)
4:4:4 Chroma in 4K
The things the Vizios do better are better low-light picture, has Dolby Vision as well as HDR10, and can do 120hz in 1080p.
Well I got tinfoil in the windows of my basement so it's always dark as hell. Should that persuade me to the vizeo but I am leaning for the Samsung.Without going into tooooo much detail, I say go for the Samsung KS8000. That's the one I'm getting and here's why:
~20ms latency in HDR and Game mode (compared to ~60 with the Vizio)
4:4:4 Chroma in 4K
The things the Vizios do better are better low-light picture, has Dolby Vision as well as HDR10, and can do 120hz in 1080p.
I strongly disagree with the OP. Now is a fine time to buy an HDR TV, if you can afford the good ones. If you're looking at budget options, then don't bother. Do your research, go to dedicated sites about TVs like avsforum, and be prepared to spend $1500+. Get the biggest TV you can afford and move that couch as close to the TV as possible. If you're not willing to do that, then wait it out.
I'll just wait until OTA broadcasts are at 4K before buying a 4K TV
Man,
The EF9500 is a beast for not being designed for HDR out of the box.
And AstroLad is right. It's never a good time to buy anything because something better is always around the corner. Just buy what you want and be happy.
I've fired up my LG EF9500 and HDR on Netflix looks god-tier on it, and it's nowhere near the 1000 nits standard. I'll probably upgrade to UHD for some movies I love once the Scorpio is out (my Marantz reciever lacks HDMI 2.0a, and is a 1400 dollar bottle neck between not losing sound quality on my receiver and getting HDMI 2.0a.. all so I could save 400 dollars a year ago. Dumb long term move :/)
That is not correct, at least according to the testing of Rtings:Didn't the other thread says when hdr signal kicks in, its the base 20ms + another 30ms so 50+ms.
The Vizios are fantastic too, and better in low light. Maybe that's the way to go for you!Well I got tinfoil in the windows of my basement so it's always dark as hell. Should that persuade me to the vizeo but I am leaning for the Samsung.
Also thanks for all the help man![]()
Didn't the other thread says when hdr signal kicks in, its the base 20ms + another 30ms so 50+ms.
its 1080p at 20ms, 1080p HDR at 23ms and 4K adding 30ms to that.
So around >50ms, and people are lowing low lag on it.
its 1080p at 20ms, 1080p HDR at 23ms and 4K adding 30ms to that.
So around >50ms, and people are lowing low lag on it.
What about the Vizio P then? 4K+HDR+Game mode?
What about the Vizio P then? 4K+HDR+Game mode?
oh man blind leading the blind.
A few TVs hit the new specifications. Also worth noting some of the TVs on your chart that aren't in the UHD Premium Spec actually are certified, notably LG's OLEDs (E6/B6/G6/C6) due to the fact that they have perfect black, and thus, their brightness is comparably 1000 nits.
Your Dolby Vision spec on that window is pure FUD.
The bigger problem HDR wise isn't that some TVs aren't 1000 nits. It's that the HDR content isn't being mastered in a standardized way, nor are the TVs standardized. The key part of this is the HDR metadata should have info on what it's mastered on and also help the TV understand how to display the HDR based on its capabilties. This doesn't work very well.
Dolby Vision doesn't need the home TVs to hit that super high mark, because the Dolby Vision chips use dynamic metadata, and also each side has info on it's capabilities which shapes the HDR curves to the actual display you're watching it on. It knows the display's capability and thus can adjust the roll-offs for brightness your TV can't handle correctly (while also adjusting scene-by-scene with dynamic metadata).
In short - it's not the TV capabilities that are the problem right now, as much as it is the lack of standardization in how the content is mastered and displayed back. This is specific to HDR10, as Dolby Vision has this down, and due to that, is the superior format at the moment, despite not being nearly as widely adopted. HDR10 is getting the dynamic metadata but we still need to see a lot of progress in how the content is mastered for it.
its 1080p at 20ms, 1080p HDR at 23ms and 4K adding 30ms to that.
So around >50ms, and people are lowing low lag on it.
Can't. My YouTube app won't do 4K and neither will my OG PS4. Hard to even know if PS4 pro is worth it cause I can't find a reliable way to even play 4K. Netflix works, but I don't feel like it was a noticeable jump.If you want streaming, watch the 4K content from these talented folks instead:
https://www.youtube.com/playlist?list=PLD33E5618740295DF
Pretty phenomenal stuff. It's what originally sold me on 4K.
Do you have a source for this? From rtings.com, I was just reading that the Samsung KS8000 with a 4k source has 20.9ms of input lag (on game mode). Where is this 50ms coming from?
http://www.rtings.com/tv/tests/inputs/input-lag
From the review of the tv when you click the question mark next to input lag:
"What it is: Lowest input lag possible on TV with a 1080p @ 60Hz input."
The table you linked was just pointing out it is a 4k tv not that they measured the lag at 4k. The tool they use to measure lag only works in 1080p.
Okay thanks. But where does it say that going to 4k resolution adds 30 ms on top of the ~20ms at 1080p?