• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Gamers, now is not a good time to buy a cheap "HDR" TV

My problem with all of this HDR talk is that it flies in the face of a lot of what we'be been told in terms of current display calibration. For example, 120 cd/ms or whatever is the standard for luminance and is recommended to avoid eye fatigue. How do all these crazy nits cope with that? Torch mode (dynamic, etc) is the scorn of the earth with how unnecessarily bright it is on modern tvs, but now we are trying to push the luminance to insane levels?

I'm more interested in it'd ability to improve shadow or dark detail. Previous tech struggles with that.

Ill be curious to see how calibration standards change with the new standards. Until we have concrete calibration targets and this hdr, wide color scene is well defined, I'm holding off. Prices are also too high right now for me to adopt.

I have a 4k vizio, 2015 m series that does not support hdr. Oh well. But the 4k is still a big improvement.

Still, I'd be using the new consoles on my 1080p Panny plasma. Nothing touches quality plasmas in terms of pq from what I've seen. OLED tvs that I've seen certainly can and do, but they are way too expensive right now.
 

ghibli99

Member
Most of these sets probably aren't intended to be used as a desktop replacement and the assumption is you will be a distance away from the screen, where the amount of light reaching you is reduced and not so bright.

See inverse-square law for details
I get what you're saying... and I guess at a point earlier in life, I was insanely particular about this stuff. Not sure what happened. LOL My LG was $500 w/ no tax/shipping, and it came with a free $50 magic remote that I sold, so all-in I barely paid more than I would have for a PS4 Pro. My 2005 720p Pioneer plasma cost 10x that (and we still use it).

Without having a high-end HDR set to compare it to side-by-side, my subjective opinion is that it looks really damn good in our office where lighting isn't that controlled. Especially native 4K content, with or without HDR. Happy to be enjoying it now, and I'm OK with replacing it if some other standard(s) somehow make it unusable.
 

Syrus

Banned
Ks8000 can do HDR and Game mode with 22 input lag and is uhd premium 1400 nits. Nuff said.

Side note , game mode doesnt automatically change to hdr mode. Simply raise backlight and contrast to max and boom ur good. Kinda pain in the butt but it works at least
 

MazeHaze

Banned
I bought a ks8000 before this recent hdr craze on neogaf that's been brought on by that playstation conference. No regrets, it's great. Best TV I've ever owned, and there aren't even HDR games yet. Even if tvs will come out next year with "better" hdr, this set is still fucking leaps and bounds better than anything else I've ever owned. I love it!

And for everyone scared off by the price, as long as your credit isn't shit, there are plenty of stores that offer two year financing with no interest. So you can own a killer tv for around 75 bucks a month. That's what I've been doing, I'll pay the rest off with my tax return and everything is great.
 

Trojan

Member
I'm really glad I did some research before I bought the Samsung KS8000 about a month ago. I didn't even know about the competing HDR standards but luckily I got a TV that sets me up pretty well for the coming 4K train.

Seriously though, parsing through HDR/4K features is not easy. There's been so much confusion since the PS4 Pro announcement.
 

Akoi

Member
HDR to me is like a buzzword on TVs. I personally see no reason to buy one. I always turn off all post-processing effects on my TV anyways.

They don't even advertise the HDR tech for monitors AFAIK.
 

CamHostage

Member
Another note as a gamer, because HDR10 implications require the TV to analyse the image and control brightness accordingly, TVs are having to do image processing in order to actually display HDR content correctly, resulting in input lag.
Activating GAME MODE doesn't always help, as on some models this actual disables HDR processing altogether, as it stands many review sites are not clear about their testing of input lag with HDR for gaming.

Question: I was under the impression that the whole idea behind HDR "support" was that it was a signal instruction set for dynamic range and brightness application? Directors master their films in HDR in the lab, game designers structure and supervise their game engine to properly display lighting values with HDR scope in mind, and then the box sends those instructions to television set for its display.

Is that not what's happening? Because what you're describing feels more like HDR interpretation more than HDR instructions. I'm confused why would the television need to do image assessment and manipulation with HDR in effect? (Or am I misunderstanding the scope of the processing we're talking about?) I do understand that TVs on the market with HDR do have lag in modes outside of GAME mode, but I thought that it was still signal manipulation issues that the television needed to do to compensate for its picture display deficiencies? Of course, all the image signal data still needs to be put together into a TV picture at some point along the chain, but is most HDR lag happening because of generally applying the HDR signal, or from the TV processing it for that TV's capabilities? Can we generalize how much lag HDR adds to the process (and will that come down to a minimum expectation level as processing gets cheaper and faster and/or new TV tech comes to market,) or will it always wildly depend on the signal processor of the specific television?
 

RoboPlato

I'd be in the dick
HDR to me is like a buzzword on TVs. I personally see no reason to buy one. I always turn off all post-processing effects on my TV anyways.

They don't even advertise the HDR tech for monitors AFAIK.
HDR isn't a post processing effect. It's natively supporting and displaying more colors.
 

Syrus

Banned
HDR to me is like a buzzword on TVs. I personally see no reason to buy one. I always turn off all post-processing effects on my TV anyways.

They don't even advertise the HDR tech for monitors AFAIK.


Lol its not a buzzword. Its actually significant difference.
 

MazeHaze

Banned
HDR to me is like a buzzword on TVs. I personally see no reason to buy one. I always turn off all post-processing effects on my TV anyways.

They don't even advertise the HDR tech for monitors AFAIK.

Because HDR monitors don't exit yet lol.
 

Akoi

Member
Because HDR monitors don't exit yet lol.

Doesn't that say something?


The fact that the standard is not on other displays and just on TVs commercially tells me she something not quite ready yet.

Edit: reading more and more into this I feel like this is something that still needs a few years.

Also call me blind or crazy but 4K is something I don't see myself upgrading to for years because I personally don't see the use for it on a TV unless I had a way bigger TV (I own a 55") and I think its crazy when people buy 4K laptops. (I had the option on my 15.6" laptop.)
 

ZOONAMI

Junior Member
My Panasonic ax800u 4k without hdr looks as good or better than the hdr demos at best buy. I'm not really worried about it. There will always be a new picture feature from the TV companies every 1-3 years. They want you to upgrade. 720p>1080p>3D>4k>HDR>Better HDR> 8k> continues forever. Honestly the 1080p plasma jump to an 8k set is probably the only thing that makes sense. Not that 4k with HDR isn't worth upgrading to, it's just that a Panasonic 1080p plasma still looks damn good, not much worse than a 4k hdr set.

However, I really don't think 10-bit color support should be tied to HDR, which seems to be happening outside of the PC realm. HDR seems like meaningless standard that adds other requirements to a potentiallly much simpler 10-bit color standard. My ax800u has a 10-bit panel but is locked out of showing all those colors in most forms of media because of other requirements. Blurays and other media should offer 0-255 4:4:4 support and 10 bit color options that aren't tied to hdr.
 
4K consoles at this time are just stupid. Consoles are meant to be easy affordable ways into gaming. Hardly anyone has 4K TVs and the ones that do probably don't have hdr or terrible input lag. But please buy our 400 dollar new console but you'll need a 1000 dollar tv too.

I doubt the market for those TVs will pick up by the time Scorpio launches.
 

Theonik

Member
Doesn't that say something?


The fact that the standard is not on other displays and just on TVs commercially tells me she something not quite ready yet.

Edit: reading more and more into this I feel like this is something that still needs a few years.
Monitors exist but only for mastering video.
The reason is you don't need your spreadsheets at 4000 nits. Most monitors are computing first media as an afterthought.
 
I'm sorry but these snobby posts are what turns me off about NeoGAF.

I have a 4K tv. It does HDR but not HDR 1000. Compared to the 1080p tv I had it's night and day. 4K from Netflix is mind blowing. 3D looks like a digital window.

I saw HDR 1000 over the weekend at Best Buy. It was nice but honestly I hard time distinguishing the image from my current set.

On the other hand, I was at my friend's house who has my old 1080p set. The image was so dark I could hardly stand it. There was no shadow detail and so much detail overall was lost.

Is there always a better piece of tech 6 months away? Of course. But if you are ready for 4K and excited for PS4 Pro, you will not be disappointed by the current sets.

How big is your tv?

I've been pretty underwhelmed by Netflix 4K.
 
The whole point of this thread was to put in perspective just how un HDR a $700 TV is right now.
It's not to suggest don't buy one at all, there are some amazing ones that are $2k which tick all the boxes.
If you want the HDR experience and plan on having the same TV for a couple of years, now is a bad time to buy a cheap 4K TV, because you are only gettting a fraction of what your new console will be able to deliver you

Dolby Cinema is Dolby Vision + Dolby Atmos. Dolby Vision in theaters is 100 nits. HDR isn't limited by nits, having 1024 shades of gray means producing more color, better shadow detail without sacrificing deep blacks. EOTF(Electro Optical Transfer Function) PQ determines how fast the display comes out of black and has a set level of luminance for each step of gray.
 

Akoi

Member
Monitors exist but only for mastering video.
The reason is you don't need your spreadsheets at 4000 nits. Most monitors are computing first media as an afterthought.

Most gaming monitors these days are all about gsync or freesync and having a high refresh rate and a nice IPS display.
 

mario_O

Member
man, fuck hardware. bough a new 4K av receiver last year and apparently it doesn't support HDR (hdmi 2a) nor DTS X/ Atmos. Now I need a new receiver and more speakers, to hang on my ceiling. And now I also need 5000nits? :/
 

Alexious

Member
Doesn't that say something?


The fact that the standard is not on other displays and just on TVs commercially tells me she something not quite ready yet.

Edit: reading more and more into this I feel like this is something that still needs a few years.

Also call me blind or crazy but 4K is something I don't see myself upgrading to for years because I personally don't see the use for it on a TV unless I had a way bigger TV (I own a 55") and I think its crazy when people buy 4K laptops. (I had the option on my 15.6" laptop.

It really doesn't. It's extremely easy for developers to implement, it allows them to truly match their artistic vision and it comes at no performance cost.

The only thing that's needed is hardware becoming cheap enough to hit mainstream, which will happen in a year or so.
 

thisisamul

Neo Member
How big is your tv?

I've been pretty underwhelmed by Netflix 4K.

Netflix 4K can be nice - but there is definitely a huge variance in quality among the 20 or so 4K titles.

That said, I did download some reference 4k HDR footage on USB drive and hot damn, it looked INCREDIBLE.
 

Metfanant

Member
Actually, isn't it a TERRIBLE time to be buying an HDR TV? lol...

Input lag with HDR enabled is pretty terrible on most sets...LCD manufacturers are still trying to push thin, edgelit displays on us...and OLED is still a little too much in its infancy to get costs down

Gotta agree...that's it's really not the time to be jumping in...I'm holding off for a few more years
 
Well based on that chart choosing the X850C over the KS8000 was a bad decision on my part. It was a tough choice since the prices were identical but pretty much every review I read gave the Sony the edge for black uniformity.
 

Akoi

Member
It really doesn't. It's extremely easy for developers to implement, it allows them to truly match their artistic vision and it comes at no performance cost.

The only thing that's needed is hardware becoming cheap enough to hit mainstream, which will happen in a year or so.

Like I said, I think it's bizare gaming monitors for pcs are not pushing this at all.

I own a nice 1080P 55" Samsung TV from 2014, I might buy into this tech in a few years. I honestly have been looking for a reason to upgrade my monitor (it's a 1080P display from 2010) and I want to be blown away from something and not just gimmicks. If HDR takes off I just might bite and admit I'm wrong about it.
 

julrik

Member
Input lag is the limiting factor for me at the moment. I'll be holding off until the OLEDs can give me a great HDR screen with acceptable input lag for shooters and fighting games.
Me too. I've been so tempted to buy the C6 since I have the funds, but I'm afraid the input lag in 4K + HDR will be too high.

What are the chances that the 2017 OLEDs will have around 30 ms input lag in 4K + HDR? Pipe dream?
 
KS8000 is a good all rounder. Even if something better is coming, its still a great TV and you can maybe upgrade in 3-5 years time for something truly mega or even enjoy now and sell it in 1-2 years.

I'll be checking them out in 2017, going to need an input for my Coleco Vision

Some points of the KS8000 is the nits drop down to 500 after a few seconds and the local dimming is pretty bad.

I might end up with a LCD but would like an OLED
 
Actually, isn't it a TERRIBLE time to be buying an HDR TV? lol...

Input lag with HDR enabled is pretty terrible on most sets...LCD manufacturers are still trying to push thin, edgelit displays on us...and OLED is still a little too much in its infancy to get costs down

Gotta agree...that's it's really not the time to be jumping in...I'm holding off for a few more years

Yeah. Input lag kills it for me.
 

BroBot

Member
vgHYR1o.gif

There truly is a Simpsons gif for everything....
 
My Panasonic ax800u 4k without hdr looks as good or better than the hdr demos at best buy. I'm not really worried about it. There will always be a new picture feature from the TV companies every 1-3 years. They want you to upgrade. 720p>1080p>3D>4k>HDR>Better HDR> 8k> continues forever. Honestly the 1080p plasma jump to an 8k set is probably the only thing that makes sense. Not that 4k with HDR isn't worth upgrading to, it's just that a Panasonic 1080p plasma still looks damn good, not much worse than a 4k hdr set.

However, I really don't think 10-bit color support should be tied to HDR, which seems to be happening outside of the PC realm. HDR seems like meaningless standard that adds other requirements to a potentiallly much simpler 10-bit color standard. My ax800u has a 10-bit panel but is locked out of showing all those colors in most forms of media because of other requirements. Blurays and other media should offer 0-255 4:4:4 support and 10 bit color options that aren't tied to hdr.

No 10bit content until now, and most content was and is mastered in 4:2:2 or 4:2:0. Those that own UHD blu-ray content still get 10bit, if they're display is 4k 10bit only. They may not get the wide color gamut or brighter highlights.
 
I was thinking of getting a new TV this holiday, a 4K one, and putting my current Samsung 1080p HD one in my other room...should I wait then? Seems like there's a lot of conflicting viewpoints.
 

MazeHaze

Banned
Doesn't that say something?


The fact that the standard is not on other displays and just on TVs commercially tells me she something not quite ready yet.

Edit: reading more and more into this I feel like this is something that still needs a few years.

Also call me blind or crazy but 4K is something I don't see myself upgrading to for years because I personally don't see the use for it on a TV unless I had a way bigger TV (I own a 55") and I think its crazy when people buy 4K laptops. (I had the option on my 15.6" laptop.)
Well there isn't much reason for hdr monitors right now considering there aren't any hdr games. Hdr tvs make sense this year because hdr video has started to take off. Hdr games will finally start to come out this fall, and hdr monitors will follow shortly after.
 

Hazanko

Banned
Actually, isn't it a TERRIBLE time to be buying an HDR TV? lol...

Input lag with HDR enabled is pretty terrible on most sets...LCD manufacturers are still trying to push thin, edgelit displays on us...and OLED is still a little too much in its infancy to get costs down

Gotta agree...that's it's really not the time to be jumping in...I'm holding off for a few more years

Yeah, that's why I was for 1080p 60fps for the Pro. I knew 4k would be nice but it just seems too soon to be pushing it. I'll get a 4k HDR when I think it'll be a decent price and specs.
 
You're incorrectly lumping OLED TVs with LCD ones. OLEDs have a different UHD Premium requirement due to their capability of showing perfect black and lower peak brightness.

On top of this, 1000nits + requires specific lighting, so as not to melts your eyeballs.
 

Darklor01

Might need to stop sniffing glue
How big is your tv?

I've been pretty underwhelmed by Netflix 4K.

It's been stated on a few sites that Netflix and other 4K streaming is compressed to the point where it is basically the quality of Blu Ray. Download a demo to a thumb stick and try it to see what you think.

Id also mention that store are in most stores aren't calibrated. They are set to whatever factory defaults are. Factory defaults are often way off from proper and aimed at displaying the brightest most over saturated colors without necessarily being maxed out. They also usually have motion enhancement turned on causing the Soap Opera Effect. This is no way to judge the set properly and it will also need to be calibrated in your home to the room it is in to get proper viewing. Additionally, store demos have been running non-stop for varied lengths of time straight sometimes causing burn in on some sets.
 

Bubba77

Member
I love mys KS8000. Beats the shit out of my old tv. Why the fuck do I care if somethig better comes out next year? Guess what? Something better will be out the year after that too. And the year after that? Something better.

It's like computer hardware. The right time to buy is when you want and can afford.

Im with you and i love my ks8000. Its a massive upgrade over my old sony 1080 set. Cant wait to test hdr on forza horizon 3.
 

spwolf

Member
I'm sorry but these snobby posts are what turns me off about NeoGAF.

I have a 4K tv. It does HDR but not HDR 1000. Compared to the 1080p tv I had it's night and day. 4K from Netflix is mind blowing. 3D looks like a digital window.

I saw HDR 1000 over the weekend at Best Buy. It was nice but honestly I hard time distinguishing the image from my current set.

On the other hand, I was at my friend's house who has my old 1080p set. The image was so dark I could hardly stand it. There was no shadow detail and so much detail overall was lost.

Is there always a better piece of tech 6 months away? Of course. But if you are ready for 4K and excited for PS4 Pro, you will not be disappointed by the current sets.

a lot of people are all about specs. In that 4K thread people were mentioning how they are gaming awesomly on the Samsung 4K HDR set at 20ms, because thats are the specs... except they are for 1080p mode which leo bodnar tester can test, and for 4K gaming, it is 50 ms :).

But even at 50ms, those people did not detect the lag, and they thought their TV was great... on the other hand, they would read review of some other TV and see 35 or 38 and claim it was likely unplayable :).

In the TV world, you can always wait for something... current entry level 10bit HDR sets are able to do 2x the brightness of last years 1080p sets. There is always going to be something better next year, but that means you missed a year of enjoying new tech.
 

Ploid 3.0

Member
I half read "Now is a Good time to buy." I clicked to disagree hah. It's always best to wait and see how things turn out.
 

J-Rzez

Member
Man,

The EF9500 is a beast for not being designed for HDR out of the box.

And AstroLad is right. It's never a good time to buy anything because something better is always around the corner. Just buy what you want and be happy.

I've fired up my LG EF9500 and HDR on Netflix looks god-tier on it, and it's nowhere near the 1000 nits standard. I'll probably upgrade to UHD for some movies I love once the Scorpio is out (my Marantz reciever lacks HDMI 2.0a, and is a 1400 dollar bottle neck between not losing sound quality on my receiver and getting HDMI 2.0a.. all so I could save 400 dollars a year ago. Dumb long term move :/)

There's a reason I recommended the EF9500 to people. That TV was, and still is amazing. I bought it up soon after seeing. Being a plasma guy, I was blown away and retired my old panny plasma to the bedroom. Only short coming is its input lag.

Sony and Samsung can battle it out over nits, but they're just ringing what little blood is left from LCD. Samsungs hit around that 1300-1500nits and that Sony Z9 I'm sure hits 2000 or close to it, thing is ridiculously bright.

The reason the nits number people and Dolby focus on is a mix of lowest common from LCD and projectors and due to their inferior contrast they have to blow up to overcome it. Best part is when They get that bright they start blooming like crazy, and that includes the Z9. Most people will be happy with their KS8000 though, and shouldn't get too crazy over numbers.

But yes, OLED has different guidelines for HDR, they don't need to hit that high, with 400-540nits as their line. Those nits hitting 2000+ is ridiculous. Like I said that Z9 gets disgustingly bright, if I had that set its brightness would be on 1 of 100.
 

Faustek

Member
as it stands many review sites are not clear about their testing of input lag with HDR for gaming.


People should stop using displaylag as a holy grail for buying a 4k monitor for 4k gaming. They are pretty clear that they only grade at a 1080p output and with all the ppe shut down. Meaning it isn't worth Shit in this scenario.

And my personal take, Samsung sucks for gaming if you want all the effects. No real test than me standing in a store and trying them out myself.
 

No_Style

Member
Early HDR reminds me a lot of early 802.11n (aka Draft N) wireless routers. People who hopped on early will receive a superior experience compared to what's available now but they will quickly be eclipsed when the standard and hardware get their act together. Draft N products still work to this very day but you're not receiving the full benefit of the standard.

If you're in no rush, it's best to wait for the next generation of display sets to hit the market around CES time. And no, there isn't just your usual cadence of "something better" is always around the corner. These aren't just incremental improvements.

I'm personally waiting for an 55" 4K HDR OLED that's gaming friendly and won't cost more than $3000 CAD
 
Top Bottom