• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Gamers, now is not a good time to buy a cheap "HDR" TV

EvB

Member
If you're watching a dark movie and then suddenly a 5,000nit image fills the display, your eyes are going to burst.

Yeah, definitely fortunately we have pupils to deal with some of that. In the Videogame world they will perform artificial pupil adaptation to assist with this.
It's a big subject, if you go and look at the link in the OP, you can see more information about how the current TVs perform.

So whilst they can reach 1500nits in a small area on the screen , for prolonged period or in larger areas on the screen (or full screen) the Nits are almost a 3rd of that level of brightness, presumably because the LEDS would melt or explode.
I would presume that the same is the case for the bigger standards, which are all to do with peak brightness.
It's not about being able to simply illuminate the screen in some awful super tanning machine in your home, but have a TV capable of producing the peak brightness in places in your image.

If I take a photography example showing blown highlights (I know photography examples are somewhat contentious), here is a scene which you would have no problems looking at in real life, but as highlighted in red, these are areas that fall outside of the upper end range of the camera/display
2015-11-19_10-41-15.png


Being able to illuminate these areas to better represent how they look in real life makes the image more real. And rather than control what the user sees to make up by this by making areas of the image darker, the users eyes can adjust in a way that is natural.
 

Toki767

Member
I disagree with this. I think there are clear lines when common standards are set and become the norm for some time. Those are the times to jump in when the dust has settled on them. If the standards keep changing, then there's never really a standard. Something as simple as HDMI w/ HDCP on a 1080p set went a long way. I think when it comes to these new sets, simply getting something with HDMI 2.2, 4K and the right HDR will go a long way too. The thing is, right now we're still on the cusp of these becoming standard.

Yeah. The reason I say to wait until CES 2017 is because the UHD Premium standard only came out this year. I'd want to be sure that nothing actually changed with the standard for TVs in 2017 before committing or at least feeling safe in buying a 2016 TV.
 
Yeah. The reason I say to wait until CES 2017 is because the UHD Premium standard only came out this year. I'd want to be sure that nothing actually changed with the standard for TVs in 2017 before committing or at least feeling safe in buying a 2016 TV.

Yep, I've been telling people at a bare minimum wait until 2017 which I think will be the first break out year of having a lot of TVs that are probably good enough to get you by for awhile. Buying a TV now, unless your TV is about to die or dead, is jumping in just a tad premature when it really is just around the corner.
 
Question, has anyone used one of the Samsung Evolution kits with the Xbox One S? Do they support all of the 4K requirements needed? Does it add any input lag? Thanks
 

SDMG

Member
The Vizio P series fully supports Dolby Vision as well as HDR 10 tho nits seems lower on the HDR10 side of things.
 
It really just depends. If you need a new TV, then now is definitely time to go shopping for a new tv.

If you have a relatively new TV then sure. Watch how things play out.

I will say HDR sets past roughly $2000 USD often feature mind-melting images, and I doubt many would regret having one, provided the other ducks are in a row.
 

EvB

Member
The Vizio P series fully supports Dolby Vision as well as HDR 10 tho nits seems lower on the HDR10 side of things.

Well that isn't technically true as it only support Dolby Vision through internal sources (streaming) and isn't a 12-bit panel.

But again, as demonstrated by the above chart, due to it's rather relatively low nit level, isn't going to fully benefit from 10-bit content ,which has 0-1024 levels of luminance for each pixel.
The whole subject is verging on mind boggling, this is before you even start the discussion on active LED zoning, edge lit screens, FALD .

I'm sorry to those who I've somehow offended with this thread, I should have made the thread "Now is not a good time to buy a cheap "HDR" TV"
 

jeffram

Member
If it looks good or better than what you've got now, who cares?

I was tired about hearing about 900p Xbox one games, I'm already tired about hearing about fauxK ps pro games, and now this. Things can always be better, that's how technology works. By the time we hit Dolby vision, Dolby vision won't matter.
 

eso76

Member
Yeah well, as exciting as HDR sounds, these are the absolute first models to implement it, so it has to be expected this is just the beginning.
 

zoukka

Member
I'm sorry but these snobby posts are what turns me off about NeoGAF.

I have a 4K tv. It does HDR but not HDR 1000. Compared to the 1080p tv I had it's night and day. 4K from Netflix is mind blowing. 3D looks like a digital window.

I saw HDR 1000 over the weekend at Best Buy. It was nice but honestly I hard time distinguishing the image from my current set.

On the other hand, I was at my friend's house who has my old 1080p set. The image was so dark I could hardly stand it. There was no shadow detail and so much detail overall was lost.

Is there always a better piece of tech 6 months away? Of course. But if you are ready for 4K and excited for PS4 Pro, you will not be disappointed by the current sets.

What a load of shit.
 

Theonik

Member
Dolby Vision is actually aiming for mastering content in 12000nits, so that the format is future proofed.

I couldn't adjust th scale to that as it made th rest of tha data unreadable
No they are not. 12 bit is the max and you can only do 4096 brightness steps in that.
 
Well that isn't technically true as it only support Dolby Vision through internal sources (streaming) and isn't a 12-bit panel.

But again, as demonstrated by the above chart, due to it's rather relatively low nit level, isn't going to fully benefit from 10-bit content ,which has 0-1024 levels of luminance for each pixel.
The whole subject is verging on mind boggling, this is before you even start the discussion on active LED zoning, edge lit screens, FALD .

I'm sorry to those who I've somehow offended with this thread, I should have made the thread "Now is not a good time to buy a cheap "HDR" TV"

Full screen is 600 nits, 2% is lower by design. Vizio doesn't like blooming.

What is this Sony standard?

No they are not. 12 bit is the max and you can only do 4096 brightness steps in that.

Yeah, don't know where he got that from.
 
I keep debating whether to save a ton of cash and get the 50" Sony KDLW800c or this years XBR49x800D. Last years is only 1080p but is 120hz and with 3d. This years has 4K and HDR but no 3d and is only 60hz. Plus it is only 8 bit HDR. I hate being at the cusp of new changes in tv tech.

I thought people were saying 800D was 10bit?
 
So my this is what my late grandmother was talking about when I used to ask if I could hook up my SNES to the big screen... nit grow please! We're almost there grandma!
 

New002

Member
It really just depends. If you need a new TV, then now is definitely time to go shopping for a new tv.

If you have a relatively new TV then sure. Watch how things play out.

I will say HDR sets past roughly $2000 USD often feature mind-melting images, and I doubt many would regret having one, provided the other ducks are in a row.

My sentiments exactly.
 

Warnen

Don't pass gaas, it is your Destiny!
Wait you mean to tell me there is always something better around the corner!
 

Instro

Member
I'm inclined to get the KS8000 if I get one anytime to soon, but more than likely I will wait another year or so.
 

chadskin

Member
I looked into 4K HDR TVs recently amid all the craze but came away with the same conclusion. The technology isn't there yet at an 'impulse buy' price for me.

TV manufacturers also do a terrible job at properly marketing their TVs like LG's 6xx range that 'supports HDR' but only has an 8-bit panel and similar shenanigans. Let's see what next year brings.
 
Right now is not a good time to buy an HDR TV because the sales are shitty. Wait until Nov/Dec (yes, even high end TV's have major sales, they are never door busters or on the first page of a BF ad) or wait until Feb/March/April (2016 models slow down production to make wave for newer models, in which case you may find decent prices on discounted models, or just upgrade to a newer model, which surprisingly is way cheaper MSRP than the discontinued model).

EDIT: I'll also say that November is a huge month for high end TV sales, normally around black friday (usually not on). Don't wait for a black friday sale for high end TV's, I've seen some spectacular sales happen for high end TV's that disappear during/after black Friday. Also, stay away from black friday only TV's, if you look for the model number online and can't find it (it's a weird variant usually), steer clear. These TV's are usually cheaper and missing features from better sets, these are not the kind of TV's one would buy if they're looking for high end TV's.
 
Guys dont buy a PS4 because the graphics will only get better. wait for the PS5!

Four years later.

Guys dont buy a PS5, the graphics will only get better.

I have a friend who's still rocking a 720p TV because he's paranoid that something better is right around the corner. And keeps waiting.

First off, no fucking shit. But second off, you make money, buy stuff, enjoy stuff, when said stuff gets surpassed, sell it, relocated it, repurpose it, donate it - then buy new better stuff.

I can understand not buying a new TV every year, but technology is always going to be getting better with wait time, jump on it - otherwise it'll just keep passing you by. When these TV's now being discussed finally get cheaper/better, HDR 12 will be coming out. It's a cycle you'll never catch up with.
 
Wait you mean to tell me there is always something better around the corner!

No, that's not the way one should read into this. There are hard lines when specific standards are set and met that when they're established is now the time to at least jump in. This isn't about how tech is always getting better; it's about the standards being set and being available. To put it in perspective, imagine buying an HDTV that didn't have HDCP, or even HDMI. HDCP and HDMI didn't evolve in a way where you missed out because they kept getting better; they were the standards that allowed you hook up everything for a decade and still do.
 

EvB

Member
I have a friend who's still rocking a 720p TV because he's paranoid that something better is right around the corner.

First off, no fucking shit. But second off, you make money, buy stuff, enjoy stuff, when said stuff gets surpassed, sell it, relocated it, repurpose it, donate it - then buy new better stuff.

I can understand not buying a new TV every year, but technology is always going to be getting better with wait time, jump on it - otherwise it'll just keep passing you by. When these TV's now being discussed finally get cheaper/better, HDR 12 will be coming out.

The whole point of this thread was to put in perspective just how un HDR a $700 TV is right now.
It's not to suggest don't buy one at all, there are some amazing ones that are $2k which tick all the boxes.
If you want the HDR experience and plan on having the same TV for a couple of years, now is a bad time to buy a cheap 4K TV, because you are only gettting a fraction of what your new console will be able to deliver you
 

EvB

Member
No, that's not the way one should read into this. There are hard lines when specific standards are set and met that when they're established is now the time to at least jump in. This isn't about how tech is always getting better; it's about the standards being set and being available. To put it in perspective, imagine buying an HDTV that didn't have HDCP, or even HDMI. HDCP and HDMI didn't evolve in a way where you missed out because they kept getting better; they were the standards that allowed you hook up everything for a decade and still do.

.
 
Whatever I upgrade over my 20 in Samsungs I got for $180 each will be plenty upgrade.

I know something will be better next year as HDR matures, but the prices for some of them now are too good anyway.

Not a big deal. In 5 years I'll get something much better than what I get now as well anyway.
 

DieH@rd

Banned
I really don't care about Dolby Vision. KS8000 is more than awesome enough, and for my needs even X800D from Sony should be a tremendous upgrade over my current display.
 

EvB

Member
It's not the same format though. It's like skipping 4k because 8k will be the bees knees. Maybe.

And I'm not suggesting skipping anything. What people are doing now is like buying an iPhone 5s the month before the 7 comes out. I'm not saying "why the hell have you just bought an iPhone 6S when the iPhone 12 will be better" which is what some people are interpreting this as.
 

Quasar

Member
Yeah. The reason I say to wait until CES 2017 is because the UHD Premium standard only came out this year. I'd want to be sure that nothing actually changed with the standard for TVs in 2017 before committing or at least feeling safe in buying a 2016 TV.

Certainly I've been waiting to see if more sets that also support dolby vision come out.

And now I have another reason to wait...for sets without excessive hdr lag.
 

madjackal

Member
Input lag is the limiting factor for me at the moment. I'll be holding off until the OLEDs can give me a great HDR screen with acceptable input lag for shooters and fighting games.
 

EvB

Member
I mentioned it in another thread that I prefer gaming on smaller tv screens. But as I'm reading more and more into 4k and HDR, seems it's only going to be available on screens 40" and up.

Yeah it's pretty irritating, I want to grab a nice TV with all the new technologies, but I have specific space requirements in which I want it to fit and everything is just a smidge bigger than I need.
 

Theonik

Member
Certainly I've been waiting to see if more sets that also support dolby vision come out.

And now I have another reason to wait...for sets without excessive hdr lag.
I wouldn't hold my breath on that. Most makers avoid it because it eliminates their chance to differentiate their sets and requires hardware from dolby.
 

AstroLad

Hail to the KING baby
I disagree with this. I think there are clear lines when common standards are set and become the norm for some time. Those are the times to jump in when the dust has settled on them. If the standards keep changing, then there's never really a standard. Something as simple as HDMI w/ HDCP on a 1080p set went a long way. I think when it comes to these new sets, simply getting something with HDMI 2.2, 4K and the right HDR will go a long way too. The thing is, right now we're still on the cusp of these becoming standard.



9th Generation Pioneer Kuro was probably the best point in history to buy a TV. Glad I jumped on that.
That's totally fair, and I'm certainly in no position to debate with the experts on the TV or PC side. I just do my research before I buy stuff and kind of lose track until the next time I'm ready to buy. I guess the funny thing is the new stuff comes out, and honestly you will have a lot of people saying don't buy this early-adopter garbage wait until the good stuff comes out after people have more experience with the standard or whatever it is. Maybe I'm biased but this has been my experience (told to wait) like 20/20 times when asking if it was a good time to buy a new TV or PC, and I don't think it's just because I always happen to be asking between cycles or anything.
 

JawzPause

Member
Yeah, definitely fortunately we have pupils to deal with some of that. In the Videogame world they will perform artificial pupil adaptation to assist with this.
It's a big subject, if you go and look at the link in the OP, you can see more information about how the current TVs perform.

So whilst they can reach 1500nits in a small area on the screen , for prolonged period or in larger areas on the screen (or full screen) the Nits are almost a 3rd of that level of brightness, presumably because the LEDS would melt or explode.
I would presume that the same is the case for the bigger standards, which are all to do with peak brightness.
It's not about being able to simply illuminate the screen in some awful super tanning machine in your home, but have a TV capable of producing the peak brightness in places in your image.

If I take a photography example showing blown highlights (I know photography examples are somewhat contentious), here is a scene which you would have no problems looking at in real life, but as highlighted in red, these are areas that fall outside of the upper end range of the camera/display
http://www.lightstalking.com/wp-content/uploads/2015/11/2015-11-19_10-41-15.pnMG]

Being able to illuminate these areas to better represent how they look in real life makes the image more real. And rather than control what the user sees to make up by this by making areas of the image darker, the users eyes can adjust in a way that is natural.[/QUOTE]
This post actually helped me understand hdr a little better so thanks
 

ghibli99

Member
I dunno... I'm enjoying mine quite a bit. 43" desktop replacement. Even with my backlight at like 35-40%, I find it to be almost too bright with the lights out. Deadpool looks amazing (the description that the image looks very similar to what your eyes would actually see is quite accurate), and we'll be getting a number of high-profile HDR games for XB1S over the next month. I'm too old to just be constantly waiting. :)
 

EvB

Member
I dunno... I'm enjoying mine quite a bit. 43" desktop replacement. Even with my backlight at like 35-40%, I find it to be almost too bright with the lights out. Deadpool looks amazing (the description that the image looks very similar to what your eyes would actually see is quite accurate), and we'll be getting a number of high-profile HDR games for XB1S over the next month. I'm too old to just be constantly waiting. :)

Most of these sets probably aren't intended to be used as a desktop replacement and the assumption is you will be a distance away from the screen, where the amount of light reaching you is reduced and not so bright.

See inverse-square law for details
 
Top Bottom