• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Gamers, now is not a good time to buy a cheap "HDR" TV

Harp

Member
I would agree that the longer you wait, the better off you are. But not because of max brightness. In a year you will be able to get a larger and higher quality tv at a lower price. I just bought the Sony 850c this summer and it's a great tv and it has firmware update HDr. I also own the Samsung UHD player and when watching the Revenant the picture is amazing but not because of how bright it is but because of how good the contrast is.

HDr is not only about super bright it is a combination of bright and dark.

Truth of the matter, unless you want to run out and buy 30 dollar versions of movies for pure HDr, HDr is not really a reason you should or shouldnt buy a new tv.
 

x3sphere

Member
I picked up one of the new OLEDs this year and it looks absolutely stunning with the HDR content I've seen on Netflix. Can't wait to try it out with games. I don't think now is a bad time to jump in other than price.

I am sure next year's sets will get brighter, and they'll lower the input lag more but none of that would really make me want to wait - I'm more than fine with the current specs.
 

MysteryM

Member
Spot on post by the OP and great advice. I find the rush by misinformed people to pick up a HDR 4K tv staggering.

Personally I won't buy a 4K to until post Scorpio, and it will probably be a Sony master backlight drive LCD, but post the current generation when HDR lag is reduced for gaming.

Seriously your gut may say buy a to now for the likes of ps4 pro but you are risking being saddled with an inferior early HDR tv. Wait a year or two, let the technology evolve and then buy one. It's like every one who bought a UHD tv only for manufacturers to be pushing UHD Premium.

Some more info about the Sony master backlight drive here, truly staggering.

https://youtu.be/LpTjitk31JA
 

branny

Member
Neither the PS4 Pro nor the Xbox One S (and likely the Scorpio) support Dolby Vision, so I wouldn't be too worried about future-proofing how blind you're going to be.

Like other people are saying, if you're buying a new TV for content that you'd like to enjoy in the near future, holding off on a purchase doesn't seem necessary, especially if you're happy with what you saw in a store.

There will always be more expensive, better-performing TVs coming out, and you're never really safe investing in any sort of technology.
 

EvB

Member
Those nits are nuts. When I get a new TV, one of the first things I do is lower the back light settings because the factory settings hurt my eyes. Who knew that getting hit in the face with something close to the intensity of the sun when you're trying to unwind at night was desirable? Maybe HDR isn't for me.

Here is an example of a real life measurement of Nits, a flower on a sunny day

dolby_vision_nits.png


It's nothing our eyes aren't used to
 

Vipu

Banned
But back on topic, I think that now is an okay time for people to update to 4k especially with the PS4 Pro coming out. Technology will always advance and your current hardware will always become outdated/obsolete. So what's left for deciding?

And there the "consoles are so cheap" goes again.
 

Nzyme32

Member
Dolby Vision is rather spectacular in person, but the TVs and content to support it properly are still early days - meanwhile HDR10 stuff will work perfectly fine on those displays. They've gone quite far to both future proof as a standard for HDR in terms of nits, for example for what you get in a real world scene:

3VfEWEL.png


*Dolby posit using current "monitor" with the maximum Dolby Vision brightness of 4,000 nits and up to 10,000 nits in the future to best suit the range they think is valuable based on their research rather than restricting the range to 100nits as is currently the standard, there by preventing a large chunk of the range

Concerning games, I only know the PC version of Mass Effect Andromeda will support Dolby Vision unlike the consoles which will support HDR10. Apparently Dolby will be pushing for support in the relevant engines but I'm not sure how long that will take them
 
Those numbers are very misrepresentative, yes those big numbers with the Samsungs are possible, but only on a very small portion of the screen, and for a very short period of time, these 1400/1500 nit peaks fall to the 500-600 mark quick, and I'd wager my 2015 SUHD set with a 500-600 nit peak would actually look very close to how the 2016 models do. Hell Rtings only scored the 2016 models .5 better in the HDR category than my 'lowly' set.
 

LCGeek

formerly sane
I'm sorry but these snobby posts are what turns me off about NeoGAF.

I have a 4K tv. It does HDR but not HDR 1000. Compared to the 1080p tv I had it's night and day. 4K from Netflix is mind blowing. 3D looks like a digital window.

I saw HDR 1000 over the weekend at Best Buy. It was nice but honestly I hard time distinguishing the image from my current set.

On the other hand, I was at my friend's house who has my old 1080p set. The image was so dark I could hardly stand it. There was no shadow detail and so much detail overall was lost.

Is there always a better piece of tech 6 months away? Of course. But if you are ready for 4K and excited for PS4 Pro, you will not be disappointed by the current sets.

You're post is the only snobby post. Him highlighting things aren't perfect isn't the same as what you're implying. He's right also it's not a good time to get on board a tech that while the industry is pumped for the roll out is still in the beginning. Also tvs in at least 1 to 3 years will offer more than early generation 4k or hdr tvs as well support better standards, it's how the industry with flat panels works in general with new tech vs evolved tech. Let people make up their mind based on what's presented is more fair than what you're saying.

Sounds like your friend has a garbage display regardless of lacking hdr features, that's a hallmark trait of not color calibrating or shit color range on a set especially if you're talking blacks or greys. Now that's a snobby line.
 

BumRush

Member
Next year's LG's are going to do it. I can FEEL it. But in regards to the chart...2,000 nits is fucking insane, and entirely too bright unless your watching TV in the gates of heaven.
 

AstroLad

Hail to the KING baby
i swear tvs and pcs it's always wait wait wait. there's always something around the corner. from my own personal experience i wouldn't fret about trying to time things out too much...you can always wait six more months and things will be cheaper and/or better, but that's the issue is that that's always the case. best advice i can give personally is just don't go ultra-premium sector unless the money doesn't mean much to you, because that's the stuff that looks like a really bad deal pretty quickly.
 

StoOgE

First tragedy, then farce.
Man,

The EF9500 is a beast for not being designed for HDR out of the box.

And AstroLad is right. It's never a good time to buy anything because something better is always around the corner. Just buy what you want and be happy.

I've fired up my LG EF9500 and HDR on Netflix looks god-tier on it, and it's nowhere near the 1000 nits standard. I'll probably upgrade to UHD for some movies I love once the Scorpio is out (my Marantz reciever lacks HDMI 2.0a, and is a 1400 dollar bottle neck between not losing sound quality on my receiver and getting HDMI 2.0a.. all so I could save 400 dollars a year ago. Dumb long term move :/)
 

Unicorn

Member
I'm the kind of guy that puts my phone and laptop on the lowest brightness setting.


This shit ain't for me, is it?
 

gatti-man

Member
OP you're falling for marketing BS with this thread. The TVs that meet the UHD premium standard aren't even good 4K buys. The UHD premium is a BS standard made up by manufacturers. And nit brightness is once again BS. My vizio P series will burn my eyes out and look where it's at on your chart.

It is never a good time to buy a TV for new tech.

But good breakdown OP.

Actually it's a terrible breakdown.
 
Is there always a better piece of tech 6 months away? Of course.
This argument comes up in every discussion where people are saying it's a bad time to buy something, and it's a bad argument.

Is there always another advancement right around the corner? Yes, but not all advancements are created equal. Some can and should be disregarded, and others are worth paying attention to. There are sweet spots where it's a much better time to buy than other times. It's those periods when prices have come down and quality has gone up on the last substantial jump, and there's no new substantial jump close enough to concern yourself with.
 

le-seb

Member
Yeah unless you're the type that buys a new TV every couple of years anyway (which is crazy in my opinion but whatever works for people I guess)
Same.

I'll only replace my 8 year old (but still very nice) TV set it if it dies or there are affordable OLED ones available.
I feel like it's still too soon for UHD now.
 
By the time it's recommended that you buy a 4K television 8K will be on the horizon. Now's a fine time to buy one if you get a good deal on one. Just don't expect it to still be top of the line in 5 years.
 

Darklor01

Might need to stop sniffing glue
18,000 NITS roughly = 100 Watt incandescent light bulb.

Edit: I personally do not want to stare directly at that much light.
 
This is becoming such a draaagg

I wish a Magic Genie lamp guy would exist to tell me what is the best
TV to Play Final Fantasy XV on PS4PRO by November.
4K , HDR is obviously a must.
43"-70" Also below 600$
 
Might wanna see if you even like how it looks. From the instore demos I've seen it looks like a stabilized Dynamic Contrast to the maxxx.
 

gatti-man

Member
Here is an example of a real life measurement of Nits, a flower on a sunny day

dolby_vision_nits.png


It's nothing our eyes aren't used to

Once again this isn't even relevant. Do you understand what will happen with tvs trying to replicate that image? How many zones do those tvs have that meet the standard? You would need THOUSANDS for your example to be relevant and not destroy image quality. Why? Because LCD tvs don't work like a photo. Unless you have individual backlighting and a very small scale that 14,000 nit will bleed into the 2,000 nit. It will lead to blown out colors, horrible blacks, and an uneven image without an plasma/OLED type display.
 

EvB

Member
Do you watch your TV in the garden?
Once again this isn't even relevant. Do you understand what will happen with tvs trying to replicate that image? How many zones do those tvs have that meet the standard? You would need THOUSANDS for your example to be relevant and not destroy image quality. Why? Because LCD tvs don't work like a photo.

Do you look at a flower outside and your eyes burn out?
Dolby's aim to make images look like real life and that image is simply stating the measurement of real life.

Also, to be clear, I'm not saying wait for Dolby Vision it is 12000 nits, I'm simply saying that you might be considering buying a TV that is already dated, because there are a few others (and many more to come ) that are closer to what IS considered the HDR standard by TV manufacturers (UHD Premium)
 

Harp

Member
I'm the kind of guy that puts my phone and laptop on the lowest brightness setting.


This shit ain't for me, is it?

The entire screen will never be 1000 nits just a small percentage of it. But in reality I don't expect to see a huge jump from what current UHD blueray offer today. They have set the standard for the disks and there will be some movies that go all about but the vast majority will be based on the current standard.
 

Madness

Member
I keep debating whether to save a ton of cash and get the 50" Sony KDLW800c or this years XBR49x800D. Last years is only 1080p but is 120hz and with 3d. This years has 4K and HDR but no 3d and is only 60hz. Plus it is only 8 bit HDR. I hate being at the cusp of new changes in tv tech.
 
I'm skipping OLED this time. Of course that's the inevitable end game, but for now I'd rather put my money into size and features.

Yeah I didn't realize OLED was still so damn expensive, it's not exactly new tech.

So you think a good 4k HDR 65inch TV for $1000 is possible next year? I think Vizio could probably make that happen at least one of sale and not full price
 

iMax

Member
no, but I have Windows in my house...
Dolby's aim to make images look like real life.

Also, to be clear, I'm not saying wait for Dolby Vision it is 12000 nits, I'm simply saying that you might be considering buying a TV that is already dated, because there are a few others that are closer to what should be considered the standard.

If you're watching a dark movie and then suddenly a 5,000nit image fills the display, your eyes are going to burst.
 
Wait, why is my TV in the Sony 4K HDR blue section. It's only a 1080p set, but according to rtings has 10-bit colour depth on their site. *Confused.com* Can I take advantage of any HDR benefits not counting nit brightness... like extended HDR colour range with PS4 Pro? I'm going to assume not.

EDIT: Nm it's Color Gamut is not up to HDR spec.
 
Yeah I didn't realize OLED was still so damn expensive, it's not exactly new tech.

So you think a good 4k HDR 65inch TV for $1000 is possible next year? I think Vizio could probably make that happen

Hmm... I don't know, I think that's pushing it. The 2016 4K HDR Vizios will certainly drop in price next year, but I would expect the 65" models to be lie... $1200 minimum? And that would be a great deal.

But who knows! I'm no analyst, and this is a time of change for TV tech.
 

gatti-man

Member
no, but I have Windows in my house...
Dolby's aim to make images look like real life.

Also, to be clear, I'm not saying wait for Dolby Vision it is 12000 nits, I'm simply saying that you might be considering buying a TV that is already dated, because there are a few others (and many more to come ) that are closer to what IS considered the HDR standard by TV manufacturers (UHD Premium)

Wtf? Dolby vision aims to be like real life? Lol. No no no.

And no UHD premier is a standard made up by Sony and Samsung to promote their TVs which most don't even have Dolby vision to begin with. And Dolby vision isn't even on most 4K blurays it's featured in streaming mostly.

Nits are like 20% of what makes an image and you're displaying a complete lack of knowledge on how a UHD tv even creates its image in the first place.
 
I bought a lg 4k oled hdr TV at the weekend and it looks amazing. If you like new tech and have the money, it's a noticeable difference
 
I'm skipping OLED this time. Of course that's the inevitable end game, but for now I'd rather put my money into size and features.

I want an OLED screen but it has to be at least 80" to be viable.

So, I'm skipping tvs entirely. I'll upgrade my projector-room once 4k hdr projectors are reliable and (relatively) affordable. And I need a new receiver too with that.

I'd rather spend 4-5k on a projector than on a small(ish) screen.
 
Hmm... I don't know, I think that's pushing it. The 2016 4K HDR Vizios will certainly drop in price next year, but I would expect the 65" models to be lie... $1200 minimum? And that would be a great deal.

But who knows! I'm no analyst, and this is a time of change for TV tech.

Damn it's kinda crazy how expensive 4k HDR sets are. It's gonna be a while before this becomes mass market
 

DorkyMohr

Banned
What this also shows you is exactly why Sony chose to create their own standard, one that has no minimum brightness requirement (4k HDR), it also shows why they have chosen not to publish Nit data for their most recent TVs (such as the X800D , which has been recommended in another thread).

I was looking at this model, so given that it's recommended is it good? Or does the graph mean it's doo-doo?
 
So , there are a load of threads about 4K and HDR and there is clearly a lot of confusion between 4k, HDR and Wide Color Gamut.

As gamers the only component of this we should be interested in is the HDR part. Right now the developers are being cagey about what they are doing , what is HDR and what is WCG.

Microsoft have stated that the Xbox One S does not use WCG for games ( only movies)
Sony hasn't said a word about what either machine is actually doing. PS3, PS4 and Xbox One all supported 10 bit deep color from their launches, yet we've never seen a game that utilises it.
(Xbox One actually supports 12-bit output also, which will be a requirement for Dolby Vision in future)

10 bit textures and data requires more storage and more bandwidth, so there is are performance implications on all the machines.
For a game to well and truly support WCG, every texture and video will need to be replaced with an 10 bit WCG version.

So, on the assumption that we aren't going to see much in the way of WCG material any time soon, but the console manufacturers are insistent on talking about it like it is a very real thing. The only assumption that can be made is that they are using the HDR side of things to better control image brightness in things like sunspots and specular highlights.

So what I've done as a guideline to show where TVs are right now in the grand scheme of things. Rtings have done some tests that show a screen's capability to display a bright image on a 2% area of the screen, so something like the sun or perhaps a light source.

j7NKXCf.gif
[/IMG]

What we can see is that the vast majority of TVs available right now sit right at the very bottom on the chart, falling massively short of the peak brightness of 1000nits to hit the main HDR standard : UHD Premium and nothing comes even close to what Dolby is suggesting for the future.

What this also shows you is exactly why Sony chose to create their own standard, one that has no minimum brightness requirement (4k HDR), it also shows why they have chosen not to publish Nit data for their most recent TVs (such as the X800D , which has been recommended in another thread).


Another note as a gamer, because HDR10 implications require the TV to analyse the image and control brightness accordingly, TVs are having to do image processing in order to actually display HDR content correctly, resulting in input lag.
Activating GAME MODE doesn't always help, as on some models this actual disables HDR processing altogether, as it stands many review sites are not clear about their testing of input lag with HDR for gaming.
So again, the resources for researching an expensive purchase such as a TV are limited, also contributing towards now not being an easy time to make the right purchase.


Anyway, I hope this helps to further add some clarity to the intentional obfusication from TV manufacturers and now Console makers.

What HDR standard did Sony create, I haven't read or heard anything?
 
i swear tvs and pcs it's always wait wait wait. there's always something around the corner. from my own personal experience i wouldn't fret about trying to time things out too much...you can always wait six more months and things will be cheaper and/or better, but that's the issue is that that's always the case. best advice i can give personally is just don't go ultra-premium sector unless the money doesn't mean much to you, because that's the stuff that looks like a really bad deal pretty quickly.

I disagree with this. I think there are clear lines when common standards are set and become the norm for some time. Those are the times to jump in when the dust has settled on them. If the standards keep changing, then there's never really a standard. Something as simple as HDMI w/ HDCP on a 1080p set went a long way. I think when it comes to these new sets, simply getting something with HDMI 2.2, 4K and the right HDR will go a long way too. The thing is, right now we're still on the cusp of these becoming standard.

It is never a good time to buy a TV for new tech.

9th Generation Pioneer Kuro was probably the best point in history to buy a TV. Glad I jumped on that.
 
Top Bottom