So , there are a load of threads about 4K and HDR and there is clearly a lot of confusion between 4k, HDR and Wide Color Gamut.
As gamers the only component of this we should be interested in is the HDR part. Right now the developers are being cagey about what they are doing , what is HDR and what is WCG.
Microsoft have stated that the Xbox One S does not use WCG for games ( only movies)
Sony hasn't said a word about what either machine is actually doing. PS3, PS4 and Xbox One all supported 10 bit deep color from their launches, yet we've never seen a game that utilises it.
(Xbox One actually supports 12-bit output also, which will be a requirement for Dolby Vision in future)
10 bit textures and data requires more storage and more bandwidth, so there is are performance implications on all the machines.
For a game to well and truly support WCG, every texture and video will need to be replaced with an 10 bit WCG version.
So, on the assumption that we aren't going to see much in the way of WCG material any time soon, but the console manufacturers are insistent on talking about it like it is a very real thing. The only assumption that can be made is that they are using the HDR side of things to better control image brightness in things like sunspots and specular highlights.
So what I've done as a guideline to show where TVs are right now in the grand scheme of things. Rtings have done some tests that show a screen's capability to display a bright image on a 2% area of the screen, so something like the sun or perhaps a light source.
What we can see is that the vast majority of TVs available right now sit right at the very bottom on the chart, falling massively short of the peak brightness of 1000nits to hit the main HDR standard : UHD Premium
What this also shows you is exactly why Sony chose to create their own standard, one that has no minimum brightness requirement (4k HDR), it also shows why they have chosen not to publish Nit data for their most recent TVs (such as the X800D , which has been recommended in another thread).
Another note as a gamer, because HDR10 impliementations require the TV to analyse the image and control brightness accordingly, TVs are having to do image processing in order to actually display HDR content correctly, resulting in input lag.
Activating GAME MODE doesn't always help, as on some models this actual disables HDR processing altogether, as it stands many review sites are not clear about their testing of input lag with HDR for gaming.
So again, the resources for researching an expensive purchase such as a TV are limited, also contributing towards now not being an easy time to make the right purchase.
Anyway, I hope this helps to further add some clarity to the intentional obfusication from TV manufacturers and now Console makers.
Update: I've removed all references to Dolby Vision as even it's very presence seems to offend, also have removed the OLED TVs from this chart, as they are measured slightly differently due to their superior abilities to produce deep black.
Source of data : http://uk.rtings.com/tv/tests/picture-quality/peak-brightness
TDLR; Don't buy a cheap one right now, possibly don't buy an expensive one either as the HDR mode will bring in extra lag ,wait just a few months (CES is in 3 Months) until the next wave of TV's with HDR functionality appear.
As gamers the only component of this we should be interested in is the HDR part. Right now the developers are being cagey about what they are doing , what is HDR and what is WCG.
Microsoft have stated that the Xbox One S does not use WCG for games ( only movies)
Sony hasn't said a word about what either machine is actually doing. PS3, PS4 and Xbox One all supported 10 bit deep color from their launches, yet we've never seen a game that utilises it.
(Xbox One actually supports 12-bit output also, which will be a requirement for Dolby Vision in future)
10 bit textures and data requires more storage and more bandwidth, so there is are performance implications on all the machines.
For a game to well and truly support WCG, every texture and video will need to be replaced with an 10 bit WCG version.
So, on the assumption that we aren't going to see much in the way of WCG material any time soon, but the console manufacturers are insistent on talking about it like it is a very real thing. The only assumption that can be made is that they are using the HDR side of things to better control image brightness in things like sunspots and specular highlights.
So what I've done as a guideline to show where TVs are right now in the grand scheme of things. Rtings have done some tests that show a screen's capability to display a bright image on a 2% area of the screen, so something like the sun or perhaps a light source.

What we can see is that the vast majority of TVs available right now sit right at the very bottom on the chart, falling massively short of the peak brightness of 1000nits to hit the main HDR standard : UHD Premium
What this also shows you is exactly why Sony chose to create their own standard, one that has no minimum brightness requirement (4k HDR), it also shows why they have chosen not to publish Nit data for their most recent TVs (such as the X800D , which has been recommended in another thread).
Another note as a gamer, because HDR10 impliementations require the TV to analyse the image and control brightness accordingly, TVs are having to do image processing in order to actually display HDR content correctly, resulting in input lag.
Activating GAME MODE doesn't always help, as on some models this actual disables HDR processing altogether, as it stands many review sites are not clear about their testing of input lag with HDR for gaming.
So again, the resources for researching an expensive purchase such as a TV are limited, also contributing towards now not being an easy time to make the right purchase.
Anyway, I hope this helps to further add some clarity to the intentional obfusication from TV manufacturers and now Console makers.
Update: I've removed all references to Dolby Vision as even it's very presence seems to offend, also have removed the OLED TVs from this chart, as they are measured slightly differently due to their superior abilities to produce deep black.
Source of data : http://uk.rtings.com/tv/tests/picture-quality/peak-brightness
TDLR; Don't buy a cheap one right now, possibly don't buy an expensive one either as the HDR mode will bring in extra lag ,wait just a few months (CES is in 3 Months) until the next wave of TV's with HDR functionality appear.