Plasma, LCD, OLED, LED, best tv for next gen

Man, you're just lucky that you can buy in USD.

The European prices for OLEDs are so bad. The B6 in Europe costs MORE than the E6 in the US. Let that sink in.

That's why I had to opt for the B6 (and I got it on sale even). If you look on Amazon Germany right now (as I live in Germany), the 55" E6 would set you back $6023

That's over six thousand USD when the same thing is available on Amazon US for less than $3500.

As an American living abroad, it makes me feel pretty sick. :\

Why are the prices so expensive in Europe? I'm guessing Shipping, Taxes, and lack of comp?

In Australia right now one of the retail stores (goodguys) has the 65" C6 & 65" B6 for $5036AUD (around 3405.63 Euro) and each week it seems prices drop due to high comp in Australia.

Edit: Just checked Amazon Germany holy fuck Europe is getting boned hard. 0_0

Edit 2:

Philips unveils 55" OLED TV with Ambilight, 4K, HDR & Android TV

http://www.flatpanelshd.com/news.php?subaction=showfull&id=1472727435
 
I achieved 4k60 in BF1 yesterday on my 4k monitor. What a visual experience. I can only imagine on a 65" OLED TV my god. I'd probably pass out.
 
I've got a question about how demanding y'all are about screen uniformity, specifically anyone with a 2016 OLED. I just got my replacement B6 a week ago, and have noticed a bizarre pattern of lighter pixels:

AzZb.jpg


As with most lighter bands, I'm wondering whether or not it's a big issue. I do use my tv as a monitor, so I'm more frequently exposed to solid color blocks - websites, photo editing and design in Adobe products, darker video game scenes, etc.

Worth noting that I did run the panel clear setting overnight and this pattern remains, so I don't think it's image retention. It's also not a smudge, which I honestly thought it might be at the start.

Would hate to start another ticket after just having gotten this second panel - is this just down to lottery? Anyone else notice uniformity problems? Or am I overreacting, and it might go away after awhile?
 
Man, you're just lucky that you can buy in USD.

The European prices for OLEDs are so bad. The B6 in Europe costs MORE than the E6 in the US. Let that sink in.

That's why I had to opt for the B6 (and I got it on sale even). If you look on Amazon Germany right now (as I live in Germany), the 55" E6 would set you back $6023

That's over six thousand USD when the same thing is available on Amazon US for less than $3500.

As an American living abroad, it makes me feel pretty sick. :
i would have had two 65 inch E6s by now if I could pay dollar prices
 
Right now, it's got inferior motion resolution compared to plasma, relatively dim HDR compared to LCD, not the best shadow detail, and too high lag for serious gaming in HDR mode.

Agree with your other points but input lag has been fine for me outside of Game mode so I can't see it being a problem with HDR gaming.

I think the E6 is praised for having one of the lowest input lag in HDR no?
 
This question might not make sense but I'll ask it anyway (since many of you know WAAAAAAAAY more about TV tech than I do):

Are many of the "issues" with current OLEDs - brightness, dark grey uniformity, image retention, input lag - inherent to OLED panels, or just LG's current design? In other words, do you anticipate 2017 models to resolve any of these issues? I'm currently in the process of selling / buying a house, so I'm not in the biggest rush, but I know that the 2017 models will probably come with a +$1000 price tag.

Edit: and thanks. You guys are super helpful.
 
The native refresh rate of these LG OLED panels is 120Hz just sucks that we are limited by shitty ass HDMI standard which is always lagging behind Display Port & other connection methods (thunderbolt for example).

I sadly don't see any TV manufacturer paying the licensing fee for DP 1.4 because to most of them they simply don't consider PC users/gamers a target audience..which I think is silly as fuck seeing as how many PC gamers are willing to spend top dollar on products that they deem is pushing technology forward. For a few extra dollars you could end up converting that into many more customers.

Also, lastly the other things I'd like to see improved for next years models.

- Motion Resolution 600 lines for current sets with right settings is a shame considering the potential of OLED's. (if rumors on AVS are true Motion Res is gonna get a huge upgrade in 2017 models.)

- Better handling of just above black when it comes to the way LG processes the signal.

- Faster SOC that does not lag the UI while in HDR mode only slight annoying thing I've seen on E6. WebOS overall is freaking amazing.

- Bonus: 1,000 nits ! (Would be more than happy if brightness was bumped to 800nits to be honest haha.)
Agreed. There is a market out there. I've been an early adopter many times and it paid off often despite the odds. I've spent countless hours researching 4K in the last two years.
I definitely feel that vibe I did with the 720 projector 10 years ago and when I bought the 1440p Catleap 5 years ago. Paid $300 back then, still going strong!


OLED is so close to nailing it already. I want one so bad, I can almost see next year's Black Friday deals. Even the finer points you made align with mine. I think a sub $2k 50" OLED is out before this year is over but even a dedicated monitor panel is unlikely to go beyond 60Hz in 2016.

Next year should also see the first GPU to handle 4K properly. There is a ever growing user base willing to compromise on display size, view distance and budget in order to achieve better immersion. Vizio P came so close to mastering both TV and monitor functions this year, an OLED is guaranteed to finish the job in 2017.
 
This question might not make sense but I'll ask it anyway (since many of you know WAAAAAAAAY more about TV tech than I do):

Are many of the "issues" with current OLEDs - brightness, dark grey uniformity, image retention, input lag - inherent to OLED panels, or just LG's current design? In other words, do you anticipate 2017 models to resolve any of these issues? I'm currently in the process of selling / buying a house, so I'm not in the biggest rush, but I know that the 2017 models will probably come with a +$1000 price tag.

Edit: and thanks. You guys are super helpful.

With each iteration they've improved these specs. They've reduced the input lag and increased the peak brightness considerably over last year's models so I fully expect them to make more strides next year.

But don't let anyone fool you into thinking OLED brightness is an "issue". It's not. It can be better but "issue" isn't the word I'd use for it and it's natural for tech that hasn't matured yet to improve with each iteration anyway.
 
I've got a question about how demanding y'all are about screen uniformity, specifically anyone with a 2016 OLED. I just got my replacement B6 a week ago, and have noticed a bizarre pattern of lighter pixels:

As with most lighter bands, I'm wondering whether or not it's a big issue. I do use my tv as a monitor, so I'm more frequently exposed to solid color blocks - websites, photo editing and design in Adobe products, darker video game scenes, etc.

Worth noting that I did run the panel clear setting overnight and this pattern remains, so I don't think it's image retention. It's also not a smudge, which I honestly thought it might be at the start.

Would hate to start another ticket after just having gotten this second panel - is this just down to lottery? Anyone else notice uniformity problems? Or am I overreacting, and it might go away after awhile?

I have something similar on my 930, (a 2015 model) though I rarely notice it these days, so perhaps in fades over time? Used to spot it on large grey objects, other colours seemed fine.
 
I achieved 4k60 in BF1 yesterday on my 4k monitor. What a visual experience. I can only imagine on a 65" OLED TV my god. I'd probably pass out.

It might sound strange, but one of the best 4K experiences I've had is the Telltale Batman game. Looks like a comic book come to life. Hell, I've still got the performance headroom to down sample from an even higher resolution to 4K/60. Gonna give it a go later for that Pixar IQ.

Agree with your other points but input lag has been fine for me outside of Game mode so I can't see it being a problem with HDR gaming.

I think the E6 is praised for having one of the lowest input lag in HDR no?

My point was Fine is not Perfect.

The input lag is 56ms in HDR for the E6. You might not notice it, but I would prefer less than 30ms.

This question might not make sense but I'll ask it anyway (since many of you know WAAAAAAAAY more about TV tech than I do):

Are many of the "issues" with current OLEDs - brightness, dark grey uniformity, image retention, input lag - inherent to OLED panels, or just LG's current design? In other words, do you anticipate 2017 models to resolve any of these issues? I'm currently in the process of selling / buying a house, so I'm not in the biggest rush, but I know that the 2017 models will probably come with a +$1000 price tag.

Edit: and thanks. You guys are super helpful.

If you can wait, wait. The tech is making significant strides every year and content standards are still being finalized. When you're spending that much money, you shouldn't have any regrets or make compromises IMO. We will have new models announced and demoed at CES in 4 months. They will probably be available for purchase in as soon as 6.
 
I have something similar on my 930, (a 2015 model) though I rarely notice it these days, so perhaps in fades over time? Used to spot it on large grey objects, other colours seemed fine.

Yeah it's definitely most noticeable in grey objects - probably why I first noticed it in illustrator. I have yet to notice it explicitly in a movie or a game, which is why I'm hesitating to bring it up to customer support. It's iust such a weird pattern. I hate panel lottery, esp on TVs that cost thousands.
 
This question might not make sense but I'll ask it anyway (since many of you know WAAAAAAAAY more about TV tech than I do):

Are many of the "issues" with current OLEDs - brightness, dark grey uniformity, image retention, input lag - inherent to OLED panels, or just LG's current design? In other words, do you anticipate 2017 models to resolve any of these issues? I'm currently in the process of selling / buying a house, so I'm not in the biggest rush, but I know that the 2017 models will probably come with a +$1000 price tag.

Edit: and thanks. You guys are super helpful.

Image retention is not really an issue, though I'm not gonna promise you won't get IR if you leave up a static image on the screen for good 8 hours lol.

Input lag has nothing to do with the tech if LG wanted they could easily produce the fastest response and lowest input lag display on earth thanks to how insanely fast the pixel response of OLED is. Around 2 frames of input lag is about the norm these days for top tier displays. I expect next years model to improve on the number seeing as how many reviews now point out the input lag of displays thus putting more pressure on manufactures to pay more attention to how much processing they still allow in "game" mode.

I expect LG to keep improving on near black detail and other slight niggles. Their jump from 2015 to 2016 model was pretty damn huge. Nothing is inherently wrong with the tech at all. Unlike LCD displays which will forever have huge flaws that simply can't be fixed due to the nature of the technology the base technology of OLED is fantastic now as it keeps maturing and manufacturers rapidly improve on each release you'll see OLED take even larger slice of the high tier pie (also, midrange as prices naturally go down).

It is pretty telling that Sony fast tracked their Z9D with "backlight master drive" after the crazy amount of buzz LG got with their OLED displays this year and even after all that the What HiFi Review slammed the shit out of the unit.

I've got a question about how demanding y'all are about screen uniformity, specifically anyone with a 2016 OLED. I just got my replacement B6 a week ago, and have noticed a bizarre pattern of lighter pixels:

As with most lighter bands, I'm wondering whether or not it's a big issue. I do use my tv as a monitor, so I'm more frequently exposed to solid color blocks - websites, photo editing and design in Adobe products, darker video game scenes, etc.

Worth noting that I did run the panel clear setting overnight and this pattern remains, so I don't think it's image retention. It's also not a smudge, which I honestly thought it might be at the start.

Would hate to start another ticket after just having gotten this second panel - is this just down to lottery? Anyone else notice uniformity problems? Or am I overreacting, and it might go away after awhile?

Check dark gray patterns to see how bad it is. Normally it fades over time I know a lot of owners report of huge improvements couple 100 hours in. Reminds me of how my Plasma display settled in after 300 hours.

If it's SUPER bad though it might not go away where you don't notice it in normal content that would warrant a return.
 
It's certainly a fantastic tv, but don't fool yourself. There are some obvious issues that need to be addressed before it can be called perfect.

Right now, it's got inferior motion resolution compared to plasma, relatively dim HDR compared to LCD, not the best shadow detail, and too high lag for serious gaming in HDR mode.

OLED is the future, but there are still some ways to go.

Enjoy your tv though. The wait for next year's model is killing me while you get to experience entertainment nirvana now. :)

KS9500 is all the gaming TV I need :)
 
I checked out the new Sony Z series while I was looking at the OLED C6, and that Sony set is stunning. Now this is coming from someone that isn't a TV pro that has seen the best there is to offer over the years, but for me it was the closest thing I've seen where it felt like I was looking out a window lol. The footage they had running was great.
 
I checked out the new Sony Z series while I was looking at the OLED C6, and that Sony set is stunning. Now this is coming from someone that isn't a TV pro that has seen the best there is to offer over the years, but for me it was the closest thing I've seen where it felt like I was looking out a window lol. The footage they had running was great.

I'm liking what I'm reading about the Z series from new owners on AVS Forums. I won't be able to afford one of the current gen sets, but I'm excited to see how things develop over the next year or two when I'll be in the market for an upgrade.
 
With each iteration they've improved these specs. They've reduced the input lag and increased the peak brightness considerably over last year's models so I fully expect them to make more strides next year.

But don't let anyone fool you into thinking OLED brightness is an "issue". It's not. It can be better but "issue" isn't the word I'd use for it and it's natural for tech that hasn't matured yet to improve with each iteration anyway.


If you can wait, wait. The tech is making significant strides every year and content standards are still being finalized. When you're spending that much money, you shouldn't have any regrets or make compromises IMO. We will have new models announced and demoed at CES in 4 months. They will probably be available for purchase in as soon as 6.


Image retention is not really an issue, though I'm not gonna promise you won't get IR if you leave up a static image on the screen for good 8 hours lol.

Input lag has nothing to do with the tech if LG wanted they could easily produce the fastest response and lowest input lag display on earth thanks to how insanely fast the pixel response of OLED is. Around 2 frames of input lag is about the norm these days for top tier displays. I expect next years model to improve on the number seeing as how many reviews now point out the input lag of displays thus putting more pressure on manufactures to pay more attention to how much processing they still allow in "game" mode.

I expect LG to keep improving on near black detail and other slight niggles. Their jump from 2015 to 2016 model was pretty damn huge. Nothing is inherently wrong with the tech at all. Unlike LCD displays which will forever have huge flaws that simply can't be fixed due to the nature of the technology the base technology of OLED is fantastic now as it keeps maturing and manufacturers rapidly improve on each release you'll see OLED take even larger slice of the high tier pie (also, midrange as prices naturally go down).

It is pretty telling that Sony fast tracked their Z9D with "backlight master drive" after the crazy amount of buzz LG got with their OLED displays this year and even after all that the What HiFi Review slammed the shit out of the unit.

You guys are amazing. Thank you. I have the time to wait (plus a 60" Samsung LED that could fill in until the next OLED model is released). Seems as though none of these issues are "unfixable" given the tech
 
Where are the UHD Blu Ray players with Dolby Vision support? I'm really itching for a new player.

Considering no UHD BRs support DV, what are you hoping to play on it? Do you want it for streaming? If so, all TVs with DV that I know of also have streaming apps compatible with DV.
 
Considering no UHD BRs support DV, what are you hoping to play on it? Do you want it for streaming? If so, all TVs with DV that I know of also have streaming apps compatible with DV.

Studios have announced their support for DV so I imagine Blu Ray DV has to be a thing. Otherwise DV is kinda dead in the water.
 
Studios have announced their support for DV so I imagine Blu Ray DV has to be a thing. Otherwise DV is kinda dead in the water.

If there are no UHD Blu-Ray players with DV yet, we can assume support for it on UHD BR is still contentious. Dolby trying to push their proprietary format vs. the open standard is helping nobody. I sincerely hope HDR10+Dynamic Metadata wins this war, and I'm not even that invested in my current HDR10 TV (KS8000) and totally open to upgrading to a DV supporting OLED panel in the next 2 years if it miraculously becomes the norm.
 
Studios have announced their support for DV so I imagine Blu Ray DV has to be a thing. Otherwise DV is kinda dead in the water.

They're expected next year, I suspect Universal, Warner Bros, Paramount, and evenDisney are prepared to put full support behind Dolby Vision. Fox, is the only studio I have doubts about, they seem to be in bed with Samsung. They even use Samsung displays to color grade they're IP's.

I almost forgot Lionsgate has adopted Dolby Vision.
 
If there are no UHD Blu-Ray players with DV yet, we can assume support for it on UHD BR is still contentious. Dolby trying to push their proprietary format vs. the open standard is helping nobody. I sincerely hope HDR10+Dynamic Metadata wins this war, and I'm not even that invested in my current HDR10 TV (KS8000) and totally open to upgrading to a DV supporting OLED panel in the next 2 years if it miraculously becomes the norm.

Why not support the company that paid for and did the research, to bring HDR to the masses? Why shouldn't they get paid for they're work?

Maybe they shouldn't have shared they're research with SMPTE.
 
OLED competitors are a great thing.

They're not competitors, they're all LG panels. They can only price their sets based on what LG sells them the panels for, which is no doubt a premium being the sole manufacturer. This is not like LCD's where they have a dozen+ panel manufacturers to choose from, all competing with each other in price and specs.
 
Check dark gray patterns to see how bad it is. Normally it fades over time I know a lot of owners report of huge improvements couple 100 hours in. Reminds me of how my Plasma display settled in after 300 hours.

If it's SUPER bad though it might not go away where you don't notice it in normal content that would warrant a return.

Hm I'll test some patterns when I get a free second. I really don't want to send another damn tv back, but I'd also rather not work through hundreds of hours of content only for the banding to remain.
 
and my newest monitor...

Dell 2715K (Manufacturered December 2014)-- 12-bit color (downsamples to 10-bit), 5K (roughly twice the pixels of 4K), 99% of Adobe RGB color gamut, 350 nits. http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=bsd&cs=04&sku=up275k3 At 5K resolution, this monitor is more advanced than TVs will be for probably the next 5 years. As a matter of fact, no single cable exists yet supporting 5K resolution, so this monitor requires 2 displayport cables. Once again, darker than brightest HDR TVs due to the IPS panel.

The 2715K is such a beast of a monitor. Do you do video/photography work?
 
Another thing to note is if you are planning on getting a 4K TV to connect to your computer, you should have a Maxwell or higher. You 'can' do 4K on a Kepler or other HDMI 1.4 card; however, it will be YUV420 which is 4K Luma (brightness) and 2K Chroma (color).
 
Last edited:
I'm a software developer and I love having as much code on my monitor as possible. I got the monitor as an open-box at Fry's and it was 20% off the list price! And yes, I occasionally do video and photography work. It's been a while since I've contributed, but here is my most recent changeset to avidemux, an open-source video editing solution. https://github.com/mean00/avidemux2/commit/43373c07b4a00962e85c0b5cf23d783226b9dd58 This changeset added h265 support through the x265 library-- admittedly, it is very rough and has been substantially improved by other contributors since my initial commit.

I currently have that monitor at 150% scaling, and some applications handle scaling better than others. Firefox scales beautifully on the monitor, as does Media Player Classic Home Cinema and Windows Explorer due to their native scaling support. Lots of legacy applications have to be upscaled to 150% by Windows and look a little blurry, so I tend to use those on my second monitor (the 4K) which doesn't upscale.

Awesome stuff. I also work in software development (webdev) and have been considering taking the plunge for a 4K monitor (use 2x1600p at work, 1440p + 2x1080p vertical at home). Sounds like a sweet setup, I think I'll considering upgrading some time this year. Maybe not to the UP2715K, but get my feet wet with the P2715Q.
 
This question might not make sense but I'll ask it anyway (since many of you know WAAAAAAAAY more about TV tech than I do):

Are many of the "issues" with current OLEDs - brightness, dark grey uniformity, image retention, input lag - inherent to OLED panels, or just LG's current design? In other words, do you anticipate 2017 models to resolve any of these issues? I'm currently in the process of selling / buying a house, so I'm not in the biggest rush, but I know that the 2017 models will probably come with a +$1000 price tag.

Edit: and thanks. You guys are super helpful.

They will probably get a bit brighter going forward.

But the current OLED generation has PLENTY of brightness unless you're putting it in the brightest of rooms (and even then? It's fine unless you want to watch tons of dark content).

Part of the key with HDR is the contrast and perceived contrast. Because of the perfect blacks, OLEDs don't need to hit the same brightness to appear as bright.

Uniformity improved from 2015->2016. It is basically a non-issue at this point in normal content. I'd imagine 2017 will be further in that direction.

Image retention isn't an issue in normal content with 2016 models. If you use it as a PC monitor, maybe.

Input Lag is something LG can certainly make better going forward, nothing inherent to the design of the panel. There current game mode is perfectly fine though.
 
And the current HDMI 2.0 standard is does not have the bandwidth to allow both 4K and HDR, so the resolution is sacrificed in order to provide the 10-bit color. It is expected with future versions of HDMI this problem will be solved.
 
Last edited:
That's funny-- because when HDR was being shown off at CES for the first time, all the reporters called it a "buzzword":

http://www.cnet.com/news/ces-2016-tv-tech-preview/
http://www.techhive.com/article/292...pe-swirling-around-high-dynamic-range-tv.html

Fact of the matter is, PC monitors have supported "HDR" features for years and years.

yeah, but in terms of television and film mastering and content, it's not just a buzzword.

Televisions weren't built to hit that range/spectrum of color, content was mastered to the old TV standards.

Very very different from realtime computer graphics applications. And even if your PC monitor could handle it, the content you watch on it (Be it blu-ray, netflix, etc) - wasn't mastered to take advantage of that monitor.
 
Another technology which is due for an upgrade soon is displayport. Currently, displayport only has enough bandwidth for a single 4K monitor, so 5K monitors require two displayport cables and stich together two images to create a 5K screen.
 
Last edited:
I also would encourage using displayport to connect your monitors, as graphics cards are more frequently only having a single HDMI port, which in the future will best be used for a VR headset, not a monitor.
 
Last edited:
That's funny-- because when HDR was being shown off at CES for the first time, all the reporters called it a "buzzword":

http://www.cnet.com/news/ces-2016-tv-tech-preview/
http://www.techhive.com/article/292...pe-swirling-around-high-dynamic-range-tv.html

Fact of the matter is, PC monitors have supported "HDR" features for years and years.
Well that settles it then. CNET called it a 'buzzword'. Case closed. They also call 4K and OLED a buzzword in the next sentence.

PC monitors may support some of the features, but they don't adhere to either of the standards, which is kinda important. Especially on a gaming forum.
 
HDR is just a buzzword for TVs. For monitors, you have to actually look at the specifications to determine the number of colors supported and maximum brightness. High end monitors have supported 10-bit and even 12-bit color for a long time now and have different specified nits of brightness.

All 3 of my monitors are already 10-bit color, and many of them were made before HDR was a television feature:

HP ZR30w (Manufacturered July 2010)-- 10-bit color, 2K, 99% of Adobe RGB color gamut, 330 nits. http://h20564.www2.hp.com/hpsc/doc/public/display?docId=emr_na-c02159509 If you look at the specs, this monitor currently has a higher color gamut than any television currently in production. Including the OLEDs. It doesn't get as bright as the brightest TVs, but IPS panels are darker than VA panels in general.

Dell UP3214Q (Manufacturered November 2013)-- 10-bit color, 4K, 99% of Adobe RGB color gamut, 350 nits. http://accessories.ap.dell.com/sna/productdetail.aspx?c=au&cs=aubsd1&l=en&sku=210-ACBW Once again, this monitor is from 2013 and has a higher color gamut than any television currently in production. It's darker than the brightest HDR TVs, due to the IPS panel.

and my newest monitor...

Dell 2715K (Manufacturered December 2014)-- 12-bit color (downsamples to 10-bit), 5K (roughly twice the pixels of 4K), 99% of Adobe RGB color gamut, 350 nits. http://accessories.us.dell.com/sna/productdetail.aspx?c=us&l=en&s=bsd&cs=04&sku=up275k3 At 5K resolution, this monitor is more advanced than TVs will be for probably the next 5 years. As a matter of fact, no single cable exists yet supporting 5K resolution, so this monitor requires 2 displayport cables. Once again, darker than brightest HDR TVs due to the IPS panel.

From my experience, the best PC monitors are IPS panels and darker than televisions' vertical alignment panels. That being said, these monitors are PLENTY bright and I have never thought they were too dark. As far as I know, no monitor manufacturer has used the "HDR" buzzword. Just get a monitor with the specifications you desire.

Note that only the highest end graphics cards can output 10-bit and 12-bit color. This includes AMD FirePro, NVIDIA Quadro, and the very high-end NVIDIA and AMD cards. My Titan X (Maxwell) can output 10-bit color, and I believe my 780 TI (Kepler) can also do it.
Great info. Thanks. A few more questions if you don't mind...

What about the HDR capabilities of something like the Xbox One S and (presumably) the PS4 Neo? Would you be able to take advantage of those on these monitors you mentioned?

Also, how are the black levels on the monitors? I'm a plasma guy so I've always enjoyed the black levels and shadow detail of a plasma. Just curious as to how these stack up in that regard.

yeah, but in terms of television and film mastering and content, it's not just a buzzword.

Televisions weren't built to hit that range/spectrum of color, content was mastered to the old TV standards.

Very very different from realtime computer graphics applications. And even if your PC monitor could handle it, the content you watch on it (Be it blu-ray, netflix, etc) - wasn't mastered to take advantage of that monitor.

Ahh. That's kind of what I was thinking. So these monitors wouldn't really be able to use the "HDR" features of the Xbox One S or the Neo?
 
TRega don't be in denial, HDR is not simply 10bit colour.

edit: reading your earlier post, why don't you just disable DPI scaling for the applications, so they won't scale on the monitor you have at 150?
 
Ahh. That's kind of what I was thinking. So these monitors wouldn't really be able to use the "HDR" features of the Xbox One S or the Neo?

No, which is why you won't hear anyone say "use your monitor for HDR" instead of recommending a TV with HDR support.
 
Another factor in choosing a computer monitor is the display technology being used. TN panels tend to have the highest refresh rates, but they have poor viewing angles and often achieve high color pallets through dithering-- or rapid flickering of a color. I prefer IPS panels, as I am willing to sacrifice the refresh rates for a better quality display.
 
Last edited:
Top Bottom