Plasma, LCD, OLED, LED, best tv for next gen

HDR also adds lag. It's unknown if LG will get a HDR Game Mode into the 2016 sets. Fingers are crossed. It's not horrific but probably not ideal for MP gaming.

Hey jstevenson, wondering if you ever got a chance to setup the HDR mode on your OLED for Ratchet & Clank? If so, would you mind sharing the settings?

I am really confused about which color gamut to use in HDR mode... according to the OLED thread on AVSforum 'normal' is most accurate. But of course, everyone has differing opinions on the matter, most of which back-up their assertions with facts lol.

What color gamut looks most accurate to you for Ratchet & Clank in HDR?
 
Hey jstevenson, wondering if you ever got a chance to setup the HDR mode on your OLED for Ratchet & Clank? If so, would you mind sharing the settings?

I am really confused about which color gamut to use in HDR mode... according to the OLED thread on AVSforum 'normal' is most accurate. But of course, everyone has differing opinions on the matter, most of which back-up their assertions with facts lol.

What color gamut looks most accurate to you for Ratchet & Clank in HDR?

I believe you should be using wide color gamut for HDR content.
 
1. The LG resolves 300 lines of motion resolution. It doubles to 600 with just the mildest motion processing. The increased resolution of motion is better than the minor artifacts that introduces. It's the preferred calibration of the TV. The artifacts/soap opera effect aren't noticable at de-judder 1. But the picture quality increase is worth it.

HDR also adds lag. It's unknown if LG will get a HDR Game Mode into the 2016 sets. Fingers are crossed. It's not horrific but probably not ideal for MP gaming.

If input lag is the concern, calibrate your Game Mode for gaming.

2. There's not really a 4K calibration disc yet, but in theory it shouldn't matter in terms of calibrating SDR content. Maybe the sharpness patters and such. Brightness/color patterns won't be a big effect.

5. My point is they are basically the same except their starting point. So it doesn't matter which one you use if the light in the room is always the same. I have ISF Bright dialed in for daytime/lights on, and ISF Dark for nighttime in dim light or darkness

I know about HDR and lag, but I'm saying I don't notice a difference when I'm playing a game with HDR on. It feels just like it does when I was playing on my previous TV, which is what I was hoping for.

What I wanted to know was if enabling Tru Motion increases the lag. If not, then I'll give it a shot, but I wanted to make sure. I only tried it for a second and it was during a driving sequence, so I wasn't positive.
 
I believe you should be using wide color gamut for HDR content.

Yeah it is a little confusing, because i understand the logic... but then i have read both these threads:

http://www.avsforum.com/forum/40-ol...106-lg-oled-55-65e6-info-issues-settings.html
http://www.avsforum.com/forum/40-ol...89905-2016-oled-hdr-calibration-settings.html

It does not seem conclusive... especially the first thread, it says 'Color Gamut should be set to Normal for accurate colors in HDR modes' on the first page under Dolby Vision & HDR.

The difference between Normal and Extended color gamut is HUGE in Ratchet & Clank. One is muted, and the other over saturated. Wide is a good in-between. Grrrr, calibrating HDR is almost impossible as there are no reference disc lol.
 
Yeah it is a little confusing, because i understand the logic... but then i have read both these threads:

http://www.avsforum.com/forum/40-ol...106-lg-oled-55-65e6-info-issues-settings.html
http://www.avsforum.com/forum/40-ol...89905-2016-oled-hdr-calibration-settings.html

It does not seem conclusive... especially the first thread, it says 'Color Gamut should be set to Normal for accurate colors in HDR modes' on the first page under Dolby Vision & HDR.

The difference between Normal and Extended color gamut is HUGE in Ratchet & Clank. One is muted, and the other over saturated. Wide is a good in-between. Grrrr, calibrating HDR is almost impossible as there are no reference disc lol.

As someone who typically likes a lot of vibrancy in colors (again, anime fan), which of the three color gamuts would be be best?

Should I also have separate settings for 1080p, 4K and 4K+HDR? Or just use the same values for all three?
 
Hey jstevenson, wondering if you ever got a chance to setup the HDR mode on your OLED for Ratchet & Clank? If so, would you mind sharing the settings?

I am really confused about which color gamut to use in HDR mode... according to the OLED thread on AVSforum 'normal' is most accurate. But of course, everyone has differing opinions on the matter, most of which back-up their assertions with facts lol.

What color gamut looks most accurate to you for Ratchet & Clank in HDR?

99% sure I'm on Normal.
 
Normal is the correct option for HDR.
What I wanted to know was if enabling Tru Motion increases the lag. If not, then I'll give it a shot, but I wanted to make sure. I only tried it for a second and it was during a driving sequence, so I wasn't positive.

It adds a lot of lag, yes. Very noticeable to me when I started the FFXV demo and activated HDR. TruMotion is on by default when using HDR. Thankfully you can turn it off. Still need a HDR Game mode for further improvement.
 
I currently have a 65" KS8000 and while I do like the input lag on the set and colors, I'm not super impressed by much else. Namely, I have some irritating banding/dse, light clouding, multiple dead pixels and am not really satisfied with the sets overall dimming implementation. The QC leaves a lot to be desired as well.

Thank you for posting this. After all my failed attempts at buying an LCD television, I decided last year to go OLED, no matter the cost. Recently, I've started to falter and think about trying a KS8000 because they are so cheap this year. But every LCD I've ever tried had nasty DSE or banding and I just can't stand it. I've recommitted myself to OLED thanks to your post. I realize they are not perfect uniformity either but based on everything I've seen the DSE is less prevalent than LCD panels. Shame about the price and input lag, but I think I can deal with that better than DSE. I don't play competitively.
 
Thank you for posting this. After all my failed attempts at buying an LCD television, I decided last year to go OLED, no matter the cost. Recently, I've started to falter and think about trying a KS8000 because they are so cheap this year. But every LCD I've ever tried had nasty DSE or banding and I just can't stand it. I've recommitted myself to OLED thanks to your post. I realize they are not perfect uniformity either but based on everything I've seen the DSE is less prevalent than LCD panels. Shame about the price and input lag, but I think I can deal with that better than DSE. I don't play competitively.

For what it's worth I have no DSE on this model, but ymmmv.

Also for the guy with dead pixels, try rubbing them gently with a microfiber. I had two pixels I thought were dead out of box, but as soon as the cloth touched the screen they turned on.
 
As someone who typically likes a lot of vibrancy in colors (again, anime fan), which of the three color gamuts would be be best?

Should I also have separate settings for 1080p, 4K and 4K+HDR? Or just use the same values for all three?

Well Soldier, i will post my settings for you later. But there is no difference in settings between resolution (1080p or 4k), but there are differences in the settings between content and time of day (ISF Day for movies games, ISF Night for movies and games and HDR).

The biggest difference between movie and game settings is in 'Clear Motion". This doubles motion resolution from 350 to 650. But should not be used when playing games because when it is engaged makes the lag feel upwards of 100ms lol.

HDR is it owns separate settings, and unfortunately i am still a little confused how to properly calibrate. Also, i don't believe these OLED sets are bright enough to view HDR in the day time. At least in my experience.

99% sure I'm on Normal.

Thanks!

I think for better REC 2020 coverage in HDR you'll want to set Colour Gamut to Wide.

Yeah this is why i am a little confused... but this is what a calibrator had to say about the color gamut on AVS forum

"Normal is most accurate. Extended is very close and ok to use if you like things balanced in the more saturated direction.

As soon as it gets an HDR signal it goes to P3 in a BT2020 container.

Wide and to a lesser extent Extended just over saturate lower saturation levels; they do not extend the gamut out any farther to P3 or BT2020"
 
I thought HDR Premium certified TVs can cover around 70-75% of Rec. 2020, and that UHD content supports it as of May 2016?

Yeah support yes.

But it's my understanding almost all of the stuff out there now is mastered towards DCI-P3 (so it's DCI-P3 inside of a Rec2020 container ---- i.e., you're getting rec2020, but everything is just to the limits of DCI-P3, and if you set yourself for wide/extended, it'll just oversaturate... not in a terribly bad way, but in an inaccurate one)

75% of rec2020 is not even close really, so you're gonna end up clipping or not getting the correct image if your TV can't handle it (especially depending on how it handles colors it can't display...)

It's better to have 98% coverage and accurate DCI-P3, which is still way better (and brighter) than rec709/the old TV standard, than it is to try and fail miserably at rec2020. You get into all sorts of rolling off issues and such, which is where HDR is having it's biggest challenges in standardization right now.

Don't quote me on this, because I don't know everyone's implementations, but I am pretty sure XB1/PS4 are all DCI-P3 in terms of target colorspace. And I think all of the UHD discs out there are DCI-P3 too.

Long term the support to move up to rec2020 is there if displays end up getting there for UHD BRD and such, but right now DCI-P3 is the standard for HDR at home.

Also all of this only applies right now (and I'm not up to date in the last month or two) - as this shit changes faaaaaaast.
 
Still obsessively playing around with the settings trying to find that happy medium.

As 4K+HDR content goes, I think it's pretty much perfect. No complaints there.

But for 1080p content (namely games), things seem a little more washed out and less colorful/vibrant. This is where I'm struggling the most. My opinions on some of the settings;

1. As stated, Tru Motion is no good for games. There's a very noticeable input lag (and this is coming from the person who has no problems with how games control under HDR) that doesn't make any of them worth it. Haven't tried it with movies yet.

2. I see no downside keeping Super Resolution at High. For the 4K stuff it really spruces everything up, and doesn't seem to affect 1080p stuff in any meaningful way.

3. I'm constantly struggling between the Dynamic Contrast settings. High seems overkill, medium or low seem beneficial, or perhaps I should have it off completely. It's really hard to decide.

4. No idea about Gamma. I'm bouncing around 2.4 or 1.9. Seems some content is better/worse with one setting vs the other.

5. Also no idea what Edge Enhancer does, not seeing any notable change keeping it on or off.

6. I also have no idea how H and W sharpness works, or what's ideal.

7. Color Gamut seems totally worth keeping on wide

8. Black Level also seems better to keep on low vs high.

I think for 1080p my issue is less the brightness and more that the image is a little soft. Would messing with the sharpening filters help with that? Also debating between color at 70 or 80 (right now I have it at 75).
 
Still obsessively playing around with the settings trying to find that happy medium.

As 4K+HDR content goes, I think it's pretty much perfect. No complaints there.

But for 1080p content (namely games), things seem a little more washed out and less colorful/vibrant. This is where I'm struggling the most. My opinions on some of the settings;

1. As stated, Tru Motion is no good for games. There's a very noticeable input lag (and this is coming from the person who has no problems with how games control under HDR) that doesn't make any of them worth it. Haven't tried it with movies yet.

2. I see no downside keeping Super Resolution at High. For the 4K stuff it really spruces everything up, and doesn't seem to affect 1080p stuff in any meaningful way.

3. I'm constantly struggling between the Dynamic Contrast settings. High seems overkill, medium or low seem beneficial, or perhaps I should have it off completely. It's really hard to decide.

4. No idea about Gamma. I'm bouncing around 2.4 or 1.9. Seems some content is better/worse with one setting vs the other.

5. Also no idea what Edge Enhancer does, not seeing any notable change keeping it on or off.

6. I also have no idea how H and W sharpness works, or what's ideal.

7. Color Gamut seems totally worth keeping on wide

8. Black Level also seems better to keep on low vs high.

I think for 1080p my issue is less the brightness and more that the image is a little soft. Would messing with the sharpening filters help with that? Also debating between color at 70 or 80 (right now I have it at 75).
Yikes. Some crazy settings you have there. Don't you have an OLED?

Just choose suitable OLED lighting for your environment, contrast at 80-90, brightness at 51/52, color at 50, sharpness at 0-10, gamma at 2.2 for day-time and 2.4 at night-time, and turn all post processing off (Edge Enhancer, Super Resolution), as they cause artifacts. If you want to use Dynamic Contrast, go ahead, but it shouldn't be necessary at all.

Edit: Are you sure you have matching RGB levels for your TV and console? Could be why it's washed out. With the settings above, SDR gaming should definitely not look washed out. At all.
 
Yikes. Some crazy settings you have there. Don't you have an OLED?

Just choose suitable OLED lighting for your environment, contrast at 80-90, brightness at 51/52, color at 50, sharpness at 0-10, gamma at 2.2 for day-time and 2.4 at night-time, and turn all post processing off (Edge Enhancer, Super Resolution), as they cause artifacts. If you want to use Dynamic Contrast, go ahead, but it shouldn't be necessary at all.

Edit: Are you sure you have matching RGB levels for your TV and console? Could be why it's washed out. With the settings above, SDR gaming should definitely not look washed out. At all.

Again, a lot of the OLED tech is brand new for me, so I'm pretty much going with what looks pleasing to my eyes. Color might be a tad too high, so I'll continue to play around with it.

And once more, the 4K content with HDR that I've seen is pretty much perfect. No complaints there. Maybe it's switching from that to 1080p that it's looking worse for me then it actually is, but I don't think so.

Here's an example: I've been watching a lot of Bob's Burgers recently on Netflix (love it) on my previous TV:

Screen-Shot-2015-10-07-at-3.50.41-PM.png


On the intro, the white background isn't nearly as bright white as it was in the last TV. It looks more washed out and dimmer, and it carries throughout the episode. On the other hand, I tried another colorful series (Young Justice) and that seemed perfectly fine. Again, I need to watch more content and continue to tinker with things.

Also, what is SDR?
 
Thank you for posting this. After all my failed attempts at buying an LCD television, I decided last year to go OLED, no matter the cost. Recently, I've started to falter and think about trying a KS8000 because they are so cheap this year. But every LCD I've ever tried had nasty DSE or banding and I just can't stand it. I've recommitted myself to OLED thanks to your post. I realize they are not perfect uniformity either but based on everything I've seen the DSE is less prevalent than LCD panels. Shame about the price and input lag, but I think I can deal with that better than DSE. I don't play competitively.

Of course, glad I could help in any way!

For me, after owning a large LCD for the first time in my life, I realized that visible uniformity issues/DSE are the worst, most distracting things by far. It's one thing to see issues in test patterns and solid color screens and slides, but if I see it at all during regular content: either directly or with my peripheral vision, it's a deal breaker. I am still potentially willing to try and play the panel lottery and get another Samsung. Maybe try a KS9000 this time due better overall construction and hope for the best. One other thing I forgot to mention that I love about the Samsung are the HDR highlights on 4K blurays. The 1000 nits were incredible in movies like Star Trek Beyond and Xmen Apocalypse. But then my enjoyment was hampered by light leaking in the letterbox bars due to the top and bottom edge lighting. Always something it seems. Even though I know OLED has its own issues and perfection doesn't exist, I have a feeling I'll only get that "blown away" feeling with OLED and that's exactly what I'm looking for in an upgrade.

On a side note, here are some (less than ideal) pictures I took of my sets uniformity on white and gray and a strange diagonal dead pixel group in that last photo, if curious. http://m.imgur.com/a/KQ4eW

Good luck on your choice!
 
TV-related PC/receiver question that maybe someone can answer: I think I had this issue before but now I can't fix it. I'm trying to play Skyrim SE on my TV but I noticed the framerate was horrible. It must have something to do with my receiver because it works fine going from my PC directly to my monitor but when going through the receiver the Skyrim setting detector says that it cannot detect my video card and sets the graphics to low.
I currently have the PC to receiver via HDMI and then HDMI out from the receiver to the TV. Even when I try setting that HDMI input to passthrough it doesn't work correctly. Any ideas?
 
Started up Forza Horizon 3 for the first time yesterday after getting the early test firmware for the Vizio (P50-C1). This is absolutely the game that has sold me on HDR, and the TV's capability. While it seems before yesterday, whatever firmware I had on the TV may not have been playing nice with the TV and I noticed HDR improvements in other titles as well after the update, it was this game that truly showed me how significant of a difference it can be, and that it wasn't what I feared (that the TV I got just wasn't any good at it or something).
 
On the intro, the white background isn't nearly as bright white as it was in the last TV. It looks more washed out and dimmer, and it carries throughout the episode. On the other hand, I tried another colorful series (Young Justice) and that seemed perfectly fine. Again, I need to watch more content and continue to tinker with things.

Also, what is SDR?

The perception of white is highly affected by color temperature. We have 2 different tablets and as long as you use one you're fine, looks normal. If you switch back and forth, one looks yellow and one looks blue.

You just need to let your eyes adjust. What you may be seeking might be nowhere near accurate in the first place. If you want it nailed, hire a calibrator.
 
Started up Forza Horizon 3 for the first time yesterday after getting the early test firmware for the Vizio (P50-C1). This is absolutely the game that has sold me on HDR, and the TV's capability. While it seems before yesterday, whatever firmware I had on the TV may not have been playing nice with the TV and I noticed HDR improvements in other titles as well after the update, it was this game that truly showed me how significant of a difference it can be, and that it wasn't what I feared (that the TV I got just wasn't any good at it or something).

Yea Matt just sent out the new firmware to people who post on avs. It gots a lot of fixes and the main one for me is it fixes HDR on xbox and should fix HDR on ps4 pro when not choosing automatic. Now I'm not sure if that update helps with the input lag ( i dont notice it anyways) or if the next update is supposed to address that in early 2017.
 
Yea Matt just sent out the new firmware to people who post on avs. It gots a lot of fixes and the main one for me is it fixes HDR on xbox and should fix HDR on ps4 pro when not choosing automatic. Now I'm not sure if that update helps with the input lag ( i dont notice it anyways) or if the next update is supposed to address that in early 2017.
Matt said the HDR10 lag (which is what these consoles are using) is also addressed in this update and to refer to his initial post about improvements for numbers. In that post, he said they were going to bring down the additional HDR lag down down to 0 or really close to it (it was adding 15ms before). So the input lag for SDR and HDR with GLL on should be the same now, ~38ms if Rtings SDR measruments are accurate, ~45ms according to Matt's stated numbers for SDR.

And yeah, like you I honestly didn't notice the lag to begin with. My main concern was there being a discrepancy between HDR and SDR that could make me start noticing it. But that shouldn't be an issue anymore if they managed to do what they set out to do.
 
What is color gamut supposed to be for movie and games sdr and same for hdr ? Normal or wide ? (Oled c6)
 
The perception of white is highly affected by color temperature. We have 2 different tablets and as long as you use one you're fine, looks normal. If you switch back and forth, one looks yellow and one looks blue.

You just need to let your eyes adjust. What you may be seeking might be nowhere near accurate in the first place. If you want it nailed, hire a calibrator.

That could be the case, too. I don't suppose there's an estimated time when my eyes would conceivably adjust?

I've had weird cases where overtime the picture looks brighter and more vibrant, after back-and-forth glances between the TV and my iPad. I don't know if that's the OLED adjusting itself after some time has passed (does it do that?) or it's my eyes adjusting.
 
TV-related PC/receiver question that maybe someone can answer: I think I had this issue before but now I can't fix it. I'm trying to play Skyrim SE on my TV but I noticed the framerate was horrible. It must have something to do with my receiver because it works fine going from my PC directly to my monitor but when going through the receiver the Skyrim setting detector says that it cannot detect my video card and sets the graphics to low.
I currently have the PC to receiver via HDMI and then HDMI out from the receiver to the TV. Even when I try setting that HDMI input to passthrough it doesn't work correctly. Any ideas?

First off, I wouldn't use the Skyrim setting detector. Just set your resolution and settings manually in the launcher. Does it not work when you do this?

Are you using 1080p or 4K?

Also, how old is your receiver? If it's old it may not be 4K/60 capable, or maybe the HDMI cable from the receiver to the TV isn't up to snuff for 4K bandwidth.
 
Has anyone ever bought an open-box excellent certified item from Best Buy? A BB near me has one of the KS8000 for $940. I might have to go get this today. I don't see myself coming across this at that price again.
 
Has anyone ever bought an open-box excellent certified item from Best Buy? A BB near me has one of the KS8000 for $940. I might have to go get this today. I don't see myself coming across this at that price again.

I saw a deal at Sam's Club for a new one for $899. Check if your best buy will match it if yall have a Sam's Club near by.
 
I saw a deal at Sam's Club for a new one for $899. Check if your best buy will match it if yall have a Sam's Club near by.

I should have specified this is the 55". I just did a search for Sam's Club and the 49" was the one at 899. Unless there's a Black Friday sale of the 55" at 899.
 
Has anyone ever bought an open-box excellent certified item from Best Buy? A BB near me has one of the KS8000 for $940. I might have to go get this today. I don't see myself coming across this at that price again.

What size? I think the 55" has been as low as $997 new, from Amazon (although it's currently out of stock):

http://camelcamelcamel.com/Samsung-UN55KS8000-55-Inch-Ultra-Smart/product/B01C5TFLSE

Click the "Amazon" tab and look at the price history (it defaults to third party right now as Amazon apparently has temporarily stopped selling it). I'm guessing we'll see this discounted to $999 or below for black friday.

If it's the 65", disregard.
 
What size? I think the 55" has been as low as $997 new, from Amazon (although it's currently out of stock):

http://camelcamelcamel.com/Samsung-UN55KS8000-55-Inch-Ultra-Smart/product/B01C5TFLSE

Click the "Amazon" tab and look at the price history (it defaults to third party right now as Amazon apparently has temporarily stopped selling it). I'm guessing we'll see this discounted to $999 or below for black friday.

If it's the 65", disregard.

999 seems to be he going price right now at Best Buy, Samsung.com, Amazon, etc. Best Buy's BF price is 997. I don't know how much lower it's going to get. In fact, I think it may bump back up to 11 or 1200 after the holidays. An additional 60 off now is very tempting.
 
First off, I wouldn't use the Skyrim setting detector. Just set your resolution and settings manually in the launcher. Does it not work when you do this?

Are you using 1080p or 4K?

Also, how old is your receiver? If it's old it may not be 4K/60 capable, or maybe the HDMI cable from the receiver to the TV isn't up to snuff for 4K bandwidth.

No it doesn't work when setting manually wither it just runs at <10 fps. It's just a 1080p TV and my receiver is only 1-2 years old (Denon x1220w). I think it's affecting everything currently because I tried the RE6 benchmark tool and similarly the performance was 10x worse through the receiver than directly to my monitor. I feel like it worked correctly a few months ago because I have played a few PC games on my TV with no noticeable performance hits like this (but maybe I'm crazy and this was before I ever got my receiver).
 
Not enjoying all these comments about the amount of fiddling HDR requires to get right. What the hell are manufacturers doing?? HDR10 seems like a right mess atm.
 
What is this exactly, and how is it achieved?

Two competing standards for HDR:
http://www.techhive.com/article/307...n-versus-hdr-10-tv-a-format-war-and-more.html

Dolby uses dynamic metadata which makes it better, but they're gonna update HDR10 with it too, it seems.
HDR10 is open to use for everyone, Dolby Vision has to be licensed (and runs off a chip, AFAIK) but consoles only support HDR10 for gaming, i think.

So unless you're talking movies, you probably shouldn't worry about Dolby Vision (though your LG OLED should have both).
 
Two competing standards for HDR:
http://www.techhive.com/article/307...n-versus-hdr-10-tv-a-format-war-and-more.html

Dolby uses dynamic metadata which makes it better, but they're gonna update HDR10 with it too, it seems.
HDR10 is open to use for everyone, Dolby Vision has to be licensed (and runs off a chip, AFAIK) but consoles only support HDR10 for gaming, i think.

So unless you're talking movies, you probably shouldn't worry about Dolby Vision (though your LG OLED should have both).

So how do I set it up for movies?
 
What is color gamut supposed to be for movie and games sdr and same for hdr ? Normal or wide ? (Oled c6)

I was going to ask this. My LG has a Color Gammut option with Normal and Wide.

I did some google work and seems like Wide needs to be ON for HRD content. Thing is, my TV doesn't have HDR, hell it isn't even a 4K TV. So I don't know if I should use Normal or Wide.

Both look different color wise, but I want to get the accurate source picture.
 
I have a question regarding the right format for USB stick for the TV to read.
So I have a Samsung KU6409 and I have a 128gb USB 3.0 Stick and I want to put large files on the for the TV to read.
Right now the TV can read whats on the stick, but I can't put large files on the stick.
What format do I have to use so I can put like 20gb+ files on there and the TV can still read it.
 
Two competing standards for HDR:
http://www.techhive.com/article/307...n-versus-hdr-10-tv-a-format-war-and-more.html

Dolby uses dynamic metadata which makes it better, but they're gonna update HDR10 with it too, it seems.
HDR10 is open to use for everyone, Dolby Vision has to be licensed (and runs off a chip, AFAIK) but consoles only support HDR10 for gaming, i think.

So unless you're talking movies, you probably shouldn't worry about Dolby Vision (though your LG OLED should have both).

Everyone says "dynamic metadata" blah blah blah.

That's only PART of it.

Dolby Vision is just a better standard at the moment because it's standardized. Everything is mastered on one display, the chipset knows how to handle the rolloffs and adjust to the display it's on.

HDR10 isn't really handling this stuff well. The metadata standards are a mess as a whole and vary wildly between content, and the displays all handle it differently. Some content will look great, and others have issues (and issues that can be completely different - whether it be white or black issues, or slight color / saturation shifts). Hence all the fiddling. Dolby Vision on the other hand just works and looks sensational.

Dynamic metadata should improve things from scene to scene in HDR10, but they really need to get everyone on the same page in terms of the data they are sending while the TV manufacturers need to do a better job of it too. But either side can cause an issue (improperly mastered content, or a display that can't display the whole range or doesn't handle the metadata well)

All that said, we're getting into details here. The mass audience will probably see the slightly incorrect HDR10 as better than anything they've ever had on their TV before (more colors, brighter, more accurate out of the box) --- it's the AV nerds that this seems specifically designed to torment right now.
 
I was going to ask this. My LG has a Color Gammut option with Normal and Wide.

I did some google work and seems like Wide needs to be ON for HRD content. Thing is, my TV doesn't have HDR, hell it isn't even a 4K TV. So I don't know if I should use Normal or Wide.

Both look different color wise, but I want to get the accurate source picture.

HDR10 on an LG OLED you want Normal.

Wide over saturates things.

It doesn't make sense, but that's the reality!
 
Top Bottom