• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

ElNino

Member
So which UHD Blu Ray Players do you guys have?

For my B6 I'm looking at the Samsung K8500, LG UP970 and the Sony X800.

Since apparently the Dolby Vision update for the UP970 was released and then taken away again, I'm leaning towards the X800 for the quieter drive. Do you guys think the Sony might get a DV update sometime in the far away future?
I have the LG UP970, mainly due to it being cheap along with buying the B7. I wanted to get the Oppo, but I couldn't justify the extra $600 dollars. I've only used it with a few UHD discs thus far (Guardians 2, Planet Earth 2), but it's okay I guess. Going back to a tray based loader instead of slot loading (PS3 has been my primary Blu-ray player) makes it feel a bit cheap to me, but image quality seems good. I like that I can split the audio and video HDMI connections on the UHD players so that I can have different TV settings for the UHD player versus those for the PS4/X1.

I can't imagine playing at anything other than ISF Dark, or even Cinema/Cinema Home in HDR. Even on ISF dark, blues and reds in particular are almost retina searing. I've had calibrated Panasonics for years and even standard looks like Willy Wonka everything, let alone Vivid picture modes.
Yeah, I feel the same. I wonder if it has anything to do with coming from plasma or LCD to the OLED, as I guess expectations from an LCD might be different than plasma. I came from (and still have) a Samsung plasma and the Cinema/ISF Dark settings (HDR or not) are more than bright enough for me.
 

psychotron

Member
So is Vudu going to support HDR10? I saw a bunch of articles from last year saying they were going to support it as well as DV, but nothing since. Kinda sucks because I use Vudu for most movie viewing.
 

FoxSpirit

Junior Member
When was this ever a feature for the 2017 models? I did a fair bit of research when comparing both years, and less image retention was nothing I ever saw.

And as far as I’m aware, the dimmer HDR affects both 2016 and 2017 models. Could be wrong there though.
Rting's review. Compare the numbers between 2016 and 2017 models. Huge difference.
 
I have the 900e and I was hoping for more firmware updates. I got it a month or two after launch and I think there has only been two updates. It's not a buggy set but I was hoping my first new TV in 7 years would have more performance improvements via software. I just assumed that was a thing.
 

Kyoufu

Member
I was going to pick up a UHD BD player but I'm going to wait for CES 2018 to see what my options are in regards to players with both Dolby Vision and HDR10+ formats.

According to Vincent Teoh, word on the street is that there should be some HDR10+ announcements at the event.
 

BumRush

Member
I was going to pick up a UHD BD player but I'm going to wait for CES 2018 to see what my options are in regards to players with both Dolby Vision and HDR10+ formats.

According to Vincent Teoh, word on the street is that there should be some HDR10+ announcements at the event.

Smart. It's only a few months now and I could see that tech exploding at CES
 

ApharmdX

Banned
Disagree with bolded. They're closer than you think. Obviously, when pitch-black is involved, there's a visible difference. In evenly lit scenes and HDR scenes, they're very close.

Most quality VA-based LCDs will do a decent job with evenly-lit scenes and HDR where the whole screen is bright. The OLEDs pull away in any scene where there's a lot of contrast difference; particularly in HDR where you have a dark scene with specular highlights. OLED spoils you with the per-pixel brightness accuracy and zero blacks. At this point I can't go back to LCD (my B7 replaced a Vizio LCD with FALD).

The 900E also has a lot of other flaws compared to the 2017 LGs- worse input lag, Android TV, pixel response issues, poor scaling at 1080@120, etc. And it's not that cheap of a set either. That makes it harder to recommend compared to the TCL or Vizio displays.
 
Most quality VA-based LCDs will do a decent job with evenly-lit scenes and HDR where the whole screen is bright. The OLEDs pull away in any scene where there's a lot of contrast difference; particularly in HDR where you have a dark scene with specular highlights. OLED spoils you with the per-pixel brightness accuracy and zero blacks. At this point I can't go back to LCD (my B7 replaced a Vizio LCD with FALD).

The 900E also has a lot of other flaws compared to the 2017 LGs- worse input lag, Android TV, pixel response issues, poor scaling at 1080@120, etc. And it's not that cheap of a set either. That makes it harder to recommend compared to the TCL or Vizio displays.

That's very selective nitpicking if you ask me. Given the fact that the LG OS experience is far from perfect, I'd say they basically even each other out on that front.

And fwiw, my main advice is not 'get the X900E', it's to wait for 2018 models. I think we'll see a big leap in features and HDR tech. However, between the two 2017 models, all things considered (price among them) I had a more pleasant experience with the Sony than the LG.
 

MazeHaze

Banned
That's very selective nitpicking if you ask me. Given the fact that the LG OS experience is far from perfect, I'd say they basically even each other out on that front.

Eh, I wouldn't say high input lag at 1080p is selective nitpicking. Neither is bad upscaling at 1080 120 if you use it as a monitor (a lot of us do)

I owned an sony tv with android OS last year (x850d)and it was pretty terrible. Edit: (and the artifacts at 1080p 120hz were one of the main reasons I returned it too) I'd say the IR pointer on the LG remote alone is a substantial upgrade. I've owned the KS8000 too, and LG web OS stomps Tizen as well. It's easily the best imo.
 

ApharmdX

Banned
That's very selective nitpicking if you ask me. Given the fact that the LG OS experience is far from perfect, I'd say they basically even each other out on that front.

And fwiw, my main advice is not 'get the X900E', it's to wait for 2018 models. I think we'll see a big leap in features and HDR tech. However, between the two 2017 models, all things considered (price among them) I had a more pleasant experience with the Sony than the LG.

My experience with LG's OS has been overwhelmingly positive. And my experience with Android TV has been overwhelmingly negative. I don't think that Android TV feels like a premium TV OS, where LG's does. What sorts of problems have you had with LG's OS?
 

Kambing

Member
Played around with my old EC9300 OLED... screw the haters, the curved OLED's of old continue to be great. For close proximity usage (5.5 feet away), i honestly prefer the display curved as opposed to flat. Kind of sad we won't see more curved OLED's.
 

Mrbob

Member
That's very selective nitpicking if you ask me. Given the fact that the LG OS experience is far from perfect, I'd say they basically even each other out on that front.

And fwiw, my main advice is not 'get the X900E', it's to wait for 2018 models. I think we'll see a big leap in features and HDR tech. However, between the two 2017 models, all things considered (price among them) I had a more pleasant experience with the Sony than the LG.
That's going to be the case every year. 2017 had a jump in HDR over 2016. 2018 will jump over 2017. 2019 over 2018. Etc. Next year might be gen 1 HDMI 2.1, not sure if that's truly the year to go in. I'd wait for Gen 2 HDMI 2.1 tvs.
 
I'm genuinely curious where all this IR/burn in conversation comes from when discussing 2017 models.

I use my C7 almost exclusively for gaming, probably 90%, and haven't had a single issue yet with even a whiff of IR. I play on ISF Dark, and color simply dropped by a couple points.

I just finished marathoning through the new Xcom expansion on PC, probably ~30ish hours, and there's HUD for practically the entire game. No IR at all.

I play Overwatch on PC on a pretty regular basis, enough so that I burned some of the HUD into my Panansonic ST60. No IR on the C7. I somehow managed to put better than 80 hours into No Man's Sky since 1.3 dropped. HUD for practically the entire game and no IR there either.

I believe the X7 sets run a compensation cycle automatically the next time you turn off the set after 4 hours of cumulative use. Maybe that's something that the X6's don't do?

Either way, I have yet to see any IR at all, let alone "burn in", anywhere near my set....

Your experience seems pretty normal. I'm totally buying a B7 by November. The Rtings test isn't scaring me at all yet.
 
Rting's review. Compare the numbers between 2016 and 2017 models. Huge difference.

That's interesting. Is it fair to say Rtings tests for IR after calibration? If that's indeed the case, we're looking at a C7 set to 100 contrast, and 16 OLED light, while the B6 is at 100 and 40, respectively. I understand the B6 has much more aggressive ABL, but assuming both panels are accurately calibrated to 100 cd/m², I suppose one could argue the panel on the B6 is probably being driven considerably harder within its specifications to produce the same 100 cd/m², thus being more prone to IR. Like I said, it's all very interesting.
 

kwabi

Member
I hope this is the right thread for this, but couldn't find the answer anywhere else.

I have my PS4 Pro hooked up to my LG C7 (HDMI2 port). When I turn on my PS4 with my PS4 controller, the TV turns on as well. This is pretty convenient and I like this feature! However.. When I turn on my TV to watch the native Netflix app/etc.. my PS4 will power on by itself. This part is annoying, because I don't want to use my PS4 and have to turn if off manually.

Is there a way I can make it so my PS4 doesn't turn on when I turn on my TV? I do like when my TV turns on when I actually intend to use my PS4 (when I turn on my PS4 via its' controller).
 

MazeHaze

Banned
Your experience seems pretty normal. I'm totally buying a B7 by November. The Rtings test isn't scaring me at all yet.
Yeah I have a b7, Ive been pulling up a red test pattern on occasion and haven't seen anything yet. Have played a few hours of overwatch, a couple dozen hours of destiny 2, loads of web browsing (my PC is plugged in to it) and Netflix. Haven't seen anything yet. I haven't even seen any normal IR to be honest. Hopefully it stays that way.

221 hours on the panel currently.
 

Crowza

Member
My experience with LG's OS has been overwhelmingly positive. And my experience with Android TV has been overwhelmingly negative. I don't think that Android TV feels like a premium TV OS, where LG's does. What sorts of problems have you had with LG's OS?

This.

LG's version of webOS (3.5) is far superior to Android's implementation on televisions from a UI perspective it's not even close (I've been using 4k Sony sets for the Droid OS comparison, unless someone has a better Droid implementation to suggest). But, webOS is on a different level altogether than everything else.

It has all of the streaming to fit my needs (Hulu, Amazon, Netflix, SiriusXM, WWE, Plex, Crackle, YouTube, Vudu, Sling, Google Play)... The only thing I see it missing is HBO Go. Your mileage may vary..

I prefer it to using my Xbox One or PS4 streaming apps, which is saying a bit for me, since I would normally take using either of them over any other television operating system.
 

Mrbob

Member
Yeah Web OS works great. It's fast and apps updated frequently. Seems to run better than anything else not Roku related.
 

DieH@rd

Banned
I have one question about the Android TV OS.

Can I homehow set up so that my "recommended YT feed" that is located on the very top row the "Home" listing is showcasing my YT subscriptions, or my YT recommendations, and not just random crap that is trending in my country?
 

RockmanBN

Member
Upgraded from a 42' Panasonic 1080p tv to the LG C7. Colors look so amazing when I started Destiny 2. One Issue I have with the TV menu is that there's an audio lag when moving the cursor. Audio lags like more than a second behind on the TV menu.
 

aaaaa0

Member

That sounds like "if a pixel is dimmer because it is worn (and you can tell by measuring the threshold voltage), drive it harder to compensate for the wear" which to me, seems like it will just wear out the pixel faster.

It also sounds like it happens in real time, not something that has to be run in a separate "compensation cycle".


This one sounds like it's a technique to "address a particular OLED pixel based on some arrangement of reference and data lines, for the purposes of reading the threshold value", and not specifically a technique for a "compensation cycle".

I mean neither of these sounds like it really has to be run separately from just using the TV.

So what is a "compensation cycle"?

Is a "compensation cycle" visible when running on a OLED?
Do you see it cycling through patterns on the screen? You would think it would have to, if it wanted to measure what the threshold voltage for various luminances of an OLED pixel.
Has anyone tried to measure the power consumption or the thermals when the screen is off to see if there is any actual activity on the screen?
What, if anything, is it actually doing?
 

aaaaa0

Member
That is actually astonishingly good for OLED, doubly so for the '16 models (from their reviews, temporary IR was much reduced on the '17's). I would have figured the issue was much closer to that of Plasma's. If they did this test on any PDP, the logo's would be so clear and well defined it'd look like they still had them super imposed on every single one of their clean test patterns. The question is whether what's there is temporary or permanent, do they intend to run a loop without logos for an equal time to see if it wears off?

It might be astonishingly good in your opinion, but it's no longer acceptable for me.

I have been a plasma TV only owner until now (my last one was a Panasonic VT30), and I'm no longer happy with having to do anything at all about burn-in, especially for such an expensive device.

Like I posted earlier, my current personal standard for burn-in resistance is a Dell UltraSharp 24" monitor that I own (VA panel), which I accidently left on the Windows logon screen displaying static white text on black background for 6 months straight. The only outcome of that abuse was about a day or so of temporary image retention that went away by itself with no special treatment of the monitor. If OLED can survive that, it goes back on my list for consideration. Until then, it's off my list, especially with what we're seeing from the RTINGS test.

After blowing $10K+ over the years on plasma TVs, I now refuse to pay thousands of dollars for a display device that can't withstand abuse. I'm now willing to trade a marginal amount of picture quality for a device that is completely robust against permanent burn in.

The next TV I'm going to buy will be 75" or bigger with at least 1500 nits, and must be able to take as much abuse as I can dish on it without flinching.

Probably a Z9E or whatever model they're up to by the time HDMI 2.1 is around.

I'm almost certain now the next TV I buy is not going to be an OLED.

Put it this way, I wouldn't pay for a gaming PC where the system would overheat and incur permanent damage if I ran the CPU at 100% straight for two weeks.
Why should I buy a TV that I can't display whatever I want (including static images) for two weeks without permanent damage?
 

dsk1210

Member
Played around with my old EC9300 OLED... screw the haters, the curved OLED's of old continue to be great. For close proximity usage (5.5 feet away), i honestly prefer the display curved as opposed to flat. Kind of sad we won't see more curved OLED's.


Funny you should mention that, when I went from my curved to the B6 I felt the flat screen was bending backwards.

The curved screen is definitely beneficial for close range.
 
Wait until 2018 or get the Sony. If you want to go OLED, get the 2017 series. Neither TV is perfect, though, and that's tech-wise. Not talking picture quality. With both of them, I get the feeling there's some kinks left to be worked out. I believe 2018 models will be built with HDR capability in mind first and foremost.

Prices for 2017 models will drop more during October-December. I'm getting the 65c7 during this period. Had the chance to grab a 65b6 pretty cheap but as the above poster wrote, the software is messed up.
I wouldn't want to wait for the 2018 models as that would mean a whole year longer so their prices can drop.

I'd wait until the holiday sales to get the LG C7 or wait until next March for the 2018 OLED, which should improve uniformity and perhaps support variable refresh rate (it's a major update rather than 2017's minor update). Can't really recommend the 900E, it's not really on the same level of performance as the OLEDs. Also take a look at the Sony A1E this year if you like Sony's processing, prices on that should be down this fall.

Both good sets that you'd be happy with. I'll say that the bang-for-the-buck value of the 900E is hard to beat. The OLED will be superior with dark scenes, but the 900E will keep up in medium to bright scenes. You should be aware of the viewing angles on the 900e. As long as you are within a reasonable front facing "cone", it's fine. But, if your room has seats sitting 90 degrees from the TV, the view from them will have noticeably washed out colors.


Thanks for the advice all! I will wait until December at least - assuming I don't see any major sales on those two options, I'll probably end up waiting for the 2018 models. Not totally sure if OLED is worth it yet, but definitely want good HDR support. Gives me some time to wait for a price drop on the 4K Apple TV too.
 

nomis

Member
Ordered an LG up970 from Costco a day before reading they pulled the firmware... does anyone know if there’s a chance that mine will have the DV-enabled firmware out of the box? In that case I would just turn off updates until they re-release. Then again, my first DV disc will probably be Spider Man which gives LG a few weeks to reissue before I have a problem.
 

Ashhong

Member
Bought LG OLED557B (EU) couple a days ago, really magnificent TV. A question to everyone with similiar panel - is it normal for it to be slightly curved ? I mean it's barely noticable, it can only been seen from the side.

Do you mean the top half of the screen from where it's just the panel?

Because yes! Noticed that on mine and it bothers me but I didn't know if this was a rare issue or not. It doesn't affect the video in any way but it's there...
 
My experience with LG's OS has been overwhelmingly positive. And my experience with Android TV has been overwhelmingly negative. I don't think that Android TV feels like a premium TV OS, where LG's does. What sorts of problems have you had with LG's OS?
Eh, I wouldn't say high input lag at 1080p is selective nitpicking. Neither is bad upscaling at 1080 120 if you use it as a monitor (a lot of us do)

I owned an sony tv with android OS last year (x850d)and it was pretty terrible. Edit: (and the artifacts at 1080p 120hz were one of the main reasons I returned it too) I'd say the IR pointer on the LG remote alone is a substantial upgrade. I've owned the KS8000 too, and LG web OS stomps Tizen as well. It's easily the best imo.
This.

LG's version of webOS (3.5) is far superior to Android's implementation on televisions from a UI perspective it's not even close (I've been using 4k Sony sets for the Droid OS comparison, unless someone has a better Droid implementation to suggest). But, webOS is on a different level altogether than everything else.

It has all of the streaming to fit my needs (Hulu, Amazon, Netflix, SiriusXM, WWE, Plex, Crackle, YouTube, Vudu, Sling, Google Play)... The only thing I see it missing is HBO Go. Your mileage may vary..

I prefer it to using my Xbox One or PS4 streaming apps, which is saying a bit for me, since I would normally take using either of them over any other television operating system.
WebOS is really nice and intuitive to control. However, Android TV has been very solid for me on the 2017 model with Android 7.0 upgrade.

The context of my reply was about the entire software experience. For me, settings are a big part of that. That's where LG has a lot to gain. Better picture mode presets, less stuff that makes no sense like the HDMI input on PC Mode thing, having to put Trumotion to User then setting the sliders to 0 for best dejudder, Netflix streams being borked, etcetera. There's a lot of quirks to what is otherwise a really great way to navigate the OS. Also, the poorer motion and upscaling on LG's side would be unfair not to factor into the software side of things.

Then there's the fuckery on the 2016 models Game Mode, raising of black levels..

Anyway; I quite like WebOS. I was just saying the newer Sony Android OS and WebOS are pretty evenly matched imo, all things considered. I was calling it nitpicking because of LG's evident flaws of its own.

That's going to be the case every year. 2017 had a jump in HDR over 2016. 2018 will jump over 2017. 2019 over 2018. Etc. Next year might be gen 1 HDMI 2.1, not sure if that's truly the year to go in. I'd wait for Gen 2 HDMI 2.1 tvs.

It's a better time to go in than before HDMI 2.1, that's for sure. It should put an end to some of the articificial limits put on the panels re: refresh rate and resolutions, at the very least. No way to be sure of its virtues until next year, of course.
 

Bustanen

Member
I hope this is the right thread for this, but couldn't find the answer anywhere else.

I have my PS4 Pro hooked up to my LG C7 (HDMI2 port). When I turn on my PS4 with my PS4 controller, the TV turns on as well. This is pretty convenient and I like this feature! However.. When I turn on my TV to watch the native Netflix app/etc.. my PS4 will power on by itself. This part is annoying, because I don't want to use my PS4 and have to turn if off manually.

Is there a way I can make it so my PS4 doesn't turn on when I turn on my TV? I do like when my TV turns on when I actually intend to use my PS4 (when I turn on my PS4 via its' controller).
It's the TV turning on whatever HDMI CEC compatible device is connected. Only way is to disable it completely.
 
The 900E also has a lot of other flaws compared to the 2017 LGs- worse input lag, Android TV, pixel response issues, poor scaling at 1080@120, etc. And it's not that cheap of a set either. That makes it harder to recommend compared to the TCL or Vizio displays.
Huh? What pixel response issues does the x900e have?
 

MazeHaze

Banned
WebOS is really nice and intuitive to control. However, Android TV has been very solid for me on the 2017 model with Android 7.0 upgrade.

The context of my reply was about the entire software experience. For me, settings are a big part of that. That's where LG has a lot to gain. Better picture mode presets, less stuff that makes no sense like the HDMI input on PC Mode thing, having to put Trumotion to User then setting the sliders to 0 for best dejudder, Netflix streams being borked, etcetera. There's a lot of quirks to what is otherwise a really great way to navigate the OS. Also, the poorer motion and upscaling on LG's side would be unfair not to factor into the software side of things.

Then there's the fuckery on the 2016 models Game Mode, raising of black levels..

Anyway; I quite like WebOS. I was just saying the newer Sony Android OS and WebOS are pretty evenly matched imo, all things considered. I was calling it nitpicking because of LG's evident flaws of its own.



It's a better time to go in than before HDMI 2.1, that's for sure. It should put an end to some of the articificial limits put on the panels re: refresh rate and resolutions, at the very least. No way to be sure of its virtues until next year, of course.

I wouldn't call setting trumotion on zero for dejudder a flaw? That seems 'nit picky". The samsung TVs are the same way, it's just the lowest de judder setting. And what Netflix streams are borked? Everything is great over here.

Any quirks of web OS are whatever compared to stuff like Android TV locking up because it can't handle streaming HDR content while adjusting the volume at the same time.
 

Alfredo_V

Neo Member
Do you mean the top half of the screen from where it's just the panel?

Because yes! Noticed that on mine and it bothers me but I didn't know if this was a rare issue or not. It doesn't affect the video in any way but it's there...

Yes, a small bend but symmetric, it does look like its designed that way. Actually I called LG about it and they confirmed it. So no need to worry, you should notice if there's any real problem with the picture.

Haha posted my question on AVS forum, first response was if I was sure about what model I had bought (refererenced to last year curved model).
 

Mrbob

Member
.



It's a better time to go in than before HDMI 2.1, that's for sure. It should put an end to some of the articificial limits put on the panels re: refresh rate and resolutions, at the very least. No way to be sure of its virtues until next year, of course.
This is just a function of tvs getting better every year. You could say that for next 2018 models too since other devices will have to catch up to HDMI 2.1 and it will take a couple years for HDMI 2.1 to be useful. It's going to take years for TVs to catch up to full HDMI 2.1 spec.
 
I wouldn't call setting trumotion on zero for dejudder a flaw? That seems 'nit picky". The samsung TVs are the same way, it's just the lowest de judder setting. And what Netflix streams are borked? Everything is great over here.

Any quirks of web OS are whatever compared to stuff like Android TV locking up because it can't handle streaming HDR content while adjusting the volume at the same time.
Why would you set something to zero and then expect it to do anything? That makes no sense. With other platforms, you just enable True Cinema or whatever it's called, and you're good to go.

Regarding the Netflix thing: check some of the LG AV forums. Many people are having issues with Netflix holding a quality stream; drops to 480p happen often, and it takes a long time to get HDR quality streaming. I've had the same issue on my C7, and it's wired to a 400 Mbit connection.
This is just a function of tvs getting better every year. You could say that for next 2018 models too since other devices will have to catch up to HDMI 2.1 and it will take a couple years for HDMI 2.1 to be useful. It's going to take years for TVs to catch up to full HDMI 2.1 spec.

Sure, but 2017 has been a gap year for innovation for sure. Next year will likely see a relatively large number of improvements and new features.
 
I was going to pick up a UHD BD player but I'm going to wait for CES 2018 to see what my options are in regards to players with both Dolby Vision and HDR10+ formats.

According to Vincent Teoh, word on the street is that there should be some HDR10+ announcements at the event.

Oh yeah? In that case I think I'm going to stick with the launch Samsung one a little longer. Not the like the dolby vision firmware update is ready for my tv yet anyway so that gives me some time.
 

aaaaa0

Member
This is just a function of tvs getting better every year. You could say that for next 2018 models too since other devices will have to catch up to HDMI 2.1 and it will take a couple years for HDMI 2.1 to be useful. It's going to take years for TVs to catch up to full HDMI 2.1 spec.

Yeah, HDMI 2.1 is built to be future proof, no one's going to have 8K or 10K TVs for a while.

That said, I really want VRR, eARC, and 4K@120hz (for gaming PC).
 

psychotron

Member
I mean, this is good for an LCD, but for me as someone who primarily uses OLED/plasma, probably wouldn't cut it:

http://i.rtings.com/images/reviews/x900e/x900e-motion-blur-large.jpg

It's much better than the Z9D at least.

I've owned plasmas for the last ten years including a Kuro, Panny V10, ST50 and Samsung C8000. This tv has no issues with blur. It's impressed the hell out of me. The only downside is DSE on the 65". I've tried three of these and they all have it when panning the sky in Uncharted, Last of Us, etc. it's not noticeable during football, thankfully. Breath of the Wild really brings it out, which must have to do with the color palette.
 
Top Bottom