• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

:( I'm coming from exclusively being a plasma TV owner.

It might still be worth checking it out in person if you haven't, if it's at any stores by you, since he said he didn't notice it in actual content despite mentioning the issue exists with VA LCDs.

No idea though since I haven't even looked at 75" TVs lol.
 
IR happens on my B6 all the time. Game HUDs do it and so do all adobe products. What usually is the worst offender is any pure color - cranking a straight Cyan streak for example leaves retention every time, and it's visible most easily afterwards on a grey background. I also use my B6 as a monitor so I've got google chrome open all the damn time which can have a bit of retention.

Literally no burn in from any of this. At all.

I don't manually run panel cleaning ever. I've owned this for over a year now.
 
IR happens on my B6 all the time. Game HUDs do it and so do all adobe products. What usually is the worst offender is any pure color - cranking a straight Cyan streak for example leaves retention every time, and it's visible most easily afterwards on a grey background. I also use my B6 as a monitor so I've got google chrome open all the damn time which can have a bit of retention.

Literally no burn in from any of this. At all.

I don't manually run panel cleaning ever. I've owned this for over a year now.

Fantastic to hear man, that is definitely a relief to hear. And not even using the panel cleaner? Nice. From what I was told a automatic clean happens after a few hours and when you then turn it off, correct?

It's just that I have been very careful with this stuff with my plasma TVs and I was so damn glad to be rid of IR with my previous EC9300 OLED. And then seeing the Netflix post in here and some things on AVS sure doesnt help that fear.

And you have games and things like Chrome opened up a lot daily?
 
Fantastic to hear man, that is definitely a relief to hear. And not even using the panel cleaner? Nice. From what I was told a automatic clean happens after a few hours and when you then turn it off, correct?

It's just that I have been very careful with this stuff with my plasma TVs and I was so damn glad to be rid of IR with my previous EC9300 OLED. And you have games and things like Chrome opened up a lot daily?

Yeah it runs the automatic cleaner, I just mean I've never gone out of my way to mess with anything and run it manually as I haven't had the need to. I've had game HUDs, chrome and adobe products open for hours and not gotten anything permanent. ive never run anything for more than a day though, this is all just my normal usage.
 
Yeah it runs the automatic cleaner, I just mean I've never gone out of my way to mess with anything and run it manually as I haven't had the need to. I've had game HUDs, chrome and adobe products open for hours and not gotten anything permanent. ive never run anything for more than a day though, this is all just my normal usage.

Sounds good man. It's good to know that there's nothing wrong with my panel then.
 

tmdorsey

Member
I think he also said the side-effects of doing this weren't worth it, no?

For HDR content, no it's not worth it. For SDR content it depends on if you noticing any flickering or not and how bright you like your content to be.

Well, its a week until my new 65 inch Sony xe930 arrives but I thought to pass the time I would check in with any fellow 930 owners.

How has the set been for you? In particular HDR content - I hear this set gets super bright. I can also see from RTINGS (which to this day I still cant say out loud without breaking into a Jamaican accent...) that this set has an issue with 24p playback via 60i and 60p. Has this been an issue for anyone? Has it been fixed yet? The xe900 doesn't have an issue here.

Also, any idea when the DV patch is coming through?


HDR content is pretty awesome on it. I do notice judder sometimes with the 24p via my Directv box, but it's not bad enough to make me regret getting the set. There's also a chance it will be fixed in the future.

The last thing I read was maybe October for the DV update. I really hope the update comes before Stranger Things 2 starts!
 

ApharmdX

Banned
Is that your own ZT60 photo?

X930e can do 1080 lines if you enable some settings like Clear and Custom and up the brightness. Not sure how good it is with 60fps games.

It isn't. These processing features introduce lots of input lag.

And let's not forget that when plasma was at its peak, LCD customers were complaining about plasma TVs flickering as well as the dimmer image you mention, both aspects that are inherently tied to the improved motion resolution.

Early plasmas may have flickered but later ones the sub-fields were displayed at such a high rate that the human eye could not discern the flickering. You did get some temporal dithering (looks like noise) on dark frames because the pixels would be between off-low intensity for most of a frame.

one of the big "improvements" of oled was the super fast pixel response time. so why not imitate the characteristics of a plasma and let each pixel naturally fade after each update allowing for full 4k motion resolution?

my only guesses would be that no one has put in the effort to create a board that would allow for this and instead phoned it in retooling led chips to save money or that the continuous on and off of each pixel reduces the life span of the panel in a big way which has been an ongoing concern for oled.

You lose brightness when you have pixels enter an off-state intermittently.
 

holygeesus

Banned
That's a relief to hear then because fuck, it appears so damn quickly and the retention of something like the Xbox dash is very clearly visible too. Yeah after that Netflix thing here I'm perhaps overly scared. But it looked like he didn't do anything wrong too.

These OLEDs have a thing called noise... something but I was told it's best not to use to too often or at all because apparently it cleans the screen automatically every few hours or so?

The Witness is notorious for IR. It's the only time I have seen it during gaming sessions and that comes on (and disappears) very quickly.
 

Kyoufu

Member
Philips's OLED vs QLED shoot-out at BAFTA in London made me laugh today for a couple of reasons.

The first reason being that they showed why QLED is just LCD with Samsung marketing attached and that their OLED TV looks noticeably more impressive than Samsung's QLED which is the same price, making Samsung's QLED TV considerably overpriced as it couldn't compete with the OLED.

The second reason why I laughed was because the result was clear enough without needing 30 AVForums members to attend and judge the two TVs. Their OLED was always going to beat Samsung's overpriced Edge-lit LCD. It would have been way more interesting to see the ZD9 and DX902 being compared to the OLED instead.

https://www.youtube.com/watch?v=xSMDjf7kZ1Q
 
I'm glad they picked Samsung's "QLED" to test it against.

The whole QLED scam is incredibly disrespectful to customers. They took an Edge-Lit TV, gave it a new name to easily confuse it with OLED, and then jacked the price up to give it the appearance of a premium product.

That price for an Edge-Lit TV is highway robbery.
 

BumRush

Member
Philips's OLED vs QLED shoot-out at BAFTA in London made me laugh today for a couple of reasons.

The first reason being that they showed why QLED is just LCD with Samsung marketing attached and that their OLED TV looks noticeably more impressive than Samsung's QLED which is the same price, making Samsung's QLED TV considerably overpriced as it couldn't compete with the OLED.

The second reason why I laughed was because the result was clear enough without needing 30 AVForums members to attend and judge the two TVs. Their OLED was always going to beat Samsung's overpriced Edge-lit LCD. It would have been way more interesting to see the ZD9 and DX902 being compared to the OLED instead.

https://www.youtube.com/watch?v=xSMDjf7kZ1Q

It is SO overpriced. I saw them in Best Buy, right next to the TCL P and I actually preferred the TCL.
 

Paragon

Member
Over-exposed photographs or photographs taken at an angle will always reveal the underlying LED structure. That's not representative of what you see in person.
Blooming happens, but it doesn't look like that, and it's less common than OLED/Plasma TV owners would have you believe.
CRTs are still praised for their black level yet they have far lower ANSI contrast and more severe blooming than any of the good FALD LCD displays.

Well, on AVS myself and many other posters have posted overexposed 5% gray slides. I can post one here as well. It cuts both ways, but my point is that eventually what you see in an overexposed shot you WILL see in content.
No, you will typically not see most of that. In dim, low-APL scenes, FALD blooming can be visible.
When you have a high contrast image with bright objects on-screen, the brightness tends to mask the blooming since your pupil contracts.
It doesn't mask everything, but a lot more than an over-exposed photograph leads people to believe.

Hold onto that one as long as you can. I'm still in the camp that Plasmas have the highest quality picture & motion handling ever created. It's a shame they got such a bad rep.
CRTs have the best motion handling of any display, followed by LCDs with backlight strobing/scanning, Plasma TVs, OLEDs with black frame insertion, high refresh rate sample & hold LCD, sample-and-hold OLED, then sample-and-hold LCD.
And if you are sensitive to motion artifacts on plasma or DLP displays, I would arguably place them last despite having better motion clarity than sample & hold displays.
OLED has the potential to be best, but only Sony have made any attempt at getting there.

Wow this post is the worst of the worst, that's what you get when trying to warn people about an objectively immature (in the reliability department) technology, you get called names and painted as someone who tries to justify a purchase.
The situation is actually the opposite of what you paint.
On the internet you can't talk about the (once again objective) problems of OLED without going trough a sea of people that either downplay or straight out deny them... This is Plasma vs LCD all over again only this time the bad guy is LCD.
And this only because some kind soul would like to see when recommended an OLED set a warning that is a hundred fold more susceptible to image retention and burn in that the other sets and to take that into account when choosing a tv.

The AV community on the net is incredibly toxic and can't accept that people might have different needs or sensibilities over what has been decided what's the "absolute best".
Great post.
It really seems like OLEDs are replacing Plasma TVs in terms of generating toxic responses to any sort of criticism online.
Blaming the owner for simply using their TV, or claiming that affected displays must be defective is a terrible position to take.
Unless it's actual intentional abuse, like intentionally keeping a static image on the screen to burn it, or keeping the display on 24/7, don't blame the owner.

And let's not forget that when plasma was at its peak, LCD customers were complaining about plasma TVs flickering as well as the dimmer image you mention, both aspects that are inherently tied to the improved motion resolution.

My issue with Plasma TVs was not that they flickered, but how they flickered.
Plasma TVs cannot vary the brightness of their pixels, they can only switch them on or off.
So to create a full-color image with lots of gradation, they have to pulse the pixels for varying durations.
This is why you might have seen older plasmas being advertised as "600Hz" displays. To create a 60Hz image, they would actually pulse the subpixels on and off 10x for every frame, varying the duration of each pulse to affect the perceived brightness.
This is what it looks like when measured:

pdpphosphorresponsealwbsps.png
Note: one frame at 60Hz = 16.67ms, so that is a bright frame followed by a black frame, and then another bright frame.
In this example, the TV is actually only switched on about 30% of the entire frame's duration - or about 5ms total.
Since the duration a frame is held on-screen affects how much motion blur we perceive, that's why Plasma TVs have less apparent motion blur than many other displays.

However you might notice that the response time of all three are very different.
Blue switches on/off almost instantly, while red/green switch on slower (diagonal line going upwards) and take significantly longer to decay. (diagonal lines going down to 0)

This can result in the image appearing to separate into separate colors, you might catch flashes of colors out the corner of your eye, or you might see blue/green trails on the leading/trailing edges of moving objects.
Here's an example of that taken by moving the camera while the TV is static - which is what you might see if you looked from one side of the screen to the other with your eye.
https://tweakers.net/reviews/3431/6/panasonic-zt60-afscheid-van-een-iconische-plasmaserie-beeldeigenschappen.html said:
So while Plasma TVs had less motion blur, they had the potential for severe artifacting in motion.
The way that Plasma TVs pulsed the image many times a second, and the motion artifacts that resulted from it, meant that I would get migraines when watching them.
I really tried, but after owning a couple of Panasonic plasmas and three Kuros, I had to accept that they were not for me.

CRTs on the other hand can vary the intensity of the beam, which allows them to adjust the brightness without having to pulse the pixels on/off like a Plasma TV did.
They would only flash the image once per frame, instead of 10x (or more) per frame.
Even though CRTs maybe held the image on-screen for 1ms or so, causing flicker to be more noticeable than Plasma TVs, it didn't trigger migraines for me.

I have similar issues with LCD TVs which use PWM to control the backlight intensity, or even LED room lighting.
I recently spent a good bit of money on Philips Hue lights for one room (12 bulbs) and had to return them all because I'd get migraines due to them flickering.

I always took issue with people saying that Plasmas had a very "CRT-like" image.

IIRC, I think Vincent from HDTVTest said that if you max out BFI on the Sonys you can get 1080 lines of motion resolution.
I would be surprised if you can achieve that with BFI alone.
As I understood it, the OLED is literally drawing black frames. Since it's a 120Hz panel, that means 50% image persistence - or 8.33ms.
You would have a very clean image in motion compared to a Plasma TV, but still more motion blur.

Now, if instead of drawing black frames, the TV simply switched the picture on/off like LCDs do with backlight scanning/strobing, the image persistence could be decoupled from the refresh rate/framerate.
With OLED you could also simulate CRT scanning by only illuminating a certain number of lines at once too.

But most TV manufacturers don't want their displays to flicker any more.
 
In the budget front, Samsung's 6300 series is pretty good. I know that this topic is all about showing who has the biggest and brightest screen with blackest blacks, but at this point I'd rather buy a cheap but pretty good TV and wait for the really good stuff. OLED clearly has problems still, and LCD has the usual ones. Not very eager to spend a lot of money for compromises. Every TV has issues, and I can live with them if I don't spend too much for one.

Aaaand that brings be to this:

I remember reading that there was going to be some kind of new screen technology coming soon? Not OLED or something but it was hyped to be the next best thing since knee length dresses. Was it from LG or something? I remember that it was still years away from actual consumer stuff (been a while since reading that article, it's possible that I just don't remember right).
 
Philips's OLED vs QLED shoot-out at BAFTA in London made me laugh today for a couple of reasons.

The first reason being that they showed why QLED is just LCD with Samsung marketing attached and that their OLED TV looks noticeably more impressive than Samsung's QLED which is the same price, making Samsung's QLED TV considerably overpriced as it couldn't compete with the OLED.

The second reason why I laughed was because the result was clear enough without needing 30 AVForums members to attend and judge the two TVs. Their OLED was always going to beat Samsung's overpriced Edge-lit LCD. It would have been way more interesting to see the ZD9 and DX902 being compared to the OLED instead.

https://www.youtube.com/watch?v=xSMDjf7kZ1Q

The whole thing is a complete joke, I couldn't not go to town on them over it! Phillips are pathetic, and AV Forums should be embarrassed covering it, and as for the stupid members......well what can I say, not knowing and acting shocked over how good an OLED vs a shitty Q7 is, absolute fools.
 

Kyoufu

Member
The whole thing is a complete joke, I couldn't not go to town on them over it! Phillips are pathetic, and AV Forums should be embarrassed covering it, and as for the stupid members......well what can I say, not knowing and acting shocked over how good an OLED vs a shitty Q7 is, absolute fools.

Okay I have a 3rd reason to laugh. Philips gave away the OLED being judged to one lucky AVForums member attending the shoot-out. That's cool of them right?

The winner already owns a fucking OLED

L M A O

Ri0cQT5.gif
 

Macaco84

Member
For HDR content, no it's not worth it. For SDR content it depends on if you noticing any flickering or not and how bright you like your content to be.




HDR content is pretty awesome on it. I do notice judder sometimes with the 24p via my Directv box, but it's not bad enough to make me regret getting the set. There's also a chance it will be fixed in the future.

The last thing I read was maybe October for the DV update. I really hope the update comes before Stranger Things 2 starts!

Really strange sony droped the ball on the 24 hz on this set considering they aced everything else. But if thats the biggest issue i can live with it.

Looking fwd to some great hdr gaming on this.
 

aravuus

Member
Great post.
It really seems like OLEDs are replacing Plasma TVs in terms of generating toxic responses to any sort of criticism online.
Blaming the owner for simply using their TV, or claiming that affected displays must be defective is a terrible position to take.
Unless it's actual intentional abuse, like intentionally keeping a static image on the screen to burn it, or keeping the display on 24/7, don't blame the owner.

Really can't blame anyone for overreacting every now and then considering the way OLED is talked about in this thread
 

Weevilone

Member
Really can't blame for anyone overreacting every now and then considering the way OLED is talked about in this thread

Especially when you get people taking on the job of burn-in evangelist... talk about toxic stuff. It's like we get it.. you prefer something else.
 

shockdude

Member
I would be surprised if you can achieve that with BFI alone.
As I understood it, the OLED is literally drawing black frames. Since it's a 120Hz panel, that means 50% image persistence - or 8.33ms.
You would have a very clean image in motion compared to a Plasma TV, but still more motion blur.

Now, if instead of drawing black frames, the TV simply switched the picture on/off like LCDs do with backlight scanning/strobing, the image persistence could be decoupled from the refresh rate/framerate.
With OLED you could also simulate CRT scanning by only illuminating a certain number of lines at once too.

But most TV manufacturers don't want their displays to flicker any more.
Are you sure your interpretation of OLED BFI is correct?
I'm not a TV expert but I thought that on an OLED, "drawing a black frame" and "turning off" are functionally/visually identical, i.e. emitting zero nits. Depending on how long the frame is "on", this would make OLED BFI at least as effective as LCD BFI and maybe even CRT for motion blur reduction.
 

tmdorsey

Member
I would be surprised if you can achieve that with BFI alone.
As I understood it, the OLED is literally drawing black frames. Since it's a 120Hz panel, that means 50% image persistence - or 8.33ms.
You would have a very clean image in motion compared to a Plasma TV, but still more motion blur.

Now, if instead of drawing black frames, the TV simply switched the picture on/off like LCDs do with backlight scanning/strobing, the image persistence could be decoupled from the refresh rate/framerate.
With OLED you could also simulate CRT scanning by only illuminating a certain number of lines at once too.

But most TV manufacturers don't want their displays to flicker any more.

I was talking about the Sony LCDs not the OLED.
 

Paragon

Member
Are you sure your interpretation of OLED BFI is correct?
I'm not a TV expert but I thought that on an OLED, "drawing a black frame" and "turning off" are functionally/visually identical, i.e. emitting zero nits. Depending on how long the frame is "on", this would make OLED BFI at least as effective as LCD BFI and maybe even CRT for motion blur reduction.
Part of the issue is that many people call LCD backlight scanning/strobing "BFI" when it's not.
BFI was largely ineffective with LCDs due to slow response times, while shutting the backlight off is instantaneous.

The current OLED panels are limited to 120Hz.
Granted, they only accept 120Hz inputs at 1080p, but you can send them a 120 FPS source.
If you're doing BFI by "drawing" black frames, it means that you are limited to a 60 FPS source at most, since half of your frames are now black.
It also means that the duration of these frames is fixed at 50%, since half your frames are source images and half are black.

Instead of drawing black frames, what you could do instead is "shut off" the OLED panel - similar to what LCDs do with their backlight.
That way you can still draw 120 frames on a 120Hz panel. You just don't keep the frames illuminated the entire time - you switch off the OLEDs part of the way through the refresh.
Not only does it let you display twice as many frames, but it would allow you to control the duration that the image is being held on-screen, rather than it being fixed at 50%.
If you cut that to 10%, there would be significantly less motion blur than 50% - but it would get dimmer and flicker more.
Alternatively you might prefer a brighter image with less flicker and set it to something like 80% persistence.

Both are still "inserting black frames" but one does it by drawing black images, while the other is switching the panel off, independently from any frames that are being drawn.
I was talking about the Sony LCDs not the OLED.
Ah, my mistake. I think Sony LCDs can reduce persistence to 4ms or so.
 

ApharmdX

Banned
It is SO overpriced. I saw them in Best Buy, right next to the TCL P and I actually preferred the TCL.

I think the Vizio P-series is also better, and much cheaper. Black level performance on those Samsung QLED sets is quite poor since they are edge-lit, though they do have very low input lag.

Over-exposed photographs or photographs taken at an angle will always reveal the underlying LED structure. That's not representative of what you see in person.
Blooming happens, but it doesn't look like that, and it's less common than OLED/Plasma TV owners would have you believe.
CRTs are still praised for their black level yet they have far lower ANSI contrast and more severe blooming than any of the good FALD LCD displays.

As someone who has owned and cross-shopped many different FALD LCDs, they all display various levels of blooming. It shouldn't be at the level of that photo (Vizio's prior implementation could be pretty bad with certain patterns like logos) but it's there. And it's something that I don't see myself dealing with again.

No, you will typically not see most of that. In dim, low-APL scenes, FALD blooming can be visible.
When you have a high contrast image with bright objects on-screen, the brightness tends to mask the blooming since your pupil contracts.
It doesn't mask everything, but a lot more than an over-exposed photograph leads people to believe.

But you will see some of it (blooming and haloes). And that's a problem. Dark/mixed contrast scenes are inconsistent with FALD displays. They can look tremendous or they can disappoint. It depends on the manufacturer's dimming algorithm and the underlying LED structure. OLED does a more consistent job with these scenes.

Anyway, LCD in high-end displays is on its death bed finally. Samsung is in a holding pattern and is basically non-competitive in the segment for 2017. Sony, Philips, Panasonic have gone OLED for their flagship sets, and I believe Sony isn't refreshing the Z9D this year. LCD is more of a mainstream/low-end display technology at this point.

However you might notice that the response time of all three are very different.
Blue switches on/off almost instantly, while red/green switch on slower (diagonal line going upwards) and take significantly longer to decay. (diagonal lines going down to 0)

This can result in the image appearing to separate into separate colors, you might catch flashes of colors out the corner of your eye, or you might see blue/green trails on the leading/trailing edges of moving objects.
Here's an example of that taken by moving the camera while the TV is static - which is what you might see if you looked from one side of the screen to the other with your eye.

So while Plasma TVs had less motion blur, they had the potential for severe artifacting in motion.
The way that Plasma TVs pulsed the image many times a second, and the motion artifacts that resulted from it, meant that I would get migraines when watching them.
I really tried, but after owning a couple of Panasonic plasmas and three Kuros, I had to accept that they were not for me.

Watching plasma TVs up-close exacerbated the issue of phosphor decay lag (which is what your example shows, the eye crossing the screen). They look tremendous for most viewing but for some people, yes, it's unlivable. I can see the color separation myself but it's tolerable at a reasonable distance. For me, it's a fair price to get a display with better contrast ratio, far cleaner pixel transitions, and far better motion handling. And OLED hits 2 of those 3 advantages, too.

By the way, different manufacturers panels had varying degrees of phosphor lag. Pioneer and Panasonic were both pretty rough, but Fujitsu/Hitachi had almost none in their ALiS displays. Not sure about Samsung, LG, and NEC, it's been forever seen since I've seen those.

I always took issue with people saying that Plasmas had a very "CRT-like" image.

They both use the same glowing phosphors. Out of the digital display technologies, plasma was the most CRT-like by far.
 

Nikana

Go Go Neo Rangers!
The Witness is notorious for IR. It's the only time I have seen it during gaming sessions and that comes on (and disappears) very quickly.


First time I played it I was like WTF when I left a puzzle. Box was retained dead center of the screen.

I found throwing energy saving on medium helps a lot, but obviously you lose some of that OLED contrast goodness.
 

BumRush

Member
I'm just waiting until the OLED market gets more competitive and refined

I've decided to just wait until motion handling / uniformity are more comparable to plasma. OLED tech is AMAZING - and I would be super happy jumping in now - but if the plasma ain't broke, don't replace it.
 

Kyoufu

Member
I've decided to just wait until motion handling / uniformity are more comparable to plasma. OLED tech is AMAZING - and I would be super happy jumping in now - but if the plasma ain't broke, don't replace it.

Maybe a daft opinion but I consider non-4K HDR sets as broke considering how amazing 4K HDR looks.

Is that a daft opinion?
 

JG5253

Member
I've decided to just wait until motion handling / uniformity are more comparable to plasma. OLED tech is AMAZING - and I would be super happy jumping in now - but if the plasma ain't broke, don't replace it.

I have a panasonic tc-50pu54 which I know isn't a high end plasma butI love the pq. Since I moved out on my own and put it in the living room the glare was unbearable for me and too dim for daytime use. I got the TCL P607 to get a taste and as stop gap of 4k HDR/Dolby Vision and love the set despite some flash lighting. When OLED get better and more refined I'll have no problem upgrading and moving the TCL P to a bedrrom
 
Just checked out my Sony A1 oled's screen uniformity on grey, and its shockingly bad. That being said, how important is it really?

I've not noticed any uniformity issues when playing content.
 

Kyoufu

Member
Just checked out my Sony A1 oled's screen uniformity on grey, and its shockingly bad. That being said, how important is it really?

I've not noticed any uniformity issues when playing content.

You answered your own question. If it doesn't affect content then it's not even worth the thought.
 

BumRush

Member
Maybe a daft opinion but I consider non-4K HDR sets as broke considering how amazing 4K HDR looks.

Is that a daft opinion?

I have a panasonic tc-50pu54 which I know isn't a high end plasma butI love the pq. Since I moved out on my own and put it in the living room the glare was unbearable for me and too dim for daytime use. I got the TCL P607 to get a taste and as stop gap of 4k HDR/Dolby Vision and love the set despite some flash lighting. When OLED get better and more refined I'll have no problem upgrading and moving the TCL P to a bedrrom

I totally agree that HDR is a massive, massive leap. With my use case, there isn't enough HDR content yet to call it "broke" but I get where (Kyoufu) you're coming from
 

tmdorsey

Member
I totally agree that HDR is a massive, massive leap. With my use case, there isn't enough HDR content yet to call it "broke" but I get where (Kyoufu) you're coming from

Netflix and Amazon have a decent amount of content I think as well as new movies are getting 4K disc versions with HDR.
 
OLED, just like Plasma and LCD, will get A LOT better in the future. Let's see what LG comes up with for their 2018 line up - their "tick" timeline.
 

BumRush

Member
Netflix and Amazon have a decent amount of content I think as well as new movies are getting 4K disc versions with HDR.

I have two children - ages 3 and 6 months - so my wife and I barely watch any non-Disney-Princess TV right now (and I've played maybe 10 hours of video games in the last 3 months, if that). If Thrones was in 4K HDR I probably would have bought one already.
 

Kyoufu

Member
OLED, just like Plasma and LCD, will get A LOT better in the future. Let's see what LG comes up with for their 2018 line up - their "tick" timeline.

1000 nits, HDMI 2.1, 4K @ 120hz and Dynamic HDR are what I'm expecting. Anything else would be a bonus/surprise for me.
 

Weevilone

Member
I have two children - ages 3 and 6 months - so my wife and I barely watch any non-Disney-Princess TV right now (and I've played maybe 10 hours of video games in the last 3 months, if that). If Thrones was in 4K HDR I probably would have bought one already.

Surely you must know that princesses look better in 4K with adequate specular highlighting.
 
Just read on another forum that Russian firmware for the B6 apparently is coming and HLG is happening, however the HDR game mode remains unchanged. The fuck man!
 

Darksim

Member
my only guesses would be that no one has put in the effort to create a board that would allow for this and instead phoned it in retooling led chips to save money or that the continuous on and off of each pixel reduces the life span of the panel in a big way which has been an ongoing concern for oled.

The current and previous vr headsets (dk2) do this (and even gear vr), it's called rolling scan. I didn't hear anything about it reducing life span, and I know it is the only way that the Dell 4k oled monitor displays anything as it apparently reduces image retention. I'd imagine it hasn't made its way onto a tv yet because of it reducing the light output oleds are still struggling with for HDR.

I mean, anything that solves these issues on OLED is going to be a processing solution (either frame interpolation or black frame insertion) since the display technology is inherently sample and hold rather than a pulse display like a plasma. At this point there isn't a consumer display technology that operates in the way those older displays did.

I don't really think you could call it a processing solution; rolling scan is not either interpolation or BFI, and is significantly better than either of those. Once a consumer oled tv gets a rolling scan option I expect it's motion clarity to instantly surpass the "focused field drive" plasmas, though it may be a while with HDR performance being the clear priority.

CRTs have the best motion handling of any display, followed by LCDs with backlight strobing/scanning, Plasma TVs, OLEDs with black frame insertion, high refresh rate sample & hold LCD, sample-and-hold OLED, then sample-and-hold LCD.

Even the benq blur reduction monitors with a VT tweak still result in unavoidable strobe crosstalk, I believe the complete lack of this on the FFD plasmas easily puts their motion handling ahead of any current lcds despite their own issues with motion.

I always took issue with people saying that Plasmas had a very "CRT-like" image

The lack of eye tracking blur does deliver a very crt-like image, far closer than what any lcds of the time could produce. I think It takes a worst case example for the crt-like image to break down (spinning your camera around in a game), and in some cases, notably 2D platformers scrolling, they can produce a convincingly blur and artifact free image in a situation that would still result in crosstalk on a backlight scanning lcd.
 

Dave_6

Member
1000 nits, HDMI 2.1, 4K @ 120hz and Dynamic HDR are what I'm expecting. Anything else would be a bonus/surprise for me.

If all this somehow ends up being true, I'll upgrade from my B6. Well depending on the price and if they can get the uniformity improved.
 

Lord Error

Insane For Sony
The Witness is notorious for IR. It's the only time I have seen it during gaming sessions and that comes on (and disappears) very quickly.
Not gonna lie, I freaked out when I saw how much IR was left on my LG OLED after playing The Witness one night, lol. I remember having a knot in my throat, and left it playing some colorful video for hours after that - and it was still visible afterwards. It took a lot more time for it to completely disappear, but thankfully it eventually did. From that point on, I don't worry too much, because if *that* disappeared, I don't think permanent burn is going to happen with any kind of normal use.
 

Kyoufu

Member
If all of that is in the 2018 sets...wow.

1000 nits shouldn't be too difficult to achieve since they're at 800 nits in 2017, but the rest will depend on HDMI 2.1's availability. Probably won't happen until 2019 :(

But damn, if that happens next year I'd have to upgrade from the E6.
 

BumRush

Member
1000 nits shouldn't be too difficult to achieve since they're at 800 nits in 2017, but the rest will depend on HDMI 2.1's availability. Probably won't happen until 2019 :(

But damn, if that happens next year I'd have to upgrade from the E6.

Wasn't the '16 to '17 jump only ~700 nits to ~800 nits?
 

Kyoufu

Member
Wasn't the '16 to '17 jump only ~700 nits to ~800 nits?

Actually I think 800 nits may be under Vivid mode which no professional calibrator uses. Real world peak brightness may be 700-ish nits for current models. HDTVTest's 2016 E6 model was 640 nits I believe. Maybe 1000 nits is asking for too much by next year?
 

BumRush

Member
I believe it went from like 620 to 800 or so.

I really don't know that I care too much about the peak brightness though.

Movies being mastered for up to 4000 nits seems insane.

The peak brightness (the way I understand it) impacts HDR highlights more than whole screen brightness. I'm sure the impact to the latter are important, but it makes HDR "pop" more.

Someone clean me up.

Actually I think 800 nits may be under Vivid mode which no professional calibrator uses. Real world peak brightness may be 700-ish nits for current models. HDTVTest's 2016 E6 model was 640 nits I believe. Maybe 1000 nits is asking for too much by next year?

We'd have to know more about the tech and if it's a bullet point item for manufacturers going into 2018 (which it should be)
 
Top Bottom