Is this really a thing? I'd imagine the pixel refreshing the B7 does every time you turn it off would negate possible accumulated wear and the only way you'd get burn-in would be by having the same static image be there for a long time uninterrupted.
Compensation cycles help, but you cannot completely eliminate the problem without reducing the brightness of the entire panel to that of the most-used pixel.
What you would hope for is that the wear on the panel would be random enough that it's not the same area of the screen receiving wear at a higher rate than the rest of the panel, which would minimize brightness loss.
Emissive displays have always been subject to uneven wear. This isn't new.
No, but everyone was saying that it was not an issue for OLEDs and calling people liars for saying otherwise.
Same thing happened when Plasmas were still available to buy.
It's one thing to tell people that "my usage has been varied enough that it's not been a problem" or "I haven't noticed any image retention/burn-in", and another to say that it
can't happen.
Or to start blaming people who have encountered it; either saying that they're using it wrongly, or that their TV must be faulty.
It's the price you pay for the benefits of the technology. If the buyer can't accept that, then they can look into LCDs. However, even with the best local dimming, LCDs cannot place a bright pixel next to a black one. Specular highlights suffer. Starfields, particle effects in games, they also suffer.
Less than you might think. Stars are rarely 100% white. (note: this is an animated PNG which may not work in all browsers)
OLED is undoubtedly better for things like small points of light, but full array local dimming is much better than many OLED/Plasma owners would have you believe. A lot of Plasma owners have finally admitted this after switching to one now for 4K/HDR.
And I think that higher brightness HDR is going to be more noticeable to the casual viewer.
It's really hard to recommend an LCD TV over like $1.5/2k for these reasons
I agree. If my current TV died and I
had to replace it with something, I would find it very difficult to justify buying a high-end LCD display rather than OLED.
But it's not like OLED is problem-free, and it's difficult to recommend to gamers when they have the potential for burn-in, considering that most games have a static crosshair / HUD.
Even if it only happens to some people, there's no way to know if you will be one of the affected few.
I have a number of friends that still have Panasonic Plasma TVs that they bought against my recommendation based on their usage patterns (mostly gaming and letterboxed movies) because people on forums claimed image retention/burn-in no longer existed, and they're starting to consider upgrading to a 4K HDR display now.
All of them have said they they don't want to buy another display with the potential for long-term image retention or burn-in again after their experience with Plasma TVs, and most have completely ruled out OLED at this point.
There are a few like me who still want an OLED display due to things like the contrast and response time, but are finding it to be a very difficult decision.
Again, who needs their OLED backlight to be at 100 100% of the time?
Most people like bright pictures.
If you're coming from an LCD rather than a Plasma TV, an OLED might appear dimmer than the TV it's replacing unless it's maxed out.
Erm, that's literally one person, though. Multiple people have chimed in regarding this, saying they've been 'abusing' their OLED TVs like no tomorrow with zero problems.
And there's a 25 page topic on AVS Forum with people posting photos of their burned OLEDs.
e: Bloodborne looks marvelous with high OLED Light and contrast
e2: and KH2 looks ridiculous lmao, almost too bright. Lots of bright UI elements too, this could be a good stress test game lol.
Well that's what happens when you push SDR beyond 100 nits. But that's what a lot of people like.