• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

For those worried about OLED burn-in

C1 (a lot of gaming hours on it, now used for TV in my bedroom) and S90C (Current gaming TV for 2 years, love the TV but fuck Tizen) neither have any burn in.
 
Clearly I'm no tech expert, as half the stuff you're talking about goes right over my head. I've just heard people complaining about frame rates on oled TV's when it comes to gaming. As I understand it, a lot of console games are still 30 fps so that seems problematic for OLED tvs. Then there's the prices...

The tv I picked up was only $500 and has most of the gaming features you mentioned and looks fantastic (at least to me) and I'm not sure what tech it uses (TCL QM7).

Most console games on PS5 and SX are 60fps (probably close to 90% even) and this trend will continue on PS6. Problem is with older games locked to 30fps that never got patches. No such issue on PC.
 
Every live service/multiplayer game that you play for hours on end on a daily basis that has a static UI will burn itself in after 1-2 years , my WoW UI was visible on 3 different OLEDS.
 
If you play a lot of different games then oled is fine, if you play the same games all the time for multiple hours a day then oled is not fine (particularly blizzard stuff, their huds are way too bright).
1200 hours of Apex Legends, no issue so far.
 
This is my 55" B7 I used as a full time PC monitor for 2 years at 140nits, I also played a wide variety of HDR games, tv, movies, and videos on this one maxed out. Probably got 20,000 hours or more on this thing.

20251222-232540.jpg


Full size link: https://ibb.co/v6vJFmrq

Now do Blue, Teal, Purple and Green.
 
CRT pixels and plasma pixels aren't organic but they have the same problem

It's not the "organic" nature of these emissive displays that causes burn-in

The holy grail remains designing an emissive display with pixels that wear in a different way from progressive brightness loss. MicroLED is such a technology but it's been extremely difficult to find a way to miniaturize it in a way to produce a screen smaller than 110 inches or mass produce it at a price that normal people can pay
The "organic" nature of OLED's diodes age faster(especially the blue diodes)than non OLED's.
All modern displays have issues but not all issues are the same.
Not getting into the mess that was Plasma TV's.
I have a mini LED display. It's better for displaying text and code than a lot of LG OLED panels I tested before that...
Was significantly better especially with monitors although OLED has improved.
I worked with every QD-OLED panelled TV from Samsung and Sony since their inception in 2021/22 and on the shop floor the ambient black level raise was evident on all of them.

The lights in the shop were suspended 20+ feet above the TVs so there was no direct light on the screen. Thats why its called ambient black level raise.

I sometimes did afterhours sessions with staff or customers where the lights were severely dimmed and you could still see it.

Samsung use of the matte coating on the S95C/D/F made it even more obvious for sure, but the Sony was still affected.

I dont think its the end of rhe world or anything, but it is there.
I hear and believe what you say you experienced but still disagree.
I haven't noticed this in real time use ever.
When the sun peaked through the window I did and my setup had to be moved but that was with light shining directly on the screen.
 
Last edited:
These generations of OLED panels (B7 and C1) max out around 140 nits full field in SDR if memory serves, any higher and you'll start seeing ABL fluctuations (particularly with PC use). Microsoft's HDR to SDR translation is also completely broken on Windows, they use some bizarre piecewise gamma function that completely fucks up the lower range of the curve, noticeably raises blacks/shadow detail and generally washes out the image. You want to run Windows desktop in SDR at all times, and only turn on HDR manually to play games or videos. That's why so much use is in SDR at 140 nits, but it also happens to be a comfortable level in a moderately lit room and is bright for a blacked out room. Basically all SDR TV's used to run around 140 - 180 nits in their bright accurate modes (40 - 50 foot lamberts). Low light modes and SDR technical specification is 100 nits or 30 foot lamberts.

That is strictly for PC desktop use though, this particular B7 also has many thousands of hours of maxed out 100 OLED light, high peak panel brightness HDR use with PC and console games, which would be like 650 or 700 nits on the B7.



It's in a bedroom now and full screen white at 140 nits will literally blind you in the dark, genuinely hurts your eyes if you fall asleep and wake up to it. You guys are wildly underestimating how bright 140 nits full field actually is, our perception of brightness is logarithmic. The only scenario you wouldn't quantify it as bright is in a room with open windows on a sunny day.



Not a typo, that is strictly for PC desktop use, which should always be in SDR (see reasons above). The S95B also maxes out around 150 nits on full field white in SDR and would only look around 20-30% brighter if you completely maxed out its SDR mode (our perception of light is logarithmic and 150 nits is much closer to the leveling off peak of the curve than the steep bottom). Most HDR games and movies actually sit primarily around the 150-300 nits APL range, the obscene values are only for specular highlights.
This was a very longwinded technical explanation rationalizing your specific use-case. And it's fine, you want to stick to 140 nits because you're in a dark room and don't want ABL kicking in all the time, cool. I'm well aware of the human eye's perception of brightness on a display relative to the environment. But you're missing the point, entirely.

If you advertise a TV or monitor display to be the best for gamers, a hobby notorious for bright HUDs and static imagery… and folks can have screen burn in occur by using the display out of the box with default settings while performing activities that posed no issue on other displays… one might consider that a dealbreaker. Maybe not for you, but definitely for me.

You came in and questioned what people were doing to their displays, essentially framing the discussion as though they were abusing their OLEDs. The reality is much more simple, they turned it on and used it. They didn't alter their behaviors, and they shouldn't have to. It's why I dropped the Steve Jobs reference in my initial reply to you. His insistence at the time that the phone was fine and folks were "holding it wrong" was bog standard out of touch silicon valley psycho shit. Folks were holding the phone the way they always did, which was no problem on any other phone. It was an engineering/tech problem, but naturally, there was an Apple defense force ready to blame users much like what you are doing here.

The risk of burn in decreases with each iteration, but it still exists and is mostly a matter of how perceptive individuals are. You don't get the same amount if extremes as first gen OLED displays had, but you still have uneven pixel wear nonetheless such as kevboard kevboard is experiencing. Some will see it, others don't mind it, it is what it is. But please folks, stop running a defense force and quit blaming users for using their display as advertised.
 
I hear and believe what you say you experienced but still disagree.
I haven't noticed this in real time use ever.
When the sun peaked through the window I did and my setup had to be moved but that was with light shining directly on the screen.

Its in a house with much less light than the shop and you don't have a WOLED panel OLED next to it to compare so it's not going to be evident at all.

I had them all lined up in the shop so could scrutinise them, like i said i dont think its something to care about at all.

Its just every time I didn't mention these things people came back and said "why didn't you tell me" so when I talk about TVs outside the job I tend to keep doing it.

No one ever came back about this ambient black level raise, the main one they came back about was me not explaining how OLEDs PRT means panning shots in 24/25/29.976/30fps content looks juddery without motion smoothing so if the customer preferred that off I would say its better to leave it on minimum at least.

I have the ZD9 still that i do most things on but I also have an Sony AF9 OLED which I use to watch certain films that just dont look the nearly as good without OLEDs per pixel dimming, eg the recent Nosferatu. I despise MEMC and I have it set to minimum on the AF9 because its unpleasant without it imo.

Almost everything I still watch on the ZD9, I just prefer the motion - no MEMC engaged and it looks phenomenal, the blur between frames is near perfect to blend them at 24fps or in 30fps games with poor or no screen-based motion blur - fullscreen brightness in HDR games and average picture level generally vs the OLED.

I'm sure recent OLEDs are way better in terms of fullscreen brightness than my AF9 but the 25% and 50% windows are still subpar imo.
 
I got 7k+ hours down on my 77" CX, no sign of burn-in, and I'm still totally happy with it. I'll only replace it when it breaks, the picture quality is still excellent and I doubt a C5 offers a big difference.
I have the 55 inch CX as well. Put in countless thousands of hours. Zero issues whatsoever. Brilliant TV.
 
"you are more likely to have a backlight die on an LED which is arguably way worse than burn in."

I easily changed backlights on 2 TVs already and they are normal again. Can I easily change the burned leds???
 
Last edited:
I'm sad OCD and this doesn't bother me.

If you understand what burn in is you will not get it or worry about it.

It's a type of uneven wear but in order to not worry about it you have to understand uneven wear and how it is and isn't applicable to your panel.

Mixed content won't get it. Black bars won't cause it. There are likely at least 3 active countermeasures happening to prevent it at any time on an LG. You need to understand them.

It's not an issue unless you are weird and also the only one using the TV. You would need to create a perfect environment for it to happen and then not allow anyone else to ever use the set for any other mixed content. Like if you watch youtube on your gaming set that's mixed content. Fuck, if you ever change fucking games, that's mixed content, so there is just nothing to worry about. I worried about it too, when I was young, before I learned. Other parts of the set will fuck up first. Power supply. If you play one game only then by all means don't get an OLED. However, modern games often have dynamic UIs. Yotei for example was UI-less for much of the game. You will never burn-in on Yotei, ever. Using it as a PC monitor. It's good to go mate, just game occasionally or watch youtube in full screen from time to time, these things are awesome and not as delicate as you think and they use them in handhelds with no issue.
 
Last edited:
Top Bottom