Clearly I'm no tech expert, as half the stuff you're talking about goes right over my head. I've just heard people complaining about frame rates on oled TV's when it comes to gaming. As I understand it, a lot of console games are still 30 fps so that seems problematic for OLED tvs. Then there's the prices...
The tv I picked up was only $500 and has most of the gaming features you mentioned and looks fantastic (at least to me) and I'm not sure what tech it uses (TCL QM7).
1200 hours of Apex Legends, no issue so far.If you play a lot of different games then oled is fine, if you play the same games all the time for multiple hours a day then oled is not fine (particularly blizzard stuff, their huds are way too bright).
This is my 55" B7 I used as a full time PC monitor for 2 years at 140nits, I also played a wide variety of HDR games, tv, movies, and videos on this one maxed out. Probably got 20,000 hours or more on this thing.
![]()
Full size link: https://ibb.co/v6vJFmrq
The "organic" nature of OLED's diodes age faster(especially the blue diodes)than non OLED's.CRT pixels and plasma pixels aren't organic but they have the same problem
It's not the "organic" nature of these emissive displays that causes burn-in
The holy grail remains designing an emissive display with pixels that wear in a different way from progressive brightness loss. MicroLED is such a technology but it's been extremely difficult to find a way to miniaturize it in a way to produce a screen smaller than 110 inches or mass produce it at a price that normal people can pay
Was significantly better especially with monitors although OLED has improved.I have a mini LED display. It's better for displaying text and code than a lot of LG OLED panels I tested before that...
I hear and believe what you say you experienced but still disagree.I worked with every QD-OLED panelled TV from Samsung and Sony since their inception in 2021/22 and on the shop floor the ambient black level raise was evident on all of them.
The lights in the shop were suspended 20+ feet above the TVs so there was no direct light on the screen. Thats why its called ambient black level raise.
I sometimes did afterhours sessions with staff or customers where the lights were severely dimmed and you could still see it.
Samsung use of the matte coating on the S95C/D/F made it even more obvious for sure, but the Sony was still affected.
I dont think its the end of rhe world or anything, but it is there.
This was a very longwinded technical explanation rationalizing your specific use-case. And it's fine, you want to stick to 140 nits because you're in a dark room and don't want ABL kicking in all the time, cool. I'm well aware of the human eye's perception of brightness on a display relative to the environment. But you're missing the point, entirely.These generations of OLED panels (B7 and C1) max out around 140 nits full field in SDR if memory serves, any higher and you'll start seeing ABL fluctuations (particularly with PC use). Microsoft's HDR to SDR translation is also completely broken on Windows, they use some bizarre piecewise gamma function that completely fucks up the lower range of the curve, noticeably raises blacks/shadow detail and generally washes out the image. You want to run Windows desktop in SDR at all times, and only turn on HDR manually to play games or videos. That's why so much use is in SDR at 140 nits, but it also happens to be a comfortable level in a moderately lit room and is bright for a blacked out room. Basically all SDR TV's used to run around 140 - 180 nits in their bright accurate modes (40 - 50 foot lamberts). Low light modes and SDR technical specification is 100 nits or 30 foot lamberts.
That is strictly for PC desktop use though, this particular B7 also has many thousands of hours of maxed out 100 OLED light, high peak panel brightness HDR use with PC and console games, which would be like 650 or 700 nits on the B7.
It's in a bedroom now and full screen white at 140 nits will literally blind you in the dark, genuinely hurts your eyes if you fall asleep and wake up to it. You guys are wildly underestimating how bright 140 nits full field actually is, our perception of brightness is logarithmic. The only scenario you wouldn't quantify it as bright is in a room with open windows on a sunny day.
Not a typo, that is strictly for PC desktop use, which should always be in SDR (see reasons above). The S95B also maxes out around 150 nits on full field white in SDR and would only look around 20-30% brighter if you completely maxed out its SDR mode (our perception of light is logarithmic and 150 nits is much closer to the leveling off peak of the curve than the steep bottom). Most HDR games and movies actually sit primarily around the 150-300 nits APL range, the obscene values are only for specular highlights.
I hear and believe what you say you experienced but still disagree.
I haven't noticed this in real time use ever.
When the sun peaked through the window I did and my setup had to be moved but that was with light shining directly on the screen.
I have the 55 inch CX as well. Put in countless thousands of hours. Zero issues whatsoever. Brilliant TV.I got 7k+ hours down on my 77" CX, no sign of burn-in, and I'm still totally happy with it. I'll only replace it when it breaks, the picture quality is still excellent and I doubt a C5 offers a big difference.