Clearly I'm no tech expert, as half the stuff you're talking about goes right over my head. I've just heard people complaining about frame rates on oled TV's when it comes to gaming. As I understand it, a lot of console games are still 30 fps so that seems problematic for OLED tvs. Then there's the prices...
The tv I picked up was only $500 and has most of the gaming features you mentioned and looks fantastic (at least to me) and I'm not sure what tech it uses (TCL QM7).
1200 hours of Apex Legends, no issue so far.If you play a lot of different games then oled is fine, if you play the same games all the time for multiple hours a day then oled is not fine (particularly blizzard stuff, their huds are way too bright).
This is my 55" B7 I used as a full time PC monitor for 2 years at 140nits, I also played a wide variety of HDR games, tv, movies, and videos on this one maxed out. Probably got 20,000 hours or more on this thing.
![]()
Full size link: https://ibb.co/v6vJFmrq
The "organic" nature of OLED's diodes age faster(especially the blue diodes)than non OLED's.CRT pixels and plasma pixels aren't organic but they have the same problem
It's not the "organic" nature of these emissive displays that causes burn-in
The holy grail remains designing an emissive display with pixels that wear in a different way from progressive brightness loss. MicroLED is such a technology but it's been extremely difficult to find a way to miniaturize it in a way to produce a screen smaller than 110 inches or mass produce it at a price that normal people can pay
Was significantly better especially with monitors although OLED has improved.I have a mini LED display. It's better for displaying text and code than a lot of LG OLED panels I tested before that...
I hear and believe what you say you experienced but still disagree.I worked with every QD-OLED panelled TV from Samsung and Sony since their inception in 2021/22 and on the shop floor the ambient black level raise was evident on all of them.
The lights in the shop were suspended 20+ feet above the TVs so there was no direct light on the screen. Thats why its called ambient black level raise.
I sometimes did afterhours sessions with staff or customers where the lights were severely dimmed and you could still see it.
Samsung use of the matte coating on the S95C/D/F made it even more obvious for sure, but the Sony was still affected.
I dont think its the end of rhe world or anything, but it is there.
This was a very longwinded technical explanation rationalizing your specific use-case. And it's fine, you want to stick to 140 nits because you're in a dark room and don't want ABL kicking in all the time, cool. I'm well aware of the human eye's perception of brightness on a display relative to the environment. But you're missing the point, entirely.These generations of OLED panels (B7 and C1) max out around 140 nits full field in SDR if memory serves, any higher and you'll start seeing ABL fluctuations (particularly with PC use). Microsoft's HDR to SDR translation is also completely broken on Windows, they use some bizarre piecewise gamma function that completely fucks up the lower range of the curve, noticeably raises blacks/shadow detail and generally washes out the image. You want to run Windows desktop in SDR at all times, and only turn on HDR manually to play games or videos. That's why so much use is in SDR at 140 nits, but it also happens to be a comfortable level in a moderately lit room and is bright for a blacked out room. Basically all SDR TV's used to run around 140 - 180 nits in their bright accurate modes (40 - 50 foot lamberts). Low light modes and SDR technical specification is 100 nits or 30 foot lamberts.
That is strictly for PC desktop use though, this particular B7 also has many thousands of hours of maxed out 100 OLED light, high peak panel brightness HDR use with PC and console games, which would be like 650 or 700 nits on the B7.
It's in a bedroom now and full screen white at 140 nits will literally blind you in the dark, genuinely hurts your eyes if you fall asleep and wake up to it. You guys are wildly underestimating how bright 140 nits full field actually is, our perception of brightness is logarithmic. The only scenario you wouldn't quantify it as bright is in a room with open windows on a sunny day.
Not a typo, that is strictly for PC desktop use, which should always be in SDR (see reasons above). The S95B also maxes out around 150 nits on full field white in SDR and would only look around 20-30% brighter if you completely maxed out its SDR mode (our perception of light is logarithmic and 150 nits is much closer to the leveling off peak of the curve than the steep bottom). Most HDR games and movies actually sit primarily around the 150-300 nits APL range, the obscene values are only for specular highlights.
I hear and believe what you say you experienced but still disagree.
I haven't noticed this in real time use ever.
When the sun peaked through the window I did and my setup had to be moved but that was with light shining directly on the screen.
I have the 55 inch CX as well. Put in countless thousands of hours. Zero issues whatsoever. Brilliant TV.I got 7k+ hours down on my 77" CX, no sign of burn-in, and I'm still totally happy with it. I'll only replace it when it breaks, the picture quality is still excellent and I doubt a C5 offers a big difference.
Getting burn-in is not a matter of luck but a matter of having a lot of static content on display for long periods of time.I guess some people are unlucky and get burn-in. Others are unlucky and get a pile of bird shit dropped on their head. It happens. But I still wouldn't stay at home just because, under very, very unlikely circumstances, a bird might shit on my head.
I recently checked my LG CX, it has between 7,000 and 8,000 hours on it by now, and the picture still looks like it did on day one.Had My LG CX since launch with thousands of hours of gaming and 0 issues.
Ive never worried about it.
I like miniLED cause you get really awesome HDR brightness. Makes HDR more useable in a bright room.
Top mini-leds are much more expensive than OLEDs. OLEDs have become the poor mans choice which why they have amassed so many rabid fanboys. Burn in essentially means "buy cheap buy twice".Buncha poors in this thread. "I drive a lot because it's my hobby and my tires wore out! Why can't someone figure this out yet?!"
Which ones are more expensive, and by how much?Top mini-leds are more expensive than OLEDs. OLEDs have become the poor mans choice which why they have amassed so many rabid fanboys. Burn in essentially means "buy cheap buy twice".
I think on the more modern sets the issue is overblown.Sucks that OLED has the issues it has, however, nothing else really compares. So I guess it is what it is that it has a shorter effective lifespan.
Top mini-leds are much more expensive than OLEDs. OLEDs have become the poor mans choice which why they have amassed so many rabid fanboys. Burn in essentially means "buy cheap buy twice".
So true, whichever way you dress it up it's still an LCD display. The black levels and HDR are simply made for OLED screens, and look better.You pay more for ancient tech that tries super hard to mimic a fraction of OLEDs power. Great choice!
Mini LEDs are rubbish though in terms of image quality.Top mini-leds are much more expensive than OLEDs. OLEDs have become the poor mans choice which why they have amassed so many rabid fanboys. Burn in essentially means "buy cheap buy twice".
Imagine thinking that degrading organic material is the future and not liquid crystalsSo true, whichever way you dress it up it's still an LCD display. The black levels and HDR are simply made for OLED screens, and look better.
Imagine thinking that degrading organic material is the future and not liquid crystals![]()
![]()
I recently checked my LG CX, it has between 7,000 and 8,000 hours on it by now, and the picture still looks like it did on day one.
Bargin, I am off to buy two of them lol.Synthetic micro leds are the future - it's like OLED (self emitting) but has no downsides.
But that price...
![]()
But... I can't fucking stand latency, and Oled is the best game in town when it comes to response time. I've seens plenty of big LCD screens with dogshit lag and I'd rather have an OLED just for that.
Input lag above all.Do you mean less smearing from the lower pixel response time when you say latency? Or do you mean input lag?
Input lag above all.
You're confusing different things, OLEDs do have really low pixel response time, but that PRT value doesn't have much to do with the input latency you'll experience, the amount of image processing going on and the max refresh rate of the screen are more what determines how much input lag a TV adds to the rest of the chain of lag: From the console/PC, the engine, the controller, etc. You can see here the input lag on a 2025 MiniLED isn't meaningfully different than a 2025 OLED:
0.7ms of extra lag on the QN90F at 4K60 and 4K120, then its actually lower on the QN90F at max refresh rate because its 165hz vs 144hz refresh rate. So the QN90F, an LCD, would give you the lowest input lag of all 2025 TVs. Even the worst 2024 and 2025 models add around 9ms, mostly Sonys because they tend to leave more image processing turned on in Game mode. Anything above that is due to it having a wireless video transfer from an external box or because its an 8K native panel.
If you mean PRT then yes an OLED TV is way faster than any LCD TV and therefore has less pixel smearing. How fast the pixels change could affect your reaction time and therefore make it seem like theres more input lag, but thats a different story. I've heard this said so many times I have to set it straight:
Response Time ≠ Input Lag