• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

For those worried about OLED burn-in

And you don't have to worry about burn in but that backlight will fuck up at some point...
Couldn't care less about any failures as I got a free 6 year warranty on my mini led while the vast majority of TV warranties explicitly state that they do not cover OLED burn in.
 
Last edited:
I have two OLEDs, both with over 15k hours, one of which has been used exclusively as a PC monitor. Zero burn-in, and even if there was I wouldn't trade them for an inferior picture because of fear, especially now that I've had experience with these tvs.
I'm more concerned about people's obsession with OLEDs in 2026. Sure, when I bought my S95C it was the shit to have, but now mini LEDs and soon RGB LEDs are brighter and more affordable (if you don't buy Sony) while having blacks that 99% of us can't distinguish from an OLED's. Sure, last year LG's tandem OLED kept up in brightness but it's an uphill battle.
Then 99% of you need glasses.
How long do you spend watching tiny white dots on your screen? Most likely never.

In real scenes a good mini-led will always look better than an OLED and you dont have to worry about burn in.
It's not just white dots, look at the bloom around anything light on a black background, like the time in that image. If you play games or watch movies with a lot of dark areas they aren't remotely similar, either.

If mini-led is really better why are OLEDs rated better in almost every price range?
 
What about VRR Flicker? Oh right, we forgot about that one tiny thing that is actually not so tiny if you're a gamer.
It's really bad on my monitor but not perceptible on my TV. It's so bad on my monitor that I stopped using VRR altogether. High refresh rate at least helps with screen tearing and v-sync induced input lag at 82fps isn't as bad as with 60, 40 or God forbid 30fps.

Don't know what's my TV doing that's so small I don't see it there but on my monitor.

As always, perfect tech don't exist.
 
It's really bad on my monitor but not perceptible on my TV. It's so bad on my monitor that I stopped using VRR altogether. High refresh rate at least helps with screen tearing and v-sync induced input lag at 82fps isn't as bad as with 60, 40 or God forbid 30fps.

Don't know what's my TV doing that's so small I don't see it there but on my monitor.

As always, perfect tech don't exist.
No. But compared to any of those flat tech, a decent CRT TV will look perfect specifically in motion.
 
Anecdotally in my 4 years with the LG C1 I've had no issues.
I was really worried about burn-in at first, I was actually going to buy a Samsung QLED instead but I found out at the last moment that version of the model I wanted they sold in my region had a shittier panel (so the great reviews wouldn't be representative of what I'd get), so I ended up going with the C1 instead.

For the first year I was pretty paranoid, I would turn it off even if I was going to take a piss or grab something to drink. But eventually I just started treating it like a normal TV and it has been completely fine. Out of curiosity I still check for burn in like once or twice a year, just did so the other day and couldn't find anything.

That said I usually don't play the same games for very long. It's rare for me to spend more than 100 hours on a game, except for a few multiplayer titles.
 
I've had an LG C1 48" for the past 3 years, have around 5000-6000 hours on it, and haven't experienced a single burn-in issue, but I do take some precautions to help minimise risk. Hiding the taskbar on Windows, black wallpaper, and unless I'm gaming or watching movies, manually keep the screen brightness turned down relatively low. Also regularly pixel refresh and have the pixel shift feature switched on.
 
Last edited:
I also using my monitor for work, so burn-in might actually be more of a real risk in my case. Furthermore, most OLEDs fucking suck ass when it comes to displaying text/code.
 
yup, and you will get burn in on OLED. it's inevitable.

my Samsung S95B has a visible reticle in the center of the screen. it's only really visible if you display a fully uniform and bright color, and even then it's pretty faint, but it's clearly burn in.
doing some pixel refresh cycles would probably reduce it to be even less noticeable,
but claiming OLED TVs will not get burn in is a straight up lie.

depending on what you play this is a bigger or lesser problem of course. but if you play a lot of shooters, maybe even a lot of the same shooter, that reticle and probably some of the HUD will absolutely burn in over the span of a few years.

Just because you got burn-in doesn't mean everyone else is lying.


I've had an LG B9 for 6 years now, use it all the time, zero burn-in.
 
I have two OLEDs, both with over 15k hours, one of which has been used exclusively as a PC monitor. Zero burn-in, and even if there was I wouldn't trade them for an inferior picture because of fear, especially now that I've had experience with these tvs.

Then 99% of you need glasses.

It's not just white dots, look at the bloom around anything light on a black background, like the time in that image. If you play games or watch movies with a lot of dark areas they aren't remotely similar, either.

If mini-led is really better why are OLEDs rated better in almost every price range?
High end LCDs and OLEDs trade blows in picture quality but you're acting like OLED is superior in every way which is simply not.

The camera captures way more blooming than your eyes will ever see and every reputable reviewer mentions that as a disclaimer. Blooming on my mini-led is pretty much unperceivable.
 
I've had two plasmas televisions and two oled televisions now. I just make sure I watch different content with variations of different aspect raitos and keep protective screen settings on like "Adjust Logo Luminance" set to high and "Screen Move" set to on. I've not had any issues yet except for one plasma that was on its final death rattle anyway.
 
Just because you got burn-in doesn't mean everyone else is lying.


I've had an LG B9 for 6 years now, use it all the time, zero burn-in.

open a fully bright pink or red image, and look at the screen.
I would be very surprised if you wouldn't see elements burnt in. I think it's almost impossible.
 
open a fully bright pink or red image, and look at the screen.
I would be very surprised if you wouldn't see elements burnt in. I think it's almost impossible.

I see nothing on pink image.

Also, corners:

onMkIYf.jpeg
eyPEWTe.jpeg
AfB85D0.jpeg
bY07uu1.jpeg


No dead pixels as well. B2 from 2022.
 
I haven't had any burn in issues, but I don't play video games on my TV, rather opt for a monitor. Could that be saving my television from burn in?
 
Mu C9 used for live TV has channel logos burned in.
My other CX used for desktop and gaming has the dota 2 hud burned in.

They all burn in, but its worth it.
 
High end LCDs and OLEDs trade blows in picture quality but you're acting like OLED is superior in every way which is simply not.

The camera captures way more blooming than your eyes will ever see and every reputable reviewer mentions that as a disclaimer. Blooming on my mini-led is pretty much unperceivable.
I didn't say they were superior in every way, just that they are superior…and they are. You literally said mini-led "always look better" and now they trade blows? And since we are talking about "reputable reviewers " why do they all have OLEDs listed as the top tvs until price is a concern?
 
According to some people in this thread nuance and context don't exist, OLED = OLED, and anecdotal evidence is superior to actual testing 🤔
Real world usage trumps destructive testing and at this point there has been well over a decade of it. Sure my engine in my car will fail someday but dumping sand in the oil will speed that up. Doesn't mean that test means anything.
 
Just because you got burn-in doesn't mean everyone else is lying.


I've had an LG B9 for 6 years now, use it all the time, zero burn-in.
All that means is you've restricted yourself from displaying content that will easily cause burn in.

Try playing Diablo 4 for a 1k hours full hud, you 100% will get burn in.
 
Last edited:
I've had a couple of OLED TVs a C1 and a C4. I used the first one for PC gaming from time to time and I ended up selling it after 2 years of use to get the C4. No burn in or any other problem.

I recently got a laptop that has an OLED screen and an OLED monitor for my desktop so I guess I'll come back later if I get any burn in. So far it has been great.
 
If your prime activity is gaming, why would you get a OLED tv? Aside from burn-in they don't do well with frame rates. Pretty clear they're made for cinema.
 
All that means is you've restricted yourself from displaying content that will easily cause burn in.

Try playing Diablo 4 for a 1k hours full hud, you 100% will get burn in.

Why would he want to do that?

If your prime activity is gaming, why would you get a OLED tv? Aside from burn-in they don't do well with frame rates. Pretty clear they're made for cinema.

Are you kidding me? Ultra fast Pixel response time and low input lag (on LG at least) with support for all gaming stuff (120hz+, vrr, HGIG) makes them the best panels for gaming.

In bravia 9 review you can read that it's bad for gaming because it smears image in motion thanks to low Pixel response time in game mode. Oleds show 30fps how it really looks (like shit) but at 60fps or above they are brilliant. You can say that they are WORSE for movies because 24fps is super low framerate to show on screen (without motion interpolation)...
 
Last edited:
Are you kidding me? Ultra fast Pixel response time and low input lag (on LG at least) with support for all gaming stuff (120hz+, vrr, HGIG) makes them the best panels for gaming.

In bravia 9 review you can read that it's bad for gaming because it smears image in motion thanks to low Pixel response time in game mode. Oleds show 30fps how it really looks (like shit) but at 60fps or above they are brilliant. You can say that they are WORSE for movies because 24fps is super low framerate to show on screen (without motion interpolation)...
Clearly I'm no tech expert, as half the stuff you're talking about goes right over my head. I've just heard people complaining about frame rates on oled TV's when it comes to gaming. As I understand it, a lot of console games are still 30 fps so that seems problematic for OLED tvs. Then there's the prices...

The tv I picked up was only $500 and has most of the gaming features you mentioned and looks fantastic (at least to me) and I'm not sure what tech it uses (TCL QM7).
 
All display tech has problems. But it's hilarious how often OLED enthusiasts gaslight everyone into believing their display tech of choice is objectively superior when it just flat out isn't. They always downplay the risks of burn-in as though that were the only problem.

There is a whole dedicated thread to OLED banding and vignetting on AVS. It's a widespread problem, doesn't matter if it's QD-OLED or W-OLED. Said thread has been going since 2017 with over 24000 posts and it is still a problem, doesn't matter if you're rocking a Sony A95L, an LG G5, or a Samsung S95F. Don't take my word for it, go see for yourselves.


The irony is that OLED is allegedly the best tech for dark scenes, and yet, some dark sequences in movies like Dune or TV shows like House of the Dragon look like you're watching a video from 2006 era Youtube with macroblocking. OLED may have "pure black" but it has issues displaying anything near black, resulting in black crush, or the aforementioned banding/vignetting. This isn't a constant problem, but will show itself in "challenging scenes" when the majority of the screen has low nit content onscreen. Do you like dithering? OLED has it in spades!

Oh and, let me know when OLED plans on addressing fullscreen white. Watching hockey games on an OLED is an exercise in comedy. Sustained 100% white on an OLED is still below what budget LCD TVs can do.

MiniLED has its own problems, but at least folks don't have to create a defense force on its behalf. I own a Bravia 9 and it has blooming. Anyone claiming otherwise lacks attention to detail, it's there. The motion clarity is also inferior to any OLED out there. But, I will never have to ever worry about what is displayed on my screen, or need to run screen retention maintenance cycles (which lower the longevity of your OLED display). It gave me the least amount of compromises out of any display at the time of purchase for my use case, but I am still awaiting something better.
 
All display tech has problems. But it's hilarious how often OLED enthusiasts gaslight everyone into believing their display tech of choice is objectively superior when it just flat out isn't. They always downplay the risks of burn-in as though that were the only problem.

There is a whole dedicated thread to OLED banding and vignetting on AVS. It's a widespread problem, doesn't matter if it's QD-OLED or W-OLED. Said thread has been going since 2017 with over 24000 posts and it is still a problem, doesn't matter if you're rocking a Sony A95L, an LG G5, or a Samsung S95F. Don't take my word for it, go see for yourselves.


The irony is that OLED is allegedly the best tech for dark scenes, and yet, some dark sequences in movies like Dune or TV shows like House of the Dragon look like you're watching a video from 2006 era Youtube with macroblocking. OLED may have "pure black" but it has issues displaying anything near black, resulting in black crush, or the aforementioned banding/vignetting. This isn't a constant problem, but will show itself in "challenging scenes" when the majority of the screen has low nit content onscreen. Do you like dithering? OLED has it in spades!

Oh and, let me know when OLED plans on addressing fullscreen white. Watching hockey games on an OLED is an exercise in comedy. Sustained 100% white on an OLED is still below what budget LCD TVs can do.

MiniLED has its own problems, but at least folks don't have to create a defense force on its behalf. I own a Bravia 9 and it has blooming. Anyone claiming otherwise lacks attention to detail, it's there. The motion clarity is also inferior to any OLED out there. But, I will never have to ever worry about what is displayed on my screen, or need to run screen retention maintenance cycles (which lower the longevity of your OLED display). It gave me the least amount of compromises out of any display at the time of purchase for my use case, but I am still awaiting something better.

OLED owner here (C2), before that I had a Panasonic Plasma , before that a Samsung HDTVand before that a Sony Trinitron.

Personally, the BIGGEST problem for me with OLED is the stuttering with movies (which, let's be honest, it can't be mitigated with "true cinema" mode or even motion smoothing) and the fact that it completely killed 30fps gaming for me - there's so many games still stuck without a FPS patch/boost/remaster that the sample and hold and near instant response time of these things make 30fps unplayable and no amount of in-game motion blur can fix that (sorry Rofif).

Back in 2022, when I was going to get my new set, I initially opted for a Philips PUS model with a LED, VA screen and...things hadn't evolved that much from the LCDs of yore due to banding and vignetting , said "fuck this, let's see what the OLED fuss is all about" and went with the C2.

It has served me well for over 2 years now, can't really complain, still, coming from a Plasma which A) still had very good blacks compared to LCD and B) the fact that it could actually handle 30fps content/games really well, the C2 didn't blow me away...
Also, if I had experienced firsthand the 30fps stutter with games I'd have opted for a MiniLED one - it's THAT bad for me to the point that my next TV isn't going to be an OLED.
Yes, consoles have been targeting "performance/60fps" modes for a good 6 years now but...there's still a trove of amazing games that are stuck at 30fps which I wouldn't be able to play them on an OLED.

PS : I don't hate the tech but the fandom and circlejerk is as strong as the FROM software one, I cringe every time I see a new post on Reddit mentioning "I've finally seen the light"/"I'm blown away !"/"Can't go back to non-OLED" etc etc
 
Last edited:
I leave my lg c1 on pretty much all day since I got it when they first were released and have easy +1k hours of Diablo 2 on it with zero issues

Edit: it's not perfect tech tho. Playing games in bright daylight is shit

Especially if game is dark + room is bright. Makes it really shitty to play horror games during the day. I could fix it with blackout curtains, but I'm pretty lazy and I only game at night really
 
Last edited:
OLED owner here (C2), before that I had a Panasonic Plasma , before that a Samsung HDTVand before that a Sony Trinitron.

Personally, the BIGGEST problem for me with OLED is the stuttering with movies (which, let's be honest, it can't be mitigated with "true cinema" mode or even motion smoothing) and the fact that it completely killed 30fps gaming for me - there's so many games still stuck without a FPS patch/boost/remaster that the sample and hold and near instant response time of these things make 30fps unplayable and no amount of in-game motion blur can fix that (sorry Rofif).

Back in 2022, when I was going to get my new set, I initially opted for a Philips PUS model with a LED, VA screen and...things hadn't evolved that much from the LCDs of yore due to banding and vignetting , said "fuck this, let's see what the OLED fuss is all about" and went with the C2.

It has served me well for over 2 years now, can't really complain, still, coming from a Plasma which A) still had very good blacks compared to LCD and B) the fact that it could actually handle 30fps content/games really well, the C2 didn't blow me away...
Also, if I had experienced firsthand the 30fps stutter with games I'd have opted for a MiniLED one - it's THAT bad for me to the point that my next TV isn't going to be an OLED.
Yes, consoles have been targeting "performance/60fps" modes for a good 6 years now but...there's still a trove of amazing games that are stuck at 30fps which I wouldn't be able to play them on an OLED.

PS : I don't hate the tech but the fandom and circlejerk is as strong as the FROM software one, I cringe every time I see a new post on Reddit mentioning "I've finally seen the light"/"I'm blown away !"/"Can't go back to non-OLED" etc etc

OLEDs don't cause stuttering, they have a near instantaneous pixel response time which means 30fps games with poor motion blur or none at all will look very bad. LCDs only mask the problem due their worse pixel response time. If LCDs get as fast, then they will have the same problem. It is a problem, but in regards to games it's the devs fault not the display.
 
OLEDs don't cause stuttering, they have a near instantaneous pixel response time which means 30fps games with poor motion blur or none at all will look very bad. LCDs only mask the problem due their worse pixel response time. If LCDs get as fast, then they will have the same problem. It is a problem, but in regards to games it's the devs fault not the display.
The fast pixel response time is why without help OLED's don't fair well with under 60hz content ie 30fps games and 24hz movies.

LCD handle 30fps and 24hz better but are not as good with faster moving scenarios like high fps gaming due to its slower response time/s.

I personally have no issues gaming on both.
 
20211115-171523.jpg


Don't buy an OLED.

open a fully bright pink or red image, and look at the screen.
I would be very surprised if you wouldn't see elements burnt in. I think it's almost impossible.

The fuck are you guys doing, 100% maxed out HDR Vivid mode running the same static HUD/content 90% of the time? I got a 48 C1 which has been a full time desktop only PC monitor (no gaming, movies, or TV) running 8-12 hours every day for over 4 years with no visible burn in or anomalies on any full field RGBCMY pattern or 10 point gray percentage. I simply turned on windows dark mode and auto hide the task bar. I suppose the biggest deterrent is I only run it in SDR at ~140 nits, which is the full field white peak of the panel (to avoid ABL fluctuations). It does however have a bunch of dead sub-pixels along the edges which is apparently a pretty common failure with WOLED's, but being 4K sub-pixels they're too small to be visible in actual use. Still, I do plan on getting it replaced under the 5 year extended warranty though.
 
Last edited:
The fuck are you guys doing, 100% maxed out HDR Vivid mode running the same static HUD/content 90% of the time? I got a 48 C1 which has been a full time desktop only PC monitor (no gaming, movies, or TV) running 8-12 hours every day for over 4 years with no visible burn in or anomalies on any full field RGBCMY pattern or 10 point gray percentage. I simply turned on windows dark mode and auto hide the task bar. I suppose the biggest deterrent is I only run it in SDR at ~140 nits, which is the full field white peak of the panel (to avoid ABL fluctuations). It does however have a bunch of dead sub-pixels along the edges which is apparently a pretty common failure with WOLED's, but being 4K sub-pixels they're too small to be visible in actual use. I do plan on getting it replaced under an extended warranty though.
w4rGvlHikTPVC0oe.jpeg
 
The fuck are you guys doing, 100% maxed out HDR Vivid mode running the same static HUD/content 90% of the time? I got a 48 C1 which has been a full time desktop only PC monitor (no gaming, movies, or TV) running 8-12 hours every day for over 4 years with no visible burn in or anomalies on any full field RGBCMY pattern or 10 point gray percentage. I simply turned on windows dark mode and auto hide the task bar. I suppose the biggest deterrent is I only run it in SDR at ~140 nits, which is the full field white peak of the panel (to avoid ABL fluctuations). It does however have a bunch of dead sub-pixels along the edges which is apparently a pretty common failure with WOLED's, but being 4K sub-pixels they're too small to be visible in actual use. Still, I do plan on getting it replaced under the 5 year extended warranty though.

my burn in is barely visible, but I play a lot of Apex Legends, and that shows.

this is the whole screen:
Q8Jcu8a2TYkWR7wF.jpg


this is the center, with a visible reticle, in a close-up:
X49sInwl2V8G2J9I.jpg


this is the bottom left of the screen in a close-up, whee the character HUD is faintly visible
NCXD57xRREw480Qr.jpg


it's slightly more visible in person than on these pictures, but it's not bad.

again, you don't really notice it unless you have a fully uniform image on full brightness. but it's definitely an issue.
 
Last edited:
Correct the "Organic" nature.
CRT pixels and plasma pixels aren't organic but they have the same problem

It's not the "organic" nature of these emissive displays that causes burn-in

The holy grail remains designing an emissive display with pixels that wear in a different way from progressive brightness loss. MicroLED is such a technology but it's been extremely difficult to find a way to miniaturize it in a way to produce a screen smaller than 110 inches or mass produce it at a price that normal people can pay
 
my burn in is barely visible, but I play a lot of Apex Legends, and that shows.

this is the whole screen:
Q8Jcu8a2TYkWR7wF.jpg


this is the center, with a visible reticle, in a close-up:
X49sInwl2V8G2J9I.jpg


this is the bottom left of the screen in a close-up, whee the character HUD is faintly visible
NCXD57xRREw480Qr.jpg


it's slightly more visible in person than on these pictures, but it's not bad.

again, you don't really notice it unless you have a fully uniform image on full brightness. but it's definitely an issue.

What model is it?

CRT pixels and plasma pixels aren't organic but they have the same problem

It's not the "organic" nature of these emissive displays that causes burn-in

The holy grail remains designing an emissive display with pixels that wear in a different way from progressive brightness loss. MicroLED is such a technology but it's been extremely difficult to find a way to miniaturize it in a way to produce a screen smaller than 110 inches or mass produce it at a price that normal people can pay

Most plasma burn in (especially later gens) wasn't actually burn it, it was temporary image retention (in varying degrees of stubbornness). It would in my experience always wear away, even if it took a while. I'm curious if you stop offending HUDs/Static content if OLED burn in will similarly disappear over time.
 
Last edited:
All OLED's get burn in, all LCD's could suffer a backlight failure, shit happens, things don't last forever, personally, after having 3 OLED's and all suffering burn in I'll never get another for the main tv in the house, cinema room or something like that, sure, it's great, but if you're watching TV with channel logo's and on screen news tickers etc, OLED's will just get fucked.
 
Samsung S95B

I am playing max backlight, HDR on.
again, it's barely visible, but after years of Apex Legends, it cumulated into a faint reticle and a faint imprint of the hud being visible.

You flew too close to the sun for a 1st gen QD-OLED running effectively maxed out. I have no doubt you'd be fine blazing it if you were playing a variety of games, but if you got 1 specific HUD on super frequently, you probably want to cut out the HDR and dial back the brightness in general.
 
Last edited:
Did they seriously try to equate reliability based off their statistically insignificant number of test units? Thought this site was smarter than that.
 
You flew too close to the sun for a 1st gen QD-OLED running effectively maxed out. I have no doubt you'd be fine blazing it if you were playing a variety of games, but if you got 1 specific HUD on super frequently, you probably want to cut out the HDR and dial back the brightness in general.

I play tons of different games. but Apex sessions are among the longest at a time.
I have a trio that I play with on the weekends, we play ranked togeter, and it's usually around 4 to 5 hours at a time.

on a daily basis, outside those ranked sessions, I play it less than 1h on average, maybe it would average out below 30min even. maybe 2 to 3 rounds of mixtape (TDM, Control, Gun Run), but that's it.

I think a reticle forming in the center is a danger for everyone however, given how shooters are insanely popular and all of them have some form of dot, lines or circle in the center of the screen at all times.
 
Last edited:
This is my 55" B7 I used as a full time PC monitor for 2 years at 140nits, I also played a wide variety of HDR games, tv, movies, and videos on this one maxed out. Probably got 20,000 hours or more on this thing.

20251222-232540.jpg


Full size link: https://ibb.co/v6vJFmrq
See guys? Force your HDR display to be an SDR display by curtailing its advertised capabilities down to 140 nits, and everything will be fine! If you insist on displaying HDR content on your expensive HDR display as advertised by the manufacturer, make sure you adjust your behavior to what the display needs, or it's your fault.
 
See guys? Force your HDR display to be an SDR display by curtailing its advertised capabilities down to 140 nits, and everything will be fine! If you insist on displaying HDR content on your expensive HDR display as advertised by the manufacturer, make sure you adjust your behavior to what the display needs, or it's your fault.

These generations of OLED panels (B7 and C1) max out around 140 nits full field in SDR if memory serves, any higher and you'll start seeing ABL fluctuations (particularly with PC use). Microsoft's HDR to SDR translation is also completely broken on Windows, they use some bizarre piecewise gamma function that completely fucks up the lower range of the curve, noticeably raises blacks/shadow detail and generally washes out the image. You want to run Windows desktop in SDR at all times, and only turn on HDR manually to play games or videos. That's why so much use is in SDR at 140 nits, but it also happens to be a comfortable level in a moderately lit room and is bright for a blacked out room. Basically all SDR TV's used to run around 140 - 180 nits in their bright accurate modes (40 - 50 foot lamberts). Low light modes and SDR technical specification is 100 nits or 30 foot lamberts.

That is strictly for PC desktop use though, this particular B7 also has many thousands of hours of maxed out 100 OLED light, high peak panel brightness HDR use with PC and console games, which would be like 650 or 700 nits on the B7.

This is bro after 20k hours using 140 nits screen :messenger_tears_of_joy: :messenger_tears_of_joy: :messenger_tears_of_joy:

77a3c6e840827f6cb838dc58faede5d6.gif

It's in a bedroom now and full screen white at 140 nits will literally blind you in the dark, genuinely hurts your eyes if you fall asleep and wake up to it. You guys are wildly underestimating how bright 140 nits full field actually is, our perception of brightness is logarithmic. The only scenario you wouldn't quantify it as bright is in a room with open windows on a sunny day.

140 nits? is that a typo?

Not a typo, that is strictly for PC desktop use, which should always be in SDR (see reasons above). The S95B also maxes out around 150 nits on full field white in SDR and would only look around 20-30% brighter if you completely maxed out its SDR mode (our perception of light is logarithmic and 150 nits is much closer to the leveling off peak of the curve than the steep bottom). Most HDR games and movies actually sit primarily around the 150-300 nits APL range, the obscene values are only for specular highlights.
 
Last edited:
I got burn in on my LG C1.

Use it on my PC for gaming and everything.
And at the top I got Firefox adress bar; and then some, BURNED IN across the entire 55" top...

Ran it on max light(Energi saving OFF) and spent a bunch of time on it and suddenly a looking after a crate drop from the sky while playing video games... I noticed some burn in.
Put on a white page in full screen and bam... there it is. My tabs and adressbar clearly visable.

Now I only run it on Maximum energi saving and only crank it up to max when watching movies/shows and play video games.

I also run everything "Black" so I dont see the burn in unless the scenerey displayed is bright same colour... then at the top it be like "PRONHUB - MILFS IN STOCKINGS" 'nd shit.

Also it came with two single dead pixels in the middle, 7 inches apart.
Didnt bother to swap it in cause I had mounted it on a wall and setup a entire rig infront of it.
 
OLED owner here (C2), before that I had a Panasonic Plasma , before that a Samsung HDTVand before that a Sony Trinitron.

Personally, the BIGGEST problem for me with OLED is the stuttering with movies (which, let's be honest, it can't be mitigated with "true cinema" mode or even motion smoothing) and the fact that it completely killed 30fps gaming for me - there's so many games still stuck without a FPS patch/boost/remaster that the sample and hold and near instant response time of these things make 30fps unplayable and no amount of in-game motion blur can fix that (sorry Rofif).

Back in 2022, when I was going to get my new set, I initially opted for a Philips PUS model with a LED, VA screen and...things hadn't evolved that much from the LCDs of yore due to banding and vignetting , said "fuck this, let's see what the OLED fuss is all about" and went with the C2.

It has served me well for over 2 years now, can't really complain, still, coming from a Plasma which A) still had very good blacks compared to LCD and B) the fact that it could actually handle 30fps content/games really well, the C2 didn't blow me away...
Also, if I had experienced firsthand the 30fps stutter with games I'd have opted for a MiniLED one - it's THAT bad for me to the point that my next TV isn't going to be an OLED.
Yes, consoles have been targeting "performance/60fps" modes for a good 6 years now but...there's still a trove of amazing games that are stuck at 30fps which I wouldn't be able to play them on an OLED.

PS : I don't hate the tech but the fandom and circlejerk is as strong as the FROM software one, I cringe every time I see a new post on Reddit mentioning "I've finally seen the light"/"I'm blown away !"/"Can't go back to non-OLED" etc etc

Cutting edge mini led panels also show stutter with 30fps, the better pixel response you get the more stuttering you will see with low fps content.

The fast pixel response time is why without help OLED's don't fair well with under 60hz content ie 30fps games and 24hz movies.

LCD handle 30fps and 24hz better but are not as good with faster moving scenarios like high fps gaming due to its slower response time/s.

I personally have no issues gaming on both.

LCDs are better because they smear image in motion.

The fuck are you guys doing, 100% maxed out HDR Vivid mode running the same static HUD/content 90% of the time? I got a 48 C1 which has been a full time desktop only PC monitor (no gaming, movies, or TV) running 8-12 hours every day for over 4 years with no visible burn in or anomalies on any full field RGBCMY pattern or 10 point gray percentage. I simply turned on windows dark mode and auto hide the task bar. I suppose the biggest deterrent is I only run it in SDR at ~140 nits, which is the full field white peak of the panel (to avoid ABL fluctuations). It does however have a bunch of dead sub-pixels along the edges which is apparently a pretty common failure with WOLED's, but being 4K sub-pixels they're too small to be visible in actual use. Still, I do plan on getting it replaced under the 5 year extended warranty though.

I don't have counter on my tv (only weekly usage) but for me it 6h MINIMUM (sometimes up to 13h) daily usage for 3 years as PC monitor mainly.

B7 I used as a computer monitor.

This wasn't exactly great generation of OLED to use as PC monitor, LG was still figuring out many protective stuff at the time. They introduced dedicated monitors just in last few years, before that people were using OLED as PC monitors at their own risk (I do that as well).

He wouldnt want to do that because if he did he would get burn in lmao 🤣

Why he would want to play D4? And for 1000h? Masochist.

And if you want to play only one game or watch one tv channel you don't buy OLED, this should be obvious to people that know some stuff about tech.

See guys? Force your HDR display to be an SDR display by curtailing its advertised capabilities down to 140 nits, and everything will be fine! If you insist on displaying HDR content on your expensive HDR display as advertised by the manufacturer, make sure you adjust your behavior to what the display needs, or it's your fault.

Most of the content is SDR, you don't want to have desktop set to HDR constantly because it looks like shit and incorrect. I only use HDR for games and movies that have HDR support.
 
Top Bottom