• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

QD-OLED gets two big fixes (Magenta tint + subpixel layout)

YCoCg

Member


The two big things are:
- The Magenta/Pink tint that QD-OLEDS have when light is casted on the screens appears to be eliminated
- The subpixel layout has been changed to make it form in an RGB stance, so text fringing is eliminated

The subpixel thing is huge as that's one of the few advantages LCD panels still had, so in terms for monitors it's great for productivity and using Windows compared to past generation QD-OLEDS.
 
I'm happy to hear QD OLED is still being pushed. Hopefully Sony put out a new one this year. The battle has gone from LED Vs OLED to RGB LED Vs QD OLED right?
 
Nice. I'll replace my ASUS ROG Swift OLED PG32UCDM with the PG32UCDM Gen 3 (fifth gen panel) as I'm not that into widescreen. I think it's coming in april?
 
OLED technology is still king. I won't go back to LED anytime soon.. this year most of main brands will push for RGB Micro LED but that still suffers from LED shortcomings in terms of blooming and viewing angles.. the clarity and contrast of an OLED panel puts it in all other level..
 
Price?

Might be my only upgrade this year as the other prices related to RAM are ridiculous

Waiting to see Samsung 's 3D glassless monitor price too, been wanting one for years but Acer's spatiallabs is so expensive
 
How are burn ins? I read they burn way more easily than Oled
Yup my uncle that works for nintendo told me oled still burns in.

Some time ago my dad bought an oled it fell down the wall walked to his fridge drank all his beer. Fucking shot his dog and took a big shit on his couch, never to be seen again..

I tell you oled is the fucking worst and i'm not even talking about that dreaded burn in
 
Last edited:
How are burn ins? I read they burn way more easily than Oled
My MSI QD OLED couldn't burn in unless you literally turned off the features that prevent it, that are on by default. If someone did that, then the burn in is due to their stupidity, not the monitor technology.

Admittedly, it can be a bit aggravating when it does the pixel shift thing if you're in the middle of something as it takes about five minutes and you can't use the screen during, but if someone turns it off and complains about burn in, well..... that's kinda like grabbing pots and pans out ovens with your bare hands and complaining about burns. There are rags and safety instruments everywhere, it's a kitchen, use them. I'm the case of a monitor, QD OLED or OLED, use the built in features that prevent burn in, they literally make it impossible to burn in by default as they shift quite a bit, so if it burns in on any remotely modern display of those types, it's user error.

Speaking as someone with multiple OLEDs and a QD OLED, burn in is a non issue for people with triple digit IQ's.
 
Yup my uncle that works for nintendo told me oled still burns in.

Some time ago my dad bought an oled it fell down the wall walked to his fridge drank all his beer. Fucking shot his dog and took a big shit on his couch, never to be seen again..

I tell you oled is the fucking worst and i'm not even talking about that dreaded burn in

How is that even a reply to a normal question? Fuck off with that passive aggressive shit. What a fucking edgelord.
 
I run the panel protect before every session and in the middle of long sessions. Severe use, I keep the taskbar hidden and put all settings to "long" so it can do what it does, proper. This is the absolute best display I have ever owned and I would never go back. I even got an extended warranty on it.
120FPS to 240FPS with deep blacks and bright whites and color blending like I have never seen before. Good example is Path of Exile 2* HDR... that log in button looks sick, LOL.
 
Last edited:
How is that even a reply to a normal question? Fuck off with that passive aggressive shit. What a fucking edgelord.
Dude he was joking man, chill bro. He was literally just trying to have a good time, what are you so mad about?


Calm Down Al Pacino GIF
 
I hate the vertical banding/lines on these fucking oleds. Since I did the stupid test on the new tv, I notice them time 2 time.

QD-OLED monitors are way better than TVs in terms of uniformity. From my experience QD-OLED TVs are often awful sadly, especially newest 4th gen panels. Vertical banding ranges from hard to spot to noticeable even on brighter greys like Android settings screen, there's big spread in quality between units.

 
Last edited:
LG new 5k2k 39" tandem oled is better in terms of burn-in and sustained hdr and screen coating durability.

I dont understand why no one else wants to pick up LG panel? everyone still on low res 1440p qd-oled, meh
 
Last edited:
LG new 5k2k 39" tandem oled is better in terms of burn-in and sustained hdr and screen coating durability.

I dont understand why no one else wants to pick up LG panel? everyone still on low res 1440p qd-oled, meh

I don't know how good or bad WOLED monitors are but in TV realm QD-OLED has still better near-black performance compared to Tandem WOLED. Cleaner (less dithering noise), smoother gradation etc.

It seems to me like most PC gamers are used to playing games with lights on, probably because before OLED they had LCD monitors with horrible contrast and glow, so these things aren't as important for them as they're for movie enthusiasts.
 
Last edited:
Yeah, the vertical banding is what needs to be addressed with all OLED technology. WOLED and QD OLED banding is atrocious and shouldn't be a thing for something so expensive.
 
I think the banding/dse of oled is because of the technology, i guess?

You are depositing organic carbon as millions of tiny pixels, there is no way to get quality control right for consumer devices without hiking the SRP.
 
I think the banding/dse of oled is because of the technology, i guess?

You are depositing organic carbon as millions of tiny pixels, there is no way to get quality control right for consumer devices without hiking the SRP.

LG has changed something with Tandem WOLED and the results are less random at least, still varies between units of course but Tandem WOLED TVs seem cleaner on average compared to 4th gen QD-OLED TVs.
 
Last edited:
Have they fixed how the pannel appears gray when in a light room when light hits it? It's what has kept me away from QDoleds -/ it completely ruins the point of oled's when not in a light controlled room.
 
LG new 5k2k 39" tandem oled is better in terms of burn-in and sustained hdr and screen coating durability.

I dont understand why no one else wants to pick up LG panel? everyone still on low res 1440p qd-oled, meh


I would be picking up the 39" LG if only it was 240hz and also comes with built-in KVM switch. Their 52" 240hz is way too big for me, maybe they are do a refresh 45" with new panel and finally 250hz, I honestly don't care much about the dual mode, 1080p is going to be horrible in those sizes
 
Still no fix for the vrr flickering. Still pass.
I was worried about VRR flicker, so I bought a model with decent VRR score according to the RTINGS review (they always measure VRR flicker). I must say that VRR flicker on my Gigabyte 4K 240Hz QD-OLED monitor is almost non-existent. As of now I have played over a hundred games and have only seen it in 2 games (Cyberpunk and Alan Wake) and what's more only in very specific scenario when I used FG. For some strange reason, the low base framerate of 35 fps (70 fps with FG) reveals VRR flicker. I typically have 80-100fps :P at 4K, and 110-170fps at 1440p, so I dont see that flicker in Alan Wake 2 during normal gameplay. Also If I don't use FG, I can play at 30–60 fps without any flicker.

I haven't used other QD-OLED monitors, so I don't know how noticeable the issue is with different brands. However, on my Gigabyte OLED, I can literally forget about VRR flicker.

Have they fixed how the pannel appears gray when in a light room when light hits it? It's what has kept me away from QDoleds -/ it completely ruins the point of oled's when not in a light controlled room.
IMO the issue of purple tint and raised blacks is overblown. Listen, I have to shine a very strong light directly on the panel (or use the flash on my camera) to see the purple tint and raised blacks. In this video, Vincent placed a strong softbox next to the panel to demonstrate raised blacks, but that's not how normal people (at least not me) use their QD-OLEDs.

sjAftLjiMWQSF0m9.jpg


Phone cameras also exaggerate that issue because of high ISO.

KD4K5vru8Y5CqXZh.png


Now I will show the blacks on my QD-OLED look like on my own photo when I use a mirrorless camera with a low ISO.

DSCF0048-Poprawione-Szum.jpg


This photo perfectly captures what my eyes can see in real life. Despite having 250W of light in the room, the blacks remain very good. On a calibrated screen, you can see a slight difference between the panel and the perfectly black frame in this photo, but there is definitely no purple tint, as seen in Vincent's video. Bear in mind that I don't usually use my QD-OLED with so many lights on in the room; I only turn on one light. So, in practice, I have perfect blacks 70% of the time. The only time I don't have perfect blacks is on a sunny day, but that doesn't bother me because the blacks and contrast are still very good even in such strong ambient light (blacks and colours looked more washed out during the day on my previous nano-ips LCD with matte finish). The only displays that look unwatchable to me in a well-lit room are old CRTs. Their glass literally looks grey in the presence of ambient light (even moderate), and the loss of contrast is extremely noticeable. That's not the case with my QD-OLED.

DSCF0126.jpg



The only real issue I've noticed with my QD-OLED is the ABL in 100% windows. If I however play at 1800p (27 inch monitor equivalent), the ABL isn't nearly as aggressive and the picture looks definitely better and I dont see any dimming in HDR1000 content. If QD-OLED monitors could be made brighter, I will have nothing to complain.
 
Last edited:
DSCF0048-Poprawione-Szum.jpg



The only real issue I've noticed with my QD-OLED is the ABL in 100% windows. If I however play at 1800p (27 inch monitor equivalent), the ABL isn't nearly as aggressive and the picture looks definitely better and I dont see any dimming in HDR1000 content. If QD-OLED monitors could be made brighter, I will have nothing to complain.
ABL sux on every OLED both WOLED, QD-OLED big and small IMO.
This is why I prefer much brighter LED for my larger screens and smaller QD-OLED for most gaming plus PC.
 
Idk, still looks like lcd to me besides a woled

This photo comparison is misleading, and I explained why in my previous comment. In typical ambient light scenario, QD-OLED achieves almost true blacks (as shown on my photo), and with a small amount of ambient light you get a true blacks, and better shadow rendering compared to W-OLED because that white pixel creates noise.
 
ABL sux on every OLED both WOLED, QD-OLED big and small IMO.
This is why I prefer much brighter LED for my larger screens and smaller QD-OLED for most gaming plus PC.
QD-OLED TVs definitely have a huge edge when it comes to brightness. QD-OLED monitors arnt nearly as bright


HHd4sUcuj9lfrjUh.jpg
 
Last edited:
I was worried about VRR flicker, so I bought a model with decent VRR score according to the RTINGS review (they always measure VRR flicker). I must say that VRR flicker on my Gigabyte 4K 240Hz QD-OLED monitor is almost non-existent. As of now I have played over a hundred games and have only seen it in 2 games (Cyberpunk and Alan Wake) and what's more only in very specific scenario when I used FG. For some strange reason, the low base framerate of 35 fps (70 fps with FG) reveals VRR flicker. I typically have 80-100fps :P at 4K, and 110-170fps at 1440p, so I dont see that flicker in Alan Wake 2 during normal gameplay. Also If I don't use FG, I can play at 30–60 fps without any flicker.

I haven't used different QD-OLED monitors, so I don't know how noticeable the issue is with different brands. However, on my Gigabyte OLED, I can literally forget about VRR flicker.


IMO the issue of purple tint and raised blacks is overblown. Listen, I have to shine a very strong light directly on the panel (or use the flash on my camera) to see the purple tint and raised blacks. In this video, Vincent placed a strong softbox next to the panel to demonstrate raised blacks, but that's not how normal people (at least not me) use their QD-OLEDs.

sjAftLjiMWQSF0m9.jpg


Phone cameras also exaggerate that issue because of high ISO.

KD4K5vru8Y5CqXZh.png


Now I will show the blacks on my QD-OLED look like on my own photo when I use a mirrorless camera with a low ISO.

DSCF0048-Poprawione-Szum.jpg


This photo perfectly captures what my eyes can see in real life. Despite having 250W of light in the room, the blacks remain very good. On a calibrated screen, you can see a slight difference between the panel and the perfectly black frame in this photo, but there is definitely no purple tint, as seen in Vincent's video. Bear in mind that I don't usually use my QD-OLED with so many lights on in the room; I only turn on one light. So, in practice, I have perfect blacks 70% of the time. The only time I don't have perfect blacks is on a sunny day, but that doesn't bother me because the blacks and contrast are still very good even in such strong ambient light (blacks and colours looked more washed out during the day on my previous nano-ips LCD with matte finish). The only displays that look unwatchable to me in a well-lit room are old CRTs. Their glass literally looks grey in the presence of ambient light (even moderate), and the loss of contrast is extremely noticeable. That's not the case with my QD-OLED.

DSCF0126.jpg



The only real issue I've noticed with my QD-OLED is the ABL in 100% windows. If I however play at 1800p (27 inch monitor equivalent), the ABL isn't nearly as aggressive and the picture looks definitely better and I dont see any dimming in HDR1000 content. If QD-OLED monitors could be made brighter, I will have nothing to complain.

Whats your GPU? VRR flicker might be fine if you got a high end pc that can mantain consistent frametimes/rates, however most dont and most games still have issues with frametimes, tho considering the hw prices, we might see them better optimized.
 
Whats your GPU? VRR flicker might be fine if you got a high end pc that can mantain consistent frametimes/rates, however most dont and most games still have issues with frametimes, tho considering the hw prices, we might see them better optimized.
I have the RTX 4080 Super and with DLSS my games usally runs above 120fps, but I don't think GPU matters because even when I test the 30–60 fps range in some older games, there's no VRR flicker. Currently I'm playing classic splinter cell 1 with enhanced patch. This game is very dark and runs at 60fps, yet there's ZERO VRR flicker.
 
How are burn ins? I read they burn way more easily than Oled
Burn in hasn't been an issue for good OLEDs for years now I'm pretty sure. Plenty of ways now to minimalize it like pixel shifting, image cleaning, etc. At least on my LG OLED monitor, I auto hide task bar & desktop icons and keep a black background (this is all probably overkill) but it's a few years old and zero sign of burn in.
 
Last edited:
Not possible. Plasma can't go higher than 1080p, draws way too much power to meet current energy consumption standards (yeah yeah I know, but blame the EU), has much more burn-in than OLED, and weighs like 200 lbs. for a large screen panel

My biggest problem with plasma was the yellowish phosphor trailing. I was particularly susceptible to it.
 
I have the RTX 4080 Super and with DLSS my games usally runs above 120fps, but I don't think GPU matters because even when I test the 30–60 fps range in some older games, there's no VRR flicker. Currently I'm playing classic splinter cell 1 with enhanced patch. This game is very dark and runs at 60fps, yet there's ZERO VRR flicker.

Low framerates dont matter ofc, the fluctuations do tho, similar to when it's loading or loading assets in the background. Not sure if having a fixed framerate works, at least based off Rting they said it barely matters. I've seen some cases where ppl had the worst of it and some might not notice it at all. I am personally sensitive to flicker, which gives me headaches.
 
Not possible. Plasma can't go higher than 1080p, draws way too much power to meet current energy consumption standards (yeah yeah I know, but blame the EU), has much more burn-in than OLED, and weighs like 200 lbs. for a large screen panel
Late Panasonic plasma TVs were extremely durable. Even older models weren't that susceptible to burn-in based on my experience. My parents bought a 1024x768p 42X10 plasma TV and there is slight burn-in, but my mother watched the same TV news station (the worst stress scenario) for 2–4 hours every day for about 16 years. The station logo is faint, and I only notice it when I look for it.

This is what the Samsung S95B looks like after undergoing a similar stress test (RTINGS). After three months, the burn-in is far more severe than on my parents' plasma TV, which worked for 16 years.


d7VK70SCKmJAxVem.jpg



My own panasonic GT60 plasma has zero burn-in, and I mainly used it for gaming. If I leave a static image on the screen for a few hours, a temporary ghost appears, but it quickly disappears when I change the content even for one minute. That's what makes it resistant to permanent burn-in, ghosts quickly go away if you just change the content.

Based on my experience of having severe burn-in on two OLED phones, burn-in on OLED doesn't behave in the same way. Even if you only display the same static content for five minutes every day, burn-in will accumulate anyway. Unlike plasma TV, switching content seems to have little effect on OLEDs. My current QD-OLED will probably burn in after 2–3 years, but I accepted this risk. When the burn-in becomes severe, I will simply buy a new monitor. It seems that upcoming OLEDs will be much brighter, so I don't plan to keep my current OLED for very long anyway.
 
Low framerates dont matter ofc, the fluctuations do tho, similar to when it's loading or loading assets in the background. Not sure if having a fixed framerate works, at least based off Rting they said it barely matters. I've seen some cases where ppl had the worst of it and some might not notice it at all. I am personally sensitive to flicker, which gives me headaches.
When I play old PC games (or emulators) they usually run at a locked 60 fps, so there are no significant fluctuations in frame time. In modern games, however, my frame rate fluctuates a lot, but I still don't see VRR flicker unless I turn Frame Generation at low base framerate. If I have at least 80fps (40fps base) VRR flicker is not visible even with FG, and I usally have 60-120fps when I turn on FG.

When the VRR flicker occurs, it's very noticeable. If I saw this flicker in every game, I would be annoyed and would prefer to turn off VRR.

By the way... I just remembered that a few months ago, I tested different panel luminance settings and found that I could trigger VRR flicker in more games if the panel luminance was set too low. At default luminance settings my monitor has 250 nits (SDR), but I prefer using 400 nits SDR in well lit room. There's also peak 1000, but it's too bright for SDR content. My OLED has brighter SDR compared to typical OLED monitor (250 nits), so maybe that's why I don't see the VRR flicker. According to an RTINGS test, my monitor has almost no mid- and light-grey flicker; only dark grey is affected.

I can only speak for myself, but with my current settings, VRR is essentially non-existent on my OLED monitor. I don't see any flicker from VRR, even during loading screens, and other OLED owners usually mention that problem.
 
Last edited:
That's great to hear, especially for the text fringing being gone.

Big thing I'm saving for end of year is an ultrawide 34-39 inch OLED as my main monitor. Any other game that doesn't have 21:9 support I'll just go back to my LG C3.
 
Top Bottom