Over-exposed photographs or photographs taken at an angle will always reveal the underlying LED structure. That's not representative of what you see in person.
Blooming happens, but it doesn't look like that, and it's less common than OLED/Plasma TV owners would have you believe.
CRTs are still praised for their black level yet they have far lower ANSI contrast and more severe blooming than any of the good FALD LCD displays.
Well, on AVS myself and many other posters have posted overexposed 5% gray slides. I can post one here as well. It cuts both ways, but my point is that eventually what you see in an overexposed shot you WILL see in content.
No, you will typically not see most of that. In dim, low-APL scenes, FALD blooming can be visible.
When you have a high contrast image with bright objects on-screen, the brightness tends to mask the blooming since your pupil contracts.
It doesn't mask everything, but a lot more than an over-exposed photograph leads people to believe.
Hold onto that one as long as you can. I'm still in the camp that Plasmas have the highest quality picture & motion handling ever created. It's a shame they got such a bad rep.
CRTs have the best motion handling of any display, followed by LCDs with backlight strobing/scanning, Plasma TVs, OLEDs with black frame insertion, high refresh rate sample & hold LCD, sample-and-hold OLED, then sample-and-hold LCD.
And if you are sensitive to motion artifacts on plasma or DLP displays, I would arguably place them last despite having better motion clarity than sample & hold displays.
OLED has the potential to be best, but only Sony have made any attempt at getting there.
Wow this post is the worst of the worst, that's what you get when trying to warn people about an objectively immature (in the reliability department) technology, you get called names and painted as someone who tries to justify a purchase.
The situation is actually the opposite of what you paint.
On the internet you can't talk about the (once again objective) problems of OLED without going trough a sea of people that either downplay or straight out deny them... This is Plasma vs LCD all over again only this time the bad guy is LCD.
And this only because some kind soul would like to see when recommended an OLED set a warning that is a hundred fold more susceptible to image retention and burn in that the other sets and to take that into account when choosing a tv.
The AV community on the net is incredibly toxic and can't accept that people might have different needs or sensibilities over what has been decided what's the "absolute best".
Great post.
It really seems like OLEDs are replacing Plasma TVs in terms of generating toxic responses to any sort of criticism online.
Blaming the owner for simply
using their TV, or claiming that affected displays must be defective is a terrible position to take.
Unless it's actual intentional abuse, like intentionally keeping a static image on the screen to burn it, or keeping the display on 24/7, don't blame the owner.
And let's not forget that when plasma was at its peak, LCD customers were complaining about plasma TVs flickering as well as the dimmer image you mention, both aspects that are inherently tied to the improved motion resolution.
My issue with Plasma TVs was not that they flickered, but
how they flickered.
Plasma TVs cannot vary the brightness of their pixels, they can only switch them on or off.
So to create a full-color image with lots of gradation, they have to pulse the pixels for varying durations.
This is why you might have seen older plasmas being advertised as "600Hz" displays. To create a 60Hz image, they would actually pulse the subpixels on and off 10x for every frame, varying the duration of each pulse to affect the perceived brightness.
This is what it looks like when measured:
Note: one frame at 60Hz = 16.67ms, so that is a bright frame followed by a black frame, and then another bright frame.
In this example, the TV is actually only switched on about 30% of the entire frame's duration - or about 5ms total.
Since the duration a frame is held on-screen affects how much motion blur we perceive, that's why Plasma TVs have less apparent motion blur than many other displays.
However you might notice that the response time of all three are very different.
Blue switches on/off almost instantly, while red/green switch on slower (diagonal line going upwards) and take significantly longer to decay. (diagonal lines going down to 0)
This can result in the image appearing to separate into separate colors, you might catch flashes of colors out the corner of your eye, or you might see blue/green trails on the leading/trailing edges of moving objects.
Here's an example of that taken by moving the camera while the TV is static - which is what you might see if you looked from one side of the screen to the other with your eye.
https://tweakers.net/reviews/3431/6/panasonic-zt60-afscheid-van-een-iconische-plasmaserie-beeldeigenschappen.html said:
So while Plasma TVs had less motion blur, they had the potential for severe artifacting in motion.
The way that Plasma TVs pulsed the image many times a second, and the motion artifacts that resulted from it, meant that I would get migraines when watching them.
I really tried, but after owning a couple of Panasonic plasmas and three Kuros, I had to accept that they were not for me.
CRTs on the other hand
can vary the intensity of the beam, which allows them to adjust the brightness without having to pulse the pixels on/off like a Plasma TV did.
They would only flash the image once per frame, instead of 10x (or more) per frame.
Even though CRTs maybe held the image on-screen for 1ms or so, causing flicker to be more noticeable than Plasma TVs, it didn't trigger migraines for me.
I have similar issues with LCD TVs which use PWM to control the backlight intensity, or even LED room lighting.
I recently spent a good bit of money on Philips Hue lights for one room (12 bulbs) and had to return them all because I'd get migraines due to them flickering.
I always took issue with people saying that Plasmas had a very "CRT-like" image.
IIRC, I think Vincent from HDTVTest said that if you max out BFI on the Sonys you can get 1080 lines of motion resolution.
I would be surprised if you can achieve that with BFI alone.
As I understood it, the OLED is literally drawing black frames. Since it's a 120Hz panel, that means 50% image persistence - or 8.33ms.
You would have a very clean image in motion compared to a Plasma TV, but still more motion blur.
Now, if instead of drawing black frames, the TV simply switched the picture on/off like LCDs do with backlight scanning/strobing, the image persistence could be decoupled from the refresh rate/framerate.
With OLED you could also simulate CRT scanning by only illuminating a certain number of lines at once too.
But most TV manufacturers don't want their displays to flicker any more.