I have often read about how 30 fps on OLEDs appear choppy or stuttery, due to the instant pixel response times compared to LCD screens (which even the best ones smear, without BFI turned on). I don't believe people are imagining this, but there is one thing I don't understand. Let me explain. I am an old-ass gen-Xer, and used CRTs until I was around 35. I never noticed 30 fps games being stuttery on CRTs, which, to this day, are unmatched in their response times... they have absolutely perfect motion resolution. Also, it's not like I was some Madden obsessed twit that couldn't tell the difference between 30 fps vs 60 fps, believe you me. The arcade and Saturn versions of Virtua Fighter 2 and the arcade version of Daytona made me a HUGE advocate of 60 fps before some of y'all were even born! However, 30 fps still was a good baseline on CRT. When I switched over to smeary LCDs, I NEVER thought 30 fps looked better on it than CRTs. So why does 30 fps on OLEDs supposedly look so horrid (due to the instant response times) while 30 fps on CRTs looks just fine? The only thing I can think of is the vblank CRTs have. If this is the case, then well-done BFI should theoretically eliminate OLED 30 fps stutter, shouldn't it? BTW, I have never played on an OLED tv, I am just going by what people say. However, I did use to have an OLED Vita, and never noticed 30 fps looking extra-choppy on that. Anyway, can anyone explain this (to me) contradiction regarding 30 fps and instant pixel response times? Especially other old-timers that well remember the CRT days!
Edit: could it be that the people complaining about 30 fps on OLEDs are just really young and consider LCD pixel blur to be the ultimate in motion resolution, due to having grown up with it?
Edit: could it be that the people complaining about 30 fps on OLEDs are just really young and consider LCD pixel blur to be the ultimate in motion resolution, due to having grown up with it?
Last edited: