• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NeoGaf OLED owners thread

The a95L is the best OLED that currently exists if you got the money for it.

I also have a x900h and I really want an OLED but I'm not sure how much of an improvement it will even be right now... X900h still produces stunning images with deep blacks
LED FALD will be blown away by OLED. No more glowing halos. When I replaced my 900e with OLED, I figured I would keep it as a basement TV. After a while, I couldn't take it anymore. I genuinely enjoyed games on it less after experiencing OLED. So I bought a discounted LG C1 strictly to replace that basement TV.

Black Friday... just do it.
 
I just purchased my 77” a95L , so fellas is there any setting i should use? Fir gaming mainly, i play mostly sport games

As long as the protection stuff is on you should be fine with sports games. I've been playing FIFA almost daily for 2 years on an OLED and had no image retention. I don't play in HDR tho as that would no doubt sear things like the radar into the screen I bet. FIFA looks weird in HDR anyway for me
 

nemiroff

Gold Member
Ok guys, what do you say. My monitor just got official support for Dolby Vision. (the monitor is an Asus PG32UCDM 32" UHD 240Hz QD-OLED) (on a Windows 11 PC).

So, the question is: Any reason why I shouldn't use Dolby Vision over HDR10? I mean, my best guess is that if content doesn't support DV metadata it will just default to a "default" DV or HDR10 HDR mode, or?
 
Last edited:

TheShocker

Member
Picking up the Alienware AW3225qf. Switching from a 42in C2. Pretty excited about switching to a slightly smaller screen. Love my C2, but I sit too close with my current set up.
 

Diddy X

Member
Ok guys, what do you say. My monitor just got official support for Dolby Vision. (the monitor is an Asus PG32UCDM 32" UHD 240Hz QD-OLED) (on a Windows 11 PC).

So, the question is: Any reason why I shouldn't use Dolby Vision over HDR10? I mean, my best guess is that if content doesn't support DV metadata it will just default to a "default" DV or HDR10 HDR mode, or?

My tv automatically chooses DV when available since it's superior, I don't know any reason to pick HDR10 over it.
 

Stafford

Member
Well I'll be damned. It really seems the culprit of the Sony A95K completely locking up/freezing or becoming extremely slow right after even a cold boot was a.......USB stick.

Ever since I removed that thing the TV has been smooth sailing. Hell, I'm not even using Apps Only mode either, but knocking on wood with that. I've read before that it can cause issues if you don't use Apps Only mode.

Very happy now, nothing wrong with the TV after all. Well, except that it doesn't play nice with USB sticks, lmao.
 

Bojji

Member
Input lag. I don't know the technical reason for it, but DV gaming has more lag than HDR. Xbox even warns you about it.

It's probably mostly about that most tvs don't have Dolby vision game mode (and support for 120hz with DV), LG tvs have for sure (maybe the newest sony too).

Only one game supports DV natively on Xbox - halo infinite. Rest just converts hdr10 to DV and it's pretty much pointless.

Right now, hdr10 + HGIG is the best combo for games. Maybe in the future (once Sony starts to support it) DV will become standard.
 

amigastar

Member
Just a question, is there an OLED Monitor for PC with lets say 24 inch for less than 250 bucks? I've searched but didn't find anything.
 

rajkosto

Neo Member
Just a question, is there an OLED Monitor for PC with lets say 24 inch for less than 250 bucks? I've searched but didn't find anything.
Why would it ever be that cheap ? The worst possible 2023 WOLED panel (27" 1440p 240hz) that you should never buy can sometimes be had for 500$. But you shouldn't buy it because it's very dim (+vignette effect on top), RWBG subpixel layout makes for awful text rendering, it has obvious vertical banding gray uniformity problems, and even manages to have visible distracting Anti ghosting on some gray shade transitions in dark scenes like around flashlight halos (or lots of places in Alan wake 2)
 

amigastar

Member
Why would it ever be that cheap ? The worst possible 2023 WOLED panel (27" 1440p 240hz) that you should never buy can sometimes be had for 500$. But you shouldn't buy it because it's very dim (+vignette effect on top), RWBG subpixel layout makes for awful text rendering, it has obvious vertical banding gray uniformity problems, and even manages to have visible distracting Anti ghosting on some gray shade transitions in dark scenes like around flashlight halos (or lots of places in Alan wake 2)
Well, i thought when a 50 inch tv costs 600 bucks i would get a 24 inch for 200-250 bucks.
 
Last edited:

dotnotbot

Member
Why would it ever be that cheap ? The worst possible 2023 WOLED panel (27" 1440p 240hz) that you should never buy can sometimes be had for 500$. But you shouldn't buy it because it's very dim (+vignette effect on top), RWBG subpixel layout makes for awful text rendering, it has obvious vertical banding gray uniformity problems, and even manages to have visible distracting Anti ghosting on some gray shade transitions in dark scenes like around flashlight halos (or lots of places in Alan wake 2)

That "anti-ghosting" effect is white subpixel overshoot I believe and every WOLED suffers from it, some maybe less than others. If it's less visible it's usually because of stronger dithering being applied so the trade off is stronger grain in dark scenes, not a big issue for TVs but could be annoying up close.
 
Last edited:

rajkosto

Neo Member
I think it was significantly reduced on the 2024 WOLED panels but those are expensive.
The 2023 27" woled absolutely ruined the "champion of light/herald of darkness" level in Alan wake 2 for me because it was absolutely everywhere as the headlights in the dark were moving, and I will never forgive it for that
 
I bought the new Samsung G8 Oled monitor. 240hz, 4K, same specs are the MSI, ASUS, etc... I love it.

The reviews have 2 major complaints; the interface which is TV like and the anti glare coating. The interface is fine but its a bloated for a PC monitor if you don't wanna use the features but once it's set to your PC port, it just automatically goes right to the PC when you turn it on. The coating isn't a typical matte coating, It looks comparable to my LG C1 in terms of brightness.

One issue that these monitor have that I haven't run into which is a big plus for me is The VRR performance. With Gsync on I have yet to experience any VRR flicker which other models in this class have issues with according to reports. Gsync is a must have feature for me so this is a huge plus on this model that seems to stand out from its direct competitors.

I would personally recommend this monitor and I was lucky enough to get it for $1240 Canadian on sale at Best Buy.
 

Tajaz2426

Psychology PhD from Wikipedia University
I have two of the odyssey G95SC monitor tv things, that are really long side to side, I think they are OLED. They are on sale for a grand right now if anyone is interested.

I like them and haven’t had any crazy issues, but I don’t game everyday or even every month or two. They seem to work fine and I believe they do either 120 or 240htz and the picture look really good while gaming.
 
Last edited:

FeastYoEyes

Member
48" LG B4 on the way. Hoping to be blown away as I hear a lot of others have been. Reviews say it's in line with a B2 and almost as good as a C3?
 
48" LG B4 on the way. Hoping to be blown away as I hear a lot of others have been. Reviews say it's in line with a B2 and almost as good as a C3?
I personally don't like how B4 HDR peak brightness is a bit lower than what games currently expect (800nits). Any reason you didn't just get a discounted C3?

I just checked USA B4 price (can't find 48" in Canada), and Best Buy has it for half MSRP for some reason? Is that why you nabbed it? Good price for sure. That is roughly what I paid for my 48" C1 in 2022.
 
Last edited:

FeastYoEyes

Member
I personally don't like how B4 HDR peak brightness is a bit lower than what games currently expect (800nits). Any reason you didn't just get a discounted C3?

I just checked USA B4 price (can't find 48" in Canada), and Best Buy has it for half MSRP for some reason? Is that why you nabbed it? Good price for sure. That is roughly what I paid for my 48" C1 in 2022.
Half MSRP is exactly why I nabbed it lol Fingers crossed it'll still perform. Reviews seemed favorable!
 

nemiroff

Gold Member
I bought the new Samsung G8 Oled monitor. 240hz, 4K, same specs are the MSI, ASUS, etc... I love it.

The reviews have 2 major complaints; the interface which is TV like and the anti glare coating. The interface is fine but its a bloated for a PC monitor if you don't wanna use the features but once it's set to your PC port, it just automatically goes right to the PC when you turn it on. The coating isn't a typical matte coating, It looks comparable to my LG C1 in terms of brightness.

One issue that these monitor have that I haven't run into which is a big plus for me is The VRR performance. With Gsync on I have yet to experience any VRR flicker which other models in this class have issues with according to reports. Gsync is a must have feature for me so this is a huge plus on this model that seems to stand out from its direct competitors.

I would personally recommend this monitor and I was lucky enough to get it for $1240 Canadian on sale at Best Buy.
Funnily enough, my old Samsung QLED monitor had VRR flicker in menus and stuff. But yeah, my new Asus QD-OLED has none of that. Good times.

Anyway, a tip; Afterburner can cause VRR flickering..
 
Replacing my LG C8 today with a C4. Anyone want to share their settings and other advice on setting it up compared to the C8?
Set it to game mode, enable HGIG tone mapping, calibrate HDR on your console. Done.

Previous models had SDR that was far too bright, so you can adjust the OLED backlight to your taste for SDR games.
 

Methos#1975

Member
So my new TV I think is DOA. Turn it on and the screen lights up with a few green lines and then shuts off with a orange blinking light underneath. Going to assume I got a dud.
 

LectureMaster

Gold Member
Replacing my LG C8 today with a C4. Anyone want to share their settings and other advice on setting it up compared to the C8?
Stole this from ResetEra, I found the settings work great on my G4. The thread is very dedicated.
X1jd2VY.png


 

Killer8

Member
Recently I connected my PS3 up to my 55" LG C1 and was pleased with the result. I was expecting it to look hideous but the image was actually quite good. Softer sure, and often aliased as PS3 games are, but it wasn't the disaster I was expecting based on my experiences running the PS3 on a 22" LCD monitor for years. It's not going to make 720p (or less) suddenly look like full HD, but the perfect colors and contrast levels that OLED provides sort of distracts from the lacking image quality. It feels akin to playing a PSP game on the PS Vita's OLED display.

That being said, there are interesting things that can be done with the sharpening settings on these LG sets. I was reminded of this My Life in Gaming video:



This information led me to experiment with sharpening settings over the last couple of days. Now, there seems to be a lot of conflicting information online about whether 0 or 10 is the the neutral value on LG TVs. Some say that in fact 10 is the 1:1 pixel perfect value while 0 would in fact blur the image. I can't find anything truly definitive, so had taken to leaving it at 10 for the last couple of years.

When I compare 0 to 10 on a PS3 feed at 720p, there is a tangible difference - so much so that to my eyes it pushes the image from being just unacceptably blurry to being perfectly adequate. Pushing sharpening higher, up to 20 and above, of course makes it sharper still and claws back even more texture detail. But in my opinion you get diminishing returns here as it makes the sub-pixel ringing a bit too much (since sharpening will also amplify any flaws in the image). 15-20 can work quite well in games with already very good image quality, like the God of War HD Collection. I also experimented with going all the way up to 25-30 in very soft, sub-HD games like Resistance 3. This ended up being a net good result as the benefits of the overall image sharpening more than outweighed the negatives.

The LG Super Resolution did not appear to do much of anything. I needed sharpening set on an insane amount of 50 and to be standing about a foot from the screen to see any benefit on edges, and even then it was like playing spot the difference. It's not worth any potential input lag that it might add.

Where the LG sharpening really appears to come into its own is modern console games which use temporal anti-aliasing or some form of image reconstruction method like FSR. The advantage of these techniques is that they are very effective at eliminating shimmering and aliasing but often leave the image looking softer. In other words, they become an ideal use case for sharpening.

I tested out Rise of the Ronin, a game with very lacking image quality in its 60fps performance mode. It renders at a dynamic 1152p which often dips as low as 936p according to Digital Foundry. This is then scaled up to 4K with FSR2. I bumped sharpening from my default value of 10 up to 20 and the result was transformative. Enough sharpness is added that I could be mistaken for playing the game at a significantly higher resolution. In fact it added so much texture detail and refined those smudges that the game calls distant trees that it actually made the game look... good. This is a game people claim looks so bad it could pass for a PS3 game and yet i'm now now stopping to admire the visuals.

I need to do more testing but the fact that TV sharpening is actually decent now opens up a lot of possibilities. I'd say the old adage of "always leave sharpening on 0" is eroding in the face of much better technology being deployed as well as how that interacts with anti-aliasing in today's games. This ain't your daddy's TV sharpening any more.
 

Killer8

Member
To follow on from my previous post, I was curious to figure out what this sharpening setting on the LG C1 is actually doing. It appears that when going from 0 to 10, it's adding a sort of anti-aliasing like edge enhancement. Any additional sharpening above that then adds on the traditional kind of image sharpening you'd expect.

I am guessing that 0 is, in fact, the neutral image without any sort of processing, as you can clearly make out the 2xMSAA pattern in God of War HD Collection here. Image on the left is sharpening set to 0, right is set to 10:

(any differences in brightness / color are because I had to use Photoshop to adjust the image brightness afterwards, since these are camera shots of the screen - otherwise you can't capture what processing the TV is doing)

image.png
10.png


ezgif-2-6a49ab3ac9.gif


I think this anti-aliasing effect is most visible at the rock curve near the bottom of the image. It looks a lot like when you add SMAA onto an edge with MSAA already applied.

There is some speculation online that leaving sharpening on 10 doesn't affect 4K content at all, which makes it all the better to just leave it at that value to enhance any sub-4K content. Some purists might argue that leaving it on 0 is more accurate, which is technically true, but I think 10 is a clear winner. Like the My Life in Gaming video said, people spend $100 on mClassic cables to do what their TV already appears to do just as well.
 
Last edited:

Bojji

Member
To follow on from my previous post, I was curious to figure out what this sharpening setting on the LG C1 is actually doing. It appears that when going from 0 to 10, it's adding a sort of anti-aliasing like edge enhancement. Any additional sharpening above that then adds on the traditional kind of image sharpening you'd expect.

I am guessing that 0 is, in fact, the neutral image without any sort of processing, as you can clearly make out the 2xMSAA pattern in God of War HD Collection here. Image on the left is sharpening set to 0, right is set to 10:

(any differences in brightness / color are because I had to use Photoshop to adjust the image brightness afterwards, since these are camera shots of the screen - otherwise you can't capture what processing the TV is doing)

image.png
10.png


ezgif-6-111f39cfaa.gif


I think this anti-aliasing effect is most visible at the rock curve near the bottom of the image. It looks a lot like when you add SMAA onto an edge with MSAA already applied.

There is some speculation online that leaving sharpening on 10 doesn't affect 4K content at all, which makes it all the better to just leave it at that value to enhance any sub-4K content. Some purists might argue that leaving it on 0 is more accurate, which is technically true, but I think 10 is a clear winner. Like the My Life in Gaming video said, people spend $100 on mClassic cables to do what their TV already appears to do just as well.

Won't tv treat ALL content as 4k if console is set to 4k in menu? You would have to switch to 1080p to see results like that.

I have it at zero but this AA effect at 10 might be useful sometimes.
 

Killer8

Member
Won't tv treat ALL content as 4k if console is set to 4k in menu? You would have to switch to 1080p to see results like that.

I have it at zero but this AA effect at 10 might be useful sometimes.

It will, I should've mentioned that this was PS3 running at 720p! This was set in the PS3 menu to avoid any unnecessary scaling on the PS3's part from 720p to 1080p (GOW Collection was a native 720p game). I will check later to see if it's working with native 1080p content too.

Nice thing about 4K is you get perfect integer scaling from not only 1080p (2x) but also 720p (3x).

I think it also gives some credence to the recommendation people have online of setting the Switch to 720p in the settings and letting the TV scale it.
 
Last edited:

Bojji

Member
It will, I should've mentioned that this was PS3 running at 720p! This was set in the PS3 menu to avoid any unnecessary scaling on the PS3's part from 720p to 1080p (GOW Collection was a native 720p game). I will check later to see if it's working with native 1080p content too.

Nice thing about 4K is you get perfect integer scaling from not only 1080p (2x) but also 720p (3x).

I think it also gives some credence to the recommendation people have online of setting the Switch to 720p in the settings and letting the TV scale it.

I have tried PS3 on my 4K LG and was surprised how good it looked, way better than I expected and I have left sharpening at 10 (later set to 20):

 

Killer8

Member
I have tried PS3 on my 4K LG and was surprised how good it looked, way better than I expected and I have left sharpening at 10 (later set to 20):


Yeah it is very surprising how good it still is. People should be glad to hear that since there are still plenty of exclusive games stuck on the platform. Not to mention that with custom firmware you can make your own PS1 pkg files so you can have a massive library of games on the hard drive.

I noticed that when running PS1 on PS3, it's best to leave the console at 720p as that 3x integer scales PS1 games perfectly from their native 240p, and the TV then integer scales the 720p output 3x to 4K. It looks very sharp.
 

Killer8

Member
Messed around with sharpening some more and did some experimenting. I can confirm that sharpening at 10 is absolutely a form of edge anti-aliasing. Below is Echochrome on PS3, a 1080p native title with a simplistic black and white art style which really shows off what the sharpening is doing. Sharpening on 0 vs 10:

Sb2nOI3.jpeg
Sdggv6J.jpeg


Another comparison of Ridge Racer 7 which is also native 1080p with no AA:

zm5YOOF.jpeg


BOigIt3.jpeg


It doesn't appear to hit every edge but then again post-AA like SMAA/FXAA often doesn't either. You can see a real benefit when playing RR7 as it smooths out enough of the obvious jagged edges.

I tried this with an Xbox 360 and, to my eyes, everything looked worse than leaving it at 1080p.

The 360 had a decent hardware scaler so might account for it looking better.

The LG sharpening's edge-AA needs to do some upscaling to kick in. I did some more tests to confirm this with the Uncharted Collection which is a 1080p native PS4 title. I played it on a PS5 and did three tests:

The PS5 outputting 1080p with no sharpening applied (ie. the TV doing the upscale to 4K):

wRhukVn.jpeg


The PS5 hardware doing the upscale to 4K, with sharpening set to 10:

Cb2ijwj.jpeg


The PS5 outputting 1080p, but this time the TV is doing the upscale AND is adding the sharpening edge-AA:

EiTxNgN.jpeg


Animated to see the difference (order is same as above):

ezgif-1-5051de3900.gif


I would say that in terms of image quality, PS5 upscaling to 4K is superior to just 1080p with no sharpening. But when you add in sharpening at 1080p, that becomes the clear winner in terms of cleaning up the edges (just look at the cliff edge, snowy peak and the tree).

I think it also confirms that sharpening's edge-AA only really applies to sub-4K images ie. if any upscaling by the TV is needed, as it didn't appear to do anything to the raw 4K output from the PS5.
 

Stafford

Member
So my Sony A95K is doing some unusual stuff now, and earlier this week as well.

It's turned off (sleep mode) and I see a red light blinking four times every 4 seconds. It's on the bottom of the TV in the middle, the power led. It will turn on fine, but why is this happening?
 

Neofire

Member
So my Sony A95K is doing some unusual stuff now, and earlier this week as well.

It's turned off (sleep mode) and I see a red light blinking four times every 4 seconds. It's on the bottom of the TV in the middle, the power led. It will turn on fine, but why is this happening?
Mine has been doing weird crap like losing sound and lagging after the recent update
 

Stafford

Member
Mine has been doing weird crap like losing sound and lagging after the recent update

Oh that sucks. I have lagging sound sometimes too, but that's via the soundbar. When I plug off the soundbar and back in it works fine again. But to be sure I turn the TV off as well then. So it could be the TVs doing too.
 

Neofire

Member
Oh that sucks. I have lagging sound sometimes too, but that's via the soundbar. When I plug off the soundbar and back in it works fine again. But to be sure I turn the TV off as well then. So it could be the TVs doing too.
Same here too on the sound and my sound bar is a HT-s2000.
 

Stafford

Member
Same here too on the sound and my sound bar is a HT-s2000.

Strange isn't it? But at least it can be fixed. I have a Harman Kardon Citation Multibeam 1100 myself.

It seems when the red led happens that it has fully shut down the TV, because just now I powered it on and it was a cold boot. Odd. I'm gonna ask on AVS as well.
 

ap_puff

Member
So my Sony A95K is doing some unusual stuff now, and earlier this week as well.

It's turned off (sleep mode) and I see a red light blinking four times every 4 seconds. It's on the bottom of the TV in the middle, the power led. It will turn on fine, but why is this happening?
Pretty sure a blinking red LED is an error code, better check the manual
 
Anyone with a 144hz LG Oled noticed that when PC gaming some games need to be in windowed fullscreen to get 144hz refresh rate? On my LG G4 Doom Eternal is fine in exclusive fullscreen for example, others need windowed fullscreen. It's like the tv's hdmi tells the pc it's a native 120hz display with a toggle to 144 that doesn't change the hdmi edid information being send. Doesn't matter much, but something to be aware off. Curious if it's in my setup or more people are experiencing this.
 

panda-zebra

Member
Been putting off buying a TV in recent years because the specs for current gen and onwards didn't seem fully locked down. Popping into threads I'd just end up seeing too many potential negatives or confusing caveats, then watching reviews I'd be too easily swayed one way or the other, so not wanting to make a mistake resulted in choice paralysis and just putting it off. My brother gave me a half decent spare Samsung when he moved house which again helped put off a decision for a while, but that went boom and left me back using an old LCD bought for the ps4 Pro... obviously needed something with more features and better quality for the new console so ordered an A95L. Couldn't find much to be concerned about so hopefully that was a good move.
 

Little Mac

Gold Member
Gaf bros ... what would be the biggest graphical and noticeable leap in presentation?

Playing a PS5 Slim on a 65in LG C3 OLED... compared a Hisense 55in U7K (LED 144hz, VRR, HDR)

or

Playing a PS5 Pro on the aforementioned Hisense U7K?

Saving up for either a Pro or a newer tv and was wondering which would provide the bigger "wow".
 
Top Bottom