• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sony’s 2021 TV lineup runs Google TV and fully embraces HDMI 2.1

I just like Caleb because he does the nerd stuff in the background and then just tells you what you need to know, I think a big reason why he has reviews out earlier/first is because Digital Trends is a large company so they get sent the sets directly and before general release.

I'm pretty sure the reason you are seeing such a difference is in large part due to 50% more zones, but also because the native contrast of the Q900T is ~1600:1, whereas the QN90A is ~3500:1, it would be the same if you had an LG IPS LCD ~1500:1 LCD next to any high contrast VA LCD. When you turn the ISO up on the camera that much to capture the blooming its going to be way worse on the Q900T because of this, if you turned the exposure up even more I'd guess you'd see the QN90A doing something similar just the zones would be tracing the box with slightly less of a border than the Q900T, because they're smaller.

I still think it would be better if they were similar contrast panels, but its definitely going to be better on QN90A just due to having that many more zones, the difference would not be anywhere near that stark with your eyes. The QN90A is like 50% brighter in both SDR and HDR so maybe that skews the test somehow when you make them look the same on camera.
Obviously it's going to be less blooming in real life but the test was meant to show the zones and how fast the zones are keeping up ; to which Fomo says the QN90A zones are much faster to react and that appears to be the case just looking at the video in addition to the lower blooming. I'm just going off what he's saying and what I see from a compressed youtube video, he seems to be right but i'm not 100% sure, still too early. Watching it on my oled helps a bit because of the deep blacks ; I can more easily see the differences on camera vs. if I was watching on LCD. But yeah, going to wait and see more on this one.

And definitely, wish he compared the Q90T with the QN90A instead of using the Q900TS, i'm not sure why that comparison is being made. Maybe because he doesn't have a Q90T.
 

Kuranghi

Gold Member
Obviously it's going to be less blooming in real life but the test was meant to show the zones and how fast the zones are keeping up ; to which Fomo says the QN90A zones are much faster to react and that appears to be the case just looking at the video in addition to the lower blooming. I'm just going off what he's saying and what I see from a compressed youtube video, he seems to be right but i'm not 100% sure, still too early. Watching it on my oled helps a bit because of the deep blacks ; I can more easily see the differences on camera vs. if I was watching on LCD. But yeah, going to wait and see more on this one.

And definitely, wish he compared the Q90T with the QN90A instead of using the Q900TS, i'm not sure why that comparison is being made. Maybe because he doesn't have a Q90T.

Ah okay, you were meaning just for the zones dimming lag, I misunderstood sorry, I think you would see a similar lag of the backlight on the QN90A if you turned up the ISO enough to capture the blooming.

The 2020 Q90T would probably be even worse because it only had 120 zones (in 55" and 65", can't find 75" number but I'd imagine its the same), although the native contrast was higher at 4000:1, probably can't make up for the difference in zones though.

edit - I just realised you replied before I wrote the Q90R part, updated my post.
 
Last edited:
Ah okay, you were meaning just for the zones dimming lag, I misunderstood sorry, I think you would see a similar lag of the backlight on the QN90A if you turned up the ISO enough to capture the blooming.

The 2020 Q90T would probably be even worse because it only had 120 zones (in 55" and 65", can't find 75" number but I'd imagine its the same), although the native contrast was higher at 4000:1, probably can't make up for the difference in zones though.

edit - I just realised you replied before I wrote the Q90R part, updated my post.
I would have liked to own a q9fn for at least a time as that seems to be the best qled prior to mini led. Everyone always talks about the “pop” it had. So that vs the new qn90a would be the most interesting comparison to me. Before Sammy patched the fn and nerfed picture quality 😡

Also I have to wonder if the q9fn was also gimped with its local dimming in game mode or if that just started with q90r?
 
Last edited:
Yo, Fomo’s review of the Q90A was pretty enticing. The dimming algorithm is waaaay better than Samsung’s previous efforts ; just by eye it may be on par with Sony’s, or at least close enough. Of course this needs to be put under more scrutiny, but give it a watch.

He didn’t test input lag, nor screen uniformity though. Also, we have to see what the dimming is like in game mode, so Sony’s algorithm may still be king for that use case.

Early days, but it sounds surprisingly good.

I have the 65 QN90A. It is amazing, but has a few minor issues that hopefully will get sorted out with future updates.
 

kyliethicc

Member
RTINGS has their X90J unit in: https://www.rtings.com/tv/suggestions

W9b6UAf.png


So we'll know a whole lot more pretty soon.
The review I'm waiting for.

$1500 for 55" Sony X90J LED
vs
$1800 for 55" LG C1 OLED

Those prices are close, so the X90J better be very good. Cause the C1 is.
 
Last edited:

Kuranghi

Gold Member
The review I'm waiting for.

$1500 for 55" Sony X90J LED
vs
$1800 for 55" LG C1 OLED

Those prices are close, so the X90J better be very good. Cause the C1 is.

Are those really the RRPs? Imo you'd be making a bad choice paying 1500 for the 55" X90J when you can go up to a same size OLED for a few hundred more, unless you will always watch with light streaming in the room, or you will have an open window shining light directly on the screen.

The C1 will drop to ~1400 by end of its life, but the X90J will be <1000 by then. If you have to buy right now and don't have the issue listed above then get the OLED definitely.

You don't need to wait for reviews, the C1 will have nearly the same peak brightness as the X90J, can do per-pixel dimming ie 8.3 million vs 32 zones, has perfect black levels when needed. Buy it somewhere with a burn-in warranty and you don't even have to ever worry about that either.

The XR processing in the X90J looks to be very very good, but in my experience it won't make up for the differences in the tech I listed above, for most people. Like if I showed you them side by side with different content you'd pick the OLED 19/20. The only exceptions being scenes with large areas of white/light colours with have much reduced overall brightness than the LCD, but even then you won't perceive that when you have the OLED on its own, because your eyes will have adjusted to the light output from the panel.
 
Last edited:

dolabla

Member
The 50" X90J seems to be a hard get right now. Amazon isn't getting another stock until May 16th (it says, but it keeps changing) and at Best Buy when you add it to the cart, it pushes it all the way back to middle June as the delivery date. Hopefully, they get them sooner than that 🙏. I want a new tv, dang it!
 

ManaByte

Gold Member
The 50" X90J seems to be a hard get right now. Amazon isn't getting another stock until May 16th (it says, but it keeps changing) and at Best Buy when you add it to the cart, it pushes it all the way back to middle June as the delivery date. Hopefully, they get them sooner than that 🙏. I want a new tv, dang it!

There were only 1-2 55"s at my local Best Buys.
 

dolabla

Member
There were only 1-2 55"s at my local Best Buys.
Same here for the 55". Checked a few days ago and it looked like they had gotten 2 in. And now it looks like they only have 1 left. No 65" though. Amazon has the 65" in stock. I want to get mine at Best Buy due to the 45 day return policy just in case it happens to drop in price and they can refund me for the difference.
 
Last edited:

Bo_Hazem

Banned
Same here for the 55". Checked a few days ago and it looked like they had gotten 2 in. And now it looks like they only have 1 left. No 65" though. Amazon has the 65" in stock. I want to get mine at Best Buy due to the 45 day return policy just in case it happens to drop in price and they can refund me for the difference.

We're not getting any until like 4-5 months from now, so I'm here waiting for later this year for a nice price cut.
 

Bo_Hazem

Banned
Looking forward to setting up this set. Going from an edge-lit LED to full array is going to be the best upgrade for me even more than the 120hz and Dolby Vision.

Sweet! What was your previous TV? My current 2016 Sony 4K HDR X70D is edge-lit, expecting a serious jump going to X90J.
 

MurfHey

Member
You're one step above me.

I'm currently using a 43" edge lit Sony X800D. The X800D was such an amazing budget tv. It had no light bleed of sorts and it had a VA panel.
I am using the same tv (43" X800D). It is starting to age a little now (few little black dots on my screen). Really thinking about getting the X900H. At the same time I am waiting for the VRR and 2.1 updates plus wanted to see if it gets the new Google TV OS.
 

ManaByte

Gold Member
I am using the same tv (43" X800D). It is starting to age a little now (few little black dots on my screen). Really thinking about getting the X900H. At the same time I am waiting for the VRR and 2.1 updates plus wanted to see if it gets the new Google TV OS.

I don't believe those updates are coming to the 900H which is why I waited for the X90J.
 

Bo_Hazem

Banned
I don't believe those updates are coming to the 900H which is why I waited for the X90J.

It got some minor update that improved it, but there is a reason why they're using a new, expensive processor that is found on the Z9J all the way to X90J. I think this new processor has enough headroom to go true mini led instead of working in clusters like some fake miniLED TV's.
 

kyliethicc

Member
Are those really the RRPs? Imo you'd be making a bad choice paying 1500 for the 55" X90J when you can go up to a same size OLED for a few hundred more, unless you will always watch with light streaming in the room, or you will have an open window shining light directly on the screen.

The C1 will drop to ~1400 by end of its life, but the X90J will be <1000 by then. If you have to buy right now and don't have the issue listed above then get the OLED definitely.

You don't need to wait for reviews, the C1 will have nearly the same peak brightness as the X90J, can do per-pixel dimming ie 8.3 million vs 32 zones, has perfect black levels when needed. Buy it somewhere with a burn-in warranty and you don't even have to ever worry about that either.

The XR processing in the X90J looks to be very very good, but in my experience it won't make up for the differences in the tech I listed above, for most people. Like if I showed you them side by side with different content you'd pick the OLED 19/20. The only exceptions being scenes with large areas of white/light colours with have much reduced overall brightness than the LCD, but even then you won't perceive that when you have the OLED on its own, because your eyes will have adjusted to the light output from the panel.
I agree if its a purchasing choice between say $1500 for the C1 and $1200 for the X90J... I'll just get the LG.
But if its a purchasing choice between say $1500 for the C1 and $1000 for the X90J... then idk. Thats an extra 50% for the LG.

(I assume you meant to write >1000 btw, meaning 1000+ ... not less than 1000. The lowest I suspect the X90J to be is maybe $1000.)

My issues with LG CX / C1 OLED are:
- risk of burn in
- the physical design (stand, port layout, etc)
- no support for DTS audio
- from watching videos I prefer the new (Sony) Google TV OS

Of course the Sony X90J has:
- worse contrast
- worse motion response
- only 2 HDMI 2.1 ports
- no VRR (yet)
- probably only similar peak brightness for HDR

Burn in for me is a legit concern, because I'm "lazy" with TVs. I value being able to fall asleep with my TV on, or just leaving it on with my PlayStation on, even just on the home screen, for like an hour while I go do something else. I have no interest in trying to babysit or manage a display. I get that OLED's have better picture thats just a fact, but will it last me 10+ years? I don't want to have to change my behavior to accommodate a TV and it needs to be just as good after 10 years of almost daily use.
 

Kuranghi

Gold Member
I agree if its a purchasing choice between say $1500 for the C1 and $1200 for the X90J... I'll just get the LG.
But if its a purchasing choice between say $1500 for the C1 and $1000 for the X90J... then idk. Thats an extra 50% for the LG.

(I assume you meant to write >1000 btw, meaning 1000+ ... not less than 1000. The lowest I suspect the X90J to be is maybe $1000.)

My issues with LG CX / C1 OLED are:
- risk of burn in
- the physical design (stand, port layout, etc)
- no support for DTS audio
- from watching videos I prefer the new (Sony) Google TV OS

Of course the Sony X90J has:
- worse contrast
- worse motion response
- only 2 HDMI 2.1 ports
- no VRR (yet)
- probably only similar peak brightness for HDR

Burn in for me is a legit concern, because I'm "lazy" with TVs. I value being able to fall asleep with my TV on, or just leaving it on with my PlayStation on, even just on the home screen, for like an hour while I go do something else. I have no interest in trying to babysit or manage a display. I get that OLED's have better picture thats just a fact, but will it last me 10+ years? I don't want to have to change my behavior to accommodate a TV and it needs to be just as good after 10 years of almost daily use.

Nah I meant to write sub 1000, 1000 will be the max resting price imo, more likely 900, but maybe even 850. 900 was the resting price of 55" XF90 and that was 2-3 years ago so I'd expect X90J to rest lower since it has the same, if not lesser, tech in it (backlight zones 32 in X90J vs 48 in XF90). It does have HDMI 2.1 over the XF90 though so not sure how much that affects the manufacturing cost. They started using one mediatek chipset for OS and image processing from 2020 on, not sure if thats the same in 2021 sets with the XR chip so that will lower costs too, so unless Covid has reallly effed prices I wouldn't expect the 90 series LCD to rest above what they have in previous years, because even if it was the same as previous years I'd wager they are making a higher margin on it as the tech advances have mostly been in SW rather than HW.

If you don't want to pay for/can't get an extended warranty that covers burn-in on the OLED then just buy an LCD, I don't think the chance is high for burn-in through 99% of normal use but it definitely could happen if you repeat the same content day after day/use it as a desktop monitor without precautions. I don't think OLED PQ in general will degrade massively over 10 years.

Here's the thing though, you just set the PS to auto turn off after X mins of inactivity or better yet set the TV to auto-off after a X hours of no input, problem solved.

Google TV is definitely better than webOS in my experience and the new webOS UI is just trying to copy Android/Google TV, but they did it poorly as the top 1/3 of the screen is permanently covered by three (3) info boxes you may never even use (weather information is one of them ,as an example).

You mentioned DTS on the LG, that is a concern if you value high quality audio and like DTS but might not even apply to you, what is your sound setup and how many HDMI inputs does it have? The only 2x HDMI 2.1 ports limitation may also not be an issue depending on your sound setup, ie if you putting everything through a 2.1 receiver/sound bar then you don't need to worry about most of the problems.

You'll always miss out on DTS/DTS-HD from internal apps because they have to use ARC for external audio, but then Netflix and Prime use "Dolby Atmos" (Not true atmos, its in the form of DD+) anyway so it doesn't matter and if you used Netflix/Prime through a player connected directly to the sound system, ie dropping off the audio before it even gets to the TV you won't have that issue anyway.

Main question is do you need to buy it right now/in the next few months or can you wait til the price drops? ie Black Friday or Boxing Day/January sales. If you are waiting the LCD will be ~50% cheaper, if you aren't waiting the OLED will be ~20% more expensive, in both cases imo its still totally worth the extra money for the PQ increase but maybe you won't see it that way when staring down the barrel of a $500 saving.

(I put the main stuff in bold if you get bored reading that)

edit - oh yeah and the ports being on the left side (when looking at the rear) on LGs is a pita, can't do anything about that unfortunately. In my setup that would be extremely annoying since the TV is angled to the left all the time.
 
Last edited:

kyliethicc

Member
Nah I meant to write sub 1000, 1000 will be the max resting price imo, more likely 900, but maybe even 850. 900 was the resting price of 55" XF90 and that was 2-3 years ago so I'd expect X90J to rest lower since it has the same, if not lesser, tech in it (backlight zones 32 in X90J vs 48 in XF90). It does have HDMI 2.1 over the XF90 though so not sure how much that affects the manufacturing cost.

If you don't want to pay for/can't get an extended warranty that covers burn-in on the OLED then just buy an LCD, I don't think the chance is high for burn-in through 99% of normal use but it definitely could happen if you repeat the same content day after day/use it as a desktop monitor without precautions. I don't think OLED PQ in general will degrade massively over 10 years.

Here's the thing though, you just set the PS to auto turn off after X mins of inactivity or better yet set the TV to auto-off after a X hours of no input, problem solved.

Google TV is definitely better than webOS in my experience and the new webOS UI is just trying to copy Android/Google TV, but they did it poorly as the top 1/3 of the screen is permanently covered by three (3) info boxes you may never even use (weather information is one of them ,as an example).

You mentioned DTS on the LG, that is a concern if you value high quality audio and like DTS but might not even apply to you, what is your sound setup and how many HDMI inputs does it have? The only 2x HDMI 2.1 ports limitation may also not be an issue depending on your sound setup, ie if you putting everything through a 2.1 receiver/sound bar then you don't need to worry about most of the problems.

You'll always miss out on DTS/DTS-HD from internal apps because they have to use ARC for external audio, but then Netflix and Prime use "Dolby Atmos" (Not true atmos, its in the form of DD+) anyway so it doesn't matter and if you used Netflix/Prime through a player connected directly to the sound system, ie dropping off the audio before it even gets to the TV you won't have that issue anyway.

Main question is do you need to buy it right now/in the next few months or can you wait til the price drops? ie Black Friday or Boxing Day/January sales. If you are waiting the LCD will be ~50% cheaper, if you aren't waiting the OLED will be ~20% more expensive, in both cases imo its still totally worth the extra money for the PQ increase but maybe you won't see it that way when staring down the barrel of a $500 saving.

(I put the main stuff in bold if you get bored reading that)

edit - oh yeah and the ports being on the left side (when looking at the rear) on LGs is a pita, can't do anything about that unfortunately. In my setup that would be extremely annoying since the TV is angled to the left all the time.
The burn in thing may just be me being paranoid, but I can't get over that nagging concern about it. I'm sure it's be fine for like 5 years.

I have a 5.1 setup. It annoys me that LG and Samsung have dropped support for DTS-HD MA because its on some UHDBDs I own.

The X900H is the model the X90J replaced. X900H began at $1200 for 55" and only ever dropped to $1000 on sales.
So I can't see the X90J 55 ever hitting below $1200-$1000 because they raised the launch price $300.
The LG C1 will probably end up around $1400-$1300 on Black Friday sales etc, so its gonna be a tough choice if they're close.

Actually I prefer all ports being on the side, like the Sony X90J is. My issue with LG CX/C1 is they put some ports on the back, some on the side. Why split up the ports randomly? Messy. And the LG uses an annoying 3 piece stand that has to be screwed together. And its ugly IMO. I prefer the X90J that just has 2 feet that slot into the TV without any screws. And the X90J is pure black, not dark silver.

I also have had excellent experiences buying Sony TVs, speakers, receivers, blu-ray players, headphones, Playstations, etc. I trust the brand for quality electronics. I know plenty of people who in the past have had issues with LG TVs tho.
 
Last edited:

Kuranghi

Gold Member
The burn in thing may just be me being paranoid, but I can't get over that nagging concern about it. I'm sure it's be fine for like 5 years.

I have a 5.1 setup. It annoys me that LG and Samsung have dropped support for DTS-HD MA because its on some UHDBDs I own.

The X900H is the model the X90J replaced. X900H began at $1200 for 55" and only ever dropped to $1000 on sales.
So I can't see the X90J 55 ever hitting below $1200-$1000 because they raised the launch price $300.

The LG C1 will probably end up around $1400-$1300 on Black Friday sales etc, so its gonna be a tough choice if they're close.

Actually I prefer all ports being on the side, like the Sony X90J is. My issue with LG CX/C1 is they put some ports on the back, some on the side. Why split up the ports randomly? Messy.

And the LG uses an annoying 3 piece stand that has to be screed together. And its ugly IMO.
I prefer the X90J that just has 2 feet that slot into the TV without any screws. And the TV is pure black, not dark silver.

I also have had excellent experiences buying Sony TVs, speakers, receiver, blu-ray players, headphones, playstations, etc. I trust their brand for quality electronics. I know plenty of people in the past who have had issues with LG TVs tho.

So since you didnt mention it I take your 5.1 setup does not allow you to put devices through it (and is also 2.1 compatible in the case of next-gen consoles)? If thats the case then yeah you will miss out on DTS when using the internal apps or from devices connected directly and using the e/ARC port to deliver audio to the sound system. If you can connect devices through the 5.1 system, like PS5/XSX, and then use the apps within those devices you will get DTS/DTS-HD/DTS-X fine, as long as your 5.1 system supports those codecs.

I know X900H was the 2020 set but thats an anomoly, they probably lowered the initial RRP because they knew everyone was going to buy new TVs around then (lockdown happened just before release) to get more sales, rather than the usual RRP for their 55" 90 series LCDs from the previous 3 years, which was 1500/1600. Its 999 on bestbuy US right now and I guarantee it will drop at least 100, maybe more, when the X90J is fully available/at the end of May.

I don't think C1 will go below 1400 in 55" because they have the new A1 now so they can bump the price a bit because of that. They wouldn't want the A1, B1 and C1 to have that small of a price gap between them imo, they will want at least 200 between them.

I forgot some of the ports face backwards on LGs. I was thinking of Samsungs having them on the opposite of Sonys, but still side facing in both cases. Then I remembered its not the side, its random backwards facing HDMI ports, as you say.

Can't do much about the feet really, any 3rd part solution will need to screw into the VESA holes. One idea to replicate the Sony feet somewhat is a product like this:


They are screwed to the VESA mounting holes on the back of the set as well though, but I don't know what the quality of these products is because I've never tried them.

About build quality, my personal experience when I worked for Sony was I saw many more returns for LGs than Sonys, I dealt with the Sony returns myself of course but I could see LG returns on the system as well. They had 3 separate 77" C8s that developed screen issues, like black lines on the screen or generally corrupted imagesm, but that could partly be down to shoddy handling by staff so take with grain of salt. Could also have just been a bad firmware that never got updated instore.

I think you'll find most of LG TV's that have build/HW issues will be from the bottom tier entry level models, like the 5, 6 and 7-series edge lit models, ie cheapo sets that go *donk donk* when you knock on them, so you should be safe with OLEDs from LG.

BTW, why wouldn't you go for best of both worlds and get a Sony OLED? I take it its because the only Sony OLED with VRR is the A90J and its too expensive? I get that, its really pricey.
 

kyliethicc

Member
So since you didnt mention it I take your 5.1 setup does not allow you to put devices through it (and is also 2.1 compatible in the case of next-gen consoles)? If thats the case then yeah you will miss out on DTS when using the internal apps or from devices connected directly and using the e/ARC port to deliver audio to the sound system. If you can connect devices through the 5.1 system, like PS5/XSX, and then use the apps within those devices you will get DTS/DTS-HD/DTS-X fine, as long as your 5.1 system supports those codecs.

I know X900H was the 2020 set but thats an anomoly, they probably lowered the initial RRP because they knew everyone was going to buy new TVs around then (lockdown happened just before release) to get more sales, rather than the usual RRP for their 55" 90 series LCDs from the previous 3 years, which was 1500/1600. Its 999 on bestbuy US right now and I guarantee it will drop at least 100, maybe more, when the X90J is fully available/at the end of May.

I don't think C1 will go below 1400 in 55" because they have the new A1 now so they can bump the price a bit because of that. They wouldn't want the A1, B1 and C1 to have that small of a price gap between them imo, they will want at least 200 between them.

I forgot some of the ports face backwards on LGs. I was thinking of Samsungs having them on the opposite of Sonys, but still side facing in both cases. Then I remembered its not the side, its random backwards facing HDMI ports, as you say.

Can't do much about the feet really, any 3rd part solution will need to screw into the VESA holes. One idea to replicate the Sony feet somewhat is a product like this:


They are screwed to the VESA mounting holes on the back of the set as well though, but I don't know what the quality of these products is because I've never tried them.

About build quality, my personal experience when I worked for Sony was I saw many more returns for LGs than Sonys, I dealt with the Sony returns myself of course but I could see LG returns on the system as well. They had 3 separate 77" C8s that developed screen issues, like black lines on the screen or generally corrupted imagesm, but that could partly be down to shoddy handling by staff so take with grain of salt. Could also have just been a bad firmware that never got updated instore.

I think you'll find most of LG TV's that have build/HW issues will be from the bottom tier entry level models, like the 5, 6 and 7-series edge lit models, ie cheapo sets that go *donk donk* when you knock on them, so you should be safe with OLEDs from LG.

BTW, why wouldn't you go for best of both worlds and get a Sony OLED? I take it its because the only Sony OLED with VRR is the A90J and its too expensive? I get that, its really pricey.
Yeah sorry my receiver is HDMI 2.0. I know I can run stuff directly into it to workaround the limits of the LG's eARC port, but its just annoying that LG and Samsung are being cheap and cost cutting by refusing to license DTS. I plan to run my PS5 directly into the TV. My UHDBD player is run to my receiver, along with an Apple TV box. Its just that loss of flexibility that makes me go "fuck off" lol. Same with Samsung refusing to support Dolby Vision.. rather annoying. There's a simplicity and clean-ness of running all inputs into the TV and then just 1 cable to the receiver. Idk, ultimately its not a deal breaker.

And yeah the A90J seems amazing.. buts its like $3000 lol. Can't spend that for a 55" with chance of burn in. Not that rich.

The X90J vs C1 comparison for price vs performance will be very interesting to watch.
 

Kuranghi

Gold Member
Yeah sorry my receiver is HDMI 2.0. I know I can run stuff directly into it to workaround the limits of the LG's eARC port, but its just annoying that LG and Samsung are being cheap and cost cutting by refusing to license DTS. I plan to run my PS5 directly into the TV. My UHDBD player is run to my receiver, along with an Apple TV box. Its just that loss of flexibility that makes me go "fuck off" lol. Same with Samsung refusing to support Dolby Vision.. rather annoying. There's a simplicity and clean-ness of running all inputs into the TV and then just 1 cable to the receiver. Idk, ultimately its not a deal breaker.

And yeah the A90J seems amazing.. buts its like $3000 lol. Can't spend that for a 55" with chance of burn in. Not that rich.

The X90J vs C1 comparison for price vs performance will be very interesting to watch.

Yeah I get it, convenience is king man, I only have two (2) inputs on my Sony ZF9 soundbar and the PC has to go through it for uncompressed audio since I have no eARC, so I only have 1 input left for the PS5 (Used to be PS4 Pro) AND Switch.

Why would the Switch need to go through the soundbar you say? Because Ninty are complete muppets and won't allow 5.1 audio through ARC, you have to go directly into the sound system or else it will just be stereo, so I had to buy an HDMI Switch to get everything through that remaining port, the upside being I was able to just plug all the other (Mostly 1080p) devices into the HDMI Switch to avoid having anything running up to the TV except the HDMI ARC output from the soundbar.

So if you have more devices than inputs on the sound system and you have a Nintendo Switch you'll need an HDMI Switch regardless to get 5.1, even if you go for the Sony.

If I still worked for Sony and was in the US I'd buy you one (as a gift ofc, its up to you if you gave me a "gift" in return that was the exact same price as the TV lol) from my staff store, I got 40% on day one! Got "myself" a 65" AF9 for £1000 off RRP before it even officially released and it arrived before I even had one in my shop lol.

"Myself" recently told me his girlfriend threw a glass at it and destroyed it, I told him to throw his girlfriend out the house lol. Thats heartbreaking as the AF9 was amazing, had more metal in it than AG9 so felt much better and had better Acoustic Surface implementation, 3 actuators vs 2 behind the screen and much better bass imo from 2 bigger subs on the sides rather than 2 subs on the back with AG9.

If you answer these questions I'll tell you whether you should buy an LCD or an OLED, ignoring price and burn-in considerations, we'll assume you set up the OLED to turn off after a period of inactivity:

* What kind of games do you play, mostly SP, or competitive?
* Will it be your main viewer for Movies/TV as well?
* Is sub-4K output gaming (ie Switch, PS3, etc) a big consideration or will it almost always be PS5 you're playing?
* Will you play during the day a lot, with curtains open and light coming in, or even shining on the screen? Or mostly at night/in a way where you can light-control the room?
* When you watch movies/TV on your current TV do you have the motion interpolation on, especially for 24hz content? ie Blurays and prestige TV on Netflix/Prime/etc
* Will you use the internal apps in the TV a lot/daily or just use the apps in the Apple TV? (It has everything most people need doesn't it? I haven't used one in years)

There are more things I should ask you but thats a good start at least, I'll prob remember more as we go back and forth.
 

kyliethicc

Member
Yeah I get it, convenience is king man, I only have two (2) inputs on my Sony ZF9 soundbar and the PC has to go through it for uncompressed audio since I have no eARC, so I only have 1 input left for the PS5 (Used to be PS4 Pro) AND Switch.

Why would the Switch need to go through the soundbar you say? Because Ninty are complete muppets and won't allow 5.1 audio through ARC, you have to go directly into the sound system or else it will just be stereo, so I had to buy an HDMI Switch to get everything through that remaining port, the upside being I was able to just plug all the other (Mostly 1080p) devices into the HDMI Switch to avoid having anything running up to the TV except the HDMI ARC output from the soundbar.

So if you have more devices than inputs on the sound system and you have a Nintendo Switch you'll need an HDMI Switch regardless to get 5.1, even if you go for the Sony.

If I still worked for Sony and was in the US I'd buy you one (as a gift ofc, its up to you if you gave me a "gift" in return that was the exact same price as the TV lol) from my staff store, I got 40% on day one! Got "myself" a 65" AF9 for £1000 off RRP before it even officially released and it arrived before I even had one in my shop lol.

"Myself" recently told me his girlfriend threw a glass at it and destroyed it, I told him to throw his girlfriend out the house lol. Thats heartbreaking as the AF9 was amazing, had more metal in it than AG9 so felt much better and had better Acoustic Surface implementation, 3 actuators vs 2 behind the screen and much better bass imo from 2 bigger subs on the sides rather than 2 subs on the back with AG9.

If you answer these questions I'll tell you whether you should buy an LCD or an OLED, ignoring price and burn-in considerations, we'll assume you set up the OLED to turn off after a period of inactivity:

* What kind of games do you play, mostly SP, or competitive?
* Will it be your main viewer for Movies/TV as well?
* Is sub-4K output gaming (ie Switch, PS3, etc) a big consideration or will it almost always be PS5 you're playing?
* Will you play during the day a lot, with curtains open and light coming in, or even shining on the screen? Or mostly at night/in a way where you can light-control the room?
* When you watch movies/TV on your current TV do you have the motion interpolation on, especially for 24hz content? ie Blurays and prestige TV on Netflix/Prime/etc
* Will you use the internal apps in the TV a lot/daily or just use the apps in the Apple TV? (It has everything most people need doesn't it? I haven't used one in years)

There are more things I should ask you but thats a good start at least, I'll prob remember more as we go back and forth.
Oh its ok you don't need to do that but I appreciate it.

Currently, I just want to see what experts like RTINGS.com say about the X90J.
 
"Myself" recently told me his girlfriend threw a glass at it and destroyed it, I told him to throw his girlfriend out the house lol. Thats heartbreaking as the AF9 was amazing, had more metal in it than AG9 so felt much better and had better Acoustic Surface implementation, 3 actuators vs 2 behind the screen and much better bass imo from 2 bigger subs on the sides rather than 2 subs on the back with AG9.
giphy.gif

Yep lol.
 

ManaByte

Gold Member
Well it took me all day (including a 3 hour round trip to find a Dolby Vision compatible receiver to replace my old one) but I got my X90J setup and configured. It's glorious. Watching Return of the King in Dolby Vision/Atmos right now.

The brightness compared to an edge-lit is amazing. I have the PS5 going through the receiver (so only 4K 60 there) but I have the XSX connected directly to the TV for 4K 120. XSX audio is handled with the eARC on HDMI 3 going to the receiver.
 
Last edited:

Melchiah

Member
Yeah I get it, convenience is king man, I only have two (2) inputs on my Sony ZF9 soundbar and the PC has to go through it for uncompressed audio since I have no eARC, so I only have 1 input left for the PS5 (Used to be PS4 Pro) AND Switch.

Why would the Switch need to go through the soundbar you say? Because Ninty are complete muppets and won't allow 5.1 audio through ARC, you have to go directly into the sound system or else it will just be stereo, so I had to buy an HDMI Switch to get everything through that remaining port, the upside being I was able to just plug all the other (Mostly 1080p) devices into the HDMI Switch to avoid having anything running up to the TV except the HDMI ARC output from the soundbar.

So if you have more devices than inputs on the sound system and you have a Nintendo Switch you'll need an HDMI Switch regardless to get 5.1, even if you go for the Sony.

If I still worked for Sony and was in the US I'd buy you one (as a gift ofc, its up to you if you gave me a "gift" in return that was the exact same price as the TV lol) from my staff store, I got 40% on day one! Got "myself" a 65" AF9 for £1000 off RRP before it even officially released and it arrived before I even had one in my shop lol.

"Myself" recently told me his girlfriend threw a glass at it and destroyed it, I told him to throw his girlfriend out the house lol. Thats heartbreaking as the AF9 was amazing, had more metal in it than AG9 so felt much better and had better Acoustic Surface implementation, 3 actuators vs 2 behind the screen and much better bass imo from 2 bigger subs on the sides rather than 2 subs on the back with AG9.

If you answer these questions I'll tell you whether you should buy an LCD or an OLED, ignoring price and burn-in considerations, we'll assume you set up the OLED to turn off after a period of inactivity:

* What kind of games do you play, mostly SP, or competitive?
* Will it be your main viewer for Movies/TV as well?
* Is sub-4K output gaming (ie Switch, PS3, etc) a big consideration or will it almost always be PS5 you're playing?
* Will you play during the day a lot, with curtains open and light coming in, or even shining on the screen? Or mostly at night/in a way where you can light-control the room?
* When you watch movies/TV on your current TV do you have the motion interpolation on, especially for 24hz content? ie Blurays and prestige TV on Netflix/Prime/etc
* Will you use the internal apps in the TV a lot/daily or just use the apps in the Apple TV? (It has everything most people need doesn't it? I haven't used one in years)

There are more things I should ask you but thats a good start at least, I'll prob remember more as we go back and forth.

I'll answer those as well...

* SP.
* Yes.
* No, I'll mostly play new games, but I might visit some PS4 classics or older remasters occasionally.
* Yes, I mostly play on late afternoons and early evenings, with curtains open. Movie watching also happens usually before 21:00.
* Dunno about that, as I've been playing on my 2009 Bravia 40W5500 so far.
* Yes, I'll probably be using the internal apps over those in PS4/5, once I buy a new TV.

On a further note, I've put thousands of hours into a game that looks like this:
sWVkcAY.png

And to my understanding the burn-in is cumulative. Meaning, it won't matter if I play the game for hours in a row, or in shorter sessions across a longer period of time. In this case, from February 2017 to today and onwards.
 

Bojanglez

The Amiga Brotherhood
Yeah sorry my receiver is HDMI 2.0. I know I can run stuff directly into it to workaround the limits of the LG's eARC port, but its just annoying that LG and Samsung are being cheap and cost cutting by refusing to license DTS. I plan to run my PS5 directly into the TV. My UHDBD player is run to my receiver, along with an Apple TV box. Its just that loss of flexibility that makes me go "fuck off" lol. Same with Samsung refusing to support Dolby Vision.. rather annoying. There's a simplicity and clean-ness of running all inputs into the TV and then just 1 cable to the receiver. Idk, ultimately its not a deal breaker.

And yeah the A90J seems amazing.. buts its like $3000 lol. Can't spend that for a 55" with chance of burn in. Not that rich.

The X90J vs C1 comparison for price vs performance will be very interesting to watch.
If you have a HDMI 2.0 receiver and are happy with it, one option (if you can get hold of one) is to try and get a SHARC eARC. It takes the eArc output and passes it through to a standard ARC receiver https://www.thenaudio.com/product/sharc-earc-audio-converter/ (available on Amazon US sometimes too).

I am debating going down this route myself as I have a perfectly good HDMI 2.0 (5.1 speaker) receiver and I'm reluctant to upgrade to a new receiver until receivers with all HDMI 2.1 ports are in the market at a reasonable price.
 

Bojanglez

The Amiga Brotherhood
Well it took me all day (including a 3 hour round trip to find a Dolby Vision compatible receiver to replace my old one) but I got my X90J setup and configured. It's glorious. Watching Return of the King in Dolby Vision/Atmos right now.

The brightness compared to an edge-lit is amazing. I have the PS5 going through the receiver (so only 4K 60 there) but I have the XSX connected directly to the TV for 4K 120. XSX audio is handled with the eARC on HDMI 3 going to the receiver.
Awesome, you'll have to share some pics or vid clips.

What receiver did you go for?
 

Ulysses 31

Member
The burn in thing may just be me being paranoid, but I can't get over that nagging concern about it. I'm sure it's be fine for like 5 years.
Really depends how you use the OLED.

If you vary your content the newer OLEDs should easily last years.

A case of burn-in has now been reported on a C9 after less than 2000 hours of use.

 
Last edited:

ManaByte

Gold Member
Awesome, you'll have to share some pics or vid clips.

What receiver did you go for?

We have a small room with just smaller satellite speakers so we don't need something really big and expensive. Just needed Dolby Vision compatibility so I got a Sony 790 to replace the Sony 770 I had and everything works perfect. It's an Atmos receiver so with that and the TV being Dolby Vision we basically have a Dolby Cinema home theater setup now,. Watched a little of Return of the King, the last part of Endgame, and Sonic (the last movie I saw in theaters, in a Dolby Cinema) so far. Very happy with it all.
 

Kuranghi

Gold Member
I'll answer those as well...

* SP.
* Yes.
* No, I'll mostly play new games, but I might visit some PS4 classics or older remasters occasionally.
* Yes, I mostly play on late afternoons and early evenings, with curtains open. Movie watching also happens usually before 21:00.
* Dunno about that, as I've been playing on my 2009 Bravia 40W5500 so far.
* Yes, I'll probably be using the internal apps over those in PS4/5, once I buy a new TV.

On a further note, I've put thousands of hours into a game that looks like this:
sWVkcAY.png

And to my understanding the burn-in is cumulative. Meaning, it won't matter if I play the game for hours in a row, or in shorter sessions across a longer period of time. In this case, from February 2017 to today and onwards.

Well, with these answers I'd say LCD so far, few more questions and I'll clarify the motion interpolation one:

* Does that game run at 30 or 60 fps?
* Do you like camera motion blur in games (assuming its done right, not too OTT or bad implement), if you have the choice do you turn it on or off? Say as well if it depends on the framerate.
* Have you/do you currently also game on monitors? Tell me the monitor model if yes and what do you think of the motion there compared to the Sony TV?

On a monitor usually the individual frames will be sharp and clear but this might lead to a sort of perceived stutter, whereas on the TV the frames might not be as clear individually but they blend together more and it appears smoother in motion. Thats what the motion interpolation does on the TV, if there are only 24 frames per second (like from a bluray) when its set to Low or High it adds fake frames in between the real ones to give the perception of a higher framerate and smoother motion.

About the motion interpolation thing, I checked from reviews of your model (Seems like a really nice TV, 2009 was apparently the year they really got their act together after years of bad sets) and if you check the TV settings and look for "MotionFlow", it will be set to either Off, Low or High, let me know what its set to please.

High adds more fake frames than Low but that has a tendency to introduce tearing and other artifacts in the image because its guessing a lot. If you dislike the smooth motion effect (Its called Soap Opera Effect) and/or artifacts added by MotionFlow but also dislike the "stutteryness" of the 24fps without it on then OLED may not be for you. OLED motion is very very clear and while many people love that for =>60fps games, ime most people dislike that effect on 24hz movies and 30hz games and so would want to add motion interpolation to smooth it out, which will add the SOE/artifacts which might be unwanted and or just can't be added to Game Mode in the first place anyway.

If the Picture Mode of the HDMI port your console/PC is connected to is in Game mode the MotionFlow option might be greyed out/not there anyway so not to worry then, that means you've not been using motion interpolation on your gaming input, but also check the other devices connected by HDMI, each picture mode is custom for each input, so maybe a bluray player or cable/TV tuner box has it set to Low or High.

If you have it set to Low or High then turn it off and watch as many of the following that you have access to/apply to you: a bluray movie (or just video thats at 24fps), 30fps and 60fps youtube videos, the internal Neflix/Prime/etc apps in the TV (internal apps may have their own picture mode so just go to settings once you've loaded the app otherwise you might just be changing settings for the HDMI input you are currently on), sports on cable TV and some regular broadcast TV.

If you don't notice a big difference try turning it from off to low to high while watching the content, if you can, but if you can't then don't stress out if you feel like there isn't much difference.

The biggest reason for you to not get OLED imo is because you said you play that game with fixed HUD for thousands of hours right now, but depending on how you answer about the motion interpolation and motion blur questions maybe OLED will give you the type of motion you've been looking for all this time.

If you all the games you playing are at >=60fps then the motion smoothness part doesnt matter for games as much and motion will look better on the OLED to you most likely, but its still a factor for movies/TV/broadcast/internal apps like netflix/prime

Sorry if thats a bit much, got carried away, let me know if anything is confusing.
 
Last edited:

Bo_Hazem

Banned
We have a small room with just smaller satellite speakers so we don't need something really big and expensive. Just needed Dolby Vision compatibility so I got a Sony 790 to replace the Sony 770 I had and everything works perfect. It's an Atmos receiver so with that and the TV being Dolby Vision we basically have a Dolby Cinema home theater setup now,. Watched a little of Return of the King, the last part of Endgame, and Sonic (the last movie I saw in theaters, in a Dolby Cinema) so far. Very happy with it all.

Don't forget to use your 1 year free sub with Bravia Core + 5 redeemable movies. Try it out!
 

Melchiah

Member
Well, with these answers I'd say LCD so far, few more questions and I'll clarify the motion interpolation one:

* Does that game run at 30 or 60 fps?
* Do you like camera motion blur in games (assuming its done right, not too OTT or bad implement), if you have the choice do you turn it on or off? Say as well if it depends on the framerate.
* Have you/do you currently also game on monitors? Tell me the monitor model if yes and what do you think of the motion there compared to the Sony TV?

On a monitor usually the individual frames will be sharp and clear but this might lead to a sort of perceived stutter, whereas on the TV the frames might not be as clear individually but they blend together more and it appears smoother in motion. Thats what the motion interpolation does on the TV, if there are only 24 frames per second (like from a bluray) when its set to Low or High it adds fake frames in between the real ones to give the perception of a higher framerate and smoother motion.

About the motion interpolation thing, I checked from reviews of your model (Seems like a really nice TV, 2009 was apparently the year they really got their act together after years of bad sets) and if you check the TV settings and look for "MotionFlow", it will be set to either Off, Low or High, let me know what its set to please.

High adds more fake frames than Low but that has a tendency to introduce tearing and other artifacts in the image because its guessing a lot. If you dislike the smooth motion effect (Its called Soap Opera Effect) and/or artifacts added by MotionFlow but also dislike the "stutteryness" of the 24fps without it on then OLED may not be for you. OLED motion is very very clear and while many people love that for =>60fps games, ime most people dislike that effect on 24hz movies and 30hz games and so would want to add motion interpolation to smooth it out, which will add the SOE/artifacts which might be unwanted and or just can't be added to Game Mode in the first place anyway.

If the Picture Mode of the HDMI port your console/PC is connected to is in Game mode the MotionFlow option might be greyed out/not there anyway so not to worry then, that means you've not been using motion interpolation on your gaming input, but also check the other devices connected by HDMI, each picture mode is custom for each input, so maybe a bluray player or cable/TV tuner box has it set to Low or High.

If you have it set to Low or High then turn it off and watch as many of the following that you have access to/apply to you: a bluray movie (or just video thats at 24fps), 30fps and 60fps youtube videos, the internal Neflix/Prime/etc apps in the TV (internal apps may have their own picture mode so just go to settings once you've loaded the app otherwise you might just be changing settings for the HDMI input you are currently on), sports on cable TV and some regular broadcast TV.

If you don't notice a big difference try turning it from off to low to high while watching the content, if you can, but if you can't then don't stress out if you feel like there isn't much difference.

The biggest reason for you to not get OLED imo is because you said you play that game with fixed HUD for thousands of hours right now, but depending on how you answer about the motion interpolation and motion blur questions maybe OLED will give you the type of motion you've been looking for all this time.

If you all the games you playing are at >=60fps then the motion smoothness part doesnt matter for games as much and motion will look better on the OLED to you most likely, but its still a factor for movies/TV/broadcast/internal apps like netflix/prime

Sorry if thats a bit much, got carried away, let me know if anything is confusing.
* Depends on the game, but all PS5 games I've played have so far have ran at 60fps, and the same goes for the PS4 game in the screenshot. I'm not that sensitive about framerate though. Bloodborne's issues never bothered me, for example.
* Yes. I guess something like Driveclub would apply (I still play it from time to time).
* No. I've got a laptop, but I don't play on it.

Yeah, the TV has served me well, and was reviewed pretty good at the time. Interestingly, there's a burn-in that appeared a year or two ago, so LCDs can apparently get it as well in extreme circumstances (I tend to keep the TV on all day). That's one of the reasons why I'm very sceptical about OLEDs, since I'll be using the next TV for another decade, or possibly until microLEDs become affordable. I wager the burn-in is from a static TV or Netflix image, before they implemented an automatic screensaver function. Sadly, PS4 and my TV don't have such, but luckily PS5 has and hopefully newer TVs as well.

I checked, and I have MotionFow set to off. Its side effects don't really sound appealing to me, and that's probably why I've set it off originally. OLED's issues with 24hz movies and 30hz games sound unappealing as well, along with the issues that the compensating motion interpolation would bring. Especially when I always choose quality (30fps) over performance (60fps) in games, if given the choice. Control being the only exception, as the way it handled lighting in the quality mode made my eyes ache.

I'll check out how content feels like with MotionFlow on.

Thank you for the detailed reply.
 

Kuranghi

Gold Member
* Depends on the game, but all PS5 games I've played have so far have ran at 60fps, and the same goes for the PS4 game in the screenshot. I'm not that sensitive about framerate though. Bloodborne's issues never bothered me, for example.
* Yes. I guess something like Driveclub would apply (I still play it from time to time).
* No. I've got a laptop, but I don't play on it.

Yeah, the TV has served me well, and was reviewed pretty good at the time. Interestingly, there's a burn-in that appeared a year or two ago, so LCDs can apparently get it as well in extreme circumstances (I tend to keep the TV on all day). That's one of the reasons why I'm very sceptical about OLEDs, since I'll be using the next TV for another decade, or possibly until microLEDs become affordable. I wager the burn-in is from a static TV or Netflix image, before they implemented an automatic screensaver function. Sadly, PS4 and my TV don't have such, but luckily PS5 has and hopefully newer TVs as well.

I checked, and I have MotionFow set to off. Its side effects don't really sound appealing to me, and that's probably why I've set it off originally. OLED's issues with 24hz movies and 30hz games sound unappealing as well, along with the issues that the compensating motion interpolation would bring. Especially when I always choose quality (30fps) over performance (60fps) in games, if given the choice. Control being the only exception, as the way it handled lighting in the quality mode made my eyes ache.

I'll check out how content feels like with MotionFlow on.

Thank you for the detailed reply.

Okay I'm thinking LCD is best then, for these reasons:

* You like smooth motion/motion blur but don't want to use motion interpolation due to artifacts
* You had MotionFlow off all this time
* You prefer 30fps if it lets you get better visuals in games (I totally get Control though, the lag on the reflections and lighting is bloody awful and the stutter without motion blur even on my LCD which has amazing motion looks horrible without the in-game motion blur on)
* You have legitimate concerns about burn-in

This video is great to explain Motion Interpolation/SOE:




Note though that in this video they use a 2016 KS8000 for the tests, the reason there are so many arifacts in the difficult scenes, like Bond on the bike going across the rooftops (His head is almost constantly torn) is because the Samsung motion interpolation system/technique has the worst performance of the four big brands (Sony, LG, Samsung and Panasonic), LG is slightly better but still has big problems, while Panasonic is 2nd best after Sony, who is far ahead in this area.

I'm betting any recent mid to high-end Sony model wouldn't tear his head on those shots but you will almost always incur the SOE regardless of whether its Sony, LG etc.

So if you do end up looking at OLED and might plan to use the motion interpolation on it for 24/25/30hz content, then definitely go for a Sony OLED, because otherwise you have to choose between perceived stutter and artifacts. Whereas on an LCD even with all the motion stuff off the motion will have much less perceived stutter. When you check out 24hz movies on your current Sony with MotionFlow OFF if that looks nice and cinematic/smooth enough to you then I doubt you'd need motion interpolation at all on a newer LCD.

Let me know about your experience with MotionFlow before final decision but thats what I think anyway. Most people will agree the super fast response time/clear motion of OLEDs is a benefit for =>60 games/video but when they see 24/25/30 they often want to turn on motion interpolation, bringing those issues, and at that point you are snookered if you don't like the downsides of it, ie artifacts and soap opera effect.

Best thing to do to see what you think of low fps motion on OLED without motion interpolation on is go into an AV shop and ask them to show you input from a bluray player playing a disc and to turn off all the motion interpolation/settings beforehand.

If they don't know what they are doing then just get the remote from them and tell us which brand/model it is and we'll tell you what to disable.
 
Last edited:

dolabla

Member
Amazon got some 50" X90J in stock. Got tired of waiting on Best Buy so just ordered from them instead. Yeah, I'm being impatient :messenger_grinning_smiling:. Don't think the local store was getting any of that size (they never got the 49" X950H last year) so probably was going to have to have it shipped anyways. And it would have been nice to get that 45 day return just in case of a price of drop, but screw it I want a tv!

Just hope it doesn't arrive with a big hole in the box right into the screen like the CX did last year 😡
 
Last edited:

dolabla

Member
Watching Moana right now in Dolby Vision on the X90J.

OH

MY

GOD
Mana, do you have an OSSC? If so, was wondering how compatibility is on this set? SNES and NES seem to give the most problems on tvs due to a jittery sync.

I'm not too worried about it tho as the Retrotink 5X Pro is about to come out (it's said to fix the issues that the OSSC had), but just was wondering how it performs on this set.
 

ManaByte

Gold Member
Mana, do you have an OSSC? If so, was wondering how compatibility is on this set? SNES and NES seem to give the most problems on tvs due to a jittery sync.

I'm not too worried about it tho as the Retrotink 5X Pro is about to come out (it's said to fix the issues that the OSSC had), but just was wondering how it performs on this set.
No I don't.

Here are a couple of Moana shots:
eYqN1gc.jpg

C16pLEe.jpg
 

ManaByte

Gold Member
Did you change the picture mode from when you set it up, if yes which did you go for? Is this a bluray or is it streaming?

Disney doesn't include Dolby Vision on their discs (usually). They only put Dolby Vision on their digital copies or Disney+. So when presented with a choice, the iTunes digital copy streams at a higher bitrate than D+ (not by much, D+ is very close) so this is the digital copy streaming on an Apple TV 4K set to Dolby Vision.

The X90J will automatically switch to Dolby Vision mode when it detects it.
 

DeepEnigma

Gold Member
We're not getting any until like 4-5 months from now, so I'm here waiting for later this year for a nice price cut.
It's what I did for the X90H. Got it last September for $840 and some change.

Best damned TV for the price, hands down. Wait for the price drops, you won't be disappointed.
Looking forward to setting up this set. Going from an edge-lit LED to full array is going to be the best upgrade for me even more than the 120hz and Dolby Vision.
This set is going to be a massive upgrade all around. Wait until you see the picture, color accuracy and those inky blacks for an LED.

I went from another Sony 4K 120hz edge lit, and while the picture clarity was comparable, the 10bit panel's colors and blacks were a major upgrade. Speed of the OS was a major upgrade as well.
 
Last edited:
Top Bottom