• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sony’s 2021 TV lineup runs Google TV and fully embraces HDMI 2.1

dotnotbot

Member
Its great to hear from VT that he thinks XR is an upgrade over X1U in terms of upscaling, smooth gradation/superbitmapping, and tonemapping but I'm not a fan of the forced always-on motion settings even if he said he didnt notice artifacts in real content, because I know he won't have tested many games so wonder how that fairs with complex and extremely unpredicable game imagery.

He said that turning on Game or Graphics mode disables it.
 
Its great to hear from VT that he thinks XR is an upgrade over X1U in terms of upscaling, smooth gradation/superbitmapping, and tonemapping but I'm not a fan of the forced always-on motion settings even if he said he didnt notice artifacts in real content, because I know he won't have tested many games so wonder how that fairs with complex and extremely unpredicable game imagery.

I suppose its par for the course with OLEDs seemingly becoming the standard for quite a while going forward (Based on the miniLED panel price predictions), OLED motion is fantastically clear but thats brought new problems for low framerate content and they want to strike a balance between clarity and perceived stutter.

I just don't like forced settings, now if it does cause problems you can't turn it off, always better to have the choice imo, unfortunately that goes against the majority opinion who will take a minor cost to simplify things a lot, which I get too, there are many areas of my life where I just want it to work and I don't care about the minutiae.

Now I want to see the XR processors across the range compared to see if there is a difference in them, maybe don't compare XR OLEDs to XR LCDs now though since I think the LCDs won't have the forced motion stuff due to LCDs not suffering from perceived stutter nearly as much.
I agree that the interpolation should be off completely when not selected, but it is certainly off completely in game mode as that would definitely add lag it it were on.

Man, A90J seems to have a couple issues compared to last year. Hope the panel grid stuff is sorted out.
 
Last edited:

ManaByte

Gold Member
He said that turning on Game or Graphics mode disables it.

And the TV is smart enough to do that automatically. I have the PS5 and an Apple TV 4K hooked up through the receiver. The TV will switch to Dolby Vision mode when on the Apple TV and then switch to Game mode automatically when you switch to the PS5's input.
 
I agree that the interpolation should be off completely when not selected, but it is certainly off completely in game mode as that would definitely add lag it it were on.

Man, A90J seems to have a couple issues compared to last year. Hope the panel grid stuff is sorted out.

Unfortunately, the panel grid is an issue with LGD. Many people with the CX were complaining about it last year. There are thoughts that it might be a result of LGD’s new manufacturing methodology that was designed to reduce waste and cut costs.
 
Unfortunately, the panel grid is an issue with LGD. Many people with the CX were complaining about it last year. There are thoughts that it might be a result of LGD’s new manufacturing methodology that was designed to reduce waste and cut costs.
Right, but you'd have thought with Sony's binning that it wouldn't be such an issue. First time i'm hearing about this though. On Fomo's video with John from value electronics, he said his A90J samples were the cleanest he'd seen from Sony so i'm kinda surprised at this.
 
Right, but you'd have thought with Sony's binning that it wouldn't be such an issue. First time i'm hearing about this though. On Fomo's video with John from value electronics, he said his A90J samples were the cleanest he'd seen from Sony so i'm kinda surprised at this.

It has been talked about over at AVS for at least a month. What has been noticed is that the uniformity of the 55in TVs has been better than the 65in TVs.
 

Kuranghi

Gold Member
He said that turning on Game or Graphics mode disables it.
I agree that the interpolation should be off completely when not selected, but it is certainly off completely in game mode as that would definitely add lag it it were on.

Man, A90J seems to have a couple issues compared to last year. Hope the panel grid stuff is sorted out.

Ah yeah ofc, ta, he said it 1 second after as wel, brain failure. No worries for gaming then, except for maddos who play outside game mode.

I don't know if its a HW issue thats impossible to solve without changing to a new process but I do hope they find a solution to how OLED handles near black, if it handled it as well as a good FALD LCD, combined with how OLED works it would be amazing. At that point you just have to get around ABL limitations, keep increase peak brightness and develop better BFI for OLED that decreases stutter for movies/low fps content further.

There will probably never be one perfect display tech, this will continue for decades lol, there will always be something to complain about :messenger_tears_of_joy:

And the TV is smart enough to do that automatically. I have the PS5 and an Apple TV 4K hooked up through the receiver. The TV will switch to Dolby Vision mode when on the Apple TV and then switch to Game mode automatically when you switch to the PS5's input.

Does it change out of game mode and back to your Cinema/Custom/media watching picture mode (whatever PM it was set to before game mode was auto turned on) when you use the media apps/BD player? I don't use those so PS5 is always set to game mode but I've wondered how that worked.

Just in case someone wanted motion interpolation on for BDs/media apps and having it in Cinema/Custom will probably give you better image quality than Game mode. I presume its just reading the device information and setting it accordingly so can't tell what the actual content being output from the device is.
 

Kuranghi

Gold Member
Right, but you'd have thought with Sony's binning that it wouldn't be such an issue. First time i'm hearing about this though. On Fomo's video with John from value electronics, he said his A90J samples were the cleanest he'd seen from Sony so i'm kinda surprised at this.

I think even if they are still strict with what panels they use for their sets, ultimately Sony are still at the mercy of what LGD choose to give them/give them access to within the panel. Like when LG had BFI in their OLEDs but Sony couldn't add it til the following year saying they didnt have control of some part of the panel, meaning they couldn't implement their own version of BFI the same year as LG.

Even though they had the best BFI by far since 2018, I don't know all the details though so maybe it was just a challenge to adapt their SW technique to OLED tech, so it took a year longer than they thought it would.
 
Ah yeah ofc, ta, he said it 1 second after as wel, brain failure. No worries for gaming then, except for maddos who play outside game mode.

I don't know if its a HW issue thats impossible to solve without changing to a new process but I do hope they find a solution to how OLED handles near black, if it handled it as well as a good FALD LCD, combined with how OLED works it would be amazing. At that point you just have to get around ABL limitations, keep increase peak brightness and develop better BFI for OLED that decreases stutter for movies/low fps content further.

There will probably never be one perfect display tech, this will continue for decades lol, there will always be something to complain about :messenger_tears_of_joy:



Does it change out of game mode and back to your Cinema/Custom/media watching picture mode (whatever PM it was set to before game mode was auto turned on) when you use the media apps/BD player? I don't use those so PS5 is always set to game mode but I've wondered how that worked.

Just in case someone wanted motion interpolation on for BDs/media apps and having it in Cinema/Custom will probably give you better image quality than Game mode. I presume its just reading the device information and setting it accordingly so can't tell what the actual content being output from the device is.
I think oled will stick around for maybe even another decade at this point ; micro led just being a headache lol. In that time, we are going to see better and better oleds, and I do think near black can be solved eventually. They can still use better organic material to reach higher peak brightness than the new evo panel, and as we know the Sony doesn't even use evo, and no heatsink on the lg sets.

Unless Samsung qned is amazing enough for the industry to shift towards it I think oled will be king for a long while yet.
 

SegaShack

Member
Just got our first 4KTV. 85 inch 950CH. I'm amazed at how little I had to tweak the settings to calibrate it.

Just watching from a standard Sony bluray player and playing games on a PS4 (base model) are amazing. I'm sure 4k content will look even better.
 

Kuranghi

Gold Member
Just got our first 4KTV. 85 inch 950CH. I'm amazed at how little I had to tweak the settings to calibrate it.

Just watching from a standard Sony bluray player and playing games on a PS4 (base model) are amazing. I'm sure 4k content will look even better.

Play this in the internal youtube app of your new monster telly:




For "oh my effing god" effect play it in on a PC connected to the TV in a browser that supports 8K and see how downsampled 8K looks. Spoiler: its completely awesome even when you don't have an 8K panel, this being a special case since he filmed the video at 12K and finished it at 8K though.
 
Last edited:

S0ULZB0URNE

Member
Lol what? Source your claims

You are claiming Sony are lying.
I suggest a read through here.


XR is new but the SOC is shared.

If you have proof that the 2021 tv's are using a different SOC post it.


Lagk4Yu.jpg
 

S0ULZB0URNE

Member
Don’t bother ; dude has ignored my posts on input lag and processing settings. That’s why he keeps repeating “its for movies” lol. He thinks smooth gradation adds lag, tempted to put him on ignore to be honest.
Ignored?
I am waiting on proof that those added processing don't add input lag.

I could be correct just prove it and we can move along.
 
Ignored?
I am waiting on proof that those added processing don't add input lag.

I could be correct just prove it and we can move along.
Would never waste my time to show you, who just posts here to show everyone how right you are. Buy your own lag tester before talking about things you know nothing about. "I could be correct" see - you're making definite statements without even being sure.

Why didn't you politely ask for proof before instead of sheepishly ignoring my last whole post to you?
 

kyliethicc

Member
I suggest a read through here.


XR is new but the SOC is shared.

If you have proof that the 2021 tv's are using a different SOC post it.


Lagk4Yu.jpg
You still have no proof of your claim that the XR is the same as the X1.

Sony have stated the XR is new for 2021. You claim they're lying.. and your evidence is an owners manual of the X900H? lol
 
Last edited:

S0ULZB0URNE

Member
Would never waste my time to show you, who just posts here to show everyone how right you are. Buy your own lag tester before talking about things you know nothing about. "I could be correct" see - you're making definite statements without even being sure.

Why didn't you politely ask for proof before instead of sheepishly ignoring my last whole post to you?
So you are the one who continues to make claims and even claimed to post proof but still haven't?
Gotcha

From my experience and research I am sure.
But nobody is perfect right?
It takes a bigger man to be open for correction no?
 

S0ULZB0URNE

Member
You still have no proof of your claim that the XR is the same as the X1.

Sony have stated the XR is new for 2021. You claim they're lying.. and your evidence is an owners manual of the X900H? lol
XR is a additional chip.
The SOC is the same.
Proof has been posted.

nZ3z4Tp.png
 
Last edited:
So you are the one who continues to make claims and even claimed to post proof but still haven't?
Gotcha

From my experience and research I am sure.
But nobody is perfect right?
It takes a bigger man to be open for correction no?
And ignores the question. Buy a lag tester or continue to spout ignorance. Never said I posted proof, just my findings. Which you ignored like a pussy willow.

Were someone else to ask me out of curiosity, I might show them. But so far no one has asked, so.
 
Last edited:

dotnotbot

Member
I don't know what you guys argue about. It's well known that X90J, X95J, A80J, A90J etc have the same Mediatek SOC as XH90, but the difference is new XR companion chip that replaces old X1 Ultimate (XH90 didn't have additional chip at all).

In the past some Sony TVs had different input lag numbers even though they shared the same SOC and companion chip (A9G and A8H for example) so input lag numbers aren't a good indicator.
 
Last edited:
I don't know what you guys argue about. It's well known that X90J, X95J, A80J, A90J etc have the same Mediatek SOC as XH90, but the difference is new XR companion chip that replaces old X1 Ultimate (XH90 didn't have additional chip at all).

In the past some Sony TVs had different input lag numbers even though they shared the same SOC and companion chip (A9G and A8H for example) so input lag numbers aren't a good indicator.
XH90 did have a picture processor, it's kinda important to have you know! Dubbed X1, but not to be confused with the X1 in the x900e from 2017 as the one in the x900h is not as good.

Souls Dork is saying if the X90J gets a VRR update, then the X900H will get it as well, but the updates are rolled out on a set by set basis, so that's not necessarily true. I'm not really interested if it has the same HDMI chipset or not.

I will believe any Sony TV gets VRR update when I see it, best not to trust these companies on stuff like this.
 
Last edited:

S0ULZB0URNE

Member
XH90 did have a picture processor, it's kinda important to have you know! Dubbed X1, but not to be confused with the X1 in the x900e from 2017 as the one in the x900h is not as good.

Souls Dork is saying if the X90J gets a VRR update, then the X900H will get it as well, but the updates are rolled out on a set by set basis, so that's not necessarily true. I'm not really interested if it has the same HDMI chipset or not.

I will believe any Sony TV gets VRR update when I see it, best not to trust these companies on stuff like this.
Wtf are you saying now?
 
XH90 did have a picture processor, it's kinda important to have you know! Dubbed X1, but not to be confused with the X1 in the x900e from 2017 as the one in the x900h is not as good.

Souls Dork is saying if the X90J gets a VRR update, then the X900H will get it as well, but the updates are rolled out on a set by set basis, so that's not necessarily true. I'm not really interested if it has the same HDMI chipset or not.

I will believe any Sony TV gets VRR update when I see it, best not to trust these companies on stuff like this.

I wouldn’t be surprised if they get VRR no later than when the PS5 gets VRR. It would be a really bad look for them if they didn’t since they said they were PS5 ready.
 
The video we need right now:


NGL I could use a bit of a hug :messenger_grinning_sweat:
I wouldn’t be surprised if they get VRR no later than when the PS5 gets VRR. It would be a really bad look for them if they didn’t since they said they were PS5 ready.
That's a good point, if they ever do update these tvs it'll be for ps5 usage ; they don't care if you can use VRR with your Xbox lol.

But I really expect Sony to do people dirty on this one, and 900H got off to a sketchy start. Happy to be proven wrong on this one though.
 

dotnotbot

Member
XH90 did have a picture processor, it's kinda important to have you know! Dubbed X1, but not to be confused with the X1 in the x900e from 2017 as the one in the x900h is not as good.

Of course, but just SOC, no companion chip. I'm saying that based on Vincent's review, he says this around 7:43 mark:

Because video processing duties are carried out entirely on the MT5895 SOC instead of together with Sony's own companion chip, there's no smooth gradation decountouring filter avaible in the picture menu



It also lacks reality creation and has poor near-black gradation compared to their other TVs. It seems like Sony couldn't fit all the video processing just on the SOC itself and some corner cutting was required. Or maybe they could but the algorithms are written with different chip in mind and porting them would be too troublesome.

Theory is that Sony "desperately" needed HDMI 2.1 model in their 2020 line-up. XR still wasn't ready and X1 Ultimate isn't designed to work with new SOC, so they created XH90 as a transition model, scrapping out some things that will be later carried out by new XR companion chip.

Souls Dork is saying if the X90J gets a VRR update, then the X900H will get it as well, but the updates are rolled out on a set by set basis, so that's not necessarily true. I'm not really interested if it has the same HDMI chipset or not.

I will believe any Sony TV gets VRR update when I see it, best not to trust these companies on stuff like this.

I agree about that, but it's possible that the updates will not be far off in time between each model. We'll see.
 
Last edited:

Kuranghi

Gold Member
Its annoying to be having all these caveats since the 2020 models came out, not just for Sony but other brands too:

* New tech brought new problems
* New processing features but now missing old processing features
* Zone count downgrades but algorithm upgrades

My boss used to see the new models as early as Fall previous year so for 2020 at least I don't think its Covid related, those sets probably entered manufacturing around the first Covid case, so the design was complete way before that.

Its fantastic if you are coming from older/lower end HW, but if you follow the stuff, 2016-2018 was just upgrade after upgrade with no downsides, aside from some X-Wide Angle, but that could be argued to be an upgrade by some.
 
Of course, but just SOC, no companion chip. I'm saying that based on Vincent's review, he says this around 7:43 mark:





It also lacks reality creation and has poor near-black gradation compared to their other TVs. It seems like Sony couldn't fit all the video processing just on the SOC itself and some corner cutting was required. Or maybe they could but the algorithms are written with different chip in mind and porting them would be too troublesome.

Theory is that Sony "desperately" needed HDMI 2.1 model in their 2020 line-up. XR still wasn't ready and X1 Ultimate isn't designed to work with new SOC, so they created XH90 as a transition model, scrapping out some things that will be later carried out by new XR companion chip.



I agree about that, but it's possible that the updates will not be far off in time between each model. We'll see.

Oh, I thought it had an extra chip and it was just crap. Makes sense I guess, have to look into that a bit.

Yeah, definitely wouldn’t want to downgrade to that from my 900e, but at least the J series has a picture processor again. And yep Sony just had to rush a 2.1 screen for ps5 launch.

Waiting for a mini led set from Sony to upgrade the ol’ 900e upstairs.
 

dolabla

Member
Well, looky here. The X90J has officially dropped in price at Amazon and Best Buy overnight. 50" is now $1098, 55" $1298 and 65" is $1598. ManaByte ManaByte better go get your $200 price adjustment at Best Buy.



I wasn't able to cancel my order, unfortunately. And they wouldn't adjust the price which I knew they quit a few years ago, but I asked anyways. But I got it settled after talking to Amazon.
 
Last edited:

kyliethicc

Member
Well, looky here. The X90J has officially dropped in price at Amazon and Best Buy overnight. 50" is now $1098, 55" $1298 and 65" is $1598. ManaByte ManaByte better go get your $200 price adjustment at Best Buy.



I wasn't able to cancel my order, unfortunately. And they wouldn't adjust the price which I knew they quit a few years ago, but I asked anyways. But I got it settled after talking to Amazon.
Considering the X900H 55" can be had for $1000, the most I'd pay for the X90J 55" is $1200 (assuming it's an improvement).

Will be interesting to see how low it goes in November.
 
Well, looky here. The X90J has officially dropped in price at Amazon and Best Buy overnight. 50" is now $1098, 55" $1298 and 65" is $1598. ManaByte ManaByte better go get your $200 price adjustment at Best Buy.



I wasn't able to cancel my order, unfortunately. And they wouldn't adjust the price which I knew they quit a few years ago, but I asked anyways. But I got it settled after talking to Amazon.

I’m glad I somehow got $300 off my A90J. I would have been a little bummed paying full price at launch to have it drop in a month.
 
Last edited:

Kuranghi

Gold Member
the 75" X900H is back at november pricing(which i got) at 1599 or even a few dollars cheaper at costco just FYI. Insane amount of TV for the money

Fuck me, I used to have to sell the 75" edge-lit 85-series Sonys for ~£2199 up to BF (£1899 on BF), the 75" 90 series was like £2499 n August and £2299 on BF. Its was almost dollars for pounds so just convert that to dollars.

You lucky bastards haha.
 

Esppiral

Member
Still waiting for an update for my xh90, if they don't plan on adding VRR at least disable that fucking annoying power saving warning that pops everytime I change anything
 
Last edited:

Kuranghi

Gold Member
Still waiting for an update for my xh90, if they don't plan on adding VRR at least disable that fucking annoying power saving warning that pops everytime I change anything

Yeah jfc, if they had patched that into my 2016 Sony I would have murdered someone, I change the brightness about 19 times a day lol.
 

Kuranghi

Gold Member
AFAIK it's only present on EU/UK firmware, Sony claims it's becuase of some EU law.

Yeah probably, Sony are a bunch of rule following nerds, I think the XF90 feet were a response to an upcoming (possibly also only UK/EU) law that said TVs had to be able to be pulled/tilted forward a certain amount of degrees and still fall backwards instead of forwards, to protect kids from pulling them down on them.

Weird "duck feet", the idea behind them they told me was for stability and also they are supposedly set at the maximum preferred off-axis angle that someone should sit at to get a good picture to make the foot look its thinest, so I guess that means a bit further left than where the camera is in this shot:

mxweqzynpkvbw29rbehf.jpg



I was like "What about the other foot?". They did make them metal and half as thin on the XG90 and some more models, which was nice.
 
Last edited:
Top Bottom