• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sony’s 2021 TV lineup runs Google TV and fully embraces HDMI 2.1

DeepEnigma

Gold Member
I calibrated using HDTV Test settings from Vincent. I like my TV, I just hope they do update it with Google TV and they improve on 120hz and add VRR. If they come through with the promises, the 900h was an absolute steal.
I like Vincent, but I wouldn't use his settings. He likes a lighter (less inky blacks) picture because of "source tools."

This guy's are the bee's knees, IMO,


Keeping it simple, and looks amazing across all modes, especially gaming with lower bloom.
 
Last edited:

S0ULZB0URNE

Member
Maybe, depends on how accurate that info is but i'm not necessarily doubting it. Trying to think why it'd be that way though.

It's weird because the X90J shares the same processor as the A90J, according to Sony (which may explain the extra cost compared to 55 inch 950g/h) which is the main area 900h suffered compared to older sets. It *really* shouldn't be worse lol. I saw your brightness slide as well... which doesn't really explain the blooming pic you posted so, not sure. Perhaps the local dimming settings are different?

It'll be interesting to see when more people get their hands on it.

The 32 zones bit I definitely believe lol.
The chip they share is the XR chip. It's mainly for movies and adds extra processing which isn't good for gaming.

The X90H/J share the same SOC.

I think Sony cut corners which explains the lower performance than the H.
Also the OS is Google TV instead of Android TV.
Not sure if this is a issue either.
 
The chip they share is the XR chip. It's mainly for movies and adds extra processing which isn't good for gaming.

The X90H/J share the same SOC.

I think Sony cut corners which explains the lower performance than the H.
Also the OS is Google TV instead of Android TV.
Not sure if this is a issue either.
You may not remember, but you told me once that smooth gradation and other processing in the picture menus add lag on Sony sets. They actually don’t, with the exception of bfi on oled sets. Tested by myself of course.

If Bravia displays allowed motion interpolation for games, that would no doubt add lag as well but of course you can’t use it in game mode. Maybe on future Sony sets.

By virtue of having the XR processor the new set should be better. Now it might not be, but it’s hard to imagine so we’ll have to see.

Until I can test the XR sets myself I’m going to assume you can leave a multitude of settings on in game mode without a lag hit ; hard to believe Sony would move backwards on these features.
 
Last edited:

S0ULZB0URNE

Member
You may not remember, but you told me once that smooth gradation and other processing in the picture menus add lag on Sony sets. They actually don’t, with the exception of bfi on oled sets. Tested by myself of course.

If Bravia displays allowed motion interpolation for games, that would no doubt add lag as well but of course you can’t use it in game mode. Maybe on future Sony sets.

By virtue of having the XR processor the new set should be better. Now it might not be, but it’s hard to imagine so we’ll have to see.

Until I can test the XR sets myself I’m going to assume you can leave a multitude of settings on in game mode without a lag hit ; hard to believe Sony would move backwards on these features.
They do add input lag.
Not saying it's a whole lot but it's there and those settings are off on Game mode defaults for a reason.

Smooth Gradation is great for lower quality sources but should not be used with gaming imo.
 
They do add input lag.
Not saying it's a whole lot but it's there and those settings are off on Game mode defaults for a reason.

Smooth Gradation is great for lower quality sources but should not be used with gaming imo.
My man, I have tested settings myself with a leo bodnar 4k lag tester. On both Bravia 4k LCD and Sony oleds.

I bought it because there's this notion that they add lag, but I never felt a difference in toggling settings on an off, so I bought one for academic purposes. Indeed, I noticed lag when turning on bfi on oled prior to testing with the device, and that was the only thing that I ever felt a difference on. My perceptions as it turned out were right.

I still leave bfi on on A8h though, it's still only 26 ms of lag and looks so much smoother with bfi.

Edit : I've found that smooth gradation should be left on low for modern systems, and off if you've got a retro console hooked up. Reality creation as Sony calls it, is really effective sharpening that really doesn't even add noise at the 40 out of 100 value on my Wii/Gamecube output settings. Really makes a world of difference, and again no lag! It should not be anywhere near that high for modern consoles though.

Funny because I associate sharpening with the gobshite on PC designed to crawl back detail obliterated by the use of TAA, but Sony's method is quite good.
 
Last edited:

Kerotan

Member
So the x90J looks like 1st of May and €1500 in Europe. My friend will be happy as he's saved up €2000 for this. I'm just trying to convince him to read the reviews before buying just incase there's any big issues.
 

Bo_Hazem

Banned
So the x90J looks like 1st of May and €1500 in Europe. My friend will be happy as he's saved up €2000 for this. I'm just trying to convince him to read the reviews before buying just incase there's any big issues.

Don't be too happy. That 1500 seems like for the 50", not even the 55". Better wait for price cut.
 

Kerotan

Member
Don't be too happy. That 1500 seems like for the 50", not even the 55". Better wait for price cut.
He's not waiting for a price cut. He's been waiting 3 years for this and has more money then he knows what to do with it thanks to covid not allowing him to spend on the things he enjoys.
 

Bo_Hazem

Banned
So no VRR? Not ready for the displays and the PS5? If it was that would be something very specific to point out in addition to ALL

So far a version of VRR seems to only work for PS5 mated with X900H or other HDMI 2.1 TV's, but a true certified VRR isn't implemented yet on either the console nor their TV's.
 

Jigga117

Member
So far a version of VRR seems to only work for PS5 mated with X900H or other HDMI 2.1 TV's, but a true certified VRR isn't implemented yet on either the console nor their TV's.
I can’t understand how that works when VRR isn’t on the PS5 and both devices need VRR to work. That is new. Do you have anything that pointed to that I can check out?
 

Bo_Hazem

Banned
I can’t understand how that works when VRR isn’t on the PS5 and both devices need VRR to work. That is new. Do you have anything that pointed to that I can check out?

Here is more information:





VRR is part of the HDMI forum and AMD trademarks. So you need a certain of things to meet and get officially approved before claiming it or you'll face lawsuits. But I'm not worried about the new lineup as they put a high end processor on something as low as X90J for that reason, first to make sure that it's working properly, and to avoid problems like no Dolby Vision in 4K@120Hz or blur problems that occurred to the previous X900H. Yet it's safer to wait for reviews and tests before throwing money if you care much about VRR and perfect 4K@120Hz, along with Dolby Vision in 4K@120Hz if that will be supported by Xbox.
 
Last edited:

Bo_Hazem

Banned
Couldn't find the piece of the hidden VRR-like version that is only working with PS5 mated with X900H, but it's been around and semi-confirmed by some insiders:

Comment:

SeniorFallRisk

On the other hand, there’s now a rumor/confirmation that the PS5 and X900H have VRR working together that’s invisible to everyone but the two pieces of hardware?
Did anyone else catch that? He references it here but then in the 3 man stream, they blatantly said/agreed there’s some special sauce going on that’s invisible to everyone between the PS5 and X900H since freesync/forum VRR isn’t enabled on both yet.


 
Last edited:

Fake

Member
I lost many times trying to calibrate others brands such LG\Samsung, but if I remember right the Bravia factory calibration is by far the most close to the ideal calibration. I guess Sony use the Sony camera division to help the Bravia TV division in that departament.

I really can't wait to back using Bravia again.
 

Bo_Hazem

Banned
I lost many times trying to calibrate others brands such LG\Samsung, but if I remember right the Bravia factory calibration is by far the most close to the ideal calibration. I guess Sony use the Sony camera division to help the Bravia TV division in that departament.

I really can't wait to back using Bravia again.

More like they have their reference monitors that other brands would use to calibrate their own TV's (except for Panasonic as it has its own reference monitor).

bvm-x300_gamma.jpg


 

Kuranghi

Gold Member
I agree. At those prices I’d rather consider the LG CX as they move to this year’s models or a G1 near BF.

I will say that I’m enjoying the A90J. The picture is great. I really enjoy seeing texture details on clothing and sets. I even appreciate the texture differences on clothing like horribly animated shows like Mickey Mouse Clubhouse when the kids are watching.

I wish broadcast tv was crisper, but at least the motion is good. The thing that probably irritates me the most on broadcast tv is the slight bit of artifacts that I see around the outlines of people/objects. Maybe I need to update settings. I’m just using the default cinema settings right now.

I do notice slight stuttering on pan shots here and there. I haven’t had a chance to game on it yet, I’ve been holding our newborn in the evenings to give my wife a little break.

Try turning smoothness down a notch or to minimum even and do you know if the signal coming from the cable box is interlaced or progressive?

I think it may be film mode that needs to be adjusted but I'm not sure of the setting it should be on as I have no firsthand knowledge of the new sets. Vincent should have his settings video out really soon though so maybe have a look at that, or if someone else has already done that them check them out.
 
I can’t understand how that works when VRR isn’t on the PS5 and both devices need VRR to work. That is new. Do you have anything that pointed to that I can check out?
I have the X900H and the PS5 and I can confirm that at least some kind of "ALLM" is implemented with this pair.
 
Last edited:

Allandor

Member
So still no VRR until end of the year for xh90 or xj90 :(, this time it is Sonys own video that say "by Winter 2021"
 
Last edited:

Dibils2k

Member
i recently got the 65" XH950, was gonna wait for 2021 TVs but decided i have no need for 2.1, consoles have to sacrifice way too much and i would never pick 120fps mode over a 60fps with much better visuals.
 

bargeparty

Member
My man, I have tested settings myself with a leo bodnar 4k lag tester. On both Bravia 4k LCD and Sony oleds.

I bought it because there's this notion that they add lag, but I never felt a difference in toggling settings on an off, so I bought one for academic purposes. Indeed, I noticed lag when turning on bfi on oled prior to testing with the device, and that was the only thing that I ever felt a difference on. My perceptions as it turned out were right.

I still leave bfi on on A8h though, it's still only 26 ms of lag and looks so much smoother with bfi.

Edit : I've found that smooth gradation should be left on low for modern systems, and off if you've got a retro console hooked up. Reality creation as Sony calls it, is really effective sharpening that really doesn't even add noise at the 40 out of 100 value on my Wii/Gamecube output settings. Really makes a world of difference, and again no lag! It should not be anywhere near that high for modern consoles though.

Funny because I associate sharpening with the gobshite on PC designed to crawl back detail obliterated by the use of TAA, but Sony's method is quite good.

What picture mode and settings do you use with the A8H for "modern" gaming?


Apologies you listed it somewhere else I haven't gone back through the thread.
 
What picture mode and settings do you use with the A8H for "modern" gaming?


Apologies you listed it somewhere else I haven't gone back through the thread.
These settings apply to both my sets : I just leave reality creation on the default setting of 10 out of 100 on the ps4 pro. Smooth gradation on low, turn off extended dynamic range. Local dimming on high for my x900e lcd. Under motion settings, under clearness I use a value of 2, which is the most effective 120hz bfi, as 3 gives you 60hz bfi with noticeable flicker. There's talk of image duplication when the bfi doesn't match the output fps, but in truth due to the sample and hold nature of the display and the low fps blur that occurs anyway, any duplication just blends in to whatever blur is already present. At least, on my sets. Some sony sets display bfi a bit differently.

For xbox 360, Wii U and switch I typically have reality creation on a higher setting of 20 because the games are typically less than 1080p. Rest of the settings are the same. Games like Yoshi's crafted world could really use some sharpening.

Btw I have the classic systems hooked up to my smaller x900e lcd, not oled because I don't want black bars 4:3 burn in lol.
 
Last edited:

Guy Legend

Member
So what's the new Sony entry level TV? I saw a youtube video that feature an X75 as part of the new lineup, but online I see Sony tv listings starting with X80j.
 

Bo_Hazem

Banned
So what's the new Sony entry level TV? I saw a youtube video that feature an X75 as part of the new lineup, but online I see Sony tv listings starting with X80j.

I would suggest X85J as it has 2x HDMI 2.1 ports, but it's edge-lit and should as well get the defecets of the X1 processor from X900H, which is not bad at all. If you're not going for a small 43" one then I would suggest you grab the current X900H instead that should still be superior with it's FALD and impressive blacks. It's still shown as $999 but I believe you can get the 55" for a cheaper price in places like Costco if you are in the US:

 

dolabla

Member
That's actually not very far from 48", I can see that size being famous among PC gamers. I'll stick with 55" because it's very close to A0 prints (57") so it would help me understand the final print when I get the Canon 44" printer (nearly as wide as a 55" TV).
Yeah, I'm limited in space so it'll be perfect for me. It's a little expensive for the size, but hopefully it can see a price drop within 45 days (my Best Buy return window) and can get refunded the difference. I'm just ready to get a new tv dang it :messenger_beaming:
 

Bo_Hazem

Banned
Yeah, I'm limited in space so it'll be perfect for me. It's a little expensive for the size, but hopefully it can see a price drop within 45 days (my Best Buy return window) and can get refunded the difference. I'm just ready to get a new tv dang it :messenger_beaming:

X900H was advertised as $1400 for the 55", it got dropped to $999 like day one on release. But don't forget that you're getting the best processor in the market that's featured in Sony's most expensive 8K tv, more like an overkill for X90J but I think they're trying to drive its price down by offering it across the board and enough processing power to handle HDMI 2.1 and a lot more for like miniLED in the future.
 
Last edited:

Kuranghi

Gold Member
Yeah launch prices are massively inflated usually but these really are nonsense. The X900H dropped to 999 on day one because the 55" X900F settled at 999 in May 2019, then they kept selling it with the 2019 models and again with the May 2020 models but only for a few months before it went away that time.

So since the X900H/X90J is a downgrade of the X900F in many ways it probably means it could settle at like 799, so definitely wait til Black Friday if you can. Although X90J has the new processor which is supposed to be great, so thats a boon, though I'm still waiting for comparisons with the old chips to be sure.

edit - I just realised the XR processor and X-Motion Clarity XR (Whatever that is) is probably a big enough thing to warrant a higher price, so maybe 899 is more realistic for what it will settle at from a marketing perspective, but cost to produce-wise I think its worth 799/849 personally.

Are there pre-release (like chinese sites) brightness vales for the X90J yet?
 
Last edited:
Yeah launch prices are massively inflated usually but these really are nonsense. The X900H dropped to 999 on day one because the 55" X900F settled at 999 in May 2019, then they kept selling it with the 2019 models and again with the May 2020 models but only for a few months before it went away that time.

So since the X900H/X90J is a downgrade of the X900F in many ways it probably means it could settle at like 799, so definitely wait til Black Friday if you can. Although X90J has the new processor which is supposed to be great, so thats a boon, though I'm still waiting for comparisons with the old chips to be sure.
Yeah if I had a choice of x900f vs. 900h I would probably buy the former lol. Input lag would still be good for 4K sources.

Curious to see how the XR processor does on the lowest end fald va Sony this year.
 

Kuranghi

Gold Member
Yeah if I had a choice of x900f vs. 900h I would probably buy the former lol. Input lag would still be good for 4K sources.

Curious to see how the XR processor does on the lowest end fald va Sony this year.

The main reasons for me are:

* Quite a bit better processing, processing is not as good as X900E, let alone X900F
* 48 vs 32 dimming zones, even the X900E had 35 zones (7x5)
* 500 nits vs 900 nits, X900E is 500 nits also

I can't wait to see VT's video on XR, yah.
 

HTK

Banned
I have a Sony 65" x900H that I got a month ago, the plan was to take it back once the X90J drops. I'm looking forward to the reviews to see if its worth trading in my 65" X900H for a 55" X90J.
 

Kuranghi

Gold Member
I have a Sony 65" x900H that I got a month ago, the plan was to take it back once the X90J drops. I'm looking forward to the reviews to see if its worth trading in my 65" X900H for a 55" X90J.

I think if the X90J is the same panel-wise and brightness to the X900H then the main difference will be the better image processing and updated BFI tech. You have X-Motion Clarity BFI in your X900H and its awesome so I have high hopes for the updated version. Sony's processors go like this in terms of quality:

* X1 2020 (Thats in the X900H)
* X1 (from before 2020)
* X1 Extreme
* X1 Ultimate
* XR (Potentially)

Since the new XR processor is the only chip across all models now, I would assume its an upgrade on the X1 Ultimate, and if it is you can see how that would be a many generational jump up in terms of image presentation. It could be a massive jump up so definitely worth considering.
 

Bo_Hazem

Banned
The main reasons for me are:

* Quite a bit better processing, processing is not as good as X900E, let alone X900F
* 48 vs 32 dimming zones, even the X900E had 35 zones (7x5)
* 500 nits vs 900 nits, X900E is 500 nits also

I can't wait to see VT's video on XR, yah.

They X900H is actually 700+ nits though.
 

HTK

Banned
I actually prefer the correct calibration of brightness at "3". Obviously with HDR it's at MAX, and it's fine. TIME STAMPED BELOW.

 
Last edited:

Kuranghi

Gold Member
They X900H is actually 700+ nits though.

I'm almost always talking about the "Real Scene" values, the top figure in the rtings brightness section. Rtings only recorded a value of 725 nits for the 25% window on the X900H, the 10% figure is 450 and Real Scene is 500. Generally when you see the leaked brightness readings from chinese, etc websites they will be quoting the 10% figure.

The 25% window is important too for large bright areas like an interior shot of a big bright window, but 2/10% and Real Scene is better for knowing how good general HDR impact will be.

2/10% is for the brightness of small highlights, like the sun or headlights, since those things will generally not take up a large area of the screen, nowhere near 25% certainly, even a 10% window will be bigger that them most of the time so 2% is best for measuring that.

Real Scene is a way to see how bright 2, 5 and 10% windows will be in actual mixed content and not just a white box over a black screen. This is the video Rtings use (Its the SDR version, but its the same video) for the Real Scene test:




If you have a look at the shot, (this is rough percentages ofc, but you get the idea) the striplights at the top act as the 2% window, the smaller white square on the left is the 5% window and the white square on the right is the 10% window. They made this test so you can see how bright those sized highlights will really be, as opposed to just what you read from a test pattern.

Here is the comparison for X900F vs. X900H I checked.


I usually look at the Real Scene + 2% + 10% + 25% values and then compare and contrast them. The Real Scene and 10% should be similar unless they are fucking around with the test patterns* and hopefully the 2% should be the same or just a bit under those first two values, though sometimes its much less if the TV has large dimming zones. The 25% and 50% figures are good for seeing how much brighter very bright scenes would be on an LCD vs. OLED, like the Matrix scene in the white construct area, with Morpheus. Maybe also for really bright games, like the Mount Volbono level in Mario Odyssey.

*Occasionally the 10% figure is much more than the Real Scene figure but thats usually due to a power limit, ie when the rest of the screen is pure black it has the power available to maximise the light output of the 10% window, but when the rest of the screen has content on it, it doesn't have the power leftover to keep a 10% of the screen at the same brightness. It only really applies to super high end LCD sets though.

Sorry for the lecture if you already knew that but while explaining why you shouldn't say X900H is 700 nits, I thought I'd also write out a guide for helping people understand the rtings brightness measurements a bit more.
 
Last edited:

Bo_Hazem

Banned
I'm almost always talking about the "Real Scene" values, the top figure in the rtings brightness section. Rtings only recorded a value of 725 nits for the 25% window on the X900H, the 10% figure is 450 and Real Scene is 500. Generally when you see the leaked brightness readings from chinese, etc websites they will be quoting the 10% figure.

The 25% window is important too for large bright areas like an interior shot of a big bright window, but 2/10% and Real Scene is better for knowing how good general HDR impact will be.

2/10% is for the brightness of small highlights, like the sun or headlights, since those things will generally not take up a large area of the screen, nowhere near 25% certainly, even a 10% window will be bigger that them most of the time so 2% is best for measuring that.

Real Scene is a way to see how bright 2, 5 and 10% windows will be in actual mixed content and not just a white box over a black screen. This is the video Rtings use (Its the SDR version, but its the same video) for the Real Scene test:




If you have a look at the shot, (this is rough percentages ofc, but you get the idea) the striplights at the top act as the 2% window, the smaller white square on the left is the 5% window and the white square on the right is the 10% window. They made this test so you can see how bright those sized highlights will really be, as opposed to just what you read from a test pattern.

Here is the comparison for X900F vs. X900H I checked.


I usually look at the Real Scene + 2% + 10% + 25% values and then compare and contrast them. The Real Scene and 10% should be similar unless they are fucking around with the test patterns* and hopefully the 2% should be the same or just a bit under those first two values, though sometimes its much less if the TV has large dimming zones. The 25% and 50% figures are good for seeing how much brighter very bright scenes would be on an LCD vs. OLED, like the Matrix scene in the white construct area, with Morpheus. Maybe also for really bright games, like the Mount Volbono level in Mario Odyssey.

*Occasionally the 10% figure is much more than the Real Scene figure but thats usually due to a power limit, ie when the rest of the screen is pure black it has the power available to maximise the light output of the 10% window, but when the rest of the screen has content on it, it doesn't have the power leftover to keep a 10% of the screen at the same brightness. It only really applies to super high end LCD sets though.

Sorry for the lecture if you already knew that but while explaining why you shouldn't say X900H is 700 nits, I thought I'd also write out a guide for helping people understand the rtings brightness measurements a bit more.


Very detailed from an experienced TV man! Much appreciated. 🙌
 

Kuranghi

Gold Member
I actually prefer the correct calibration of brightness at "3". Obviously with HDR it's at MAX, and it's fine. TIME STAMPED BELOW.



This first part here I'm talking about the gamma stuff that comes after the part you linked, I talk about the backlight setting below that:

I appreciate this guys videos and thanks for posting them but the whole "0 gamma" is correct isn't really a thing. I say just set it to 0 or -2 depending on what you think looks good, with the other settings set to account for that.

Here is an article where they are talking to calibration experts and it seems to be personal preference which they calibrate to, you just have to pick one and then base the calibration around it, even though 2.2 is an industry standard.gamma (0 on Sony TVs as seen in video) VT and Stacy Spears say 2.4.

I just know people often prefer a bigger difference between light and dark and 2.4 (-2) will get you that, 2.2 (0) makes it too flat for me, though I'm watching in a pitch black room so if its always well lit then definitely increase the gamma.

To your comment: Some people just love low backlight settings, my friend really does and I get it. My TVs calibrated SDR day and night picture modes are at 41 and 23 out of 50 backlight respectively so 3 out of 50 seems super low to me. Mine is 2.4 and 2.5 gamma for day and night though so maybe thats why the backlight is boosted, also could just be something unique to my set.
 
Last edited:

Kuranghi

Gold Member
Very detailed from an experienced TV man! Much appreciated. 🙌

I appreciate the praise my friend, I would say to anyone who reads my stuff that I'm not an expert in any way and I just parsed all this info from researching on the internet, speaking to experts and my old TV jobs resources like the head office engineer-ds and internal documents.

So definitely do your own reading because I might be (read: probably) wrong/confused on some stuff and I'd relish being corrected. The brightness stuff I'm pretty confident about as thats kind of relative anyway so even if I'm misunderstanding some more complicated facet of it the gist of what I'm saying is still relevant for comparing sets.
 
Last edited:

99Luffy

Banned
Is edge lit a deal breaker for monitor use? Im waiting on reviews for the 43" X85J but Im thinking next will be a better time to upgrade.
 

iamvin22

Industry Verified
I like Vincent, but I wouldn't use his settings. He likes a lighter (less inky blacks) picture because of "source tools."

This guy's are the bee's knees, IMO,


Keeping it simple, and looks amazing across all modes, especially gaming with lower bloom.

how dare you talk down upon vincent lol. ill take a look at your vid.
 

DeepEnigma

Gold Member
how dare you talk down upon vincent lol. ill take a look at your vid.
Hahaha, I like his informative videos just not his picture settings, since what works on an OLED source material doesn't always translate well on backlit LED. It may in microLED in the future.

This TV is so good out of the box, you only need to tweak a couple of things. Let me know what you think after.
 

SegaShack

Member
I have always been a Sony TV guy. Currently have a projector for my main theater room, 55" Bravia 3D in our master.

We were looking to get a 4k tv for our living room. Most of what I watch are blurays or dvds, sometimes on disc, but most I ripped in full quality and put on our Plex Media Server and watch through AppleTV.

I have no interest in rebuying 4k movies I already own on DVD/Bluray.

Does 480p/1080p content still look good on a 4k Sony Set? This would be a deal breaker for me getting a 4K TV.
 
Top Bottom