• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sony’s 2021 TV lineup runs Google TV and fully embraces HDMI 2.1

Kuranghi

Gold Member
I know the whole HDR is better on these specific LCDs people like me sound mad to the OLED crowd, because how can the picture look better than on your OLED? The OLED picture is almost perfection in isolation but all this bloviating about bright HDR scenes is summed up by VT in the video I posted above - ZD9 vs 2017 OLED, watch that for his opinion on why the ZD9 is better than an OLED for HDR.

I wish I could get you guys round (We'd have to take turns sitting in the chair facing the screen because the viewing angle is so shit haha) and show you what I see when I watch 4000/10000 nit mastered content on it, Dolby Vision and the like. There has been only a few situations out of 1000s where I found blooming/the black level/other FALD LCD problems distracted me the content, most of the time I'm just marvelling at the picture. I wouldn't trade all those good experiences to solve the few problems if it meant all the content wasn't as dynamic as it is now.

As an example, Ozark is hilariously bright in parts, they definitely never tested this show on FALD LCDs like the ones I listed. The cinematography of the show likes to do this window/sky is uber bright behind the actors while they are in shadow effect a lot and its visually so stunning when the window is actually makes it hard to see the actors properly, its the way (Going by the master they made) they intended you to see it imo.

Funny side-note - I'm part of a ZD9 facebook group called The Grand Order of The Z (Recently changed from ZD9 to Z to encompass the new shit) where people come for help with issues with the set, but mostly its used to willy wave about how good HDR looks on it :messenger_tears_of_joy: .
 
Last edited:
I know the whole HDR is better on these specific LCDs people like me sound mad to the OLED crowd, because how can the picture look better than on your OLED? The OLED picture is almost perfection in isolation but all this bloviating about bright HDR scenes is summed up by VT in the video I posted above - ZD9 vs 2017 OLED, watch that for his opinion on why the ZD9 is better than an OLED for HDR.

I wish I could get you guys round (We'd have to take turns sitting in the chair facing the screen because the viewing angle is so shit haha) and show you what I see when I watch 4000/10000 nit mastered content on it, Dolby Vision and the like. There has been only a few situations out of 1000s where I found blooming/the black level/other FALD LCD problems distracted me the content, most of the time I'm just marvelling at the picture. I wouldn't trade all those good experiences to solve the few problems if it meant all the content wasn't as dynamic as it is now.

As an example, Ozark is hilariously bright in parts, they definitely never tested this show on FALD LCDs like the ones I listed. The cinematography of the show likes to do this window/sky is uber bright behind the actors while they are in shadow effect a lot and its visually so stunning when the window is actually makes it hard to see the actors properly, its the way (Going by the master they made) they intended you to see it imo.

Funny side-note - I'm part of a ZD9 facebook group called The Grand Order of The Z (Recently changed from ZD9 to Z to encompass the new shit) where people come for help with issues with the set, but mostly its used to willy wave about how good HDR looks on it :messenger_tears_of_joy: .
Grand order of the Z is by far the nerdiest thing i've ever read from you xD

HDR and motion smoothness for movies without interpolation, that set is the king no doubt. I wonder how the A90J fares against it in terms of hdr impact?

But in all reality I would never go from my oled to zd9 because of the former's uniformity, perfect contrast with no bloom, response time and viewing angle. Not to mention input lag. I wish the Z9F had retained the Z9D's zones, and BMD (with no viewing angle film ; I want contrast damn it!) and just got the input lag and response time down. As a true swan song to 4k LCD.
 
Last edited:

Kuranghi

Gold Member
Grand order of the Z is by far the nerdiest thing i've ever read from you xD

HDR and motion smoothness for movies without interpolation, that set is the king no doubt. I wonder how the A90J fares against it in terms of hdr impact?

But in all reality I would never go from my oled to zd9 because of the former's uniformity, perfect contrast with no bloom, response time and viewing angle. Not to mention input lag. I wish the Z9F had retained the Z9D's zones, and BMD and just got the input lag down. As a true swan song to 4k LCD.

I am a bit of a nerd, but I mostly joined that group because conceptually its funny to me that (obviously mostly) men would make a group to go "Hey that thing we all like is STILL great" lol. Sharing the joy(z).

Indeed ha, ideally you own a ZD9 and an OLED but if you can only have one then know you might not want to go back to the other afterwards :messenger_tears_of_joy:
 
Last edited:

dolabla

Member
I think its fine to use OLED for gaming/dekstop use but you have to take some precautions like hiding static elements and having a slidehow of wallpapers thats randomly ordered and has a LOT of images to go through, like 100s at least. I cant' speak to burning in black bars from devices like OSSC and Framemeister though unfortunately.
Just for clarification, when I say black bars on the side I just mean 4:3 content (or if I scale using 5x on the OSSC it's a little wider than 4:3 so not as much black bars). Best Buy does have the 5 year warranty that covers burn in. I got it last year when I got the CX (they refunded me for it of course). It still makes me worry though, lol.

Say if you're mixing up games that have static elements, will that be okay? I do like to go from one game to another when I play my older consoles.
 

Kuranghi

Gold Member
Just for clarification, when I say black bars on the side I just mean 4:3 content (or if I scale using 5x on the OSSC it's a little wider than 4:3 so not as much black bars). Best Buy does have the 5 year warranty that covers burn in. I got it last year when I got the CX (they refunded me for it of course). It still makes me worry though, lol.

Say if you're mixing up games that have static elements, will that be okay? I do like to go from one game to another when I play my older consoles.

I said the black bar burn in thing because I saw someone else mention that when talking about a Framemeister, doesn't make sense to me because the pixels should be off but it might be that the bars aren't truly black and thats why it burns in, not sure. They could've also been an idiot who doesn't know what they are talking about lol.

I think a big part of the problem lies with playing the same game every day for months even if its just for a few hours, the TV will start to remember the HUD if its static and too bright. To be fair though if they say they are going to replace it if it happens then I say just don't worry about it, its not like you will have much data in the TV you'd lose with a replacement is there?
 

dolabla

Member
I said the black bar burn in thing because I saw someone else mention that when talking about a Framemeister, doesn't make sense to me because the pixels should be off but it might be that the bars aren't truly black and thats why it burns in, not sure. They could've also been an idiot who doesn't know what they are talking about lol.

I think a big part of the problem lies with playing the same game every day for months even if its just for a few hours, the TV will start to remember the HUD if its static and too bright. To be fair though if they say they are going to replace it if it happens then I say just don't worry about it, its not like you will have much data in the TV you'd lose with a replacement is there?
Yeah, it's black on the Framemeister as far as I can tell, lol. Pixels should be off. Uneven wear of the screen due to 4:3 content where you have the middle of the screen on/active, while the sides are off, is what I was concerned about. But I guess movies have the black bars on the top and bottom.

I definitely don't play the same game for months on end (for all consoles). When I play, it's usually to beat a game and then I shelve it and on to the next game I go. The only online games I really play are F13th and COD (Hardcore modes that don't have huds), but it's maybe a 10 hours a week and sometimes none a week. Mix that in with single player games/tv/movie/youtube watching. My backlog due to waiting for a new tv has built up so I'm very ready to get a new tv :messenger_grinning_smiling:.

Yeah, the Best Buy warranty covers burn in for 5 years so maybe I shouldn't worry too much. I guess by then MicroLED (or some other tech) may even be a thing by that time.
 
I am a bit of a nerd, but I mostly joined that group because conceptually its funny to me that (obviously mostly) men would make a group to go "Hey that thing we all like is STILL great" lol. Sharing the joy(z).

Indeed ha, ideally you own a ZD9 and an OLED but if you can only have one then know you won't want the other afterwards
It’s a very nice tv my friend, enjoy it until it stops working! Maybe pick up a 2.1 Sony oled down the line when we hit peak oled, but z9d will be an awesome movie set for years and years.
 

Bojanglez

The Amiga Brotherhood
So over on AVSForum, someone posted that a firmware update has now been released for the X90J that fixes the 'judder' also improves brightness and (maybe) improves input lag. Let's wait and see if this is true.

From the YT comments:
87MUi8F.png


Update:
Another dude (this time on Reddit) was initially having stuttering issues (see )

He now says the latest firmware has fixed his issues...
GI4vEAR.png
 
Last edited:

dolabla

Member
So over on AVSForum, someone posted that a firmware update has now been released for the X90J that fixes the 'judder' also improves brightness and (maybe) improves input lag. Let's wait and see if this is true.

From the YT comments:
87MUi8F.png


Update:
Another dude (this time on Reddit) was initially having stuttering issues (see )

He now says the latest firmware has fixed his issues...
GI4vEAR.png

That is great news. Kind of crazy to think Sony released with a pretty big issue like this. As long as it has been fixed I'll stay on the X90J train. I wonder if it's an XR processor thing that also affected the A90J?
 

Kuranghi

Gold Member
GI4vEAR.png


If i read another person confusing response time with input lag 😬

"I was doing 1ms before..."

No, you weren't pal. Even if you're playing at native res @ the max refresh rate of the panel your input lag would be around 1.7ms on the best monitor rtings ever tested, if you are at native res @ 60hz then input lag is minimum 8ms for the best rated monitor on the rtings table. HDR + BFI adds more input lag.

I know 1.7-8ms is still really low but seems like its entering placebo territory if the spread is that large, ie they read 1ms (response time) on the box and buy it and then say "I was gaming on a 1ms input lag monitor until I got this 4K TV with ~8ms input lag (@4K120) so its a big step down for me"... except the input lag was probably as high if not higher at times on their monitor and they didn't think "oh this feels terrible", or I presume they didn't anyway. Saying "coming from 1ms" sounds like you thought it was 1ms in all situations to me.

Do all these super high refresh rate monitors have some form of VRR now? That would make running at native refresh rate all the time more realistic, because no tearing or heavy framerate drops from late v-sync'd frames, but I get the feeling if you don't have that you'll be locking to lower refresh rates, incurring even more lag as a result.


From the page above:
Note: Do not confuse the input lag time with the response time. The response time is the time it takes a pixel to shift from one color to another, which is often significantly shorter than the input lag time.

Awwwwww I'm just grumpy, sorry lol. I appreciate the guy taking the time:

 
Last edited:

Bo_Hazem

Banned
If i read another person confusing response time with input lag 😬

"I was doing 1ms before..."

No, you weren't pal. Even if you're playing at native res @ the max refresh rate of the panel your input lag would be around 1.7ms on the best monitor rtings ever tested, if you are at native res @ 60hz then input lag is minimum 8ms for the best rated monitor on the rtings table. HDR + BFI adds more input lag.

I know 1.7-8ms is still really low but seems like its entering placebo territory if the spread is that large, ie they read 1ms (response time) on the box and buy it and then say "I was gaming on a 1ms input lag monitor until I got this 4K TV with ~8ms input lag (@4K120) so its a big step down for me"... except the input lag was probably as high if not higher at times on their monitor and they didn't think "oh this feels terrible", or I presume they didn't anyway. Saying "coming from 1ms" sounds like you thought it was 1ms in all situations to me.

Do all these super high refresh rate monitors have some form of VRR now? That would make running at native refresh rate all the time more realistic, because no tearing or heavy framerate drops from late v-sync'd frames, but I get the feeling if you don't have that you'll be locking to lower refresh rates, incurring even more lag as a result.


From the page above:


Awwwwww I'm just grumpy, sorry lol.

Thank you, mate. I really appreciate the conversation around here and keeping an eye on all what you guys with much more experience say.

Aiming for 55" X90J here, or 65" X95J.
 

Kuranghi

Gold Member
I'm glad the motion judder was a bug and its seemingly fixed, weird that they increased light output with the firmware update as well, never seen that before but not a bad thing I'm sure.

The input lag being much higher is strange to me unless the image processing is a massive step up what came before and that justifies it.

edit - apparently input lag is not that much higher, see below.
 
Last edited:
I'm glad the motion judder was a bug and its seemingly fixed, weird that they increased light output with the firmware update as well, never seen that before but not a bad thing I'm sure.

The input lag being much higher is strange to me unless the image processing is a massive step up what came before and that justifies it.
Didn’t that Chinese slide we saw earlier say 17ms? Hardly significant if true. Unless I missed something
 
At this point I’m starting to wonder if Sony will even bother with mini led next year, due to rising lcd prices. What say you Kuranghi Kuranghi my man?

If mini led from Sony would reach oled prices will they even bother?
 

Kuranghi

Gold Member
At this point I’m starting to wonder if Sony will even bother with mini led next year, due to rising lcd prices. What say you Kuranghi Kuranghi my man?

If mini led from Sony would reach oled prices will they even bother?

I also say no to miniLED next year mostly because they don't seem to want to make super high end 4K LCDs anymore, I'd guess they don't want to have a 4K set in the range that costs the same/more than the OLEDs. The ZD9 was probably the equivalent of MiniLED prices (in 2016) and they downgraded that so much for the 75" XE94/X940E in 2017, from ~858 zones to 256 so they were already starting to abandon super high end back then.

When I worked for them the sales upgrade path was confusing when the ZD9/XE94/ZF9 was in the mix (So this is 2016-2018), they wanted us to sell up from edge lit to FALD to OLED but the 65" ZF9 was priced in between the two OLED models, AF8 & AF9, or sometimes even the same as the AF9!

Comparing it to the OLEDs in-store was confusing for customers, I would tell them OLED is the best, show them amazing imagery but then they'd see the ZF9 next to it (because thats how it was placed instore, a 65" ZF9, an 85" mid-end LCD and a 65" AF9 OLED side by side) and think wow that looks even better than the OLED, because the elevated black level/blooming of LCD wasn't nearly as visible in-store with all the lights and Vivid mode on the TVs (Vivid mode was law in the shop, you could change while you were demoing but if you left it off Vivid for more than 5 minutes they would bollock you, thank god for Pic Reset) and all they'd see was the difference in brightness output which would make the ZF9 look better in many scenes.

The black level would look the exact same on both sets if it was pure black. Like this clip for example would confuse the shit out of them after what I had said about OLED because most shots looked as good/better on the LCD on the shop floor:




Funny aside: I'd always point out the japanese woman's moustache to show how good Sony image presentation was, "Look at the detail in her moustache!" :messenger_tears_of_joy: You have to talk shite in a job like that, otherwise you go insane/sound really depressed playing the same demo 9000 times in a row.

Obviously the OLED should still give better image quality than an LCD but shots/scenes that showed that the most didnt look good under all those lights, too dull on the OLED compared to an 1800 nit LCD and the store conditions would hide any blooming/LCD issues.

I still want to see comparisons between the Samsung 2021 4K/8K miniLEDs and the Sony 2021 4K/8K LCDs (ZJ9 could be miniLED maybe?*) because if it doesn't smash the 4K sets then I don't think thats "real miniLED", what ever that means lol, I just mean they should be smashing the fuck out of a 32/48 zone LCD with the number of zones they tout.

*Just read the released info on it, they don't even mention Backlight Master Drive/A successor technology so prob not...

edit - Maybe (giant maybe) they would do one miniLED model with ludicrous prices, but push them to gamers rather than movie watchers and see how that does, attack QLED with it. VT reported Samsung are finally going to buy LG's TV size OLED panels though because the QD OLEDs they put trillions into havent had a manufacturing/materials breakthrough that would allow them to sell to consumers yet.

We'll also have to see what the the TCL OD Zero miniLED set that Dave_at_Home mentioned is like.
 
Last edited:
Kuranghi Kuranghi Hmm yes. A possible Z9K with mini led for high dollar prices, I can see that being possible. But yeah, a x900k with mini led seems unlikely next year imo.

And I know bro, explaining TVs to people on the show floor when what you’re saying doesn’t mesh with what they’re seeing is such a pain in the ass lol.
 

Bojanglez

The Amiga Brotherhood
The FIFA guy mentioned above has now posted a video of gaming mode on the X90J using the new firmware. It includes 4K/120 Fortnite gameplay. From a quick look it seems judder free...

 
Last edited:

Kuranghi

Gold Member
The FIFA guy mentioned above has now posted a video of gaming mode on the X90J using the new firmware. It includes 4K/120 Fortnite gameplay. From a quick look it seems judder free...



There are some weird jumps from time to time but I think its caused by the in-camera image stabilisation and/or image processing, I saw it during the Fifa part a couple of times but its most noticeable here:




Motion seems smooth though so thats good, the doubling of the image and smearing is also caused by the camera in case anyone is wondering about that.
 

Kerotan

Member
My friend was asking a few silly questions about the x90J. Just to confirm that's the set you want for gaming and not the x95j? What's the reason for this?

Anyway he's asking me if the x90J is a 50hz TV? And then if it will do 60fps or 120fps? I thought it obvious it does so said they do. But where's he getting this 50hz thing from?
 

Kuranghi

Gold Member
My friend was asking a few silly questions about the x90J. Just to confirm that's the set you want for gaming and not the x95j? What's the reason for this?

Anyway he's asking me if the x90J is a 50hz TV? And then if it will do 60fps or 120fps? I thought it obvious it does so said they do. But where's he getting this 50hz thing from?

First part: I think he is meaning the 2020 sets X900H and X950H, only the X900H had HDMI 2.1, so was better for gaming because it allowed 4K@120hz, whereas the X950H only has HDMI 2.0, so while it can display 120hz (@1080p) it can only do 60hz at 4K. This year the 2021 models, the X90J and X95J both have 2.1, so no worries there, they can both display 4K120hz.

2nd part: The X900H, X950H, X90J AND X95J are all 120hz panels, so they can also display 120hz but only the X900H, X90J and X95J can display 4K120hz due to having the bandwidth from HDMI 2.1, the X950H only had 2.0 as I said above so it can display 1080@120hz. The 50hz thing maybe comes from the UK, in the UK broadcast TV is 25 or 50hz, but not to worry as all the 120hz panels can display 50hz no bother.

Make sense?
 

Bo_Hazem

Banned
My friend was asking a few silly questions about the x90J. Just to confirm that's the set you want for gaming and not the x95j? What's the reason for this?

Anyway he's asking me if the x90J is a 50hz TV? And then if it will do 60fps or 120fps? I thought it obvious it does so said they do. But where's he getting this 50hz thing from?

I think your friend makes content in PAL, hence 25Hz, 50Hz, 100Hz instead of 24Hz, 60Hz, 120Hz on NTSC. My Sony a7S III camera was set to PAL and had to change the settings to NTSC to have 24/60/120 instead of 25/50/100. Tell him that his 50fps content will look perfectly fine as I view some content in 50fps from youtube and nothing looks strange.
 
Last edited:

dolabla

Member
Looks like it was fixed along with input lag. The guy on AVS re-bought the tv and did a pre-update/post-update comparison. He posted this video and said this:



Pre-update: https://www.avsforum.com/threads/20...hread-faq-no-price-talk.3190601/post-60658725

Out of the box settings before applying the update, the new/2nd x90j exactly matches my findings with the first one.
Extreme judder / frame skips in games in game mode
High input lag
Green push in Dolby Vision OOTB settings

Applying new update now and retesting.

Also switched to a different lens and settings on the camera to try and better show the difference with the C1. It still makes the C1 seem more blue than it is, even if I apply the C1 lower blue light setting, but it is not as bad as before.

Post update: https://www.avsforum.com/threads/20...hread-faq-no-price-talk.3190601/post-60658821

The update fixed the two main issues with the x90j.
Judder / frame skipping is game is fixed.
Input lag is better, essentially the same as the C1.
Dolby Vision is not changed.

Now I need to go through more thoroughly to make sure nothing has been broken in the update, but so far it fixed the main issues gamers would have.
Full video coming soon.

About the green push he's seeing and other things he will be looking for: https://www.avsforum.com/threads/20...hread-faq-no-price-talk.3190601/post-60659140

Watching other people with the a90j it also has the green tint. If I had to guess Sony changed the color science this year to try and show brighter highlights brighter by pushing more green. I'm sure it can be calibrated out but 99% of buyers aren't going to calman calibrate their display.

I will probably only do one more video as a final wrap up for the x90j before it goes back again. I'll run through some slides to check for uniformity issues, compare the Dolby Vision to the x900h, and recheck 120hz performance to make sure nothing changed with that with the firmware update.
 
Last edited:

Kerotan

Member
I think your friend makes content in PAL, hence 25Hz, 50Hz, 100Hz instead of 24Hz, 60Hz, 120Hz on NTSC. My Sony a7S III camera was set to PAL and had to change the settings to NTSC to have 24/60/120 instead of 25/50/100. Tell him that his 50fps content will look perfectly fine as I view some content in 50fps from youtube and nothing looks strange.
Thanks for the 2 replies. He's in Ireland so would be under PAL. I think he just panicked thinking he won't be able to get 120fps. He keeps a TV for a long ass time so this set will be used for a Ps5 Pro and a ps6 so even if there's not much 120fps content now it will be important to be future proofed.
 
Yo, Fomo’s review of the Q90A was pretty enticing. The dimming algorithm is waaaay better than Samsung’s previous efforts ; just by eye it may be on par with Sony’s, or at least close enough. Of course this needs to be put under more scrutiny, but give it a watch.

He didn’t test input lag, nor screen uniformity though. Also, we have to see what the dimming is like in game mode, so Sony’s algorithm may still be king for that use case.

Early days, but it sounds surprisingly good.
 

Kuranghi

Gold Member
Watching the TCL C82 preview on HDTVTest and I had to pause, it looks like I caught VT and MM right after they blazed a fat one:

mqdRn2C.png


Seriously though I bet Marek is a massive stoner. VT is a good boy so he doesn't ride the space trumpet, thats why you can see he's literally green and about to whitey. Marek is in a sauna and doesn't GAF.

The TV sounds good considering how I imagine it will be priced, 120 zone FALD (in 55"), 1000 nits, 2.1.
 
Last edited:
Watching the TCL C82 preview on HDTVTest and I had to pause, it looks like I caught VT and MM right after they blazed a fat one:

mqdRn2C.png



Seriously though I bet Marek is a massive stoner.

The TV sounds good considering how I imagine it will be priced, 120 zone FALD (in 55"), 1000 nits, 2.1.
Marek also sounded like he recorded in his bathroom on speaker phone xD
 

dotnotbot

Member
There are reports of some weird vignetting going on in game mode on A90J, so possibly also affecting A80J: https://www.avsforum.com/threads/20...rs-thread-no-price-talk.3184217/post-60662454
Sony's support says it's normal, although everything that support says should be taken with huge grain of salt. Also some reviewers and users are reporting more aggresive dimming than LG in some games with static elements like NBA.
I hate this kind of stuff, feels like your TV is constrained by stupid firmware and there is no way around it.
 

S0ULZB0URNE

Member
Yo, Fomo’s review of the Q90A was pretty enticing. The dimming algorithm is waaaay better than Samsung’s previous efforts ; just by eye it may be on par with Sony’s, or at least close enough. Of course this needs to be put under more scrutiny, but give it a watch.

He didn’t test input lag, nor screen uniformity though. Also, we have to see what the dimming is like in game mode, so Sony’s algorithm may still be king for that use case.

Early days, but it sounds surprisingly good.
Input lag is amazing

I don't do Samsung/Korean made but that is a nice tv.
 

dotnotbot

Member
Input lag is amazing

I don't do Samsung/Korean made but that is a nice tv.

Low input lag is also a downside in some sense, cause local dimming is gimped down in game mode, noticeable if you like to play in dark room:
Blooming is a bit more aggressive than outside of Game Mode, and it could be more noticeable, but it's still good overall. There's less black crush, but that's because the whole screen is just a bit over-brightened. Fast objects move between zones just a bit slower than outside of Game Mode, which could be more distracting. We experienced a strange issue where text in the center of the screen causes the entire top half of the screen to turn on all of the dimming zones. If you notice the same thing, let us know.

Samsung should let you choose between a bit higher lag and worse LD behaviour. Not everyone is e-sports try hard, some of us want better visual quality.
 
Last edited:
Low input lag is also a downside in some sense, cause local dimming is gimped down in game mode, noticeable if you like to play in dark room:
Blooming is a bit more aggressive than outside of Game Mode, and it could be more noticeable, but it's still good overall. There's less black crush, but that's because the whole screen is just a bit over-brightened. Fast objects move between zones just a bit slower than outside of Game Mode, which could be more distracting. We experienced a strange issue where text in the center of the screen causes the entire top half of the screen to turn on all of the dimming zones. If you notice the same thing, let us know.

Samsung should let you choose between a bit higher lag and worse LD behaviour. Not everyone is e-sports try hard, some of us want better visual quality.
Ah, was wondering about that. Can't agree more with Samsung offering that choice ; not like the Sony 950g wasn't very responsive despite non gimped local dimming.

I'm quite excited for Sony using this tech eventually. But yes S0ULZB0URNE S0ULZB0URNE 10ms is incredible. Also, the response time is amazing for an lcd as well.
 

Bojanglez

The Amiga Brotherhood
Android TV sucks, it's smooth when you first buy the TV then it slows down hard
This is quite a sweeping statement. I think it will vary from manufacturer and device to device. The platform itself is fine on adequate hardware, I use it as my main TV OS (via Nvidia Shield) because my Samsung KS700 Tizen based OS was becoming so laggy, ad filled and unsupported by some key apps that I have given up on it (as a Smart TV).

As with most tech products nowadays, I do believe that as a product gets older manufacturers are less likely to optimise the experience for older devices. The skeptic in me may also believe that there is actually a form of planned obsolescence going on, but this is not an Android TV issue.

I say this regularly. I kind of wish there was a manufacturer that built what is essentially a living room monitor and focus on providing top quality core features (lots of HDMI 2.1. ports with all supported features, great picture quality and tuning options) and maybe teamed up with Amazon, Google, Apple etc. to have some kind of certification for external streaming boxes, and make sure they play nicely and things like Dolby Vision are balanced correctly when passed through.... and of course this mythical living room screen would need a sweet universal remote with buttons to directly access an HDMI input (maybe with little LED buttons or screen that could display a custom name for the input).
 

Kuranghi

Gold Member
Yo, Fomo’s review of the Q90A was pretty enticing. The dimming algorithm is waaaay better than Samsung’s previous efforts ; just by eye it may be on par with Sony’s, or at least close enough. Of course this needs to be put under more scrutiny, but give it a watch.

He didn’t test input lag, nor screen uniformity though. Also, we have to see what the dimming is like in game mode, so Sony’s algorithm may still be king for that use case.

Early days, but it sounds surprisingly good.

I'm more inclined to trust Caleb from Digital Trends to be honest and he said the opposite, that the backlight system is "slow" though he wasn't talking about game mode afaik so maybe its faster there, if I had to guess I'd say it'd be slower though.




I'm guessing maybe FOMO talking about it not blooming/clouding when in game mode (Like previous years) rather than the speed of the zone transitions.
 

Bojanglez

The Amiga Brotherhood
Watching the TCL C82 preview on HDTVTest and I had to pause, it looks like I caught VT and MM right after they blazed a fat one:

mqdRn2C.png


Seriously though I bet Marek is a massive stoner. VT is a good boy so he doesn't ride the space trumpet, thats why you can see he's literally green and about to whitey. Marek is in a sauna and doesn't GAF.

The TV sounds good considering how I imagine it will be priced, 120 zone FALD (in 55"), 1000 nits, 2.1.
Just caught up on this. It was quite insightful and nice to have an executive so open and transparent with what they are doing. I had always discounted TCL as a cheap (and therefore rubbish) brand, but over the last month or so as I've been looking at new TVs, a lot of people have said how good their models were (at least in the US) last year. I guess they are kind of where LG were maybe 15 years ago, a challenger brand that began to win people over with good performing products at cost effective pricing.

So it seems like the TCL C82 has a few things in common with the new Sony range; based on Google TV, Mediatek SoC with 2x48Gbps HDMI 2.1. Do we expect it will come in cheaper than the X90J? Will be interesting to see them go head to head.

I don't like the design though, the integrated speaker at the bottom makes it look a bit crap.
 
Last edited:
I'm more inclined to trust Caleb from Digital Trends to be honest and he said the opposite, that the backlight system is "slow" though he wasn't talking about game mode afaik so maybe its faster there, if I had to guess I'd say it'd be slower though.




I'm guessing maybe FOMO talking about it not blooming/clouding when in game mode (Like previous years) rather than the speed of the zone transitions.

Fomos tests weren’t in game mode.

I do take fomo with a grain of salt, but what he was saying and what I was seeing matched, at least in the video.

Wait for our boy Vinny I guess xD
 

Kuranghi

Gold Member
Fomos tests weren’t in game mode.

I do take fomo with a grain of salt, but what he was saying and what I was seeing matched, at least in the video.

Wait for our boy Vinny I guess xD

Ah right, who knows then, did you watch the video I linked? He doesn't mention game mode, he says (paraphrasing) "sometimes there is a lag for really bright objects, around a second, but its not too big of a deal in real content, just something to be aware of".

I will wait on more impressions but thats sounds bad for games to me, because you move the camera way faster in a game than you'd ever see in a film (They might occasionally move that fast in a film but they'd add post processing blur to it usually or have the camera set up in such a way that it adds tons of blur anyway), and also that when there are really bright objects you can usually still move the camera in a game, whereas in a movie that would normally be a tripod locked shot or have little movement in the frame, because it would look bad at 24hz.
 
Ah right, who knows then, did you watch the video I linked? He doesn't mention game mode, he says (paraphrasing) "sometimes there is a lag for really bright objects, around a second, but its not too big of a deal in real content, just something to be aware of".

I will wait on more impressions but thats sounds bad for games to me, because you move the camera way faster in a game than you'd ever see in a film (They might occasionally move that fast in a film but they'd add post processing blur to it usually or have the camera set up in such a way that it adds tons of blur anyway), and also that when there are really bright objects you can usually still move the camera in a game, whereas in a movie that would normally be a tripod locked shot or have little movement in the frame, because it would look bad at 24hz.
Not yet gotta watch later
 
Ah right, who knows then, did you watch the video I linked? He doesn't mention game mode, he says (paraphrasing) "sometimes there is a lag for really bright objects, around a second, but its not too big of a deal in real content, just something to be aware of".

I will wait on more impressions but thats sounds bad for games to me, because you move the camera way faster in a game than you'd ever see in a film (They might occasionally move that fast in a film but they'd add post processing blur to it usually or have the camera set up in such a way that it adds tons of blur anyway), and also that when there are really bright objects you can usually still move the camera in a game, whereas in a movie that would normally be a tripod locked shot or have little movement in the frame, because it would look bad at 24hz.
Oh that guy. Honestly he’s like the most unscientific (for lack of a better word) reviewer that I see occasionally because sometimes he’s the first to get a review out. Not saying he’s lying, but yeah I trust fomo more than him tbh. Even though he doesn’t calibrate...



It’s at the 8:54 mark, see what you think about it. Not 100% believing it until Vincent though.

But I agree with you that the lower dimming performance in game mode is a deal breaker, but that’s how they achieve that super low lag so some would be okay with it.
 
Last edited:
Just thinking today, i've been wanting to see an sony oled with an input lag 15ms or lower with 120hz bfi engaged and have been wondering how they could lower the bfi lag hit... Not sure why I never thought of it before but the reason the bfi costs 8ms is that the frametime for a 120hz bfi signal is 8ms, that they're overlaying on the picture ; so basically there's no way to reduce that without a backlight which oled doesn't have.

Basically oled has to hit 7ms input lag or lower to be under one frame of lag with bfi engaged... be cool if Sony could hit that number by the time I need to upgrade lol. 7ms I think for Sony will be really hard, considering Samsung isn't even there yet.
 

Kuranghi

Gold Member
Oh that guy. Honestly he’s like the most unscientific (for lack of a better word) reviewer that I see occasionally because sometimes he’s the first to get a review out. Not saying he’s lying, but yeah I trust fomo more than him tbh. Even though he doesn’t calibrate...



It’s at the 8:54 mark, see what you think about it. Not 100% believing it until Vincent though.

But I agree with you that the lower dimming performance in game mode is a deal breaker, but that’s how they achieve that super low lag so some would be okay with it.


I just like Caleb because he does the nerd stuff in the background and then just tells you what you need to know, I think a big reason why he has reviews out earlier/first is because Digital Trends is a large company so they get sent the sets directly and before general release.

I'm pretty sure the reason you are seeing such a difference is in large part due to 50% more zones, but also because the native contrast of the Q900T is ~1600:1, whereas the QN90A is ~3500:1, it would be the same if you had an LG IPS LCD ~1500:1 LCD next to any high contrast VA LCD. When you turn the ISO up on the camera that much to capture the blooming its going to be way worse on the Q900T because of this, if you turned the exposure up even more I'd guess you'd see the QN90A doing something similar just the zones would be tracing the box with slightly less of a border than the Q900T, because they're smaller.


I still think it would be better if they were similar contrast panels, but its definitely going to be better on QN90A just due to having that many more zones, the difference would not be anywhere near that stark with your eyes. The QN90A is like 50% brighter in both SDR and HDR so maybe that skews the test somehow when you make them look the same on camera. It would be better to compare it to the 2019 Q90R or 2018 Q9FN, which have native contrast ratios of 3200:1 and 6000:1 respectively, but still have 480 zones.

The algorithm does seem improved from that video though, looks amazing. I just have reservations about the honestly of some of the shots where there is no blooming on QN90A at all, but its crazy bad on the Q900T, not sure how that makes sense if the zones are "only" 40% smaller, there would still be a big gap between each one, much larger than those stars are so I would its mostly the camera exposure. I'm making tons of assumptions though, 10 levels deep inception ass-umptions lol so I should stop because I've never even seen a miniLED TV in person so that could just be the difference it makes over a standard FALD backlight.

It would be awsome if the algorithm improvements + miniLED made this much of a difference though, I'm pretty excited about miniLED if thats the case!
 
Last edited:
Top Bottom