• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

So my 15 years old Pioneer Kuro is dying - OLED, QD-OLED, QLED..

saintjules

Gold Member
OLED. As long as you vary content on your TV you should never experience burn-in during the life of your TV.

I recently upgraded to a 77" B3 OLED and moved the C9 to the bedroom. C9 has no issues when I bought it back in 2020.
 

Bernoulli

M2 slut
It has low contrast ratio for a VA panel, plus fuck that blooming!



Mini LEDs have some advantages but I can't stand some aspects of them.

it show how sony with only 480 local dimming zones it's destroying TVs with 1344 zones
all thanks to sony's superior algorithm, next year version it will be even closer to OLED without all the bad things about it like burn in
 

Bojji

Gold Member
it show how sony with only 480 local dimming zones it's destroying TVs with 1344 zones
all thanks to sony's superior algorithm, next year version it will be even closer to OLED without all the bad things about it like burn in

But you can clearly see those dimming zones in the test or with subtitles. Connected to PC, mouse on dark background will look like Christmas lights.

This tv is super expensive so I don't get why it has this low amount of dimming zones.
 
Last edited:

Buggy Loop

Member
Got it this morning and went to work

Now it’s up and running

First time seeing those HDR demos while still being with the aftermath of a cold ( watery eyes )

200.gif
 

Mister Wolf

Gold Member
Got it this morning and went to work

Now it’s up and running

First time seeing those HDR demos while still being with the aftermath of a cold ( watery eyes )

200.gif

Try Ratchet and Clank on it. Don't forget to adjust your HDR settings on the PS5 for the new TV,
 
Last edited:

S0ULZB0URNE

Member
Fuck it,

Just bought 65" X93L

Done

Lets Go Sport GIF by ALL ELITE WRESTLING


Hopefully a companion that will last as long as my Kuro bro.

Was mighty fucking impressive in store next to OLED so I don't see any regrets on horizon. I trust in Sony brand too, their processing and long history of quality TVs, I'm a bit doubtful for Hisense, they might have no problems but they have a bit more to prove for long term quality.

I thought about this and I really didn't want to nag the family on content they watch to not get burn ins, etc. I don't think it will necessarily happen, I just don't want the thought to occur.

To be delivered this Wednesday. OMG.

I think the first movie I'll watch is Blade Runner 2049
Guess you needed it now?
Hope you got a good deal....
Congratulations and enjoy.
Much tweaking you will need to do.
But you can clearly see those dimming zones in the test or with subtitles. Connected to PC, mouse on dark background will look like Christmas lights.

This tv is super expensive so I don't get why it has this low amount of dimming zones.
Not with Sony's algorithm.
One of my Sony's only has 32 zones and it's not that bad.
 

Buggy Loop

Member
Just followed a bunch of Bravia XR setting guides, for SDR, HDR & Dolby vision and afterwards for Apple TV 4K settings.

The TV looks completely different than the first hour I spent with it. I'm watching the jet sequences in Top Gun maverick and it blows my fucking mind.

I thought honestly my Plasma was holding up, I just didn't realize how much 4k + HDR/Dolby Vision on a good TV just blows it the fuck away. I'm jumping between all the movies I have to test it and EVERYTHING looks amazing.

I'm legit gonna have trouble going back to PC monitor non HDR now I think. The shadow details and the peak brightness for some scenes.

Edit - Dunkirk! DUNKIRK!! Holy crap, this movie was made for brightness 🔆
 
Last edited:

Spukc

always chasing the next thrill
Pouring Austin Powers GIF


That Kuro was amazing. I can't believe it lasted me even that long. Image quality was so good that I never felt the urge to upgrade when going to stores. On top of that, it served as a heater for Canadian winter :messenger_tears_of_joy:

Now the panel randomly doesn't turn on, has the 8 blinking lights of death, it sometimes start but it's getting really annoying when it doesn't want to as it really sucks for the kids, especially with holiday movie season. Probably fixable by toying around with the PCBs inside, but putting cash into a 15 years old display doesn't make too much sense.

So.. for a new TV
  • No intention of having the new TV as "the" home cinema, I'm renovating my basement for a movie theater so likely a 85~98" panel will go down there, in the meantime
    • I think 65" is good, and will serve as secondary TV eventually.
  • I don't have a PS5 nor an Xbox, no intention of future consoles either. I have a Nintendo switch I barely play with anymore but more than likely, my kids will eventually play it on it more and when they're older a bit I'll play with them.
    • So good upscaling, I doubt even Switch 2 will be 4K so upscaling that doesn't suck is high on the list. On top of TV channels barely moving out of the HD age.
  • A bit worried about burn-in, but technically we don't watch the troublesome things like news feed 8 hours a day or something. Kids playing games and leaving it ON for a while might be a problem.
  • I hear that there's no TV nowadays that will match the motion resolution of Plasma so that sucks a bit, but I mean, unless someone corrects me, there's not a single tech in modern TVs that will match it, correct?
  • I don't particularly aim for the BEST picture quality with perfect blacks infinite contrasts, etc. I love the Kuro image quality and that was not perfect. My wife will never care for it. I'll put in more cash into the future home theater setup, again, this will likely become secondary TV.
  • Right now it would be in a bright room with big windows

I kind of nailed it down to
  • Cheap LCD ~$1500 Canadian, hoping for a $999 sales soon? Don't think I can wait..
    • TCL QM8 / Hisense U8K, these are apparently trading blows, with Hisense having more software problems it seems, so I would aim for TCL, but how long do they last..
    • Sony X90L, its just a full array but apparently punch way above its weight with sony processing. Only a few scenes can be problematic with it but also the image would be more accurate as the micro-LED displays will typically just black crush details, such as stars disapearing. Sony has best upscaling from what I read?
  • Top range LCD ~$2100
    • Samsung QN90C
  • Mid-range OLED ~$2300 CND
    • LG C3 / Sony A80L / Samsung S90C, Samsung not having any Dolby vision, again, not even sure I'll take advantage of it..
Is OLED worth the extra ~$800 over cheap LCD for secondary TV?

Mind you, I already have an Apple 4K TV, so forget about the trash "smart" OS of the TVs, I could live with them being dumb as a brick for all I care.
Yes
 

S0ULZB0URNE

Member
Just followed a bunch of Bravia XR setting guides, for SDR, HDR & Dolby vision and afterwards for Apple TV 4K settings.

The TV looks completely different than the first hour I spent with it. I'm watching the jet sequences in Top Gun maverick and it blos my fucking mind.

I thought honestly my Plasma was holding up, I just didn't realize how much 4k + HDR/Dolby Vision on a good TV just blows it the fuck away. I'm jumping between all the movies I have to test it and EVERYTHING looks amazing.

I'm legit gonna have trouble going back to PC monitor non HDR now I think. The shadow details and the peak brightness for some scenes,
My gear including my 4090 build is hooked up to my Sony's.
Some good mini led sets should be shown at CES if you rather play on a monitor.
 

Celcius

°Temp. member
Just followed a bunch of Bravia XR setting guides, for SDR, HDR & Dolby vision and afterwards for Apple TV 4K settings.

The TV looks completely different than the first hour I spent with it. I'm watching the jet sequences in Top Gun maverick and it blos my fucking mind.

I thought honestly my Plasma was holding up, I just didn't realize how much 4k + HDR/Dolby Vision on a good TV just blows it the fuck away. I'm jumping between all the movies I have to test it and EVERYTHING looks amazing.

I'm legit gonna have trouble going back to PC monitor non HDR now I think. The shadow details and the peak brightness for some scenes,
Can you post a picture of the TV running something just so that we can kinda get an idea of what it looks like?
 

Buggy Loop

Member
Can you post a picture of the TV running something just so that we can kinda get an idea of what it looks like?

I just tried and everything looks like shit with my phone lol

Honestly YouTubers who make TV comparisons, I don’t know how much hardware they splurge to make it look great, but they are way better than whatever I would get.
 

Bojji

Gold Member
Just followed a bunch of Bravia XR setting guides, for SDR, HDR & Dolby vision and afterwards for Apple TV 4K settings.

The TV looks completely different than the first hour I spent with it. I'm watching the jet sequences in Top Gun maverick and it blows my fucking mind.

I thought honestly my Plasma was holding up, I just didn't realize how much 4k + HDR/Dolby Vision on a good TV just blows it the fuck away. I'm jumping between all the movies I have to test it and EVERYTHING looks amazing.

I'm legit gonna have trouble going back to PC monitor non HDR now I think. The shadow details and the peak brightness for some scenes.

Edit - Dunkirk! DUNKIRK!! Holy crap, this movie was made for brightness 🔆

Yeah, most monitors are piece of shit compared to good tvs (not even highest end), I literally can't go back to monitors without HDR and with all VA\IPS flaws. And there is no reason, I have my PC hooked up only to LG tv, input lag is super low and 120Hz is more than enough for my needs. Not to mention picture quality is amazing.
 

Shake Your Rump

Gold Member
I also use my OLED TV for PC gaming (mainly C1, but a95k when playing couch coop) and it makes my desktop monitors look like trash.
 
Last edited:

Buggy Loop

Member
Is there a way to buy on iTunes some amazing demo? Like YouTube demo are pretty fly but clearly we see the algorithm. How do they set up demos at stores? Surely not via physical media right?

Are there apps? Or iTune movies?
 

S0ULZB0URNE

Member
Is there a way to buy on iTunes some amazing demo? Like YouTube demo are pretty fly but clearly we see the algorithm. How do they set up demos at stores? Surely not via physical media right?

Are there apps? Or iTune movies?
Less compressed?

Altered Carbon season 1 on Netflix higher tier plan looks pretty good in Dolby Vision.
Netflix has a few lookers.
I recommend using a Apple 4KTV late 2022 model to stream
 

coffinbirth

Member
RIGHT NOW Dolby Vision is useless for gaming because almost no games support it, Xbox setting for DV just converts HDR10 signal. Once developers start to use DV it will be better format for games for sure.

HDR10 peak brightness in 4000 nits so brightness isn't the problem for games.

Right now HDR10 (console, PC) and HGIG (tv) is perfect combo.



Even the best QLED with thousands of dimming zones will be worse than OLED where every pixel is a light source. You are wrong.
You are correct in that HDR10 games are converted(and worse for it) when DV is on, but there are quite a few Dolby Vision Enhanced games already on the market. Gears, Halo, CoD, NBA2K...
 

anothertech

Member
Is there a way to buy on iTunes some amazing demo? Like YouTube demo are pretty fly but clearly we see the algorithm. How do they set up demos at stores? Surely not via physical media right?

Are there apps? Or iTune movies?
Check out Roman de Giuli 4k and 8k hdr demos on YouTube. It's insane. There's some hour long versions out there too.
 

hussar16

Member
Go for a mini led but with ads/ips panel only.its got better motion resolution and more plasma like natural colors then regular va panels. Oled are nice bit it never gets bright enough and has a fake color tone to it. I had a ads mini led and it looked amazing however it was a samsung so the motion had soap opera effect even woth settings off.currently only one that has the ads panel and is big is hisense u8k 75 inch ads with no soap opera effect like samsung
 
Last edited:

hussar16

Member
If you go LED with local dimming definitely pay a little more for the wide angle color (Sony x93L/x95L) rather than the standard VA panel (Sony x90L). You lose a little in contrast but it's worth it to be able to sit off angle without having washed out colors. I would not consider any non-Sony LED as Sony has the best local dimming algorithm, especially in game mode.
All the Sony miniled are va panels
 
I may have shat on OLED TVs (i have a C2) for the obvious reasons (judder/stutter with sub-60fps content, lower brightness compared to LEDs etc) but...the prices for MiniLED TVs are just friggin' stupid for fook's sake especially for something like the Sony X95L - it's almost 1000 Eurodollars more for a same inches OLED (C3) ! I know that there's the infamous Sony tax® involved but G'damn...

Last year, wanted to get a MiniLED myself (Philips, since i love Ambilight) but the prices were - once again - a complete joke, it was more expensive than the tried and tested LG OLEDs so i obviously went with the later.

Good or not, at the end of the day it's still LED technology with everything that entails (corner vignetting/Vertical banding, DSE, blooming etc), i really can't understand the prices they're asking for.
 
Last edited:

Bojji

Gold Member
Not with Sony's algorithm.
One of my Sony's only has 32 zones and it's not that bad.

It depends on how sensitive you are to this. On this Sony tv in 21:9 movies or in darker scenes subtitles will glow on dark background. Most movies are 21:9 and I have to use subtitles so this thing is a massive minus for me.

I may have shat on OLED TVs (i have a C2) for the obvious reasons (judder/stutter with sub-60fps content, lower brightness compared to LEDs etc) but...the prices for MiniLED TVs are just friggin' stupid for fook's sake especially for something like the Sony X95L - it's almost 1000 Eurodollars more for a same inches OLED (C3) ! I know that there's the infamous Sony tax® involved but G'damn...

Last year, wanted to get a MiniLED myself (Philips, since i love Ambilight) but the prices were - once again - a complete joke, it was more expensive than the tried and tested LG OLEDs so i obviously went with the later.

Good or not, at the end of the day it's still LED technology with everything that entails (corner vignetting/Vertical banding, DSE, blooming etc), i really can't understand the prices they're asking for.

For Mini LEDs you probably pay for brightness. I can't comprehend how these tvs are more expensive than OLEDs when they have technology that is inferior in many metrics (aside brightness and low fps content handling of course).
 

S0ULZB0URNE

Member
It depends on how sensitive you are to this. On this Sony tv in 21:9 movies or in darker scenes subtitles will glow on dark background. Most movies are 21:9 and I have to use subtitles so this thing is a massive minus for me.



For Mini LEDs you probably pay for brightness. I can't comprehend how these tvs are more expensive than OLEDs when they have technology that is inferior in many metrics (aside brightness and low fps content handling of course).
No light bleeding on black bars with my 858 zone and 32 zone Sony tv's.

Brightness(and no burn in) is the biggest strength of LED tech.
They go brighter and hold brightness much better.

The HDR market is aiming to go brighter and only one display tech lags behind in this area and that's with improvements like heat sinks etc.
 

Bojji

Gold Member
No light bleeding on black bars with my 858 zone and 32 zone Sony tv's.

Brightness(and no burn in) is the biggest strength of LED tech.
They go brighter and hold brightness much better.

The HDR market is aiming to go brighter and only one display tech lags behind in this area and that's with improvements like heat sinks etc.

?

EN7yYSy.png


Samsung has over 3x more dimming zones.

But when this "nit race" will stop? Movies are often mastered for 4000 nits but do we really need all this shooting into our eyes?
 

S0ULZB0URNE

Member
?

EN7yYSy.png


Samsung has over 3x more dimming zones.

But when this "nit race" will stop? Movies are often mastered for 4000 nits but do we really need all this shooting into our eyes?
The nit race isn't stopping.

1. I said black bars.
2. Cameras and videos overexpose such things.
3. When looking head on I don't see blooming on my 858 zone ZD9.
4. I do sometimes on my 32 zone set.
5. I am not saying blooming doesn't exist all display tech has strengths and weaknesses.
6. VA panels are overall the best for MY usage with IPS being the worse.
 

Buggy Loop

Member
2. Cameras and videos overexpose such things.

This

There's so such thing as seeing through a TV→Camera→what you're looking at right now to do justice on the image output. You really have to listen to peoples experiencing them live.

I see the blooming on my X93L, only on subtitles honestly. The black bars are pure raw black.

Question is how much percentage of content you think you'll be watching that is pure raw fucking black with bright white subtitles? For me it's quite low by my estimates. So can I live with blooming for ~3% (if even that) of content watch, while the remaining 97% enjoys much brighter images?

WRGB OLED and older introduces colour banding, tinting and uniformity issues. QD-OLED introduce grey blacks due to lack of polarization to get higher brightness and are more prone to burn-in than WOLED....

So nothing's perfect as of now. Pick a poison.
 
Last edited:

Buggy Loop

Member
Guess you needed it now?

I probably didn't stress enough in original post how its looking for me in the coming weeks/months

My wife is 3 weeks until due date for our 3rd baby. She's 100% at home.

I don't want her to go through this shit where you want to watch TV and it doesn't work. And especially not when the baby will be there and you sort of couch potato a couple of months. Waiting for a better deal or a better model was not worth the tradeoff of her being stuck with a dead TV. I hope that clears the urgency I had into purchasing a new one.
 

S0ULZB0URNE

Member
I probably didn't stress enough in original post how its looking for me in the coming weeks/months

My wife is 3 weeks until due date for our 3rd baby. She's 100% at home.

I don't want her to go through this shit where you want to watch TV and it doesn't work. And especially not when the baby will be there and you sort of couch potato a couple of months. Waiting for a better deal or a better model was not worth the tradeoff of her being stuck with a dead TV. I hope that clears the urgency I had into purchasing a new one.
👍 and congratulations
 

Deerock71

Member
Pouring Austin Powers GIF


That Kuro was amazing. I can't believe it lasted me even that long. Image quality was so good that I never felt the urge to upgrade when going to stores. On top of that, it served as a heater for Canadian winter :messenger_tears_of_joy:
As a Montanan, I can endorse this. I have radiant heating in my house, and my kids in their room playing their 60" tvs and PS5s and Series X never need to worry about cranking up the baseboard heaters in the wintertime. 😆
 

Meicyn

Gold Member
?

EN7yYSy.png


Samsung has over 3x more dimming zones.

But when this "nit race" will stop? Movies are often mastered for 4000 nits but do we really need all this shooting into our eyes?
?

Why would you post this? The Samsung is overly aggressive with the dimming based on the screenshot you provided. There is a tremendous amount of detail lost, do you not see it? 3x more dimming zones doesn’t mean much when the algorithm driving those zones cuts out that much detail. Sure, the Sony has blooming, but at least I can see the tree on the left.

Also, nit race? Since when is reproducing the intended image a bad thing? Why is accurate reproduction of contrast something to be valued, but accurate reproduction of brightness is something to be questioned? You are aware that content is mastered at 4000 nits, so achieving 4000 nits is an important capability of the TV, no? More range is absolutely worth achieving.
 

Shake Your Rump

Gold Member
But when this "nit race" will stop? Movies are often mastered for 4000 nits but do we really need all this shooting into our eyes?
Yes, we need more peak brightness for a more accurate picture. Just as how we didn’t stop at black and white film, or 256 colour gifs, we will continue to make our display technology more accurate. If, in real life, a bonfire is 100x brighter than the surrounding area, we want it to be the same on our screens.
 

Bojji

Gold Member
?

Why would you post this? The Samsung is overly aggressive with the dimming based on the screenshot you provided. There is a tremendous amount of detail lost, do you not see it? 3x more dimming zones doesn’t mean much when the algorithm driving those zones cuts out that much detail. Sure, the Sony has blooming, but at least I can see the tree on the left.

Also, nit race? Since when is reproducing the intended image a bad thing? Why is accurate reproduction of contrast something to be valued, but accurate reproduction of brightness is something to be questioned? You are aware that content is mastered at 4000 nits, so achieving 4000 nits is an important capability of the TV, no? More range is absolutely worth achieving.

Yes, we need more peak brightness for a more accurate picture. Just as how we didn’t stop at black and white film, or 256 colour gifs, we will continue to make our display technology more accurate. If, in real life, a bonfire is 100x brighter than the surrounding area, we want it to be the same on our screens.

I'm all for accurate picture but there is a line where more brightness would do more harm than good (IMO), for example do you think that watching car lights in the movie attacking your eyes in 4000 nits will be comfortable?

What is funny to me that movies are mostly aimed at cinema screens and brightness of cinema screens is laughable compared to modern tvs. There is also this interesting quote:

Zell also touched on the issue of how our eyes actually respond to light and why more nits doesn’t necessarily equate to a better experience. “When we watch a movie at 48 nits or 100 nits,” he said, “our irises stay open. When we’re looking at 150 nits, they start to close down and by 300 nits, they are really small. A screen at full white at 300 nits, makes viewers’ “heads jerk backwards, it makes them cover their eyes with their hands.”

And while there would be very few situations of pure white being projected on a screen for any significant length of time, he did suggest that if movies are to be shown at levels beyond 300 nits, editors and colorists will need to understand the fatigue and headache that can result when viewers’ irises are made to frequently open up and close down and to factor this into their work.

 

Meicyn

Gold Member
I'm all for accurate picture but there is a line where more brightness would do more harm than good (IMO), for example do you think that watching car lights in the movie attacking your eyes in 4000 nits will be comfortable?

What is funny to me that movies are mostly aimed at cinema screens and brightness of cinema screens is laughable compared to modern tvs. There is also this interesting quote:



The context behind that article is a cinema environment, i.e. a dark room. 300 nits is a lot when you’re in a very dark room. In a bright room, 300 nits is nothing.
 

Shake Your Rump

Gold Member
I'm all for accurate picture but there is a line where more brightness would do more harm than good (IMO), for example do you think that watching car lights in the movie attacking your eyes in 4000 nits will be comfortable?
Again, this would be like asking "do we really need 16 million colours in our photos", arguing against it because too many colours may be uncomfortable to view. There is no upper limit where dynamic range would be harmful, or even undesirable.

The peak brightness of any display involves a small percentage of the screen. Most of the headroom above 1000nits would be used only for extreme highlights. Your 4000 nit headlight example would never actually occur.
 

Bojji

Gold Member
The context behind that article is a cinema environment, i.e. a dark room. 300 nits is a lot when you’re in a very dark room. In a bright room, 300 nits is nothing.

Of course. But why do we need tvs reaching 4000 (or 10000!) nits when cinema screens can't even reach fraction of that and can't show all those "hidden details". Cinema screen is what movie directors are aiming for.

This brightness race is directed mostly by tv producers right now, people watched their tvs in bright rooms for decades but now everything below 2000 nits is "too dim" (hint: crts, first lcds and plasma tvs weren't above 300 nits).

Again, this would be like asking "do we really need 16 million colours in our photos", arguing against it because too many colours may be uncomfortable to view. There is no upper limit where dynamic range would be harmful, or even undesirable.

The peak brightness of any display involves a small percentage of the screen. Most of the headroom above 1000nits would be used only for extreme highlights. Your 4000 nit headlight example would never actually occur.

So why have 4000 nits if you only use it for (for example) 2% of the screen? I get what you are saying about number of colors etc. but there are diminishing returns for everything. Just like 8K resolution is a waste.
 
Last edited:

Mister Wolf

Gold Member
Again, this would be like asking "do we really need 16 million colours in our photos", arguing against it because too many colours may be uncomfortable to view. There is no upper limit where dynamic range would be harmful, or even undesirable.

The peak brightness of any display involves a small percentage of the screen. Most of the headroom above 1000nits would be used only for extreme highlights. Your 4000 nit headlight example would never actually occur.

The idea that "infinite contrast" makes up for low peak brightness pushed by OLED proponents is one of the biggest scams sold to TV consumers these past years. Any side by side in a store like Buggy Loop did using high nit content destroys that notion. Even SDR content tone mapped to a higher nit range TV is more pleasing to the eye.
 
Last edited:

Meicyn

Gold Member
I get what you are saying about number of colors etc. but there are diminishing returns for everything. Just like 8K resolution is a waste.
Ah yes. Diminishing returns. Except for OLED, where infinite contrast matters! And the immediate pixel response is superior. Except when the content is sub 40 fps and results in distracting judder. Console devs, adjust please! Ignore the bright HUD in your modern games causing the ABL to aggressively darken everything else on the screen to avoid screen retention issues. OLED is superior, you don’t really need more than 300 nits. Go read this article about 300 nits being enough.

Jesus Christ, OLED purists are insufferable. I’m out.
 

Bojji

Gold Member
Ah yes. Diminishing returns. Except for OLED, where infinite contrast matters! And the immediate pixel response is superior. Except when the content is sub 40 fps and results in distracting judder. Console devs, adjust please! Ignore the bright HUD in your modern games causing the ABL to aggressively darken everything else on the screen to avoid screen retention issues. OLED is superior, you don’t really need more than 300 nits. Go read this article about 300 nits being enough.

Jesus Christ, OLED purists are insufferable. I’m out.

I'm aware about OLED problems and things they do worse than other tech, thank you.

And I wasn't talking about all this with OLEDs in mind at all, just in general.

This is where brightness race is going:

To begin, let’s please get one common, deep-rooted misunderstanding out of the way: that HDR is about overall brighter pictures. That's just wrong. It's a wider range of luminance, so deeper blacks and brighter highlights, especially specular highlights – light reflecting off shiny objects.

Another question is where to draw the line? We may think of 1,000 nits as bright now but in a few years 10,000 nits may be achievable. Will we look at 1,000-nit movies then as 'fake' HDR? Let’s hope we won’t. The only line that can be drawn objectively is SDR’s 100 nits. Anything above that in principle is HDR.

 

Buggy Loop

Member
So either my hdmi cable or Apple TV 4K I had (first gen) is not cutting it.

Dolby vision is detected and enabled, but I noticed in Dune that every scenes where it’s dark, a subtitle basically makes the screen brighten up rapidly at the same time as the subtitle, not a lot brighter but the kind of flicker that is noticeable, and annoying.

Went to the native Sony OS and opened the Apple app to access the movie, same problematic scenes, and everything is good.

Just bought AirPods Max today too to enjoy Spatial Audio while family sleeps, if it wasn’t for that I would just say fuck Apple TV and switch to Sony OS but then I might as well return the headphones as Spatial Audio or transparency doesn’t seem supported.

Apple TV 3rd gen then? I was kind of hoping to wait, 4th gen is around the corner per rumours..
 
Last edited:

Buggy Loop

Member
Went to Costco today for food during holidays and picked the Apple TV 3rd gen on the way with HDMI 2.1 cables

So the problem I had previously is gone with 3rd gen. Also now YouTube actually outputs HDR while 1st gen it was not. (Had to go in Sony OS apps to see)

Gonna try to sell this 1st gen paper weight now..

Watched Joker tonight with Dolby vision and AirPods Max spatial audio

You Are Amazing Love It GIF by Late Night with Seth Meyers


The scene where he’s about to kill his mother with a pillow, the sun shining in the room in HDR, I’m basically rediscovering all movies I saw and feels completely new. The Colors and contrasts, the bright lights. So fucking good.
 
Top Bottom