Bojji
Gold Member
the Sony X95L is the best QLED, sony's alogrithm is the best in the market
It has low contrast ratio for a VA panel, plus fuck that blooming!
Mini LEDs have some advantages but I can't stand some aspects of them.
the Sony X95L is the best QLED, sony's alogrithm is the best in the market
It has low contrast ratio for a VA panel, plus fuck that blooming!
Mini LEDs have some advantages but I can't stand some aspects of them.
it show how sony with only 480 local dimming zones it's destroying TVs with 1344 zones
all thanks to sony's superior algorithm, next year version it will be even closer to OLED without all the bad things about it like burn in
Got it this morning and went to work
Now it’s up and running
First time seeing those HDR demos while still being with the aftermath of a cold ( watery eyes )
Guess you needed it now?Fuck it,
Just bought 65" X93L
Done
Hopefully a companion that will last as long as my Kuro bro.
Was mighty fucking impressive in store next to OLED so I don't see any regrets on horizon. I trust in Sony brand too, their processing and long history of quality TVs, I'm a bit doubtful for Hisense, they might have no problems but they have a bit more to prove for long term quality.
I thought about this and I really didn't want to nag the family on content they watch to not get burn ins, etc. I don't think it will necessarily happen, I just don't want the thought to occur.
To be delivered this Wednesday. OMG.
I think the first movie I'll watch is Blade Runner 2049
Not with Sony's algorithm.But you can clearly see those dimming zones in the test or with subtitles. Connected to PC, mouse on dark background will look like Christmas lights.
This tv is super expensive so I don't get why it has this low amount of dimming zones.
Yes
That Kuro was amazing. I can't believe it lasted me even that long. Image quality was so good that I never felt the urge to upgrade when going to stores. On top of that, it served as a heater for Canadian winter
Now the panel randomly doesn't turn on, has the 8 blinking lights of death, it sometimes start but it's getting really annoying when it doesn't want to as it really sucks for the kids, especially with holiday movie season. Probably fixable by toying around with the PCBs inside, but putting cash into a 15 years old display doesn't make too much sense.
So.. for a new TV
- No intention of having the new TV as "the" home cinema, I'm renovating my basement for a movie theater so likely a 85~98" panel will go down there, in the meantime
- I think 65" is good, and will serve as secondary TV eventually.
- I don't have a PS5 nor an Xbox, no intention of future consoles either. I have a Nintendo switch I barely play with anymore but more than likely, my kids will eventually play it on it more and when they're older a bit I'll play with them.
- So good upscaling, I doubt even Switch 2 will be 4K so upscaling that doesn't suck is high on the list. On top of TV channels barely moving out of the HD age.
- A bit worried about burn-in, but technically we don't watch the troublesome things like news feed 8 hours a day or something. Kids playing games and leaving it ON for a while might be a problem.
- I hear that there's no TV nowadays that will match the motion resolution of Plasma so that sucks a bit, but I mean, unless someone corrects me, there's not a single tech in modern TVs that will match it, correct?
- I don't particularly aim for the BEST picture quality with perfect blacks infinite contrasts, etc. I love the Kuro image quality and that was not perfect. My wife will never care for it. I'll put in more cash into the future home theater setup, again, this will likely become secondary TV.
- Right now it would be in a bright room with big windows
I kind of nailed it down to
Is OLED worth the extra ~$800 over cheap LCD for secondary TV?
- Cheap LCD ~$1500 Canadian, hoping for a $999 sales soon? Don't think I can wait..
- TCL QM8 / Hisense U8K, these are apparently trading blows, with Hisense having more software problems it seems, so I would aim for TCL, but how long do they last..
- Sony X90L, its just a full array but apparently punch way above its weight with sony processing. Only a few scenes can be problematic with it but also the image would be more accurate as the micro-LED displays will typically just black crush details, such as stars disapearing. Sony has best upscaling from what I read?
- Top range LCD ~$2100
- Samsung QN90C
- Mid-range OLED ~$2300 CND
- LG C3 / Sony A80L / Samsung S90C, Samsung not having any Dolby vision, again, not even sure I'll take advantage of it..
Mind you, I already have an Apple 4K TV, so forget about the trash "smart" OS of the TVs, I could live with them being dumb as a brick for all I care.
My gear including my 4090 build is hooked up to my Sony's.Just followed a bunch of Bravia XR setting guides, for SDR, HDR & Dolby vision and afterwards for Apple TV 4K settings.
The TV looks completely different than the first hour I spent with it. I'm watching the jet sequences in Top Gun maverick and it blos my fucking mind.
I thought honestly my Plasma was holding up, I just didn't realize how much 4k + HDR/Dolby Vision on a good TV just blows it the fuck away. I'm jumping between all the movies I have to test it and EVERYTHING looks amazing.
I'm legit gonna have trouble going back to PC monitor non HDR now I think. The shadow details and the peak brightness for some scenes,
Can you post a picture of the TV running something just so that we can kinda get an idea of what it looks like?Just followed a bunch of Bravia XR setting guides, for SDR, HDR & Dolby vision and afterwards for Apple TV 4K settings.
The TV looks completely different than the first hour I spent with it. I'm watching the jet sequences in Top Gun maverick and it blos my fucking mind.
I thought honestly my Plasma was holding up, I just didn't realize how much 4k + HDR/Dolby Vision on a good TV just blows it the fuck away. I'm jumping between all the movies I have to test it and EVERYTHING looks amazing.
I'm legit gonna have trouble going back to PC monitor non HDR now I think. The shadow details and the peak brightness for some scenes,
Can you post a picture of the TV running something just so that we can kinda get an idea of what it looks like?
Just followed a bunch of Bravia XR setting guides, for SDR, HDR & Dolby vision and afterwards for Apple TV 4K settings.
The TV looks completely different than the first hour I spent with it. I'm watching the jet sequences in Top Gun maverick and it blows my fucking mind.
I thought honestly my Plasma was holding up, I just didn't realize how much 4k + HDR/Dolby Vision on a good TV just blows it the fuck away. I'm jumping between all the movies I have to test it and EVERYTHING looks amazing.
I'm legit gonna have trouble going back to PC monitor non HDR now I think. The shadow details and the peak brightness for some scenes.
Edit - Dunkirk! DUNKIRK!! Holy crap, this movie was made for brightness
Less compressed?Is there a way to buy on iTunes some amazing demo? Like YouTube demo are pretty fly but clearly we see the algorithm. How do they set up demos at stores? Surely not via physical media right?
Are there apps? Or iTune movies?
You are correct in that HDR10 games are converted(and worse for it) when DV is on, but there are quite a few Dolby Vision Enhanced games already on the market. Gears, Halo, CoD, NBA2K...RIGHT NOW Dolby Vision is useless for gaming because almost no games support it, Xbox setting for DV just converts HDR10 signal. Once developers start to use DV it will be better format for games for sure.
HDR10 peak brightness in 4000 nits so brightness isn't the problem for games.
Right now HDR10 (console, PC) and HGIG (tv) is perfect combo.
Even the best QLED with thousands of dimming zones will be worse than OLED where every pixel is a light source. You are wrong.
Check out Roman de Giuli 4k and 8k hdr demos on YouTube. It's insane. There's some hour long versions out there too.Is there a way to buy on iTunes some amazing demo? Like YouTube demo are pretty fly but clearly we see the algorithm. How do they set up demos at stores? Surely not via physical media right?
Are there apps? Or iTune movies?
All the Sony miniled are va panelsIf you go LED with local dimming definitely pay a little more for the wide angle color (Sony x93L/x95L) rather than the standard VA panel (Sony x90L). You lose a little in contrast but it's worth it to be able to sit off angle without having washed out colors. I would not consider any non-Sony LED as Sony has the best local dimming algorithm, especially in game mode.
Not with Sony's algorithm.
One of my Sony's only has 32 zones and it's not that bad.
I may have shat on OLED TVs (i have a C2) for the obvious reasons (judder/stutter with sub-60fps content, lower brightness compared to LEDs etc) but...the prices for MiniLED TVs are just friggin' stupid for fook's sake especially for something like the Sony X95L - it's almost 1000 Eurodollars more for a same inches OLED (C3) ! I know that there's the infamous Sony tax® involved but G'damn...
Last year, wanted to get a MiniLED myself (Philips, since i love Ambilight) but the prices were - once again - a complete joke, it was more expensive than the tried and tested LG OLEDs so i obviously went with the later.
Good or not, at the end of the day it's still LED technology with everything that entails (corner vignetting/Vertical banding, DSE, blooming etc), i really can't understand the prices they're asking for.
No light bleeding on black bars with my 858 zone and 32 zone Sony tv's.It depends on how sensitive you are to this. On this Sony tv in 21:9 movies or in darker scenes subtitles will glow on dark background. Most movies are 21:9 and I have to use subtitles so this thing is a massive minus for me.
For Mini LEDs you probably pay for brightness. I can't comprehend how these tvs are more expensive than OLEDs when they have technology that is inferior in many metrics (aside brightness and low fps content handling of course).
No light bleeding on black bars with my 858 zone and 32 zone Sony tv's.
Brightness(and no burn in) is the biggest strength of LED tech.
They go brighter and hold brightness much better.
The HDR market is aiming to go brighter and only one display tech lags behind in this area and that's with improvements like heat sinks etc.
The nit race isn't stopping.?
Samsung has over 3x more dimming zones.
But when this "nit race" will stop? Movies are often mastered for 4000 nits but do we really need all this shooting into our eyes?
2. Cameras and videos overexpose such things.
Guess you needed it now?
and congratulationsI probably didn't stress enough in original post how its looking for me in the coming weeks/months
My wife is 3 weeks until due date for our 3rd baby. She's 100% at home.
I don't want her to go through this shit where you want to watch TV and it doesn't work. And especially not when the baby will be there and you sort of couch potato a couple of months. Waiting for a better deal or a better model was not worth the tradeoff of her being stuck with a dead TV. I hope that clears the urgency I had into purchasing a new one.
As a Montanan, I can endorse this. I have radiant heating in my house, and my kids in their room playing their 60" tvs and PS5s and Series X never need to worry about cranking up the baseboard heaters in the wintertime.
That Kuro was amazing. I can't believe it lasted me even that long. Image quality was so good that I never felt the urge to upgrade when going to stores. On top of that, it served as a heater for Canadian winter
All the Sony miniled are va panels
??
Samsung has over 3x more dimming zones.
But when this "nit race" will stop? Movies are often mastered for 4000 nits but do we really need all this shooting into our eyes?
Yes, we need more peak brightness for a more accurate picture. Just as how we didn’t stop at black and white film, or 256 colour gifs, we will continue to make our display technology more accurate. If, in real life, a bonfire is 100x brighter than the surrounding area, we want it to be the same on our screens.But when this "nit race" will stop? Movies are often mastered for 4000 nits but do we really need all this shooting into our eyes?
?
Why would you post this? The Samsung is overly aggressive with the dimming based on the screenshot you provided. There is a tremendous amount of detail lost, do you not see it? 3x more dimming zones doesn’t mean much when the algorithm driving those zones cuts out that much detail. Sure, the Sony has blooming, but at least I can see the tree on the left.
Also, nit race? Since when is reproducing the intended image a bad thing? Why is accurate reproduction of contrast something to be valued, but accurate reproduction of brightness is something to be questioned? You are aware that content is mastered at 4000 nits, so achieving 4000 nits is an important capability of the TV, no? More range is absolutely worth achieving.
Yes, we need more peak brightness for a more accurate picture. Just as how we didn’t stop at black and white film, or 256 colour gifs, we will continue to make our display technology more accurate. If, in real life, a bonfire is 100x brighter than the surrounding area, we want it to be the same on our screens.
Zell also touched on the issue of how our eyes actually respond to light and why more nits doesn’t necessarily equate to a better experience. “When we watch a movie at 48 nits or 100 nits,” he said, “our irises stay open. When we’re looking at 150 nits, they start to close down and by 300 nits, they are really small. A screen at full white at 300 nits, makes viewers’ “heads jerk backwards, it makes them cover their eyes with their hands.”
And while there would be very few situations of pure white being projected on a screen for any significant length of time, he did suggest that if movies are to be shown at levels beyond 300 nits, editors and colorists will need to understand the fatigue and headache that can result when viewers’ irises are made to frequently open up and close down and to factor this into their work.
The context behind that article is a cinema environment, i.e. a dark room. 300 nits is a lot when you’re in a very dark room. In a bright room, 300 nits is nothing.I'm all for accurate picture but there is a line where more brightness would do more harm than good (IMO), for example do you think that watching car lights in the movie attacking your eyes in 4000 nits will be comfortable?
What is funny to me that movies are mostly aimed at cinema screens and brightness of cinema screens is laughable compared to modern tvs. There is also this interesting quote:
HDR in Cinema: Technology Projections and Predictions - NAB Amplify
HDR in home entertainment has received a lot of attention, but theatrical HDR? Not so much. SMPTE’s Future of Cinema conference investigates.amplify.nabshow.com
Again, this would be like asking "do we really need 16 million colours in our photos", arguing against it because too many colours may be uncomfortable to view. There is no upper limit where dynamic range would be harmful, or even undesirable.I'm all for accurate picture but there is a line where more brightness would do more harm than good (IMO), for example do you think that watching car lights in the movie attacking your eyes in 4000 nits will be comfortable?
The context behind that article is a cinema environment, i.e. a dark room. 300 nits is a lot when you’re in a very dark room. In a bright room, 300 nits is nothing.
Again, this would be like asking "do we really need 16 million colours in our photos", arguing against it because too many colours may be uncomfortable to view. There is no upper limit where dynamic range would be harmful, or even undesirable.
The peak brightness of any display involves a small percentage of the screen. Most of the headroom above 1000nits would be used only for extreme highlights. Your 4000 nit headlight example would never actually occur.
Again, this would be like asking "do we really need 16 million colours in our photos", arguing against it because too many colours may be uncomfortable to view. There is no upper limit where dynamic range would be harmful, or even undesirable.
The peak brightness of any display involves a small percentage of the screen. Most of the headroom above 1000nits would be used only for extreme highlights. Your 4000 nit headlight example would never actually occur.
Ah yes. Diminishing returns. Except for OLED, where infinite contrast matters! And the immediate pixel response is superior. Except when the content is sub 40 fps and results in distracting judder. Console devs, adjust please! Ignore the bright HUD in your modern games causing the ABL to aggressively darken everything else on the screen to avoid screen retention issues. OLED is superior, you don’t really need more than 300 nits. Go read this article about 300 nits being enough.I get what you are saying about number of colors etc. but there are diminishing returns for everything. Just like 8K resolution is a waste.
Ah yes. Diminishing returns. Except for OLED, where infinite contrast matters! And the immediate pixel response is superior. Except when the content is sub 40 fps and results in distracting judder. Console devs, adjust please! Ignore the bright HUD in your modern games causing the ABL to aggressively darken everything else on the screen to avoid screen retention issues. OLED is superior, you don’t really need more than 300 nits. Go read this article about 300 nits being enough.
Jesus Christ, OLED purists are insufferable. I’m out.
To begin, let’s please get one common, deep-rooted misunderstanding out of the way: that HDR is about overall brighter pictures. That's just wrong. It's a wider range of luminance, so deeper blacks and brighter highlights, especially specular highlights – light reflecting off shiny objects.
Another question is where to draw the line? We may think of 1,000 nits as bright now but in a few years 10,000 nits may be achievable. Will we look at 1,000-nit movies then as 'fake' HDR? Let’s hope we won’t. The only line that can be drawn objectively is SDR’s 100 nits. Anything above that in principle is HDR.
Can u not disable it?Treat yourself and buy the X93L. You deserve it. That G3 is dim. Auto Brightness Limiter ruining things like usual.
Can u not disable it?
No pixel orbiter?And risk damaging the OLED now that it has no safeguard.