Halo Infinite Dolby Vision Implementation on Xbox Series X is Currently Broken. Play in HDR with These Best Settings.




From HDTVTest:

We compared Halo Infinite Dolby Vision vs HDR using two Xbox Series X consoles, as well as two LG C1 OLED TVs, and found the Dolby Vision presentation to be flatter, dimmer & more washed out than HDR10.

We also used a Canon reference monitor to measure the Halo Infinite Dolby Vision output from the Xbox Series X when connected to LG OLEDs, and found that peak nits in the game were mostly capped to 3-400 nits.

When connected to a Sony A90J OLED, Dolby Vision output from Halo Infinite went up to 500 nits, but it's still dimmer than what's possible in HDR10, therefore we advice using HDR10 with the appropriate in-game [HDR] setting for the best picture quality.
 
HDR is a real mess, sadly. I was hoping Dolby Vision was going to at least be one dependable bulletpoint on the box, given that it's proprietary while every other HDR solution share general commonalities but can end up being drastically different from game to game and from TV to TV... but Dolby Vision has not been the HDR to save HDR either.
 
Last edited:
Once again, HDR10 is all that is needed for gaming.

Dolby Vision will never work for games. Xbox's brand new flagship game from their flagship developer on their new flagship console can't even get their own new exclusive Dolby Vision for gaming feature to work correctly.
 
Every game I've tried in Dolby vision has looked awful. Movies play fine though in DV.

DV for movies is great because they fine tune the HDR on a scene by scene basis.

Games are dynamic, so outside of cutscenes I'd imagine that would be an impossible task. Because of that I really fail to see how games would benefit from DV over standard HDR10 although I'm open to being convinced otherwise.
 
Last edited:
Seems excessively negative to say it will never work, and this is coming from someone who doesn't even like dolby vision.....
 
Results are going to vary until there is a standard for HDR.

Dolby Vision seems like the way going forward compared to HDR10+ but most of the time I leave it off.
 
HDR in general has been a huge letdown for me for both gaming and video content. I've got an lg OLED (c9), but, depending on content, it's far too inconsistent. At best, it might improve some highlight detail a little. At worst, it made the picture look significantly less vibrant. I think I'm at the point we're I would rather not deal with it.
 
Last edited:
Not only that but enabling DV limits the bandwidth of the HDMI below 40gbps (in LG TV's), as a result the image is more prone to banding. Not worth it
 
Last edited:
Not only that but enabling DV limits the bandwidth of the HDMI below 40gbps (in LG TV's), as a result the image is more prone to banding. Not worth it
The extra 8Mbps is only needed for 12-bits, 10-bits is still enough to avoid banding so 40Gbps is enough. 12-bit is not supported on TVs anyway, they have 10-bit panels.
 
Dolby Vision broken, VRR broken... in THE 1st party release of the year (or since the console was released, really). Not a great look technically. Was the game still rushed out even though it was delayed by a year, or are 343 just not very good?
 
So for Dolby vision for movies I need to buy a dedicated player right? Like the Panasonic ub820?

I feel these consoles are pure shit for movies.

I'll switch it off for games on the Xbox.
 
Last edited:
Dolby Vision is simply layers of metadata on top of what is essentially a HDR10 image (which is either a Rec2020 or P3 image with a PQ gamma curve). It's not the technology that's at fault, but the implementation - and not necessarily the developers, but perhaps Dolby's.
 
So for Dolby vision for movies I need to buy a dedicated player right? Like the Panasonic ub820?

I feel these consoles are pure shit for movies.

I'll switch it off for games on the Xbox.

If you want physical 4k blu-rays then yes.

For digital then either an Apple TV 4k or if you want the best of the best then a Kaleidescape (but you're well into the realm of diminishing returns with the latter and unless you have a really high end multi-speaker setup paired with a 4k projector then you'll be hard pressed to notice a difference).

Dolby Vision is simply layers of metadata on top of what is essentially a HDR10 image (which is either a Rec2020 or P3 image with a PQ gamma curve). It's not the technology that's at fault, but the implementation - and not necessarily the developers, but perhaps Dolby's.

If you get a bad DV picture then the burden lies squarely on the developer/producer of the content. It requires a lot more TLC than any standard method of HDR.
 
Last edited:
If you get a bad DV picture then the burden lies squarely on the developer/producer of the content. It requires a lot more TLC than any standard method of HDR.

Dolby Vision for film and TV doesn't require a lot of TLC, just changes to workflow. I wouldn't know about the DV implementation in gaming, as it's a proprietary SDK/Unreal plugin - so it would be supposition to say it's the developers fault.
 
I disable both HDR and DV ..

F**king had to tweak the image in every game...

I am just happy with OLED Colors and reg 4K
 
Dolby Vision for film and TV doesn't require a lot of TLC, just changes to workflow. I wouldn't know about the DV implementation in gaming, as it's a proprietary SDK/Unreal plugin - so it would be supposition to say it's the developers fault.

The whole format requires fine tuning of HDR on a scene by scene basis. For movies they run it through an analyser tool which makes automatic adjustments scene by scene and then they will go in and make fine tuning adjustments to each scene after that to make sure everything appears as it should. The fine tuning for each scene is the TLC I'm referring to since only relying on the tool can result in unexpected and adverse results.

Because games are so dynamic I don't know how they would even begin with that process unless the game is completely linear with static lighting.
 
Last edited:
The whole format requires fine tuning of HDR on a scene by scene basis. For movies they run it through an analyser tool which makes automatic adjustments scene by scene and then they will go in and make fine tuning adjustments to each scene after that to make sure everything appears as it should. The fine tuning for each scene is the TLC I'm referring to since only relying on the tool can result in unexpected and adverse results.

Because games are so dynamic I don't know how they would even begin with that process unless the game is completely linear with static lighting.

I work in post production. Grading for DV like you say requires a 'trim pass' - but for the most part the analysis is spot on and doesn't usually require much work. If it takes 16 hours to do an HDR grade (the same for DV/HDR10) - the trim pass can usually be boshed out in a couple of hours. Some colourists do not adjust anything (the analyser is actually pretty great).

I agree, I don't understand how DV works for dynamic, non cut based content either - which makes me think that it's a bodge, and probably quite difficult for developers (which is why we shouldn't blame them straight away).
 
I work in post production. Grading for DV like you say requires a 'trim pass' - but for the most part the analysis is spot on and doesn't usually require much work. If it takes 16 hours to do an HDR grade (the same for DV/HDR10) - the trim pass can usually be boshed out in a couple of hours. Some colourists do not adjust anything (the analyser is actually pretty great).

I agree, I don't understand how DV works for dynamic, non cut based content either - which makes me think that it's a bodge, and probably quite difficult for developers (which is why we shouldn't blame them straight away).

I guess I'm just looking at it from the perspective of "if we have a lot of high quality DV video content then blaming the format doesn't make sense". Clearly it is possible to get high quality and consistent results using DV, just not for games. But ultimately I think the conclusion is that in it's current state it's probably not a good fit for games and developers might be better off avoiding it until something changes with how they are able to implement it.
 
Last edited:
The extra 8Mbps is only needed for 12-bits, 10-bits is still enough to avoid banding so 40Gbps is enough. 12-bit is not supported on TVs anyway, they have 10-bit panels.
Not talking about the HDMI 2.1 chipset at 48gbps. The DV mode in XSX limits the bandwidth below 40, it was one of the videos of HDTV tests which compares it against HGIG
 
I have dolby vision for gaming unchecked, my tv doesn't support game mode for it so i am playing with hdr. Anyway i tried it and couldn't really tell a difference between dolby and hdr.
 
Top Bottom