• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Underrated feature of Samsung displays- proprietary "HDR10+ Gaming"

YeulEmeralda

Linux User
I have a Samsung LED TV that can output 1500 nits so I can definitely see the difference.

Unfortunately I most play old games, low budget games or weebshit. So HDR hasn't been that important for me.
 
So what's the difference between hdr and hdr10+ or whatever this is ?
On LG c1 I feel like Dolby vision games are a bit darker than regular hdr
 

JohnnyFootball

GerAlt-Right. Ciriously.
So what's the difference between hdr and hdr10+ or whatever this is ?
On LG c1 I feel like Dolby vision games are a bit darker than regular hdr
HDR is static metadata and HDR 10+ is 10-bit dynamic metadata. Dolby Vision is 12-bit dynamic metadata

"Dynamic HDR technology means applying metadata to each scene, and then delivering a more optimized picture quality compared to that of static HDR technology."

Again, dynamic metadata's benefits to gaming are minimal when you can make so many adjustments in gaming. Its far far far more useful in movie when you want different tone mapping for different scenes. Admitedly most people will tell you regular HDR is juist fine. But for a movie enthusiast like me, I prefer Dolby Vision.
 

DJ12

Member
HDR10+ is supported on Samsung TV's too but I can count on 1 hand the number of times I've encountered content which supported that
Amazon prime stuff is HDR10+ enabled, or at least it used to be.
HDR 10+ is a fucking waste.

Dolby Vision took over and owns dynamic metadatic. Dynamic metadata IMO is less important for games than it is for movies.
HDR10+ is free for anyone to use, but Dolby Vision has the hearts and minds. Samsung should just bite the bullet for their consumers sake.

Bet there's little difference side by side if any though, but with the lack of hdr10+ encoded movies the difference between HDR10 and Dolby Vision will be pretty obvious.
 
Last edited:
HDR is static metadata and HDR 10+ is 10-bit dynamic metadata. Dolby Vision is 12-bit dynamic metadata

"Dynamic HDR technology means applying metadata to each scene, and then delivering a more optimized picture quality compared to that of static HDR technology."

Again, dynamic metadata's benefits to gaming are minimal when you can make so many adjustments in gaming. Its far far far more useful in movie when you want different tone mapping for different scenes. Admitedly most people will tell you regular HDR is juist fine. But for a movie enthusiast like me, I prefer Dolby Vision.

I don't like to tinker with settings. I'd rather have it automatic where I'm immediately playing the game how the developers intended. I get that with HDR10+. I just tested out Cyberpunk 2033, toggling regular HDR and HDR10+ and the latter is so, so much better visually. The colors pop so much more it's a crazy difference.
 
Last edited:

hinch7

Member
Good video I found explaining exactly what HDR10+ Gaming is and how it works.


Pretty good explanation on what it does. At the moment its a wild west on PC with way too many variations and factors from hardware, display to individual games. We need a default standard in HDR gaming in PC, for ease of use if anything.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Amazon prime stuff is HDR10+ enabled, or at least it used to be.

HDR10+ is free for anyone to use, but Dolby Vision has the hearts and minds. Samsung should just bite the bullet for their consumers sake.

Bet there's little difference side by side if any though, but with the lack of hdr10+ encoded movies the difference between HDR10 and Dolby Vision will be pretty obvious.
That’s the point! Almost nobody uses it! There’s less motivation to use it when only Samsung officially supports it!

It is puzzling why so few TVs implement use it since it is free.

I don’t think Cyberpunk uses true HDR10+ since the feature is available on my non-HDR10+ Alienware. I think it just changes the tone mapping.
 

Bojji

Member
Pretty good explanation on what it does. At the moment its a wild west on PC with way too many variations and factors from hardware, display to individual games. We need a default standard in HDR gaming in PC, for ease of use if anything.

HDR 10 is the standard on PC, same as on consoles. Every game with HDR support uses HDR 10, auto HDR and rtx HDR are HDR 10 too.

Very few games offer something different (like HDR 10+ or DV) and only one game on Xbox has native Dolby vision suport (halo infinite).

Hdr10 combined with HGIG is not bad at all, you can set maximum brightness and get correct tone mapping for your display

Movies actually benefit much more from techniques like DV or HDR10+ with dynamic metadata.
 
Top Bottom