• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is it just me, or does HDR on most games look bad?

Killjoy-NL

Gold Member
PS5 can output HDR on full.

TVs tho, although they can receive full both in hdr and sdr, are made with limited in mind, and work better with limited.

Full rgb is less accurate, although its hard to notice unless you know how a calibrated display looks like
The way I understand it, is that most tvs use limited RGB range for HDR, whereas monitors use full range.

Also, looking it up, it seems PS5 4K/120hz is limited to 8bit limited RGB.
 

Calverz

Member
It’s a bit random.
If you start the game in hdr right after pc restart, then it works.

Nvidia broke hdr months ago. It all used to work properly before
Yea I briefly saw it working once and it looked great. I think I then changed a setting in regards to performance, screen went black and then reappeared washed out. Never could get it back and I think capcom have basically abandoned the game now.
 

Bojji

Member
The way I understand it, is that most tvs use limited RGB range for HDR, whereas monitors use full range.

Also, looking it up, it seems PS5 4K/120hz is limited to 8bit limited RGB.

It's 12 bits actually but with 4.2.2 chroma instead of 4.4.4

This appears to be a known issue of HDMI sink devices misreporting YCC 4:2:2 bit rate. The problem is not exclusive to this monitor or the TV in the HDTVTest video. As you can see at the 6:00 mark on that video, the PS5 is in fact outputting YCC 4:2:2 12-bit but the TV info shows it as 8-bit.
 

rofif

Can’t Git Gud
Yea I briefly saw it working once and it looked great. I think I then changed a setting in regards to performance, screen went black and then reappeared washed out. Never could get it back and I think capcom have basically abandoned the game now.
It''s absolutely not capcoms fault.
Every hdr stuff on nvidia/windwos is like that for me in past year.
Even HDR movie, I have to restart my pc and start this shit right after restarting pc
 
Once again I extend an invite to anyone wanting to come over to old young Kuranghi's house and see proper HDR on a bright, high zone count FALD LCD (which is the bare minimum for superb HDR, though most people should just buy an OLEG), because games like Death Stranding look so so much better in HDR.

Come on over lads/ladettes *pats space next to me on the couch with sinister look on face*

Simply travel to Scotland, UK. Either Glasgow or Edinburgh and stand in the town centre holding a sign saying Kuranghi, I will approach you with a balaclava on, blindfold you, bundle you into a cab and drive to mine (for doxxing safety).

When the blindfold is removed I'll have a beautiful game running in HDR, snacks and a beverage waiting, which I'll feed to you slowly throughout the evening. I'll be fully clothed don't worry about that.

HDR discussion aside, posts like these are the reason why I like visiting this forum !

God bless ya Kur, you brought a smile to this old man's face !

Cheers and happy new year everybody !
 

Buggy Loop

Member
Can't speak for games yet but since I got my Sony X93L not long ago, HDR movies have blown my mind. I can't imagine that games would fuck it up that much that its drastically different than movies? Implementation is key of course, but there must be some very good ones?
 

Kuranghi

Gold Member
my TV doesn't even have a 10-bit panel, I assume HDR is pointless on it.

edit: actually, is it really pointless? I assume HDR isn't just about color range.

A true 10-bit panel is the least important factor in HDR, as long as the panel has Frame Rate Control it will be very close or the same to a true 10-bit panel. The most important factors (in roughly this order )are local dimming ability, brightness output, panel type (For LCD mostly, so if its not OLED) and colour volume.

What exact model TV do you have?
 

Kuranghi

Gold Member
PS5 can output HDR on full.

TVs tho, although they can receive full both in hdr and sdr, are made with limited in mind, and work better with limited.

Full rgb is less accurate, although its hard to notice unless you know how a calibrated display looks like

In what manner is 444 chroma subsampling/RGB less accurate than 422 or lower? Chroma subsampling is a form of compression, I don't see how it could be less accurate.

HDR discussion aside, posts like these are the reason why I like visiting this forum !

God bless ya Kur, you brought a smile to this old man's face !

Cheers and happy new year everybody !

I do love to make people smile/laugh, good stuff.

Happy new year to you too sir!


I'd be happy to show you my glorious willy Tommi, it really is a thing of beauty, but the thing is I'm as straight as an arrow pal, so you wouldn't be able to have your way with it, so thats gonna be torture isn't it? Some things just aren't meant to be pal x
 
Last edited:

Kuranghi

Gold Member
So that means I can set both tv and PS5 to full?
That should be better than limited as long as they match, right?

What exact TV model do you have?

Usually just set both to Auto and its fine, but some TVs don't switch correctly and in that case its better to just set both to Limited, because depending on the bandwidth of the HDMI ports, it they aren't 32Gbps+ the PS5 will output Full in SDR and Limited in HDR so the TV needs to switch.

Settings it to Low does sometimes lose you dynamic range/top end brightness so Full is preferable but if you force both to full it might stop you from outputting 4K + 60hz + HDR together since the bandwidth would exceed 18Gbps and since you've forced Full it will reduce some other aspect of the output to keep it at Full.
 

Bojji

Member
So that means I can set both tv and PS5 to full?
That should be better than limited as long as they match, right?

In theory both limited and full RGB should produce the same results when setted up correctly. You can check some test patterns to see what setting is correct, with wrong setting you will see crushed blacks:

rgb-test.jpg


In the case of PS5 it's important for 60Hz signal that outputs in RGB. 120Hz signal is using ycbcr422 that is limited only.

In what manner is 444 chroma subsampling/RGB less accurate than 422 or lower? Chroma subsampling is a form of compression, I don't see how it could be less accurate.

RGB 444 is the best signal available. I don't think full/limited RGB have anything to do with it.
 
Last edited:

Killjoy-NL

Gold Member
In theory both limited and full RGB should produce the same results when setted up correctly. You can check some test patterns to see what setting is correct, with wrong setting you will see crushed blacks:

rgb-test.jpg


In the case of PS5 it's important for 60Hz signal that outputs in RGB. 120Hz signal is using ycbcr422 that is limited only.



RGB 444 is the best signal available. I don't think full/limited RGB have anything to do with it.
What exact TV model do you have?

Usually just set both to Auto and its fine, but some TVs don't switch correctly and in that case its better to just set both to Limited, because depending on the bandwidth of the HDMI ports, it they aren't 32Gbps+ the PS5 will output Full in SDR and Limited in HDR so the TV needs to switch.

Settings it to Low does sometimes lose you dynamic range/top end brightness so Full is preferable but if you force both to full it might stop you from outputting 4K + 60hz + HDR together since the bandwidth would exceed 18Gbps and since you've forced Full it will reduce some other aspect of the output to keep it at Full.
I have an A90J, which supports 120hz, so I guess the best option would be to keep it set to limited then.

Don't care too much about tv and such, as I haven't watched tv for years and I rarely watch older movies.
 

Kuranghi

Gold Member
I have an A90J, which supports 120hz, so I guess the best option would be to keep it set to limited then.

Don't care too much about tv and such, as I haven't watched tv for years and I rarely watch older movies.

Your TV has 2x HDMI 2.1 ports (should be number 3 and 4 in all model aliases, but whatever ports say 4K@120hz anyway), so just make sure the PS5 is connected to one of those ports. Your TV will have it set to Auto by default, set it to Auto in the PS5 and it will output Full when it can. The only time it will change to Limited is when the PS5/a game tries to output 4K + 120hz + HDR together, as that would exceed 32Gbps of bandwidth, so it drops the signal to a Limited/422 chroma output.

TL;DR: You're already good to go unless you manually changed the TV and/or PS5 settings in the past as they should both be set to Auto by default and adjust as the signal changes.
 

Whitecrow

Banned
In what manner is 444 chroma subsampling/RGB less accurate than 422 or lower? Chroma subsampling is a form of compression, I don't see how it could be less accurate.



I do love to make people smile/laugh, good stuff.

Happy new year to you too sir!



I'd be happy to show you my glorious willy Tommi, it really is a thing of beauty, but the thing is I'm as straight as an arrow pal, so you wouldn't be able to have your way with it, so thats gonna be torture isn't it? Some things just aren't meant to be pal x
RGB range and chroma subsampling are different things.

RGB range maps a color code to a determined brightness on the display.
On limited rgb, the black is the color 16 16 16, and the white is 235 235 235.
In full rgb, those values are 0 and 255.

You can have both ranges in any chroma subsampling you want.

They dont produce the exact same result. Full rgb has a lot more colors, for obvious reasons, and games are made using PC monitors, which are full rgb.

If you convert it to limited, you are getting a close equivalent which consist basically on a higher contrast image, but not technically accurate to the source.

And again, PC mode on LG TVs allow for full 444 chroma processing, but the gamma curve (aka luminances) is very bad.

I hate playing on TV because I cannot fuckind decide between accuracy (console mode) or PC mode.
 
Last edited:

Killjoy-NL

Gold Member
Your TV has 2x HDMI 2.1 ports (should be number 3 and 4 in all model aliases, but whatever ports say 4K@120hz anyway), so just make sure the PS5 is connected to one of those ports. Your TV will have it set to Auto by default, set it to Auto in the PS5 and it will output Full when it can. The only time it will change to Limited is when the PS5/a game tries to output 4K + 120hz + HDR together, as that would exceed 32Gbps of bandwidth, so it drops the signal to a Limited/422 chroma output.

TL;DR: You're already good to go unless you manually changed the TV and/or PS5 settings in the past as they should both be set to Auto by default and adjust as the signal changes.
Yeah, I've manually set it to limited on both my tv and PS5, figuring it would be best.
Might change it back to auto after all then.
 

Kuranghi

Gold Member
RGB range and chroma subsampling are different things.

RGB range maps a color code to a determined brightness on the display.
On limited rgb, the black is the color 16 16 16, and the white is 235 235 235.
In full rgb, those values are 0 and 255.

You can have both ranges in any chroma subsampling you want.

They dont produce the exact same result. Full rgb has a lot more colors, for obvious reasons, and games are made using PC monitors, which are full rgb.

If you convert it to limited, you are getting a close equivalent which consist basically on a higher contrast image, but not technically accurate to the source.

I understand they are different things but with a console/PS5 they go hand in hand don't they? You can't output 444/RGB over a limited signal I don't think. If I change HDMI Level to Limited on any of my displays it switches the output 422 as well. Maybe I'm missing something here.
 

spons

Gold Member
A true 10-bit panel is the least important factor in HDR, as long as the panel has Frame Rate Control it will be very close or the same to a true 10-bit panel. The most important factors (in roughly this order )are local dimming ability, brightness output, panel type (For LCD mostly, so if its not OLED) and colour volume.

What exact model TV do you have?
It's a 400 buck Philips TV (PUS8108), it outputs video and that's about it. Joking aside, it does seem to have FRC and has direct LED backlight. I was just wondering whether tech like HDR10+ and Dolby Vision have any use at all on a TV like this.
 

Akuji

Member
People seem to have trouble understanding what hdr is. Essentially its a higher resolution Mode for colours. Sdr ( normal ) is 8bit for each color. 255 steps of a color, maybe uve seen it in a game creator Software hdr is 10 or even 12 bits. Meaning 1023 steps for 10 bit and 4000steps for 12bit. If picture looks worse then its because of artefacts and errors ur Display is using because its not up to the Standard. Like playing at 1440p @ 1080p Display is less sharp then a 1080p native render.
 

ShakenG

Member
Number one mistake people make with HDR is setting RGB range to Full when they should be using Limited if using an HDTV with a console.

OP, I use a Sony X950H which is a step up from your TV, but they are similar. Make sure the RGB range in your PS5 settings is set to Limited. Same thing on the TV.

On the HDR calibration screen on the PS5, ignore the instructions on the screen. Using the D-pad, press down until you reach the lowest setting, then press up 15 times. This basically sets a cap of 1000 nits which is the limit of your TV without having to engage the dynamic tone mapping. Press X to the next screen. Do the same thing, 15 clicks, then press X. Last screen, again lowest setting but do not go up at all, just press X. Done.

Use some of the settings here as a starting point.

If this Tv is anything like the X90H it should be on Full. Even the Auto setting will put it on Full.
 

Kuranghi

Gold Member
It's a 400 buck Philips TV (PUS8108), it outputs video and that's about it. Joking aside, it does seem to have FRC and has direct LED backlight. I was just wondering whether tech like HDR10+ and Dolby Vision have any use at all on a TV like this.

Not completely pointless but not really doing much, you have a Direct LED backlight but no local dimming and HDR really requires that to work properly, if its a 50 or 70" of that model then maybe HDR is a bit better than SDR output would look but still dark scenes will look more blown out.

If you have any of the other sizes of that model I'd disable HDR entirely as SDR will probably look better overall. That's because they most likely use an ADS-IPS panel which has a low natio contrast so shadows/blacks will be very washed out in HDR due to the backlight being forced to max.
 

Whitecrow

Banned
I understand they are different things but with a console/PS5 they go hand in hand don't they? You can't output 444/RGB over a limited signal I don't think. If I change HDMI Level to Limited on any of my displays it switches the output 422 as well. Maybe I'm missing something here.
You definitely can.
In fact, PCs can output at YCbCr:444 which is another way of encoding limited.

Another thing is how displays process the source, for example, LG oleds, unless in pc mode, they process the incoming signal in YCbCr:422, doesnt matter what you throw at it.

Another issue might be the bandwith and hdmi version. If theres not enouth bandwith, the console might compress the signal and use chroma subsampling to fit resolution and fps requirenents.

For example, with Uncharted Remasters: If you dont enable 120 hz on the console, it can output 4k full rgb at 60 hz.
But if you enable 120 hz, it switches to YCbCr:422
 
Last edited:
Wow! Great thread with some great replies. I think I actually did manage to improve my HDR because I think my actual PS5 HDR settings were off.

Turns out there’s a world of difference between “15 clicks” and “20 clicks” in the HDR adjustment screens. Wild how it’s so vague in the setting itself.
 

Meicyn

Gold Member
Wow! Great thread with some great replies. I think I actually did manage to improve my HDR because I think my actual PS5 HDR settings were off.

Turns out there’s a world of difference between “15 clicks” and “20 clicks” in the HDR adjustment screens. Wild how it’s so vague in the setting itself.
What’s really interesting is that Hogwart’s Legacy is actually really useful for HDR calibration on the PS5, especially if you want hard numbers and want to play around with settings. The in-game default is based on the PS5’s calibration that you set, which lets you modify the game’s HDR if desired without affecting the system level settings. For example, it’ll tell you that it’s at 985 or whatever nits if you have it calibrated to 15 clicks.
 
On my Gigabyte M27QP (400 nits) PC monitor, HDR content looks very bad, however on my smartphone S21Ultra (up to 1500nits) HDR gameplays on YT look amazing, much better compared to the SDR for sure. I watched Alan Wake 2 gameplays in HDR as well and the picture quality was better compared to the SDR version, contrast was no longer so washed out, there was more shadow details, and the highlights (from the sun for example) look extremely realistic.

I'm planning to replace my PC monitor again in the near future, because HDR makes too big difference. I saw only few gameplays where I could see raised black levels in HDR (Resident Evil 2 REMAKE) but I think it should be fairly easy to fix.
 

Woggleman

Member
Cyberpunk used to have bad HDR but they really improved it with the latest update. RDR2 still has terrible HDR which makes no sense since they got it right with the latest GTA V and I expect the same with GTAVI.

If done right in can look great.
 
On my Gigabyte M27QP (400 nits) PC monitor, HDR content looks very bad, however on my smartphone S21Ultra (up to 1500nits) HDR gameplays on YT look amazing, much better compared to the SDR for sure. I watched Alan Wake 2 gameplays in HDR as well and the picture quality was better compared to the SDR version, contrast was no longer so washed out, there was more shadow details, and the highlights (from the sun for example) look extremely realistic.

I'm planning to replace my PC monitor again in the near future, because HDR makes too big difference. I saw only few gameplays where I could see raised black levels in HDR (Resident Evil 2 REMAKE) but I think it should be fairly easy to fix.
If you are looking for a PC monitor with HDR, you need one that meets HDR 1000 specifications. "HDR 400" and "HDR 600" are joke specs which are basically the same as SDR.

On the TV side it's easier, these days almost all TV's of a high enough premium tier meet the 1000 nits minimum needed for good HDR, both premium OLED's and LCD's meet this criteria.

This thread smells of non-OLED owners.

HDR is amazing.
Imagine not knowing Mini-LED lit LCD TV's are out there blasting 2000+ nits in HDR nowadays. Mini-LED lit LCD TV's have vastly superior brightness for HDR and also can't burn in.
 
Last edited:
If you are looking for a PC monitor with HDR, you need one that meets HDR 1000 specifications. "HDR 400" and "HDR 600" are joke specs which are basically the same as SDR.

On the TV side it's easier, these days almost all TV's of a high enough premium tier meet the 1000 nits minimum needed for good HDR, both premium OLED's and LCD's meet this criteria.


Imagine not knowing Mini-LED lit LCD TV's are out there blasting 2000+ nits in HDR nowadays. Mini-LED lit LCD TV's have vastly superior brightness for HDR and also can't burn in.
Yes, my HDR monitor is a joke :p, however my sister bought some cheap sony TV with around 400 nits as well and even on her TV HDR looks clearly better compared to the SDR (not as much as on my S21Ultra, but still). Contrast is more striking, I can see more detail in the shadows and a greater variety of colours. It seems that good dynamic tone mapping is the key when viewing HDR content on sub-1000 nits displays.
 
Last edited:

Killjoy-NL

Gold Member
Yes, my HDR monitor is a joke :p, however my sister bought some cheap sony TV with around 400 nits as well and even on her TV HDR looks clearly better compared to the SDR (not as much as on my S21Ultra, but still). Contrast is more striking, I can see more detail in the shadows and a greater variety of colours. It seems that good dynamic tone mapping is the key when viewing HDR content on sub-1000 nits displays.
Apart from better contrast and detail in shadows, fog for example also seems to have much more volume.
There is quite a difference between SDR and HDR in a game like Horizon.

Also, I feel like games get more visual depth due to the added detail.
 
Last edited:

zeroluck

Member
10 bits SDR can looks quite a bit better than 10 bits HDR between 1-100 nits, HDR really is made for high average brightness which most OLED can't display.
 

Bojji

Member
10 bits SDR can looks quite a bit better than 10 bits HDR between 1-100 nits, HDR really is made for high average brightness which most OLED can't display.

Average? HDR is made for highlits not for full screen 1000 nits torch.

You can have 0 nit blacks and few hundred nits highlights on the same frame and that's the beauty of HDR and most oleds (plus some mini LEDs) can do that.
 

zeroluck

Member
Average? HDR is made for highlits not for full screen 1000 nits torch.

You can have 0 nit blacks and few hundred nits highlights on the same frame and that's the beauty of HDR and most oleds (plus some mini LEDs) can do that.
At the expense of less precision between 1-100 nits compared to SDR. Most people are attracted to those high contrast areas in HDR, but there is downsides.
 
Last edited:

This old thread apparently may be relevant here. It sounds like the set is auto adjusting causing a bit of an issue to some calibration.

TLDR - manual set 15 clicks/15 clicks on the HDR adjustment menu according to Vincent.

For cyberpunk at least - play around with the peak brightness setting in the in game HDR adjustment menu.
The 15/15/0 clicks is not true as a blanket statement but is the standard for displays that tone map to 1000 nits. That setting really depends on the capabilities of your display and how it’s tone mapping or hard clipping (e.g., HGIG).

If you don’t have a QD-OLED, OLED, or top end FALD LED from the last 4-5 years then HDR is not going to look better than SDR. Cheap TVs with HDR usually only hit ~500 nits and tone map the signal to fit into that window or hard clip. In many cases on the low end displays, content peak highlights are set at the displays peak capabilities and rolled down, bringing the brightness of the entire scene down with it.

Many HDR games are also mastered like movies where the majority of the scene is 100-200 nits. If you don’t have a capable display and want max brightness, you're better off just playing in SDR.

This is the thread people should be checking out for PS5 settings: https://www.reddit.com/r/PS5/s/BRHAXnP6RH
 

SHA

Member
Hdr isn't cheap and still incomplete, the short answer it's not worth all the propaganda that comes with it, yeah, you're on the "favoring vibrant colors side", not the realistic down to earth side, the vibrant colors aren't real, it's your problem that you got used to it, deal with it or stick with the conventional technology cause in techs language, it's always your fault.
 

King Dazzar

Member
HDR should look stunning and far better than SDR. I love it. But unfortunately it is subject to the content, the playback device and the TV. And how you have all 3 set.

The X900H isnt Sony's finest hour when it comes to TV's. But you should be able to get some fairly decent HDR from it in. Keep it simple to start with. So stick with a console, like you're doing, with a PS5 for gaming. But then choose a game which is straight forward to set correctly. Something like Days Gone or The Last Of Us remake would be good, as they dont use the PS5's system OS calibration and will simply rely on your TV's tone mapping. Which would allow you to check that you have the TV set correctly, without having the game and console likely causing you any issues. Just follow something like RTING's recommended settings to get you pointed in the right direction and take it from there.

Good luck.
 

dcx4610

Member
Not to be mean but if it looks worse, it's your TV or you not understanding what HDR is. HDR should always look better unless it is botched.

The idea is you are simply getting more richness in your color and your contrast/blacks and you can have certain portions of the screen lighter or darker depending on what is being displayed. To get proper HDR, you need a LED TV with full array dimming or, an OLED. Most people have software dimming or edge lit dimming. That's just not going to be very impressive and isn't proper HDR.

If you do have those and still aren't impressed, I think it's just not understanding what is going on. If you are playing a scene at night for instance, it's going to be way darker than SDR because it's trying to look like real life. If you go outside at night, the only light you are going to see if coming from the moon or street lights. That's what HDR can replicate. In SDR, you are getting a flat light source and while you might be able to see more, you won't have the richness or deep blacks and stark contrast of that of HDR.
 
Last edited:

R6Rider

Gold Member
Always looks washed out on PS5 for me. Looks great on YouTube. Don't remember trying it on the Series X yet. Did tons of testing with in on PS5 and never looked good.

TV still has amazing black levels.

Lots of people don't even understand HDR. So many thinking HDR itself gives the deeper blacks and the more saturated colors.
 
Last edited:

Hoddi

Member
Yeah, I've manually set it to limited on both my tv and PS5, figuring it would be best.
Might change it back to auto after all then.
I think you guys are conflating these terms a bit. The difference between full vs limited range RGB isn't about bandwidth limits or 8/10-bit processing which are wholly separate from that. The reason this option exists is because TVs and monitors treat the black floor differently where monitors consider absolute black to be 0 while TVs consider it to be 16. What this essentially means is that if you output the full 0-255 range to a TV then you will get black crush and, conversely, if you output the limited 16-235 range to a monitor then you will get a washed out image.

It seems counterintuitive but if your console is outputting full range RGB then you'll want to set your TV to limited range RGB. Likewise, if your console is outputting limited range RGB then you want to set the TV to full range RGB. This is the way to get accurate SDR black level tracking on TVs without black crush or image washout because these settings cancel one another out.
 

Famipan

Member
RDR2 and Plague Tale 1 has very bad HDR implementation.
TLOU2 and Days Gone looks amazing on LG C1 with settings according to HDTVTEST
 
Top Bottom