• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

God of War 2018 PS5 patch incoming

But doesn't HDR10 require 10 bit color? 8 bit is SDR right?

Not really, the colour space is still BT2020 instead of REC709 (the SDR space) so you get the "extra colours" (If they are even used) but technically the banding will be worse.

You get worse gradients with an 8-bit signal/panel over a 10-bit one, but if you dither the 8-bit you get very comparable and often better results because certain aspects of the games pipeline might have lower bit precision anyway (Like for a vignette over the graphics) so sending out true 10-bit will show up the banding more where 8-bit + dithering would mask it.

Many (most) 4K TVs that are marketed as HDR TVs don't have true 10-bit panels, they have 8-bit panels and use something called FRC or Frame Rate Control to rapidly flash different colour values inbetween the ones it can't show, so the gradient appears smooth to our eyes.

The 8-bit panel can't show as many "shades" of a colour as the 10-bit panel so it basically flashes the inbetween colour values between frames so its gives the appearance of a 10-bit panel:


edit - found this as well, the "best solution" answer starts out really technical but the 2nd or 3rd paragraph explains it in a simple way: https://forums.tomshardware.com/threads/true-10bit-vs-8bit-frc.2937222/

If you want to know if your TV has a true 10-bit panel then put your model number in the search here and do a control-f for "panel bit depth":


This is my understanding of it anyway, others please feel free to correct me if you have more knowledge on the subject.
 
Last edited:
Yeah I guess its like for a 60 FPS game, 30 of the frames are around 50% of the pixels of 2160p (or sometimes more than 50% like half the vertical pixels but all the horizontal pixels, so like 1920x2160, or diff stuff like that) and then the other 30 frames are the other pixels, so by combining them together it looks like 2160p to the players eye on screen. I think thats how I read it all works.

But yeah there can be the motion artifacts with fast moving scenes. Not perfect but a solid performance saver.
Yeah its pretty much half the resolution, but done in a way that masks it pretty well. I agree its a big performance saver and when done right looks really, really good. They made the right decision building the Pro around using this technique, and I expected them to continue for the PS5... not sure if that is the path they are going though.
 
lol. Maybe.

Just made this gif by comparing the 30 fps ps4 version of spiderman with the 60 fps pro version of Infamous. Looks like a massive difference.


I could be wrong but I think maybe you just like the color palette more, spiderman is brighter. I don't think there's much of a difference at all in technical terms, biggest thing being Spiderman is larger overall. Probably a bit better image quality on spiderman too, Insomniac's temporal injection was great last gen. But Infamous has really good image quality honestly.
 
I thought 4K checkerboard was a sort of "upscale" way to run 4k but actually at a lower resolution.

If they put 2160p that is true 4k. So it doesn't make sense to me.
That means they are using checker boarding to render to full 4K (full height anyways) and then not doing another upscale. Checker boarding, or other methods of reconstructions using local data and temporal information (previous frame data) is a lot more effective than standard up scaling.

I guess that checkerboarded 1440p would be like half horizontal resolution, full height, and then upscale (through whichever technique they chose) to 4K.
 
Last edited:
I have too many good games in my backlog, I'm never catching up.

Pop Tv GIF by Schitt's Creek
 
Sweet. Just started playing the unpatched disk version 2 days ago. Looks glorious and I forgot how great this game was.
 
Awesome! I was slow to get rolling on this am curious how far I am?
I just unlocked the gate to Jotenheim after flipping Tyr's chamber...how far am I?
 
Last edited:
What would they do for Wipeout: Omega Collection? I thought it was native 4K, locked 60 anyway, I guess they could downsample from above 4K? Maybe LOD tweaks, I haven't played for an extended time in ages though so maybe there are some glaring issues I've forgotten about. It probably does drop frames during explosions of alpha effects since everything does come to think of it.

I would love to see HDR added, but it sounds like too much extra work when the studio doesn't exist anymore, sadly.
In a native PS5 version they could add native 4K, 120fps, HDR, super short loading times, 3D audio, DualSense features.
 
That'll make a lot of people go back and play it and then you have Valhalla which made a lot of people look at Vikings in general differently.
 
any idea how 3d audio work on god of war on ps5?
Will it be just stereo with headphones connected to a controller or will it use the "3d audio" from ps5 settings since it had 3d audio on ps4?
 
I hope its dynamic 4k. Days Gone went from CB to dynamic. It looks better.

any idea how 3d audio work on god of war on ps5?
Will it be just stereo with headphones connected to a controller or will it use the "3d audio" from ps5 settings since it had 3d audio on ps4?

Its the latter. Days Gone and Horizon have this as well and sound amazing through the Pulse. Pretty much every PS exclusive does.
 
Not really, the colour space is still BT2020 instead of REC709 (the SDR space) so you get the "extra colours" (If they are even used) but technically the banding will be worse.

You get worse gradients with an 8-bit signal/panel over a 10-bit one, but if you dither the 8-bit you get very comparable and often better results because certain aspects of the games pipeline might have lower bit precision anyway (Like for a vignette over the graphics) so sending out true 10-bit will show up the banding more where 8-bit + dithering would mask it.

Many (most) 4K TVs that are marketed as HDR TVs don't have true 10-bit panels, they have 8-bit panels and use something called FRC or Frame Rate Control to rapidly flash different colour values inbetween the ones it can't show, so the gradient appears smooth to our eyes.

The 8-bit panel can't show as many "shades" of a colour as the 10-bit panel so it basically flashes the inbetween colour values between frames so its gives the appearance of a 10-bit panel:


edit - found this as well, the "best solution" answer starts out really technical but the 2nd or 3rd paragraph explains it in a simple way: https://forums.tomshardware.com/threads/true-10bit-vs-8bit-frc.2937222/

If you want to know if your TV has a true 10-bit panel then put your model number in the search here and do a control-f for "panel bit depth":


This is my understanding of it anyway, others please feel free to correct me if you have more knowledge on the subject.
Ok thanks. I honestly just assumed the HDR10 standard required 10 bit color due to the name.

I also kinda assume Sony know what their doing by choosing to output 4K60 HDR 10 bit YUV 422 instead of 4K60 HDR 8 bit RGB 444 when using HDMI 2.0 out. Theres gotta be some reason why.
 
Fucking DOPE. I Tweeted Herman Hulst about this a few months ago.... must be why he did it (I also asked for a Bloodborne 60FPS patch, you're welcome guys).

happy fuck yeah GIF
 
Best news I've heard all week.

Honestly, I'd love to see the same treatment for Rockstar titles, which I haven't seen spoken of in here yet.
 
Ok thanks. I honestly just assumed the HDR10 standard required 10 bit color due to the name.

I also kinda assume Sony know what their doing by choosing to output 4K60 HDR 10 bit YUV 422 instead of 4K60 HDR 8 bit RGB 444 when using HDMI 2.0 out. Theres gotta be some reason why.
I miss your old avatar smh...
 
In a native PS5 version they could add native 4K, 120fps, HDR, super short loading times, 3D audio, DualSense features.

Oh ofc, I'm being smallbrained. Imagine having the Plasma Cannon on a two stage trigger! *Whiirrrrrrr-Boof!*

I thought it was already native 4K so I watched this and learned that it IS native 4K when motion blur is off, it dynamic though and could drop below 4K but Tom from DF couldn't find a single drop in res after looking around the game for a day. So definitely locking that down would be great and anyway maybe people played on base PS4 and never saw it on Pro.

Loading times increase would be great for time trials especially and 3D audio is always welcome if its an improvement over the surround sound in the original release.
 
Fuck!!!
I played Jedi Fallen Order on PS5, finished it. One week later - next gen update. Then I decided to finally play God of War, finished it. The week has passed. Update! What the hell?!
What game should I finish next?
 
Last edited:

TLOU II not far behind either. Though I suspect they are holding the patch back till Factions II is ready. Motion matching used in the game for animations will probably need touching up as they were designed to run at 30fps, they may well look janky at a higher frame rate. Like the ravens around towers in AC Valhalla when played at 60fps, the animation plays at 30fps, and looks really odd compared with everything else in the game.
 
Ok thanks. I honestly just assumed the HDR10 standard required 10 bit color due to the name.

I also kinda assume Sony know what their doing by choosing to output 4K60 HDR 10 bit YUV 422 instead of 4K60 HDR 8 bit RGB 444 when using HDMI 2.0 out. Theres gotta be some reason why.

Definitely, maybe its because dithering takes processing power and 444 vs 422 makes a big difference to how tiny text looks (like you'd see on a PC) but you don't see that on a console so its nearly as much of a problem. If you turn on HDR in the PS5 and go to the browser you can probably see the text difference, but only there. There will be a coloured fringe around parts of the text like the top line here, apologies if you knew this:

444 vs 422

ju7100-text-chroma-4k-60hz-pc-mode-large.jpg
ju7100-text-chroma-4k-30hz-large.jpg


The main commenter here talks about it and has a bit about consoles where he says you need 10-bit signal because "consoles don't do dithering":



I also just realised when you play games in HDR on PC it has to be exclusive fullscreen so the game is setting its own output in that regard, so it may well be 10-bit 422 anyway, I'm not sure how to tell.

I had read all the discussions from the first few pages of the google search for "444 8-bit with dithering vs 422 10-bit" a year or more ago and decided it wasn't worth thinking about but then Vincent Teoh's (HDTVtest) video on Spiderman:MM talking about the HDR "brightness range" (Not sure what to call it) being different on HDMI 2.0 vs 2.1 (Or I guess you can do it on 2.0 if you lower res to 1080p so you have the bandwidth for full RGB) made me think about this stuff again since I don't have control of the output on console. Heres the bit in the video I mean:




I'm on a TV that can display around 1600 nits so it being 1100 vs 800 makes a big difference for me, even if you disregard the dynamic range compression/changes. So thats the main reason I would like to get RGB, by sacrificing 10-bit output.
 
Last edited:
nice to see, wish they update more of there old games for free. Best way to go really.
 
Just installed it and was in the first 30 minutes of a new playthrough. This news couldn't have come at a better time. Well, I guess 30 minutes earlier, but whatever.
 
Top Bottom