Digital Foundry: The Last of Us Part 2 PC Review - We're Disappointed - Analysis + Optimised Settings

No, you've heard multiple angles on Gaf, including people who said it was a hit and people who said it was a flop. Not that it changes the reality that it was a top seller on Steam.

Oh, I did. Why aren't you addressing the post? Please enlighten us on what you meant.

Alan Wake and BMW 2 aren't a graphical showcase. Now this is just trolling.


Since you won't do it this is what I actually said regarding the 5090

Where are all of these PC game showcases that make the most out of a 5090... they don't exist.

I said the 5090 didn't have games that made the most of it. Tell me how that's a slight on a video card that just came out a couple months ago.

You're a warrior, so anytime anyone says anything that you don't like, you think they're trying to be a warrior too.

The Witcher 4 will absolutely be a 5090 showcase, which is precisely why they used a 5090 in the trailer.

Even if we have different definitions of showcases, you saying that I was trashing the 5090 is straight up warring/trolling.
 
Since you won't do it this is what I actually said regarding the 5090



I said the 5090 didn't have games that made the most of it. Tell me how that's a slight on a video card that just came out a couple months ago.

You're a warrior, so anytime anyone says anything that you don't like, you think they're trying to be a warrior too.

The Witcher 4 will absolutely be a 5090 showcase, which is precisely why they used a 5090 in the trailer.

Even if we have different definitions of showcases, you saying that I was trashing the 5090 is straight up warring/trolling.
What the fuck does "make the most out of it" even mean?
 
Last edited:
Two more pictures without a single NPC in view... One of the pictures is 50% sky... jesus... and yes, they look muddy as hell.

Michael Jordan Reaction GIF



I'm kinda glad you don't make threads anymore, fam.
 
that's sadly true, but I think having a crazy high end CPU helps things a bit
Dont get me wrong im agree with you. Hogwarts Legacy with Ray Tracing looks great but even if 7800x3d is stuttering there is a no point the talking about this game. Its looking great but cant play so no matter how its look
 
Played the first hour of the game. It's running fine for me with the latest drivers.

5800x3d
3080 10G
1440p with DLAA on and DLSS off
 
Last edited:
A few more 4K/DLAA Transformer model/Max Settings shots to share:
54427915927_77448699f4_o.png
I happened to have some PS5 (base) screenshots of the same scene as yours, so I'll post them here for comparison.

PS5 Performance Mode (1440p/60fps) *Compressed due to data size issues
zEFRblZA_o.png


4k/DLAA Transformer model/max settings
54428648961_f44c24ef06_o.png


PS5 Performance Mode (1440p/60fps)
wF7EpYs5_o.png


I haven't tried the PS5 Pro version yet.
 


Looks like it's running well here do don't know what Alex is crying about.

He was talking about relative to the PS5. The 4080 is 2.5x faster than the PS5's GPU. Here, it's not even running the game at twice the fps, so it's massively underperforming relative to what it can do. The regular PS5 runs this game at 4K40 (albeit, lower settings), here, the 4080 does 4K 60-70 when in other games, it would be doing close to 100fps.

There's the overhead to factor, but it seems insanely high in this game. I don't think a game's PC performance drops as much as this one. I'd need to do a like for like with matched settings and scenes to see the exact performance differential.
 
Last edited:
I happened to have some PS5 (base) screenshots of the same scene as yours, so I'll post them here for comparison.

PS5 Performance Mode (1440p/60fps) *Compressed due to data size issues
zEFRblZA_o.png


4k/DLAA Transformer model/max settings
54428648961_f44c24ef06_o.png


PS5 Performance Mode (1440p/60fps)
wF7EpYs5_o.png


I haven't tried the PS5 Pro version yet.

Thanks! Definitely noticeably better on PC but PS5 still looks fantastic of course.
 
The game runs at 1080p30 on the PS4...
Yeah, so it doesn't require powerful hardware to run. 1080p30 is quite respectable, especially given what the PS4 targeted, 1080p30.

You guys seem to be forgetting that this game is running on a HD Radeon 7850-like GPU, an anemic 1.6Ghz tablet CPU, and a 5400 RPM HDD on a USB 2.0 interface. Even by 2012 PC standards, this wasn't exactly impressive. Here we are, over 11 years after the PS4 release and an almost PS5-tier GPU cannot run it at 1080p60. Yet Alex is the whiner. The fuck?
 
Last edited:
Yeah, so it doesn't require powerful hardware to run. 1080p30 is quite respectable, especially given what the PS4 targeted, 1080p30.

You guys seem to be forgetting that this game is running on a HD Radeon 7850-like GPU, an anemic 1.6Ghz tablet CPU, and a 5400 RPM HDD on a USB 2.0 interface. Even by 2012 PC standards, this wasn't exactly impressive. Here we are, over 11 years after the PS4 release and an almost PS5-tier GPU cannot run it at 1080p60. Yet Alex is the whiner. The fuck?

Stop it dude...

1080p30 is respectable now? But you're complaining about how this game performs on PC? Which is it?
 
Stop it dude...

1080p30 is respectable now?
It's respectable for a PS4, yes.
But you're complaining about how this game performs on PC? Which is it?
I'm complaining that a GPU that's 3.5x as powerful as a PS4 GPU can barely run the game at twice the performance. I'm not expecting it to scale linearly as it's a console game first, but this kind of performance delta hasn't been seen since like the 90s.
 
It's respectable for a PS4, yes.

I'm complaining that a GPU that's 3.5x as powerful as a PS4 GPU can barely run the game at twice the performance. I'm not expecting it to scale linearly as it's a console game first, but this kind of performance delta hasn't been seen since like the 90s.

And it's been explained to you that Naughty Dog was able to get WAY more out of a PS4 by writing to metal and reducing overhead.

The hate boner here for the game because it doesn't sing on specific hardware is pretty telling.
 
Last edited:
And it's been explained to you that Naughty Dog was able to get WAY more out of a PS4 by writing to metal and reducing overhead.
No, it actually hasn't. We perfectly know a console should have a significant advantage, but this kind of performance differential is unheard of, even in other Sony exclusives. Coding to the metal certainly isn't the only explanation when it also applies to games like GOW where we don't see such a disparity.
The hate boner here for the game because it doesn't sing on specific hardware is pretty telling.
Not hating on the game. I'm saying the performance is way way lower than what you'd expect on a PC.

It'd run a lot better if it was exactly the same as the PS4 version.
Except it wouldn't. Medium settings actually look worse than the PS4 version. Still runs badly on a 3060 relatively speaking.
 
Last edited:
On my 57 Samsung it's running about 60fps with DLSS at performance mode with some hitches. I turned on Frame Gen and it's around 90-100 with a 4090. I had one crash so far during a cutscene. I did turn off the cinematic cutscenes so it would fill the whole 32x9 screen, it does mention there will be some graphic issues with that and I've seen a couple. The binoculars don't work well at 32x9 either.
 
No, it actually hasn't. We perfectly know a console should have a significant advantage, but this kind of performance differential is unheard of, even in other Sony exclusives. Coding to the metal certainly isn't the only explanation when it also applies to games like GOW where we don't see such a disparity.

Not hating on the game. I'm saying the performance is way way lower than what you'd expect on a PC.


Except it wouldn't. Medium settings actually look worse than the PS4 version. Still runs badly on a 3060 relatively speaking.

I'm playing through the game and I've been very impressed with it on my PC, but you aren't wrong here either.


It's not the disaster that TLOU Part I was at launch on PC, but it should be performing better.


If nothing else though, at least there's not any major bugs/frametime issues/stutter issues. And if you have modern mid-tier hardware you're still going to have a good experience.


But with that said, one would definitely have expected better performance on lower-end hardware.
 
On my 57 Samsung it's running about 60fps with DLSS at performance mode with some hitches. I turned on Frame Gen and it's around 90-100 with a 4090. I had one crash so far during a cutscene. I did turn off the cinematic cutscenes so it would fill the whole 32x9 screen, it does mention there will be some graphic issues with that and I've seen a couple. The binoculars don't work well at 32x9 either.
This doesn't make sense. Are you running the game at 16K or something? A 4090 shouldn't just get 60fps at 4K DLSS Performance. It should easily clear 60fps at 4K DLAA.
 
No, it actually hasn't. We perfectly know a console should have a significant advantage, but this kind of performance differential is unheard of, even in other Sony exclusives. Coding to the metal certainly isn't the only explanation when it also applies to games like GOW where we don't see such a disparity.

Not hating on the game. I'm saying the performance is way way lower than what you'd expect on a PC.


Except it wouldn't. Medium settings actually look worse than the PS4 version. Still runs badly on a 3060 relatively speaking.


Not all games are the same. TLOUP2 was one of the last major PS4 games that Sony made that wasn't a cross gen game. It can't be compared to God of War at all.

TLOUP2 was a feat of magic running on the PS4.

There isn't much value in going back and forth with you, but by all accounts this seems like a competent port that people are more than happy with. So I'll leave it at that.
 
Not all games are the same. TLOUP2 was one of the last major PS4 games that Sony made that wasn't a cross gen game. It can't be compared to God of War at all.

TLOUP2 was a feat of magic running on the PS4.

There isn't much value in going back and forth with you, but by all accounts this seems like a competent port that people are more than happy with. So I'll leave it at that.
Competent as in, it doesn't crash or is full of bugs. Luckily, an RTX 3060 is a fairly pedestrian GPU these days and most people have a GPU around that level or higher.

PS4 version never ran at 60FPS though so not a like for like comparison.
The point isn't a like for like because the 3060 and PS4 GPU aren't alike. One is far, far more powerful.

Anyway, I said my piece. Not the worst port ever, but kind of underwhelming.
 
Last edited:
The point isn't a like for like because the 3060 and PS4 GPU aren't alike. One is far, far more powerful.

Anyway, I said my piece. Not the worst port ever, but kind of underwhelming.
None of that takes away from it still being one of the best looking games today, of which you'd expect means powerful hardware is required to run it.

If anything that just shows the difference coding to the metal makes, and this is not a first. Both TLOU1 and GoW2 releasing at the end of their generations also showed this.
 
None of that takes away from it still being one of the best looking games today, of which you'd expect means powerful hardware is required to run it.

If anything that just shows the difference coding to the metal makes, and this is not a first. Both TLOU1 and GoW2 releasing at the end of their generations also showed this.
If by "coding to the metal" is a thing, it should be visible across all PS5 to PC ports, not just some. At least on first party offerings. Otherwise, the implications is that only a select few first party titles (mainly TLOU) are "coded to the model".

In other words, those first party PS5 ports that do scale broadly as expected with PC hardware (the GoW, the Horizons, GoT etc), are they not "Coded to the metal"?

My view is that TLOU II port being an iron galaxy production (and given their track record), with Nixxxes only coming in later, suggests that this is just a undercooked port. In the hands of a competent porting house, and given enough time/budget this title could have been released at a better state.
 
None of that takes away from it still being one of the best looking games today, of which you'd expect means powerful hardware is required to run it.

If anything that just shows the difference coding to the metal makes, and this is not a first. Both TLOU1 and GoW2 releasing at the end of their generations also showed this.
This isn't coding to the metal. This is "we don't give a fuck about 2017 GPUs." You seriously think they discovered some secret that allowed them to make the PS4 perform 50% better after 7 years? Or is it more likely the GPUs contemporary to the PS4 are now 8-10 years old and most people don't use them anymore, so why bother with them?

Rift Apart, a native PS5 game shows nowhere near such a difference. Where is the coding to the metal there?
 
Last edited:
This isn't coding to the metal. This is "we don't give a fuck about 2017 GPUs." You seriously think they discovered some secret that allowed them to make the PS4 perform 50% better after 7 years? Or is it more likely the GPUs contemporary to the PS4 are now 8-10 years old and most people don't use them anymore, so why bother with them?

Rift Apart, a native PS5 game shows nowhere near such a difference. Where is the coding to the metal there?
If by "coding to the metal" is a thing, it should be visible across all PS5 to PC ports, not just some. At least on first party offerings. Otherwise, the implications is that only a select few first party titles (mainly TLOU) are "coded to the model".

In other words, those first party PS5 ports that do scale broadly as expected with PC hardware (the GoW, the Horizons, GoT etc), are they not "Coded to the metal"?

My view is that TLOU II port being an iron galaxy production (and given their track record), with Nixxxes only coming in later, suggests that this is just a undercooked port. In the hands of a competent porting house, and given enough time/budget this title could have been released at a better state.
Those games far exceed anything else on their system and are delivered at the end of the gen once devs understand the hardware more.
That's why you don't see it with every game.

If it were that easy every game would be up to those extremely high standards.
 
Those games far exceed anything else on their system and are delivered at the end of the gen once devs understand the hardware more.
That's why you don't see it with every game.

If it were that easy every game would be up to those extremely high standards.
They don't far exceed HFW, which, again, shows no such difference.

HZD on PC's Original (PS4 settings) preset is equivalent to Medium. Here, Forbidden West gets 80fps at 1080p medium, 2.66x the PS4's performance. That's even assuming the PS4 is still running medium in HFW. If not, it might be closer to 3x. Again, the PS4 is still, relatively speaking, better than PC, but not to the extent we see in TLOUII.



If I saw the 3060 running TLOUII at 80fps using PS4 settings, I wouldn't say a thing even though it should run it at 100fps+ based on specs.
 
Last edited:
This is where we disagree. I don't think they have placed a big enough bet. Like you said, the PC market is the only relatively unexplored space for them to expand. And they have been playing too safe. Release a GT or a slew of massively successful PS2/PS3/PS4 games and see how it goes. If that doesn't make a dent, then fall back to playing safe. Try a simultaneous release like Helldivers from one of their big studios and see how it goes. If it blows up, place more bets of that type. Otherwise, fall back.

What you call logical, I'm going to call safe. And just to annoy some people here, I'll call it borderline lazy :p

It's easy to dismiss me as "Oh what do you know about business that Sony doesn't", but what is a gaming forum if not to exchange ideas, even bad ones from fools like me?
I think you're looking at it incorrectly.

You're entire premise is based on the need for Sony to 'play it safe'.
But Sony isn't dying, they have the console market on lock. They own ~50% of the console-market.
It isn't like MS where they are desperately trying to survive.

Which brings us to another point: Xbox dying.
This alone will increase Playstation market-share, which will reduce the need for Playstation to attract consumers from outside the console-market, like PC.
At least short-term.

But my intention is not to write an entire essay or to lecture you.
The point is that you set up a target yourself and apply it to Sony as if it's theirs.

Edit:

Btw everything I said is basically a summary of everything Sony has been saying about their PC (and gaas) strategy since Shawn Layden started talking about it back in 2019.
So it's not even me sharing my 'knowledge', it's just me pointing out Sony's strategy as they explained it themselves for the past 5-6 years.
 
Last edited:
Alex is a liar and full of crap: Last of Us Part II Remastered / RTX 5090 / 9800X3D / 4K Native DLAA Max Settings - 140-160fps (Not running at 80fps @ 4K DLSS Performance) DF is getting more and more useless for tech reviews of graphics with each new release.

 
I agree game looks incredible even without ray tracing
but you have to see how much ray tracing and ray reconstruction adds to the game. it is really insane. sadly our 3070s and 3080s just can't do it with transformer DLSS in this game. 30% performance cost. that is why seeing 5090 get 250+ FPS at 4K DLSS with all those settings enabled with only 60 ms latency (nothing with gamepad, really) is a TRUE SHOWCASE for me
you get incredible motion clarity without relying on motion blur, highly playable latency, with incredible visuals
if that is not a showcase, i don't know what is

Zathalus Zathalus this is hopefully what you meant? back me up here
It looks a bit soft on the PS5 Pro, maybe that is where the muddy comment comes from? I mentioned it previously, but when I booted up the game again with DLSS 4 and the new graphical options I was floored by how detailed the game actually was, Textures really popped, it seems PSSR did the game no favours on the Pro. That being said, I still thought it was quite the pretty game on the Pro.
 
If anything that just shows the difference coding to the metal makes, and this is not a first. Both TLOU1 and GoW2 releasing at the end of their generations also showed this.
It's not coding to the metal. The game isn't running in Assembly. It's a low level API that Sony uses, but it is still an API. DX12 and Vulkan are both low level APIs as well, but with slightly more abstraction due to the nature of PCs. So there is indeed a performance benefit with using the PS API, but you'd generally see it in stuff like draw calls, or better memory management. Not the massive gap in performance you would see here. No other game, even other PS releases show a gap like this. It should be obvious, but you won't get +100% performance from a mere API. You would get such a difference if your engine is highly optimised around a specific configuration, say unified memory architecture, and you don't go through the effort of retooling it for PCs and the split memory architecture they employ. Or you heavily abstract away something that targeted a specific configuration on the PS4, as you don't want to spend time and money redoing the entire section of the engine to work nicely with PC GPUs.
 
It's not coding to the metal. The game isn't running in Assembly. It's a low level API that Sony uses, but it is still an API. DX12 and Vulkan are both low level APIs as well, but with slightly more abstraction due to the nature of PCs. So there is indeed a performance benefit with using the PS API, but you'd generally see it in stuff like draw calls, or better memory management. Not the massive gap in performance you would see here. No other game, even other PS releases show a gap like this. It should be obvious, but you won't get +100% performance from a mere API. You would get such a difference if your engine is highly optimised around a specific configuration, say unified memory architecture, and you don't go through the effort of retooling it for PCs and the split memory architecture they employ. Or you heavily abstract away something that targeted a specific configuration on the PS4, as you don't want to spend time and money redoing the entire section of the engine to work nicely with PC GPUs.
You're overlooking that there is a massive gap, even against games on the same platform.
 
You're overlooking that there is a massive gap, even against games on the same platform.
There is often times a gap, but when a game engine has been designed to be flexible enough for both platforms it performs close enough on both. The reverse is true as well, games that are not designed around the limited memory of consoles and decompression of assets suffer in performance relative to PC. You usually only see that with AA games or late ports as optimising for console with the budget of AAA is usually easy enough.
 
I don't think there's really any excuse to be made up for this. This isn't a ps3 game it's a ps5 game that is x86. The pa5 is just a cheaper APU by amd. It's just a pc.

How can it run well on a small console pc vs a fully high end pc?

Something doesn't add up.
 
The missing blood/gore effects is really disappointing. I definitely won't buy this shit.



I think Sony has been overworking nixxes lately. Or trying to get the original devs to do more of the porting work and only let nixxes finish.

There are also to this day shaders missings from Uncharted 4 on PC. But that wasn't nixxes, so it seems to be repeating that not everything makes it over from the playstation codebase.
 
Top Bottom