Digital Foundry: Metal Gear Solid Delta PC - The Best Way To Play - DF Tech Review + Optimised Settings

lol i dont know what you have been reading but performance really isnt as bad as you think it is, unless you want to play at 4K or something.

On my 3700x, 32gb ram, RTX 3070 combo the game runs at 50-60fps at 1080p Ultra, except shadows on High, and DLSS on Quality.
In fact if i change my desktop to 50hz, the game is pretty much locked at 50fps, and i'm about 4 hours in.

By the way its the shadow setting that is the big performance hog with this, NOT the global illumination setting which controls all the RT stuff. I'm not certain why the shadow setting is so demanding but put that on High and everything else on Ultra and you will be more than fine.
I mean i could play it on my 1440p monitore but io want to play it on my bog screen 4k tv as well
 
I tried out enabling Smooth Motion from the Nvidia app and I can't recommend it. While it did bump my FPS counter up to 90-100 fps at 4k performance mode ultra settings, it actually appeared to increase stuttering and there were some really nasty artifacts whenever you turn the camera.

Aside from forcing DLSS 4 and driver vsync, I think I'll just leave this one alone and accept the 60 fps.
 
Last edited:
It isnt a great looking game to be fair, especially taking into account how small the areas are.

They could of done a lot better, although its probably a good job they didnt with how demanding this version already is.
It sounds like games aren't ecstatically forward compatible which means you can't build the same universe on newer hardware.
 
You posted in the wrong thread, here


Episode 7 Wow GIF by Wrexham AFC
 
Sony allowed this game to perform like this on Pro. They could have rejected it, force Konami to fix it.

VRR makes the game playable on Xbox.
You're actyally the one being irrational here. Sont could've stepped in ar any time to look out for their loyal customers who spent $800 for their "Pro" console. It's THEIR PLATFORM, their store, they're supposed to have a quality control process.

MGS Delta was a significant part of the Pros advertising too when you think about it. Even though it was coming one year away from the Pros launch it was one of the first games shown as "Pro Enhanced" on the Sony store. Sony and Konami had obviously communicated to each other. None of this happens in a vacuum.
 
On my 3060 / 5700X, I had to stick with the AMD frame gen plugin. The game can hit 40fps, so this helps mask thos jarring uncomfortable segments somewhat. Haven't noticed any real input latency yet. I do have Reflex and Ultra Low Latency forced anyway. What stoped my inconsistent frame pacing with FG was putting my Sony TV to 100hz instead of 120.

DLSS is on Quality. So around 820p internal, I believe. With the new Transformer model.

Looks really good, to be honest. My TV is scaling the 1080p output image to 4K, as this looks better than using GPU scaling via control panel/Profile Inspector.

4K output with a custom DLSS to the same 820p causes my fan to go nuts, and probably hits the bandwidth ceiling as it performs worse.

VRR is thankfully working as it should. But, being a Sony TV, it doesn't always play nice with Nvidia cards. Every few days I need to restart the TV for it to actually bloody work correctly.
 
Last edited:
His own quote calls the Xbox version the only OK console version, lad self owned himself and didn't even realize it 🤭




It shouldn't be true at all.

Pro should never be worse than base.
To be fair Pro it's not worse. Quality mode runs quite higher fps than quality on base ps5. The problem it's just offer a single mode in the platform or not cut backs some setting to achieve a more stable fps (if graphic setting it's the cause). It happens some developers to cutoff smartly some meanigless stuff in the graphic to achieve 60 fps with the ps5 base quality setting on the Pro (see Kingdom Deliverance 2)
 
Last edited:
So you need a $300 CPU and a $1000 GPU? That's a no from me.
9060XT will run this great as will a 5060ti. 9060xt can be found for $250 and the 16gb version is $350. You also don't need a $300 CPU, you clearly don't know what you're talking about. The 3600 is 6 years old and even when Zen 2 was new in 2019 it wasnt a great gaming CPU. Alder Lake shreds Zen 2 and those are old CPUs from 4 years ago that can be found for dirt cheap prices (12400f is $114) and theres much cheaper versions of Alder Lake.

PC hardware discussion requires knowledge of PC hardware without it you're just making stuff up, the market has hundreds of CPUs and GPUs to choose from and you think some weak sauce 6 year old CPU that got bodied by CPUs that were years older than it at launch was the standard of PC gaming? The reason DF uses it is because the consoles use AMD Zen 2 CPUs and the 3600 is (a bit faster) but closer to their performance than other *common* Zen 2 desktop CPUs.
 
I have a 3600 lol.
It still has drops on a 4070 while it's using DLSS at only 60fps, that's not good imo, not when it looks the way it does.
I was watching the video and it showed a 5080, that's why I said about the $1000 card.
You have a 3600? alright well whenever you do upgrade you're gonna see a significant difference.
 
It's a shame how many people were fooled into buying the zen 2 CPUs like the 3600 by retarded youtubers who look at all the wrong things when reviewing CPUs. I remember being in the market around that time and everyone was shitting on intel's 11th gen CPUs for being power hungry, running hot, and praising the zen 2 CPUs for running cooler and being less power hungry. They were actually more expensive too. I only ended up going with the intel i7-11700k because it was $50 cheaper and $100 cheaper with the mobo. And yet it's held up far better than these trashy Zen 2 CPUs.

It started around starfield when people were struggling to hit 60 fps in cities while im like nah, im good. Then path tracing dropped in cyberpunk soon after and people were shocked to see my 3080 run it at 60 fps.

I have had zero issues running CPU heavy games like Gotham Knights, Starfield, Space Marine 2, and even the ones that perform poorly in cities like Jedi survivor and Dragons Dogma 2, ran at a perfect 60 fps with all RT effects enabled during gameplay and in the open world.

i mean look at how the 8700k is outperforming 5000 series CPUs in starfield. This was 2 years ago. And people got upset at Todd Howard for telling them to upgrade their CPUs.

HSYVNTi.png
 
9060XT will run this great as will a 5060ti. 9060xt can be found for $250 and the 16gb version is $350. You also don't need a $300 CPU, you clearly don't know what you're talking about. The 3600 is 6 years old and even when Zen 2 was new in 2019 it wasnt a great gaming CPU. Alder Lake shreds Zen 2 and those are old CPUs from 4 years ago that can be found for dirt cheap prices (12400f is $114) and theres much cheaper versions of Alder Lake.

PC hardware discussion requires knowledge of PC hardware without it you're just making stuff up, the market has hundreds of CPUs and GPUs to choose from and you think some weak sauce 6 year old CPU that got bodied by CPUs that were years older than it at launch was the standard of PC gaming? The reason DF uses it is because the consoles use AMD Zen 2 CPUs and the 3600 is (a bit faster) but closer to their performance than other *common* Zen 2 desktop CPUs.
I bought the 3600 back in 2020, it's been mostly fine. A 5700X3D would cost as much as a 9600X and if I got the 9600X I would need to get new RAM/MOBO and possibly PSU. My next build will be with a new MOBO and stuff but it won't be yet.
I have a 9060XT 16GB.
 
It's a shame how many people were fooled into buying the zen 2 CPUs like the 3600 by retarded youtubers who look at all the wrong things when reviewing CPUs. I remember being in the market around that time and everyone was shitting on intel's 11th gen CPUs for being power hungry, running hot, and praising the zen 2 CPUs for running cooler and being less power hungry. They were actually more expensive too. I only ended up going with the intel i7-11700k because it was $50 cheaper and $100 cheaper with the mobo. And yet it's held up far better than these trashy Zen 2 CPUs.

It started around starfield when people were struggling to hit 60 fps in cities while im like nah, im good. Then path tracing dropped in cyberpunk soon after and people were shocked to see my 3080 run it at 60 fps.

I have had zero issues running CPU heavy games like Gotham Knights, Starfield, Space Marine 2, and even the ones that perform poorly in cities like Jedi survivor and Dragons Dogma 2, ran at a perfect 60 fps with all RT effects enabled during gameplay and in the open world.

i mean look at how the 8700k is outperforming 5000 series CPUs in starfield. This was 2 years ago. And people got upset at Todd Howard for telling them to upgrade their CPUs.

HSYVNTi.png

I agree that 3600 is weak (I had it in 2019) but Starfield is performing unlike 90% of other games. It loves Intel CPUs.

SEAzKt6T1tP76ygG.jpg


5xxx series from AMD is fine, good memory latency, 8 core chiplets. 3xxx is fucked thanks to separated L3 memory pools and 4 core chiplets.
 
I agree that 3600 is weak (I had it in 2019) but Starfield is performing unlike 90% of other games. It loves Intel CPUs.

SEAzKt6T1tP76ygG.jpg


5xxx series from AMD is fine, good memory latency, 8 core chiplets. 3xxx is fucked thanks to separated L3 memory pools and 4 core chiplets.

He's absolutely delusional with that post.

And AM4 is the best CPU socket that's ever existed.
 
I agree that 3600 is weak (I had it in 2019) but Starfield is performing unlike 90% of other games. It loves Intel CPUs.

SEAzKt6T1tP76ygG.jpg


5xxx series from AMD is fine, good memory latency, 8 core chiplets. 3xxx is fucked thanks to separated L3 memory pools and 4 core chiplets.
Notice those last gen games.

Zen 2 CPUs have also held back every single RT game. Or any CPU heavy game in general.

Starfield ran fine on the 7000 series thanks to the higher clocks and other zen 4 enhancements. Zen 2 just sucked dick for next gen games. It's why the consoles struggle with UE5 which is reliant on single threaded performance and those low clocks dont help. Meanwhile my i7-17000k goes up to 5.1 ghz when needed. But it consumes 150 watts which to youtubers was a cardinal sin.

moz1aMb.png
 
Last edited:
Notice those last gen games.

Zen 2 CPUs have also held back every single RT game. Or any CPU heavy game in general.

Starfield ran fine on the 7000 series thanks to the higher clocks and other zen 4 enhancements. Zen 2 just sucked dick for next gen games.

moz1aMb.png
Starfield looks and runs like ass.
 
Starfield looks and runs like ass.
nope it looked stunning and ran fine on modern CPUs as you can see from those zen 4 and 11th gen intel benchmarks. Not bethesda's fault that AMD was 2 gens behind intel. Upgrade your CPU and run it at max settings. Again, it's not the only game that had issues. Gotham Knights, Space Marine 2, Cyberpunk RT, Spiderman PC ports and any CPU heavy game had issues with Zen 2 CPUs.

the procedurally generated planets looked like garbage yes. but you dont really explore them in this game. the real game is set mostly indoors and those hand crafted levels are beautiful with great lighting and exceptional asset quality.

bnE0shL.gif


qt0KJyN.gif


oTOj0r3.gif
 
nope it looked stunning and ran fine on modern CPUs as you can see from those zen 4 and 11th gen intel benchmarks. Not bethesda's fault that AMD was 2 gens behind intel. Upgrade your CPU and run it at max settings. Again, it's not the only game that had issues. Gotham Knights, Space Marine 2, Cyberpunk RT, Spiderman PC ports and any CPU heavy game had issues with Zen 2 CPUs.

the procedurally generated planets looked like garbage yes. but you dont really explore them in this game. the real game is set mostly indoors and those hand crafted levels are beautiful with great lighting and exceptional asset quality.

bnE0shL.gif


qt0KJyN.gif


oTOj0r3.gif
7800X3D was a $450 CPU and all it could do in that bench was 90fps? Nahhh.
There's nothing special or next gen about Starfield. The only thing next gen about it is it didn't launch on last gen Xbox.
 
7800X3D was a $450 CPU and all it could do in that bench was 90fps? Nahhh.
There's nothing special or next gen about Starfield. The only thing next gen about it is it didn't launch on last gen Xbox.

you could argue it would be badly suited to last gen due to it having so much loading that it would make it insanely annoying to play on a last gen console.

imagine if every loading transition took 20 seconds 😭
 
Notice those last gen games.

Zen 2 CPUs have also held back every single RT game. Or any CPU heavy game in general.

Starfield ran fine on the 7000 series thanks to the higher clocks and other zen 4 enhancements. Zen 2 just sucked dick for next gen games. It's why the consoles struggle with UE5 which is reliant on single threaded performance and those low clocks dont help. Meanwhile my i7-17000k goes up to 5.1 ghz when needed. But it consumes 150 watts which to youtubers was a cardinal sin.

moz1aMb.png

Modern game:

z9kL6Zo2UGiAeMCm.jpg


11900k = 5700X

Nothing changed since launch. There are always outlier games for AMD and Intel CPUs that perform outside of the norm (like Starfield).

On consoles, smart developers can avoid high latency between chiplets, they can code games to use specific threads. That's one of the reasons why sometimes PS5/XSX outperforms 3600. On PC with Zen 1/2 you are fucked.
 
Last edited:
7800X3D was a $450 CPU and all it could do in that bench was 90fps? Nahhh.
There's nothing special or next gen about Starfield. The only thing next gen about it is it didn't launch on last gen Xbox.
it's literally running a realtime GI solution like Lumen. something virtually no sony first party game has done all gen. these things cost resources. Both GPU and CPU.
 
Modern game:

z9kL6Zo2UGiAeMCm.jpg


11900k = 5700X

Nothing changed since launch. There are always outlier games for AMD and Intel CPUs that perform outside of the norm (like Starfield).

On consoles, smart developers can avoid high latency between chiplets, they can code games to use specific threads. That's one of the reasons why sometimes PS5/XSX outperforms 3600. On PC with Zen 1/2 you are fucked.
Not really that impressive. The 3700x was equivalent to my i7-11700k, selling for a $50 premium like i mentioned thanks to the good reviews, and it is well behind even the 10700k in this game. let alone the 11700k.

And BF6 isnt even pushing any RT or next gen features like lumen or nanite. it's cpu heavy but not as heavy as other games ive mentioned. Turn on RT and watch the zen 2 performance tank.

on consoles, cerny couldve increased the clocks on the pro and i am a 100% sure you wouldnt be seeing these massive drops to 30 fps. I literally went around blowing up barrels in my game on both my 3080 and 5080 and saw no drops. sudden or sustained. virtually no spikes on the GPU usage either. i dont blame him for going with zen 2 in 2020, but in 2024 there were much better options available. he couldve gone with zen 4 or higher clocks.

on PCs, im simply asking people to upgrade their 5 year old CPUs instead of stubbornly handicapping their modern GPUs with shitty CPUs.
 
And yet it looks like that.
Looks beautiful, but yes, it could always look like this:

Gufwy8IWMAARW5U


u2x1xBL.jpeg


I am glad some devs spent GPU and CPU cycles on rendering next gen assets and lighting. If you want rocks that look like this, you will have to invest in a better GPU and CPU.

HMMadcKoLBPQg7f1.jpeg
 
Looks beautiful, but yes, it could always look like this:

Gufwy8IWMAARW5U


u2x1xBL.jpeg


I am glad some devs spent GPU and CPU cycles on rendering next gen assets and lighting. If you want rocks that look like this, you will have to invest in a better GPU and CPU.

HMMadcKoLBPQg7f1.jpeg

Decima needs RTGI. Shocking they didn't make that upgrade for Forbidden West.
 
Not really that impressive. The 3700x was equivalent to my i7-11700k, selling for a $50 premium like i mentioned thanks to the good reviews, and it is well behind even the 10700k in this game. let alone the 11700k.

And BF6 isnt even pushing any RT or next gen features like lumen or nanite. it's cpu heavy but not as heavy as other games ive mentioned. Turn on RT and watch the zen 2 performance tank.

on consoles, cerny couldve increased the clocks on the pro and i am a 100% sure you wouldnt be seeing these massive drops to 30 fps. I literally went around blowing up barrels in my game on both my 3080 and 5080 and saw no drops. sudden or sustained. virtually no spikes on the GPU usage either. i dont blame him for going with zen 2 in 2020, but in 2024 there were much better options available. he couldve gone with zen 4 or higher clocks.

on PCs, im simply asking people to upgrade their 5 year old CPUs instead of stubbornly handicapping their modern GPUs with shitty CPUs.

Yeah, Zen 2 is cooked. It was bad architecture from the start - worse than Intel 10 and 11 series. People sticking to it instead of getting 5xxx or 5xxx3D series (or just upgrading platform) are really stubborn...

I continued posting graphs because you posted that starfield chart where even Zen 3 takes the beating. Starfield engine is just some ancient tech with some modern features on top, it's fucked and not represents performance in modern games.
 
Last edited:
Sony allowed this game to perform like this on Pro. They could have rejected it, force Konami to fix it.

Sont could've stepped in ar any time to look out for their loyal customers who spent $800 for their "Pro" console. It's THEIR PLATFORM, their store, they're supposed to have a quality control process.
you are both saying the same thing lmao.

Cerny needs to be involved. it's his decision to go with a custom AI upscaling solution instead of FSR4. FSR4 actually looks better than DLSS3 nowadays. Even at 720p internal it wouldve looked great if it actually fucking worked as advertised. And MGS3 isnt the only game. This has been happening all year since launch in november of last year. His current guidance should be to not use PSSR with any kind of realtime GI solution since PSSR cant handle denoising at the moment. PSSR works beautifully in other games with baked lighting. Especially if the resolution is 1440p. but anything below 1080p and anything with dynamic lighting elements should be a big no no and that needs to come from Sony.

i will take some screenshots of the game running at 720p internal resolution (1440p dlss performance) and i can promise you it will look better than the IQ cerny's solution is posting even at the higher level resolutions which can top out at 1152p.
 
Last edited:
Decima needs RTGI. Shocking they didn't make that upgrade for Forbidden West.
It's an engine built primarily for console. I doubt there is any room for an RTGI solution on their engine that can still look good on base PS5. And without it working on the primary install base, there's no point investing in it. I'd be absolutely SHOCKED if the next game doesn't have it, assuming it's on the PS6. Even if it's cross gen, they'd probably implement it for next gen and fall back to baked lighting for PS5/Pro. And the game following that should be rid of baked lighting altogether
 
Decima needs RTGI. Shocking they didn't make that upgrade for Forbidden West.
they also need better geometry, better materials, better asset quality in general. People just dont realize how detailed starfield and nanite assets can be.

having RTGI would help with some indirect lighting and proper shadowing and AO, but it will still feel half baked like Doom and Indy do in their non path traced modes even on PC. You need both Lumen and Nanite. RTGI and higher fidelity assets. For that next gen look.
 
It's an engine built primarily for console. I doubt there is any room for an RTGI solution on their engine that can still look good on base PS5. And without it working on the primary install base, there's no point investing in it. I'd be absolutely SHOCKED if the next game doesn't have it, assuming it's on the PS6. Even if it's cross gen, they'd probably implement it for next gen and fall back to baked lighting for PS5/Pro. And the game following that should be rid of baked lighting altogether
Avatar has RTGI, reflections and shadows on consoles. So does AC Shadows. both have high resolution 30 fps modes and somewhat decent resolution 40 fps modes. It's the 60 fps modes where these games have to drop to 720p and still fail to hold a consistent 60 fps.

It's the same issue with Frostbite, Northlight and other engines like RE Engine and Square's inhouse engine for FF16. These games arent even using RTGI. Next gen graphics are just expensive. And decima is pushing last gen graphics. Best of last gen yes, and they can still look great like RDR2 and TLOU2 still do at times, but last gen nonetheless.

I think forcing 60 fps on these 5 year old $399-499 consoles was simply a bad idea. Shouldve left that up to PCs or Pro consoles, but cerny fucked that up too.
 
Avatar has RTGI, reflections and shadows on consoles. So does AC Shadows. both have high resolution 30 fps modes and somewhat decent resolution 40 fps modes. It's the 60 fps modes where these games have to drop to 720p and still fail to hold a consistent 60 fps.

It's the same issue with Frostbite, Northlight and other engines like RE Engine and Square's inhouse engine for FF16. These games arent even using RTGI. Next gen graphics are just expensive. And decima is pushing last gen graphics. Best of last gen yes, and they can still look great like RDR2 and TLOU2 still do at times, but last gen nonetheless.
That's what I'm getting at as well, just from a different angle. Pretty much all Playstation Studios try to maintain a consistent look across all modes. And if there is a performance mode and a quality mode, they can't look a generation apart. That's what happened with AC shadows. And the resolution drops way too low for Avatar at 60. Unlike PS Studio games, those games were not really built with base consoles as target hardware. So it makes sense for them to invest in tech that goes above and beyond.

I think forcing 60 fps on these 5 year old $399-499 consoles was simply a bad idea. Shouldve left that up to PCs or Pro consoles, but cerny fucked that up too.

If their metrics say a majority play on 60 fps, then that's the mode that gets development priority. Didn't Cerny say it was something like 60 or 70% preferred 60 fps? But I agree with you, these have become unrealistic goals and should not be forced, if the goal was to do anything on the RT front.

The base console was unfortunately designed at a time when AMD/RT hardware was still well behind the curve. And the pro console is stuck behind cost considerations and the inability to make actual architectural changes for a mid-gen refresh that removes bottlenecks. That system is unbalanced as fuck. Both have turned out to be quite disappointing as far as RT output is concerned.

Sad as that makes me, it gives something to look forward to for next generation. Actual ray tracing that makes a real difference. There are AMD patents on hardware accelerated nanite as well. We may even get path tracing for a 30 fps quality mode a few years in.

I'm just finishing up the main story of HFW and that large machine battle scene was absolutely spectacular on the pro. Whatever adjustments they've made to the lighting and shadows, along with their upscaling... No pop-in or aliasing whatsoever. I kept wondering if it was pre-rendered CG. First time that I was genuinely awestruck. I can only imagine what it would be for a game to look like that the entire time, during gameplay, at any angle. Next generation could easily deliver that.

All this is assuming Sony wants to catch up to cutting edge tech. Not seeing a reason why they wouldn't.
 
Last edited:
That's what I'm getting at as well, just from a different angle. Pretty much all Playstation Studios try to maintain a consistent look across all modes. And if there is a performance mode and a quality mode, they can't look a generation apart. That's what happened with AC shadows. And the resolution drops way too low for Avatar at 60. Unlike PS Studio games, those games were not really built with base consoles as target hardware. So it makes sense for them to invest in tech that goes above and beyond.



If their metrics say a majority play on 60 fps, then that's the mode that gets development priority. Didn't Cerny say it was something like 60 or 70% preferred 60 fps? But I agree with you, these have become unrealistic goals and should not be forced, if the goal was to to do anything on the RT front.

The base console was unfortunately designed at a time when AMD/RT hardware was still well behind the curve. And the pro console is stuck behind cost considerations and the inability to make actual architectural changes for a mid-gen refresh that removes bottlenecks. That system is unbalanced as fuck. Both have turned out to be quite disappointing as far as RT output is concerned.

Sad as that makes me, it gives something to look forward to for next generation. Actual ray tracing that makes a real difference. There are AMD patents on hardware accelerated nanite as well. We may even get path tracing for a 30 fps quality mode a few years in.

I'm just finishing up the main story of HFW and that large machine battle scene was absolutely spectacular on the pro. Whatever adjustments they've made to the lighting and shadows, along with their upscaling... No pop-in or aliasing whatsoever. I kept wondering if it was pre-rendered CG. First time that I was genuinely awestruck. I can only imagine what it would be for a game to look like that the entire time, during gameplay, at any angle. Next generation could easily deliver that.

All this is assuming Sony wants to catch up to cutting edge tech. Not seeing a reason why they wouldn't.
that large machine battle scene pissed me the fuck off. They famously said that the PS4 didnt hold them back, and when they finally released the game, they made the best scene a fucking non-playable cutscene. Like isnt the point of video games PLAYING these epic battles? instead, you just see machines duking it out while aloy just runs past everyone and you never get to have control. the game then ends on two underwhelming 1v1 fights like a ps2 era game. yay? Seriously, if i was the mother of HFW devs, i would ground them for weeks for making that sequence a cutscene.

I think people look at data and get things all wrong. Cerny is reading that 75% figure and misunderstanding it because here is another data point: 100% of gamers last gen played at 30 fps with no issues. Besides, virtually every single game for the first 2 years of this gen was a last gen game that ran just fine at high resolutions and 60 fps. Of course people played those games at 60 fps. But will people buy a next gen console for $500, and pay a $10 surcharge for a $70 game and play a mode without RTGI? Well how could he have a data point on those games two years ago when he was designing the pro? they didnt exist.

I actually think both consoles are fine 30 fps machines. Nanite, Lumen, RTGI, all work fine in games and run at 1440p internal resolutions which look almost native 4k even with FSR, TSR or checkerboarding. In fact, some UE5 games actually go over 1440p to 1512p and avatar can go upto 1800p before being reconstructed to 4k in the 30 fps modes. they are basically offering 2x the resolution over 1080p ps4 games which is what the ps4 did coming from ps3's 720p. And they are offering RT and next gen features like nanite. These consoles are great machines for that price, the pro not so much, but thats on cerny.
 
that large machine battle scene pissed me the fuck off. They famously said that the PS4 didnt hold them back, and when they finally released the game, they made the best scene a fucking non-playable cutscene. Like isnt the point of video games PLAYING these epic battles? instead, you just see machines duking it out while aloy just runs past everyone and you never get to have control. the game then ends on two underwhelming 1v1 fights like a ps2 era game. yay? Seriously, if i was the mother of HFW devs, i would ground them for weeks for making that sequence a cutscene.
LMAO! I feel ya. Same with the flying. Such a wasted opportunity. The traversal speed was atrocious. The cross gen shackles were so obvious.

Speaking of wasted opportunities, the other WTF moment was when they didn't even show Ted Faro. That whole mission was giving me blue balls with the continuous teasing. And ended up being.... nothing? I thought we were going full resident evil as a surprise mission.

Agree with the rest. There is certainly a misalignment in priorities between the devs and what graphics focused gamers want. How many of us are around for them to care though? Monster Hunter Wilds has given me a whole different perspective on this. Most people don't give a shit or even have an eye for it. They look at a puddle with SSR and go "Wow, look at all the ray tracing!"

I'm hoping this won't even be a conversation next gen and everyone can choose the mode that pleases them, because the hardware wouldn't pose a significant design constraint.
 
Last edited:
LMAO! I feel ya. Same with the flying. Such a wasted opportunity. The traversal speed was atrocious. The cross gen shackles were so obvious.

Speaking of wasted opportunities, the other WTF moment was when they didn't even show Ted Faro. That whole mission was giving me blue balls with the continuous teasing. And ended up being.... nothing? I thought we were going full resident evil as a surprise mission.

Agree with the rest. There is certainly a misalignment in priorities between the devs and what graphics focused gamers want. How many of us are around for them to care though? Monster Hunter Wilds has given me a whole different perspective on this. Most people don't give a shit or even have an eye for it. They look at a puddle with SSR and go "Wow, look at all the ray tracing!"

I'm hoping this won't even be a conversation next gen and everyone can choose the mode that pleases them, because the hardware wouldn't pose a significant design constraint.
i liked the idea of a beastly ted faro, but i actually think it wouldve been great to have him as a real character in this world that was part of the story for more than just a single mission. Have him see what this world has become, let him fuck things up some more and then do something a bit more cathartic with him like having him stand trial in a tribal jury and executed in a medieval way instead of getting killed off screen. the writing was so strong in the first game. they did such a great job creating a hateable character and even came up with a way to bring him back, only to fumble it in the end. so disappointing.

i think we will see the same issues next gen. especially if devs embrace path tracing. you are not getting 60 fps on these 30-40 tflops next gen consoles and path tracing. best case scenario you get current gen graphics at 60 fps which means another generation of cross gen games.
 
i liked the idea of a beastly ted faro, but i actually think it wouldve been great to have him as a real character in this world that was part of the story for more than just a single mission. Have him see what this world has become, let him fuck things up some more and then do something a bit more cathartic with him like having him stand trial in a tribal jury and executed in a medieval way instead of getting killed off screen. the writing was so strong in the first game. they did such a great job creating a hateable character and even came up with a way to bring him back, only to fumble it in the end. so disappointing.
Agreed. I still like the idea of continuing the story with nemesis though (I think I'm at the last boss fight, so not sure what happens after that). It can really go places, if only they hire better writers.

i think we will see the same issues next gen. especially if devs embrace path tracing. you are not getting 60 fps on these 30-40 tflops next gen consoles and path tracing. best case scenario you get current gen graphics at 60 fps which means another generation of cross gen games.

Once RTGI becomes standard across the board, I don't see why they would need path tracing at 60. They can keep path tracing to 30 and have a bunch of ray tracing at 60. And attempt path tracing at 60 for the inevitable pro console. The differences would be subtle enough and the implementation would not require any disruptive changes to development workflow. Once you get rid of baked lighting and lightmaps, the choice of ray tracing or path tracing would not require major design considerations as the assets and textures are the same. At least until they start doing true path tracing, which current implementations are a far cry from. That may not even happen next generation.
 
Last edited:
nope it looked stunning and ran fine on modern CPUs as you can see from those zen 4 and 11th gen intel benchmarks. Not bethesda's fault that AMD was 2 gens behind intel. Upgrade your CPU and run it at max settings. Again, it's not the only game that had issues. Gotham Knights, Space Marine 2, Cyberpunk RT, Spiderman PC ports and any CPU heavy game had issues with Zen 2 CPUs.

the procedurally generated planets looked like garbage yes. but you dont really explore them in this game. the real game is set mostly indoors and those hand crafted levels are beautiful with great lighting and exceptional asset quality.

bnE0shL.gif


qt0KJyN.gif


oTOj0r3.gif
I haven't played Starfield yet, but the geometry and lighting in these GIFs look impressive for a big open world game.

Modern game:

z9kL6Zo2UGiAeMCm.jpg


11900k = 5700X

Nothing changed since launch. There are always outlier games for AMD and Intel CPUs that perform outside of the norm (like Starfield).

On consoles, smart developers can avoid high latency between chiplets, they can code games to use specific threads. That's one of the reasons why sometimes PS5/XSX outperforms 3600. On PC with Zen 1/2 you are fucked.
I wonder how GAMEGPU tests so many CPUs for every game.
 
Last edited:
I haven't played Starfield yet, but the geometry and lighting in these screenshots look impressive.


I wonder how GAMEGPU tests so many CPUs for every game.

I wonder that myself, but there is no one else that does that sadly. We only get some games selected for CPU bencharks.

Their results (at least on GPU side) usually aren't far from other outlests when games are tested by them (very few games have benchmarks).
 
I have 5600x and 3080 and when I try to run the game on my 4k tv with balance dlss the game felt a bit sluggish. Shoudl I change to performance instead or there is soem setting i can do?
 
I have 5600x and 3080 and when I try to run the game on my 4k tv with balance dlss the game felt a bit sluggish. Shoudl I change to performance instead or there is soem setting i can do?

4k with dlss performance should give you 50-60fps. Balanced is too much for 3080.

Of course I'm talking about high settings.
 
Last edited:
Top Bottom