• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Starfield PC's New Patch: Massive CPU/GPU Perf Boosts, Official DLSS Support

Draugoth

Gold Member


Starfield launched in a baffling state on PC, with no support for DLSS, frame generation and highly questionable performance on Nvidia graphics hardware. A new patch is currently in beta and it's fair to say that it's transformative. The new HDR features disappoint, but with official DLSS and frame-gen support and genuinely impressive boosts to CPU and GPU performance, this patch moves Starfield one step closer to delivering everything it should on PC.​
00:00:00 Introduction
00:00:31 User Experience, DLSS, HDR
00:05:04 CPU Performance Improvements
00:08:35 GPU Performance Improvements
00:11:36 Conclusion
 
Last edited:

SmokedMeat

Gamer™
1-14.jpg


We did optimize. It’s a next gen game.

Riiiiight.
 

gothmog

Gold Member
Glad they are fixing those issues. I'm personally waiting on a larger content fix but I'm glad I can switch back to my PC from my Series X.
 

SlimySnake

Flashless at the Golden Globes
That can't be! Toddler said it was optimized on live TV and that us plebs needed to upgrade our systems!
well, he was half right. you do need to upgrade your 4.5 tflops 2016 GPUs with shitty last gen CPUs. the gains after the patch are still around 20% not 200%. Those 1060s and 580s are still not going to run this game well after the performance upgrades.

i dont know when this happened, but PC gamers went from upgrading their PCs every 2 years to now complaining about having to upgrade their PCs every 8 years. the 970 and 1060 were roughly 2x-3x more powerful than the X1 and PS4. Now we have people buying 3060s and 3060 Tis which are barely as powerful as the PS5 and XSX, and expecting the same 2x-3x more performance. Just not going to happen.

I ran this game at 4k dlss quality at 60 fps on my 3080. 2x that of the XSX with way higher settings. at launch. Just like i would expect a 3080 to perform.
 

SlimySnake

Flashless at the Golden Globes
Looking at the improvements, it is crazy how much more they are able to multithread the CPU. This has to be the heaviest CPU bound game ive ever seen. I dont think even cyberpunk goes over 50% with most last gen games staying around 20-30% max.

One has to wonder just what the hell these CPU threads are doing at all times. My CPU wattage was already in the 100s, im scared to think how high it will go now.


I14Frdp.jpg
 

geary

Member
Crunching on those persistent sandwiches...
Looking at the improvements, it is crazy how much more they are able to multithread the CPU. This has to be the heaviest CPU bound game ive ever seen. I dont think even cyberpunk goes over 50% with most last gen games staying around 20-30% max.

One has to wonder just what the hell these CPU threads are doing at all times. My CPU wattage was already in the 100s, im scared to think how high it will go now.
 

Dane

Member
Looking at the improvements, it is crazy how much more they are able to multithread the CPU. This has to be the heaviest CPU bound game ive ever seen. I dont think even cyberpunk goes over 50% with most last gen games staying around 20-30% max.

One has to wonder just what the hell these CPU threads are doing at all times. My CPU wattage was already in the 100s, im scared to think how high it will go now.


I14Frdp.jpg
Its ironic that Bethesda of all studios are one of the few who actually cared about taking advantage of modern CPUs designs, both Starfield and Cyberpunk are the few games that having more cores in the same CPU generation does increase performance as it uses all threads available, while everyone else is stuck at 4/8 or 6/12 at best.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Crunching on those persistent sandwiches...
thats what i figured but shouldnt that happen on load? The game loads in every area and thats when it should load in those persistent objects.

Walking around those levels should only be tracking NPCs who are also walking around. I guess shops and other quests but constantly? i see some threads hitting 80% on a 16 thread CPU. What exactly is it calculating when you are standing still?

I wonder if their realtime GI solution has a huge hit on the CPU since time of day is the one thing that changes constantly.
 
Everyone in new atlantis is fucking ugly. Other settlements are fine.
For the NPCs you can talk to, yeah. The filler NPCs look bad everywhere due to the randomization process, I think. It feels like there are only a few shades of white colors for NPCs, while there are a ton of shades for black and brown, and then it generates everything equally. Mix that with the randomization of facial and body features, and you get some weird combinations and a large amount of black and brown people with white haircuts and facial features. You also seem to get a higher percentage of old people with young haircuts and facial/body features. It's a bizarre look that doesn't quite mesh with the sleek art style.
 

rofif

Can’t Git Gud
Everyone in new atlantis is fucking ugly. Other settlements are fine.

AUgDfKp.jpg


F5tjDmhXUAAvWzl


And thats just the crowd NPCs. The character models of quest NPCs you can interact with, of which there are well over a hundred, are really detailed.

dX0dxpR.jpg

Z5zq58e.jpg
Not a fair comparison maybe but here is random forspoken npc and then quest givers.
Starfield quest givers look fantastic even if they lack something

vU6iVz0.jpg

eOBjLBz.jpg

fO0DXbP.jpg

oN1cQTz.jpg

Dvg1QRO.jpg

aeGtIHy.jpg

iD0BabO.jpg
 

Quezacolt

Member
i dont know when this happened, but PC gamers went from upgrading their PCs every 2 years to now complaining about having to upgrade their PCs every 8 years. the 970 and 1060 were roughly 2x-3x more powerful than the X1 and PS4. Now we have people buying 3060s and 3060 Tis which are barely as powerful as the PS5 and XSX, and expecting the same 2x-3x more performance. Just not going to happen.
Maybe that happened after mid range gpu's started to cost almost as much as top end gpu's in the past.
 

fatmarco

Member
People need to understand there's clearly two tiers of NPC's (three if you count actual NPC "Characters" aka people you can talk to).

There's the ones that randomly speak expository dialogue around the world and they tend to have the high quality animated faces, then there's the filler NPC's who don't have unique dialogue who roam about. People selectively used them and their weird eye tracking to misrepresent the quality of Starfield NPC's.

For the most part the higher quality animated NPC's look good, just they're re-used far too often / there's not enough of them. The lower quality NPC's suffer from their eyes being too open by default, similar to the launch version of Mass Effect Andromeda, and simply need to be patched to have their eyelids made more narrow.
 

SlimySnake

Flashless at the Golden Globes
Maybe that happened after mid range gpu's started to cost almost as much as top end gpu's in the past.
I dont know how true that is. When starfield came out, the 7800xt was $499 and came with a copy of the game. It easily ran Starfield at double the xsx framerate with way higher settings. Its around $150 more than the 970 launched at 9 years ago, but still no where even close to the Titan cards from that era.

The 4070 launched at $599 so almost double what the 970 launched at. But its already down to $520. Thats another card that will run this game at 2x the framerate of the xsx.
 

Punished Miku

Human Rights Subscription Service
People need to understand there's clearly two tiers of NPC's (three if you count actual NPC "Characters" aka people you can talk to).
They know dude. You literally have to go out of your way to zoom in on random NPCs you can't even interact with to get the meme shots, and then you have to close your eyes or lie anytime you actually speak to anyone and it looks really good.
 

analog_future

Resident Crybaby
1-14.jpg


We did optimize. It’s a next gen game.

Riiiiight.

Cherry-pick screenshots?

Just load into an inhabited planet and walk around. Ugliest NPCs I’ve seen for a last gen game, let alone a “next gen” game.

Dude....you didn't play the game..
Right ?
This is all Aurora consumers😂

NPCs look great more often than not.

Some shots from my playthrough, all random NPCs:

53213007953_0f21155087_o.png


53213085729_3b841ba03a_o.png


53213085704_c6c5b03b05_o.png

Everyone in new atlantis is fucking ugly. Other settlements are fine.

AUgDfKp.jpg


F5tjDmhXUAAvWzl


And thats just the crowd NPCs. The character models of quest NPCs you can interact with, of which there are well over a hundred, are really detailed.

dX0dxpR.jpg

Z5zq58e.jpg


Gotta love these dumbshits trolling and then disappearing from the thread when they're called out on their BS.
 

NeonDelta

Member
Whenever I select DLSS type it changes to another one. E.g I set it to Quality, go into game, back to settings and it’s changed to DLAA. If I set it to ultra performance it changes to performance or balanced.
 

Gaiff

SBI’s Resident Gaslighter
I dont know how true that is. When starfield came out, the 7800xt was $499 and came with a copy of the game. It easily ran Starfield at double the xsx framerate with way higher settings. Its around $150 more than the 970 launched at 9 years ago, but still no where even close to the Titan cards from that era.

The 4070 launched at $599 so almost double what the 970 launched at. But its already down to $520. Thats another card that will run this game at 2x the framerate of the xsx.
If the fps is capped, sure. The 4070 is on par with the 3080 and the 3080 is around 60-70% faster than the 2080. You generally won't get all that close to double the frame rate of the consoles even with a 4070 unless you use a lot of RT. You need something on the level of a 3090 or sometimes even 3090 Ti to double the performance of a 2070S/consoles. This means a 4070 Ti-ish tier GPU which is $800.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
If the fps is capped, sure. The 4070 is on par with the 3080 and the 3080 is around 60-70% faster than the 2080. You generally won't get all that close to double the frame rate of the consoles even with a 4070 unless you use a lot of RT. You need something on the level of a 3090 or sometimes even 3090 Ti to double the performance of a 2070S/consoles. This means a 4070 Ti-ish tier GPU which is $800.
Maybe in benchmarks, but in my experience, i have been able to double the performance of my 2080 in most games.

Again, I was able to run the game at 60 fps at high settings at 4k dlss quality when xbox was using a mixture of medium to low settings. And this was when the game was performing awful on nvidia cards.

You just needed to pair your GPUs with a good CPU.
 
Top Bottom