• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Starfield PC Performance Thread

SlimySnake

Flashless at the Golden Globes
From the performances analysis i'm reading everywhere, no, it really doesn't unless you have an amd centric hardware.

The fact that i'm fucking scared to try this thing with a 4080\13600k\32 gb ddr5 6000 mhz under my ass should speak volume.
Jedi survivor was running fine on my rig, not perfect but better than what i'm reading about starfield, but i didn't tried tlou1, just the ps5 version.

I didn't had much problems with any of the abominations this year, maybe i'm lucky, but from the sound of it, this is gonna be the first game that i can't play even remotely close to 4k60 or with dlss that look almost as good as native since i upgraded my pc last year...
(Vaselline fsr that kill the iq is not 4k or remotely comparable to dlss)

The game lacks even basic option like a fov slider, framerate limiter, a full screen option and brigthness slider while having super washed out colors indoors, i don't wanna deal with nvidia settings, ini files, mods and reshade to play a fucking game, i didn't had to do anything of that with the "abominations"...

I hope i'm wrong and i get lucky with thos one aswell.
Your 4080 should run this thing fine. dont worry.

the game might not have many of the normal options like hdr, fov and framerate toggles, but the game itself is fine.

P.S the space battles are a 60 fps lock for me. its the outdoors where the game struggles to hit a locked 60 fps. FSR is not bad in this game. i didnt see any shimmering or ghosting. its no RE4 on PS5.
 

SF Kosmo

Al Jazeera Special Reporter
So this is really interesting. For as much as the mid-tier/older gen AMD processors seem to be punching above their weight, that flips when we get to the current gen, where Intel is bodying AMD chips that would normally be performing equal or better to Intel. We even see Intel chips in the new gen performing out of scale compared to last gen chips (13700k is normally about 5-10% better performing than a 12900k, but here we see a good 28% uplift)

So we really might be seeing very strong scaling with memory bandwidth. That's unusual.

I'm running on a i7-13700k with DDR5-6400, and it seems to be running very well for the most part, so that squares with my experience.
 
Last edited:

draliko

Member
played a bit, 5600x and 7900xt at 1440p ultra, performance is good not great, can't really see difference between high and ultra (not a fan of zooming and looking for minimal shadows edges), anyway a good reshade really help. Gonna wait for optimal settings and a couple of patches, with fsr3 this game will probably shine let's hope AMD this time delivers...
 

Del_X

Member
12900KS
4090
64GB 5200MHz
Gen4 NVMe

Default ultra settings with 75% res scaling and FSR2
75-85fps in New Atlantis
100+ fps in a lot of interiors
 

GymWolf

Gold Member
Your 4080 should run this thing fine. dont worry.

the game might not have many of the normal options like hdr, fov and framerate toggles, but the game itself is fine.

P.S the space battles are a 60 fps lock for me. its the outdoors where the game struggles to hit a locked 60 fps. FSR is not bad in this game. i didnt see any shimmering or ghosting. its no RE4 on PS5.
The game looking like a washed out bitch unless you use external apps to fix the colors is not...ok.

I don't know how to use this stuff and sure as hell i'm not gonna learn today for starfield of all games.

Also, define fine, because fsr4k with shit solid 60 fps is not remotely "fine" for how the game looks.

I defended the game when it was right to do so, but i'm not gonna defend this shitshow of a port.
 
Last edited:

SolidQ

Member
Saw comparison native vs Fsr on gamegpu. Native looks like blur quality, Fsr like clear and crystal quality
 

GymWolf

Gold Member
So this is really interesting. For as much as the mid-tier/older gen AMD processors seem to be punching above their weight, that flips when we get to the current gen, where Intel is bodying AMD chips that would normally be performing equal or better to Intel. We even see Intel chips in the new gen performing out of scale compared to last gen chips (13700k is normally about 5-10% better performing than a 12900k, but here we see a good 28% uplift)

So we really might be seeing very strong scaling with memory bandwidth. That's unusual.

I'm running on a i7-13700k with DDR5-6400, and it seems to be running very well for the most part, so that squares with my experience.
Jesus i hope you are right.

Tonight can't come soon enough.
 

SlimySnake

Flashless at the Golden Globes
The game looking like a washed out bitch unless you use external apps to fix the colors is not...ok.

I don't know how to use this stuff and sure as hell i'm not gonna learn today for starfield of all games.

Also, define fine, because fsr4k with shit solid 60 fps is not remotely "fine" for how the game looks.

I defended the game when it was right to do so, but i'm not gonna defend this shitshow of a port.
At least play the game. I didnt even know about the game looking washed out until i saw people complaining here. i played with both hdr on and off. I am calling out the downgrade to the geometry and terrain detail, so i have no issues with calling it like it is.

But a lot of this stuff is overblown. this is not a shitshow and the game looks great and performs like i would expect it to for the level of fidelity they are going for.
 

GymWolf

Gold Member
At least play the game. I didnt even know about the game looking washed out until i saw people complaining here. i played with both hdr on and off. I am calling out the downgrade to the geometry and terrain detail, so i have no issues with calling it like it is.

But a lot of this stuff is overblown. this is not a shitshow and the game looks great and performs like i would expect it to for the level of fidelity they are going for.
If the colors are washed out i'm just gonna switch to limited black and colors on my oled and nvidia panel, that usually make the img way more contrasted but with some black detail crush, still better than over brightness...
 

RoboFu

One of the green rats
I've seen only a very few performance issues so far. Only in new antlantis.

- 1440
- ryzen 7700x
- radeon 7900xt
- 16 gig Ram
- Installed on second M.2 pcie 4 drive
- FSR turned off
- film grain at 0
- everything else on ultra

FSR has an issue with small, very high contrast items during quick movements. so off it goes.

Also no real fullscreen is a bummer but that may just be on the xbox app version?
 
5600x 7900xtx 75-100+ fps i guess it's nice but for some reason FSR doesnt affect perfomance at all for me no matter the scaling %.
 
Last edited:

geary

Member
Awful optimization. A 3080 paired with 5800x and 32 GB, i have 45-50 FPS in New Atlantis, at 1440, maxed out, with FSR at 75% internal resolution.
My GPU is at 98% and CPU at 55%... In CP2077 with Max RTX i had better performance and New Atlantis doesnt hold a candle to Night City.
 

Matt_Fox

Member
Film Grain setting, any recommendations?

The default seems quite heavy and contributes somewhat to the 'washed out' aesthetic...
 

GymWolf

Gold Member
Worthless, literally nobody is running those processors with those ram kits.
I have ddr5 6000 mhz, many people who upgraded their pc in the last year bought ddr5.

But maybe you were talking about something else...
 
Last edited:

Rossco EZ

Member
was gonna upgrade to premium edition today but think ill hold off for now. too many people with better rigs than me saying they only getting 40-50fps
 

SlimySnake

Flashless at the Golden Globes
Awful optimization. A 3080 paired with 5800x and 32 GB, i have 45-50 FPS in New Atlantis, at 1440, maxed out, with FSR at 75% internal resolution.
My GPU is at 98% and CPU at 55%... In CP2077 with Max RTX i had better performance and New Atlantis doesnt hold a candle to Night City.
switch to high settings and see if it makes a difference.

CP2077 at psycho maxed out at 4k dlss performance was around 45-50 fps for me on my 3080 10 GB. Around 3 fps more on my 12GB model. Maxing out these games is for people with 4090s. 3080 is basically a mid range card now.
 

yamaci17

Member
I've seen only a very few performance issues so far. Only in new antlantis.

- 1440
- ryzen 7700x
- radeon 7900xt
- 16 gig Ram
- Installed on second M.2 pcie 4 drive
- FSR turned off
- film grain at 0
- everything else on ultra

FSR has an issue with small, very high contrast items during quick movements. so off it goes.

Also no real fullscreen is a bummer but that may just be on the xbox app version?

there really is no "exclusive" fullscreen with dx12. even if a game says they have (reatchet clank spiderman) it is fake. only thing it does is to set your desktop resolution temporarily
 

Haint

Member
I have ddr5 6000 mhz, many people who upgraded their pc in the last year bought ddr5.

But maybe you were talking about something else...

I'm referring to the fact that literally nobody is running an X or X3D with 5200 Mhz RAM as Ryzen performance is hugely dependant on reaching 6000Mhz for the 1:1 infinity fabric ratio. Even AMDs dirt cheap firesell bundles with CPUs, mobos, and RAM for like $400 included 32GB of 6000Mhz. AMD has never shipped product for review, or sanctioned bundles with anything lower than 6000 cause it's like putting square tires on a car. Similarly, but only slightly less stupid, is that very few people are running K processors with 5600 Mhz.
 
Last edited:

XesqueVara

Member
Looks like the game is more bandwith bound than fucking SpiderMan Holy Fuck, It really wants that Ram Speed and Bandwith and Raptor-Lake is flying because that
 

HeisenbergFX4

Gold Member
I'm referring to the fact that literally nobody is running an X or X3D with 5200 Mhz RAM as Ryzen performance is hugely dependant on reaching 6000Mhz for the 1:1 infinity fabric ratio. Even AMDs dirt cheap firesell bundles with CPUs, mobos, and RAM for like $400 included 32GB of 6000Mhz. Similarly, but only slightly less stupid, is that very few people are running K processors with 5600 Mhz.
And I bet the vast majority of people buying prebuilts have no idea about this and gets stuck with some of the cheaper ram the builder can put it in

In fact the RAM that was stuck in the prebuilt I bought for my wife 7950x was 32GB of 4800 DDR5 and my Corsair Vengeance 7900X 4090 PC had 64 GB of 5600 DDR5 before I upgraded them both
 

Haint

Member
And I bet the vast majority of people buying prebuilts have no idea about this and gets stuck with some of the cheaper ram the builder can put it in

In fact the RAM that was stuck in the prebuilt I bought for my wife 7950x was 32GB of 4800 DDR5 and my Corsair Vengeance 7900X 4090 PC had 64 GB of 5600 DDR5 before I upgraded them both

People buying bargain basement generic prebuilts out of Best Buys Sunday sale, which don't even list RAM speed, aren't looking at benchmarks on a German PC nerd site.
 
Last edited:

HeisenbergFX4

Gold Member
People buying bargain basement generic prebuilts out of Best Buys Sunday sale, which don't even list RAM speed, aren't looking at benchmarks on a German PC nerd site.
Actually BB does normally list RAM speeds on their higher end builds but that doesnt address "I'm referring to the fact that literally nobody is running an X or X3D with 5200 Mhz RAM" as you said and I am just saying yes it does happen and likely a lot because most buying gaming PCs do not know these details

In fact I know a lot of people who walk into places like a Best Buy and get their gaming PC and never do anything to them to maximize their machine
 

Haint

Member
Actually BB does normally list RAM speeds on their higher end builds but that doesnt address "I'm referring to the fact that literally nobody is running an X or X3D with 5200 Mhz RAM" as you said and I am just saying yes it does happen and likely a lot because most buying gaming PCs do not know these details

In fact I know a lot of people who walk into places like a Best Buy and get their gaming PC and never do anything to them to maximize their machine
I'm actually surprised any halfway competent company would sell such systems (particularly high end ones), as I'd have guessed AMD had strong guidlines, if not some kind of certification process (especially for Xs and X3Ds), that barred integrators from handicapping their brand and image to save $5. Though I suppose they don't have any leverage given the competition from Intel.
 
Last edited:

HeisenbergFX4

Gold Member
I'm actually surprised any halfway competent company would sell such systems (particularly high end ones), as I'd have guessed AMD had strong guidlines, if not some kind of certification process, that disallowed handicapping their brand and image to save $5.
Just browsing through some of the higher end gaming PCs on Best Buys site and even up to the $5000 rigs they are paired with as low as 4200 RAM and I bet many people buying those have no clue
 

LostDonkey

Member
6700k+1070 16gb DDR4 3000

Game installed on a 2TB Crucial P3 NVME


Ultra 1440p runs about as good as the Series X version but a few more dips here and there. Sure I can tweak some settings and smooth it out, it definitely looks cleaner than the console version.
 
Last edited:

Haint

Member
Just browsing through some of the higher end gaming PCs on Best Buys site and even up to the $5000 rigs they are paired with as low as 4200 RAM and I bet many people buying those have no clue
It looks like none of the mainstream Sub-$2K models bother to list it all.
 

T4keD0wN

Member
I'm referring to the fact that literally nobody is running an X or X3D with 5200 Mhz RAM as Ryzen performance is hugely dependant on reaching 6000Mhz for the 1:1 infinity fabric ratio. Even AMDs dirt cheap firesell bundles with CPUs, mobos, and RAM for like $400 included 32GB of 6000Mhz. AMD has never shipped product for review, or sanctioned bundles with anything lower than 6000 cause it's like putting square tires on a car. Similarly, but only slightly less stupid, is that very few people are running K processors with 5600 Mhz.
I find it hard to believe that most people have xmp enabled. The average consumer is likely running their system on stock bios settings and a large amount of people probably do not even have a clue which slot to put the memory into to get dual channel working, how to multiply by 2 or which frequency to pick. Prebuilt companies have no incentives to enable it for them.
 
Last edited:

winjer

Gold Member
I'm referring to the fact that literally nobody is running an X or X3D with 5200 Mhz RAM as Ryzen performance is hugely dependant on reaching 6000Mhz for the 1:1 infinity fabric ratio. Even AMDs dirt cheap firesell bundles with CPUs, mobos, and RAM for like $400 included 32GB of 6000Mhz. AMD has never shipped product for review, or sanctioned bundles with anything lower than 6000 cause it's like putting square tires on a car. Similarly, but only slightly less stupid, is that very few people are running K processors with 5600 Mhz.

There is not that much reasoning to run 6000 MT/s memory on a single CCX Zen4 CPU, like the 7800X3D, or 7600 or 7700.
For starters, unlike previous Zen gens, the latency penalty of not running 1:1 is very small. This is due to a very improved caching between the IF and memory.
Then there is the fact that single CCX Zen4 CPUs are always limited by the IF bandwidth.
Thus is because there is only one link of 32B per clock, per CCX.
So even if someone clocks the memory at 6000 with the IF at 2000, the system bandwidth will be only 64GB/s. Despite the memory being capable of a theoretical maximum of 96GB/s.
 

Agent_4Seven

Tears of Nintendo
So, finally got my hands on the game.

On my setup (8700K @4.8GHz/32GB 3600MHz/3080Ti) I'm getting rock-solid 30 FPS at native 4K with maxed out settings with every DRS or image reconstruction (FSR) turned off as well as motion blur, VRAM is around 6,5GBs alocated but not used. But I haven't yet gotten into the first major city so I can't say anything about performance there yet.
 
Last edited:

draliko

Member
I got motion sickness after a 2 hours session, what fov are you using guys? Sincerely it's the first time a non VR game gave me motion sickness, gonna try disabling motion blur and maybe fov to 90, think this should be enough?
 

HeisenbergFX4

Gold Member
I got motion sickness after a 2 hours session, what fov are you using guys? Sincerely it's the first time a non VR game gave me motion sickness, gonna try disabling motion blur and maybe fov to 90, think this should be enough?
I got motion sick last night a bit as well with motion blur off and there is no FOV slider yet on PC though there is a workaround

 

Stuart360

Member
I am avoiding the forum for a week or two but i just came on to say to make sure you install the new Nvidia drivers if you havent. I got upwatds of 10fps better performance in the same area in my testing (1080ti).
Right back to the game lol.
 

BlueLyria

Member
I'm going to try reinstalling amd gpu drivers and unplugging my main monitor later today, anyone found a workaround for crashes that aren't "turn off fsr lmao"?
 
Someone let me know what good cheaper PC build would be to get locked 1440p or even 4K, 60fps going. That's all I need.
 
Last edited:
Top Bottom