• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Starfield PC Performance Thread

SlimySnake

Flashless at the Golden Globes
While i think that Starfield is a very good looking game there are better looking or similar games which perform way better than starfield while having the disadvantage of not being instanced and being fully open world. The quality of visuals and performance itself varies pretty wildly (in my observations very dependant on the lightning), but the game is very clearly poorly gpu optimized, i get 99% gpu usage and the card is pulling way less power (~30% less) than in most other games under full load. I had a 3070ti use 126w at one planet while sitting @ 99% usage, it would usually be above 200w.

This is very odd behavior and indeed representative of poor engine optimization. I will check it out right now, but my 3080 only drops to around 280 watts at times in line with the GPU utilization when it dips to 85% in indoor areas where i max out my 60 fps cap. Havent paid much attention to it outdoors, but just checked a couple of my recordings and its over 300 watts in both Atlantis and an indoor cafeteria.
Curiously my performance is consistently at its worst when flying a spaceship, which is where i would expect it to run way better than in those imo impressive cities, but hey its either the same or even worse for me.
Its bizzare that even when i am in relatively empty locations (space, lol) when there is nearly nothing going on on the screen besides the ship it still runs poorly. I am on alder lake so i am pretty much gpu bound 100% of the time, even in cities, but this is far from well gpu optimized.

(I have no complaints on the CPU side of things, which also seems abnormal, but i think those are at least somewhat justified)
Thats definitely very abnormal. The VRAM usage is only 2.9 GB which is well below last gen requirements and space is one area where im getting a locked 60 fps with a lot of GPU headroom. Its not CPU bound either so i wonder if its a driver issue.
 

raduque

Member
I left most the settings on default, which defaulted to High. I turned off the motion blur and most of the film grain, set the resolution scale to 70, the sharpness to 75, and I'm getting the lowest of about 40 and the highest is 75 (my refresh rate).

CPU (Ryzen 5 3600) is pretty much always around 80-100% and the GPU (RTX 2080) 50-100%, with about 6.7-7gb VRAM.
 
Last edited:

Pagusas

Elden Member
Feels like I'm the only having an extremely stable, smooth experiance. Playing on a 83" oled, running at 4k, capped at 60fps, barely a stutter anywhere. Also running reshade ontop of it to fix the black levels being raised by those overly strong luts + using a debanding filter)

I'm running:

4090 FE (watercooled and a slight OC on it, curve optimized, averaging +255/+500 core/mem oc)
7950x (watercooled, heavily OC'd, optimized for heavy low core count workloads (found it best for gaming, have a 95amp switch from PBO to all core OC for large multicore workloads, behaves amazingly well now that Asus/AMD has their bios worked out)
64gb DDR5 (running at 6400mhz)

The games played smoother than any AAA release in past memory.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I'm struggling and I know my specs are on the outdated side, but the slowdown is immense in this game. Honestly, i think I'm done with it until it gets patched or the GOTY edition comes; or better yet, upgrade my system. Ugh

Ryzen 7 3700x
16 GB DDR4
RTX 2060
yeah those are last gen specs unfortunately. The new 7800xt just launched for $500 and it performs like a 6800xt in every game. 6800xt gives you 78 fps on ultra using a 13900k in this game but thats overkill. Pair it up with a 7700x and you should have a 3x upgrade over your 2060 for only $800.
 

DeaDPo0L84

Member
@Low Moral Fibre

Good news! PureDark updated the mod a few times to address some of the issues including the skybox issue and crashing. I can confirm that the flickering sky/star cluster issue I was having no longer happens. Also, I didn't have any crashes tonight even without using the 'Disable in Menu' option. Couldn't be happier. This dude is really going the extra mile to support his mod.

Patch notes:
I tried like many others did the free version available on nexus and had quite a few issues. Reverted back to puredark's beta 2 hotfix 1 and not a single problem. Hopefully things can get lined out on both sides though.
 

Mister Wolf

Member
guys i watched a tutorial on how to do the dlss2 mod and followed it correctly but to bring up the menu thing to turn dlss on it says hit the end key my keyboard doesn’t have that is there another way to bring it up?

pure dark mod btw

What keyboard are you using? I didn't think I had an END or HOME key on this Logitech K400 keyboard I'm using but turns out I did all this time. You should try a search engine for your particular board and that key.
 

winjer

Gold Member
Didnt you say in your original post that the game was performing well indoors and even outdoors until you got to Atlantis when it started dropping below 40 fps? So what do you think is different? Its obviously the NPCs and whatever else their engine needs to keep track of in big cities. Vendors, quests, whatever. Thats why you see GPU usage drop to 38% when in the open world it probably sits around 99%. Because in the cities there is more CPU tasks the engine needs to do.

If anything i am pissed that they downgraded the games outdoor areas so heavily. The comparisons ive seen from the 2022 demo show a clear downgrade in terrain geometry, textures, and draw distance. lighting is more or less the same, but i am the complete opposite of you in that i want them to push the visuals more even if it comes at the expense of my 3080 not being able to run it at 4k dlss 60 fps. We shouldnt be chastizing devs for trying to push visual fidelity.

P.S I recently went back to play Witcher 3 on PC and that game's new remake has several settings related to NPCs. My CPU simply cant handle it. Goes from 60-80 fps in the wilderness to 30-40 fps in cities. I checked the settings and they really upped the NPC count in these remake/remaster. lowering the setting reduces the cpu load which lets my framerate go up. the same thing is happening here. they can probably add or remove more NPCs than they do in the crowd density setting because in Witcher going from Ultra + To Ultra or High is a massive 50% reduction in NPC count. here it hardly feels different.

You do realize that NPCs use geometry and pixels to be rendered. And these require draw calls.
Calculating their position and path is easy stuff, compared to rendering their geometry and pixels.
And if the engine is not doing proper use of instancing, then it means a ton of memory accesses. And on PC, that is done over the PCIe bus. So that could be another issue with the game.
But even outside of New Atlantis, I got a few scenes where I was CPU bottlenecked. Just not as badly.

There are plenty of games that look much better than Starfield, but run much better than this.
The reason Bethesda had to cut graphics is because their engine can't handle much more detail.
But that is the issue with Bethesda's engines, they have always been very poorly coded. And the result has always been bad performance and bugs.

On the Witcher 3, in the square of Novigrad, where DF sometimes tests for CPU bottlenecks, I got around 90-100 fps with the 5800X3D. I think the 6800XT was not at 100% utilization.
And the NPCs on the Witcher 3 look way better than the ones in Starfield.
 

M1987

Member
What are people rendering the resolution scale at 1440p? just setting up the game now,and I have only ever selected quality etc
 

JRW

Member
13600K / 32GB / EVGA 3060 Ti / 1080P 144Hz Starfield Nvidia optimization needs some help , I'm using Hardware Unboxed optimized settings and still see 45fps at its lowest (outdoors in New Atlantis) and 60-100fps indoors, No upscaling being used (native 1080P).
 
Last edited:

King Dazzar

Member
I've been out of the PC gaming arena for a while but used to customise and build all the time. I'm about to upgrade my media PC which is a now archaic 7700k platform. Originally I was going to get a 5000 series AMD. But if I were to keep some future flexibility in mind and try not to go too stupid with cost. Looking at this games benchmarks, would I be wrong to think the i5 13600k would be the sweet spot? I can get the CPU for around £300. Then if I felt the need to explore PC gaming again, I could just drop in a 4080...
 
Last edited:
D

Deleted member 1159

Unconfirmed Member
I've been out of the PC gaming arena for a while but used to customise and build all the time. I'm about to upgrade my media PC which is a now archaic 7700k platform. Originally I was going to get a 5000 series AMD. But if I were to keep some future flexibility in mind and try not to go too stupid with cost. Looking at this games benchmarks, would I be wrong to think the i5 13600k would be the sweet spot? I can get the CPU for around £300. Then if I felt the need to explore PC gaming again, I could just drop in a 4800...
I opted for the 13700k but it doesn't seem like that much more bang for the buck tbh. That said, I've played a couple hours on my 7700k (my new parts get here Friday) and the game runs quite well. The difference between FPS inside and outside is fairly jarring, though. From what I've seen, the 13700k should about double the CPU bound perf at least and smooth out that gap.
 

Puscifer

Member
Anyone here with a 7600X, 4080 and 4K? I'm still unpacking my new house and haven't set my PC and desk up yet. Any can comment on performance?
 
7800x3d
3090
ultrawide monitor, 1440p

Have everything on ultra. Set resolution to 100. Turned off VRS, FSR, dynamic resolution, . Getting like 50 fps in the opening mining level. Game feels silky smooth with the raw mouse input. I like it.
 
Last edited:
This game loves CPU. my 7950X3D is at about 40% usage across 10 cores all the time and it will occasionally use some of the other 6 cores. That's the first I've seen a game use more than 8 cores with this CPU.

Wow, so is this one of the few games that actually benefits from the increased core counts on the 7950x3d over the 7800x3d?

I was told by everyone the 7800x3d would be better for gaming lol.
 

Bojji

Member
Wow, so is this one of the few games that actually benefits from the increased core counts on the 7950x3d over the 7800x3d?

I was told by everyone the 7800x3d would be better for gaming lol.

7800x3d runs better.

The colour looks bland

There are mods but I use Nvidia freestyle (alt+f3) and enchance contrast and black level, games looks shit tons better.
 
Blame it on AMD. I have been saying this for a while, but the AMD Zen 2 and Zen 3 CPUs were only beating out these intel CPUs in last gen games. As soon as we started seeing current gen games come out favoring higher clocked CPUs, these AMD CPUs with their low wattage some with 65 watt caps struggled to keep up.

That 11700k gave me a lot of headache early on. It was too hot. It was consuming something crazy like 128 watts in cyberpunk on lower resolutions. All youtubers were like its trash. Microcenter was selling it for a $50-100 discount compared to the AMD equivalent. I think it was their 3700x iirc. But time has proven everyone wrong. it doesnt matter how hot something is or how much wattage it takes, the performance is king.


well, we dont know whats under the hood so i can only look at what i see, and what i see is better multithreading than just about any other game on the market.

the game is only cpu bound in big cities with lots of NPCs. only comparable game is mass effect andromeda and it doesnt have cities nearly as big or as many NPCs. something like gta, spiderman and cyberpunk dont count because those are basically open world games. this is a different kind of city.
LMAO ok, imagine trying to claim the Creation Engine which is a PS360 engine is a more modern game than Cyberpunk
 

Ovech-King

Member
To compare performance before and after I installed the ''Replacing FSR2 with DLSS-G'' mod (for frame generation) from nexus on rtx 4080 laptop:

Before:

3840 x 2160, medium preset, resolution scale 62% , many random dips in the 40's in New Atlantis

After:

3840 x 2160, high preset, resolution scale 75% , always staying at or above 60 fps in New Atlantis.

Like most frame generation compatible titles though, then you have to enforce V-sync in the nvidia control panel for ''Starfield'' otherwise even though it's say it's active in the game, there was screen tearing and now they are gone. Highly recommended + he released an update this morning which seems to have resolved crashes.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
The guy who made the DLSS Bridge mod (that gives better performance than the PureDark one according to some), has published his free version of the the DLSS 3 version, I can't try it right now :

Well, this DLSS mod crashed on the first load screen so im done. It did give me an error on load saying i have to enable windows hardware acceleration or something but im not taking anymore chances.

I do have the pure dark mod that some gruntled dude posted on nexus mod before it was taken down. but since there is no readme, i dont know how to install it. anyone have that?
 

Ovech-King

Member
Well, this DLSS mod crashed on the first load screen so im done. It did give me an error on load saying i have to enable windows hardware acceleration or something but im not taking anymore chances.

I do have the pure dark mod that some gruntled dude posted on nexus mod before it was taken down. but since there is no readme, i dont know how to install it. anyone have that?
should probably give it another go with version 1.03 from last night. Some good feedback on the page from people who tried this one. This is the first I installed myself and I just played around 2 hours like nothing happened to any game file beside sky rocketing my fps because I have a 4000 serie
 

SlimySnake

Flashless at the Golden Globes
should probably give it another go with version 1.03 from last night. Some good feedback on the page from people who tried this one. This is the first I installed myself and I just played around 2 hours like nothing happened to any game file beside sky rocketing my fps because I have a 4000 serie
yeah i downloaded the latest version. this time enabled gpu acceleration but that crashed not just the game but windows.

going to stick with fsr. the performance uplift for dlss was maybe 2-3 fps and i couldnt even tell if it looked better. tested in akila city.
 
Just imagine, for a second, if there was a conspiracy among the big developer/publishers and the hardware manufacturers to effectively try and force millions of people to upgrade their GPU (in particular).

Let's play make believe and pretend almost every AAA title released this year has been increasingly difficult to run unless you own the best offerings from Nvidia and AMD.
What if (arguably) the most anticipated games in years then released in a shambolic state and seemed to run terribly on anything but the most expensive GPU (and/or CPU).
I wonder how many people might upgrade just to play that game?

Not many, you say?

Hmmmm...but, what about all the other games they are struggling to play?

What about future releases?

Now, to be clear, we all know technology moves at a fast pace and that hardware and software manufacturers like to push the envelope...but that's not the issue.
The issue is games that simply shouldn't need a $1000 to run, games that don't look anything particularly special, that aren't doing anything especially new or exciting, being impossible to run at playable frame rates with anything approaching an acceptable visual quality.
Poor optimisation, lazy development practices, tired game engines etc...seem to be offering games that shouldn't be so demanding... but are.

Brute forcing 60fps on a 4090 and everyone else struggling to hit 40 is clearly not something consumers are going to be happy with going forward.
What happens when the next big AAA title runs just as poorly and demands just as powerful a system to run at an acceptable FPS?
This is important for a lot of people, remember, as the majority (the vast majority even) of Steam users have a 2060 or worse.

Now, perhaps, you might say it's time they upgraded, the pc market needs to be able to offer that bleeding edge visual and performance quality, and they are holding it back.
The thing is, are they, or are developer/publishers simply making it so they HAVE to upgrade, when in reality it's not actually necessary if the games were not so badly made, poorly optimised etc...

Ask yourself how long you would suffer 27fps and 50% render resolution on game after game after game before you'd decide you'd had enough and upgrade?

Remember, it's not about technology moving on and games becoming more demanding per se, it's about a cynical ploy to deliberately force players to upgrade by making games more and more demanding to run when they don't need to be...that's my point.

So, circling back, if those big publishers and hardware manufacturers did work together to force millions of people to upgrade how that would look any different to what we are seeing now?
 
Last edited:

bbeach123

Member
I just realized the default anisotropic filtering was 4x , and the ingame graphics settings doesnt have option for AF , why bethesda ? Its been 2000 years!
 
Last edited:

Xcell Miguel

Gold Member
yeah i downloaded the latest version. this time enabled gpu acceleration but that crashed not just the game but windows.

going to stick with fsr. the performance uplift for dlss was maybe 2-3 fps and i couldnt even tell if it looked better. tested in akila city.
If you just want DLSS and not DLSS3 Frame Generation (if you have a RTX 4000 series), try this one, it's stable, no crash in hours, and faster than PureDark's free DLSS2 version :


The DLSS3 FG version from my other post is still being worked on to avoid the crashes.
 

winjer

Gold Member
Bethesda can't even implement a proper mipmap bias.
Seriously, the level of incompetence....

Seems like Bethesda forgot to implement negative LOD bias with upsamplers. Problem exists with DLSS mod as well as native FSR2. Ruins image quality. Here's 100% res scale vs 50% res
scale.
[IMG]

[IMG]


Here are the recommended
values from AMD for FSR2 (DLSS is the same)
[IMG]


This can be fixed Nvidia Inspector (i don't think it can for AMD cards unfortunately):

[IMG]


You have to stick close to the values from the preset or calculate them yourself.
 

T4keD0wN

Member
The stutter every time i unequip the scanner is driving me crazy.
Try only having common tier weapons equipped and in the favourite bar, as crazy as that sounds it worked for me and i was getting a stutter every single time i changed a weapon or switched to a scanner.
 

GymWolf

Member
Try only having common tier weapons equipped and in the favourite bar, as crazy as that sounds it worked for me and i was getting a stutter every single time i changed a weapon or switched to a scanner.
Thanks for the advice but majority of my weapons are blue or gold, i'm not gonna fuck up my inventory for that.

The solutions to bugs for this game get increasingly more hilarious by the day.

In a week you are gonna have people busting a nut while someone cut their nipples with a piece of paper to not lose a save files...
 
Last edited:

SlimySnake

Flashless at the Golden Globes
audience Q: Why did you not optimize Starfield for PC?
Todd Howard: we did... you might need to upgrade your PC


He’s right. This is a 1440p 30 fps next Gen only game on a 12 tflops console, you are not running this on your 4-6 tflops x60 series cards with shit amd cpus that top the steam hardware surveys.

A relatively cheap mid end upgrade like the 7800xt and a $300 13600k should have no problems running this game at double the xsx framerate.
 

The Cockatrice

I'm retarded?
Only the neutral luts but everyone has that installed, no other mods, no dlss or framegen or ui improvement or anything.

Yeah I have the SGT LUT with 50% strength. Im assuming you installed the game on an SSD drive right? Maybe try the DLSS mod tho from Luke. Also disable motion blur, its buggyish.
 
Last edited:

GymWolf

Member
Yeah I have the SGT LUT with 50% strength. Im assuming you installed the game on an SSD drive right? Maybe try the DLSS mod tho form Luke.
I don't like to use mods to fix games.

The luts thing was super necessary because the game was unplayable.

The stutter doesn't always happen so i can kinda live with it...(very annoying tho)
 

Kacho

Gold Member
He’s right. This is a 1440p 30 fps next Gen only game on a 12 tflops console, you are not running this on your 4-6 tflops x60 series cards with shit amd cpus that top the steam hardware surveys.

A relatively cheap mid end upgrade like the 7800xt and a $300 13600k should have no problems running this game at double the xsx framerate.
I’m running a top end machine and all I had to do was drop the shadows to medium. Fixed virtually all my performance problems.
 
Last edited:

yamaci17

Member
He’s right. This is a 1440p 30 fps next Gen only game on a 12 tflops console, you are not running this on your 4-6 tflops x60 series cards with shit amd cpus that top the steam hardware surveys.

A relatively cheap mid end upgrade like the 7800xt and a $300 13600k should have no problems running this game at double the xsx framerate.
i agree. people need to beef up their rigs. 3700x is a 30 fps cpu from now on. it won't change, not unless devs change their stance on consoles
and they have no reason not to.

game made massive sales on xbox. so MAJORITY of people had no trouble with 30 fps. in that case, 3700x will forever be a 30 FPS cpu from going forward (and it might not even hit the same stable 30 fps as xbox does, as pc will always have worse cpu bound optimization on PC)

unless people mass boycott 30 FPS on consoles, it will be pretty rough for folks who refuse to upgrade and also enjoyed getting 100+ frames on ports that are designed to hit 40+ fps cpu bound avg. on 1.6 ghz jaguar cores (to hit a super locked 30 fps with tight %1 lows)
 
I have an i5-13600KF, 32 GB DDR4 3600 RAM, RTX 4080 Founders Edition, Windows 11 Pro 22H2 PC and have the game installed to my fastest NVMe M.2 SSD, a 2TB Samsung 980 Pro. I am playing at 1440p maxed out settings on my 165 Hz G-SYNC monitor with a 60 fps cap set using RTSS. I have played 43 hours of the games so far and have mostly enjoyed what feels like a solid 60 fps performance. I say 'feels' because I intentionally turned off the MSI Afterburner overlay so focus on how the game plays and feels instead of being distracted by a number going going down and up. Normally, when a game drops below 60 fps, even with G-SYNC, then I can immediately tell as the game starts to feel noticeably laggy. This has only happened twice in my entire playthrough so far and it has been very brief (literally a split second). I have had no stuttering and no crashes.

While I wouldn't go as far as to say Starfield is a graphical showcase, the game can look decent at times but there are many instances where the visuals can look downright dated, whether due to poor lighting - the game's default washed out look certainly doesn't help either - that makes character faces look like plastic dolls at times, or instances of bad pop in or the odd low-res texture. The game definitely looks better than anything else Bethesda have ever done before but it still looks at least 10 years out of date.

Despite dated aspects to the design, including lots of loading screens and a lack of seamless planet/space exploration, I have been thoroughly enjoying my time, 43 hours so far, with the game. It feels very Bethesda, pretty much like a cross between their own Fallout 4 and Obsidian's superb The Outer Worlds. It is fun to play and that's what matters. However, as an RPG, I feel it positively pales in comparison to the vastly superior Baldur's Gate 3, which has better visuals, AI, writing, dialogue and story. The bland characters have always been a weakness in Bethesda's games due to not being particularly memorable or well-written and this is certainly true of Starfield, although it is at least a small improvement over Fallout 4 and Skyrim. I don't feel as connected to my companions in Starfield in the same way as I did Baldur's Gate 3 because the writing is just not that great and you don't get the same level of engagement with them as you do in Larian's game. The voice-acting is good in theory but everyone seems to talk the same. I don't mean the accents, just the manner in which they speak sounds like they're just reading it off an autocue or something. There's no real emotion to anything anyone says; it's all very videogame-y if you like.
 
Last edited:

analog_future

Resident Crybaby
Just imagine, for a second, if there was a conspiracy among the big developer/publishers and the hardware manufacturers to effectively try and force millions of people to upgrade their GPU (in particular).

Let's play make believe and pretend almost every AAA title released this year has been increasingly difficult to run unless you own the best offerings from Nvidia and AMD.
What if (arguably) the most anticipated games in years then released in a shambolic state and seemed to run terribly on anything but the most expensive GPU (and/or CPU).
I wonder how many people might upgrade just to play that game?

Not many, you say?

Hmmmm...but, what about all the other games they are struggling to play?

What about future releases?

Now, to be clear, we all know technology moves at a fast pace and that hardware and software manufacturers like to push the envelope...but that's not the issue.
The issue is games that simply shouldn't need a $1000 to run, games that don't look anything particularly special, that aren't doing anything especially new or exciting, being impossible to run at playable frame rates with anything approaching an acceptable visual quality.
Poor optimisation, lazy development practices, tired game engines etc...seem to be offering games that shouldn't be so demanding... but are.

Brute forcing 60fps on a 4090 and everyone else struggling to hit 40 is clearly not something consumers are going to be happy with going forward.
What happens when the next big AAA title runs just as poorly and demands just as powerful a system to run at an acceptable FPS?
This is important for a lot of people, remember, as the majority (the vast majority even) of Steam users have a 2060 or worse.

Now, perhaps, you might say it's time they upgraded, the pc market needs to be able to offer that bleeding edge visual and performance quality, and they are holding it back.
The thing is, are they, or are developer/publishers simply making it so they HAVE to upgrade, when in reality it's not actually necessary if the games were not so badly made, poorly optimised etc...

Ask yourself how long you would suffer 27fps and 50% render resolution on game after game after game before you'd decide you'd had enough and upgrade?

Remember, it's not about technology moving on and games becoming more demanding per se, it's about a cynical ploy to deliberately force players to upgrade by making games more and more demanding to run when they don't need to be...that's my point.

So, circling back, if those big publishers and hardware manufacturers did work together to force millions of people to upgrade how that would look any different to what we are seeing now?



Tin Foil Tinfoil Hat GIF
 

SF Kosmo

Al Jazeera Special Reporter
Feels like I'm the only having an extremely stable, smooth experiance. Playing on a 83" oled, running at 4k, capped at 60fps, barely a stutter anywhere. Also running reshade ontop of it to fix the black levels being raised by those overly strong luts + using a debanding filter)

I'm running:

4090 FE (watercooled and a slight OC on it, curve optimized, averaging +255/+500 core/mem oc)
7950x (watercooled, heavily OC'd, optimized for heavy low core count workloads (found it best for gaming, have a 95amp switch from PBO to all core OC for large multicore workloads, behaves amazingly well now that Asus/AMD has their bios worked out)
64gb DDR5 (running at 6400mhz)

The games played smoother than any AAA release in past memory.
I'm on a 4070Ti and an in i7-13700k and it's extremely smooth for me. Using DLSS mod at 66% with minimal loss of quality. No frame-gen or DLSS3 and I'm still getting 80-120 in most parts. Occasional dips into like... high 60s, but the point is it's very smooth.

I don't think it's people on current gen hardware that are complaining. If you have a 4-5 year old computer this won't run very well, at least not without big compromises.
 
So far it's been fine for me on my 3600/3060ti. I've switched back and forth between my 1440p monitor and my C2 and it's been fine on both. I think Gsync is definitely helping me in this case on both displays.

When I started I was on my C2 and it defaulted to medium settings, 4K with FSR performance. I lowered a couple things to low that I saw recommended in a video and I think the game still looks fine. I'll probably install the DLSS mod today.
 
Top Bottom