Fallout 4 PC Performance Thread

I've just built a new pc yesterday and to my delight Fallout 4 launcher decided to pick ultra for me.

I5 4690k
Gigabyte Gtx 970
16 gb ddr3 hyperx
Ssd Samsung 850 evo 250gb

I got out of the vault and everything was butter-smooth
 
Wait. Sli isn't even officially supported last I heard?


...is it?

You can force the bits w/Fallout 4 config in Nvidia Inspector + AR2. Not sure if that's what's causing the crashes though. Either way the game is totally unplayable for me.
 
So Fallout 4 is the first game that's pushed my 4GB 770 past 2GB of memory usage in-game. I consistently see over 2500mb and have seen as high as 2800mb. I have a 1920x1200 screen btw. So glad I ignored the consensus at the time and got the 4GB instead of the 2GB. I can only imagine it'll get more resource hungry when mods start rolling out.

This and Dark Souls III will probably be the last games my 770 sees. She's been a champ...
 
I've just built a new pc yesterday and to my delight Fallout 4 launcher decided to pick ultra for me.

I5 4690k
Gigabyte Gtx 970
16 gb ddr3 hyperx
Ssd Samsung 850 evo 250gb

I got out of the vault and everything was butter-smooth

Nice. Don't get discouraged if frames tank when you get to the Boston area. Everyone is having that problem for now.
 
Anybody running with SLI 980's/970s/Titans using custom SLI bits?

Just curious what your performance is like. I have SLI 980's so I'm just looking to get a general idea how much fps I can expect (using a 1440p, 144hz monitor)
 
Anybody running with SLI 980's/970s/Titans using custom SLI bits?

Just curious what your performance is like. I have SLI 980's so I'm just looking to get a general idea how much fps I can expect (using a 1440p, 144hz monitor)

Multiple crashes, totally unstable. But then I'm running 3 980Ti's @ 4k.
 
You can force the bits w/Fallout 3 config in Nvidia Inspector + AR2. Not sure if that's what's causing the crashes though. Either way the game is totally unplayable for me.
Have your tried 0x080000F5 (Killing Field...) bits? Works well for me.

Anybody running with SLI 980's/970s/Titans using custom SLI bits?

Just curious what your performance is like. I have SLI 980's so I'm just looking to get a general idea how much fps I can expect (using a 1440p, 144hz monitor)
I run SLI Titans X @ 4K gsync. Except for lowering godrays and shadow, everything else on max and getting mostly 60fps. I do get drops, but with gsync and too busy playing, don't notice it.
 
Which gpu, bredda? I'm on a 7950, so I'm eager to know what sort of boost old gpus will get. I'm out of the country right now and can't see for myself until Saturday.
I have a 7870 4GB. Game runs super smoothly, solid 60, even in cities now on ultra with shadow distance turned down. It's a miracle.
 
Multiple crashes, totally unstable. But then I'm running 3 980Ti's @ 4k.

That's shitty. Has there been any word for offical SLI support? Seems kind of ridiculous if that isn't on their radar.

I run SLI Titans X @ 4K gsync. Except for lowering godrays and shadow, everything else on max and getting mostly 60fps. I do get drops, but with gsync and too busy playing, don't notice it.

Mostly 60 fps at 4k sounds good, sounds like I should be able to get 75+ fps pretty easily at 1440p.
 
I'm hating myself for thinking this, but I'm looking at 6700k's. But I'm reading more and more that fallout 4 responds well to i7's with HT and faster ram

Both digital foundry and gamers nexus are showing this

I'd link but I'm on mobile

600+ upgrade for 10-15 fps gain in a Bethesda game?
I7 6700k
 
Yeah, I had noticed some talk about vault doors bugging out, so I made a save beforehand just in case. I'll have to try that pre-rendered frames thing. Remember testing it out in GTA V, and the input lag was pretty bad.

Right, increasing the max pre-rendered frames can also increase input latency (by very little, mind you). But it will vary game to game, especially depending on what a given game's default pre-rendered frames are set to in the first place, if the given game is more CPU or GPU dependent, and what your target framerate is (30 fps is obviously going to have double the frametime/input latency of 60 fps, and so on).

The setting is also dependent on the power of your setup, and how quickly your specific CPU can prepare and send the processed frames to your specific GPU; a setting of "1" on a certain game may perform far better on a 980 Ti GPU/i7 CPU combo when compared to a 680 GPU/i3 CPU combo, for instance.

However, as its name suggests, it does only decrease/increase the "max" amount of pre-rendered frames, so if you have it on "3," the CPU will then only prepare UP to 3 frames in advance. It doesn't necessarily mean it will at all times; it could be 1, then 2, then 3, then 2, and so on.

TL;DR: Try "3" and see if it helps ;p
 
I'm hating myself for thinking this, but I'm looking at 6700k's. But I'm reading more and more that fallout 4 responds well to i7's with HT and faster ram

Both digital foundry and gamers nexus are showing this

I'd link but I'm on mobile

600+ upgrade for 10-15 fps gain in a Bethesda game?
I7 6700k

I'm going to be upgrading my CPU soon too I think.
I'm still using the first generation Core i7, the Nehalem.

I definitely think an i7 is best to go for if you can afford it over an i5. There are games that make use of the extra threads and there probably will be even more in the near future.
 
Have your tried 0x080000F5 (Killing Field...) bits? Works well for me.


I run SLI Titans X @ 4K gsync. Except for lowering godrays and shadow, everything else on max and getting mostly 60fps. I do get drops, but with gsync and too busy playing, don't notice it.

You sure that's right? That's DX10 SLI bits, and the Fallout 3 bits are DX9 I think.
 
I'm going to be upgrading my CPU soon too I think.
I'm still using the first generation Core i7, the Nehalem.

I definitely think an i7 is best to go for if you can afford it over an i5. There are games that make use of the extra threads and there probably will be even more in the near future.

on second thought i might try to find a i7 4790k, that way i can keep my old mobo/ram
 
Shame about the RAM bottlenecking issue. Are there any other games that get bottlenecked by 1600 DDR3? With an R9 390 with 8GB of vram I never thought I'd be bottlenecked by any memory at 1080p. My mobo doesn't support me OCing my CPU or my RAM past 1600 so I guess I'm stuck with shit performance.

I have no idea how anything works, but I saw someone asking why they didn't just use more vram for the game instead of relying on the system ram so much. Seems like a sensible question, but I have no idea how games work.
 
Shame about the RAM bottlenecking issue. Are there any other games that get bottlenecked by 1600 DDR3? With an R9 390 with 8GB of vram I never thought I'd be bottlenecked by any memory at 1080p. My mobo doesn't support me OCing my CPU or my RAM past 1600 so I guess I'm stuck with shit performance.

I have no idea how anything works, but I saw someone asking why they didn't just use more vram for the game instead of relying on the system ram so much. Seems like a sensible question, but I have no idea how games work.

I'm out of my league here, but I believe it has to do with that the consoles used unified memory? This combined with gamebryo being a cpu heavy engine, causes these non-gpu bottlenecks to appear.

I'm speaking out of my ass though.
 
I'm out of my league here, but I believe it has to do with that the consoles used unified memory? This combined with gamebryo being a cpu heavy engine, causes these non-gpu bottlenecks to appear.

I'm speaking out of my ass though.

I'd believe that. Doesn't excuse a shit port job though. Just because a PS4 has unified memory doesn't mean you force PC hardware to do the same thing. Things like this don't give me hope that a patch will even address the garbage performance in a meaningful way. Looking at a bunch of benchmarks for other games it seems like a very rare case where just upping the RAM clock gives such a performance boost.
 
I'd believe that. Doesn't excuse a shit port job though. Just because a PS4 has unified memory doesn't mean you force PC hardware to do the same thing. Things like this don't give me hope that a patch will even address the garbage performance in a meaningful way. Looking at a bunch of benchmarks for other games it seems like a very rare case where just upping the RAM clock gives such a performance boost.

They put out some fairly hefty optimisations for Skyrim, so its certainly a possibility.
 
Wait, what? How was the performance originally before the new driver?
And now I can't wait to get home and test it out myself.
For me, drops to below 30 frames in cities even on medium. Walking in cities was awful. Now, I get 60 on ultra with some shadow distance and quality turned down to medium and high, respectively. It really has helped.
 
Surely, activate a placebo effect?

Well if it's supposed to let the game use all the cores on a cpu I guess I can understand why people would think it matters, but the game was already using all of my cores on my i5, albeit poorly.

This happens every few months with a PC release.

"Try this one crazy trick Bethesda doesn't wan't you to know about to eliminate stutters and framerate drops. You won't believe your eyes!"
 
I have a 7870 4GB. Game runs super smoothly, solid 60, even in cities now on ultra with shadow distance turned down. It's a miracle.

I find that quite bizarre but hey, if it works, it works! Presumably you have God Rays on Low or something, at least?

Getting the same performance on a 970 as you with a 7870 is quite disheartening. :(

Keep playing for a while and let us know if you sustain that 60.
 
Guys i am going to buy fallout 4 but first i want to make sure my pc can run it, i used can you run it website and i got the following: my video card is GeForce GTX 770 when the recommended on can you run it was NVIDIA GTX 780 3GB/AMD Radeon R9 290X 4GB or equivalent

my CPU is Intel(R) Core(TM) i7-4770K CPU @ 3.50GHz and the recommended was Intel Core i7 4790 3.6 GHz/AMD FX-9590 4.7 GHz or equivalent.
 
Guys i am going to buy fallout 4 but first i want to make sure my pc can run it, i used can you run it website and i got the following: my video card is GeForce GTX 770 when the recommended on can you run it was NVIDIA GTX 780 3GB/AMD Radeon R9 290X 4GB or equivalent

my CPU is Intel(R) Core(TM) i7-4770K CPU @ 3.50GHz and the recommended was Intel Core i7 4790 3.6 GHz/AMD FX-9590 4.7 GHz or equivalent.
Is it a 2GB card? What screen resolution do you run?

I have a 2GB GTX 770 and an i5-3570k @ 4.2GHz, runs fine at 1680x1050. Just make sure you turn shadow distance and godrays down. Worst case, you can try a Steam refund too. :)
 
I'm hating myself for thinking this, but I'm looking at 6700k's. But I'm reading more and more that fallout 4 responds well to i7's with HT and faster ram

Both digital foundry and gamers nexus are showing this

I'd link but I'm on mobile

600+ upgrade for 10-15 fps gain in a Bethesda game?
I7 6700k

It's a great CPU, but I wouldn't worry about it just for Fallout 4. You'll still get framerate drops in the cities and random drops in other spots in any case(I do with my 6700K @ 4.7ghz, 16gb 2800mhz DDR4, yada yada). The game needs patching and videocard driver updates basically.
 
It's a great CPU, but I wouldn't worry about it just for Fallout 4. You'll still get framerate drops in the cities and random drops in other spots in any case(I do with my 6700K @ 4.7ghz, 16gb 2800mhz DDR4, yada yada). The game needs patching and videocard driver updates basically.

Alright cool thanks. You're right I need to wait for more optimizations. I was o the verge of upgrading but I'll hold off

*throws wallet into a well*
 
I loaded this up the other day through steam sharing with my brother, and man... I really do not get the PGU utilization in a number of scenes in the waste land.

RIght after you get out of the vault, you look forward and I am greeted with 60fps @ 1620p.

Turn around and look at the low lying lands towards the mountains and It goes down to 35.

Mind you, it all looks similar in terms of geometric complexity.
 
I'm hating myself for thinking this, but I'm looking at 6700k's. But I'm reading more and more that fallout 4 responds well to i7's with HT and faster ram

Both digital foundry and gamers nexus are showing this

I'd link but I'm on mobile

600+ upgrade for 10-15 fps gain in a Bethesda game?
I7 6700k
I did it.. I don't regret it. I went with an i7 4790k, a new z97 mobo and 16gb of 2400mhz ddr3 ram. The game runs locked at 60fps now on ultra even near Corvega plant, in Diamond City, Lexington, you name it.
 
The game runs locked at 60fps now on ultra even near Corvega plant, in Diamond City, Lexington, you name it.

239473_o.gif
 
I've just built a new pc yesterday and to my delight Fallout 4 launcher decided to pick ultra for me.

I5 4690k
Gigabyte Gtx 970
16 gb ddr3 hyperx
Ssd Samsung 850 evo 250gb

I got out of the vault and everything was butter-smooth

You're going to be in the 30's when you get to built up areas. I just locked my fps to 30, it was shit at first but I'm used to it now.
 
Guys i am going to buy fallout 4 but first i want to make sure my pc can run it, i used can you run it website and i got the following: my video card is GeForce GTX 770 when the recommended on can you run it was NVIDIA GTX 780 3GB/AMD Radeon R9 290X 4GB or equivalent

my CPU is Intel(R) Core(TM) i7-4770K CPU @ 3.50GHz and the recommended was Intel Core i7 4790 3.6 GHz/AMD FX-9590 4.7 GHz or equivalent.

I have a 770 and an inferior CPU than yours and I'm happy with my Fallout 4 performance.

http://www.neogaf.com/forum/showpost.php?p=184925858&postcount=1722

Those are my settings if you wanna tinker with them. I get ~60fps until I hit places like Corvega and then dips happen. I could sacrifice some iq for fps, but I don't think it's worth it because I'd be sacrificing settings for what is essentially a small portion of the game.

Attached a screenshot of what you can expect.

 
Does anyone know how to change/disable the weather? For the past several days my game has been stuck in that blown out nuclear radiation look. There's no color besides shades of green. It basically looks like permanent night vision. It doesn't matter if I'm inside or outside, it's always that blown out light green shade. It's burning my retinas and I miss seeing color and shadows. Don't know if it's a bug or what. Before this happened the weather would cycle but now no matter where I go: weather, weather never changes. HELP!
 
Haven't been keeping up with this thread so I apologise if this has been answered, but has anyone figured out how to stop the game going in slow motion when the frame rate drops below 45fps.
 
Haven't been keeping up with this thread so I apologise if this has been answered, but has anyone figured out how to stop the game going in slow motion when the frame rate drops below 45fps.

Personally my game speed isn't affected at all when the framerate dips.
Using any ini tweaks?
 
Top Bottom