Digital Foundry tests "Ultimate Engine Tweaks" Unreal Engine INI file "mods" that supposedly improve performance. Results: "This doesn't do anything"

Can't watch it since I'm at work, but does he prove it with some video comparisons/tech data or are we supposed to take his word for it?

Not a PC gamer, so I can't speak as to whether or not those things are useful, just curious about the process.
 
It's a small workaround, it helps occasionally. Only a fool would think it's a miraculous solution.
For example i tried one with Silent Hill 2 and got less stuttering but more pop in.
 
Last edited:
Can't watch it since I'm at work, but does he prove it with some video comparisons/tech data or are we supposed to take his word for it?

Not a PC gamer, so I can't speak as to whether or not those things are useful, just curious about the process.
It's hard data.
 
I bet it wasn't always this way for you.
30 would have been fine at one point, then 60 and now 90.

I was "fine" with the 30 on console because I didnt have a choice, but even during OGX generation my PC was actually running at 75Hz so i could tell the games were running slower. (Even if alot of OGX games were actually 60 too).

On DreamCast I could even tell the difference between the 50Hz and 60Hz mode.




Im not like hyper averse to 30fps, I replay alot of older titles on console so theres that.
But on PC......never!!!!!!
 
Or maybe you were told 30 is bad and your brain adapts. A lot of the cases of it on GAF have said 30FPS gets worse overtime for those that use 60FPS a lot.

Some now even say 60FPS is unplayable.

It's an interesting case but something that is said to start good and get worse overtime lends to that. It is incredibly unlikely your eyesight got better over time.
I've got 20/20, same vision I had 17 years ago when I joined the Army. I'm partially color blind, but my actual vision is great.

30 fps just isn't pleasant to look at for me, never was, never will be. It's the entire reason why I switched to PC as my primary platform so long ago, but I'm pleased to see how everyone's getting tons of high fps games on console now as well.

Connecting the perceptible motion clarity of content to one's vision doesn't seem terribly accurate to me.
 
Finally, I always feel gaslighted with these types of mods. Like really, a game has been out for months but the devs haven't figured out a way to edit the ini file to fix the stuttering.
 
I've tried almost all popular mods for jedi survivor and hogwarts legacy
none of them worked
some of them made it worse
some of them seemed to work somewhat but ended up with potato textures loading all around

so no, I have no trust in these mods anymore
 
The producer of the .ini tweak is being cooked on ERA.
Got a strong feeling he is going to be a member of GAF soon.
 
I took a quick look at this ini file.
And it's a mess. Some tweaks are real and do things.
But then there are lines that are from UE4 and have been deprecated on UE5. And then there are lines that don't even exist in the UE5 cvar list.
I also noticed that some lines are not properly configured. Probably the guy doesn't know what they do.

But the assessment from Alex that they do nothing is also wrong.
The reality is that some do and can provide improvements.

Another issue to consider is that some games block tweaks to the ini files. And some even block any tweaks at all.
So if someone tests some cvars in a game that has the ini blocked, of course it's not going to work.
 
Like many people I wanted to believe these tweak "mod" work in the past but nearly all of them are sham. Even reshade ones are someone else's reshade presets. You can do what you want with your preferences with the actual reshade.

That being said some minor tweaks in .ini files work. For example in some Assassin's Creed games you can unlock cloth physics from 30 fps etc. etc.

But I think everybody wants to believe in miracles with the .ini files and somehow it will fix all of problems 🤷‍♂️
 
Last edited:
But the assessment from Alex that they do nothing is also wrong.
The reality is that some do and can provide improvements.

Note. He is assessing Oblivion stuttering and performance specifically.
If they reduce settings obviously there will be average fps gains, but with a lot if them you can just use the settings menu to play in low...........beyond that none of them do anything to actually address Oblivion issues.

This is what the test was about, changing settings to low or beyond low is NOT what these tweaks are claiming they claim to improve stuttering/streaming/performance without impacting visual quality greatly.



So if you have a tweak that actually addresses what we are talking about please link them.
 
Anecdotally, it seems it is the opposite where more frames are added it worsens your vision, making it difficult for your brain to process less frames.
That isn't worsening your vision. When you are accustomed to higher frame rates, your vision cortex gets used to not having to filter so much of what you are seeing. When you suddenly go back to low frame rates, you are seeing the image as it really is (or closer to what it actually is), then it takes time for your visual cortex to start filtering out the judder once again.
 
They don't make them like they did before... I remember having a PC without a dGPU around 2007, I had to mod RE4 in order to bypass the Vertex Shader verification, then the game miraculously ran and it ran great, not even my newer PC later ran it as good even tho that one didn't require the mod... I'm feeling nostalgic 🥲
 
Last edited:
Note. He is assessing Oblivion stuttering and performance specifically.
If they reduce settings obviously there will be average fps gains, but with a lot if them you can just use the settings menu to play in low...........beyond that none of them do anything to actually address Oblivion issues.

This is what the test was about, changing settings to low or beyond low is NOT what these tweaks are claiming they claim to improve stuttering/streaming/performance without impacting visual quality greatly.


So if you have a tweak that actually addresses what we are talking about please link them.

Here is an example of a tweak I usually use, to reduce input latency. And it's very noticeable.

r.OneFrameThreadLag=0

It's not in the engine.ini that guy made. And I bet Alex never tested it.

For streaming data, I got good results with Lords of The fallen with these tweaks.
Mind you it uses a lot of vram. But I noticed an good improvement.

r.Streaming.LimitPoolSizeToVRAM=1
r.Streaming.PoolSize=5000
r.Streaming.MipBias=0
r.Streaming.Boost=1.5
r.Streaming.FullyLoadUsedTextures=1
r.Streaming.HLODStrategy=0
 
Anecdotally, it seems it is the opposite where more frames are added it worsens your vision, making it difficult for your brain to process less frames.
I always preferred the feel and look of games that ran at 60fps. I can also play a 30fps game just fine, I just find a 60fps game better on every metric.

You're claiming that because I have tasted a Michelin chef meal, I can no longer enjoy a cheap McDonald's burger and my tastebuds are worse. No, I can enjoy the McDonald's burger just fine, but I vastly prefer the Michelin class meal. The person who thinks McDonald's and the best burgers in the world taste the same obviously has worse tastebuds.
 
Here is an example of a tweak I usually use, to reduce input latency. And it's very noticeable.

r.OneFrameThreadLag=0

It's not in the engine.ini that guy made. And I bet Alex never tested it.

For streaming data, I got good results with Lords of The fallen with these tweaks.
Mind you it uses a lot of vram. But I noticed an good improvement.

r.Streaming.LimitPoolSizeToVRAM=1
r.Streaming.PoolSize=5000
r.Streaming.MipBias=0
r.Streaming.Boost=1.5
r.Streaming.FullyLoadUsedTextures=1
r.Streaming.HLODStrategy=0

They don't help with any of the problems Oblivion has.
 
Did you try it?
What results did you get? Do you have enough vram?

16GB of VRAM.
RTX4080.

My GPU isn't the problem.
I could even play the game on low 1080p and I'll have almost 100% replicatable hitches.
None the tweaks make the stutters go away or even make them feel any better.

Literally not gonna bother till they patch it but I've got Expedition 33 to go through so I ain't exactly pissed.
 
16GB of VRAM.
RTX4080.

My GPU isn't the problem.
I could even play the game on low 1080p and I'll have almost 100% replicatable hitches.
None the tweaks make the stutters go away or even make them feel any better.

Literally not gonna bother till they patch it but I've got Expedition 33 to go through so I ain't exactly pissed.

I'm starting to think that game might have the ini files blocked, so it just ignores any tweak.
 
I'm starting to think that game might have the ini files blocked, so it just ignores any tweak.

The inis aren't locked cause you can get below low and other settings to hook.
Just that none of them make the stuttering go away.

If patching the stuttering was as simple as a few ini tweaks do you really think stutter struggle would be even a thing?

The problem is deeper than a few ini tweaks.
 
The inis aren't locked cause you can get below low and other settings to hook.
Just that none of them make the stuttering go away.

If patching the stuttering was as simple as a few ini tweaks do you really think stutter struggle would be even a thing?

The problem is deeper than a few ini tweaks.

I have already managed to get several games to run very well, with some of these tweaks. Both in UE4 and UE5.
 
They almost never do anything. Tried every variation of the Silent Hill 2 tweaks to solve the UE5 stutter and my performance got worse when using them.

The solution I found in the end was

playing it on PS5.
 
I was one of those "it runs fine on my computer" people. I wasn't able to see the stutters till I started watching digital foundry. And thank God because now I can't stand playing the game anymore.
 
Last edited:
I have already managed to get several games to run very well, with some of these tweaks. Both in UE4 and UE5.

Games that had stutter struggle or had other issues that could be solved by changing settings?

Cuz we might be talking about two completely different things.

Sure your steak came out great when you changed the oven temperature, but I'm doing a stir fry, so your solution to getting good beef doesn't actually help anyone doing a stir fry.
 
Although the most acute judges of the witches and even the witches themselves, were convinced of the guilt of witchery, the guilt nevertheless was non-existent. It is thus with all guilt. - Friedrich Nietzsche

To expound, you were never fixing a universal PC gaming programming issue which would require the entire game to be recoded to actually fix, with .ini file tweaks. Yet many thought they did this impossible thing already and had it installed on their computer. That is how unreliable all eye witness testimony is. Do you think you could tell with 100% certainty if there was a difference or not without someone to confirm it for you?
 
Last edited:
Games that had stutter struggle or had other issues that could be solved by changing settings?

Cuz we might be talking about two completely different things.

Sure your steak came out great when you changed the oven temperature, but I'm doing a stir fry, so your solution to getting good beef doesn't actually help anyone doing a stir fry.

There are 2 sources of stutters. Shader compilation, and in this case, there is no solution. We just have to let the game compile shaders while playing.
The other is from asset streaming. And with tweaks to cache more data into vram, it can be mitigated.

It also helps to have the proper systems clocks and ticks configured.
UE4 and 5 games also run better with the exe with Fullscreen Optimizations disabled. And High DPI scaling override set to Application.
And disable Control Flow guard for the game's exe. The one in the Win64 folder.

rwdZ8z7.png
 
There are 2 sources of stutters. Shader compilation, and in this case, there is no solution. We just have to let the game compile shaders while playing.
The other is from asset streaming. And with tweaks to cache more data into vram, it can be mitigated.

It also helps to have the proper systems clocks and ticks configured.
UE4 and 5 games also run better with the exe with Fullscreen Optimizations disabled. And High DPI scaling override set to Application.
And disable Control Flow guard for the game's exe. The one in the Win64 folder.

rwdZ8z7.png
Sounds like fun.

How could they fuck this engine up so bad that it won't run properly on a PC. That is crazy to me. They had one job.
 
I took a quick look at this ini file.
And it's a mess. Some tweaks are real and do things.
But then there are lines that are from UE4 and have been deprecated on UE5. And then there are lines that don't even exist in the UE5 cvar list.
I also noticed that some lines are not properly configured. Probably the guy doesn't know what they do.

But the assessment from Alex that they do nothing is also wrong.
The reality is that some do and can provide improvements.

Another issue to consider is that some games block tweaks to the ini files. And some even block any tweaks at all.
So if someone tests some cvars in a game that has the ini blocked, of course it's not going to work.
Yep. Hogwarts was an unplayable mess until i tried some of these tweaks.
 
There are 2 sources of stutters. Shader compilation, and in this case, there is no solution. We just have to let the game compile shaders while playing.
The other is from asset streaming. And with tweaks to cache more data into vram, it can be mitigated.
There are many many more sources for stutters.

I'd argue that most of what people call "traversal stutters" are in fact engine data management stutters (object initialization, effects permutations leading to sudden data transfer spikes, CPU intervention into the rendering process, a whole bunch of other things happening with the engine) which aren't even linked to traversal per se as much as just gameplay, and you definitely can't avoid them with caching more into VRAM since they aren't even happening on the GPU and mostly relate to how fast the CPU is in both data processing and data movement (so RAM and buses speeds and latencies).

Also consider that caching more means that you still have the streaming, you're just streaming a bigger sized chunk of the game data. So if there are hitches related to streaming then you'll have to "cache" all of game's data basically which would effectively disable streaming, and this isn't practical even on PCs with top end h/w. So even in such case caching more data likely won't do much unless it's also coupled with a different streaming strategy in which case why not just use that in the first place?
 
I took a quick look at this ini file.
And it's a mess. Some tweaks are real and do things.
But then there are lines that are from UE4 and have been deprecated on UE5. And then there are lines that don't even exist in the UE5 cvar list.
I also noticed that some lines are not properly configured. Probably the guy doesn't know what they do.

But the assessment from Alex that they do nothing is also wrong.
The reality is that some do and can provide improvements.

Another issue to consider is that some games block tweaks to the ini files. And some even block any tweaks at all.
So if someone tests some cvars in a game that has the ini blocked, of course it's not going to work.

I have never seen any .ini tweak that fixes I/O or shader related stutters, and that is what you see in Oblivion.
the only thing that could help is if you use ini tweaks to turn off effects that cause framerate issues. but that's rare in UE4/5 games as many devs just use the default engine settings in the menus anyway.
 
Absolutely! - If you cant tell the difference between 30FPS / 60FPS and 120FPS then its bad times for you :(

If you honestly cant tell the difference, then leave us that clearly can alone lol - Please dont project on us :D
I remember Costco having a great sale on a monitor that was 144hz and I convinced my uncle to get it in 2017 and years of shit talk went out the window when he even talked about how much better just browsing the web was with a higher refresh rate. People are wild to say a lesser frame rate is "better"
 
I remember Costco having a great sale on a monitor that was 144hz and I convinced my uncle to get it in 2017 and years of shit talk went out the window when he even talked about how much better just browsing the web was with a higher refresh rate. People are wild to say a lesser frame rate is "better"

Yup. Just move a mouse at a steady pace on 30hz vs 144hz and it's a huuuuge difference. Never mind games.
 
Not near my computer until this evening, but when I played the oblivion remaster last, it looked like it was only running on a couple of cores.

Is anyone able to check and confirm?

If that's the case, then I doubt we'll ever see a performance patch as it won't be easy to fix the game engine.
 
There are many many more sources for stutters.

I'd argue that most of what people call "traversal stutters" are in fact engine data management stutters (object initialization, effects permutations leading to sudden data transfer spikes, CPU intervention into the rendering process, a whole bunch of other things happening with the engine) which aren't even linked to traversal per se as much as just gameplay, and you definitely can't avoid them with caching more into VRAM since they aren't even happening on the GPU and mostly relate to how fast the CPU is in both data processing and data movement (so RAM and buses speeds and latencies).

Also consider that caching more means that you still have the streaming, you're just streaming a bigger sized chunk of the game data. So if there are hitches related to streaming then you'll have to "cache" all of game's data basically which would effectively disable streaming, and this isn't practical even on PCs with top end h/w. So even in such case caching more data likely won't do much unless it's also coupled with a different streaming strategy in which case why not just use that in the first place?
Dead space is a classic example of this. I captured the below a couple of years back at one if the squares in the game that triggers leading. As I'm monitoring the SSD as well you can see when you hit that invisible load section a big spike in SSD read, CPU and then the associated frame time spike.

Have thought about revisiting this and losing the game into a ram disk as I have 96gb and see if this improves things.

 
Dead space is a classic example of this. I captured the below a couple of years back at one if the squares in the game that triggers leading. As I'm monitoring the SSD as well you can see when you hit that invisible load section a big spike in SSD read, CPU and then the associated frame time spike.

Have thought about revisiting this and losing the game into a ram disk as I have 96gb and see if this improves things.


Dead Space is doing data streaming wrong as it does it in big chunks at specific points instead of doing it constantly in the background. The issue with that is the data movement from storage to RAM to VRAM which floods the buses and copy engines which leads to a hitch.

You could try running it off RAM disk but with this issue it's bound to produce even bigger hitches - the speed of data transfer off RAM disk will be even higher meaning that the impact to bus usage will be even more severe and the resulting starvation of GPU can be even worse.

With such problem a solution can in fact be the opposite - i.e. use a slower storage for the game to limit the impact of these chunks being loaded on the buses and copy engines. There are some games where this in fact helps to reduce the hitching.

All in all this just a badly designed streaming system which doesn't take into account the fact that other devices on the same bus also need to have some bandwidth to work when streaming happens.
 
Last edited:
I was impressed with UE3 back in 09', unreal 4 most forgettable, unreal engine 5 I thought you could kinda see real advancements.
 
Top Bottom