Digital Foundry tests "Ultimate Engine Tweaks" Unreal Engine INI file "mods" that supposedly improve performance. Results: "This doesn't do anything"

P4OLO, the mod creator, is also a member of RESETERA

THE MOD

XqLENqj.jpeg


Alex's findings? "Yes, these types of "mods" are completely bullshit."

Alex Battaglia said:
I'm gonna say - this mod, and these style of mods - I've always thought they did nothing, and I didn't want to test them because I couldn't imagine them doing anything... But after testing it A/B, and I did multiple runs of this by the way... it runs the same, if you install this Engine.ini mod or not.

And I think we should all talk about why we think people think these do things. Like, why does this mod have 300,000 unique downloads?

 
of course they don't work.

the only times ini tweaks can fix actual issues is when something like in Silent Hill 2 or Jedi Survivor happens, where delta time is calculated wrongly, leading to animation stutters.
by locking the engine tick rate to a set value you essentially circumvent the wrongly calculated delta time, fixing the animation stutter.

so in order to fix technical issues in a UE4/5 game you need to actually know what causes the issue you want to fix. decompressing files that were designed for single core CPUs won't fix anything as a modern CPU will not even register a light workload like that, neither will slightly changing graphics settings
 
Last edited:
Haven't people posted screenshots in the main thread for Clair Obscur where they clearly removed DoF and things like that?
 
Last edited:
I used the TweakGuides ini edits in 2006 to remove grass and other stuff. It took me hours to get Oblivion to run at like 25 fps on my Pentium 4 and Radeon 9600. Now people claim that ini edits uploaded 30 minutes after launch actually do something.

Today I learned that the TweakGuides dude was diagnosed with Parkinson's at 44 and retired... I think I also used his Bioshock settings. He also wrote guides for Nvidia's website.
 
I used the TweakGuides ini edits in 2006 to remove grass and other stuff. It took me hours to get Oblivion to run at like 25 fps on my Pentium 4 and Radeon 9600. Now people claim that ini edits uploaded 30 minutes after launch actually do something.

Today I learned that the TweakGuides dude was diagnosed with Parkinson's at 44 and retired... I think I also used his Bioshock settings. He also wrote guides for Nvidia's website.
Jesus dude....fuckin depressing man.
 
I used the TweakGuides ini edits in 2006 to remove grass and other stuff. It took me hours to get Oblivion to run at like 25 fps on my Pentium 4 and Radeon 9600. Now people claim that ini edits uploaded 30 minutes after launch actually do something.

Today I learned that the TweakGuides dude was diagnosed with Parkinson's at 44 and retired... I think I also used his Bioshock settings. He also wrote guides for Nvidia's website.
Spent an obscene amount of time with Koroush's guides for various games during the 2000's. Such a shame what happened to him.
 
Last edited:
I used the TweakGuides ini edits in 2006 to remove grass and other stuff. It took me hours to get Oblivion to run at like 25 fps on my Pentium 4 and Radeon 9600. Now people claim that ini edits uploaded 30 minutes after launch actually do something.

Today I learned that the TweakGuides dude was diagnosed with Parkinson's at 44 and retired... I think I also used his Bioshock settings. He also wrote guides for Nvidia's website.

Damn, that is a hell of a well written and super detailed guide. I hope he was able to spend time with his family and do things he wanted while he had full mobility.
 
If the mod author had proposed an Engine.ini mod that focused solely on addressing input lag, visual clarity, and similar issues, we wouldn't be having this conversation. All of those improvements can be achieved through the Engine.ini framework and are entirely legitimate. However, calling it an "Anti-Stutters" mod is a blatant lie. Sure, you can reduce the severity of stuttering, but eliminating it entirely? No. That's a job for the developers.
 
Placebo is really strong with people.
I notice it all the time in steam forums when people say to force anti aliasing in nvidia control panel to improve graphics (or sometimes, somehow, improve performance? :messenger_loudly_crying: ), while forced MSAA through nvidia control panel stopped working since DX11 :messenger_grinning_sweat:
 
Placebo is really strong with people.
I notice it all the time in steam forums when people say to force anti aliasing in nvidia control panel to improve graphics (or sometimes, somehow, improve performance? :messenger_loudly_crying: ), while forced MSAA through nvidia control panel stopped working since DX11 :messenger_grinning_sweat:

I always laugh when people says forced AA in nvidia control panel with modern games. The ignorance man...

As for these anti stuttering mod fixes I am not surprised. You can remove certain effects like DOF or add better looking shadows or certain effects but removing stuttering always sounded weird to me and always saw this like a placebo.
 
This is a joke post right? I can't tell anymore.
Yes about that...

I'm convinced people not liking 30FPS is mostly those with bad eyesight. Remember GTA got big on less than 30FPS.

I blame the focus on frame rates and ray tracing.
If we could go 30FPS without ray tracing the graphics will be massively improved, and I mean no option above 30FPS and no option for ray tracing at all.

That's also what locked 30FPS looks like to me.
I don't get why some people don't see it as smooth.
 
Yeah, those "ultimate" .ini tweaks that come out a couple hours after a UE5 game is out are completely placebo/snake oil. It's not like the devs of the engine are forgetting a magic cvar that can fix traversal and shader compilation stuttering. Those are deep-seated issues with the engine that can't be fixed by users.

Also, some UE5 games are heavily customized and they might not even use the standard UE5 variables.

Maybe things like setting the stream pool size can mitigate some issues in low end config, but it doesn't fix anything.
 
Not being able to tell how much smoother 60fps is more a sign of bad eyesight than anything else. Although I'm not sure how anyone cannot immediately feel the difference in latency and response times.
Or maybe you were told 30 is bad and your brain adapts. A lot of the cases of it on GAF have said 30FPS gets worse overtime for those that use 60FPS a lot.

Some now even say 60FPS is unplayable.

It's an interesting case but something that is said to start good and get worse overtime lends to that. It is incredibly unlikely your eyesight got better over time.
 
Absolute state of Unreal Engine on PC.

The fact that we're even discussing this is hilarious in a bad way. The only way to fix stutter in UE is usually to buy more VRAM.
 
Absolute state of Unreal Engine on PC.

The fact that we're even discussing this is hilarious in a bad way. The only way to fix stutter in UE is usually to buy more VRAM.

Yep. If the engine was not a piece of crap, we wouldn't have these grifters making money out of victims

Disgusting engine seriously
 
Not being able to tell how much smoother 60fps is more a sign of bad eyesight than anything else. Although I'm not sure how anyone cannot immediately feel the difference in latency and response times.
More like bad "brainsight", just like people who can't see 3d properly (stereo blindness), their eyes are fine, their brain isn't.
 
Yep. If the engine was not a piece of crap, we wouldn't have these grifters making money out of victims

Disgusting engine seriously
In the end even with Coboh FPS drops it was way easier for me to finish Jedi: Survivor on PS5 (and now re-run it in pro) than wait for PC patches.
 
Everyone who isn't a moron knows this. The mod creator doesn't provide benchmarks or describe what each line does. There are also changes that apply to DX11, which the game doesn't use.
 
Or maybe you were told 30 is bad and your brain adapts. A lot of the cases of it on GAF have said 30FPS gets worse overtime for those that use 60FPS a lot.

Some now even say 60FPS is unplayable.

It's an interesting case but something that is said to start good and get worse overtime lends to that. It is incredibly unlikely your eyesight got better over time.
I've know the difference between 30fps and 60fps since my early teens when I had 20/20 vision. It's also super easy to tell the difference in a test where you are not told what is 30fps or 60fps. The ability to not tell the difference between 30fps and 60fps is a sign of perception issues, or slower visual processing. Obviously.
 
I've know the difference between 30fps and 60fps since my early teens when I had 20/20 vision. It's also super easy to tell the difference in a test where you are not told what is 30fps or 60fps. The ability to not tell the difference between 30fps and 60fps is a sign of perception issues, or slower visual processing. Obviously.
Does this also apply for 60FPS and 120FPS?
 
I remember tweaking these to fix Tales of Arise draw distance on PC and they 100% worked. They got reverted in a patch but I was able to complete the game with settings the in game menu did not allow. This was UE4.
 
Does this also apply for 60FPS and 120FPS?
Of course, but the higher you go the less noticeable it becomes for most people. The difference between 240fps and 480fps is a whole extra 240fps, but it is far less of an impact compared to between 30 and 60. As you go higher, the perceived differences lessen as your eyes and brain simply can't notice such tiny differences in perceived frames. Everyone has a limit where adding on more frames makes little difference to them.
 
Last edited:
30fps with no motion blur sucks. Motion blur + my LG C4 = super blurry on motion.

Its not unplayable, but it's a bad experience on my part.
That's fair. And it certainly sucks

Even going back from 120fps to 60fps sucks a bit

But unplayable? Nah.
 
Does this also apply for 60FPS and 120FPS?

The difference between 33 and 16ms is very noticeable. Thats 30 vs 60fps
The difference between 16 and 8ms is easily still very perceivable. Thats 60 vs 120fps
The difference between 8 and 7ms is barely noticeable. Thats 120fps vs 144fps.
The difference between 7 and 4ms is only noticeable if you are a twitch gamer. Thats 144 vs 240fps.

The higher you go in FPS the smaller and thus less noticeable the differences in frametime get.


For me anything above 90fps I cant really tell the difference.
Small fluctuations 90 to 120 I honestly wouldnt be able to tell there was a drop.
Playing on 240Hz panel what actually trips me out is the mouse.




Think of it like this. (cateris paribus).
I tell you to go for a 10 mile run, you will feel it.
I tell you to half it, you feel the difference.
I tell you to keep halfing it, you keep feeling the shorter distances naturally.
At some point the distance between two halfs is so small you cant really feel the difference.
 
I'm very familiar with these "download this dll and install it while dancing naked under the moonlight" solutions. Always fun.
 
Someone should ask DF if they can open a terminal and either type "sudo rm -rf /*" and "format C:" and report back what happens.
 
2025 and some ppl here are still thinking theres no difference between 30 and 60 to 120 lmao. Its not that they don't notice, they don't want to notice because they either play only on consoles at 30 fps or dont have capable hardware to handle anything more than that. Even if your vision may be impaired or your brain slow, it's impossible not to notice the huge difference.
 
Last edited:
Guys I opened cyberpunk2077.ini and added SETFRAMERATE=30000 and now the game runs at 30,000 frames per second.

It's literally that easy.
 
I mainly used that mod to get the brightness to a normal level. Maybe it's a placebo effect too, but it seemed to work.
 
Of course, but the higher you go the less noticeable it becomes for most people. The difference between 240fps and 480fps is a whole extra 240fps, but it is far less of an impact compared to between 30 and 60. As you go higher, the perceived differences lessen as your eyes and brain simply can't notice such tiny differences in perceived frames. Everyone has a limit where adding on more frames makes little difference to them.
Anecdotally, it seems it is the opposite where more frames are added it worsens your vision, making it difficult for your brain to process less frames.
 
The difference between 33 and 16ms is very noticeable. Thats 30 vs 60fps
The difference between 16 and 8ms is easily still very perceivable. Thats 60 vs 120fps
The difference between 8 and 7ms is barely noticeable. Thats 120fps vs 144fps.
The difference between 7 and 4ms is only noticeable if you are a twitch gamer. Thats 144 vs 240fps.

The higher you go in FPS the smaller and thus less noticeable the differences in frametime get.


For me anything above 90fps I cant really tell the difference.
Small fluctuations 90 to 120 I honestly wouldnt be able to tell there was a drop.
Playing on 240Hz panel what actually trips me out is the mouse.




Think of it like this. (cateris paribus).
I tell you to go for a 10 mile run, you will feel it.
I tell you to half it, you feel the difference.
I tell you to keep halfing it, you keep feeling the shorter distances naturally.
At some point the distance between two halfs is so small you cant really feel the difference.
I bet it wasn't always this way for you.
30 would have been fine at one point, then 60 and now 90.
 
Top Bottom