It's hard data.Can't watch it since I'm at work, but does he prove it with some video comparisons/tech data or are we supposed to take his word for it?
Not a PC gamer, so I can't speak as to whether or not those things are useful, just curious about the process.
I bet it wasn't always this way for you.
30 would have been fine at one point, then 60 and now 90.
I've got 20/20, same vision I had 17 years ago when I joined the Army. I'm partially color blind, but my actual vision is great.Or maybe you were told 30 is bad and your brain adapts. A lot of the cases of it on GAF have said 30FPS gets worse overtime for those that use 60FPS a lot.
Some now even say 60FPS is unplayable.
It's an interesting case but something that is said to start good and get worse overtime lends to that. It is incredibly unlikely your eyesight got better over time.
But the assessment from Alex that they do nothing is also wrong.
The reality is that some do and can provide improvements.
That isn't worsening your vision. When you are accustomed to higher frame rates, your vision cortex gets used to not having to filter so much of what you are seeing. When you suddenly go back to low frame rates, you are seeing the image as it really is (or closer to what it actually is), then it takes time for your visual cortex to start filtering out the judder once again.Anecdotally, it seems it is the opposite where more frames are added it worsens your vision, making it difficult for your brain to process less frames.
Note. He is assessing Oblivion stuttering and performance specifically.
If they reduce settings obviously there will be average fps gains, but with a lot if them you can just use the settings menu to play in low...........beyond that none of them do anything to actually address Oblivion issues.
This is what the test was about, changing settings to low or beyond low is NOT what these tweaks are claiming they claim to improve stuttering/streaming/performance without impacting visual quality greatly.
So if you have a tweak that actually addresses what we are talking about please link them.
I always preferred the feel and look of games that ran at 60fps. I can also play a 30fps game just fine, I just find a 60fps game better on every metric.Anecdotally, it seems it is the opposite where more frames are added it worsens your vision, making it difficult for your brain to process less frames.
Here is an example of a tweak I usually use, to reduce input latency. And it's very noticeable.
r.OneFrameThreadLag=0
It's not in the engine.ini that guy made. And I bet Alex never tested it.
For streaming data, I got good results with Lords of The fallen with these tweaks.
Mind you it uses a lot of vram. But I noticed an good improvement.
r.Streaming.LimitPoolSizeToVRAM=1
r.Streaming.PoolSize=5000
r.Streaming.MipBias=0
r.Streaming.Boost=1.5
r.Streaming.FullyLoadUsedTextures=1
r.Streaming.HLODStrategy=0
They don't help with any of the problems Oblivion has.
Did you try it?
What results did you get? Do you have enough vram?
16GB of VRAM.
RTX4080.
My GPU isn't the problem.
I could even play the game on low 1080p and I'll have almost 100% replicatable hitches.
None the tweaks make the stutters go away or even make them feel any better.
Literally not gonna bother till they patch it but I've got Expedition 33 to go through so I ain't exactly pissed.
I'm starting to think that game might have the ini files blocked, so it just ignores any tweak.
The inis aren't locked cause you can get below low and other settings to hook.
Just that none of them make the stuttering go away.
If patching the stuttering was as simple as a few ini tweaks do you really think stutter struggle would be even a thing?
The problem is deeper than a few ini tweaks.
I have already managed to get several games to run very well, with some of these tweaks. Both in UE4 and UE5.
Games that had stutter struggle or had other issues that could be solved by changing settings?
Cuz we might be talking about two completely different things.
Sure your steak came out great when you changed the oven temperature, but I'm doing a stir fry, so your solution to getting good beef doesn't actually help anyone doing a stir fry.
Sounds like fun.There are 2 sources of stutters. Shader compilation, and in this case, there is no solution. We just have to let the game compile shaders while playing.
The other is from asset streaming. And with tweaks to cache more data into vram, it can be mitigated.
It also helps to have the proper systems clocks and ticks configured.
UE4 and 5 games also run better with the exe with Fullscreen Optimizations disabled. And High DPI scaling override set to Application.
And disable Control Flow guard for the game's exe. The one in the Win64 folder.
![]()
Pre compiling is not the solution how?There are 2 sources of stutters. Shader compilation, and in this case, there is no solution. We just have to let the game compile shaders while playing.
He probably switched genders when he developed it.P4OLO, the mod creator, is also a member of RESETERA
Pre compiling is not the solution how?
Yep. Hogwarts was an unplayable mess until i tried some of these tweaks.I took a quick look at this ini file.
And it's a mess. Some tweaks are real and do things.
But then there are lines that are from UE4 and have been deprecated on UE5. And then there are lines that don't even exist in the UE5 cvar list.
I also noticed that some lines are not properly configured. Probably the guy doesn't know what they do.
But the assessment from Alex that they do nothing is also wrong.
The reality is that some do and can provide improvements.
Another issue to consider is that some games block tweaks to the ini files. And some even block any tweaks at all.
So if someone tests some cvars in a game that has the ini blocked, of course it's not going to work.
There are many many more sources for stutters.There are 2 sources of stutters. Shader compilation, and in this case, there is no solution. We just have to let the game compile shaders while playing.
The other is from asset streaming. And with tweaks to cache more data into vram, it can be mitigated.
I took a quick look at this ini file.
And it's a mess. Some tweaks are real and do things.
But then there are lines that are from UE4 and have been deprecated on UE5. And then there are lines that don't even exist in the UE5 cvar list.
I also noticed that some lines are not properly configured. Probably the guy doesn't know what they do.
But the assessment from Alex that they do nothing is also wrong.
The reality is that some do and can provide improvements.
Another issue to consider is that some games block tweaks to the ini files. And some even block any tweaks at all.
So if someone tests some cvars in a game that has the ini blocked, of course it's not going to work.
I remember Costco having a great sale on a monitor that was 144hz and I convinced my uncle to get it in 2017 and years of shit talk went out the window when he even talked about how much better just browsing the web was with a higher refresh rate. People are wild to say a lesser frame rate is "better"Absolutely! - If you cant tell the difference between 30FPS / 60FPS and 120FPS then its bad times for you
If you honestly cant tell the difference, then leave us that clearly can alone lol - Please dont project on us![]()
I remember Costco having a great sale on a monitor that was 144hz and I convinced my uncle to get it in 2017 and years of shit talk went out the window when he even talked about how much better just browsing the web was with a higher refresh rate. People are wild to say a lesser frame rate is "better"
Dead space is a classic example of this. I captured the below a couple of years back at one if the squares in the game that triggers leading. As I'm monitoring the SSD as well you can see when you hit that invisible load section a big spike in SSD read, CPU and then the associated frame time spike.There are many many more sources for stutters.
I'd argue that most of what people call "traversal stutters" are in fact engine data management stutters (object initialization, effects permutations leading to sudden data transfer spikes, CPU intervention into the rendering process, a whole bunch of other things happening with the engine) which aren't even linked to traversal per se as much as just gameplay, and you definitely can't avoid them with caching more into VRAM since they aren't even happening on the GPU and mostly relate to how fast the CPU is in both data processing and data movement (so RAM and buses speeds and latencies).
Also consider that caching more means that you still have the streaming, you're just streaming a bigger sized chunk of the game data. So if there are hitches related to streaming then you'll have to "cache" all of game's data basically which would effectively disable streaming, and this isn't practical even on PCs with top end h/w. So even in such case caching more data likely won't do much unless it's also coupled with a different streaming strategy in which case why not just use that in the first place?
Dead space is a classic example of this. I captured the below a couple of years back at one if the squares in the game that triggers leading. As I'm monitoring the SSD as well you can see when you hit that invisible load section a big spike in SSD read, CPU and then the associated frame time spike.
Have thought about revisiting this and losing the game into a ram disk as I have 96gb and see if this improves things.