• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

DF : PC dormant commands from consoles to remove CPU burden

Buggy Loop

Gold Member


While looking into denoising with DLSS 4.5 on Unreal Engine 5 games, Alex made an intriguing discover: Star Wars Jedi Survivor has custom console commands that ease the CPU burden and vastly improve the camera and animation errors still seen in the game to this day. Alex believes these may be CPU optimisations for consoles that are dormant on the PC build - but can be used. So is this a complete "fix" for this most troublesome of PC ports? As you might imagine, the reality is a little more nuanced.

jon stewart queue GIF


When script kiddies improve the game from a studio supported by EA. Fucking shame Respawn. What a mess the Jedi games have been on PC.
 
I don't know why they rushed to release this game so fast, maybe it was pressure from EA but it's not like gamers were demanding a sequel right this second. They could've taken more time to launch it in a much better state.
 
This game is a technical disaster, including on console, but there is likely a reason why this stuff is turned off on PC, it's because it wasn't tested and could cause problems.
 
This game is a technical disaster, including on console, but there is likely a reason why this stuff is turned off on PC, it's because it wasn't tested and could cause problems.
It raises latency 20ms and performs worse on <8GBs (which most gamers have). It's off for a reason. And even with the better fps the fix still has the issue, it's just diminished.

The actual issue is the Ryzen 3600. It's a very, very low performance CPU. It's substantially lower ST (almost half) vs say an A16 Bionic. And it's slightly lower MT performance. Expecting 120 Hz out if it is wrong.

Enabling Nvidia's GSP in the registry should theoretically offload some work off the CPU and help fix it too.

TLDR: It's a hardware issue.
 
Last edited:
Console optimization exists. Imagine what they need to do for Switch.

Not sure why its news that PC gets the brute-force method while consoles get legit optimization.
 
Legit optimization in 2026:
Toggle option from 0 to 1

The command doesnt blanket make things better.
In fact its so specific it makes sense the command is off by default.
If you have a fast CPU its practically pointless to set = 1.
If you have a slow GPU its detrimental to your experience to set = 1.

For the command to "work" well for you, you need a slowish CPU pair with a fast GPU, then the benefits show alot more.

Unreal 5 is such a retarded engine

This isnt running on Unreal Engine 5.
 
It raises latency 20ms and performs worse on <8GBs (which most gamers have). It's off for a reason. And even with the better fps the fix still has the issue, it's just diminished.

The actual issue is the Ryzen 3600. It's a very, very low performance CPU. It's substantially lower ST (almost half) vs say an A16 Bionic. And it's slightly lower MT performance. Expecting 120 Hz out if it is wrong.

Enabling Nvidia's GSP in the registry should theoretically offload some work off the CPU and help fix it too.

TLDR: It's a hardware issue.

it's not a hardware issue.

the issue at hand is that even if you lock the framerate to 60fps, and you never drop below it, the game still stutters.

it doesn't stutter due to frametime jumps, but because the game calculates its delta time incorrectly, leading to animation and motion stutters of your character and the camera.

Silent Hill 2 has the exact same issue. it's an issue with Unreal Engine 4/5.

enabling this smoothing seems to interpolate in-between animation steps and movement steps slightly better, therefore removing this stutter issue
 
Last edited:
Things are often disabled by devs for a reason. It may work with that computer in this spot of the game, but may crash or cause worse performance issues in other area of the game or with other HW configuration. I'd use it carerfully.
 
Last edited:
The command doesnt blanket make things better.
In fact its so specific it makes sense the command is off by default.
If you have a fast CPU its practically pointless to set = 1.
If you have a slow GPU its detrimental to your experience to set = 1.

For the command to "work" well for you, you need a slowish CPU pair with a fast GPU, then the benefits show alot more.



This isnt running on Unreal Engine 5.
I was just mocking the idea that consoles had gotten legit optimization because they set a value to 1 in an ini file on the consoles - when console optimization used to mean writing custom assembly code.
 
If this is true you can't diss on Game Freak anymore, they are all Game Freak now.
Captain Phillips GIF

Also did anyone play this game? The WHOLE game was clearly rushed as fuck and I'm a little sick of Respawn's shit. Bragging about being fast is one thing. Cutting corners is entirely different. They are so fast and you can't tell it in the GAAS but this game sure suffered because of it. By the end it was a chore. Could have been so much better if given a proper development cycle.
 
Last edited:
I was just mocking the idea that consoles had gotten legit optimization because they set a value to 1 in an ini file on the consoles - when console optimization used to mean writing custom assembly code.

This is a custom variable injected into Unreal Engine that specifically eases the burden on slower CPUs like the r5 3600 which is pretty much exacltly whats in the consoles and pretty much allows them to even have a 60fps mode.

They literally did write custom code, not just this one but many other "optimizations" for the consoles soooooooooo...............................
 
If this is true you can't diss on Game Freak anymore, they are all Game Freak now.
Captain Phillips GIF

Also did anyone play this game? The WHOLE game was clearly rushed as fuck and I'm a little sick of Respawn's shit. Bragging about being fast is one thing. Cutting corners is entirely different. They are so fast and you can't tell it in the GAAS but this game sure suffered because of it. By the end it was a chore. Could have been so much better if given a proper development cycle.

If what is true?
That Respawn made console specific optimization to the engine that when enabled on PCs with CPUs similar to the console ones helps the game?
 
Things are often disableds by devs for a reason. It may work with that computer in this spot of the game, but may crash or cause worse performance issues in other area of the game or with other HW configuration. I'd use it carerfully.

This port is absolute crap, I doubt anything in PC version was tested properly and any big thinking was behind decisions like that.

On the latest patch game is also still buggy as fuck.
 
Game still ran like shit on consoles for ages as well, not even holding 30fps at launch.

I still remember the director a few weeks before launch when the retail copies were going gold, saying the game was in a great state, and not to worry as they were taking pre-orders.

Just straight up fucking lying, it was an unfinished game that should've been delayed, and they were bragging they got it done in 3 years. Literally had some of the same problems as the previous entry too, but worse.
 
This command is not a standard cvar for UE4. So the studio had to create this command and it's function, integrate it into UE4 and then, somehow forgot to enable it on PC.
Now that is a nice level of incompetence.
 
This command is not a standard cvar for UE4. So the studio had to create this command and it's function, integrate it into UE4 and then, somehow forgot to enable it on PC.
Now that is a nice level of incompetence.
have you considered perhaps that they didn't enable it over the 20ms latency increase and worse performance on mainstream GPUs?

Alex made stuttering his whole personality now.

Respawn made the right choice. Real gamers predominantly use =< 8GB GPUs. This option that crashes performance on them is incomprehensible.

You're not supposed to be CPU limited while gaming. Ryzen 5 3600 is a very low performance CPU. That's the actual lesson from this.
 
Last edited:
You're not supposed to be CPU limited while gaming. Ryzen 5 3600 is a very low performance CPU. That's the actual lesson from this.
Alex continuing to use the Ryzen 5 3600 for all his testing is pretty silly. It still appears in almost every DF PC review. As with any previous console gen, we've seen console performance generally improve relative to a baseline "equivalent" PC. Doing all this testing with a low-end 2019 CPU as your baseline doesn't make much sense in 2026 and probably gives DF viewers wrong ideas about what they should expect from their cheap PC hardware relative to consoles five years in.
 
Last edited:
This game is very high on my performance shit list. I refunded and might play it in 5 years when the hardware required to brute force this game is invented. I'll hook this future rig up to the Large Hadron Collider as we are gonna need some juice to get a stable 120 fps.
 
I hate how so many companies release so many games that are gimped and have their experience / entertainment value handicapped just to please stock holders that they hit the release date and they get no penalty for it (outside of extreme cases). They need to be penalised or they'll simply keep pushing their luck.

They don't even bother with the old copy paste apologies anymore, in fact, we get blamed for our hardware (Borderlands 4) instead, with the message that the game is polished in their testing so maybe upgrade your PC or update drivers.
 
have you considered perhaps that they didn't enable it over the 20ms latency increase and worse performance on mainstream GPUs?

Alex made stuttering his whole personality now.

Respawn made the right choice. Real gamers predominantly use =< 8GB GPUs. This option that crashes performance on them is incomprehensible.

You're not supposed to be CPU limited while gaming. Ryzen 5 3600 is a very low performance CPU. That's the actual lesson from this.
What stopped them from making it an exposed settings toggle with help text suggesting which CPUs should turn it on and which should turn it off? Especially after the game was massively shit on for performance issues.
 
Last edited:
I really liked the first one, but I put off buying this one until I heard that the performance issues were fixed. Might look it up again to see if I can find it on the cheap.

On topic, it's shameful that this kind of stuff makes it past the optimization phase (I guess the joke is - What optimization phase?).
 
have you considered perhaps that they didn't enable it over the 20ms latency increase and worse performance on mainstream GPUs?

Alex made stuttering his whole personality now.

Respawn made the right choice. Real gamers predominantly use =< 8GB GPUs. This option that crashes performance on them is incomprehensible.

You're not supposed to be CPU limited while gaming. Ryzen 5 3600 is a very low performance CPU. That's the actual lesson from this.

That number is only valid for the 3600. Newer CPUs, the difference is very small.
This means it would make the game much smoother, without significant draw backs.
A simple option in the menu for people to choose, would be the thing to do.
 
have you considered perhaps that they didn't enable it over the 20ms latency increase and worse performance on mainstream GPUs?

Alex made stuttering his whole personality now.

Respawn made the right choice. Real gamers predominantly use =< 8GB GPUs. This option that crashes performance on them is incomprehensible.

You're not supposed to be CPU limited while gaming. Ryzen 5 3600 is a very low performance CPU. That's the actual lesson from this.

In this game you are fucking CPU limited on 9800X3D, what are you talking about?

When this game launched it collapsed performance to ~40FPS on one of the fastests CPUs at the time. Why are you defending developers of game that is in top 5 worst ports of the decade?

w1TjqQ9lZrTfak1P.jpg
 
Top Bottom