• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

DF : PC dormant commands from consoles to remove CPU burden

Buggy Loop

Gold Member


While looking into denoising with DLSS 4.5 on Unreal Engine 5 games, Alex made an intriguing discover: Star Wars Jedi Survivor has custom console commands that ease the CPU burden and vastly improve the camera and animation errors still seen in the game to this day. Alex believes these may be CPU optimisations for consoles that are dormant on the PC build - but can be used. So is this a complete "fix" for this most troublesome of PC ports? As you might imagine, the reality is a little more nuanced.

jon stewart queue GIF


When script kiddies improve the game from a studio supported by EA. Fucking shame Respawn. What a mess the Jedi games have been on PC.
 
I don't know why they rushed to release this game so fast, maybe it was pressure from EA but it's not like gamers were demanding a sequel right this second. They could've taken more time to launch it in a much better state.
 
This game is a technical disaster, including on console, but there is likely a reason why this stuff is turned off on PC, it's because it wasn't tested and could cause problems.
 
This game is a technical disaster, including on console, but there is likely a reason why this stuff is turned off on PC, it's because it wasn't tested and could cause problems.
It raises latency 20ms and performs worse on <8GBs (which most gamers have). It's off for a reason. And even with the better fps the fix still has the issue, it's just diminished.

The actual issue is the Ryzen 3600. It's a very, very low performance CPU. It's substantially lower ST (almost half) vs say an A16 Bionic. And it's slightly lower MT performance. Expecting 120 Hz out if it is wrong.

Enabling Nvidia's GSP in the registry should theoretically offload some work off the CPU and help fix it too.

TLDR: It's a hardware issue.
 
Last edited:
Console optimization exists. Imagine what they need to do for Switch.

Not sure why its news that PC gets the brute-force method while consoles get legit optimization.
 
Legit optimization in 2026:
Toggle option from 0 to 1

The command doesnt blanket make things better.
In fact its so specific it makes sense the command is off by default.
If you have a fast CPU its practically pointless to set = 1.
If you have a slow GPU its detrimental to your experience to set = 1.

For the command to "work" well for you, you need a slowish CPU pair with a fast GPU, then the benefits show alot more.

Unreal 5 is such a retarded engine

This isnt running on Unreal Engine 5.
 
It raises latency 20ms and performs worse on <8GBs (which most gamers have). It's off for a reason. And even with the better fps the fix still has the issue, it's just diminished.

The actual issue is the Ryzen 3600. It's a very, very low performance CPU. It's substantially lower ST (almost half) vs say an A16 Bionic. And it's slightly lower MT performance. Expecting 120 Hz out if it is wrong.

Enabling Nvidia's GSP in the registry should theoretically offload some work off the CPU and help fix it too.

TLDR: It's a hardware issue.

it's not a hardware issue.

the issue at hand is that even if you lock the framerate to 60fps, and you never drop below it, the game still stutters.

it doesn't stutter due to frametime jumps, but because the game calculates its delta time incorrectly, leading to animation and motion stutters of your character and the camera.

Silent Hill 2 has the exact same issue. it's an issue with Unreal Engine 4/5.

enabling this smoothing seems to interpolate in-between animation steps and movement steps slightly better, therefore removing this stutter issue
 
Last edited:
Things are often disabled by devs for a reason. It may work with that computer in this spot of the game, but may crash or cause worse performance issues in other area of the game or with other HW configuration. I'd use it carerfully.
 
Last edited:
The command doesnt blanket make things better.
In fact its so specific it makes sense the command is off by default.
If you have a fast CPU its practically pointless to set = 1.
If you have a slow GPU its detrimental to your experience to set = 1.

For the command to "work" well for you, you need a slowish CPU pair with a fast GPU, then the benefits show alot more.



This isnt running on Unreal Engine 5.
I was just mocking the idea that consoles had gotten legit optimization because they set a value to 1 in an ini file on the consoles - when console optimization used to mean writing custom assembly code.
 
If this is true you can't diss on Game Freak anymore, they are all Game Freak now.
Captain Phillips GIF

Also did anyone play this game? The WHOLE game was clearly rushed as fuck and I'm a little sick of Respawn's shit. Bragging about being fast is one thing. Cutting corners is entirely different. They are so fast and you can't tell it in the GAAS but this game sure suffered because of it. By the end it was a chore. Could have been so much better if given a proper development cycle.
 
Last edited:
I was just mocking the idea that consoles had gotten legit optimization because they set a value to 1 in an ini file on the consoles - when console optimization used to mean writing custom assembly code.

This is a custom variable injected into Unreal Engine that specifically eases the burden on slower CPUs like the r5 3600 which is pretty much exacltly whats in the consoles and pretty much allows them to even have a 60fps mode.

They literally did write custom code, not just this one but many other "optimizations" for the consoles soooooooooo...............................
 
If this is true you can't diss on Game Freak anymore, they are all Game Freak now.
Captain Phillips GIF

Also did anyone play this game? The WHOLE game was clearly rushed as fuck and I'm a little sick of Respawn's shit. Bragging about being fast is one thing. Cutting corners is entirely different. They are so fast and you can't tell it in the GAAS but this game sure suffered because of it. By the end it was a chore. Could have been so much better if given a proper development cycle.

If what is true?
That Respawn made console specific optimization to the engine that when enabled on PCs with CPUs similar to the console ones helps the game?
 
Things are often disableds by devs for a reason. It may work with that computer in this spot of the game, but may crash or cause worse performance issues in other area of the game or with other HW configuration. I'd use it carerfully.

This port is absolute crap, I doubt anything in PC version was tested properly and any big thinking was behind decisions like that.

On the latest patch game is also still buggy as fuck.
 
Game still ran like shit on consoles for ages as well, not even holding 30fps at launch.

I still remember the director a few weeks before launch when the retail copies were going gold, saying the game was in a great state, and not to worry as they were taking pre-orders.

Just straight up fucking lying, it was an unfinished game that should've been delayed, and they were bragging they got it done in 3 years. Literally had some of the same problems as the previous entry too, but worse.
 
This command is not a standard cvar for UE4. So the studio had to create this command and it's function, integrate it into UE4 and then, somehow forgot to enable it on PC.
Now that is a nice level of incompetence.
 
This command is not a standard cvar for UE4. So the studio had to create this command and it's function, integrate it into UE4 and then, somehow forgot to enable it on PC.
Now that is a nice level of incompetence.
have you considered perhaps that they didn't enable it over the 20ms latency increase and worse performance on mainstream GPUs?

Alex made stuttering his whole personality now.

Respawn made the right choice. Real gamers predominantly use =< 8GB GPUs. This option that crashes performance on them is incomprehensible.

You're not supposed to be CPU limited while gaming. Ryzen 5 3600 is a very low performance CPU. That's the actual lesson from this.
 
Last edited:
You're not supposed to be CPU limited while gaming. Ryzen 5 3600 is a very low performance CPU. That's the actual lesson from this.
Alex continuing to use the Ryzen 5 3600 for all his testing is pretty silly. It still appears in almost every DF PC review. As with any previous console gen, we've seen console performance generally improve relative to a baseline "equivalent" PC. Doing all this testing with a low-end 2019 CPU as your baseline doesn't make much sense in 2026 and probably gives DF viewers wrong ideas about what they should expect from their cheap PC hardware relative to consoles five years in.
 
Last edited:
This game is very high on my performance shit list. I refunded and might play it in 5 years when the hardware required to brute force this game is invented. I'll hook this future rig up to the Large Hadron Collider as we are gonna need some juice to get a stable 120 fps.
 
I hate how so many companies release so many games that are gimped and have their experience / entertainment value handicapped just to please stock holders that they hit the release date and they get no penalty for it (outside of extreme cases). They need to be penalised or they'll simply keep pushing their luck.

They don't even bother with the old copy paste apologies anymore, in fact, we get blamed for our hardware (Borderlands 4) instead, with the message that the game is polished in their testing so maybe upgrade your PC or update drivers.
 
have you considered perhaps that they didn't enable it over the 20ms latency increase and worse performance on mainstream GPUs?

Alex made stuttering his whole personality now.

Respawn made the right choice. Real gamers predominantly use =< 8GB GPUs. This option that crashes performance on them is incomprehensible.

You're not supposed to be CPU limited while gaming. Ryzen 5 3600 is a very low performance CPU. That's the actual lesson from this.
What stopped them from making it an exposed settings toggle with help text suggesting which CPUs should turn it on and which should turn it off? Especially after the game was massively shit on for performance issues.
 
Last edited:
I really liked the first one, but I put off buying this one until I heard that the performance issues were fixed. Might look it up again to see if I can find it on the cheap.

On topic, it's shameful that this kind of stuff makes it past the optimization phase (I guess the joke is - What optimization phase?).
 
have you considered perhaps that they didn't enable it over the 20ms latency increase and worse performance on mainstream GPUs?

Alex made stuttering his whole personality now.

Respawn made the right choice. Real gamers predominantly use =< 8GB GPUs. This option that crashes performance on them is incomprehensible.

You're not supposed to be CPU limited while gaming. Ryzen 5 3600 is a very low performance CPU. That's the actual lesson from this.

That number is only valid for the 3600. Newer CPUs, the difference is very small.
This means it would make the game much smoother, without significant draw backs.
A simple option in the menu for people to choose, would be the thing to do.
 
have you considered perhaps that they didn't enable it over the 20ms latency increase and worse performance on mainstream GPUs?

Alex made stuttering his whole personality now.

Respawn made the right choice. Real gamers predominantly use =< 8GB GPUs. This option that crashes performance on them is incomprehensible.

You're not supposed to be CPU limited while gaming. Ryzen 5 3600 is a very low performance CPU. That's the actual lesson from this.

In this game you are fucking CPU limited on 9800X3D, what are you talking about?

When this game launched it collapsed performance to ~40FPS on one of the fastests CPUs at the time. Why are you defending developers of game that is in top 5 worst ports of the decade?

w1TjqQ9lZrTfak1P.jpg
 
When this game launched it collapsed performance to ~40FPS on one of the fastests CPUs at the time. Why are you defending developers of game that is in top 5 worst ports of the decade?
When it launched. Not now.

And 12900K is not that fast. We're talking sub A16 Bionic ST here. The game has 1-2 primary render thread it's bound by. The other threads are just there to offload some work out from the render thread.

In this game you are fucking CPU limited on 9800X3D, what are you talking about?
And? That's normal, 5090 is a monster. 9800X3D is just alright. And we're talking like 160-240 CPU FPS here lol. There is nothing inherently wrong with a monster GPU being bottlenecked by it. Nvidia has the GSP disabled so CPU has to work extra with 5090.

Why are you defending developers of game that is in top 5 worst ports of the decade?
This isn't tribalism. And besides it's not that bad LMAO. Off the top of my head the following were like way worse

Borderlands 4
The Last of Us
Forspoken
Starfield
Etc

That number is only valid for the 3600. Newer CPUs, the difference is very small.
8 GB issues makes this option completely useless for PC.
 
When it launched. Not now.

And 12900K is not that fast. We're talking sub A16 Bionic ST here. The game has 1-2 primary render thread it's bound by. The other threads are just there to offload some work out from the render thread.


And? That's normal, 5090 is a monster. 9800X3D is just alright. And we're talking like 160-240 CPU FPS here lol. There is nothing inherently wrong with a monster GPU being bottlenecked by it. Nvidia has the GSP disabled so CPU has to work extra with 5090.


This isn't tribalism. And besides it's not that bad LMAO. Off the top of my head the following were like way worse

Borderlands 4
The Last of Us
Forspoken
Starfield
Etc


8 GB issues makes this option completely useless for PC.

This is game at launch on the fastest CPU on the planet at the time (7800x3D):



mEpvvwK6RcmWRj76.jpg


56FPS CPU limited, and this isn't even CPU heavy area...

Several patches later:

U7CZQ1nVBeEYgYTr.jpg




They improved it a bit in 10 patches since launch but it's still GARBAGE.
 
I remember watching a DF video in which they said that the recent patch made the game run well without RT, even on the Koboh planet. What has changed since then that people still consider this game to be unplayable?
 
I remember watching a DF video in which they said that the recent patch made the game run well without RT, even on the Koboh planet. What has changed since then that people still consider this game to be unplayable?

Animations were always broken. This command in their latest video fixes that to some extend.

Without RT framerate is in ok territory but game still has problems with frame time spikes. With RT... it's still garbage tier.
 
This is game at launch on the fastest CPU on the planet at the time (7800x3D):
7800X3D is not that fast. It has A15 Bionic ST. Which came out 18 months before it. Heck, even 7600X runs almost 1 GHz faster.

The game has two very heavy render threads it's bound by and then multiple also heavy game threads. Look at that utilization, The CPU literally has nothing left to give. It's near maxed out.

It's not a "developers didn't optimize their game for PC issue", it it's more a failure of their rendering strategy.

(Nvidia disabling the hardware scheduler on 5090 and offloading to the CPU doesn't help)

Nevertheless, the developers mistake was trying to heavily use CPUs in the first place. They should have treated it as an auxiliary processor that exists only to service the GPU. Nothing more.
 
Last edited:
When it launched. Not now.

And 12900K is not that fast. We're talking sub A16 Bionic ST here. The game has 1-2 primary render thread it's bound by. The other threads are just there to offload some work out from the render thread.

What kind of non sense is this?
Of course the 12900K is still a very fast CPU for gaming. It's not the fastest, but it provides great performance even in modern games.

And? That's normal, 5090 is a monster. 9800X3D is just alright. And we're talking like 160-240 CPU FPS here lol. There is nothing inherently wrong with a monster GPU being bottlenecked by it. Nvidia has the GSP disabled so CPU has to work extra with 5090.

The 9800X3D is by far, the fastest gaming CPU in the market. Nothing comes close to it.

8 GB issues makes this option completely useless for PC.

Plenty of people with GPUs with more than 8GB.
I have a 16Gb GPU. Even my previous GPU had 16GB.

7800X3D is not that fast. It has A15 Bionic ST. Which came out 18 months before it. Heck, even 7600X runs almost 1 GHz faster.

The game has two very heavy render threads it's bound by. So the cache merchant CPU with cores designed for server and not client sniffed glue. Look at that utilization. The CPU literally has nothing left to give. It's near maxed out.

It's not a "developers didn't optimize their game for PC issue", it it's more a failure of their rendering strategy.

(Nvidia disabling the hardware scheduler on 5090 and offloading to the CPU doesn't help)

Nevertheless, the developers mistake was trying to heavily use CPUs in the first place. They should have treated it as an auxiliary processor that exists only to service the GPU. Nothing more.

The 7800X3D is the second fastest CPU in gaming. It only loses to the 9800X3D. And the 7600X loses to both by a big margin.
 
What kind of non sense is this?
Of course the 12900K is still a very fast CPU for gaming. It's not the fastest, but it provides great performance even in modern games.
It is, but that's more that games are overwhelmingly a GPU affair. I have already provided my basis for the statement.

9800X3D ST is around 20% better than 12900K. And there is no chance that important work might get sent to anemic E core with 9800X3D.

The 9800X3D is by far, the fastest gaming CPU in the market. Nothing comes close to it.
I didn't make a statement to the contrary.

Nevertheless compared with 5090, it's very unimpressive. 5090 being bottlenecked by it in a game that heavily uses CPUs is not notable at all by itself.

Plenty of people with GPUs with more than 8GB.
I have a 16Gb GPU. Even my previous GPU had 16GB.
Around half of PC gamers have 8GB or less. Was more back then.

The 7800X3D is the second fastest CPU in gaming. It only loses to the 9800X3D. And the 7600X loses to both by a big margin.
7800X3D is tied with RaptorLake.
 
Last edited:
Top Bottom