• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Star Wars Jedi: Survivor on PC is still the worst triple-A PC ports of 2023

Draugoth

Gold Member
We recently had a new update for Star Wars Jedi: Survivor, which brought major improvements to the performance mode on consoles.

Analyzing the new version of the game, Digital Foundry states that Star Wars Jedi: Survivor remains the "worst AAA PC port of 2023", even with the update bringing some improvements.



Among the new features, NVIDIA's DLSS technology has finally been added to the game. Still, performance is still not good enough, with many of the port's issues not having been fixed.One of the game's biggest problems at launch remains present.

Every time the game is opened there is a long shader pre-compilation process, which the channel considers unnecessary and probably a bug. It appears every time you open the game, even though it doesn't seem to do anything after the first time you open it - unnecessary user friction that feels a lot like a bug, still present five months after the game's release. The weird thing about this is that if the game actually pre-caches shaders, it's not doing a particularly good job. Star Wars Jedi: Survivor still has a bunch of shader compilation glitches during gameplay.

For those who depend on TAA or FSR 2, unfortunately the image quality continues to be well below standard. Luckily, RTX owners can now use DLSS, which "does wonders for the game's image quality."

Despite this, the performance remains disastrous. The stutters continue to occur during exploration, regardless of your hardware, which is caused by low CPU usage. On simpler graphics settings there was a small improvement, but on higher settings with Ray Tracing enabled it remains as bad as at launch.

The bottom line is that even on my high-end gaming system, you are CPU limited at reasonable graphics settings and especially with ray tracing. On an older mid-range processor, the game is still a disaster. Technically, the game is better, but that doesn't matter when the main problems since launch haven't been resolved.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
The console version has been improved tremendously while the PC version ...


The bottom line is that even on my high-end gaming system, you are CPU-limited at reasonable graphics settings and especially with ray tracing. On an older mid-range processor, the game is still a disaster. Technically, the game is better, but it barely matters when core problems there since launch have not been addressed.


Frustrated World Cup GIF
 

rofif

Can’t Git Gud
ooof.
Was Dead Space patched?
I tried the demo on steam and when you get back to the ship and it explodes, I had a stutter so long, the whole cutscene passed in the background and as necromorph attacked me after it, it was still rebounding at 3fps and it killed me.

I said "YEP", deleted the demo and bought it for ps5 lol.
 

Skifi28

Member
ooof.
Was Dead Space patched?
I tried the demo on steam and when you get back to the ship and it explodes, I had a stutter so long, the whole cutscene passed in the background and as necromorph attacked me after it, it was still rebounding at 3fps and it killed me.

I said "YEP", deleted the demo and bought it for ps5 lol.
Played recently when there was a trial and it was a complete stutterfest every few seconds despite getting 100+ frames. Completely unplayable, I don't think that's ever getting fixed.
 
Last edited:

rofif

Can’t Git Gud
Played recently when there was a trial and it was a complete stutterfest every few secinds despite getting 100+ frames. Completely unplayable, I don't think that's ever getting fixed.
yeah that's the trail I played.
I even got some screenshots of frametimes because I was flabbergaster how bad it gets lol.

L7WGRmT.jpg

O8ydrGc.jpg
 

yamaci17

Member
yeah that's the trail I played.
I even got some screenshots of frametimes because I was flabbergaster how bad it gets lol.

L7WGRmT.jpg

O8ydrGc.jpg
you're running out of vram

either play the game that 10 gb is suited for at 1440p
or back off ray tracing or texture settings
also reduce background programs that could be using VRAM

regardless, I played dead space remake on my 3070 and never experienced what you experienced. given I disabled ray tracing, which is a given.
 

rofif

Can’t Git Gud
you're running out of vram

either play the game that 10 gb is suited for at 1440p
or back off ray tracing or texture settings
also reduce background programs that could be using VRAM

regardless, I played dead space remake on my 3070 and never experienced what you experienced. given I disabled ray tracing, which is a given.
No background stuff running.
And I am good on vram until about 9700mb. That's the point RE4 crashes lol.
And I think I was running upscaling, 1440p internally here. Maxed out I think
The demo was timed, so I only checked few settings
 
Last edited:

yamaci17

Member
No background stuff running.
And I am good on vram until about 9700mb. That's the point RE4 crashes lol.
And I think I was running upscaling, 1440p internally here. Maxed out I think
The demo was timed, so I only checked few settings
upscaling doesn't matter in the scope of VRAM usage. you're not really running the game at 1440p. if you did, you would understand from the image quality

if you output to 4K, you will have near-4K vram consumption. that is why upscaling works that good anyways. it takes its power from running native 4k buffers and hints (+ 4k textures, lods and models). this is why even 4k dlss performance will often have bette rimage quality than running a game directly at 1440p. try it out for yourself if you like. (4k dlss performance will also have more if not equal vram footprint than native 1440p, mind you)

4k is definitely possible on 10 gigs. definitely needs to tone down settings (high). 8-10 gb is death sentence with ultra in any game in any form past 2021 aside from rare examples. it is just how it is. i played the damn game at 4k dlss quality too. but with no ray tracing + high preset. looked good enough.

ray tracing definitely tanked the performance. exact same story with callisto protocol (played iat 4k fsr quality, not a single fps drop at high preset. with ray tracing on, even if i put the rest of the game to low, game will tank massively to single digit frames).

I know it feels like a kick in the gut (considering 3080 was a flagship ray tracing gpu) but ray tracing in certain titles are not meant to be with these vram budgets. you will have better chance at being in vram budgets with using path tracing on cyberpunk rather than the mild RTAO effects dead space and callisto has. don't worry about it. it is how AMD wanted those games to operate. not much to do about it.
 

rofif

Can’t Git Gud
upscaling doesn't matter in the scope of VRAM usage. you're not really running the game at 1440p. if you did, you would understand from the image quality

if you output to 4K, you will have near-4K vram consumption. that is why upscaling works that good anyways. it takes its power from running native 4k buffers and hints (+ 4k textures, lods and models). this is why even 4k dlss performance will often have bette rimage quality than running a game directly at 1440p. try it out for yourself if you like. (4k dlss performance will also have more if not equal vram footprint than native 1440p, mind you)

4k is definitely possible on 10 gigs. definitely needs to tone down settings (high). 8-10 gb is death sentence with ultra in any game in any form past 2021 aside from rare examples. it is just how it is. i played the damn game at 4k dlss quality too. but with no ray tracing + high preset. looked good enough.

ray tracing definitely tanked the performance. exact same story with callisto protocol (played iat 4k fsr quality, not a single fps drop at high preset. with ray tracing on, even if i put the rest of the game to low, game will tank massively to single digit frames).

I know it feels like a kick in the gut (considering 3080 was a flagship ray tracing gpu) but ray tracing in certain titles are not meant to be with these vram budgets. you will have better chance at being in vram budgets with using path tracing on cyberpunk rather than the mild RTAO effects dead space and callisto has. don't worry about it. it is how AMD wanted those games to operate. not much to do about it.
Of course upscaling does matter when it comes to vram lol. It might not be exact 1440p vram usage but closer to that than 4k.

3080 is a piece of trash with that 10gb of vram but it's still devs fault for not making their games better. Games should automatically swap assts and not just crash or stutter like mad.
besides - I still think it was traversal or shader stutter rather than vram limit.
I am looking at 4080... I could get one but I am not playing on pc too much
 

Arsic

Loves his juicy stink trail scent
Of course upscaling does matter when it comes to vram lol. It might not be exact 1440p vram usage but closer to that than 4k.

3080 is a piece of trash with that 10gb of vram but it's still devs fault for not making their games better. Games should automatically swap assts and not just crash or stutter like mad.
besides - I still think it was traversal or shader stutter rather than vram limit.
I am looking at 4080... I could get one but I am not playing on pc too much

How the last few years had been I expected a 5000 series would exist by now and I was holding off to get a 5090…. Now I’m tempted to get a 4090 but knowing my luck I’ll bone myself like I did when I got a 3080.

I need to be patient until a 5090 is announced.
 

yamaci17

Member
How the last few years had been I expected a 5000 series would exist by now and I was holding off to get a 5090…. Now I’m tempted to get a 4090 but knowing my luck I’ll bone myself like I did when I got a 3080.

I need to be patient until a 5090 is announced.

wait for the inevitable 16 gb 70 series card. that will be the jackpot.

stay away from 4000s. just wait. patiently.

even if it takes a 6070, wait for the 16 gb 70 card. that will be when it will be safe to get a card that will last at least 5 years with no problems

4080 is cool but super expensive. at some point nvidia will have to give in a bit
 

rofif

Can’t Git Gud
How the last few years had been I expected a 5000 series would exist by now and I was holding off to get a 5090…. Now I’m tempted to get a 4090 but knowing my luck I’ll bone myself like I did when I got a 3080.

I need to be patient until a 5090 is announced.
We need to be strong. I will wait too....
I am still on am4, so I have to get 5800x3D if I don't want to change the whole platform.
And 4090/5090 is out of the question. I have corsair rm750x, so I have to stay within 350watt gpu range
 

Bojji

Member
Of course upscaling does matter when it comes to vram lol. It might not be exact 1440p vram usage but closer to that than 4k.

3080 is a piece of trash with that 10gb of vram but it's still devs fault for not making their games better. Games should automatically swap assts and not just crash or stutter like mad.
besides - I still think it was traversal or shader stutter rather than vram limit.
I am looking at 4080... I could get one but I am not playing on pc too much

 

Bojji

Member
haha lol it's exactly that spot !!!
But 9200 should be enough so 100% there is a leak

Yeah this port is a mess, I had same issues with 3060ti. With 4070 everything is smooth so this game needs 12GB at least.

I also had 6800 and of course there were no vram problems but game was constantly stuttering, even with shader precompile every time i loaded save it stuttered and CPU utilization was super high, game was compiling shaders all the time. I don't know if this was game problem or AMD drivers but it works much better on Nvidia in this aspect.

EA doesn't fix their games.
 
Last edited:

Haint

Member
Tiny Tina Wonderland also compiles shaders everytime you launch it, so I'm not so sure its a bug.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I played the game using the DLSS mod back in June and it ran reasonably well on my 4090 7800X3D (yeah I know everyone has that setup!) I'd be curious if it runs even better now.

Still Koboh in the town had awful stutters.
 

yamaci17

Member
We need to be strong. I will wait too....
I am still on am4, so I have to get 5800x3D if I don't want to change the whole platform.
And 4090/5090 is out of the question. I have corsair rm750x, so I have to stay within 350watt gpu range
dont get amd cpu also

get intel cpu

amd cpu always gimmick. just yesterday i heard ryzen 7600 gets magically 2x fps over 5800x 3d in switch emulator. it makes no sense. i dont care about technicalities. every new ryzen gen, same story. became a boring story

pay up and get an intel cpu.

even an i5 8400 from 2017 provides more stable frametimes than a 5600x in both star wars survivor and hogwarts legacy. people think these games broken in terms of cpu but the actual problem is majority of people are now on amd platforms. notice very few intel cpus actually whine.

even the amd sponsored starfield runs better on Intel CPUs.
 
Last edited:

yamaci17

Member
Tiny Tina Wonderland also compiles shaders everytime you launch it, so I'm not so sure its a bug.
its not a bug. its also not a shader compilation

it probably keeps shaders in some kind of different format and repurposes them from their compressed/unpurposed format at every launch

hogwarts legacy, same story.

df is out of touch in certain cases. if it happens on 3 seperate ue titles, it is not a bug, it is a feature.
 
Last edited:

winjer

Gold Member
upscaling doesn't matter in the scope of VRAM usage. you're not really running the game at 1440p. if you did, you would understand from the image quality

if you output to 4K, you will have near-4K vram consumption. that is why upscaling works that good anyways. it takes its power from running native 4k buffers and hints (+ 4k textures, lods and models). this is why even 4k dlss performance will often have bette rimage quality than running a game directly at 1440p. try it out for yourself if you like. (4k dlss performance will also have more if not equal vram footprint than native 1440p, mind you)

4k is definitely possible on 10 gigs. definitely needs to tone down settings (high). 8-10 gb is death sentence with ultra in any game in any form past 2021 aside from rare examples. it is just how it is. i played the damn game at 4k dlss quality too. but with no ray tracing + high preset. looked good enough.

ray tracing definitely tanked the performance. exact same story with callisto protocol (played iat 4k fsr quality, not a single fps drop at high preset. with ray tracing on, even if i put the rest of the game to low, game will tank massively to single digit frames).

I know it feels like a kick in the gut (considering 3080 was a flagship ray tracing gpu) but ray tracing in certain titles are not meant to be with these vram budgets. you will have better chance at being in vram budgets with using path tracing on cyberpunk rather than the mild RTAO effects dead space and callisto has. don't worry about it. it is how AMD wanted those games to operate. not much to do about it.

Yes, upscaling reduces vram usage.
Even at 4K, using a temporal upscaler can reduce vram usage by 1 to 2Gb, depending on quality setting.
LOD's don't have as much impact as you think.
 

winjer

Gold Member
Tiny Tina Wonderland also compiles shaders everytime you launch it, so I'm not so sure its a bug.

I'm playing the game and I have checked.
By disabling the start up movies, the game goes straight into the menu, as soon as it finishes loading.
CPU usage at start up is around 30% on my 5800X3D. And loading a level, it goes to the same CPU usage.
So this is just the game processing textures, passing data around, etc.
 
Top Bottom