• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Star Wars Jedi Survivor Performance Review - PS5 vs PC vs Xbox Series X|S (NXGamer)

Kataploom

Gold Member
Yeah, in this case even the console version is terrible. But it's very rare for AAA titles. Usually even if it looks bad on console, at least the performance is good enough and most games don't suffer from stuttering.
As I and others have said multiple times here, if we're downgrading graphical settings to consoles settings, those unoptimized ports at launch can work flawlessly or not worse than on consoles and even at 60 fps... The problem mostly comes when people want to go way above consoles settings like those wanting to run the game with max settings, RT at HFR at 1440p or 4K... Most unhappy users tend to be the higher end ones and those with too little VRAM.

Do you think people with mid to top end range GPU would be complaining if they were running the game at 1440p ultra performance?

There are some outliers like TLOU P1 which performance wasn't the problem, but every other configuration than high or epic, memory leaks, bugs, constant crashes, etc.
 
My bet is they tried to go crazy with UE4, didn't have skilled devs to optimize the game, and there you have your "early access" game. What a fucking mess :messenger_poop:
 

Darsxx82

Member
The same bugaga who "casually" counted 1440p in RE4 on PS5?
I don't remember if it was Alex who said such a thing, but the fact is that this figure was correct in time and had a justified reason. On PS5 launch version there was a bug that caused the vertical resolution to drop (giving ~1440p resolution in pixels) while you had post-processing effects like CA and Blur enabled.

You can have all the prejudices you want about Alex, but I don't think there is anyone in DF who is more reliable when it comes to identifying resolutions and rendering methods.

But OK, Bugaga and such, you know.😏
 

rofif

Can’t Git Gud
It doesn't have rt gi and yes. Fsr 2.1 is SHIT, because these developers use it to upscale from such low native resolutions. Every game on ps5 that I've played with fsr has been pretty awful- dying light 2, Dead space, cyberpunk all have really low res looking performance modes.

I swear fsr made cyberpunk look worse but for some strange reason people decided it looks better than before it was added. It doesn't! Look at the artifacting it created on roads for example.

People jumped on the bandwagon because it's a new tech and we were desperate for a solution "like dlss" on console. If the base resolution is high, like 1440p and above, fsr can work but if it's 1080p or below it looks terrible!

One note to add though- it's on developers to know that fsr doesn't work well using low native resolutions. It's their fucking jobs to know this but they keep using it. Much the same way having RT reflections and RT AO on jedi survivor's performance mode was a bad idea, it seems having the latest buzzword technology is all that matters.

If they had ALSO included a mode without RT, IT WOULDVE SOLVED ALL THIS GAMES PROBLEMS in performance mode! Better framerate and a higher base resolution without the need for FSR

Incompetence plus rushed launches=a string of awful "Next gen only games".
Fsr2 is trash. Utter garbage if not used at 4k with highest preset.
I much prefer lower res taa games with solid image quality. Hell, 1080p uncharted4 looks way more stable than this fsr crap. Let alone 1440p and 4k. Naughty dogs are gods of stable image quality.
Or ratchet. Amazing iq even at low res.

Fsr is a cheap trash cop out that looks worse than 720p garage these terrible devs scale from.

I don’t like the rhetoric that hardware is underpowered. It’s not. Ue4 is bad and devs need to figure stuff out beyond least gen games. Hell, there are last gen games looking and running better on ps4 than this
 

rofif

Can’t Git Gud
I'm sorry but Michael is an embarrassment.

He isn't a real tech guy and I think his embarrassing video and incorrect resolutions show how clueless he is.
What the fuck does it take to qualify as tech hit. This is ass talk. It’s not rocket science. It’s all pretty simple stuff
 

rofif

Can’t Git Gud
600p is giving me series S flashbacks. Absolutely insane stuff. Surely the devs should've stopped at having to drop resolution to 600p and said, hey maybe, this console cant handle RT. Just maybe.

Even the resolution mode drops all the way down to what? 872p? WTF are we doing here? Tom had a RTGI comparison showing the difference and I honestly couldnt tell looking at it for 10 full seconds. RT Reflections are obvious but im playing on PC and they arent needed tbh. The difference is so minimal, im baffled by their decision to make this an RT game with RTGI, shadows and reflections.

Embarrassing stuff by Respawn. Game should not be dropping to 17 fps. My friend who is playing on PS5 in resolution mode told me it was dropping below 20. When i posted that here, a couple of guys told me I was wrong. Well, this proves my friend right.

BTW, On PC, those open world areas drop frames like a motherfucker. I hate everything about that open world. The game is fine in dungeons and smaller caves but that fucking open world just chugs and stutters so much it gives me motion sickness. Just got my chocobo and its a fucking disaster traversing that world even at a locked 40 fps. There is just something really off about this game.

P.S If my 3080 can do native 4k 35 fps in the open world at max settings with no RT, the PS5 should be able hit 1440p 30 fps without needing any kind of upscaling. They should turn off RT and get rid of FSR for the resolution mode.
Game should clearly be rt less and more optimised
 

Kataploom

Gold Member
Because it's entirely dependent on where you test the game. The second planet apparently performs much, much worse than the first one because it's a large open-world. At 4K, the performance absolutely tanks there but Techpowerup still shows the performance graphs being good at 1080p and 1440p. At 4K, it immensely suffers and there is likely a VRAM bottleneck in addition to other bottlenecks there.

I already explained to you why just reading the VRAM usage is wrong. Test it yourself, take an 8GB card and a 16GB card, you'll get different VRAM readings. My guess is they got those numbers from something like a 4090 or 7900 XT, not a 3070.

Here for instance, the 6800 XT consistently reports 500MB more than the 3080.

RTX_3080_vs_6800_VRAM.PNG


And Techpowerup are the same fools who were pairing an RTX 4090 with a mid-tier Zen 3 CPU and getting results like 10-20% lower than everyone else. Hardware Unboxed also haven't noticed a VRAM bottleneck.
AMD cards always use a little more VRAM than Nvidia, like Nvidia using 6GB on a game while AMD uses 6.4 or 6.7 GB or so... Still the amount of VRAM they got is still too little incomparison to AMDs
 

rofif

Can’t Git Gud
There’s never such a case where you have a 3080 rig and would pick consoles, unless it was like, at launch problems that are not appearing on consoles, but that’s just momentarily, typically.

I think peoples are pretending to have that rig and then say « I’ll go console » for warring
Hello. It’s me. A person with a 3080 who often chooses ps5 version.
And I am all better for it. No stress just play. No stutter, no crashes and no vram problems. Sure, lower fps most likely but playing on a console I barely care. I finish games and still see people analysing pc Messi ports for weeks.
I am allergic to settings as - I can’t be fucking bothered.
 

Buggy Loop

Member
Hello. It’s me. A person with a 3080 who often chooses ps5 version.
And I am all better for it. No stress just play. No stutter, no crashes and no vram problems. Sure, lower fps most likely but playing on a console I barely care. I finish games and still see people analysing pc Messi ports for weeks.
I am allergic to settings as - I can’t be fucking bothered.

Yea i don't get it sorry

Even if i wait a year, I'm not double dipping later on for the inevitable superior version. FF7 remake comes to mind. The only peoples who jump are seriously fragile with FOMO of day 1 launch, which i also don't get.

Library stays with me too and scales for future tech, be it resolution, framerates or modders adding more content. God knows what happens on consoles at every gen, it's a dice roll. You'll pay for a PS6 patch to get real 4k this time at 60 fps?

sick jim carrey GIF


But good for you!
 
Last edited:

01011001

Banned
Fsr2 is trash. Utter garbage if not used at 4k with highest preset.
I much prefer lower res taa games with solid image quality. Hell, 1080p uncharted4 looks way more stable than this fsr crap. Let alone 1440p and 4k. Naughty dogs are gods of stable image quality.
Or ratchet. Amazing iq even at low res.

Fsr is a cheap trash cop out that looks worse than 720p garage these terrible devs scale from.

I don’t like the rhetoric that hardware is underpowered. It’s not. Ue4 is bad and devs need to figure stuff out beyond least gen games. Hell, there are last gen games looking and running better on ps4 than this

they absolutely should turn off FSR2 in this game.
they should run Performance mode with a dynamic res and TAAU.
removing FSR2 also would remove some GPU strain and should result in a higher dynamic resolution.

oh how I miss the days of Frostbite's amazing Checkerboard rendering, where even the Xbox One versions of games looked really good even tho they were checkerboard 1080p.

when an upsampling method makes you miss the pixelcomb artifacts of CBR and the soft image of it, you know it's dogshit.
 
Last edited:

winjer

Gold Member
So I did a bit of digging regarding FSR2 implementation in this game. And I found this.
FSR 2.1 SDK for UE added these things. Take notice there are 2 new commands listed here.

2.1
• Update the FSR2 code to version 2.1.
• Fixed incorrect rendering & crashes with Split-Screen.
• Resolved a crash when changing Scalibility Level in the Editor.
• Resolved a crash when enabling visualisation modes that disable upsampling.
• Resolved a crash when using Shader Model 6 in Unreal Engine 5.
• Resolved a crash opening the Unreal Editor when using the Vulkan RHI.
• Fixed sampling from reduced resolution ScreenSpace Reflections & Separate Translucency.
• Fixed re-enabling World-Position-Offset console variables when toggling FSR2 on & off.
• Added an optional de-dither pass to improve FSR2’s handling of dithered materials, especially Hair.
• Disabled FP16 in the RCAS pass to prevent incorrect rendering.
• Added an option to treat a shading model as reactive & use either CustomData0.x or the value of ‘r.FidelityFX.FSR2.ReactiveMaskForceReactiveMaterialValue’ as the reactive mask value.
• Added an option ‘r.FidelityFX.FSR2.ForceLandscapeHISMMobility’ to force the mobility of Hierarchical Instanced Static Mesh components attached to Landscapes to Stationary so they render motion vectors.
• Added an Engine patch to improve rendering of ‘Static’ objects that use a material with World-Position-Offset.
• Added an Engine patch which adds the Lit-Reactive ShadingModel that can be used to pass a reactivity value to write into the Reactive Mask for animated materials such as video screens.
• Remove an unnecessary RHI command flush that reduced CPU performance.

I tried using these in UUU, to see if it would recognized. And they aren't.
So my guess is that this game is still using the FSR 2.0. Not FSR 2.2 and not even FSR 2.1
Dead Island 2 released first and already had FSR2.2 and a good implementation of it. But these guys at Respanw didn't even bother updating it.
So it's not that this game has a bad FSR2 implementation, it's that it's using the oldest, worst version.
If it was using FSR2.2, it would look much better.
 

Gamezone

Gold Member
There's no difference between Windows 10 and Windows 11. Windows 10 actually gives you a couple of frames extra using the "wrong" OS.

 

MikeM

Gold Member
I'm sorry but Michael is an embarrassment.

He isn't a real tech guy and I think his embarrassing video and incorrect resolutions show how clueless he is.
Is not a tech guy? Doesn’t he fix tube TVs for funzies? I trust him more than anyone else on Gaf.
There’s never such a case where you have a 3080 rig and would pick consoles, unless it was like, at launch problems that are not appearing on consoles, but that’s just momentarily, typically.

I think peoples are pretending to have that rig and then say « I’ll go console » for warring
I have a 7900xt PC and a PS5. Bought Dead Space Remake and FFVII Crisis Core on PS5. Why? Because I felt like it.

Gamers play wherever. You don’t need PC to have a good experience. But I will say that some games, like Cyberpunk2077, are just a flat out better experience on PC.
 

ToTTenTranz

Banned
Wtf is this?
The game has a CPU performance problem.
It's apparently trying to compile shaders on-the-fly during gameplay, which stalls the GPU waiting for the CPU cores to go back to the game.

With this, the dynamic resolution system sees the frametimes taking longer and longer and it responds by lowering the native render resolution, although it has nothing to do with the rendering workflow.
Dynamic resolution makes sense in GPU-limited scenarios, which is not the case.

This might be something that gets fixed rather quickly.
 

Gaiff

SBI’s Resident Gaslighter
I'm sorry but Michael is an embarrassment.

He isn't a real tech guy and I think his embarrassing video and incorrect resolutions show how clueless he is.
Nah, he's a software engineer with over 20 years of experience. He definitely knows technology but his field of expertise isn't related to video games so he doesn't know much more than gaming tech enthusiasts.
 

DeepEnigma

Gold Member
Is ratchet and clank not next gen as it uses insomniacs engine, the same as spider man 1 and ratchet remake?

I guess spider man 2 isn't next gen as it uses spider man 1s engine......

🤡
EA literally put out apologies and "promises" before the game even launched. 🤷‍♀️
Make-Up Meme GIF by Justin


And those games will perform far better than this trash pit of unoptimized mess. Keep going with your fanatic insecurities though, to zero in on some of the most optimized games and developers in the business.
 
Last edited:

DenchDeckard

Moderated wildly
EA literally put out apologies and "promises" before the game even launched. 🤷‍♀️
Make-Up Meme GIF by Justin


And those games will perform far better than this trash pit of unoptimized mess. Keep going with your fanatic insecurities though, to zero in on some of the most optimized games and developers in the business.

You just said it can't be next gen becuase it's unreal engine 4......

What are you trying to say?
 
who the f*ck keeps greenlighting the use of RT on consoles?? And now they bake it into perf. mode without a toggle? Have these guys really just gone off and lost their minds? Ban RT on consoles for crying out loud already
 

01011001

Banned
who the f*ck keeps greenlighting the use of RT on consoles?? And now they bake it into perf. mode without a toggle? Have these guys really just gone off and lost their minds? Ban RT on consoles for crying out loud already

it works just fine if the devs know what they're doing
 

Kataploom

Gold Member
Jesus christ
900p/30fps with drops/reduced settings/no RT

I've tried to defend the Series S in the past but fucks sake.
Well, the game is a mess everywhere and XSS is a hell of a stable version regarding to performance. Not the version I would play, but definitely one that's more than good enough for most people.
 

ReBurn

Gold Member
Jesus christ
900p/30fps with drops/reduced settings/no RT

I've tried to defend the Series S in the past but fucks sake.
None of these consoles are putting in work for native resolution. It's tough to single out Series S here when they're all pretty much sub-900p and the bigger consoles struggle in performance mode. 30fps seems to be the way to go on PS5 and XSX, too. At least for now.
 
Hello. It’s me. A person with a 3080 who often chooses ps5 version.
And I am all better for it. No stress just play. No stutter, no crashes and no vram problems. Sure, lower fps most likely but playing on a console I barely care. I finish games and still see people analysing pc Messi ports for weeks.
I am allergic to settings as - I can’t be fucking bothered.

If that's the case you should've went with Ps5 from the get go. Tho I guess you have the best of both worlds. But..you said you barely care so could've saved big money.
 

rofif

Can’t Git Gud
If that's the case you should've went with Ps5 from the get go. Tho I guess you have the best of both worlds. But..you said you barely care so could've saved big money.
if I knew, I would get 6950xt instead of 3080. That shit being dragged down because of 10gb and I have no intention of upgrading pc anymore this gen.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I don’t understand why series x performs worse

Isn’t the cpu 300mHz faster? They seem to be otherwise identical on paper
No it’s only 100 mhz faster.

It’s 300 mhz faster with SMT disabled which means only 8 of the 16 threads are active. No dev was going to do that. It was just for PR.
 

nordique

Member
With the improved performance, I ended up going to GameStop for May the Fourth to get the Series X version but they had none in stock, soooo I just picked up the PS5 version

No it’s only 100 mhz faster.

It’s 300 mhz faster with SMT disabled which means only 8 of the 16 threads are active. No dev was going to do that. It was just for PR.

Oh! Thank you for explaining - I did not know that.

Is that why most Series X games seem to suffer relative to PS5 (100 MHz faster for 2ish extra TF of power not being enough to make up the difference? I don't actually know - I am not sure why the supposedly "superior on paper" Series X seems to always have performance issues for what is, more or less, very similar hardware. Not the primary development platform? It was mentioned the PS5 is just better designed but both are AMD GPU & CPU of the same generations, they shouldn't have this much discrepancy - if anything I expected them to basically be identical for games with the slight edge to the series x)
 
Last edited:
Man, performance with RTX 4090/7800X3D is awfully bad. And it's not like it's pushing my PC to the limit. Instead, it's doing nothing with it. Both GPU and CPU just sit there barely being used. It's a shame because the game is beautiful and gameplay seems better than in the first game.

Well, I had to see it with my own eyes. I guess I'll just ask for a refund and come back when it's, hopefully, fixed.
 

hollams

Gold Member
Well I joined the patreon so I could get the DLSS mod and so far it's working pretty well. When I first tried it I got a crash to desktop and thought oh well but then I remembered I had Ray tracing on and I was in Jedha where you have to turn off RT. After I turned RT off no problems and the game runs smoothly with HDR. I really haven't noticed any artificing and I'm playing with FSR off.
 

john2gr

Member
Man, performance with RTX 4090/7800X3D is awfully bad. And it's not like it's pushing my PC to the limit. Instead, it's doing nothing with it. Both GPU and CPU just sit there barely being used. It's a shame because the game is beautiful and gameplay seems better than in the first game.

Well, I had to see it with my own eyes. I guess I'll just ask for a refund and come back when it's, hopefully, fixed.

I find it hard to believe you have an RTX4090/7800X3D and get awful performance. With an RTX4090/7950X3D (only with CCD0, which means it performs exactly like the 7800X3D), I get mostly constant 100% GPU usage at 4K/RT/FSR 2.0 Quality with framerates ranging between 80-120fps in Koboh, Coruscant and Jedha. By locking it at 60fps, I get constant 60fps with dips to 58fps due to the traversal stutters. The only downside that we can agree on is the traversal stutters. Other than that, a high-end PC system has no trouble at all running it with more than 80fps. Is it unoptimized? Absolutely, it should be running way better. Same applies to console versions though which have major image quality and performance issues. Does it run awful on a high-end PC? Well, if you're going to call constant 4K/Ray Tracing/Max Settings/FSR 2.0 Quality with 80fps awful, then I guess it does.

PS: Without FSR 2.0, your RTX4090 will be used at 100% at all times. So again, saying that "the GPU just sits there doing nothing" is as inaccurate as it can get.
 
Last edited:

sinnergy

Member
No it’s only 100 mhz faster.

It’s 300 mhz faster with SMT disabled which means only 8 of the 16 threads are active. No dev was going to do that. It was just for PR.
Seeing that these games only used 2 cores for most of the work .. I wonder if running with SMT off and get the benefit of 300 MHz and 8 threads, would have turned out better 🤣
 
Last edited:
Top Bottom