• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Star Wars Jedi Survivor PC Performance Discussion OT: May The Port Be With You

SlimySnake

Flashless at the Golden Globes
Why do techtubers can’t resist making broad claims based on shit ports? Dev literally said « weeks of patches » coming, but sure, let’s go into Nvidia driver overhead speculation 🤡
Some games just favor AMD GPUs like AC Valhalla. This could be one of those games.

Anyway, new patch out today. Fixes non-RT performance.

FvC_ZEwWYAEQi_t
 

Trunx81

Member
Just saw the performance on the Steam Deck. 17-20fps, if you’re lucky. While it runs 30fps on the Series S. Bummer :(
 

winjer

Gold Member
After patch it seems there is a bit less stuttering from traversal. And what there is, seems less pronounced.
Frame rate seems ~5 more fps, depending on area.
The game has some variability, so it's hard to pinpoint things.
It would be nice if it had a benchmark mode.

FSR is still only the 2.0 version.
 

drotahorror

Member
This is an old video without the latest patch and I don't know if there were any other patches. I skipped on this because everyone was knocking the performance.

timestamp 4:00 if link doesnt work


So, I might be able to get around 60fps with a 3060ti Ultra settings (no RT and no DSR)? I can definitely deal with that...

I know there was genuine issues at launch (like crashing and not even being able to open the game), but it seems like people just don't understand their own PC's, which is not surprising.

Are people trying to run this game maxed out with RT on a 2060? a 1650? a 3060 ti? They are ignorant.

For people that have 4080's and 4090's and newer CPU's, that can't break 40fps then I get that that's a problem.
 
Last edited:

rnlval

Member


What a fucking clown 🤡

Always jumping in the worst AMD sponsored ports to come back with the VRAM narrative.

They have AMD’s dick way too deep

hamster GIF

8GB VRAM is a problem for RTX 3070 and RTX 3070 Ti with recent games.

My HTPC with RTX 3070 Ti (+22 TFLOPS) 8 GB VRAM, Ryzen 7 5800X, 32 GB DDR4-3600, and ASUS ROG Strix X570 Gaming-E.

5v6kbkt.jpg
 

SlimySnake

Flashless at the Golden Globes
Performance is definitely better in non-RT modes. I am seeing way better GPU utilization. Actually saw it hit 99% in native 4k in the first open world.

But High Settings at native 4k dont look like native 4k. There is definitely something funky going on. 4k epic with FSR 2.0 looks better. Go figure.
 

sertopico

Member
Performance is definitely better in non-RT modes. I am seeing way better GPU utilization. Actually saw it hit 99% in native 4k in the first open world.

But High Settings at native 4k dont look like native 4k. There is definitely something funky going on. 4k epic with FSR 2.0 looks better. Go figure.
It might be the TAA quality setting, which is also an upscaler.
 

bbeach123

Member
Performance is definitely better in non-RT modes. I am seeing way better GPU utilization. Actually saw it hit 99% in native 4k in the first open world.

But High Settings at native 4k dont look like native 4k. There is definitely something funky going on. 4k epic with FSR 2.0 looks better. Go figure.
There a hidden settings that scale resolution with graphics settings . Epic at 100% res scale , Low at 50% . So yea epic +fsr 2.0 dif look better than high lol .
 

Kataploom

Gold Member
There a hidden settings that scale resolution with graphics settings . Epic at 100% res scale , Low at 50% . So yea epic +fsr 2.0 dif look better than high lol .
Oh, that's a perfectly designed user experience out there, definitely not a fundamentally flawed design at least. Sure... /s
 
I believe performance has gone way up for me. I've gone from 40FPS on Medium 1080p to mostly-stable 60fps on High 1080p. Using around 92% of my VRAM. (10400F, 3070, 32GB ram)
 
Last edited:

winjer

Gold Member
I sawa benchmark that showed Jeddah has gotten the biggest boost. Koboh is mostly the same with maybe 5-10% boost in performance.

I can confirm the 5-10% boost in Koboh.
I only started on Jeddah after the patch, so I can't say. But it is runnign well. And it is almost as open and big as the Koboh village area, tough with fewer building ad NPCs.
Not bad for a patch made in a few days. Let's hope EA and Respawn continue making improvements.
 

yamaci17

Member
I sawa benchmark that showed Jeddah has gotten the biggest boost. Koboh is mostly the same with maybe 5-10% boost in performance.
4k fsr perf + ray tracing even then still unusable. i tried locking 30 but no go. will have abrupt frame drops (game constantly hitting pcie)

even 1080p fsr perf + medium settings + ray tracing fills the 7.2 gb buffer (game will not use more than that)

they need to fix non ray tracing lighting asap. i can play 4k fsr balanced at 40 fps lock without problems but ray tracing simply cannot fit into this tiny buffer... damn you jensen. damn you. card is capable too.

i dont want ray tracing anyways, but lighting is simply broken without it in many instances. ssr is completely whack without it.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
There a hidden settings that scale resolution with graphics settings . Epic at 100% res scale , Low at 50% . So yea epic +fsr 2.0 dif look better than high lol .
Yeah i think someone told me that earlier. The problem is that if i can’t set it to epic and turn down the settings, it changes the preset to custom and i get the same shit IQ. I basically have no choice but to turn on fsr.

Fsr + rt at 4k looks really good but there are hitches which I’m trying to avoid. I’ve already set the fps cap to 40 so I’m willing to compromise but the rt mode has too many stutters even with the gpu sitting around 70% utilization and vram hovering around 7500 mb.

DF and other PC sites have dropped the ball with this game. This is the kinda stuff i rely on them for. I bet respawn isn’t even aware of this issue because no one on YouTube covered it. Useless.
 

SlimySnake

Flashless at the Golden Globes
Enabling Rebar in NV Inspector helps performance overall, so I would recommend it.
I did this yesterday and didnt notice any improvements in Koboh.

Anyone else seeing a lot of LOD pop-in after fast traveling in koboh since updating the latest nvidia drivers? I never had much LOD pop-in while traversing or fast traveling before the game ready drivers that were released yesterday.
 

SlimySnake

Flashless at the Golden Globes

The 1440p and 4k results on 3070 and other lower end cards are near identical though. Only the most expensive cards benefit from this patch at higher resolutions due to better GPU utilization.

So 3080 owners like me playing on 1440p 60 fps will still see drops, and 4k 40 fps on Epic seems to be out of the picture. tops out at 35 fps where its GPU bound. Whats odd is that i was getting very similar performance with RT on. Around 30-33 fps at native 4k epic settings. Ray tracing is definitely not very GPU demanding in this game.
 

yamaci17

Member
I did this yesterday and didnt notice any improvements in Koboh.

Anyone else seeing a lot of LOD pop-in after fast traveling in koboh since updating the latest nvidia drivers? I never had much LOD pop-in while traversing or fast traveling before the game ready drivers that were released yesterday.
Don't enable rebar, it will increase vram consumption immensely /from 500 mb to 1.5 gb/ and will cause more frame drops. if anything you will run into troublers faster than you would (increase will be hidden as the game will start to load and offload data in and out more quickly. so you will appear to have same vram usage but with more frame drops. the VRAM management + VRAM metrics all borked with modern systems)

I've solidified my theory, game thinks you only have your %90 of your vram as available. how did I prove this? if you open / run programs that use 600 mb of VRAM, the game "reduces" its own vram so that total system usage is around 7.2 gb. no matter what you open in the background, game will force itself so that system uses 7.2 gb vram as its maximum.

similar to spiderman + hog legacy, this is another game that is wonky with vram management. there's a BIG miscommunication between windows devs + nvidia devs + game devs. I'm sure of it.


no chrome: 6.2 gb in game + 7.2 gb total vram usage
DimTrkw.png


chrome: 5.6 gb in game + 7.2 gb total

xbfAYp7.png


whatever you have in the background, that app will not use the free VRAM you have. instead, game will jankily and wonkily reduces its own VRAM consumption (and create performance problems while doing so)

This is unoptimized bullshit behaviour at its peak. if NVIDIA is adamant on releasing 8 gb midrange cards, they have to address this bullshit.

do you see the problem here=? even if we assume game concisously does not use and leave %10 free VRAM, the programs that use extra VRAM will still cause game to swap data anyways. so what's even the point???? no matter what you do, you won't be able to shoot upwards of 7.2 GB VRAM while running this game. you can run all the software in the world. you won't be able to. (and same applies to 10 gb with different values, around 9 gb allocation and 8 gb usage)

why do I bitch about this? because even If I accepted that game leaves VRAM for you to use for your background applications, the said backgroud applications do not use that free VRAM anyways. or that game simply feels threatened and reduces its own VRAM usage when something else uses VRAM. And all of this happens when the VRAM is not even maxed out at 7.2 GB.

last of us was not like this.

windows and games and unreal engine and devs are too dumb. this will make me leave pc gaming. you cant brute force your way out of stupidity.

with such bullshit 12 gb is obsolete and POS. frame gen, francy ray tracing,. ue games are too dumb. with such dumb bullshittery, an 12 gb will barely have enough cvram to run nextgen console games without any of these gimmicks included. nvidia needs to address this asap
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Don't enable rebar, it will increase vram consumption immensely /from 500 mb to 1.5 gb/ and will cause more frame drops. if anything you will run into troublers faster than you would (increase will be hidden as the game will start to load and offload data in and out more quickly. so you will appear to have same vram usage but with more frame drops. the VRAM management + VRAM metrics all borked with modern systems)

I've solidified my theory, game thinks you only have your %90 of your vram as available. how did I prove this? if you open / run programs that use 600 mb of VRAM, the game "reduces" its own vram so that total system usage is around 7.2 gb. no matter what you open in the background, game will force itself so that system uses 7.2 gb vram as its maximum.

similar to spiderman + hog legacy, this is another game that is wonky with vram management. there's a BIG miscommunication between windows devs + nvidia devs + game devs. I'm sure of it.


no chrome: 6.2 gb in game + 7.2 gb total vram usage
DimTrkw.png


chrome: 5.6 gb in game + 7.2 gb total

xbfAYp7.png


whatever you have in the background, that app will not use the free VRAM you have. instead, game will jankily and wonkily reduces its own VRAM consumption (and create performance problems while doing so)

This is unoptimized bullshit behaviour at its peak. if NVIDIA is adamant on releasing 8 gb midrange cards, they have to address this bullshit.

do you see the problem here=? even if we assume game concisously does not use and leave %10 free VRAM, the programs that use extra VRAM will still cause game to swap data anyways. so what's even the point???? no matter what you do, you won't be able to shoot upwards of 7.2 GB VRAM while running this game. you can run all the software in the world. you won't be able to. (and same applies to 10 gb with different values, around 9 gb allocation and 8 gb usage)

why do I bitch about this? because even If I accepted that game leaves VRAM for you to use for your background applications, the said backgroud applications do not use that free VRAM anyways. or that game simply feels threatened and reduces its own VRAM usage when something else uses VRAM. And all of this happens when the VRAM is not even maxed out at 7.2 GB.

last of us was not like this.

windows and games and unreal engine and devs are too dumb. this will make me leave pc gaming. you cant brute force your way out of stupidity.

with such bullshit 12 gb is obsolete and POS. frame gen, francy ray tracing,. ue games are too dumb. with such dumb bullshittery, an 12 gb will barely have enough cvram to run nextgen console games without any of these gimmicks included. nvidia needs to address this asap
DWM just pisses me off. EA using so much vram is also insane. i saw 600 MB before i launched the game which reduced it to 100 MB which is still pretty significant. DWM though remains high.
 

yamaci17

Member
DWM just pisses me off. EA using so much vram is also insane. i saw 600 MB before i launched the game which reduced it to 100 MB which is still pretty significant. DWM though remains high.
problem is not even the DWM. problem is that game chops its off vram budget based on what dwm or any other app uses. all the while free VRAM still... exists and sits there. will never be used. nor by system or game. because the second something else uses vram, the game reduces its own vram usage, despite free vram being available for all apps presnet.

you cannot even maximize the vram alongside the game even if you wanted to.

forspoken was similar, hard coded to think that your system had 7.2 GB VRAM. problem is, even if you open apps on top of that, you get a reduction from the 7.2 gb budget, not the total 8 gb budget. by this virtue, a 12 gb gpu for this game only has 10.8 gb of meaning ful vram. out of that 10.8 gb meaningfull vram, 700 800 mb of it will go to dwm, 300 400 mb to various programs. with such a super dumb logic, the 12 gb practically has to now make do with 9.6-9.8 gb of meaning vram budget for the whole game. now imagine trying to fit ray tracing, dlss, frame generation in that budget. what does NVIDIA really plan here? this really needs addressing. not only 8 10 gb users are getting hugely affected by it, 12 gb users will too

it is even dumb on 16 gb too as it practically gets you to 14.4 gb max budget



notice how game actively and dumbly challenges itself to redue total vram usage to 14.7 GB RANGE



notice how without ray tracing, vram usage is 7.2 gb total + 5.8 gb in game, and with ray tracing (later on)... it is still 7.2 gb total and 5.8 gb in game. most users will think that "ohh game uses 5.8 gb even with ray tracing on1! and i have 0.8 gb empty on top!!" while having stutters. so naturally they will try to find root of the problem elsewhere.

the reason his system and his in game is 5.8 gb is because he has 400 500 mb worth of background vram usage.

you can't literally force game to use more than 6.4 GB on a 8 GB system. like, literally you can't. even with 4k ray tracing maxed out, it will peak at 6.4 GB. that's a joke. same for 10 GB, you will go past 8 GB.
 

Bojji

Gold Member
The 1440p and 4k results on 3070 and other lower end cards are near identical though. Only the most expensive cards benefit from this patch at higher resolutions due to better GPU utilization.

So 3080 owners like me playing on 1440p 60 fps will still see drops, and 4k 40 fps on Epic seems to be out of the picture. tops out at 35 fps where its GPU bound. Whats odd is that i was getting very similar performance with RT on. Around 30-33 fps at native 4k epic settings. Ray tracing is definitely not very GPU demanding in this game.

They didn't test game in Koboh city so this chart is nearly useless. I think there is some improvement after patch in this area but not big.

Fuck me, i have to try it!

 
Last edited:

Bojji

Gold Member
So I tried this DLSS3 frame gen mod and I can say that it improves the game, framestutters are smoothed with it. I will have to test it further.
 
Last edited:

RoboFu

One of the green rats
Whoever did the port had no clue how PC works. The amount of locking/unlocking and kernel calls make it seem like an intern did the port






I mean it’s easy to point out issues, I’m sure the devs know all the issues what this doesn’t show is why they had to do these things. It’s never just that easy To fix issues.
 

simpatico

Member
Me scrolling this thread: Maybe I should buy this and play this weekend
I believe performance has gone way up for me. I've gone from 40FPS on Medium 1080p to mostly-stable 60fps on High 1080p. Using around 92% of my VRAM. (10400F, 3070, 32GB ram)

Seeing this post, as a GTX 1080 owner. lol. Skipping this one. This is absurd.
 

Kataploom

Gold Member
I mean it’s easy to point out issues, I’m sure the devs know all the issues what this doesn’t show is why they had to do these things. It’s never just that easy To fix issues.
The sole reason I can think of is an badly supervised or unsupervised junior dev not knowing how to manage these situations with proper asynchrony, these are the things you do when you don't know better practices in programming but those are supposed to be caught in a code review, so the senior in charge could point out what's going on and why the code shouldn't pass into the next branch, should have asked the junior to redo it following advices or directly telling them what should be used instead, that's how the junior grow and how they avoid these mistakes into the production branch, but seems like the senior in charge just let it pass anyway... Damn...
 
Top Bottom