• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Star Wars Jedi Survivor PC Performance Discussion OT: May The Port Be With You

M1987

Member
Struggling to hit 40fps 1440p on a 4090 😂 how the fuck is anyone with a GPU below a 4090 supposed to play it,720p with FSR/DLSS ultra performance?
 
Last edited:

SlimySnake

Flashless at the Golden Globes
It takes too long for devs to optimize for PC so they rely on patches instead of delaying the game.

If they don't patch this on day one, it deserves every bit of the review bombing, this is not acceptable.

Where is the god damn integrity PC devs? do you enjoy releasing stuttering shit we have to play on low?
You are expecting integrity from devs who are trying to pass off PS4 quality visuals as next gen games just so they can charge $70 for it?
 

Elog

Member
What is the latency of I/O access on a PS5 vs a PC? And how does it impact this specific game?
Sorry in advance for a lengthy response.

The trend that you all see in terms of more and more games having stuttering issues and frame drops on the PC that you do not see on the consoles come primarily down to the below.

If you do a CGI movie, a single character model can have up to 100 textures. High resolution textures are roughly 50-100 MB in size uncompressed. In other words, a single character model has 5-10 GB of texture data under optimal conditions. In games it is obviously less. However, in the past there has been a hard cap on how many textures the game can utilize and what resolution they can have. Since consoles had mechanical drives (as well as most PCs) all textures were mainly loaded when a 'level' loaded. Nothing was loaded on the fly. On a PC this resulted in a texture limit around 4GB (based on the median graphic card that was utilized).

With the new generation of consoles with SSDs they can load textures on the fly so most games have started to do that and as a consequence the number of textures that are used have increased dramatically as well as their resolution.

On a console this is very easy since RAM and VRAM are the same and the GPU is allowed to call for data straight from the SSD without checks and balances. On a PC the GPU cannot access RAM and SSD directly at all since that would be a security risk. The GPU needs to loop through the kernel (with associated driver overhead) and ask for a file to be fetched from RAM or SSD to be loaded to VRAM to be accessed. This introduces a lot of latency and fairly dramatic frame drops if the texture is not where it should be to create a frame. Direct Storage tries to make this faster and smoother but it is still a far cry from what both the XSX and the PS5 can achieve due to the limitations of the PC architecture.

This tweet still holds despite a lot of PC lovers fighting its implications (and I love my PC - have a custom loop 4090 etc):

 

Gaiff

SBI’s Resident Gaslighter
Sorry in advance for a lengthy response.

The trend that you all see in terms of more and more games having stuttering issues and frame drops on the PC that you do not see on the consoles come primarily down to the below.
No, this trend is a chiefly an issue between DX12 and UE4 which is ubiquitous. It started long before the consoles were even out. The stuttering is mainly due to shader compilation.
 
Last edited:

T4keD0wN

Member
It takes too long for devs to optimize for PC so they rely on patches instead of delaying the game.

If they don't patch this on day one, it deserves every bit of the review bombing, this is not acceptable.

Where is the god damn integrity PC devs? do you enjoy releasing stuttering shit we have to play on low?
If only playing on low has actually helped. Ive tried turning everyting down to low in fallen order, playing it at the lowest resolution the game has allowed and limiting it to 30fps, it had zero effect on stuttering. Theres no way to get around the software being poor.
 

01011001

Banned
No, this trend is a chiefly an issue between DX12 and UE4 which is ubiquitous. It started long before the consoles were even out. The stuttering is mainly due to shader compilation.

it's not even DX12.
UE4 games have stutter issues on DX11 as well.

on PC the only way to play Fortnite relatively stutter free for example is to use the "Performance" renderer, which uses DX11 and the mobile version's assets.

if you only put it to DX11 it still has shader stutters.
and other games in the past also still had stutters in DX11 mode, like Final Fantasy VII Remake... it had less stutters in DX11, but they were still there.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Sorry in advance for a lengthy response.

The trend that you all see in terms of more and more games having stuttering issues and frame drops on the PC that you do not see on the consoles come primarily down to the below.

If you do a CGI movie, a single character model can have up to 100 textures. High resolution textures are roughly 50-100 MB in size uncompressed. In other words, a single character model has 5-10 GB of texture data under optimal conditions. In games it is obviously less. However, in the past there has been a hard cap on how many textures the game can utilize and what resolution they can have. Since consoles had mechanical drives (as well as most PCs) all textures were mainly loaded when a 'level' loaded. Nothing was loaded on the fly. On a PC this resulted in a texture limit around 4GB (based on the median graphic card that was utilized).

With the new generation of consoles with SSDs they can load textures on the fly so most games have started to do that and as a consequence the number of textures that are used have increased dramatically as well as their resolution.

On a console this is very easy since RAM and VRAM are the same and the GPU is allowed to call for data straight from the SSD without checks and balances. On a PC the GPU cannot access RAM and SSD directly at all since that would be a security risk. The GPU needs to loop through the kernel (with associated driver overhead) and ask for a file to be fetched from RAM or SSD to be loaded to VRAM to be accessed. This introduces a lot of latency and fairly dramatic frame drops if the texture is not where it should be to create a frame. Direct Storage tries to make this faster and smoother but it is still a far cry from what both the XSX and the PS5 can achieve due to the limitations of the PC architecture.

This tweet still holds despite a lot of PC lovers fighting its implications (and I love my PC - have a custom loop 4090 etc):


This game has very poor texture streaming on the PS5 so it isnt using the PS5 IO or Cernys SSD.

Here you can see 15 seconds to load textures. The rocks in the distance literally look like ps3 or ps2 quality rocks until they spawn in after 15 seconds.



This isnt like Hogwarts, RE4, or TLOU. This game is poorly unoptimized even on PS5.
 
Last edited:

Elog

Member
This game has very poor texture streaming on the PS5 so it isnt using the PS5 IO or Cernys SSD.

Here you can see 15 minutes to load textures. The rocks in the distance literally look like ps3 or ps2 quality rocks until they spawn in after 15 seconds.



This isnt like Hogwarts, RE4, or TLOU. This game is poorly unoptimized even on PS5.

I do not doubt that - and my argument really had very little to do with the PS5 and whether they utilized the PS5 specific I/O solutions or not.

My point though is with regards to games having comparatively larger issues on the PC than on consoles lately (stuttering and frame drops). People really underestimate the challenges with improving graphics by utilizing larger amounts of high resolution textures - the PC environment has a real disadvantage here that is not easily remedied (compared to XSX and PS5).

I think this will continue to be challenging for the PCs this generation.
 
It's okay you have the best looking game in the industry in Elden Ring to play on PC instead.

Fast And Furious Shaw GIF by The Fast Saga
 

01011001

Banned
This situation is becoming untenable now. I have a 4090 and I'm having to refuse buying new games for it that are supposed to stretch it's legs.

Where and how does this end?

it might end if game reviewers stopped having such low standards and stopped crawling so high up the rectum of publishers that they start seeing the light on the other end again.

a game like this on PC deserves nothing more than a 5/10 at best.
 
  • Like
Reactions: GHG

Bragr

Banned
it might end if game reviewers stopped having such low standards and stopped crawling so high up the rectum of publishers that they start seeing the light on the other end again.

a game like this on PC deserves nothing more than a 5/10 at best.
Yeah, something is wrong with the process, they are reviewing the console version, and that's the score people see.
 

01011001

Banned
Yeah, something is wrong with the process, they are reviewing the console version, and that's the score people see.

they should demand review code for all versions, and refuse a review if they don't get all of them to test.

but they won't do that, because let's be real, they don't care, they want the clicks, and only the clicks
 

GHG

Gold Member
Yeah, something is wrong with the process, they are reviewing the console version, and that's the score people see.

But apparently even the console versions can be utter shit from a performance standpoint as well.

Its just that these hack journalists are willing to tow the party line and take the developer or publishers word for it when they say "we pinky promise everything will be 100% fixed in the day one patch". And then it isn't, but by that point it doesn't matter and the damage is done because everyone pre-purchased based on a bunch of phoney reviews.
 

SlimySnake

Flashless at the Golden Globes
This situation is becoming untenable now. I have a 4090 and I'm having to refuse buying new games for it that are supposed to stretch it's legs.

Where and how does this end?
It wont end until these lazy devs are called out and put on full blast by the media. Things have gotten this bad because of the journalists' complacency.

Games like Gotham Knights, Sackboy, Callisto, Hogwarts and TLOU have all shipped in unplayable states. The PC gaming media has no problems trashing nvidia and AMD gpus, but when it comes to games, they dont want to pick up the phone and get someone from these studios to ask why something as simple as shader comp wasnt done at the start of the game. Even TLOU which received bad reviews didnt result in ANY articles where people got Neil Druckman on to answer just why the fuck his studio shipped a game in such a poor state. He's the co-president. Call him until he picks up the phone. Keep making videos calling him out telling their viewers that he's hiding and refusing to take their calls.

Journalists treat devs like a protected species. They all have youtube channels with millions of subscribers who are responsible for their salary, and yet their obligation is towards the developer, why? To get review codes that they are no longer sent? DF spent years sucking up to Sony only for Sony to shaft them of a review code for TLOU. Respawn sent everyone review codes except for Digital Foundry. Some loyalty.

I dont expect IGN, Gamespot, Easy Allies, and Rock Paper Shotgun to stand up for PC ports, but GamerNexus, JayZTwoCents, Linus and all these youtubers with millions of subscibers covering PC hardware have dropped the ball big time. People buy hardware to play software, and all their hardware reviews mean nothing if the software is shipping in unplayable states taking months to fix. Played TLOU last night and had the same issues i had at launch. A month after launch.

The entire industry is rotten. Everyone wants to be friends with these hacks who cheat their customers, but no one wants to stick up for their subscriber whose views are literally paying for the channel.
 
Last edited:

Spyxos

Member
This game looks exactly like its predecessor. These hardware requirements are quite surprising. Either the game is shat on the market completely unfinished or it's up to Denovo. Or mixture of both.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Yet another AMD sponsored game with VRAM issues. I'm so surprised...
 
Last edited:

GHG

Gold Member
I dont expect IGN, Gamespot, Easy Allies, and Rock Paper Shotgun to stand up for PC ports, but GamerNexus, JayZTwoCents, Linus and all these youtubers with millions of subscibers covering PC hardware have dropped the ball big time. People buy hardware to play software, and all their hardware reviews mean nothing if the software is shipping in unplayable states taking months to fix. Played TLOU last night and had the same issues i had at launch. A month after launch.

The problem is that those people don't get given pre-release access to PC games in the same way that the obedient hack "gaming journalists" do.

So they aren't even being given the opportunity to do their jobs in that sense.
 

01011001

Banned
The problem is that those people don't get given pre-release access to PC games in the same way that the obedient hack "gaming journalists" do.

So they aren't even being given the opportunity to do their jobs in that sense.

it's telling that they didn't even give Digital Foundry or NX Gamer, who are relatively tame even when the port sucks, any review code for the game.

these devs are scared of even the mild pushback that those channels give them.
imagine them getting a request for a review code for a game like this by Gamers Nexus, they would shit their pants and never let them near a review copy of the game, because they aren't afraid to call something they review trash, and completely dismantle the product's and/or the company's reputation.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I do not doubt that - and my argument really had very little to do with the PS5 and whether they utilized the PS5 specific I/O solutions or not.

My point though is with regards to games having comparatively larger issues on the PC than on consoles lately (stuttering and frame drops). People really underestimate the challenges with improving graphics by utilizing larger amounts of high resolution textures - the PC environment has a real disadvantage here that is not easily remedied (compared to XSX and PS5).

I think this will continue to be challenging for the PCs this generation.
Ive been gaming on PC since 2003. I get it. My PCs would always last until the next gen consoles came and vram would become the bottleneck forcing me to upgrade. My memory is hazy for my 2003 PC, but I distinctly remember the day my GTX 570 defaulted all settings to very low, and refused to load textures in CoD Advanced Warfare. A GPU just as powerful as the PS4.

I completely understand the vram side of things. I argued alongside you in other threads just last year. What's going on recently is more than just vram related. The TLOU medium textures are not even PS3 quality and they take up 10GBs? PS3 had 256MB of VRAM.

PCs also have the added benefit of having System RAM. This has been the standard for the last three gens. I understand if PS5 and XSX have this inherent advantage but I have a system that has almost 2x faster vram bandwidth, 50% faster CPU clocks, DDR5 RAM, and as SSD thats 30% faster than the PS5. That should be enough to compensate for whatever secret sauce Cerny and Jason Ronald put into the XSX and PS5. The fact that these issues are resolved with some patches show that its not hardware related, its just time related. They are willing to ship out these games knowing full well they stink.

Any game with shader stutters is proof of that. You run gotham knights, sackboy, elden rings for the first time and it will stutter like crazy. Unplayable. There is no way they dont catch that. TLOU took 40 minutes to 2 hours to build shaders. ND said that they are investigating why that could be. BULLSHIT. They knew. This happens with every single CPU. AMD, Intel, 8 core, 16 core. 4.0 Ghz or 5.0 Ghz. Doesnt matter. And they pretended as if it was an isolated issue. GTFO.
 

01011001

Banned
Ive been gaming on PC since 2003. I get it. My PCs would always last until the next gen consoles came and vram would become the bottleneck forcing me to upgrade. My memory is hazy for my 2003 PC, but I distinctly remember the day my GTX 570 defaulted all settings to very low, and refused to load textures in CoD Advanced Warfare. A GPU just as powerful as the PS4.

I completely understand the vram side of things. I argued alongside you in other threads just last year. What's going on recently is more than just vram related. The TLOU medium textures are not even PS3 quality and they take up 10GBs? PS3 had 256MB of VRAM.

PCs also have the added benefit of having System RAM. This has been the standard for the last three gens. I understand if PS5 and XSX have this inherent advantage but I have a system that has almost 2x faster vram bandwidth, 50% faster CPU clocks, DDR5 RAM, and as SSD thats 30% faster than the PS5. That should be enough to compensate for whatever secret sauce Cerny and Jason Ronald put into the XSX and PS5. The fact that these issues are resolved with some patches show that its not hardware related, its just time related. They are willing to ship out these games knowing full well they stink.

Any game with shader stutters is proof of that. You run gotham knights, sackboy, elden rings for the first time and it will stutter like crazy. Unplayable. There is no way they dont catch that. TLOU took 40 minutes to 2 hours to build shaders. ND said that they are investigating why that could be. BULLSHIT. They knew. This happens with every single CPU. AMD, Intel, 8 core, 16 core. 4.0 Ghz or 5.0 Ghz. Doesnt matter. And they pretended as if it was an isolated issue. GTFO.

exactly! noone is expecting to run ULTRA SUPER EXTREME textures and have sub 1 second load times in every game.
what people expect is 8GB of VRAM usage actually looking like 8GB of VRAM usage...

they expect to start playing a game and have the typical performance the GPU shaders and CPU cores can process, and not wait for 2 hours, or play for hours and hours until the shaders are compiled and the game runs as it should have run from the start.

and they expect the loading to actually use THE FULL FUCKING CPU and not just 1 thread of 1 core to decompress and load the files into memory.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
The problem is that those people don't get given pre-release access to PC games in the same way that the obedient hack "gaming journalists" do.

So they aren't even being given the opportunity to do their jobs in that sense.
As representatives of PC gaming, I think they need to make a big stink about that.

If they still dont get review codes then put them on full blast and harrass devs for comment until they go on record. Linus has 20 million subscribers across his many channels, EA and Sony wont be able to ignore him for long.
 

jshackles

Gentlemen, we can rebuild it. We have the capability to make the world's first enhanced store. Steam will be that store. Better than it was before.
Is the first one any good or can I skip it
The first game is great - and by now it's been heavily optimized on PC. If you haven't already played it, I'd recommend picking that up first. By the time you're done with it, this new game will be in a better technical state (hopefully) and potentially discounted, depending on how long it takes you to play through the first one.
 

sendit

Member
This is going to get worse as the generation continues if technologies like DirectStorage and RTX IO aren't matured enough. Where is VFXVeteran VFXVeteran to tell everyone that the solution on PC is to brute force more system ram.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I have a question for PC gamers. How fast is the DDR5 3600 RAM speeds in terms of GBps. The PS5 can pull 5.5 GBps from the SSD into the vram. The XSX is capped at 2.4 GBps, but they used DDR3 ram in conjuction with the ESRAM just last gen. Surely the DDR4 and DDR5 ram can pull data from the Gen 4 7 GBps SSDs and then push to the vram at a much faster rate than the PS5 IO can.

Hogwarts was taking up to 25GBps in my system ram. Another 9GB in my VRAM. thats half the fucking game. Just how slow is this DDR5 that it cant do what the PS5 IO is doing?
 

Zug

Member
They didn't even try at this point, and just compiled the exe for Windows without any optimisation the day before release.
Will buy in a couple years for 10 bucks if they actually decide to release proper software.
 

xrnzaaas

Member
This is definitely disappointing, I don't remember the first game having big technical issues on launch... and it's not only a Denuvo issue (although the drm probably contributes to the poor performance of the PC version).

From a perspective of someone who's not heavy into tech stuff I don't understand the enormous bump in the system requirements vs. the first game. It doesn't look that much better than Fallen Order.
 

fermcr

Member
Guess, I'll just wait a few months before purchasing this game. No rush!
Get it on sale.
 
Last edited:

Zathalus

Member
I have a question for PC gamers. How fast is the DDR5 3600 RAM speeds in terms of GBps. The PS5 can pull 5.5 GBps from the SSD into the vram. The XSX is capped at 2.4 GBps, but they used DDR3 ram in conjuction with the ESRAM just last gen. Surely the DDR4 and DDR5 ram can pull data from the Gen 4 7 GBps SSDs and then push to the vram at a much faster rate than the PS5 IO can.

Hogwarts was taking up to 25GBps in my system ram. Another 9GB in my VRAM. thats half the fucking game. Just how slow is this DDR5 that it cant do what the PS5 IO is doing?
3600 DDR4 is 57.6 GB/s. 6000 DDR5 would be 96 GB/s. No consumer SSD (even with compression) is close to that.
 
Top Bottom