• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Marvel's Spider-Man: Miles Morales for PC | Review Thread

ChiefDada

Gold Member
You called the PS5 having RT shadows? Because the PC version does.

Lol, my point still stands. A last gen game is maxing out VRAM limitations of high end PC cards. RT shadows look great on PC, but that is not the main reason why performance is relatively low considering GPU compute specs. Even at 5min mark of video where he is playing at 1440p with high RT, he is averaging in the 70s (occasionally dips to 60 and below) with 8.5gb VRAM usage, whereas PS5 1440p Performance RT WITH the highest quality textures remains comfortably in 80-90fps.

All I'm saying is PC cards need to address the issue memory management and asset decompression to perform well in current gen. Otherwise that compute power goes to waste. It's obvious PS5 is streaming data in and out and putting hardware decompression to work. The issue is much more pronounced in MM compared to SM Remastered as I predicted.

yamaci17 yamaci17 thoughts?
 

Gaiff

SBI’s Resident Gaslighter
Lol, my point still stands. A last gen game is maxing out VRAM limitations of high end PC cards. RT shadows look great on PC, but that is not the main reason why performance is relatively low considering GPU compute specs. Even at 5min mark of video where he is playing at 1440p with high RT, he is averaging in the 70s (occasionally dips to 60 and below) with 8.5gb VRAM usage, whereas PS5 1440p Performance RT WITH the highest quality textures remains comfortably in 80-90fps.
The PS5 doesn't have RT shadows. It would tank its fps if it did. At the 5 minutes mark with High RT, High RT also includes shadows so I have no idea why you bring up the PS5's performance. You'd also need to match the scenes 1 for 1. You can't just claim the PS5 gets around this performance and call it a day. Who's to say it doesn't tank at 55fps in that scene?
All I'm saying is PC cards need to address the issue memory management and asset decompression to perform well in current gen. Otherwise that compute power goes to waste. It's obvious PS5 is streaming data in and out and putting hardware decompression to work. The issue is much more pronounced in MM compared to SM Remastered as I predicted.

yamaci17 yamaci17 thoughts?
A lot of claims with nothing to back them up. The PS5's asset decompression sure as shit didn't help in a Plague Tale Requiem where it's being outdone by a 2070S or Gotham Knights where it runs like shit. This a game tailor-made for the PS5. It takes full advantage of its strengths. PC doesn't need to address a thing because 99% of games will be multiplatform anyway.

You should probably also wait for a proper benchmarking review before declaring anything. So far, you're simply claiming "I said this and that" and I have no idea where your proof is.
 

ChiefDada

Gold Member
The PS5 doesn't have RT shadows. It would tank its fps if it did. At the 5 minutes mark with High RT, High RT also includes shadows so I have no idea why you bring up the PS5's performance. You'd also need to match the scenes 1 for 1. You can't just claim the PS5 gets around this performance and call it a day. Who's to say it doesn't tank at 55fps in that scene?

I just explained to you why RT shadows is not the cause of the performance gap. For starters, there's no specific memory hit specifically to shadows since rt reflections are already enabled with BVH in memory. just additional compute. Granted, I don't know the performance impact specifically, but we're talking about RT shadows, notorious for being relatively cheaper than other RT implementations.

I can make claims about PS5 performance RT because I have a PS5 and have played the game countless times. It stays comfortably in 80-90fps range.

A lot of claims with nothing to back them up. The PS5's asset decompression sure as shit didn't help in a Plague Tale Requiem where it's being outdone by a 2070S or Gotham Knights where it runs like shit.

Why are you angry? APTR didn't need asset decompression because it isn't a memory intensive game. Not something to brag about, depending on your perspective.

This a game tailor-made for the PS5. It takes full advantage of its strengths. PC doesn't need to address a thing because 99% of games will be multiplatform anyway.

First couple sentences I agree, SM MM leverages PS5 streaming and I/O. But yes, PC does need to address because Xbox Series consoles follow the same memory philosophy as PS5.
 

Gaiff

SBI’s Resident Gaslighter
I just explained to you why RT shadows is not the cause of the performance gap. For starters, there's no specific memory hit specifically to shadows since rt reflections are already enabled with BVH in memory. just additional compute. Granted, I don't know the performance impact specifically, but we're talking about RT shadows, notorious for being relatively cheaper than other RT implementations.
RT Shadows can drop performance by an enormous amount depending on the scenes. In fact, shadows probably take up more performance than reflections because there are shadows in every scene whereas it isn't the case for reflections (or at least it's easier to pick and choose).

This is the performance impact you have going from Ultra Shadows to RT Ultra Shadows in Shadow of the Tomb Raider.
1-4-3.jpg

2-5-3.jpg
The 3080 is a whopping 43% faster with Ultra Shadows vs RT Ultra Shadows. The 2080 Ti is 49%. I'm not sure where you got this idea that shadows RT shadows have a small impact on performance. Depending on the game, their impact varies from significant to crippling. It's never negligible.

I can make claims about PS5 performance RT because I have a PS5 and have played the game countless times. It stays comfortably in 80-90fps range.
No, you absolutely can't unless you got a built-in fps counter in your brain which I'm 99% certain you don't and if you do, please be the plug and let me know where I can get one. Furthermore, it would need to be in the same scene.
Why are you angry? APTR didn't need asset decompression because it isn't a memory intensive game. Not something to brag about, depending on your perspective.
You make sweeping claims based on your misunderstanding of a single scene in a benchmark then go on to say a lot of things which you evidently have very little understanding of.
First couple sentences I agree, SM MM leverages PS5 streaming and I/O. But yes, PC does need to address because Xbox Series consoles follow the same memory philosophy as PS5.
Assuming that's even the answer and you can't prove it is. What we do know is that Spider-Man on PC is extremely CPU-intensive, and likely still has issues to address (and they might never be) because many older CPUs actually run faster with HT/SMT off, so clearly, something is off with the usage/scheduling in this game. Hell, look at the guy's GPU usage in the video you posted, it sometimes drops to the 80's, implying the GPU isn't even being fed.

Adding RT shadows puts further strain on the CPU. There could be a million reasons for this game's questionable performance on PC and there actually are many. It isn't a single factor about asset decompression and memory management like you seem to cherish.

NxGamer and DF will have a video ready soon. Wait for them. They should at least provide further insight into the performance issues, then you can go wild with your claims with supporting evidence.
 
Last edited:

ChiefDada

Gold Member
RT Shadows can drop performance by an enormous amount depending on the scenes. In fact, shadows probably take up more performance than reflections because there are shadows in every scene whereas it isn't the case for reflections (or at least it's easier to pick and choose).

This is the performance impact you have going from Ultra Shadows to RT Ultra Shadows in Shadow of the Tomb Raider.
The 3080 is a whopping 43% faster with Ultra Shadows vs RT Ultra Shadows. The 2080 Ti is 49%. I'm not sure where you got this idea that shadows RT shadows have a small impact on performance. Depending on the game, their impact varies from significant to crippling. It's never negligible.

Apples to oranges, buddy. The graph you're referencing shows performance change from no RT to ultra RT shadows which means CPU, VRAM, memory bandwidth, and GPU compute all take a hit with the preset on. With Miles Morales differential much less with RT reflections already enabled.

No, you absolutely can't unless you got a built-in fps counter in your brain which I'm 99% certain you don't and if you do, please be the plug and let me know where I can get one. Furthermore, it would need to be in the same scene.

Lol, you're only 99% certain I don't have a frame counter in my brain? Ok, how certain are you that I don't have one of those fancy LG Oled TV contraptions with a built in VRR do-hickey that......... COUNTS FPS???!!! *GASP*

You make sweeping claims based on your misunderstanding of a single scene in a benchmark then go on to say a lot of things which you evidently have very little understanding of.

LOL

Assuming that's even the answer and you can't prove it is. What we do know is that Spider-Man on PC is extremely CPU-intensive, and likely still has issues to address (and they might never be) because many older CPUs actually run faster with HT/SMT off, so clearly, something is off with the usage/scheduling in this game. Hell, look at the guy's GPU usage in the video you posted, it sometimes drops to the 80's, implying the GPU isn't even being fed.

All I'm saying is PC cards need to address the issue memory management and asset decompression to perform well in current gen. Otherwise that compute power goes to waste.

Doc Rivers Reaction GIF



NxGamer and DF will have a video ready soon. Wait for them. They should at least provide further insight into the performance issues, then you can go wild with your claims with supporting evidence.

Nothing wrong with seeking additional insight from the professionals, but remember that should NEVER preclude you from using your brain and reasonable logic as well. Goodnight buddy.
 

Gaiff

SBI’s Resident Gaslighter
Apples to oranges, buddy. The graph you're referencing shows performance change from no RT to ultra RT shadows which means CPU, VRAM, memory bandwidth, and GPU compute all take a hit with the preset on. With Miles Morales differential much less with RT reflections already enabled.
The graph measures solely RT shadows impact, something you claim that has little impact which is obviously bogus, and even when compared to reflections/GI, it's inaccurate to say that shadows are less costly. It varies widely
Lol, you're only 99% certain I don't have a frame counter in my brain? Ok, how certain are you that I don't have one of those fancy LG Oled TV contraptions with a built in VRR do-hickey that......... COUNTS FPS???!!! *GASP*
And you plug that to your PS5 and spend your time measuring the fps? Not that it matters because once again, for your comparisons to be even remotely accurate, you'd need to measure the same scenes on the PS5 with RT shadows on, something that you cannot do so that argument is moot.
That's not an argument. You showed no understanding of basic things. You dismissing the impact of RT shadows on performance was proof enough.
Nothing wrong with seeking additional insight from the professionals, but remember that should NEVER preclude you from using your brain and reasonable logic as well. Goodnight buddy.
Except you're not "using your brain". You're making claims with 0 evidence. Show me the I/O throughput numbers compared to the PS5. Show me the memory bandwidth constraints. You're making guesses, nothing more. If you at least had some metrics to go by, we could start a discussion but you got literally nothing.
 
The PS5 doesn't have RT shadows. It would tank its fps if it did. At the 5 minutes mark with High RT, High RT also includes shadows so I have no idea why you bring up the PS5's performance. You'd also need to match the scenes 1 for 1. You can't just claim the PS5 gets around this performance and call it a day. Who's to say it doesn't tank at 55fps in that scene?

A lot of claims with nothing to back them up. The PS5's asset decompression sure as shit didn't help in a Plague Tale Requiem where it's being outdone by a 2070S or Gotham Knights where it runs like shit. This a game tailor-made for the PS5. It takes full advantage of its strengths. PC doesn't need to address a thing because 99% of games will be multiplatform anyway.

You should probably also wait for a proper benchmarking review before declaring anything. So far, you're simply claiming "I said this and that" and I have no idea where your proof is.
About plague tale I’m so happy Alex finally used a matching cpu for his bench against console if he had used a ridiculous 12900k like he always did before the ps5 would have been performing like a 2060 so I respect the results in the end. Im interested how far the series x could go if it was fully uncapped
 
3080 10gb running Miles Morales at native 4k/High settings. Similar performance as PS5 with FPS in the 40s. VRAM maxed out with lower texture quality than PS5. Wow, called it. DirectStorage GPU decompression is a must for current gen.


It would actually be even worse if they were using the 3600g/3700x (ps5 equalivalent cpu) instead of the 5900x
 

ACESHIGH

Banned
Lol, my point still stands. A last gen game is maxing out VRAM limitations of high end PC cards. RT shadows look great on PC, but that is not the main reason why performance is relatively low considering GPU compute specs. Even at 5min mark of video where he is playing at 1440p with high RT, he is averaging in the 70s (occasionally dips to 60 and below) with 8.5gb VRAM usage, whereas PS5 1440p Performance RT WITH the highest quality textures remains comfortably in 80-90fps.

All I'm saying is PC cards need to address the issue memory management and asset decompression to perform well in current gen. Otherwise that compute power goes to waste. It's obvious PS5 is streaming data in and out and putting hardware decompression to work. The issue is much more pronounced in MM compared to SM Remastered as I predicted.

yamaci17 yamaci17 thoughts?


Nah... Nixxes needs to get their coding game together. I know pot is legal over there but they should not have it while working on these ports.
 

yamaci17

Member
Lol, my point still stands. A last gen game is maxing out VRAM limitations of high end PC cards. RT shadows look great on PC, but that is not the main reason why performance is relatively low considering GPU compute specs. Even at 5min mark of video where he is playing at 1440p with high RT, he is averaging in the 70s (occasionally dips to 60 and below) with 8.5gb VRAM usage, whereas PS5 1440p Performance RT WITH the highest quality textures remains comfortably in 80-90fps.

All I'm saying is PC cards need to address the issue memory management and asset decompression to perform well in current gen. Otherwise that compute power goes to waste. It's obvious PS5 is streaming data in and out and putting hardware decompression to work. The issue is much more pronounced in MM compared to SM Remastered as I predicted.

yamaci17 yamaci17 thoughts?
no, PC cards just need more VRAM to match the PS5, that's about it
as I said, you're really overcomplicating the issue. this is as if asking devs will find special ways to work around issues that 2 GB GTX 770 and similar cards experienced later into the generation. solution was to play with framedrops or super garbage textures or sometimes in between. that's about it.

imagine being entitled to ultra textures in rdr2 with a 2 or 3 GB card (needs 4 gb vram). then asking rockstar to implement memory compression for 2 gb audience.

nope, you're going to take lower quality textures and be happy about it. if not, you're free to upgrade to a card with more vram. that's how things work in PC. you're responsible for your own purchases and decisions, no one is entitled to anything.

a gtx 780 only lasted with 2-3 years of meaningfull good performance. no one was entitled to get 6+ years out of that once flagship gpu.
a gtx 980 practically lasted the entire generation, at 8th year, and still runs a lot of games decently. again, it is just luck of the draw.
 
Last edited:

yamaci17

Member
I just explained to you why RT shadows is not the cause of the performance gap. For starters, there's no specific memory hit specifically to shadows since rt reflections are already enabled with BVH in memory. just additional compute. Granted, I don't know the performance impact specifically, but we're talking about RT shadows, notorious for being relatively cheaper than other RT implementations.
This is plain wrong. ray traced shadows CAN be costly, and will also have additional VRAM cost. If you don't have a RTX card and did not play around with such settings before, please stay out of this discussion.


At 1080p;
No ray tracing: 97 FPS 4.6 GB VRAM
Ray traced reflections: (%36 performance hit) 81 FPS 5.3 GB VRAM (a whopping 700 mb increase)
Reflections+Shadows: 64 FPS (a whopping %26 performance hit on a 3070) 5.8 GB VRAM (another whopping 500 mb increase)


And this is just at 1080p. 1440p and 4K costs will also be higher in terms of VRAM. It is really going to extremes to make wild claims on how RT shadows have no performance or VRAM impact somehow and it is justifiable to compare it with a PS5. Bravo.

Not accepting lower texture presets as a solution is YOUR problem. not developers. TEXTURE settings are there to accomodate for lower VRAM budgets. end of the DISCUSSION. a solution is there but you ask for a different solution.
you don't even know a PC GPU, yet you somehow feel entitled on behalf of PC gamers when no one feels entitled to ultra textures with such 8 GB GPUs. Even I do not feel entitled. Everyone knew what they were getting into when they cashed in for a 8 GB GPU. you wont get 4K/ultra textures on such cards. maybe at 1440p, but that will also end up problematic someday.
not that High textures look bad. it is a solution. it can be used. it SHOULD be used. no one is entitled to anything.

having 12 gb vram + matching ps5 settings is enough to fully match console performance without decompression or anything. 10 GB VRAM is not going to be completely enough at all cases, since you only have 8.5-9.3 GB VRAM usable most of the time due to background work.
 
Last edited:

yamaci17

Member
It would actually be even worse if they were using the 3600g/3700x (ps5 equalivalent cpu) instead of the 5900x

Not it wouldn't be worse, because as I've said countless other times, a 3700x practically can render 60+ FPS almost all the time in Spiderman with ray tracing enabled. The video in question is fully GPU bound around 40-50 FPS instead.
 

ChiefDada

Gold Member
This is plain wrong. ray traced shadows CAN be costly, and will also have additional VRAM cost. If you don't have a RTX card and did not play around with such settings before, please stay out of this discussion.

Imgsli
At 1080p;
No ray tracing: 97 FPS 4.6 GB VRAM
Ray traced reflections: (%36 performance hit) 81 FPS 5.3 GB VRAM (a whopping 700 mb increase)
Reflections+Shadows: 64 FPS (a whopping %26 performance hit on a 3070) 5.8 GB VRAM (another whopping 500 mb increase)

Why are you showing a menu screen, show me gameplay. Don't be stubborn we have the proof right here he isolated RT reflections by turning off shadows only and performance remained in the gutter with no change to VRAM usage.

 

rofif

Can’t Git Gud
no, PC cards just need more VRAM to match the PS5, that's about it
as I said, you're really overcomplicating the issue. this is as if asking devs will find special ways to work around issues that 2 GB GTX 770 and similar cards experienced later into the generation. solution was to play with framedrops or super garbage textures or sometimes in between. that's about it.

imagine being entitled to ultra textures in rdr2 with a 2 or 3 GB card (needs 4 gb vram). then asking rockstar to implement memory compression for 2 gb audience.

nope, you're going to take lower quality textures and be happy about it. if not, you're free to upgrade to a card with more vram. that's how things work in PC. you're responsible for your own purchases and decisions, no one is entitled to anything.

a gtx 780 only lasted with 2-3 years of meaningfull good performance. no one was entitled to get 6+ years out of that once flagship gpu.
a gtx 980 practically lasted the entire generation, at 8th year, and still runs a lot of games decently. again, it is just luck of the draw.
yeah agree on vram.
The only game that has problems, looses performance over time and crashes for me is Resident Evil 2 running maxed at 4k with RT. The game is very vram heavy
 

yamaci17

Member
Why are you showing a menu screen, show me gameplay. Don't be stubborn we have the proof right here he isolated RT reflections by turning off shadows only and performance remained in the gutter with no change to VRAM usage.


If I show you my own gameplay proof, will you shut up and go away? You and your entourage of PC doomsayers are the ones who are stubborn and fails to understand.

That video explains nothing. It is quite possible engine did not decide to dump off the unnecessary VRAM load. Or simply the VRAM change is lost in transition because the game could've decided to stream more textures when found space in VRAM. You gotta go from no ray tracing, to ray traced reflections, and then reflections+shadows. I can do that in a video too. Then you will find another aspect to nitpick. Its up to you
 
Last edited:

ChiefDada

Gold Member
If I show you my own gameplay proof, will you shut up and go away?

Lol, depends on what you show. Reminder you can also choose to simply leave the conversation and not be sour about the situation. It's just video games.

yeah agree on vram.
The only game that has problems, looses performance over time and crashes for me is Resident Evil 2 running maxed at 4k with RT. The game is very vram heavy

Even without VRAM hitting max we see the PC cards perform below console. It's heavy on CPU because of decompression and happens to be much heavier on MM as I suspected it would. PC folks pitch a fit about it when in reality they should be forcing developers, vendors, and Microsoft to sort out DirectStorage.
 

rofif

Can’t Git Gud
Lol, depends on what you show. Reminder you can also choose to simply leave the conversation and not be sour about the situation. It's just video games.



Even without VRAM hitting max we see the PC cards perform below console. It's heavy on CPU because of decompression and happens to be much heavier on MM as I suspected it would. PC folks pitch a fit about it when in reality they should be forcing developers, vendors, and Microsoft to sort out DirectStorage.
WE got very powerful CPUs, so at least they are being used... somehow
 

yamaci17

Member
Lol, depends on what you show. Reminder you can also choose to simply leave the conversation and not be sour about the situation. It's just video games.



Even without VRAM hitting max we see the PC cards perform below console. It's heavy on CPU because of decompression and happens to be much heavier on MM as I suspected it would. PC folks pitch a fit about it when in reality they should be forcing developers, vendors, and Microsoft to sort out DirectStorage.
"Lol, depends on what you show"

Nope. You tell the conditions first, so we can understand your sophisticated view on the matter. Otherwise it will be a cat and mouse chase game here with no end in sight.
 

ChiefDada

Gold Member
WE got very powerful CPUs, so at least they are being used... somehow

Yes, both CPU/GPU much more powerful. And of course more memory. But CPU should be doing CPU friendly task, not decompression. It's severe inefficiency that can be substantially mitigated by DS if the PC crowd put certain companies' feet to the fire.

"Lol, depends on what you show"

Nope. You tell the conditions first, so we can understand your sophisticated view on the matter. Otherwise it will be a cat and mouse chase game here with no end in sight.

I only said that because you started off terribly with the screenshots of performance measurement/comparisons from a menu screen. You should know better, that's not going to fly in any legitimate tech discussion. There's no risk of cat and mouse game, we can be done here. You should never force yourself in any dialogue you deem unproductive. That's torture. Just simply walk away, it's your prerogative.
 

yamaci17

Member
Yes, both CPU/GPU much more powerful. And of course more memory. But CPU should be doing CPU friendly task, not decompression. It's severe inefficiency that can be substantially mitigated by DS if the PC crowd put certain companies' feet to the fire.



I only said that because you started off terribly with the screenshots of performance measurement/comparisons from a menu screen. You should know better, that's not going to fly in any legitimate tech discussion. There's no risk of cat and mouse game, we can be done here. You should never force yourself in any dialogue you deem unproductive. That's torture. Just simply walk away, it's your prerogative.


VRAM consumption increases by 500 mb (more as I traversed)
Performance takes a %18-22 drop

Now you can search for your next goalpost. Good luck.

If the RT shadows increase VRAM consumption in menu screen, there's no logic in denying it would also increase the VRAM consumption whenever ray traced shadows are being cast. You're simply nitpicking. This is not even a legitimate tech discussion , since you don't even know what kind of cost / memory footprint ray traced shadows have. I've been countless games with ray traced shadows and other mixed ray traced effects for 2 years now, at least I actually have the experience and knowledge on how it impacts the resources I have, and I can speak from experience. I cannot say the same to you, considering you nitpicked a video part where the GPU VRAM buffer is full of pressure and any change in graphical settings will not properly represent the VRAM resource impact they might have.

How can this discussion not go sour when you have a clear agenda on this specific topic and retell and recount misinformed opinions over and over and over again. Sorry but this is a discussion forum and if you a have misinformed opinion, you're bound to be corrected. I'm not even discussing anything with you. I'm merely stating facts with factual proofs and correcting your misinformed opinions. Nothing more, nothing less.

You're overcomplicating and deducing wrong opinions. That is ALL. the game is not using any super sauce magic on PS5. It literally runs how it should when you have enough VRAM budget (10 GB, and you need 12 GB to make sure the game can allocate an interrupted 10 GB to the game. the 10 GB GPUs themselves do not count)

The game literally is designed with 10 GB VRAM budget in mind, on PS5. If you have that you won't have problems. The CPU requirements could be higher. I believe you're one of those people that think somehow the PS5 SSD acts as a VRAM buffer. Dude, you're not comprehending this clearly. No SSD can ever be substitute for VRAM. you can stream textures but that will only lead to slower texture loading on PC. that is quite literally what the benefit of PS5's streaming system is. It will load textures in mere seconds, while on PC it could take 3-10 seconds depending on the region. THAT'S it. only thing directstorage would help with would be THAT. not the performance itself.

performance itself is related to how low your VRAM is.







Take a GOOD look at this chart. At 4K, 3070 is only %28 faster than 3060 DUE TO VRAM PRESSURE creeping in.

Or look at RTX 3080. it is %40 slower than 3090ti at NATIVE 4K. DLSS reduces the "VRAM pressure", and all of a sudden it is only %8 slower.

RTX 3080 lost its %30 performance, nearly, due to VRAM pressure. The video you've provided is full OF it. it is not representative of actual 3080 performance. Neither Directstorage nor PS5-like sophisticated STREAMING system on PC would solve this. THE GAME still demands a clear uninterrupted 10 GB budget for 4K/ultra textures. What part of this do you not understand? Once you provide that, performance becomes supreme, and it is where you expect from the GPUs. The game already uses RAM as a substitute for VRAM. This is why the performance is slower. Somehow you believe that using SSD as a substitute for VRAM (tens of times slower than RAM) would solve the issue or be remedy for it. IT would not. IT WOULD even be worse. You either have enough budget or not. You either gotta take the perf. hit or you will play around it. The game is not reducing memory footprint requirements on PS5 by utilizing SSD. It merely utilizes SSD so that textures can be loaded instantly, seamless. This is what PC ports have been lacking. It really takes some serious time for some textures to load in both cutscenes and in game traversal. This is a solution most devs come up with, and we call it texture streaming. PS5 is simply super fast at it. PS5 is not using SSD to reduce VRAM requirements. VRAM requirements are still there.

Solution is to reduce VRAM pressure if you're VRAM bound. DLSS helps that. Even ray traced shadows have their own VRAM cost( as evidenced above ). you can reduce textures as a last resort. if you have to, you gotta. THERE's no other way.

TdVEf5T.png



I won't reply any further. You can do whatever you want with this information.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Alex Battaglia is reporting that the PS5 versions uses setting(s) below Very Low (I assume only in Miles Morales) on the PS5.

Also, I noticed that when I play around with the settings sometimes, the performance will drop massively and I have to reboot for it to get back to normal.
 
Lol, my point still stands. A last gen game is maxing out VRAM limitations of high end PC cards. RT shadows look great on PC, but that is not the main reason why performance is relatively low considering GPU compute specs. Even at 5min mark of video where he is playing at 1440p with high RT, he is averaging in the 70s (occasionally dips to 60 and below) with 8.5gb VRAM usage, whereas PS5 1440p Performance RT WITH the highest quality textures remains comfortably in 80-90fps.

All I'm saying is PC cards need to address the issue memory management and asset decompression to perform well in current gen. Otherwise that compute power goes to waste. It's obvious PS5 is streaming data in and out and putting hardware decompression to work. The issue is much more pronounced in MM compared to SM Remastered as I predicted.

yamaci17 yamaci17 thoughts?
Performance RT mode on PS5 spends the majority of its time at an internal resolution of 1080p, not 1440p. And also has far lower geometry density/LoD compared to PC high settings, and the Quality PS5 mode. Also lower RT settings.

And the 3080 had notoriously low VRAM even by 2020 standards, my 2080ti from 2018 had more than that. The source of many arguments about whether it was enough VRAM to last through this gen, at least for 4k gaming with high settings.

Regardless, there are better looking open world games on PC with higher resolution textures, so this isn't an issue with the tech, but the engine design/port. Let's not forget that Miles runs on a Jaguar CPU and HDD with low memory bandwidth on a PS4 Pro and has no problem pulling those textures through, even if some of them are slightly lower in resolution compared to PS5.
Alex Battaglia is reporting that the PS5 versions uses setting(s) below Very Low (I assume only in Miles Morales) on the PS5.

Also, I noticed that when I play around with the settings sometimes, the performance will drop massively and I have to reboot for it to get back to normal.
The latter was an issue with Spider-Man too - if you change any settings, restart the game, otherwise performance is degraded.
 
Last edited:

ChiefDada

Gold Member
yamaci17 yamaci17 if you're going to edit your comments with retorts 30min after original, at least have the decency to tag.

But yeah, I see the reason why you are approaching in such a weirdly hostile manner:

I believe you're one of those people that think somehow the PS5 SSD acts as a VRAM buffer. Dude, you're not comprehending this clearly. No SSD can ever be substitute for VRAM. you can stream textures but that will only lead to slower texture loading on PC.

... and you are absolutely wrong, I do not believe any of this. You're the one bringing up SSD and I'm not even sure why in this particular case. Did you not see where I say asset decompression is likely a culprit here even when within VRAM budget?

Solution is to reduce VRAM pressure if you're VRAM bound.

Do you not realize what you typed? "Solution is to reduce textures, native resolution, RT, aka settings that effect fidelity/graphical quality if you are VRAM bound."

Confused Little Girl GIF



Unfortunately, I failed to remember NXG tried to get through to you on this exact issue but you were hard headed then as well:

I get frustrated when people attack facts with no logic, your argument is, "Well if the GPU had more Vram than it does it would be performing better!"
Well yeah, of course, This argument (which it is clearly not) is if my Fiesta had a Ferrari engine, it would be able to beat a Porshe. I see this very pigeonhole logic a great deal in comments, and it misses the point of these tests and how tests should be.
 

yamaci17

Member
Yeah, not being able to give a solid answer back and moving the goalposts. Just as I expected. I'm done answering to you, clearly it is going nowhere with your bias and agenda towards memory compression on PS5 being the key to the good performance and then claiming it is not that important or whatsoever. You're not even consistent with your own arguments, lmao.

You claimed ray traced shadows not having a memory cost, then when proven against it, you dodged the fact. This alone shows malicious intent. Its really interesting how much the PS5 has power over some people like you that it can turn you into literal dodge masters.
 
Last edited:

ChiefDada

Gold Member
Performance RT mode on PS5 spends the majority of its time at an internal resolution of 1080p, not 1440p. And also has far lower geometry density/LoD compared to PC high settings, and the Quality PS5 mode. Also lower RT settings.

If you could provide support that Performance RT spends most time at 1080p, I would appreciate. My recollection is it spends majority of time at 1440p. In regards to RT settings, I'm going off of DF video for SM Remastered.

HuZ0x4O.png


And the 3080 had notoriously low VRAM even by 2020 standards, my 2080ti from 2018 had more than that. The source of many arguments about whether it was enough VRAM to last through this gen, at least for 4k gaming with high settings.

Even without VRAM saturation, there is an issue of CPU resources used for asset decompression that is much more pronounced in Miles Morales because asset quality is even higher. My point is this is a PS4 game at heart that is causing performance issues due to memory bottleneck, whether it be VRAM or data movement. How do you suppose a 30 series GPUs will run Spiderman 2 which will inevitably have even larger assets, crazier set pieces, character swinging through the city twice as fast, etc. I'm not here to shit on PC, just explaining why it's important to consider before purchase. There are countless 3070/3080 owners who assumed their purchase will get them through entire generation, as you're saying here.

Regardless, there are better looking open world games on PC with higher resolution textures, so this isn't an issue with the tech, but the engine design/port. Let's not forget that Miles runs on a Jaguar CPU and HDD with low memory bandwidth on a PS4 Pro and has no problem pulling those textures through, even if some of them are slightly lower in resolution compared to PS5.

Texture and geometry quality is much higher in PS5 version of MM.
 

SmokedMeat

Gamer™
How do you suppose a 30 series GPUs will run Spiderman 2 which will inevitably have even larger assets, crazier set pieces, character swinging through the city twice as fast, etc. I'm not here to shit on PC, just explaining why it's important to consider before purchase. There are countless 3070/3080 owners who assumed their purchase will get them through entire generation, as you're saying here.

Star Trek Bullshit GIF


If a PS5 can run Spiderman 2, then a 30 series GPU certainly will as well.

It’s nice that PS5 fans are concerned for us PC players. We’re fine though.
 

Synless

Gold Member
Interesting that on PS5 Miles Morales was, in multiple reviews across the board, considered inferior to Spider-Man Remastered with exception of graphics and some tighter mechanics but someone how the PC release this changed and it scored higher than all the versions of any of the games…

What changed?
 
Last edited:

ChiefDada

Gold Member
Yeah, not being able to give a solid answer back and moving the goalposts. Just as I expected. I'm done answering to you, clearly it is going nowhere with your bias and agenda towards memory compression on PS5 being the key to the good performance and then claiming it is not that important or whatsoever. You're not even consistent with your own arguments, lmao.

Not only is asset compression the key for PS5 running at it's specific settings, but it is THE ONLY WAY it could because of limited RAM. Where did I change say it wasn't important? You're getting even stranger by the minute.

You claimed ray traced shadows not having a memory cost, then when proven against it, you dodged the fact. This alone shows malicious intent.

Return Of The Jedi Episode 6 GIF


Its really interesting how much the PS5 has power over some people like you

power remake GIF


that it can turn you into literal dodge masters.


Star Wars Jump GIF by Xbox
 

SmokedMeat

Gamer™
Interesting that on PS5 Miles Morales was, in multiple reviews across the board, considered inferior to Spider-Man Remastered with exception of graphics and some tighter mechanics but someone how the PC release this changed and it scored higher than all the versions of any of the games…

What changed?

Different reviewers?

I’ve only played Spiderman in PS4, but that game has pretty bad side missions and bloat that I wasn’t big on.

If Miles trims the fat and focuses more on being a straight Spider-Man game, then I can see myself enjoying it more.
 

Synless

Gold Member
Different reviewers?

I’ve only played Spiderman in PS4, but that game has pretty bad side missions and bloat that I wasn’t big on.

If Miles trims the fat and focuses more on being a straight Spider-Man game, then I can see myself enjoying it more.
It trimmed the fat but all the fat was optional in Spider-Man Remastered... Miles Morales had a weaker story, was shorter all around, and had shit side missions like it’s full fledged counter part.
 

Alexios

Cores, shaders and BIOS oh my!
Seems nicely optimized, easy 30+ fps (mostly 40+, often hovering at 60, so I'm assuming if I lock to 30 it'll stay there and not suffer later) on aging 3770K + GTX1080 @ 1440p with FSR set to Quality and otherwise highest custom settings. Running off HDD mind, first load is fast, no issues anywhere.
 
Last edited:

rofif

Can’t Git Gud


DF video.
Many many redundant settings. Sure you can change weather low,medum,high, very high... but why? There is no performance impact.
Imo pc games should not do that. Do not offer lower graphics options if it does not bring performance improvement
 


DF video.
Many many redundant settings. Sure you can change weather low,medum,high, very high... but why? There is no performance impact.
Imo pc games should not do that. Do not offer lower graphics options if it does not bring performance improvement

He nailed it. There is no magic done on PS5 Performance mode as crowd and traffic density is lowest possible to maintain FPS.

So in general AMD 3600 CPU is matching PS5 CPU performance mode in this game.
 

Buggy Loop

Member


DF video.
Many many redundant settings. Sure you can change weather low,medum,high, very high... but why? There is no performance impact.
Imo pc games should not do that. Do not offer lower graphics options if it does not bring performance improvement


This. Never understood a setting from ultra to low is 1% performance hit. Red dead redemption 2 is full of those settings. It probably eats up GBs of storage to store all these different assets, and for no gain in performance.
 

rofif

Can’t Git Gud
He nailed it. There is no magic done on PS5 Performance mode as crowd and traffic density is lowest possible to maintain FPS.

So in general AMD 3600 CPU is matching PS5 CPU performance mode in this game.
yeah makes sense. consoles use a bit worse 3700x.
Disappointing there is not much coding to metal done on ps5
 
Lol, depends on what you show. Reminder you can also choose to simply leave the conversation and not be sour about the situation. It's just video games.



Even without VRAM hitting max we see the PC cards perform below console. It's heavy on CPU because of decompression and happens to be much heavier on MM as I suspected it would. PC folks pitch a fit about it when in reality they should be forcing developers, vendors, and Microsoft to sort out DirectStorage.

Its amazing that no matter how many times its been debunked. People continue to push the super fast SSD narrative. Its pathetic.

mynzEMg.png
 
Last edited:

SmokedMeat

Gamer™


DF video.
Many many redundant settings. Sure you can change weather low,medum,high, very high... but why? There is no performance impact.
Imo pc games should not do that. Do not offer lower graphics options if it does not bring performance improvement


Looks like a decent port, and I’m glad to see my current CPU doing well. I’m expecting a nice boost though as I plan on upgrading to a new CPU/DDR5 this Christmas.
 
Last edited:

ChiefDada

Gold Member


DF video.
Many many redundant settings. Sure you can change weather low,medum,high, very high... but why? There is no performance impact.
Imo pc games should not do that. Do not offer lower graphics options if it does not bring performance improvement


This video has got to be a joke. A 2070 Super running1080p 60-70fps with optimized settings that are overall lower than PS5 (including lower RT object range) = console parity???

cijEROU.jpg


YB00IZ4.png



And then the icing on the cake is him saying textures failing to stream in is just a tiny matter lol squeezing that "insignificant" bit at the end. So if you choose very high texture preset and they don't actually stream in, are you actually running with very high textures? Or did you just click your mouse?
 

yamaci17

Member
- Compares PS5's performance mode to the 3080 with super omega maxed settings with ray traced shadows on top and calls it fair, claims ray traced shadows have no performance impact anyway so thinks it is okay to compare one with the other with shadows enabled, assumes ultra maxed settings do not create an unfair performance disadvantage for the 3080 (completely disregards how changing settings on the fly is not always reflected upon the performance or performance metrics in general. claims random found video on the net as the supreme proof that 3080 sucks)

- Moans about one or two setting being lower on optimized settings compared to PS5's performance mode and how it is insanely unfair to compare PS5 to 2070 super with such settings (time and time again, RT object range heavily affects CPU, not the GPU. Ray traced performance between 6 and 8 is practically almost same on similar GPUs. most of the work is already being done by GPU, you only need beefier CPU for more objects to be reflected / drawn)

- Logic = left the chat
 
Last edited:

ChiefDada

Gold Member
- Compares PS5's performance mode to the 3080 with super omega maxed settings with ray traced shadows on top and calls it fair, claims ray traced shadows have no performance impact anyway so thinks it is okay to compare one with the other with shadows enabled, assumes ultra maxed settings do not create an unfair performance disadvantage for the 3080 (completely disregards how changing settings on the fly is not always reflected upon the performance or performance metrics in general. claims random found video on the net as the supreme proof that 3080 sucks)

I know this is the type of discussion you want to have and you want me to be this person that hates PC. Sorry to disappoint you.

- Moans about one or two setting being lower on optimized settings compared to PS5's performance mode and how it is insanely unfair to compare PS5 to 2070 super with such settings (time and time again, RT object range heavily affects CPU, not the GPU. Ray traced performance between 6 and 8 is practically almost same on similar GPUs. most of the work is already being done by GPU, you only need beefier CPU for more objects to be reflected / drawn)

- Logic = left the chat

Lol you want to take your time to re-read what I said because you literally have it backwards. Go ahead, take another whack at it.
 
Not it wouldn't be worse, because as I've said countless other times, a 3700x practically can render 60+ FPS almost all the time in Spiderman with ray tracing enabled. The video in question is fully GPU bound around 40-50 FPS instead.
Oh hey what’s up dude hope you been well we haven’t talked in a bit. I was saying especially cause of how rt works in this game that having a much better cpu is helping to some degree even at 4k wouldn’t shock me if it loses 5 frames on average if it went down to an ps5 equivalent cpu
 


VRAM consumption increases by 500 mb (more as I traversed)
Performance takes a %18-22 drop

Now you can search for your next goalpost. Good luck.

If the RT shadows increase VRAM consumption in menu screen, there's no logic in denying it would also increase the VRAM consumption whenever ray traced shadows are being cast. You're simply nitpicking. This is not even a legitimate tech discussion , since you don't even know what kind of cost / memory footprint ray traced shadows have. I've been countless games with ray traced shadows and other mixed ray traced effects for 2 years now, at least I actually have the experience and knowledge on how it impacts the resources I have, and I can speak from experience. I cannot say the same to you, considering you nitpicked a video part where the GPU VRAM buffer is full of pressure and any change in graphical settings will not properly represent the VRAM resource impact they might have.

How can this discussion not go sour when you have a clear agenda on this specific topic and retell and recount misinformed opinions over and over and over again. Sorry but this is a discussion forum and if you a have misinformed opinion, you're bound to be corrected. I'm not even discussing anything with you. I'm merely stating facts with factual proofs and correcting your misinformed opinions. Nothing more, nothing less.

You're overcomplicating and deducing wrong opinions. That is ALL. the game is not using any super sauce magic on PS5. It literally runs how it should when you have enough VRAM budget (10 GB, and you need 12 GB to make sure the game can allocate an interrupted 10 GB to the game. the 10 GB GPUs themselves do not count)

The game literally is designed with 10 GB VRAM budget in mind, on PS5. If you have that you won't have problems. The CPU requirements could be higher. I believe you're one of those people that think somehow the PS5 SSD acts as a VRAM buffer. Dude, you're not comprehending this clearly. No SSD can ever be substitute for VRAM. you can stream textures but that will only lead to slower texture loading on PC. that is quite literally what the benefit of PS5's streaming system is. It will load textures in mere seconds, while on PC it could take 3-10 seconds depending on the region. THAT'S it. only thing directstorage would help with would be THAT. not the performance itself.

performance itself is related to how low your VRAM is.







Take a GOOD look at this chart. At 4K, 3070 is only %28 faster than 3060 DUE TO VRAM PRESSURE creeping in.

Or look at RTX 3080. it is %40 slower than 3090ti at NATIVE 4K. DLSS reduces the "VRAM pressure", and all of a sudden it is only %8 slower.

RTX 3080 lost its %30 performance, nearly, due to VRAM pressure. The video you've provided is full OF it. it is not representative of actual 3080 performance. Neither Directstorage nor PS5-like sophisticated STREAMING system on PC would solve this. THE GAME still demands a clear uninterrupted 10 GB budget for 4K/ultra textures. What part of this do you not understand? Once you provide that, performance becomes supreme, and it is where you expect from the GPUs. The game already uses RAM as a substitute for VRAM. This is why the performance is slower. Somehow you believe that using SSD as a substitute for VRAM (tens of times slower than RAM) would solve the issue or be remedy for it. IT would not. IT WOULD even be worse. You either have enough budget or not. You either gotta take the perf. hit or you will play around it. The game is not reducing memory footprint requirements on PS5 by utilizing SSD. It merely utilizes SSD so that textures can be loaded instantly, seamless. This is what PC ports have been lacking. It really takes some serious time for some textures to load in both cutscenes and in game traversal. This is a solution most devs come up with, and we call it texture streaming. PS5 is simply super fast at it. PS5 is not using SSD to reduce VRAM requirements. VRAM requirements are still there.

Solution is to reduce VRAM pressure if you're VRAM bound. DLSS helps that. Even ray traced shadows have their own VRAM cost( as evidenced above ). you can reduce textures as a last resort. if you have to, you gotta. THERE's no other way.

TdVEf5T.png



I won't reply any further. You can do whatever you want with this information.

Can it not be possible that the game is BOTH vram heavy and also by default heavier on pc and the 2 issues are compounding each other cause it seems like you both are taking an either or approach
 
Alex Battaglia is reporting that the PS5 versions uses setting(s) below Very Low (I assume only in Miles Morales) on the PS5.

Also, I noticed that when I play around with the settings sometimes, the performance will drop massively and I have to reboot for it to get back to normal.
I would doubt any settings are below very low especially considering there are ps4 versions
 
Performance RT mode on PS5 spends the majority of its time at an internal resolution of 1080p, not 1440p. And also has far lower geometry density/LoD compared to PC high settings, and the Quality PS5 mode. Also lower RT settings.

And the 3080 had notoriously low VRAM even by 2020 standards, my 2080ti from 2018 had more than that. The source of many arguments about whether it was enough VRAM to last through this gen, at least for 4k gaming with high settings.

Regardless, there are better looking open world games on PC with higher resolution textures, so this isn't an issue with the tech, but the engine design/port. Let's not forget that Miles runs on a Jaguar CPU and HDD with low memory bandwidth on a PS4 Pro and has no problem pulling those textures through, even if some of them are slightly lower in resolution compared to PS5.

The latter was an issue with Spider-Man too - if you change any settings, restart the game, otherwise performance is degraded.
Performance RT mode on PS5 spends the majority of its time at an internal resolution of 1080p, not 1440p. And also has far lower geometry density/LoD compared to PC high settings, and the Quality PS5 mode. Also lower RT settings.

And the 3080 had notoriously low VRAM even by 2020 standards, my 2080ti from 2018 had more than that. The source of many arguments about whether it was enough VRAM to last through this gen, at least for 4k gaming with high settings.

Regardless, there are better looking open world games on PC with higher resolution textures, so this isn't an issue with the tech, but the engine design/port. Let's not forget that Miles runs on a Jaguar CPU and HDD with low memory bandwidth on a PS4 Pro and has no problem pulling those textures through, even if some of them are slightly lower in resolution compared to PS5.

The latter was an issue with Spider-Man too - if you change any settings, restart the game, otherwise performance is degraded.

Can I guess that Alex didn’t unlock the framerate on ps5 when doing the tests?
 
If you could provide support that Performance RT spends most time at 1080p, I would appreciate. My recollection is it spends majority of time at 1440p. In regards to RT settings, I'm going off of DF video for SM Remastered.

HuZ0x4O.png




Even without VRAM saturation, there is an issue of CPU resources used for asset decompression that is much more pronounced in Miles Morales because asset quality is even higher. My point is this is a PS4 game at heart that is causing performance issues due to memory bottleneck, whether it be VRAM or data movement. How do you suppose a 30 series GPUs will run Spiderman 2 which will inevitably have even larger assets, crazier set pieces, character swinging through the city twice as fast, etc. I'm not here to shit on PC, just explaining why it's important to consider before purchase. There are countless 3070/3080 owners who assumed their purchase will get them through entire generation, as you're saying here.



Texture and geometry quality is much higher in PS5 version of MM.
Nx gamers settings list for Spider-Man is more accurate than Alex’s cause Alex himself could not figure out what 4 of the settings did while Nx gamer went to the developers to confirm how the settings worked. Don’t know why people keep referring to Dfs settings list when it’s both older and straight up inaccurate compared to Nx gamers
 


DF video.
Many many redundant settings. Sure you can change weather low,medum,high, very high... but why? There is no performance impact.
Imo pc games should not do that. Do not offer lower graphics options if it does not bring performance improvement

Did Alex do the thing where he has a settings list but then couldn’t explain what 4 of the settings did like in Spider-Man remastered?
 
Top Bottom