• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Can Ratchet and Clank: Rift Apart on PC match - and exceed - the PS5 experience?

Buggy Loop

Member
So to sumarize, the only real issue if BVH calculations since nobody else reported the same DF issues but the actual opposite with textures?

Seems like it

So probably a patch or two and it'll be fixed

Might be a reason why AMD ray tracing crapped out so close to launch, it seems broken.
 

LiquidMetal14

hide your water-based mammals
The game is a visual joy to behold. And great fun. My kids love it. I have like 9 games on their account due to not letting them have access to violent stuff. This is one of them lol

ratchetclankriftapartsmdzb.jpg


ratchetclankriftapartxwfpx.jpg


ratchetclankriftapart36f8j.jpg


ratchetclankriftaparte5c4s.jpg


ratchetclankriftapart9td4v.jpg


ratchetclankriftapartaze7c.jpg


ratchetclankriftapartjzflp.jpg


ratchetclankriftapartbidmx.jpg


ratchetclankriftapartn9ek3.jpg
 

twilo99

Gold Member
No, please, not this.

The problem with PC isn't the hardware. It's that no one can be bothered to properly utilize it because since times immemorial the answer has been "get better hardware".

You see hardware that should dust the PS5 lagging behind it. Your first thought shouldn't be to think about new hardware to compensate, it should be about how to actually utilize the power of what we have. Now there are major bottlenecks and just getting faster components isn't going to cut it. We need to come up with solutions to take advantage of those expensive components.

Hell, isn't it crazy that we've had SSDs for over a decade and that it's just now that even a basic-bitch 500MB/s SATA is truly be taxed? Awesome that we have those 10GB/s drives geared at gaming but for what exactly? To be constrained by the ancient software stacks powering these archaic I/O solutions?

DirectStorage is a step in the right direction. They need to do more to leverage the power of the hardware.

Yep, imagine a game optimized for a 4090 build as well as this game was optimized for the PS5 and they try to run that on the PS5 and see what happens.
 

Kakax11

Banned
What kind of setup is involved? I have 2 overlays but am unfamiliar with how yours work together. I use fpsmonitor for my needs.

Old times required MSI+RTSS setup but now with the addition of CapframeX you only need to figure out CapframeX, the font style, size and alignment in RTSS

53077590189_b3145f42af_3k.jpg



if you want to go very deep in the overlay

 

ChiefDada

Gold Member
No, he didn't. He said they will LIKELY see a difference because he noticed the saturation difference between Gen2 and Gen3 for the same game. He then compared the limited difference of MM between Gen2 and Gen 3.

Update - Definitely not VRAM bound. I hope Alex has a discussion with Nixxes similar to their talks over Spiderman so we can know how memory management works for the PC Port. Overall though, it looks to be a pretty good experience on PC.

 

Buggy Loop

Member
Update - Definitely not VRAM bound. I hope Alex has a discussion with Nixxes similar to their talks over Spiderman so we can know how memory management works for the PC Port. Overall though, it looks to be a pretty good experience on PC.



Still what the hell, it's bandwidth hungry

There was talk on reddit that the game would memory leak? Setting textures back to medium and back to very high would boost performances back. Maybe patch already fixed this, i was reading that last night.
 

Chronicle

Member
1. Math says yes.
2. Everyone will say it will be amazing.
3. It runs like trash and everyone upset.

Brought to you by historical data
 

64bitmodels

Reverse groomer.
No, please, not this.

The problem with PC isn't the hardware. It's that no one can be bothered to properly utilize it because since times immemorial the answer has been "get better hardware".

You see hardware that should dust the PS5 lagging behind it. Your first thought shouldn't be to think about new hardware to compensate, it should be about how to actually utilize the power of what we have. Now there are major bottlenecks and just getting faster components isn't going to cut it. We need to come up with solutions to take advantage of those expensive components.

Hell, isn't it crazy that we've had SSDs for over a decade and that it's just now that even a basic-bitch 500MB/s SATA is truly be taxed? Awesome that we have those 10GB/s drives geared at gaming but for what exactly? To be constrained by the ancient software stacks powering these archaic I/O solutions?

DirectStorage is a step in the right direction. They need to do more to leverage the power of the hardware.
Why Dont We Have Both GIF


Mobos and CPUs with new I/O solutions inspired by the PS5 and better software stacks to take advantage of the new hardware. We could go a long way with this stuff.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter


Towards the end he says the PlayStation 5 is running the game approximately at the level of an RTX 2080ti. Impressive for the price of the console.

Too bad he doesn't have the data to back this up and simply scales off his shitty RTX 2070 OC that's like 50% slower than a 2080 Ti.

He could be correct for all we know but testing is damn near impossible because settings cannot be matched and DRS throws a wrench into the comparisons.
 
Last edited:

Zuzu

Member
Too bad he doesn't have the data to back this up and simply scales off his shitty RTX 2070 OC that's like 50% slower than a 2080 Ti.

He could be correct for all we know but testing is damn near impossible because settings cannot be matched and DRS throws a wrench into the comparisons.
Yeah it would be good if he had benchmarked a 2080ti
 

Dice

Pokémon Parentage Conspiracy Theorist
Of course. Windows on nvme ssds loads in a second. Its amazing.

I’ve moved windows installs from hdd to ssd before. It is not really that bad. Just look up some videos on YouTube. The hardest thing will be to go into the bios to change the boot order to ssd. Windows installer takes care of everything else.
It is actually an ordeal I am going through rn because I had optaine memory, so I have to disable that while retaining user data which basically has to reconfigure the whole 2TB, which the BIOS is doing while on a frozen screen (but the HDD is crunching away so I presume it is doing the work and will unfeeeze).

If all goes to plan I will not lose all my data (which would happen if I didn’t first disable optaine) and will be able to switch the memory out for a windows move.
 

Guilty_AI

Gold Member
Yeah it would be good if he had benchmarked a 2080ti
AND started making his tests with a halfway decent up-to-date cpu instead of that 5 year old ryzen 2700 of his. That is the sole reason why i don't consider his benchmarks good reference.
 
Last edited:

theclaw135

Banned
To say that an their i/o tech was needed for the game to come out was pretty disingenuous, but to say that it runs fine on an HDD is also false. It may run but in no way would anyone say that it's a great experience and a good time. We could argue semantics on what is "fine", and I'd say that Sony/Insomniac kinda got caught in a lie to a small degree, but in no way would playing that on an HDD be a good time compared to what their intended experience is.

Either way, watching some of you defend or attack either side of this argument has been fascinating. PC gamers have been giving plaudits to SSD's and the benefits well before they were in the mainline console discussion and people loved to dismiss the impact then. It's funny to see it all come full circle.

People are forgetting that Sony installed SSDs into 100-percent of PS5s ever sold.
Developers of PS5 exclusives don't have to even consider the possibility the game will ever be run from an HDD.
 

Hoddi

Member
Thrashing the PCI-e bus? Why would ray tracing traverse the PCI-e bus? That sounds like a pretty big bottleneck. Do you mean RT would slow down the decompression?
That's not really what I meant. What I meant was that enabling RT adds to VRAM requirements which can push you over your VRAM capacity. If you have an 8GB GPU and your game needs 7GB then you're probably going to be fine. But if you then enable RT and it adds, say, 1.5GB then you are now 0.5GB over budget. This remaining 0.5GB will then be stored in system RAM under 'Shared GPU memory' and thrash back and forth across the PCIe bus for every frame. It's a fair bit more complex in reality but that's the gist of it.

This has little to do with DirectStorage decompression which only uses a trivial amount of PCIe bandwidth by comparison. PCIe 3.0 has a total bandwidth of 16GB in each direction and it's fairly common for well performing games to transfer 3-4GB per second. R&C is only adding around a gigabyte on top of that at the most and much less on average.

How did you perform a read test on PS5? Where does that number come from?
This was already answered by H hlm666 (thanks for that) but here's how it looks if you're curious. I just plug my PS5 NVMe drive into a USB adapter and connect it with my PC.

Here's my (then new) drive before running the test and here it is after. That differential in Total Host Reads basically tells you all you need to know. And now that we can compare with a PC release we know that these numbers are accurate.

Are you running 32GB of system RAM?
Yes, in a 9900k + 2080 Ti system. What did you have in mind?

using nvme it seems it doesn't use much of it


53076944520_98a0f46a37_4k.jpg
I've seen some confusion online regarding this but the reason is that BypassIO will make your drives invisible while using DirectStorage. Disk counters still work perfectly fine on anything that doesn't use DirectStorage but it's currently not possible to measure DS performance on drives that use BypassIO.

There are ways to disable that in Windows' registry but it's up to you if you want to. I still haven't found the slightest performance benefit in BypassIO but that could change at some point.
 

Md Ray

Member


How much lower? Would love to see a test.

I doubt this, those cards shouldn't reach the maximum bandwidth of PCIE 3 x16.
I could be wrong though I'll admit it as havn't checked max mem bw on PCIE 3 x16 vs the different nvidia 30 series cards as I lost interest as you couldn't get one due to miners.

Just checked this and its the 3080 that is tested(capable of much higher bandwidth than a 3070) - https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-pci-express-scaling/25.html

I call BS on that twitter tweet.
Still call BS on that tweet?
To illustrate his point that there's a massive difference between a 3070 running on PCIE4 vs 3( there isn't) he's illustrating the difference between a GPU on PCIE2(which is PCIE3 x8) vs PCIE3(x16).

He makes no sense.
??
I'm pretty sure the guy is just wrong and running out VRAM from having RT enabled. The smoking bullet is that he's using Medium settings which barely streams any data from disk.

I already ran some tests comparing disk reads during the intro sequence and here's how they stack up.

PS5: ~16GB
PC Very High: ~19GB
PC High: ~5GB
PC Medium: ~1.8GB

I'm running the game on Very High settings without RT and it barely makes a blip on the bus load. He's running on Medium settings with RT and it's completely thrashing his bus. Both of us are on PCIe 3.0 so it's clearly not because of DirectStorage when my game is pushing 10x more disk data through.
Quick question - were you testing with AMD or NVIDIA GPU?
Look at his power usage. The 3070 is a 220w card but it's only using 120-150w in his screenshots. That strongly suggests that it's underutilized.

'GPU utilization' is just a measure of work scheduled by Windows and doesn't really say anything about how busy the physical chip is. Encoding a video with NVENC will show the GPU being fully utilized, for example, but it only uses a tiny part of the chip. For all we know, it's just Windows issuing copy commands because he's running out of VRAM.

Hard to say obviously. But I really doubt it's anything else since my own PCIe load is nothing remotely like his in this game.

This is an awesomely fascinating port. I love seeing all this PCIe 2v3v4. Finally a game which can flex and use these new advancements.

As some noted, were going to need more custom HW to compensate. Something like the custom stuff on PS5.

This is the type of advancement I've been waiting for since all the talk about this started years ago.

Folks are sleeping on the importance of this release due to this new implementation.

Thrashing the PCI-e bus? Why would ray tracing traverse the PCI-e bus? That sounds like a pretty big bottleneck. Do you mean RT would slow down the decompression?

Doubt it makes that much of a difference.

Update - Definitely not VRAM bound. I hope Alex has a discussion with Nixxes similar to their talks over Spiderman so we can know how memory management works for the PC Port. Overall though, it looks to be a pretty good experience on PC.



Still what the hell, it's bandwidth hungry

There was talk on reddit that the game would memory leak? Setting textures back to medium and back to very high would boost performances back. Maybe patch already fixed this, i was reading that last night.
It is PCIe bandwidth heavy confirmed! GPU under Gen4 mode gets big boost over Gen3 in this game.

 

lmimmfn

Member
Still call BS on that tweet?

??

Quick question - were you testing with AMD or NVIDIA GPU?











It is PCIe bandwidth heavy confirmed! GPU under Gen4 mode gets big boost over Gen3 in this game.


They probably have resizable bar enabled on PCIE4.
 
Review scores don’t matter, but Just remember that these bastards gave the game 2/10 when the PS5 launched:

 

Fess

Member
AND started making his tests with a halfway decent up-to-date cpu instead of that 5 year old ryzen 2700 of his. That is the sole reason why i don't consider his benchmarks good reference.
I don’t think he’s aiming for the high-end PC crowd, the way his comparisons go it seems like he’s aiming for console gamers considering getting into PC gaming. It’s not a bad idea but maybe not the best for PC gamers on core gamer boards.
 

yamaci17

Member
It increases the size of the addressable GPU VRAM available to the CPU and can improve framerates especially for high memory throughput games
It would be interesting to find out if that is the case here as that lower end GPU shouldn't saturate the PCIE3 bus.
the game, like spiderman, uses enormous amounts of shared vram as a substitute for the lack of vram. this is why it is saturating the pcie3 bus. resizable bar or anything else wouldn't really help
 

hlm666

Member
I don’t think he’s aiming for the high-end PC crowd, the way his comparisons go it seems like he’s aiming for console gamers considering getting into PC gaming. It’s not a bad idea but maybe not the best for PC gamers on core gamer boards.
If that's true he should probably be using a cpu and gpu that's in production or at the least can still be bought new.
 

Guilty_AI

Gold Member
I don’t think he’s aiming for the high-end PC crowd, the way his comparisons go it seems like he’s aiming for console gamers considering getting into PC gaming. It’s not a bad idea but maybe not the best for PC gamers on core gamer boards.
That makes it even worse, because he is using old hardware that often is more expensive than newer and better hardware, if available at all.

If that was really his objective, he should've been using something along the lines of a rtx 3060 and a ryzen 3600. Those are the basic specs for an average good gaming pc. If he was aiming for low specs, he should've been using a rtx 2060 at best along a i3. Those are better (apart from the rtx 2060), cheaper and more available than the specs he's using, have been for some time.
 
Last edited:
Top Bottom