• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Can Ratchet and Clank: Rift Apart on PC match - and exceed - the PS5 experience?

shamoomoo

Member
Console gamers thinking that SSD is the newest cutting edge technology is the funniest shit ever. In reality it’s been the standard for so long that it became dirt cheap to include it at entry level gaming devices like PS5 or Series.
How! If I'm not mistaken, people could put SSDs in the PS4 but it wouldn't have a noticeable effect on loading.
 

yamaci17

Member
Performs great for a x50 series card, why are people complaining?
4060 is still %20 faster than 3060 chip by chip

its vram is causing %50 to %150 average performance drop (hence why nx gamer's 8 gb videos are pointless, as evidenced by 4060 even being much slower than 3060 once its vram is full)
 

RagnarokIV

Battlebus imprisoning me \m/ >.< \m/
How! If I'm not mistaken, people could put SSDs in the PS4 but it wouldn't have a noticeable effect on loading.

Or

 

SomeGit

Member
4060 is still %20 faster than 3060 chip by chip

its vram is causing %50 to %150 average performance drop (hence why nx gamer's 8 gb videos are pointless, as evidenced by 4060 even being much slower than 3060 once its vram is full)

20% is a bad gen on gen upgrade, combined with the VRAM issues the card is just embarrassing
 
Last edited:

yamaci17

Member
20% is a bad gen on gen upgrade, combined with the VRAM issues the card is just embarrassing
on top of being bad, it will be unable to utilize framegen in most cases, as framegen requires an additional 1-1.5 gb of vram that a dev have no obligations to account for when they design games for consoles and alike (and for people that don't have frame gen).

besides that, funnily, having a card with PCIE 4 16x specs and 32 gb/s bandwidth allows the other 8 gb cards to get a bit more performance when the game heavily taps into system ram. but surprise surprise, both 4060 and 4060ti is PCIE 4 8x, in other words, it can allow for a maximum of 16 gb/s bandwidth, which is further funny
 

SomeGit

Member
on top of being bad, it will be unable to utilize framegen in most cases, as framegen requires an additional 1-1.5 gb of vram that a dev have no obligations to account for when they design games for consoles and alike (and for people that don't have frame gen).

besides that, funnily, having a card with PCIE 4 16x specs and 32 gb/s bandwidth allows the other 8 gb cards to get a bit more performance when the game heavily taps into system ram. but surprise surprise, both 4060 and 4060ti is PCIE 4 8x, in other words, it can allow for a maximum of 16 gb/s bandwidth, which is further funny

If a poor soul uses this car on a PCIE 3 platform, like the R5 3600 or a 5000 series APU, this is likely even worse then. I didn’t know it was a 8x card wth
 

Bojji

Gold Member


Yeah it works better without DS... looks like this tech is good only for slower drives? I don't know.

They should implement CPU (option) only DS like in Forspoken, it worked great in that game, putting GPU resources to that may not be the best idea (1% lows!).

on top of being bad, it will be unable to utilize framegen in most cases, as framegen requires an additional 1-1.5 gb of vram that a dev have no obligations to account for when they design games for consoles and alike (and for people that don't have frame gen).

besides that, funnily, having a card with PCIE 4 16x specs and 32 gb/s bandwidth allows the other 8 gb cards to get a bit more performance when the game heavily taps into system ram. but surprise surprise, both 4060 and 4060ti is PCIE 4 8x, in other words, it can allow for a maximum of 16 gb/s bandwidth, which is further funny

If a poor soul uses this car on a PCIE 3 platform, like the R5 3600 or a 5000 series APU, this is likely even worse then. I didn’t know it was a 8x card wth

4060 series of GPUs is one big joke from Nvidia, these cards never had bus and memory cut down so much. 1060 was 192 bit bus but A LOT (for that time) memory, 2060 had the same bus but memory was on the edge in 2018, 3060 had more memory than xx80 series (LOL) and again, 192 bit bus. None of the previous xx60 cards had cut down PCIE bus, Nvidia are just greedy bastards.

Based on raw specs, 4070 should be xx60 (maybe TI?) card this gen.
 

Hoddi

Member
Yeah it works better without DS... looks like this tech is good only for slower drives? I don't know.
Likely the opposite. If the CPU is capable of matching the GPU then it means the load isn't intensive enough that it makes a difference.

As it is, it just saves a bit on CPU performance. It's only about a half core but it could mean the difference between playable vs unplayable on quad core CPUs.

9ZAatOe.png
 

Md Ray

Member
on top of being bad, it will be unable to utilize framegen in most cases, as framegen requires an additional 1-1.5 gb of vram that a dev have no obligations to account for when they design games for consoles and alike (and for people that don't have frame gen).

besides that, funnily, having a card with PCIE 4 16x specs and 32 gb/s bandwidth allows the other 8 gb cards to get a bit more performance when the game heavily taps into system ram. but surprise surprise, both 4060 and 4060ti is PCIE 4 8x, in other words, it can allow for a maximum of 16 gb/s bandwidth, which is further funny
Shitty NVIDIA.
 
Top Bottom