It's curious that Red Dead Redemption PC requirements are higher than RDR2

LectureMaster

Has Man Musk
RDR2 SPEC:

FxDBv4k.png


RDR1 SPEC:

4ce2rft.png


But this is not as informative as we don't know what the recommended specs targeted performance is.

Gotta summon

1516342687376
for the ultimate truth.
 
And not just a bit higher, it's much higher. RDR2's recommended GPUs are closest to the last-gen Pro consoles. RDR1 recommended GPUs fall a bit below the current-gen consoles.
 
Tis a mighty Red (Dead) Flag on port quality. Hope this is not just cobbled together to rely on brute forcing without any attention given to actual optimisation for the PC platform.
 
It's because the GTX 900 series is now the lowest card to still be getting driver updates. The GTX 700 series stopped getting driver updates in July this year.
 
Probably a less talented devs. cant optimize for shit. probably fresh junior team was given this project.
 
Last edited:
Probably a less talented devs. cant optimize for shit. probably fresh junior team was given this project.
It's a small studio but they got the game running on a switch at a locked framerates and 4k30 on PS4 with a 60fps lock on PS5 with FSR Native AA which is clean AF.
 
Those high requirements might indicate that the game is running on top of an emulation layer. Not a native Windows PC application.
 
Let's continue piling up an infinite stack of middleware to run the same shit as before. Diminishing returns.
 
Wouldn't surprise me if they are running the PS3 or Xbox 360 version in an emulator and then selling it to us as the "PC version".
 
Denuvo. You can already play this on PC at 4k 60fps+ using Ryujinx fairly easily, so it'll be funny af if the official release runs worse
 
Why do people expect recommended specs to not go up with time? When testing games the QA department has HW of a certain age to test with and that is what they use. Same goes for minimum. Why should they maintain ancient tech to validate some minimum? Its a wonder they could even find a system with a 4000 series chip and a mid range GPU from 2014 to test on.
 
Why do people expect recommended specs to not go up with time? When testing games the QA department has HW of a certain age to test with and that is what they use. Same goes for minimum. Why should they maintain ancient tech to validate some minimum? Its a wonder they could even find a system with a 4000 series chip and a mid range GPU from 2014 to test on.

What are you waffling on about?

This was from a set of games rockstar released just last year:

gJ7nIvw.jpeg


And if you want to be pedantic, since Double Eleven are handling this port, this is from the Harry Potter Collection which just released:

bkoNVIe.jpeg
 
Last edited:
If Rockstar is charging 50-60$ for an emulator, people are going to be disappointed.
Emulating the Switch version doesn't even need such hardware. I did it on a 11 year old i5 4670 with an 8 year old card (GTX 1060). And it looks just as good as the originals if not better.
 
Optimization - schmoptimization

Just buy a new pc!

And for the folks who can't get a new PC, we have a product as well - it is called xbox 360. You can play RDR there.
 
Lol a 2070 for a game that runs o ps3, x360 and switch, what a disgrace is that ?
Maybe the "recommended" experience is 8k at 144fps 😅.

It all depends what the recommended performance means at the end of the day. The recommended experience varies wildly from game to game even from within the same publishers or developers.
 
Those high requirements might indicate that the game is running on top of an emulation layer. Not a native Windows PC application.
The CPU requirements should be much higher in that case. The GPU ones, not quite as high. How does it run with RPCS3 or that 360 emulator?
 
Last edited:
The CPU requirements should be much higher in that case. The GPU ones, not quite as high. How does it run with RPCS3 or that 360 emulator?

The CPU requirements are much higher than any CPU from the X360/PS3 era.
For the GPU, who knows what the hell is going on there.

If it's not emulated, then this might be a very poorly optimized port. GTA4 level's of bad.
 
The CPU requirements are much higher than any CPU from the X360/PS3 era.
For the GPU, who knows what the hell is going on there.

If it's not emulated, then this might be a very poorly optimized port. GTA4 level's of bad.
Are there instruction set differences between processors before 4670? Could be as simple as the engine they are using not supporting certain aspects of older chips. Or maybe that's as far back as they had or willing to go for testing purposes.
 
Hopefully we'll see some YouTube videos of ppl playing with out of spec setups and see how it runs.

Either they're brute forcing with the higher specs or just making shit up.

Recommending a 20 series card makes sense as that was the 1st series with DLSS.
 
The CPU requirements are much higher than any CPU from the X360/PS3 era.
For the GPU, who knows what the hell is going on there.

If it's not emulated, then this might be a very poorly optimized port. GTA4 level's of bad.
Could be just some shit they made up too because nobody uses Intel Core 2 Duo from the PS360 era anymore.

I guess we'll see if there are visual improvements and what the resolution target is. If it's 4K60 with better visuals, that's fine. If it's 1080p30 with the same graphics? Yeah…
 
Last edited:
Maybe recommended is for 4K?

Or maybe it's just a quick shitport from the PS4 version.
 
Those high requirements might indicate that the game is running on top of an emulation layer. Not a native Windows PC application.
From the other thread, "full mouse and keyboard functionality" indicates that it is not emulated, but of course nothing is certain.
 
Top Bottom