Gaiff
SBI’s Resident Gaslighter
I believe both statements are true.So ALL OTHER games are unoptimized on PS5 or TLoU is not optimized on PC? What is more likely?
I believe both statements are true.So ALL OTHER games are unoptimized on PS5 or TLoU is not optimized on PC? What is more likely?
I believe both statements are true.
There’s more than just API differences. Look at how good Sony games look and how well they run. Now compare them to the average AAA and you’ll see a huge gap.Games they tested represent theoretical power of PS5 GPU vs. Pc GPUs quite well. So PS5 = 6700 (or 2070S when not vram limited).
How much better can console perform vs. Pc? API difference shouldn't be more than few %.
There’s more than just API differences. Look at how good Sony games look and how well they run. Now compare them to the average AAA and you’ll see a huge gap.
I don’t think most games are all that well-optimized for PC either.
We're back to this? do you even have anything concrete other than this screenshot you must have posted 4 times now. It's more likely that ND utilized the PS5 better than most to achieve native 4k at higher framerate than games like A Plague Tale and a lot of other games, and still ended up with a great looking game, yes. Remember a Plague tale is 1440p there on the PS5 and that 4070 Super test. Look at when you enable DLSS Q to drive the native res of TLOU remake to 1440p too. Better performance on the 4070 super than A Plagues tale with DLSS Q 4k.Yeah, game has nice quality textures for sure. But at launch i required like 12GB+ at 1080p resolution and this game started the whole debate on VRAM requirements of new games vs. what NV offers.
??
This is how game performs vs. PS5:
While in other games they tested 4070S is ~2x performance of PS5.
So ALL OTHER games are unoptimized on PS5 or TLoU is not optimized on PC? What is more likely?
We're back to this? do you even have anything concrete other than this screenshot you must have posted 4 times now. It's more likely that ND utilized the PS5 better than most to achieve native 4k at higher framerate than games like A Plague Tale and a lot of other games, and still ended up with a great looking game, yes. Remember a Plague tale is 1440p there on the PS5 and that 4070 Super test. Look at when you enable DLSS Q to drive the native res of TLOU remake to 1440p too. Better performance on the 4070 super than A Plagues tale with DLSS Q 4k.
That's irrelevant and repetitive though. What we're discussing is the idea that there isn't a technical reason for a given performance on a specific system. unlikely that the game simply decides to perform at 34fps for inexplicable non-technical reasons on lower end hardware like the 6600XT at max settings. Just as wukong runs at 14fps on a 6600XT at max there are actual technical reasons.
Or get this, good PS5 code from one of PS' best studios. Using an API and hardware they know inside out, had a hand in developing. Unfathomable I know. Only your reason is correct. Even when I tell you that A Plague tale runs at a lower fps than TLOU at the same res on a 4070 Super with PS5 settings your idea of whether a game is "optimised" remains PC to PS5 relative performance. Meaning nobody is allowed to push console hardware above the average from fear of "poor code".There is no technical reason why TLoU perform only 34% better on GPU much better than what is inside PS5 (in every category) while other games are 100% better. So if there is no technical reason, what remains? Poor code.
That's not what I said at all. There is a difference between calling those games shit and saying TLOU is good. So why does Cyberpunk on 6700 underperform. shit code? Or technical reason?In Cyberpunk RT 6700 undererforms but it's ok in performance mode. Your argument here is that all those games above are shit on PS5, that's why they are close to 6700, while TLoU - pinnacle of Sony engineering is the only optimized game and that's why it's so much better on PS5?
So what are you suggesting is happening exactly? They detect you're not running a PS5 and gimp your shit?PS5 is 6700 in APU form and without IC, there is no reason for it to outperform PC GPU with almost the same specs:
Let's just ignore the memory bandwidth difference there. Whoops, did your argument fall apart with what has been said is probably a bottleneck several times already?
PS5 is 30% faster here^ (can go up to over 40%)
TLoU is not the first (and certainly not the last) underperfming game on PC but for sure PS fans love to use it as example of mighty power of their machine.
I've never seen anyone else say that except you.Yet you want to believe the PS5 GPU has 6700XT / 2080S / 3060ti class GPU.
Wait, youre telling me a computer with $1400+ specs using a $600 card has better optimization and graphics vs a $700 console (and cheaper consoles by proxy)?I just bought a PS5 Pro about 2 weeks ago at target. It's pretty much nothing as far as better graphics. I don't see any difference at all. The only thing I really noticed at all was better frame rate in Call of Duty games, but that's really it. I got more kills, that's for sure. Now I have a Lenovo Legion gaming PC with a Nvidia 4070 Super 12GB GDDR6 GPU with 32GB ram, 1TB SSD, 1TB HDD, Intel core i7 14400. Like all the pictures here have prooved already in the posts on here, the 4070 and 4070 super are way better than the PS5 Pro. THe 4070 Super has 33.5 teraflops. The PS5 Pro has 16.7 teraflops. I have got it beat. So like I said I own both. I notice the better graphical detail on PC, especially older games like Resident Evil 5 on Steam, I was like wow dude, I can see all the details in them, and I got it up to my 165 Hz 1440p monitor. The graphics and the frame rate are way way better than the PS5 Pro. I ain't never going back to consoles again. Sure I'll keep what I have now, and I'll still play PS3, 4, and 5 games sometimes, I own about 300 games all across it, way more than Steam. Overall though because performance is way way better on PC though, why continue to buy consoles? Exactly. Don't buy consoles. Once you go PC, you stay PC cause you realize oh yeah the frame rate is higher and better, the graphics are better too.
Um, spend more = get more better processing power and graphical power and frame rate power.Wait, youre telling me a computer with $1400+ specs using a $600 card has better optimization and graphics vs a $700 console (and cheaper consoles by proxy)?
I am legit shooketh
Yeah thats pretty much my point. Not that i disagree about sticking with PC if you already have one. But the whole point of consoles is getting a good entry level system for an affordable priceUm, spend more = get more better processing power and graphical power and frame rate power.
I'm guessing the RT on consoles is just drastically toned down. But is more complex on PCs since RTX and Arc cards can handle it. If the same code paths were used on PC, Radeon cards would handle it better, but the image quality would be much worse.PS5 GPU don't have access to whole memory bandwidth, it's shared with CPU. 6700 also has infinity cache to boost that. Nothing like that on console.
6700 also has higher clock.
Cyberpunk could be like that thanks to AMD drivers or ps API for rt being more efficient. The last of us is not doing any RT.
You have many games vs one. Yet in your opinion this one game results represents power difference between PS5 and pc.
I got more kills, that's for sure.
Why didn't you use the reply function? It's CPU access to GPU memory that is the bottleneck on the 6700 limited by the 40% slower memory bandwidth.PS5 GPU don't have access to whole memory bandwidth, it's shared with CPU. 6700 also has infinity cache to boost that. Nothing like that on console.
Sure it does but this doesn't offset 40% slower memory.6700 also has higher clock.
So you think efficiency in APIs are only related to RT? You think the people who helped create the PS APIs cannot improve other aspects of it and take advantage of it in their game?Cyberpunk could be like that thanks to AMD drivers or ps API for rt being more efficient. The last of us is not doing any RT.
2070S + 30-45% is basically a 2080 Ti. I'm still using that GPU and, from what videos I've seen of PS5 Pro, the performance seems very similar to mine. And there's a reason I'm still using it because it's a fantastic GPU.I've never seen anyone else say that except you.
As Boji said, the PS5 and Xbox Series X have the same performance as the RTX2070S (RX6700).
Some highly optimized games such as FH5 (XSX), Gears of War 5 (XSX), Gotham Knight (PS5 & XSX), Genshin Impact (PS5), Death Stranding (PS5) and COD: Cold War (PS5) are equivalent to the RTX2080.
With the latest UE5 (Fortnite Chapter 6), the PS5 Pro has a 56% higher resolution than the base PS5 and adds HWRT (High to Epic).2070S + 30-45% is basically a 2080 Ti. I'm still using that GPU and, from what videos I've seen of PS5 Pro, the performance seems very similar to mine. And there's a reason I'm still using it because it's a fantastic GPU.
I don't like comparing different architectures/vendors though. Console-centric engines will obviously gravitate towards AMD GPUs while UE has historically performed better on nvidia hardware. People are chasing their tails trying to make this about console vs PC when they should be talking about AMD vs nvidia GPUs.
That's definitely true. I'm just basing this on Sony's document calling it 45% faster. There's bound to be some areas where it's more than 45% faster just like there are areas where it's less than 45%. Some games might allow higher boost clocks on PS5 Pro than PS5, for example.With the latest UE5 (Fortnite Chapter 6), the PS5 Pro has a 56% higher resolution than the base PS5 and adds HWRT (High to Epic).
It's limited by the Zen2 CPU and a 60fps cap (V-Sync), making it hard to measure the Pro's pure GPU benchmarks.
We'll have to wait for a larger sample of Pro games.
Mileage will vary for sure and as I've said before this is extremely EARLY days here. Most of the games have had minimal efforts into PRO patches. It's also very difficult to get absolute matched settings across PRO and PC because many titles have custom settings on PRO (i.e Alan Wake 2). That said, there are more than a few games where you can clearly see that the general level of performance on the new PRO modes are in the ballpark of a RTX 4070 or higher. And yes in some cases you may say well the settings aren't an exact match or the test scenes aren't exact matches etc. But having tried many of these titles on both the PRO and PC myself, I would challenge that any differences are so slight that they'd hardly be noticeable by eye to the vast majority of gamers and the story across the broad game experience is close enough.
So without further ado (tried to minimize comparison to no DRS and upscaling where possible to reduce variability):
GOD OF WAR RAGNAROK
[PS5 PRO] Quality Mode (Native 4K TAA) - 45-60fps (avg low 50s)
[PC] God of War Ragnarok PC (Native 4K TAA) - RTX 4070 avg ~45fps - 7900GRE/4070Super level to match general PRO performance
THE LAST OF US PT 1
[PS5 PRO] 4K Fidelity Mode runs at 50-60fps with avg in the mid 50s
[PC] Native 4K on RTX 4070 = avg FPS is mid 30s. Even a 4080 can't hit 60fps avg at 4K. PRO definitely superior here
RATCHET AND CLANK RIFT APART
[PS5 PRO] 4K Fidelity Mode with All Ray Tracing options enabled: Framerate is ~50-60fps with an avg in the mid 50s
[PC] Native 4K with Ray Tracing: RTX 4070 is under 60fps on average (low 50s). Pretty close to the PRO experience
CALL OF DUTY BLACK OPS 6
[PS5 PRO] 60hz Quality mode (w/PSSR) runs at ~60-100fps with average in the mid-upper 70s (Campaign and Zombies)
[PC]RTX 4070 Extreme + DLSS Quality drops below 60fps quite often in Campaign (one example below). Lowering settings a little bit will be a closer match in quality and performance to the PS5 PRO
Resident Evil Village
[PS5 PRO] 4K Mode with Ray Tracing Enabled runs at ~60-90fps with average in the 70s
[PC] Native 4K with RT: RTX 4070 averages ~75fps. Right in the ballpark of the PS5 PRO experience
Hint: got a feeling TLOU PT2 and SpiderMan 2 will be good candidates for this list in a few months
Why didn't you use the reply function? It's CPU access to GPU memory that is the bottleneck on the 6700 limited by the 40% slower memory bandwidth.
Remember on the PS3 where games like skyrim were an awesome optimised game and PS3 Cell access to RSX VRAM was slower and a headache because it was a split pool? When that was slower that was the consoles fault of course. Now it's the software's. Fast shared pools of RAM are bad now! Who'd have thought?
The fact that the fast GDDR6 is shared with the CPU isn't a negative, it's a positive in optimisation. Amounts sure you can argue about, bandwidth not really if you need CPU access to VRAM. The bandwidth is better not worse.
Anyway isn't this kind of contradictory? I thought you were saying it is only optimised for the 6700 and 6700XT earlier. Now you're saying it's not optimised for the 6700 either? Make up your mind son.
Sure it does but this doesn't offset 40% slower memory.
So you think efficiency in APIs are only related to RT? You think the people who helped create the PS APIs cannot improve other aspects of it and take advantage of it in their game?
Im surprised myself how good the RTX 4070 is at 2560×1080 It can max out any game including patracing in CP, Alan Wake 2 with DLSS set to quality. I didnt expect a mid range card to be able to max out path tracing in games.I bought a 4070 build a few days after the ps5 pro release.
So far I've been blown away. Currently playing cyberpunk with almost every setting at its highest, with dlss quality or balanced and framegen. Playing at above 60 FPS at 4k easily.
Also currently playing RDR2 at near maxed out (a few settings like tree tessellation off), also at 4k (native). Getting 55-65fps.
I don't think the ps5 pro would be giving me anywhere near this level of fidelity, and that's before factoring in how reliant you are on the dev when on console.
I've been very, very happy with my purchase. DLSS is such a cheat code.
If I were them I'd invest in a PC to avoid the console all digital future hellscape.Nah, it's not at all. Still imo the best option for console bros for going through the rest of the gen.
If I were them tho, I would be lighting up some candles and praying that devs get their shit together when it comes to pro patches. Some good ones out there but also some stinkers that make it look worse than what it really is.
You would advise getting an all digital ecosystem to avoid an all digital ecosystem?If I were them I'd invest in a PC to avoid the console all digital future hellscape.
Open vs closed. PC you have multiple options while a console leaves you at the tender mercies of the console owner.You would advise getting an all digital ecosystem to avoid an all digital ecosystem?
Reflex makes the added latency negligible. The fearmongering around frame generation's input lag is utter bullshit. Unless you play stuff like CS or COD where you don't need it for high fps anyway, there's almost no reason not to toggle it in games where it's available.DLSS frame generation surprised me too for single player at least i dont feel lag, i was expecting the input lag to be noticeable but its not. More noticeable are the image glitches
Yes. The difference is PC an open platform anybody can sell games which means there's a lot of competition. Now you'll probably say Steam bodies everyone but this isn't exactly the case look at the price hierarchy of digital games from most expensive to biggest discounts: Playstation Network/Xbox/Switch eShop >> Steam/MS Store/Uplay/Origin > Epic > licensed resellers like GMG > Key sites like Kinguin > Itch.ioYou would advise getting an all digital ecosystem to avoid an all digital ecosystem?