• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is the PS5 pro on par with a RTX 4070 like Digital Foundry claimed before launch?

It has been a blast the gazillion other times PC and PS5 Pro have been compared so...

It's so tiring and boring. Who is it who wants these comparisons? If you care about the best performance and best quality get a PC. If you don't care about quality/performance, want an easier life and don't like tweaking settings get a PS5. I guess for me the pro seems odd because it sits in the middle. I'm honestly not sure who is the person who wants to spend that much over the base PS5 and still have an inferior experience to PC...I'm guessing people who are complete luddites but have decent disposable income and are really invested in the playstation ecosystem.
 
Last edited:

kevboard

Member
So basically you are telling me this is not optimized for Pro.

the pro version of AW2 clearly got a lot of care.

they changed how the BVH works by hand and at a granular level.

adjusting the distance of the BVH by hand to make sure the most important distant detail like mountains are still captured, while less important detial is reigned in.

they also adjusted the tick rate of different types of dynamic objects that are in the BVH to save performance where it doesn't hurt, while keeping important things like player characters and important NPCs updating every frame.

all of this doesn't guarantee that they got every last bit of performance out of the system, but is absolutely shows that the port was handled with care.

again, something that can clearly not be said about Callisto Protocol on PC, where many glaring issues still remain to this day, and were catastrophic at launch.
 
Last edited:
The problem is this game has absolutely no granularity whatsoever and you cannot even change the settings in real time. You have to exit to the main menu.

Lighting for instance can only be toggled between Low and Standard and has a huge performance impact in some scenes while not so much in others. The difference can vary between less than 10% to over 25%. Consoles could easily be using a custom value between the two and only look a bit worse in some scenes and you'd think they're using the same setting.

Another one is volumetric. If you compare still shots, there really isn't much of a difference. You'd have to find the same place and compare side by side. However, the impact between Medium and High can be as much as 17%.
jrRyvrX.png
Once again, consoles could be using a custom value or even mix them up depending on the source. It could be large volumetrics use a lower value to save performance.

Another one is particles that can look very similar between High and Medium, but Medium runs 5-6% faster.

wwJwhkZ.png

Then there's the ray tracing with RT shadows sometimes barely showing a difference but tanking the fps by 30%.


Surely, you have noticed this:

GoAipIt.png


The game is completely single-threaded and this shows a severe CPU bottleneck. The 4070 is asleep at the wheel here drawing 116W.

Also, in this very video, the average for 1440p FSR Q is 64fps, but the average for 4K FSR Q is 80fps.
4g1fgHx.png
g0xP3cB.png

So going from basically 1440p to 4K increased the fps by a whopping 25%. Of course, they tested different areas, but this harks back to the point I made several posts ago that you cannot just use random benchmarks. You have to do like-for-like because as you can see here, the numbers can vary enormously. You'd think this game is more demanding at 1440p than at 4K with FSR Q using that video, but we all know this isn't the case.
Consoles probably use more optimised settings like you say, and that's why the slower PS5Pro GPU (RX6800 equivalent) can run this game so well, but it doesn't matter if even enthusiasts like me can't tell the difference between the PS5Pro and PC version running at maxed out settings.

I know this game has CPU optimization issues on PC, at least with RT turned on. My 7800X3D usually gets well over 100fps with RT, but I did see a couple of dips to around 52fps or rare occasion.

The RT shadows in Callisto Procol look better than the raster ones, and I definitely noticed them during my walkthrough. RT character shadows have that realistic soft look, but they can also look perfectly sharp when the light source is strong.

Performance in this game fluctuate a lot. At 4K native I can get 80fps in one place and 40fps in another (ignore the IPS glow in my video, that's a result of the high ISO and wide angle lens, my eyes can't see it in real life, at least not during the day).


 
Last edited:

MikeM

Member
The real question is the PS5 Pro better than the new Intel Arc B580?

If you are building a PS5 Pro "PC" the intel GPU is very tempting at that price point.
Intel has a spotty history with drivers. Reliability is something to consider in this argument.
 

Gaiff

SBI’s Resident Gaslighter
Consoles probably use more optimised settings like you say, and that's why the slower PS5Pro GPU (RX6800 equivalent) can run this game so well, but it doesn't matter if even enthusiasts like me can't tell the difference between the PS5Pro and PC version running at maxed out settings.
That's a different conversation and more subjective. If the topic is that it can give you similar experience to a high-end PC, then I think probably, but the game is still broken on PC, so it's not like the bar is very high there. The Pro seems much more stable as it's all but impossible to maintain 60fps on a PC. You still get massive traversal and shader comp stutters or random fps lurches.
 

Three

Gold Member
I'm specifically talking about something like the last of us part 1, clear outlier of performance when DF tested many games between PS5 and GPUs.

This game is just made by idiots on PC, if you put "PS5 GPU" (6700) into PC is performs MUCH better, 20% faster than 6800

MrhsVIe.jpeg




How can anyone explain this shit? Normally 6800 is ~44% faster than 6700.

You explain it as "better optimised for a specific machine" and not "unoptimised for PC" considering you're using a PC to show this anyway.

It's not that the game is unoptimised or "made by idiots".
 
Last edited:

Bojji

Member
What cpu where used on the PC? A 3600 or a cpu similar to the PS5?

It’s funny that some people thinking their 4070 is good because of frame gen

f9FW2.gif


You explain it as "better optimised for a specific machine" and not "unoptimised for PC" considering you're using a PC to show this anyway.

It's not that the game isn't unoptimised or "made by idiots".

It's unoptimized for all GPUs except 6700 and 6700XT

It's one of the dumbest things I have ever seen on PC.
 

Hoddi

Member
This is true. But in the case of Callisto Protocol he might be right about the extra care and optimization on Pro combined with DRS more than settings, as they actually exceed PC in some ways:

4wKQ71p.gif


Pro:
za9H8fQ.png


PC:
yGr7D8C.png


And RT at the least in the only comparison we have, is a match to highest PC settings.

PS5 Pro Performance Mode:
The-Callisto-Protocol-20241120013653.png


RTX 3060ti Native 4K RT Reflections High:
p7zxykp5.png


4K FSR2 Performance Mode (1080p internal) RT Reflections High:
tzhchmm3.png



4K TAAU 50% (1080p internal) RT Reflections High:
3nhkyrts.png


If it offers a rocksolid 60fps with no traversal stutter and has this IQ in its Performance Mode:

The-Callisto-Protocol-20241120030759.png


The-Callisto-Protocol-20241120030725.png


The-Callisto-Protocol-20241120013355.png


The-Callisto-Protocol-20241120013624.png


The-Callisto-Protocol-20241120013713.png


The-Callisto-Protocol-20241120013840.png


The-Callisto-Protocol-20241120013949.png


The-Callisto-Protocol-20241120014156.png


The-Callisto-Protocol-20241120020816.png


It's fair to say it offers a better experience on console even if the RT shadows (which were entirely absent on base PS5 60fps mode) don't match PC resolution. Especially if you also enjoy gimmicks like DualSense features that actually do have an immersion impact in this kind of games.


If we want to go that route, there are games better on Pro than even a 4090. But has very little to do with hardware.
In case you missed it, the PS5 Pro patch was 25GB because it upgraded many of the game's textures. Those same textures are also used on base PS5 which now looks better than it did at launch (as well as Xbox/PC which didn't receive those upgrades).

You can see the difference in these videos.



 

kevboard

Member
In case you missed it, the PS5 Pro patch was 25GB because it upgraded many of the game's textures. Those same textures are also used on base PS5 which now looks better than it did at launch (as well as Xbox/PC which didn't receive those upgrades).

You can see the difference in these videos.





noone should support this game or dev team btw...

like, even ignoring for a moment that the game is utter trash, I have never seen a dev team so clearly neglecting certain ports they released to this degree. like wtf?

the only even remotely finished version on launch was the PS5 version, and the others are in varying states of shit to this day.
 
Last edited:
I bought a 4070 build a few days after the ps5 pro release.

So far I've been blown away. Currently playing cyberpunk with almost every setting at its highest, with dlss quality and framegen. Playing at above 60 FPS at 4k easily.

Also currently playing RDR2 at near maxed out (a few settings like tree tessellation off), also at 4k (native). Getting 55-65fps.

I don't think the ps5 pro would be giving me anywhere near this level of fidelity, and that's before factoring in how reliant you are on the dev when on console.

I've been very, very happy with my purchase. DLSS is such a cheat code.

Framegen is ass, you're playing it at 30/33fps and have the input latency of 30/33fps, Cyberpunk on a 4090 at 4k with path tracing and no frame gen is truly glorious!
 

Clear

CliffyB's Cock Holster
That's a different conversation and more subjective. If the topic is that it can give you similar experience to a high-end PC, then I think probably, but the game is still broken on PC, so it's not like the bar is very high there. The Pro seems much more stable as it's all but impossible to maintain 60fps on a PC. You still get massive traversal and shader comp stutters or random fps lurches.

Pro should not be expected to perform as well as a high-end PC! The entire unit is around the price of an upper-mid GPU!

Take your flowers if you demand them PC fans, but we all know how much you're paying for that privilege!

Also, enjoy nit-picking on PSSR while you can; because let's face it, AI based upsampling improves over time with training, so its practically guaranteed to get better in its current form, and no doubt will be even better by the time PS6 is launched with even more modern hardware.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Pro should not be expected to perform as well as a high-end PC!
In this game it does provide a similar or better experience, which is why we're talking about it. Sure, there's blatant favoritism on the part of Striking Distance, but that's part of the point.
Take your flowers if you demand them PC fans, but we all know how much you're paying for that privilege!

Also, enjoy nit-picking on PSSR while you can; because let's face it, AI based upsampling improves over time with training, so its practically guaranteed to get better in its current form, and no doubt will be even better by the time PS6 is launched with even more modern hardware.
No idea what you're on about.
 
Last edited:

Gamer79

Predicts the worst decade for Sony starting 2022
Pro should not be expected to perform as well as a high-end PC! The entire unit is around the price of an upper-mid GPU!

Take your flowers if you demand them PC fans, but we all know how much you're paying for that privilege!

Also, enjoy nit-picking on PSSR while you can; because let's face it, AI based upsampling improves over time with training, so its practically guaranteed to get better in its current form, and no doubt will be even better by the time PS6 is launched with even more modern hardware.
My PC did not cost me much more than the Pro and wipes it's ass with the pro

-Case = $60
-750s modular PSU (gold rated) $75
-CPU (I5-13600kf) $175
-CPU Cooler ($25)
-2tb nvme ($100)
-GPU 4070 ($450)
-Ram 32gb DDR 5 ($99)
-Micro atx mobo - ($150)

(about $1150)

Minus 6 years of PS+ and the pc becomes cheaper and cheaper.
 

hinch7

Member
Framegen is ass, you're playing it at 30/33fps and have the input latency of 30/33fps, Cyberpunk on a 4090 at 4k with path tracing and no frame gen is truly glorious!
Do you have an Nvidia GPU? Path tracing is something to be experienced and full RT and is the future of game rendering outside of Ai. Granted its way too heavy for the current generation of hardware and thus, still too early for the masses (Blackwell may move the needle on that) but the tech is amazing. Frame generation included, once you hit certain thresholds in performance.
 
Last edited:

Three

Gold Member
It's unoptimized for all GPUs except 6700 and 6700XT

It's one of the dumbest things I have ever seen on PC.
what's dumb about it exactly?

from what I can gather the only thing that you seem to think is 'dumb' is the fact that on the PC the 6800 is meant to be 44% faster than the 6700 but it's not.
The same arguments can be made about certain games "unoptimised" on a Pro vs regular PS5. remember how much faster that is meant to be but is much lower in some games in practice?

not sure how true that video even is considering on the the 6700 he is using overclock + SAM clocked higher than the 6950 so a number of engines that favour fast with SAM rather than wide would benefit and make the gap smaller than any theoretical gap, isn't this something we see on PS5 vs XSX too in a lot of engines?

it's 99% GPU untilisation on both 6700 and 6950 too so he's complaining about lower power draw to say unoptimised?


Still don't see what's dumb about it. The game doesn't perform or look bad on PC for you to call it unoptimised overall even if the 6700 might be punching above its weight overclocked with SAM due to better optimisation on hardware more similar to PS5. It just shows good PS5 optimisation.

Nobody goes around calling A Plague Tale unoptimised and the dumbest things they have ever seen and objectively that performs worse. I mean what is the issue exactly?

performance-2560-1440.png



performance-2560-1440.png


So I don't get it are you calling it ugly, are you calling it poor performing or are you just upset it performs above what is expected on PS5 or "PS5-like" hardware?
 
Last edited:

scydrex

Member
My PC did not cost me much more than the Pro and wipes it's ass with the pro

-Case = $60
-750s modular PSU (gold rated) $75
-CPU (I5-13600kf) $175
-CPU Cooler ($25)
-2tb nvme ($100)
-GPU 4070 ($450)
-Ram 32gb DDR 5 ($99)
-Micro atx mobo - ($150)

(about $1150)

Minus 6 years of PS+ and the pc becomes cheaper and cheaper.
That's $450 more than the Pro so 64% the price of the Pro. The least it can do is wipe it's ass with the pro. Let me know when a $700 PC is better than the Pro. Then i will comeback to pc gaming. Also i have more than 6 years without plus.
 
Last edited:
Do you have an Nvidia GPU? Path tracing is something to be experienced and full RT and is the future of game rendering outside of Ai. Granted its way too heavy for the current generation of hardware and thus, still too early for the masses (Blackwell may move the needle on that) but the tech is amazing. Frame generation included, once you hit certain thresholds in performance.

I've got a 4090, framegen is horrible, thankfully i don't have to use it.
 

MarkMe2525

Banned
It’s funny that some people thinking their 4070 is good because of frame gen
What does that even mean? That is quite analogues to "it's funny that some people thinking their PS4 Pro is good because of checkerboarding". In the end, it's about the results, and if said results are "good", then said product or process that made it that way, is "good"

Not bullshit but ok buddy.[

Rh6TIbL.png
You can own a product, and still have a bad take on it, that is what he means when he says bullshit.
 
Last edited:

MarkMe2525

Banned
Bullshit that frame generation is horrible.
but... but... if I take a 1000fps video of the frame gen and pause it at the right time, I can see that the generated frames aren't as good.

framegen is ass, you probably love it on your 900 usd 4070 but i don't need it on my 4090 thanks.
At least you got your username right.. Introspection is an important skill that not many have.
 
Last edited:
but... but... if I take a 1000fps video of the frame gen and pause it at the right time, I can see that the generated frames aren't as good.

If I want 30fps input latency i'll just play at 30, i play at higher FPS because i want the better responsiveness of the game.
 

Bojji

Member
what's dumb about it exactly?

from what I can gather the only thing that you seem to think is 'dumb' is the fact that on the PC the 6800 is meant to be 44% faster than the 6700 but it's not.
The same arguments can be made about certain games "unoptimised" on a Pro vs regular PS5. remember how much faster that is meant to be but is much lower in some games in practice?

not sure how true that video even is considering on the the 6700 he is using overclock + SAM clocked higher than the 6950 so a number of engines that favour fast with SAM rather than wide would benefit and make the gap smaller than any theoretical gap, isn't this something we see on PS5 vs XSX too in a lot of engines?

it's 99% GPU untilisation on both 6700 and 6950 too so he's complaining about lower power draw to say unoptimised?


Still don't see what's dumb about it. The game doesn't perform or look bad on PC for you to call it unoptimised overall even if the 6700 might be punching above its weight overclocked with SAM due to better optimisation on hardware more similar to PS5. It just shows good PS5 optimisation.

Nobody goes around calling A Plague Tale unoptimised and the dumbest things they have ever seen and objectively that performs worse. I mean what is the issue exactly?

performance-2560-1440.png



performance-2560-1440.png


So I don't get it are you calling it ugly, are you calling it poor performing or are you just upset it performs above what is expected on PS5 or "PS5-like" hardware?

6800 is 44% better than 6700 on average, across many games. In no game 6700 should perform better than much bigger GPU - it's like saying that normal PS5 could outperform PS5 pro on the same settings.

Game had idiotic performance, it also launched in terrible state and needed months to be fixed to some decent level. Still, GPU performance is lower than it should be.

That's why I used "/" in my first post, some games are amazingly optimized to PS5 or amazingly unoptimized on pc.

For last of us, no one will convince me that this game is optimized on pc. It might be well optimized on PS5 but it's also very underperforming on pc.
 
Last edited:
it does make sense, you get the smoother panning and better motion clarity from higher fps using frame gen, but it's not actually running the game at the higher FPS, it's fake frames, your input latency stays at whatever the native FPS is, i don't want that, i don't like the feel of input delay, 30fps is jarring, even 60 is a bit of a chore these days.
 

MarkMe2525

Banned
it does make sense, you get the smoother panning and better motion clarity from higher fps using frame gen, but it's not actually running the game at the higher FPS, it's fake frames, your input latency stays at whatever the native FPS is, i don't want that, i don't like the feel of input delay, 30fps is jarring, even 60 is a bit of a chore these days.
Do you even know how to use frame gen? You aren't supposed to be using it to boost 30fps to 60fps. 60fps to 120fps, maybe. Where it shines is 90fps to 120fps. 100fps to 144fps, etc. The idea is to push the high end even higher, not take a shit source and boost it.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
it does make sense, you get the smoother panning and better motion clarity from higher fps using frame gen, but it's not actually running the game at the higher FPS, it's fake frames, your input latency stays at whatever the native FPS is, i don't want that, i don't like the feel of input delay, 30fps is jarring, even 60 is a bit of a chore these days.
That's not how it works. For one, the higher the base frame rate, the lower your input latency. If frame gen takes you from 90 to 140fps, you won't be feeling the input latency of 30fps.

For two, every game that has frame generation also has Reflex/Reflex+Boost, which dramatically lowers input latency to BELOW native. 40fps+Reflex>>>40fps without Reflex. As a result, you can toggle on frame generation and still have a significantly lower input latency than the base frame rate would have without Reflex. You don't just blindly turn shit on and off with NVIDIA. They have a suite of different software that are meant to interact and uplift one another. If you use Reflex+DLSS+Frame Gen, you're going to get 4K-like quality, a much higher fps thanks to frame gen, and a latency even lower than if you weren't using Reflex. Furthermore, unless frame gen takes you to like 40fps from a 20-30 base (at which point you should drop your settings) there's very little reason not to use it unless you're playing a competitive game and those are so easy to run that you're getting 300fps, so why would you use frame generation anyway?
 

hinch7

Member
I've got a 4090, framegen is horrible, thankfully i don't have to use it.
Thats why I say when you hit a certain threshold. With 40FPS (70-80 FG), its bordering acceptable as input lag goes with Reflex when using controller - which, funnily enough is still is faster than consoles at 60FPS at well under 100ms. 50-60FPS base and then FG added, its starting to become a fairly decent experience. Anything higher is nice.

With Lossless Scaling I can set FGx3 from 80FPS to 240 and it looks and runs so smooth on my monitor when using controller. With good responsiveness and ridiculously smooth motion. Granted UI and static things like crosshairs can ghost a bit, there's still a lot of room for improvement. And that will get better over time when more devs use it and Nvidia+AMD work on it.
 
Last edited:

kevboard

Member
If I want 30fps input latency i'll just play at 30, i play at higher FPS because i want the better responsiveness of the game.

well, there is no such thing as "30fps input lag"
every game has different amounts of input lag. every engine is different and even within the same engines input lag differs greatly.
Halo 3 has lower input lag at 30fps than God of War 3 at 60fps.

on a mouse, if your input lag is below 70ms your game will feel fine. if it's below 50ms its near perfect.
thanks to Nvidia Reflex, even with frame gen you can often get sub 70ms latency even at relatively low frame gen fps.

Cyberpunk 2077:
1P8cUR7.png

notice how the native framerate of Frame Gen here would be around 78fps, and of course the card buffers 3 whole frames, yet the input lag only went up by 10ms, and is far below what would cause any issues.


here a heavier scene also from CP2077
WFSOfc0.png

110fps with frame gen has 49ms of lag, so with a native framerate of 55fps.

most games that do not support Nvidia reflex would probably have higher input lag than the frame gen numbers here, while running without frame gen.
for example Call of Duty Warzone on console at a native 120fps has around 47ms of input lag, and probably a similar amount on PC as well if reflex is disabled (not even sure if it supports reflex, but I assume it does)

and anything below 50ms is absolutely top tier still. anything below that will only become important in competitive esports shooters.
 
Last edited:

Three

Gold Member
6800 is 44% better than 6700 on average, across many games. In no game 6700 should perform better than much bigger GPU
so you're just upset that the Ps5, 6800 and 6800XT appear to perform well relatively?

It isnt even necessarily true. Are you perhaps looking at the 1% lows or the high overclocks on the 6700 he is using with SAM?

I assume you're using the tech power up relative performance figure for that 44%. Yet when you look at the performance of the 6700XT vs 6800XT on the actual tech power up benchmarks how much of a gap do you see on a 6700XT vs 6800XT at 1080p? (1080p is also something that benefits fast vs wide in a lot of engines so he can hammer IO which I assume is why he used it)

You said it's only well optimised on 6700 and 6700XT because the relative improvement on the tier up card isn't there? well...

TLOU part1 remake
6700XT average fps = 61
6800XT average fps = 90

% performance increase = (90-61)/61 = 48% increase in performance

A Plague Tale
6700XT average fps = 70
6800XT avergae fps = 102

% performance increase = (102-70)/70 = 46% increase in performance

that's lower. Explain it then? if the game is "only optimised for the 6700 XT" due to relative performance between the cards why is it objectively a lower performance increase on A Plague tale?

with a relatively high OC and SAM a lot of games would go blow for blow with slightly more powerful cards under specific conditions, especially since it's not just the GPU benefitting from increased clocks.


- it's like saying that normal PS5 could outperform PS5 pro on the same settings.
yes if you could overclock the base PS5 and set conditions that don't benefit the wider GPU vs the faster one you could but that's not happening unless you've found a way to overclock the PS5. even then it would be very specific things it would go blow for blow with the bigger GPU . You've already seen examples of fast vs wide (bigger) with PS5 vs XSX anyway and the smaller faster one performing better.

For last of us, no one will convince me that this game is optimized on pc. It might be well optimized on PS5 but it's also very underperforming on pc.
Only based on the fact that it runs well on a PS5, 6700 and 6700XT I'm sure, irrationally.
 
Last edited:
I've got a 4090, framegen is horrible, thankfully i don't have to use it.
I dont have to use DLSS FG, because I can get well over 60fps at 1440p even without FG (at 1440p my 4080S get higher fps than RTX4090 at 4K), but I see no point to not use it, since this technology works so well. I get improved image quality (on sample and hold display higher fps drastically improve motion quality) and in my experience DLSS FG also improves aiming.

I tried playing cyberpunk at 1440p native without upscaling with path tracing and I had like 40fps. I could play like this on my VRR display, but my aiming wasnt so precise and smooth as I would like. As soon as I turned on DLSS FG my aiming improved dramaticaly because my eyes could track moving objects way more easily.

I played games like quake and unreal tournament for over two decades and not many people can feel increased input latency so easily like me. I remember when I bought Xbox One X back in 2017 and people on this forum attacked me when I said this game is unplayable because of terrible input lag. Later on youtubers (digital foundry or nxgamer, I dont remember now) tested input latency in this game and it was twice as high compared to Gears Of War Ultimate. I didint need equipemnt to notice input lag problems dude, that's how sensitive to input lag I'm. I can even tell the difference between 170fps vsync and 170fps without vsync on my monitor. With vsync there's a feeling of weight during mouse movement even at such high fps. With DLSS FG mouse movement isnt affected nearly as much. Sure, I can still feel the difference at base 40fps in cyberpunk when I turn on FG, but that difference is very subtle and definitely not terrible like you said and becasue FG generate so many frames it actualy improves my aiming, therefore there's no reason to play without it even at such low fps. Most of the time however I use FG when my average framerate is higher than 60fps. Many games can run at 100fps on average, but can dip below 60fps from time to time and it's better to use FG than lower graphics settings. Thanks to DLSS FG I no longer notice sub 60fps dips.

What's interesting, in some games, such as Black Myth Wukong, I actually get lower input latency with DLSS FG (48ms with FG vs. 60ms without FG) because the Nvidia drivers enable 'Reflex', and you can't use this feature separately in that game.

In cyberpunk with psycho RT at 1440p DLSS quality and FG I get 27ms input latency as you can see in the right corner of my monitor.

20241123-182840.jpg


With PT instead of standard RT I get 48ms

20241123-182801.jpg


30fps games have around 120ms input latency dude. I would need to turn on VSYNC to make DLSS FG input latency as terrible (for some strange reason DLSS FG dont like Vsync, you get 100ms input lag penalty). You said DLSS FG makes games feels like 30fps and that's simply not true as my photos prove.
 
Last edited:
Callisto in 8k looks like something I'd call ps5.5 it definitely is a huge leap over base ps5 not sure how callisto looks on a 4070 can it do 8k?
 

DenchDeckard

Moderated wildly
If there is an award for "L" of the year for video game releases I think it goes to the pro.

It's not performing how we expected, it's super expensive. Doesn't come with drive stand (beaten to death i know)

It just seems so poorly designed I'm completely confused about it.
 

FireFly

Member
I'm still waiting for the roughly double PS5 performance that MLiD was claiming, used by posters here to "prove" Sony was lying to their own users about the Pro's performance advantage.
 
If there is an award for "L" of the year for video game releases I think it goes to the pro.

It's not performing how we expected, it's super expensive. Doesn't come with drive stand (beaten to death i know)

It just seems so poorly designed I'm completely confused about it.

I say this as someone that definitely looks forward to playing AW2 and SH2 but depending on the games you've played the pro is either doing phenomenal or meh I've played TLOU2, Spiderman Miles, Callisto Protocol, Dying Light 2, and Ratchet and Clank and they all look amazing clearly above ps5 amateur...so 🤷🏾‍♂️
 
Last edited:
No. In pure rasterization, it's quite compatible to the 4070 but when ray tracing or basically when Unreal is being used its below the 4060.
 

SweetTooth

Gold Member
Looking at CP patch.. its on par with 4090 🤣

It really depends on how devs are utilizing the power, some will make an efforts others will slap in a rush job. In the end its the most powerful ever created and buyers will benefit out of this fact.
 
Top Bottom