• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is the PS5 pro on par with a RTX 4070 like Digital Foundry claimed before launch?

Clear

CliffyB's Cock Holster
My PC did not cost me much more than the Pro and wipes it's ass with the pro

-Case = $60
-750s modular PSU (gold rated) $75
-CPU (I5-13600kf) $175
-CPU Cooler ($25)
-2tb nvme ($100)
-GPU 4070 ($450)
-Ram 32gb DDR 5 ($99)
-Micro atx mobo - ($150)

(about $1150)

Minus 6 years of PS+ and the pc becomes cheaper and cheaper.

So what you're saying is, I'm right to point out that you are paying more to get more. You're just choosing to ignore that very significant price bump over the Pro, and of course the fairly massive one compared to a regular PS5 which offers the exact same gaming experiences at a modestly lower level of fidelity.

Also the whole "savings over time" argument only works if you spend less on software over time, which is in no way a certainty. And in fact strikes me as rather counter-intuitive as if you are willing to stump up more up-front, why are you going to scrimp and save on smaller purchases over time? Its not like there aren't subscription services on PC too!
 

GymWolf

Member
Pssr doesn't seems to be as good (or at least as consistent) as dlss and doesn't have framegen so...no?!!
 
Last edited:
  • Like
Reactions: KU_

Senua

Member
Looking at CP patch.. its on par with 4090 🤣
Over It Ugh GIF
 

vkbest

Member
the pro version of AW2 clearly got a lot of care.

they changed how the BVH works by hand and at a granular level.

adjusting the distance of the BVH by hand to make sure the most important distant detail like mountains are still captured, while less important detial is reigned in.

they also adjusted the tick rate of different types of dynamic objects that are in the BVH to save performance where it doesn't hurt, while keeping important things like player characters and important NPCs updating every frame.

all of this doesn't guarantee that they got every last bit of performance out of the system, but is absolutely shows that the port was handled with care.

again, something that can clearly not be said about Callisto Protocol on PC, where many glaring issues still remain to this day, and were catastrophic at launch.
Its not using mesh shaders despite according leaks we know Pro can do it
 

Gaiff

SBI’s Resident Gaslighter
Closer to the 3070


And he gets it completely wrong. He uses the internal render resolution on PC without upscaling. That’s significantly less demanding than upscaling.

1440p upsaling to 4K will have a lower fps than 1440p. No wonder he arrived at the conclusion that it’s comparable to a 3070.

Edit: Never mind. He uses a 4K output with DLSS. Still, too many leaps and not enough rigor in his tests. It’s all over the place.
 
Last edited:

PeteBull

Member
Before the PS5 pro there were some crazy claims being thrown about. So did this statement hold true or is Richard from Digital Foundry caught talking out of his ass?

It can be, in best case scenario, which ofc sometimes happens, but its "up to" that lvl of power, its not a baseline.
 

DForce

NaughtyDog Defense Force
Closer to the 3070



He's using an unpatched Callisto Protocol and Elden Ring in his tests.

He does compare it to the PATCHED version of Callisto Protocol right after, but he doesn't mention that it's probably using the older SDK version for RT, which can be seen in the article he showed in his video.

He's also getting dips in the low 50s during gameplay on the 3070.

He needs to leave it to DF.
 

Bojji

Member
so you're just upset that the Ps5, 6800 and 6800XT appear to perform well relatively?

It isnt even necessarily true. Are you perhaps looking at the 1% lows or the high overclocks on the 6700 he is using with SAM?

I assume you're using the tech power up relative performance figure for that 44%. Yet when you look at the performance of the 6700XT vs 6800XT on the actual tech power up benchmarks how much of a gap do you see on a 6700XT vs 6800XT at 1080p? (1080p is also something that benefits fast vs wide in a lot of engines so he can hammer IO which I assume is why he used it)

You said it's only well optimised on 6700 and 6700XT because the relative improvement on the tier up card isn't there? well...

TLOU part1 remake
6700XT average fps = 61
6800XT average fps = 90

% performance increase = (90-61)/61 = 48% increase in performance

A Plague Tale
6700XT average fps = 70
6800XT avergae fps = 102

% performance increase = (102-70)/70 = 46% increase in performance

that's lower. Explain it then? if the game is "only optimised for the 6700 XT" due to relative performance between the cards why is it objectively a lower performance increase on A Plague tale?

with a relatively high OC and SAM a lot of games would go blow for blow with slightly more powerful cards under specific conditions, especially since it's not just the GPU benefitting from increased clocks.



yes if you could overclock the base PS5 and set conditions that don't benefit the wider GPU vs the faster one you could but that's not happening unless you've found a way to overclock the PS5. even then it would be very specific things it would go blow for blow with the bigger GPU . You've already seen examples of fast vs wide (bigger) with PS5 vs XSX anyway and the smaller faster one performing better.


Only based on the fact that it runs well on a PS5, 6700 and 6700XT I'm sure, irrationally.

Most reviewers don't use resizable bar (so SAM) when benchmarking. Ancient Gameplays tests are very good overall.

6800 is better in every way, NO GAME should perform better on 6700 - it's impossible unless code is fucked up.

What TLoU was doing is limiting performance of ALL GPUs except 6700/6700XT. This is the same GPU that PS5 is using (both 6700 and PS5 are cut down version of 40CUs die).

Performance of this game is just idiotic. It's also outlier when DF did PS5 vs. the world tests:

5qbduVM.jpeg
Jy9aL0M.jpeg
b82PDxy.jpeg
EmYInjY.jpeg
o1cSlVs.jpeg


2x performance ^

vs. The last of us:

Tcb902E.jpeg
vp9b0Ka.jpeg


100% uplift in many games vs. 36% in one game. And this requires question: is it ultra amazing optimization on PS5 or just shit port on PC?

Callisto in 8k looks like something I'd call ps5.5 it definitely is a huge leap over base ps5 not sure how callisto looks on a 4070 can it do 8k?

CP is native 4k in 8k mode on PS5 Pro. It's probably not using maximum settings as well (this pc chart is on max):

Pttd4be.jpeg


No. In pure rasterization, it's quite compatible to the 4070 but when ray tracing or basically when Unreal is being used its below the 4060.

It's worse than 6800 in raster and 4070 is usually faster than that.
 

SpokkX

Member
It surpasses the 4070 in some games, in others it doesn't.
about that... which games are those?

4070 is around 30TFLPS vs 18TFLOPS in PS5 PRO
4070 has DLSS extremely efficient and fast upscaling while PS5 Pro has a VERY demanding upscaler that compares more to FSR in quality

They are MILES apart imo..

If there is a game where PS5 Pro outperforms a 4070 with the same settings and resolution/upscaling quality - I would like to see an example?
 

lh032

I cry about Xbox and hate PlayStation.
My PC did not cost me much more than the Pro and wipes it's ass with the pro

-Case = $60
-750s modular PSU (gold rated) $75
-CPU (I5-13600kf) $175
-CPU Cooler ($25)
-2tb nvme ($100)
-GPU 4070 ($450)
-Ram 32gb DDR 5 ($99)
-Micro atx mobo - ($150)

(about $1150)

Minus 6 years of PS+ and the pc becomes cheaper and cheaper.
the price above is not realistic imo, seems you are trying too hard imo.
The price above does not make sense in my country and im sure it does not apply to most of the people out there as well.
 
Last edited:

Bojji

Member
Ratchet & Clank is the only one I can think of. Callisto Protocol maybe as well, but that was poorly made for PC, specifically NVIDIA.

Maybe GOW Ragnarok, for sure The Last of Us.

Those games are mediocre ports on PC (TLoU was terrible, it's mediocre after tons of patches).

Ratchet performs poorly if you don't have 16GB of VRAM, I tested it myself when I had 4070ti, and now over 1 year later on 4070ti super it runs MUCH better - more than 30% better or even more. While difference between 4070ti and 4070ti super is like ~10% on average.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
about that... which games are those?
A few PS5 exclusives. The 4070S is only 36% faster than the regular PS5. The Pro is 29% faster. They're almost equal there.

GOWR is another candidate. The 4070 is only around 20% faster than the regular PS5. The Pro should come close to a 4070S there as well.
 

daninthemix

Member
So the upshot is, the Pro is probably losing against the middle-tier card from Nvidia's last gen that is about to be replaced?
 

Three

Gold Member
Most reviewers don't use resizable bar (so SAM) when benchmarking. Ancient Gameplays tests are very good overall.
The reviewer you yourself linked did use SAM and an overclock, setting the frequency on the 6700 way above even a 6950. So what are you getting at, you yourself linked to that dumb 1080p overclock SAM benchmark between a 6700 and 6800.
6800 is better in every way, NO GAME should perform better on 6700 - it's impossible unless code is fucked up.
No game generally does.

A game on a system where you set the resolution to 1080p and end up approaching some other bottleneck on the system would reduce your delta when comparing something that is no longer the bottleneck (especially ones overcome with direct CPU access to GPU memory at higher clocks). You can increase GPU and you would still have the same bottleneck. That's how bottlenecks work.

A lot of games behave exactly the same. Especially when you start overclocking the lower card higher than other cards. The clue should have been in giving the CPU access to the GPU memory with SAM that the bottleneck is no longer the GPU size. It's IO with the CPU.

I even showed you that the performance jump from 6700 XT to 6800 XT is bigger in TLOU than it is in A Plague tale despite this even at that res on a non-bottlenecked system. You chose to ignore this completely. I gave you an example of a higher clocked 36 CU GPU console performing better than a 56 CU GPU console on games that aren't TLOU. Again you ignore it. I give you the fact that the PS5 Pro is a 60CU GPU vs the PS5 being a much smaller 36CU yet the delta is not what you would expect in some games. I'm sure you will refrain from calling games dumb and idiotic and blame the hardware then begin to understand that other bottlenecks exist for that one beyond GPU size. You're tiresome.
What TLoU was doing is limiting performance of ALL GPUs except 6700/6700XT. This is the same GPU that PS5 is using (both 6700 and PS5 are cut down version of 40CUs die).

Performance of this game is just idiotic. It's also outlier when DF did PS5 vs. the world tests:

5qbduVM.jpeg
Jy9aL0M.jpeg
b82PDxy.jpeg
EmYInjY.jpeg
o1cSlVs.jpeg


2x performance ^

vs. The last of us:

Tcb902E.jpeg
vp9b0Ka.jpeg


100% uplift in many games vs. 36% in one game. And this requires question: is it ultra amazing optimization on PS5 or just shit port on PC?
Yes relative to a PS5 it's a 36% uplift because it's a well optimised game on the PS5 hardware but you were making silly claims like how how it's "only optimised for a 6700XT", claiming that the delta on a 6700 XT vs 6800 XT is nonexistent using a dumb benchmark you didn't understand. That's simply bullshit. I showed you the delta is larger than games like A Plague Tale. You're seemingly just upset that the game is well optimised on PS5 so the gap between it and PC is smaller than you'd hoped. The game looks fantastic and runs very well, even on PC with delta improvement that are typical.
 
Last edited:

STARSBarry

Gold Member
It looks more like a standard PS5 with a shitty FSR upscaler

For the love of God can we just get this basic name right, it's called PSSR, pronounced Pisser.

Like why can't people get used to saying "wow playstations shitty pisser makes my games look like crap"

As someone who owns a PS5 Pro, it's really taking the pisser.
 

Seider

Member
Does a PC Gaming with a RTX 4070 cost 700$ like Ps5 Pro? I think no.

Why are we comparing these systems then?
 

Bojji

Member
The reviewer you yourself linked did use SAM and an overclock, setting the frequency on the 6700 way above even a 6950. So what are you getting at, you yourself linked to that dumb 1080p overclock SAM benchmark between a 6700 and 6800.

No game generally does.

A game on a system where you set the resolution to 1080p and end up approaching some other bottleneck on the system would reduce your delta when comparing something that is no longer the bottleneck (especially ones overcome with direct CPU access to GPU memory at higher clocks). You can increase GPU and you would still have the same bottleneck. That's how bottlenecks work.

A lot of games behave exactly the same. Especially when you start overclocking the higher clock lower card as well. The clue should have been in giving the CPU access to the GPU memory with SAM that the bottleneck is no longer the GPU size. It's IO with the CPU.

I even showed you that the performance jump from 6700 XT to 6800 XT is bigger in TLOU than it is in A Plague tale despite this even at that res. You chose to ignore this completely. I gave you an example of a higher clocked 36 CU GPU console performing better than a 56 CU GPU console on games that aren't TLOU. Again you ignore it. I give you the fact that the PS5 Pro is a 60CU GPU vs the PS5 being a much smaller 36CU yet the delta is not what you would expect in some games. I'm sure you will refrain from calling games dumb and idiotic and blame the hardware then begin to understand that other bottlenecks exist for that one beyond GPU size. You're tiresome.

Yes relative to a PS5 it's a 36% uplift because it's a well optimised game on the PS5 hardware but you were making silly claims like how how it's "only optimised for a 6700XT", claiming that the delta on a 6700 XT vs 6800 XT is nonexistent using a dumb benchmark you didn't understand. . That's simply bullshit. I showed you the delta is larger than games like A Plague Tale. You're seemingly just upset that the game is well optimised on PS5 so the gap between it and PC is smaller than you'd hoped. The game looks fantastic and runs very well, even on PC.


This is 6800 vs. faster 6700XT

wSQkATF.jpeg


What it more likely: that MANY games are unoptimized on PS5 so this console and PC gpus perform according to their specs

OR

That The Last of Us uses magic and allow PS5 hardware to reach performance of GPU 40% faster than what is inside the console?
 

Three

Gold Member
This is 6800 vs. faster 6700XT

wSQkATF.jpeg


What it more likely: that MANY games are unoptimized on PS5 so this console and PC gpus perform according to their specs

OR

That The Last of Us uses magic and allow PS5 hardware to reach performance of GPU 40% faster than what is inside the console?
So again completely oblivious to what I said about your dumb benchmark and the fact that TLOU actually does perform better on higher end cards when not bottlenecked by something else. Completely oblivious to the points made about PS5 vs XSX gpu sizes not resulting in deltas you'd expect and PS5 vs PS5 Pro GPU sizes not resulting in deltas you'd expect. There is no "magic" required. You asked somebody to explain a benchmark you posted. I explained it. Yet you weren't after an explanation you were after "PS software bad, Playstation hardware bad". Cool, you're tiresome, but keep on trucking.
 
Last edited:

Bojji

Member
So again completely oblivious to what I said about your dumb benchmark and the fact that TLOU actually does perform better on higher end cards when not bottlenecked by something else. Completely oblivious to the points made about PS5 vs XSX gpu sizes not resilulting in deltas you'd expect and PS5 vs PS5 Pro GPU sizes not resulting in deltas you'd expect. There is no "magic" required. You asked somebody to explain a benchmark you posted. I explained it. Yet you weren't after an explanation you were after "PS software bad, Playstation hardware bad". Cool, you're tiresome but keep on trucking.

Fuck me.

Explain how 36CU GPU can outperform 60CU GPU. Only thing that 6700 has better than 6800 is like ~15% higher clock.

You can believe that ND pulls some magic moves and PS5 is performing 40% better than it should be but what is far more likely is that they dropped shit port on PC.

TXaJQ8L.jpeg


Every GPU except 6700/XT is underutilized in this game. They don't even use the same amount of power as other games (W).

R5ToIzg.jpeg


Tua7D61.jpeg
 
Last edited:

twilo99

Member
No. In pure rasterization, it's quite compatible to the 4070 but when ray tracing or basically when Unreal is being used its below the 4060.

I think if you pair a 4070 with the archaic CPU from the PS5 “pro” you might get very similar performance, but who in their right mind would build such a crippled PC in 2025??
 

JackMcGunns

Member
Not in a normal scenario where you would pair it with a nice CPU, but maybe if you bottlenecked the crap out of it with a crappy CPU, but even then it might be a stretch.
 
I find it amusing that some people thought Sony would instantly succeed where AMD failed, at least with upscaling.

Frame gen should also be factored in. Fluid Motion Frames has been incredibly useful on the occasssions when I've been forced to turn it on.
 

Three

Gold Member
Fuck me.

Explain how 36CU GPU can outperform 60CU GPU. Only thing that 6700 has better than 6800 is like ~15% higher clock.
You can believe that ND pulls some magic moves and PS5 is performing 40% better than it should be but what is far more likely is that they dropped shit port on PC.
TXaJQ8L.jpeg


Every GPU except 6700/XT is underutilized in this game. They don't even use the same amount of power as other games (W).

R5ToIzg.jpeg


Tua7D61.jpeg
I've explained already. Not sure if you're blind or being obtuse. He did a benchmark at 1080p, overclocked the 6700 GPU to 2800Mhz (default is 1900mhz I believe?) and enabled SAM so that CPU tasks that require copying things between RAM pools are sped up by benefitting from high clocks and fewer memory misses. He's created a specific scenario and introduced a fps bottleneck that isn't the GPU size anymore.

When you look at the tech power up 44% figure this isn't for a GPU that's been overclocked to 2800Mhz with better timings either so stop playing dumb, and it isn't one where you've introduced another bottleneck. What you are claiming is dumb. It's like saying "X publisher only optimised for Y CPU" then running a game at 8K and showing only marginal differences between CPU tiers and you seem to believe this is an actual logical take. When you look at actual tech power up TLOU benchmarks between the 6700XT and 6800XT on non-bottlenecked setups you see a 46% delta between these two cards. Bigger than games like a plagues tale.
He's intentionally set up a condition where the bigger GPU doesn't help but his high overclock does for IO. He's essentially recreated a PS5 vs XSX scenario for a game with these high clocks. That is a 36CU GPU vs a 56CU GPU yet doesn't show this delta in games. It often shows in the PS5s favour even. Every dev is dumb and idiotic then? Including those who do games for 36CU vs 60CU or whatever it is on the PS5 Pro without showing these type of deltas?
You ignore these points and rant on with your nonsense over and over completely ignoring any discussion after asking for an explanation.
 

Three

Gold Member
T Three Bojji Bojji What the fuck are you guys talking about?
He seems to think TLOU is "only optimised for a 6700 and 6700xt" and used a flawed benchmark to push this idiotic point asking for an explanation. I showed it doesn't scale atypically when it's not intentionally placed in a scenario bottlenecked by other parts of the system. He proceeds to ignore it and keeps trying to hammer home "PS hardware/software bad" and "magic". And round and round we go.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Bojji Bojji Is there a reason you're using a single benchmark to prove that the 6700 outperforms the 6800 XT in TLOU Part I when every other reputable source on the internet clearly has the 6800 XT being much faster? What point are you trying to make? That with an OC, a CPU/memory bottleneck, and SAM, a 6700 is faster than a 6800 XT in this game?
 
Last edited:

SHA

Member
Exactly.

When the console doesn't even reach a 40% improvement over base, absolutely not. Even though there are some exceptions exceeding what was promised.

Ea9rRDF.png


The thing is, with a 4070 you would never risk having developers forcing on you a broken DLSS implementation.
Capcom prefer Playstation, the Xbox isn't weaker, I actually hate the new monster hunter graphics, re9 shouldn't follow mh path or they'll fall really hard.
 
Last edited:

Bojji

Member
I've explained already. Not sure if you're blind or being obtuse. He did a benchmark at 1080p, overclocked the 6700 GPU to 2800Mhz (default is 1900mhz I believe?) and enabled SAM so that CPU tasks that require copying things between RAM pools are sped up by benefitting from high clocks and fewer memory misses. He's created a specific scenario and introduced a fps bottleneck that isn't the GPU size anymore.

When you look at the tech power up 44% figure this isn't for a GPU that's been overclocked to 2800Mhz with better timings either so stop playing dumb, and it isn't one where you've introduced another bottleneck. What you are claiming is dumb. It's like saying "X publisher only optimised for Y CPU" then running a game at 8K and showing only marginal differences between CPU tiers and you seem to believe this is an actual logical take. When you look at actual tech power up TLOU benchmarks between the 6700XT and 6800XT on non-bottlenecked setups you see a 46% delta between these two cards. Bigger than games like a plagues tale.
He's intentionally set up a condition where the bigger GPU doesn't help but his high overclock does for IO. He's essentially recreated a PS5 vs XSX scenario for a game with these high clocks. That is a 36CU GPU vs a 56CU GPU yet doesn't show this delta in games. It often shows in the PS5s favour even. Every dev is dumb and idiotic then? Including those who do games for 36CU vs 60CU or whatever it is on the PS5 Pro without showing these type of deltas?
You ignore these points and rant on with your nonsense over and over completely ignoring any discussion after asking for an explanation.

Overclock alone is this difference, and all GPUs are oced:

5spzoR9.jpeg


You really think that this is comparable in any shape or form to Xbox vs. PS5?

Bojji Bojji Is there a reason you're using a single benchmark to prove that the 6700 outperforms the 6800 XT in TLOU Part I when every other reputable source on the internet clearly has the 6800 XT being much faster? What point are you trying to make? That with an OC, a CPU/memory bottleneck, and SAM, a 6700 is faster than a 6800 XT in this game?

Most reviewers don't enable ReBar (or at least they didn't when tested this game). You see that massive difference is only seen with it. Have you watched the video?

Bench is CPU limited on 6950 that's why I'm focusing on 6800 vs. 6700.

I think we can safely say at this stage it outperforms the 4070 but has PSSR issues which are giving false comparisons

Clearly better than 4070 in one of the heaviest games around:

Alan-Wake-2-PS5-Pro-Tech-Review-Pro-vs-PS5-PSSR-vs-DLSS-Pro-RT-vs-PC-40-39-screenshot.png
 
Last edited:

cormack12

Gold Member
Overclock alone is this difference, and all GPUs are oced:

5spzoR9.jpeg


You really think that this is comparable in any shape or form to Xbox vs. PS5?



Most reviewers don't enable ReBar (or at least they didn't when tested this game). You see that massive difference is only seen with it. Have you watched the video?

Bench is CPU limited on 6950 that's why I'm focusing on 6800 vs. 6700.



Clearly better than 4070 in one of the heaviest games around:

Alan-Wake-2-PS5-Pro-Tech-Review-Pro-vs-PS5-PSSR-vs-DLSS-Pro-RT-vs-PC-40-39-screenshot.png
What CPU is it paired with?
 

Three

Gold Member
Overclock alone is this difference, and all GPUs are oced:

5spzoR9.jpeg
What do you think "@ OC+ SAM" means with the other one? The one that you keep saying performs better. The others don't even show increased performance it shows a marginal decrease vs the 6800 because he's at a low res and more bottlenecked by his CPU/memory at 89 fps.
 
Last edited:

Bojji

Member
What CPU is it paired with?

Doesn't matter, game is purely GPU limited here. AW2 also is not very CPU heavy.

What do you think "@ OC+ SAM" means with the other one? The one that you keep saying performs better. The others don't even show increased performance it shows a marginal decrease vs the 6800 because he's at a low res and more bottlenecked by his CPU/memory at 89 fps.

OC is OC alone, OC + SAM is the same OC with Smart Access Memory enabled. It makes big differences in some games. If he was CPU limited - both 6800 and 6700 would have the same performance.
 
about that... which games are those?

4070 is around 30TFLPS vs 18TFLOPS in PS5 PRO
4070 has DLSS extremely efficient and fast upscaling while PS5 Pro has a VERY demanding upscaler that compares more to FSR in quality

They are MILES apart imo..

If there is a game where PS5 Pro outperforms a 4070 with the same settings and resolution/upscaling quality - I would like to see an example?
Relative FLOPS performance can only be compared within the same architecture. For example, the RTX3070 has 20TF, while the 2080ti has 13TF. You might think that the RTX3070 should destroy the RTX2080ti based on the flops metric, but in reality both cards offer the same level of performance.

If you want to know how fast the PS5Pro GPU really is compared to other GPUs on the PC, just read the RX6800 reviews, as the PS5Pro GPU is very similar when it comes to specs.
The PS5Pro should be just as fast in raster, and probably even faster in RT, as Mark Cerny mentioned that they improved RT performance on the PS5pro.

BTW. I just watched this video and I cannot stop laughing ;P.

 
Last edited:

twilo99

Member
Intel will be interesting in that low/mid tier with Battlemage launch.

It will make building a PS5 pro PC even easier ... as long as they get their drivers sorted of course

I've been saying that the 7700/7800xt GPUs are the closest to what the PS5 pro can offer since launch but no one likes that take
 

Gamer79

Predicts the worst decade for Sony starting 2022
I find it amusing that some people thought Sony would instantly succeed where AMD failed, at least with upscaling.

Frame gen should also be factored in. Fluid Motion Frames has been incredibly useful on the occasssions when I've been forced to turn it on.
If you throw in frame Generation with DLSS 3 it makes the ps5 pro look like the red headed retarded step child.

On my 4070 with DLSS 3 and frame gen, I get over 120 fps on 1440p with ultra settings.
 

cormack12

Gold Member
Doesn't matter, game is purely GPU limited here. AW2 also is not very CPU heavy.



OC is OC alone, OC + SAM is the same OC with Smart Access Memory enabled. It makes big differences in some games. If he was CPU limited - both 6800 and 6700 would have the same performance.
I'm sure pairing it with a 5800x3d will drop frames to be fair
 
Top Bottom