• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro Specs Leak are Real, Releasing Holiday 2024(Insider Gaming)

Gaiff

SBI’s Resident Gaslighter
Bringing up CPU as a bottleneck when the 4090 v. 7900xtx is performing 35% faster 1080p RT compared to 14% faster at 1080p/non RT, or when 4070ti v. 7900xtx goes from 2% faster at 1080p RT vs 10% slower at 1080p/non RT all while the CPU is being controlled for as a constant is just poor logic and ignorance. Sorry gotta call it what it is.
Not how it works. Just because the difference increases from non-RT to RT doesn't mean you're no longer CPU-limited.

I mean, look at this:

relative-performance-rt_1920-1080.png


relative-performance-rt_3840-2160.png


We go from 38% faster to 55% faster. The 4090 is still CPU-limited at 1080p RT, even more without it. Once again, the 1080p benches feature a myriad of tests where rt barely has an effect on performance because its usage is minimal, and even with it, you are largely CPU-limited, hence why you see only 35% at 1080p but then over 50% at 4K. I told you to use the path tracing tests because the GPU load is so damn heavy that even at 1080p, a halfway decent CPU won't be the limitation. You can choose to ignore them, but they're far more representative of pure ray tracing performance than tests that include games with only RT shadows.

Look at Far Cry 6 as an example. At 1080p, the ray tracing impact really isn't that big with a 21% performance loss.

far-cry-6-rt-1920-1080.png


But then compare that to F1 2022.

f1-2022-rt-1920-1080.png


60% performance loss. The distribution is very wide so taking the average without the games doesn't actually show the real picture, even at 1080p because those tests tend to take 4-5 games and one outlier can throw all the results out of whack.

I said RDNA RT does not scale 1:1 with CU.

Which is fine, but your conclusion that the scaling is 35% performance for 60% more CUs fails to acknowledge the 53% at 4K. Clearly, there isn't a single data point there. There are multiple and you can't just run with one and ignore the others, pretending that CPU limits don't matter but then turn around and say that the memory config is the problem at 4K.

I'm simply saying that it's true that the scaling isn't 1:1, but this also depends on the scenario. There isn't a one-size-fits-all answer as demonstrated by the different results we get. For the record, I don't think this argument is particularly useful when it comes to RDNA4 and how the PS5 Pro will perform in ray tracing. Way too many unknowns and using a bunch of PC benches with bad implementations and APIs won't even begin to tell us the real answer.
 

Zathalus

Member
And the 40 cards performing significantly above the XTX are immune to these bottlenecks because??? Clearly is GPU limited.

ExG9vso.jpeg




The graph... Gaiff
Or you can just look at the 4K data where the 7900XTX is 50% faster. Also, you are looking at a relative performance chart that is a summary of results across a number of games, CPU limits are possible because some games would be and others not, skewing the results. Looking at the individual games, Plague Tale and Doom Eternal are probably the main culprits, but you can see the scaling problem across a number of games and GPUs.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Or you can just look at the 4K data where the 7900XTX is 50% faster. Also, you are looking at a relative performance chart that is a summary of results across a number of games, CPU limits are possible because some games would be and others not, skewing the results. Looking at the individual games, Plague Tale and Doom Eternal are probably the main culprits, but you can see the scaling problem across a number of games and GPUs.
The Office Thank You GIF
 

ChiefDada

Gold Member
Not how it works. Just because the difference increases from non-RT to RT doesn't mean you're no longer CPU-limited.

Sigh... you know I already addressed this.

Of course there is no perfect analysis. Nothing we can do about that reality. But again based on the debate we're having here, it's far better to use that 1080p RT test than any other, particularly when we are honing in on relative RT performance between AMD GPUs; CPU bottleneck becomes significantly less of a factor.

Why don't you educate us all on how a 4090 is also technically CPU limited when it runs Cyberpunk PT... because that is the level of silliness you're introducing when reaching so far out of context.

Or you can just look at the 4K data where the 7900XTX is 50% faster.

Yes Zathalus. Let's compare 4K Ultra settings with RT across wide range of memory setup in order to get a sense of who has better RT compute performance 8gb vs 24gb??? pffft who gives a shit.


Funny Sidenote: I love how the same people I'm arguing with will be the first ones to defend the 3070 vs PS5 "RT performance" at high res/textures with the memory qualifier/exception. Now all of a sudden they don't understand the impact of memory when we're trying to isolate per core RT performance as best we can.
 

Gaiff

SBI’s Resident Gaslighter
Sigh... you know I already addressed this.
You didn't and you continue to misrepresent the data.
Why don't you educate us all on how a 4090 is also technically CPU limited when it runs Cyberpunk PT... because that is the level of silliness you're introducing when reaching so far out of context.
I mean, it isn't?

cyberpunk-2077-rt-1920-1080.png

cyberpunk-2077-rt-3840-2160.png


The 4090 is 95% faster at 1080p and 99% at 4K. Given the insanely low frame rate of the 7900 XTX, this is within margin of error. 1 more fps would shrink the difference to just 90%.

With path tracing now.

performance-rt-1920-1080.png

performance-rt-3840-2160.png


2.1x at 1080p vs 2.17x at 4K, again, within margin of error. Evidently, the CPU is a non-factor here and everyone is heavily GPU-limited. In other games, however, the difference sometimes goes from 20% in favor of the 4090 at 1080p to 50% at 4K when using RT.
 
Last edited:

Ashamam

Member
The sooner there is an announcement with some real in game footage the better.

The comparisons in this thread between off the shelf components to a somewhat obscure assembly of custom silicon are a waste of space at this point. Proof will be in the actual performance. I appreciate what people are attempting to show, but actual results are going to be driven by custom design aka https://chipsandcheese.com/2024/03/20/the-nerfed-fpu-in-ps5s-zen-2-cores/, not BYO parts assembly.
 

Gaiff

SBI’s Resident Gaslighter
The CPU limited scenarios are rare, and tend to be due to poor optimizations.
They aren't at 1080p, and I'd appreciate console players who don't know what they're talking about doing their research before spouting nonsense. Those aren't path-tracing benchmarks.
It's my own damn fault for not paying attention. My bad. Goodnight.
Yeah, so no arguments there. Thank you. It's always the same damn thing when we talk. You're wrong and are told so repeatedly by multiple people who know what they're talking about, then you start being condescending despite us clearly explaining to you why your arguments are incorrect.
 

James Sawyer Ford

Gold Member
They aren't at 1080p, and I'd appreciate console players who don't know what they're talking about doing their research before spouting nonsense. Those aren't path-tracing benchmarks.

They aren't at 1080p. What isn't? CPU limited scenarios aren't rare at 1080p? I wish PC players would stop assuming hardware is the fix for software problems and that just because a benchmark shows a CPU limitation that this obviously means it's the root of the problem. Thanks
 
Last edited:

bitbydeath

Member
The sooner there is an announcement with some real in game footage the better.

The comparisons in this thread between off the shelf components to a somewhat obscure assembly of custom silicon are a waste of space at this point. Proof will be in the actual performance. I appreciate what people are attempting to show, but actual results are going to be driven by custom design aka https://chipsandcheese.com/2024/03/20/the-nerfed-fpu-in-ps5s-zen-2-cores/, not BYO parts assembly.
Their theories all fall apart when you mention 8K because the spec they think it is doesn’t support it.

That alone shows we don’t know what PS5 Pro is capable of, everyone is just randomly guessing.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
They aren't at 1080p. What isn't? CPU limited scenarios aren't rare at 1080p?
CPU-limited scenarios aren't rare at 1080p on top-tier cards because the vast majority of ray-traced games aren't Black Myth Wukong or Cyberpunk. Trash RT like Far Cry 6 or RE4 are the norm. A 4090 (and even 7900 XTX) will be CPU-limited in many games.

I wish PC players would stop assuming hardware is the fix for software problems. Thanks
Yeah, and where is your proof of poor optimization? You say it, yet got nothing to back it up. It is because you say so?
 

James Sawyer Ford

Gold Member
CPU-limited scenarios aren't rare at 1080p on top-tier cards because the vast majority of ray-traced games aren't Black Myth Wukong or Cyberpunk. Trash RT like Far Cry 6 or RE4 are the norm. A 4090 (and even 7900 XTX) will be CPU-limited in many games.

Well I'm not really arguing about Ray Tracing. Maybe there are certain CPU limitations that exist if you want all the bells and whistles on that front, but I don't think developers should go in with the idea that they need that and achieve higher framerates.
 

James Sawyer Ford

Gold Member
$500-$600.

$600 seems like a lock. I can't see them going 499 due to PS5 not getting a price cut. And $699 seems a bit too pricey for any sort of decent adoption.

The bulk of the costs in the PS5 Pro should also be fairly close to the base PS5, with the exception of just the GPU really. Sure, you need to beef up certain things to account for more power, but I don't think that's going to have too large of an impact.
 
Last edited:

ChiefDada

Gold Member
Yeah, so no arguments there. Thank you. It's always the same damn thing when we talk. You're wrong and are told so repeatedly by multiple people who know what they're talking about, then you start being condescending despite us clearly explaining to you why your arguments are incorrect.

Calm down man I'm just joking. I honestly believe you aren't familiar with how this tangential conversation on RT performance started. I will refrain from directly quoting Loxus as he should reap the rewards of his smart decision to leave this mess of a debate, but comparing Sony's real world "2x-4x" RT speed up to a Path tracing benchmark that intentionally aims to control for all non RT aspects isn't good methodology at all imo. If you agree, great. If you disagree, that's equally fine too.
 
Why do you assume the price would be high?

XBSX launched with far more advanced tech compared to XB1 but still costed the same.

The price would be the same if it launched tomorrow or 2030.
Xbox one had kinect in it, otherwise it would have been $399 at launch, clearly they don't have a huge audience waiting to pay them $499 so why go higher? If it's more than that again it'll only be hardcore xbox fans which is a relatively small group and people with a lot of disposable income. They won't grow their audience they'll shrink it, of course porting games to PS5 will help make up for some of that.

It sounds like this isn't going to be a mass market machine anyway.
 

Zathalus

Member
Yes Zathalus. Let's compare 4K Ultra settings with RT across wide range of memory setup in order to get a sense of who has better RT compute performance 8gb vs 24gb??? pffft who gives a shit.
Are you not comparing the 7800XT and 7900 XTX? So 16GB vs 24GB, neither of them would have memory bottlenecks at 4K.
 

Panajev2001a

GAF's Pleasant Genius
PC Gamers with Intel i9s and Ryzen X3D CPUs looking on at PS5 Pro owners thinking its Ryzen 2 CPU will solve the Unreal Engine 5 problem on consoles...


In a lot of cases some of the problems are lessened on consoles: shader stutter are mostly a PC only problem and CPU wise general overhead in consoles is lower (especially on PS5). Still, not enough to brute force things through.
 

winjer

Gold Member
PC Gamers with Intel i9s and Ryzen X3D CPUs looking on at PS5 Pro owners thinking its Ryzen 2 CPU will solve the Unreal Engine 5 problem on consoles...



Though that is true, consoles have a few things going for them.
For one, they have lower level APIs. They also don't have all the bloatware and spyware from Windows running in the background.
And because they have a unified memory pool, they don't have to shuffle data around in the PCIe bus, between the CPU and GPU.
A console like the PS5 has dedicated hardware for decompression of game data. PC only has 2 games using Direct Storage. Most game jus use the CPU to decompress data.
And then there is the problem of shader compilation, that causes so many stutters, on so many games on PC.
Finally, the bugs in Windows, that cause performance drops in CPUs, or in network transfers, or with SSD speeds, or boot times, etc.
 
I'm curious what you guys think of the 300 TOPS figure for the Pro's ML capabilities? I confess I'm a little ignorant in this regard but just taking a look at other CPU's, GPU's and NPU's, 300 TOPS seems like a big number relatively speaking.
 
I wouldn't believe any of those claims on those AMD slides. That's 100% PR material.

Cerny's scriptures on the other hand can be taken litteraly! :messenger_grinning_smiling: More seriously those are for developers and can be believed.
 

No_Cartridge

Neo Member
I'm curious what you guys think of the 300 TOPS figure for the Pro's ML capabilities? I confess I'm a little ignorant in this regard but just taking a look at other CPU's, GPU's and NPU's, 300 TOPS seems like a big number relatively speaking.
The 4080 has 836 TOPS and 4090 over 1300 TOPS. 300 is a decent starting point and will no doubt do the job but it's far from big numbers when compared to Nvidia cards.
 
Last edited:

Seomel

Member
PS5 Pro appeal would be bigger if devs had more performance option in games. Have option to unlcok 30fps on wuality modes etc. Nioh 1-2 were amazing on ps5 because you could just play at highest res 60fps and it ran like dream.
 
Top Bottom