• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro Specs Leak are Real, Releasing Holiday 2024(Insider Gaming)

Loxus

Member
Yeah and I highly doubt that's going to happen, it may be a "wish" but anything they put out that's a "generational" leap would be priced so highly it would be only the most hardcore xbox fan or some people here with the spare cash :) who would buy it.
Why do you assume the price would be high?

XBSX launched with far more advanced tech compared to XB1 but still costed the same.

The price would be the same if it launched tomorrow or 2030.
 
Why do you assume the price would be high?

MS and Sony are getting out of the big subsidy game. It's going to be like Nintendo where the consoles are sold at cost at a bare minimum.

This includes the PS5 Pro.

If MS is going to allow Steam, etc on the console Day 1... I almost wonder if they might even attempt to try to make a profit.
 
Last edited:

Little Mac

Gold Member
Have faith my brethren. Astro Bot is the one true GOTY and he will bring upon us on the scriptures of the Pro's arrival!!!

Church Saturday GIF
 
Last edited:

Loxus

Member
so lets see...

36 RT units = 1RT performance index

60 RT units = 1.6RT index

120 RT units = 3.3RT index

Ps5pro is said to have 2-4x the RT performance.

How exactly does doubling RT cores per CU not explain this?
It is already said that BVH8 doubles RT performance per RT unit.

zNCmbqv.jpeg


Then RDNA3 already gives a 50% RT improvement per CU vs RDNA2 and we have to take into account that the PS5 Pro has more CUs.
K3G66PR.jpeg


PS5 Pro having double the RT units would yield a 4-8× improvement, not 2-4×.
29zxLQT.jpeg


This is what I dislike about you guys.
You all don't pay attention.
 
Last edited:

Loxus

Member
WTF am I reading? What treachery is this? Of all the posters here, I did not expect you to be an unbeliever.


No, thanks. We don't need you. You're out of the PS5 Pro club. Don't let us catch you trying to buy one when it comes out.
I like how everyone is skipping over the part of may post.

PS3: 2006 --3 years--> PS3 Slim: 2009
- -7 years--
PS4: 2013 --3 years--> PS4 Slim: 2016
--7 years--
PS5 2020 --3 years--> PS5 Slim: 2023
--7 years--
PS6: 2027

Following this makes PS5 Pro feel like a 50/50 bet. PS5 Pro should of released last year to follow Sony's console models.
 

Pedro Motta

Member
I like how everyone is skipping over the part of may post.

PS3: 2006 --3 years--> PS3 Slim: 2009
- -7 years--
PS4: 2013 --3 years--> PS4 Slim: 2016
--7 years--
PS5 2020 --3 years--> PS5 Slim: 2023
--7 years--
PS6: 2027

Following this makes PS5 Pro feel like a 50/50 bet. PS5 Pro should of released last year to follow Sony's console models.
This gen will be longer.
 

onQ123

Member
I like how everyone is skipping over the part of may post.

PS3: 2006 --3 years--> PS3 Slim: 2009
- -7 years--
PS4: 2013 --3 years--> PS4 Slim: 2016
--7 years--
PS5 2020 --3 years--> PS5 Slim: 2023
--7 years--
PS6: 2027

Following this makes PS5 Pro feel like a 50/50 bet. PS5 Pro should of released last year to follow Sony's console models.
What model though? There has only been one PlayStation Pro model & this time around we was dealing with a global pandemic & chip shortages.
 

Loxus

Member
What model though? There has only been one PlayStation Pro model & this time around we was dealing with a global pandemic & chip shortages.
It's a clear as day pattern that would put the PS6 releasing in 2027.

That pattern would of put the PS5 Pro releasing the same year as the PS5 Slim.

With that not happening and following that clear as day pattern, it normal for someone to think maybe Sony decided to skip the PS5 Pro.

That's all I was pointing out, but if what I said hurt feelings, I'm truly sorry.
 

Omnipunctual Godot

Gold Member
I like how everyone is skipping over the part of may post.

PS3: 2006 --3 years--> PS3 Slim: 2009
- -7 years--
PS4: 2013 --3 years--> PS4 Slim: 2016
--7 years--
PS5 2020 --3 years--> PS5 Slim: 2023
--7 years--
PS6: 2027

Following this makes PS5 Pro feel like a 50/50 bet. PS5 Pro should of released last year to follow Sony's console models.
You're not taking into account the massive supply chain issues caused by covid. This wasn't an ordinary gen. It's also why Sony released next to nothing this year in terms of software.
 
Last edited:

ChiefDada

Gold Member
Then RDNA3 already gives a 50% RT improvement per CU vs RDNA2 and we have to take into account that the PS5 Pro has more CUs.
K3G66PR.jpeg

XVtX5Kg.jpeg


... And it's more like 25% uplift per CU with RDNA2 vs RDNA 3 as you can see with 6800 vs 7800xt.

lqJuWgH.jpeg


Gotta say, knowing how empirically minded you've always been in the past, I'm disappointed in you for conflating spec sheet with real world performance.
 

Loxus

Member
XVtX5Kg.jpeg


... And it's more like 25% uplift per CU with RDNA2 vs RDNA 3 as you can see with 6800 vs 7800xt.

lqJuWgH.jpeg


Gotta say, knowing how empirically minded you've always been in the past, I'm disappointed in you for conflating spec sheet with real world performance.
I like how you intentionally skip out the BVH8 part.
 

onQ123

Member
It's a clear as day pattern that would put the PS6 releasing in 2027.

That pattern would of put the PS5 Pro releasing the same year as the PS5 Slim.

With that not happening and following that clear as day pattern, it normal for someone to think maybe Sony decided to skip the PS5 Pro.

That's all I was pointing out, but if what I said hurt feelings, I'm truly sorry.
Hurt Feelings? Lol how?

It's good to gather data from the past but don't ignore changes in that data.

We could actually end up with a PS5 Pro this year & a Super Pro in 2027
 
Regardless of PS5 Pro, Sony has to have a significant event in September for games

I'm fairly confident that a Showcase is coming soon
This year Sony has Astrobot and nothing else. Death Stranding 2 is scheduled to release in 2025. There is also that unknown Bluepoint game. The rest will be at the same level (or similar) as Concord. GTA6 is not exclusive to PS5 BTW and will also release on Xbox at the same framerate. Yeah the good old Playstation era is ending soon.
 

Loxus

Member
Hurt Feelings? Lol how?

It's good to gather data from the past but don't ignore changes in that data.

We could actually end up with a PS5 Pro this year & a Super Pro in 2027
One thing about me is I always keep an open mind about multiple possibilities.

That way I wouldn't be disappointed.
My mind is open to the PS5 Pro releasing this year, next year or even being canceled.

I share these possibilities to open others mind about these different possibilities based on various information, but it's obvious that I can hurt someone that wants a the Pro or someone who doesn't think a Pro is needed.

You see the emotions in their post. So I was just apologizing for hurting feelings.
 

jose4gg

Member
Reconstruction as we know it today... does go a long way back. It's been an evolution. And yes, you can say it started with Sony and checkerboard rendering. With software-based reconstruction spearheaded by Guerilla games and funny enough... Ubisoft. But yes, Sony was first to actually put in dedicated hardware to accelerate it in the PS4 Pro.

Having said all that, as it stands today... ML-based reconstruction (DLSS, XeSS and soon to be PSSR) is now demystified. Its basically a matrix operation run on matrix hardware. Thats it. It's why Intel could spit out a reconstruction solution that rivaled DLSS and is better than anything FSR can manage... in their first attempt. Everything that makes DLSS great, can be done by anyone as long as they have hardware with ML units.

Once the hardware is there, it's a simple case of training their algorithm. And the crazy thing is that you can even use Nvidia hardware to train an algorithm that would run on an AMD-based GPU. The even crazier thing, is that said alogarithim has/could have been being trained for the last 4 years. Even when the people training do not yet have the hardware to actually run their algorithm. Its just matrix code. No special or secret sauce there. At the end of the day, you are or will only be limited by how many TOPS (the AI buzzword to replace the raster buzzword of TF) your processor can handle. And even that has become standardized with how much you actually need to do reconstruction and say frame gen in under 2ms.

But we are not even talking about the most important thing here... that is AMD. Make no mistake, the reason Nvidia has looked a world apart is not because they are sitting on some sort of mystical tech... its all down to AMDs stubbornness (or stupidity, take your pick) of not putting ML-based hardware in their GPUs and actually releasing a ML based reconstruction solution. You have to understand that what AMD has managed thus far with FSR, is actually harder to pull off on a purely software basis. Thats because their algorithm is man-made.

Once the PS5pro hits, all that changes. Every AMD GPU after that would come with ML-based hardware that matches or exceeds whatever is in the PS5pro. AMD will at that point, no doubt have an ML-based reconstruction solution to match or exceed whatever the PSSR is... they probably call it FXR. And even if MS doesn't build out their own, nothing stops them from just using AMDs own... especially if their next GPU is also going to be from AMD. Hell,,, MS could even have an ML-based reconstruction algorithm trained and ready to go right now. Because remember, its now about those TOPs... you can literally have 3 different ML-based reconstruction solutions running on the same hardware. Or a situation where an Nvidia GPU (with more TOPs) can run PSSR better than an PS5pro can run it.

But where this gets interesting, is if MS is really going the route of just not operating within the pricing confines of your typical console, and instead making a premium "console" that retails for $800+... nothing stops them from putting in an Nvidia GPU in there. This is something that MS can and should probably do, because they have to have figured out by now that they will never win sony if they have the exact same priced box. They might as well double down on the power and premiumness of their console, except they will sell as a niche console product, but will always be known as having the best console gaming hardware or experience.

The point is, within a 3 year window, everyone is gonna have ML-based hardware, and everyone is going to be able to do it right.

Apologies for the wall of text.

The only thing I disagree with is that selling consoles at a lower price than what it costs to build a PC makes sense while selling at high quantities, so yeah, they can create an 800-dollar console that is "premium" but... If it is selling at a loss it will be at a bigger loss than what they are doing now, even with the price increase. So expensive consoles, that only sell to a niche doesn't really make sense.
 

Fafalada

Fafracer forever
Reconstruction as we know it today... does go a long way back.
Longer than most people realize - some TV makers had motion-vector based super-resolution in realtime back in the late 00s. Though I don't think it ever made it into mainstream devices.
The amusing thing is that many of these algorithms could even run on a PS3 though - with a time-machine, that 1080p future Cell promised could actually be real. 🤷‍♀️ :messenger_winking_tongue:

And plain TAA origins date even farther back - hell, the first console game to feature TAA was actually on GT Portable on PSP.


The point is, within a 3 year window, everyone is gonna have ML-based hardware, and everyone is going to be able to do it right.
And maybe then we'll finally start seeing AI accelerators in games used for purposes beyond pure cosmetics...
 

ChiefDada

Gold Member
I like how you intentionally skip out the BVH8 part.

Not really, I basically addressed it here:

I'm disappointed in you for conflating spec sheet with real world performance.

It goes back to you interpreting specs in a vacuum. The BVH8 doubles the THEORETICAL THROUGHPUT of RT intersect/traversal calculations but there are other aspects of the hardware that play a part in completing the end to end process such as CPU and memory/bandwidth setup. And we know these other aspects did not receive the same degree of uplift. Contrary to what Alex from DF (and seemingly you too) believe RDNA RT does not scale perfectly linearly with CUs. Look at the graph again; 7900XTX has 60% more CUs than 7800XT and yet we only see 35% performance delta. Why do you think that is?
 

THE:MILKMAN

Member

I just can't believe after Shawn Layden's DICE talk recalling their "Icarus moment" and PlayStations "stark moment of hubris" with the $599 price being the worse thing of all, they go at or above that price point again.

While I get it was a long time (and inflation) ago they know full well it will be called out....

*crosses fingers for Pro to be the PS5 launch price of £449 max inc disc drive*
 

Gaiff

SBI’s Resident Gaslighter
RDNA RT does not scale perfectly linearly with CUs. Look at the graph again; 7900XTX has 60% more CUs than 7800XT and yet we only see 35% performance delta. Why do you think that is?
7900 XTX is around 50% faster than the 7800 XT. It’s slightly faster than the 4080. No idea where you got 35% from.
 

Gaiff

SBI’s Resident Gaslighter
And the 40 cards performing significantly above the XTX are immune to these bottlenecks because??? Clearly is GPU limited.





The graph... Gaiff
Those are benchmarks that flip on the ray tracing setting in games. Since most games are still primarily using rasterization, you'll still run into a CPU bottleneck even at 1080p using ray tracing data (because they'll also include a bunch of games with very light RT).

relative-performance-rt-3840-2160.png


Clearly, something is holding back the 7900 XTX. It wouldn't make sense for it to be a mere 35% faster in RT but 50% in rasterization. Their raster and rt numbers align almost perfectly.
 
Last edited:

Xyphie

Member
CPU boundness is a spectrum, not a binary. That's why better GPUs can still perform better while still being CPU bound.
 
Last edited:

ChiefDada

Gold Member
Those are benchmarks that flip on the ray tracing setting in games. Since most games are still primarily using rasterization, you'll still run into a CPU bottleneck even at 1080p using ray tracing data (because they'll also include a bunch of games with very light RT).

relative-performance-rt-3840-2160.png


Clearly, something is holding back the 7900 XTX. It wouldn't make sense for it to be a mere 35% faster in RT but 50% in rasterization. Their raster and rt numbers ally almost perfectly.

Yes but this is exactly my point to isolate RT performance as it relates to CUs. You might want to circle back to genesis of my side discussion with Loxus Loxus . Using something like the 4k analysis would make rasterization, memory architecture more of the limiting factor, and we are debating real world RT performance, specifically.
 

Loxus

Member
Not really, I basically addressed it here:



It goes back to you interpreting specs in a vacuum. The BVH8 doubles the THEORETICAL THROUGHPUT of RT intersect/traversal calculations but there are other aspects of the hardware that play a part in completing the end to end process such as CPU and memory/bandwidth setup. And we know these other aspects did not receive the same degree of uplift. Contrary to what Alex from DF (and seemingly you too) believe RDNA RT does not scale perfectly linearly with CUs. Look at the graph again; 7900XTX has 60% more CUs than 7800XT and yet we only see 35% performance delta. Why do you think that is?
This test truly gauges RT performance.
Radeon RX 7800 XT reference review
1IxJxwk.jpeg


Assuming the PS5 RT performance = 6700XT = 13fps.
PUG3q65.jpeg


It's stated the PS5 Pro RT is 2-3× better than PS5.
13fps × 2 = 26fps = 6800XT/3060TI
13fps × 3 = 39fps = 4060TI
This increase in performance is what one would expect from a mid gen refresh.

This raw RT improvement reflects the various RT improvements including BVH8.

It doesn't reflect these improvements plus doubling the RT units.

39fps × 2 = 78fps, which would put in between the 4070TI and 4080 like the MLiD leak would suggest.

I don't even know what the issue is.
Matching the 4060TI in RT is crazy good and this improvement isn't even using PSSR.
 

Gaiff

SBI’s Resident Gaslighter
Yes but this is exactly my point to isolate RT performance as it relates to CUs. You might want to circle back to genesis of my side discussion with Loxus Loxus . Using something like the 4k analysis would make rasterization, memory architecture more of the limiting factor, and we are debating real world RT performance, specifically.
Not sure how you're isolating the RT performance of the CUs when the difference keeps ballooning with higher resolutions? There are full ray tracing data benchmarks out there. Just use those instead.
 

ChiefDada

Gold Member
Not sure how you're isolating the RT performance of the CUs when the difference keeps ballooning with higher resolutions?

Because, again, I would argue that memory setup becomes the main bottleneck and thus taints the comparison.

There are full ray tracing data benchmarks out there. Just use those instead.

Also not applicable because it's ultra theoretical, while Sony leaks are referencing real world performance.

Like it or not we are in a hybrid RT generation. All about balance. The 1080p RT analysis, while not perfect, is the most sensible to use.
 

ChiefDada

Gold Member
This test truly gauges RT performance.
Radeon RX 7800 XT reference review
1IxJxwk.jpeg


Assuming the PS5 RT performance = 6700XT = 13fps.
PUG3q65.jpeg


It's stated the PS5 Pro RT is 2-3× better than PS5.
13fps × 2 = 26fps = 6800XT/3060TI
13fps × 3 = 39fps = 4060TI
This increase in performance is what one would expect from a mid gen refresh.

This raw RT improvement reflects the various RT improvements including BVH8.

It doesn't reflect these improvements plus doubling the RT units.

39fps × 2 = 78fps, which would put in between the 4070TI and 4080 like the MLiD leak would suggest.

I don't even know what the issue is.
Matching the 4060TI in RT is crazy good and this improvement isn't even using PSSR.

Read my prior post. If you're going to use awful methodology and compare:

1. RT performance of a console that has actually been achieved in real games and while also having to contend with other bottlenecks of the system

vs

2. pure PT benchmark that purposefully controls for any external system bottlenecks, thus measuring theoretical,

then you have to use the max real world figure (4x number) that Sony has provided as the worst case scenario for PS5 Pro.

I hope you understand where I'm coming from.
 

Mr.Phoenix

Member
This test truly gauges RT performance.
Radeon RX 7800 XT reference review
1IxJxwk.jpeg


Assuming the PS5 RT performance = 6700XT = 13fps.
PUG3q65.jpeg


It's stated the PS5 Pro RT is 2-3× better than PS5.
13fps × 2 = 26fps = 6800XT/3060TI
13fps × 3 = 39fps = 4060TI
This increase in performance is what one would expect from a mid gen refresh.

This raw RT improvement reflects the various RT improvements including BVH8.

It doesn't reflect these improvements plus doubling the RT units.

39fps × 2 = 78fps, which would put in between the 4070TI and 4080 like the MLiD leak would suggest.

I don't even know what the issue is.
Matching the 4060TI in RT is crazy good and this improvement isn't even using PSSR.
The only conclusion I can arrive it is that someone has hijacked your account.

You are taking a lot of different "true" things, but painting a picture that is untrue. So much so that I don't even know where to begin.

Eg.. you talk about 50% uplift of RDNA3 over RDNA2... which we can all clearly see is not even the case from benchmarks.

Then you are talking about BVH8 is if BVH acceleration is not just one part of the RT pipeline.

But the worst part of al this... is that with all your technical knowledge, you are somehow ignoring the fact that there are so many other things that goes into a GPU that could affect its overall performance. Eg... how much cache is feeding the CUs? How bad is bandwidth contention in the PS5pro? How accurate are the least we even have? Like are the old? New? What?
 

Loxus

Member
The only conclusion I can arrive it is that someone has hijacked your account.

You are taking a lot of different "true" things, but painting a picture that is untrue. So much so that I don't even know where to begin.

Eg.. you talk about 50% uplift of RDNA3 over RDNA2... which we can all clearly see is not even the case from benchmarks.

Then you are talking about BVH8 is if BVH acceleration is not just one part of the RT pipeline.

But the worst part of al this... is that with all your technical knowledge, you are somehow ignoring the fact that there are so many other things that goes into a GPU that could affect its overall performance. Eg... how much cache is feeding the CUs? How bad is bandwidth contention in the PS5pro? How accurate are the least we even have? Like are the old? New? What?
So what is both the PS5 and PS5 Pro RT performance equivalent to?
 

Gaiff

SBI’s Resident Gaslighter
Because, again, I would argue that memory setup becomes the main bottleneck and thus taints the comparison.
And CPU problems taint the comparison as well. Besides, it’s not like you can isolate the CUs because the memory setup also factors in ray tracing. There is no way to completely isolate any one component no matter what you try to do. The rest of the GPU doesn’t stay idle during RT workloads.
 

ChiefDada

Gold Member
And CPU problems taint the comparison as well. Besides, it’s not like you can isolate the CUs because the memory setup also factors in ray tracing. There is no way to completely isolate any one component no matter what you try to do. The rest of the GPU doesn’t stay idle during RT workloads.

Of course there is no perfect analysis. Nothing we can do about that reality. But again based on the debate we're having here, it's far better to use that 1080p RT test than any other, particularly when we are honing in on relative RT performance between AMD GPUs; CPU bottleneck becomes significantly less of a factor. The XTX position proves as much with it' performance being neck and neck with 4070ti. Can't be the CPU, it's tied there obviously, can't be VRAM, XTX has double capacity, can't be memory bandwidth XTX is comfortably ahead there as well. By deductive reasoning, we can point to/ isolate the RT architecture. By increasing resolution, you're getting further away from meaningful insight.
 

Gaiff

SBI’s Resident Gaslighter
Of course there is no perfect analysis. Nothing we can do about that reality. But again based on the debate we're having here, it's far better to use that 1080p RT test than any other, particularly when we are honing in on relative RT performance between AMD GPUs; CPU bottleneck becomes significantly less of a factor. The XTX position proves as much with it' performance being neck and neck with 4070ti. Can't be the CPU, it's tied there obviously, can't be VRAM, XTX has double capacity, can't be memory bandwidth XTX is comfortably ahead there as well. By deductive reasoning, we can point to/ isolate the RT architecture. By increasing resolution, you're getting further away from meaningful insight.
The 7900 XTX is 51% faster than the 7800 XT at 4K, 45% at 1440p, and 38% at 1080p. You’re postulating that the difference in ray tracing is greater than in rasterzation and this makes no sense. Furthermore, the CPU can also be massively impacted by RT, introducing yet another bottleneck into the equation.

There is no correct resolution to prove your point. It will depend on the resolution used. The problem with your data is that something outside of the GPU (the CPU) is also skewing the results. At least at 4K, we know for a fact that anything binding the GPUs is themselves somehow someway.

The CU count seem to scale the exact same whether with or without RT. Stating that the 7900 XTX has 60% more CUs but 35% more performance ignore the 1440p and 4K data where the gap is bigger. There’s really no wrong or right because if your objective is to determine the performance of the Pro based on its CU count, then it too will be impacted by the resolution.

I’m not quite sure why we must absolutely use 1080p rather than just acknowledge that the performance delta will increase as does the resolution.
 

ChiefDada

Gold Member
The 7900 XTX is 51% faster than the 7800 XT at 4K, 45% at 1440p, and 38% at 1080p. You’re postulating that the difference in ray tracing is greater than in rasterzation and this makes no sense.

I most definitely did not. Check the tape:

Contrary to what Alex from DF (and seemingly you too) believe RDNA RT does not scale perfectly linearly with CUs.

I said RDNA RT does not scale 1:1 with CU.

Furthermore, the CPU can also be massively impacted by RT, introducing yet another bottleneck into the equation.

Bringing up CPU as a bottleneck when the 4090 v. 7900xtx is performing 35% faster 1080p RT compared to 14% faster at 1080p/non RT, or when 4070ti v. 7900xtx goes from 2% faster at 1080p RT vs 10% slower at 1080p/non RT all while the CPU is being controlled for as a constant is just poor logic and ignorance. Sorry gotta call it what it is.

XtTvY1A.jpeg


cIGdtFR.jpeg
 
Top Bottom