• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro Specs Leak are Real, Releasing Holiday 2024(Insider Gaming)

lh032

I cry about Xbox and hate PlayStation.
PS5 - $499 +
PS5 Pro - $599 +
PS+ (8 years x $80) - $640 =
-------------------------------
$1740 for consoles this gen to keep up with midrange PCs

I dunno boys... PC gamers can also sell their parts and reduce the upfront costs of upgrading too.

Used to be that you spent $350 on a console, lasted you for 7 years, and the thing was $99 by the end. Game devs used to get better with the hardware and visuals drastically improved over the gen too.. Now, they've got you spending $500+ every 3-4 years.
we are not going to get a gaming PC
 

bitbydeath

Member
Exactly...

I know I in the minority but I truly woudnt mind a 1000 dollar premium console that would get me into the whole gen ... better than this mid gen refresh bullshit
People are underestimating what this bump means.
Remember we did get a slight glimpse of gaming in 8K once.

wiktor-ohman-copy-of-sreenshot-03-a.jpg

bastian-hoppe-artstation-bh-08.jpg

mike-kime-artstation02.jpg
 

Jigsaah

Member
That RAM is really slow...but no, your rig isn't only 25% faster than the Pro. It will probably be much closer to twice as powerful but that RAM can seriously hold you back. I assume the CPU isn't too recent either.
Nah it's 5000 series. I want to upgrade to the 7800x3d and get an AM5 motherboard and obviously DDR5 RAM. I'm confused here, you're saying even with this dated set up (sans GPU) my rig is twice as powerful? So I'm basing this off OP. Can you or someone clarify...is it 33 TFLOPS or 67 TFLOPS?
 

HeisenbergFX4

Gold Member
Nah it's 5000 series. I want to upgrade to the 7800x3d and get an AM5 motherboard and obviously DDR5 RAM. I'm confused here, you're saying even with this dated set up (sans GPU) my rig is twice as powerful? So I'm basing this off OP. Can you or someone clarify...is it 33 TFLOPS or 67 TFLOPS?
From what limited knowledge I have actually comparing to the PS5 it’s really half the 33 TF number

Besides too many people are missing the main thing in all this even if it was 100TFs

  • Rendering 45% faster than PS5
 

Bojji

Member
Nah it's 5000 series. I want to upgrade to the 7800x3d and get an AM5 motherboard and obviously DDR5 RAM. I'm confused here, you're saying even with this dated set up (sans GPU) my rig is twice as powerful? So I'm basing this off OP. Can you or someone clarify...is it 33 TFLOPS or 67 TFLOPS?

It's 33.5 RDNA3 dual issue teraflops.

This GPU is comparable to 16.75TF RDNA2 GPU (PS5 is around 10).
 

SlimySnake

Flashless at the Golden Globes
People that are concerned about the lack of CPU bump don't seem to have a good grasp of the reality of this refresh and the reality of the development landscape.

No, most games are *NOT* CPU limited to any great degree. CPUs can help with a lot of things, but in many ways devs are nowhere near ambitious enough to truly take advantage of them in any meaningful ways. Most game logic, physics, etc. is not much radically different from when developers were more constrained with Jaguar cores.

And if the PS5 Pro did introduce a modern CPU with much higher clocks - who would develop for it? You can't just magically write better AI code, or physics code, which would fundamentally change the way the game plays versus the mass market base PS5 audience. It does not make any sense to increase the CPU. We already get 60 fps games in the *VAST* majority of titles. By default, these games will get an instant bump to the 60 fps modes thanks to the higher raster, better RT, and superior AI methods.

We are only going to get a dramatic increase to the CPU when it's a new generation and developers slowly start taking advantage of it - but even when the PS6 launches, most devs will still be limited by 2020 console CPUs because we're back to the whole "cross gen" thing. It won't be until mid-PS6 generation that CPUs *MAY* start being taken advantage of, and even then it seems like the entire industry is having a brain drain moment. Maybe advancements in AI will help developers create more dynamic interactions that truly take advantage of additional CPU might.
I dont think you know anything about the state of RT gaming on PC if you think games are not CPU limited. They are especially limited in RT games. Its the reason why so many games ditch RT in performance modes on consoles. Even if the GPU can handle it, the CPU cant.

Now the latest and greatest CPUs from nvidia and AMD routinely hit 4.5-5.0 ghz which is why CPU limitations might not affect people rocking expensive rt GPUs. But people have done testing on these zen 2 CPUs and they are trash. utter garbage. and these CPUs are better than the trash cerny stuck inside the PS5. the only hope was that the increase in clockspeed to 4.5 ghz would help it overcome those bottlenecks much like zen 4 CPUs but 3.8 GHz just wont do.

Go ahead and look up how spiderman, star wars, hogwarts, forspoken, callisto, and virtually every single rt game is CPU bottlenecked on PC. You can plug a 3080 or a 4090 in there and unless you use a faster CPU, you are not going to fully utilize the GPU. Its just not going to happen. just watch the video i linked earlier.

i mean in one week, you are going to get dragons dogma 2 on consoles and its going to be 30 fps likely because its CPU bound due to RT effects. And its not just RT causing issues. Games like FF16, Skull and bones, Guardians of the Galaxy and alan wake 2 are dropping to 720p or struggling to hold 60 fps at 1080p. and they dont even have RT. a big CPU upgrade was needed here and they blew it.
 
Go ahead and look up how spiderman, star wars, hogwarts, forspoken, callisto, and virtually every single rt game is CPU bottlenecked

I can look at Spider-Man on PS5 with RT and see that it’s doing something most would consider impossible on consoles

I am not suggesting RT perf wouldn’t get better with more competent CPUs, but presumably that’s the exact reason why Cerny is going with dedicated increases to RT hardware with this refresh, that matters a lot more to RT than increasing the CPU ever would

He clearly knows a lot more than you or I about building a well balanced affordable machine.
 

bitbydeath

Member
Not at all, but we had this discussion many times already.

6800 is 16.17TF
7700XT is 35.17TF

Real world performance is:

V9wv1lc.jpg
I provided proof, your feelings that it’s something else is just wrong. Your assumption is that it means RDNA3 is a downgrade from RDNA2, that should have been a red flag.
 

SlimySnake

Flashless at the Golden Globes
I can look at Spider-Man on PS5 with RT and see that it’s doing something most would consider impossible on consoles

I am not suggesting RT perf wouldn’t get better with more competent CPUs, but presumably that’s the exact reason why Cerny is going with dedicated increases to RT hardware with this refresh, that matters a lot more to RT than increasing the CPU ever would

He clearly knows a lot more than you or I about building a well balanced affordable machine.
spiderman is just doing rt reflections and already drops below 1080p. those other games are doing a lot more rt effects which is why they cause a bottleneck on the CPU. it makes zero sense to increase rt performance by 2-4x and then cheap out on the CPU when the CPU is the major bottleneck.

And here is spiderman 1 on PC with ray tracing. look at how the zen 2 cpu effectively turns the 3090 into a little bitch. if Cerny's goal was to get his GPU up to par with a 3080 then he's just handicapped his own GPU with the smallest of CPU upgrades.

66e3GxU.jpg


And no, the same cerny guy bottlenecked the PS4 Pro with a tiny memory bandwidth upgrade. He did well with the PS5 but his rival Jason Ronald and all the geniuses over at Xbox who know better than me or you produced a console that loses half the face/offs against the PS5 because the geniuses decided to clock the GPU to RDNA1 clocks all the while trying to hit that magical 12 tflops number. You put too much faith into these people. we just had Sony's own CEO tank the stocks of his company losing them billions in a week forcing them to shed people from their most successful business because the idiot CEO threw them under the bus.

if these geniuses knew what they were doing, it wouldnt have taken them 6 years to bring RT up to par with nvidia.
 

Brigandier

Member
What's to say more CPU cache was not added?

Even if it wasnt though, I just don't think sony felt they had a CPU bottleneck problem and decided to do nothing about it. Maybe making the GPU better, especially with regards to RT and having dedicated AI hardware for upscaling frees up the CPU some.

Will the increase in Ram bandwidth mean anything??

Also wasn't Cerny supposed to be working on some kind of RT hardware/software? I'm very interested in seeing what PSSR can do!!

This sounds like a PS5.1

similar power to the original model but with better performance with RT and ML

Nonsense.
 

ChiefDada

Gold Member
This GPU is comparable to 16.75TF RDNA2 GPU (PS5 is around 10).

It really isn't comparable though because this isn't RDNA3 and the closed development environment is very different and more advantageous for dual issue programming.

Would be pretty cool if more rudimentary games that are already rendering at or near 4K could upscale to 8K internally and then subsample back down to 4K for really good AA.

God of War Ragnarok would be an excellent candidate for this. IQ is already super crisp but some PSSR AA would be next level.
 

ChiefDada

Gold Member
And here is spiderman 1 on PC with ray tracing. look at how the zen 2 cpu effectively turns the 3090 into a little bitch. if Cerny's goal was to get his GPU up to par with a 3080 then he's just handicapped his own GPU with the smallest of CPU upgrades.

Yeah... because of asset decompression. You know that thing Cerny specifically ensured PS5 CPU wouldn't have to strain over with its super duper ASIC decompressor and i/o stack.
 

SlimySnake

Flashless at the Golden Globes
From what limited knowledge I have actually comparing to the PS5 it’s really half the 33 TF number

Besides too many people are missing the main thing in all this even if it was 100TFs

  • Rendering 45% faster than PS5
Yeah, its bizarre to see people ignoring that figure. We dont need tflops or IPC gains or secret sauce or console optimization nonsense because we already have THE performance number direct from sony themselves.

Is there a chance that the CPU got an upgrade from zen 2 to zen 4? this 10% cpu increase might not be so bad when paired up with a 30% IPC gain increase amd achieved when going from zen 2 to zen 4. Why are we assuming that Sony is Sticking with zen 2 if they have no problems upgrading the gpu to rdna 4?
 
Last edited:

ChiefDada

Gold Member
its not just the cpu decompression. its mainly RT.

tmD49wS.jpg

Nah it's mainly decompression which is why the 2070S and 3600 DF used in their performance analysis fell way below PS5. Nixxes confirmed all of this with Alex. You can see below 3600 is comfortably above 60fps with high RT in CPU limited testing.

6QnmWDi.jpg
 

SlimySnake

Flashless at the Golden Globes
Nah it's mainly decompression which is why the 2070S and 3600 DF used in their performance analysis fell way below PS5. Nixxes confirmed all of this with Alex. You can see below 3600 is comfortably above 60fps with high RT in CPU limited testing.

6QnmWDi.jpg
Dude did you not see my screenshot? It literally shows rt off vs rt on.

It doesn’t matter if the 3600 can do 60 fps on a last Gen game with just one run rt effect, the point is that even that game is holding back a powerful gpu like the 3090. If cernys intention is to get rt performance up to speed with the 3080 or 4070 then he needed a bigger cpu upgrade.

Other games with more rt effects than just reflections completely crush the 3600 and bottleneck any gpu you attach to that 3600. Which itself is way better than the ps5 zen 2.

I play PC games and i know exactly how heavy these games are on the cpu. Especially CPUs with lower clocks like these zen 2 cpus. I’ve gone through this with almost a dozen games in the last couple of years alone. Rt is expensive on the cpu. It’s not rocket science.
 
those other games are doing a lot more rt effects which is why they cause a bottleneck on the CPU. it makes zero sense to increase rt performance by 2-4x and then cheap out on the CPU when the CPU is the major bottleneck.

Seriously you can't be this dense. Those other games likely have exceed the RT capabilities on the GPU, which is then causing the issue that you are seeing.

You are making so many reactionary, hyperbolic, ridiculous comments. Just STOP. We know the RT performance increase - it's 2-4X per the documentation. According to you, because the CPU isn't upgraded, the RT performance gain is 0%.

WRONG.

And no, the same cerny guy bottlenecked the PS4 Pro with a tiny memory bandwidth upgrade. He did well with the PS5 but his rival Jason Ronald and all the geniuses over at Xbox who know better than me or you produced a console that loses half the face/offs against the PS5 because the geniuses decided to clock the GPU to RDNA1 clocks all the while trying to hit that magical 12 tflops number. You put too much faith into these people.

Mark Cerny didn't "BOTTLENECK" the Pro. He was given a budgetary constraint to work with, and developed the system around that. Always is the case. I also have never put Jason Ronald on a pedestal for great hardware. Microsoft has never made great hardware. They are not a hardware company. But Sony is, and Mark Cerny has a proven track record going back to the 80s.

It's laughable that you are even questioning him on any of this.
 

FalconPunch

Gold Member
Personally, I can’t wait to replace my 2 ps5s with 2 pros. I’ll be getting it on day one and i wish they’d let us preorder already. The ps5 is already struggling and we need more power. And to the pc marketers in this thread, we don’t care. Some of us already have pcs and still want a pro. I have a 4090 + 7800x3d system with a QD Oled monitor and it mostly gathers dust. After working on a pc all day, very few want to sit on the pc and game.

Pc would be more interesting if it had unique games that really took advantage of the hardware. Unfortunately, to date, there are only 2 games since Turing released which actually make playing on pc worth it. Cyberpunk and AW2. I already beat Cyberpunk on pc and aw2 is boring. The rest of the games are just console games at high resolutions and frame rates which is just boring. Even the games that have RT bolted on are not transformative at all and most of those games have RT on console for those who want it.
 

ChiefDada

Gold Member
Dude did you not see my screenshot? It literally shows rt off vs rt on.

Um yeah and most of that fps clawback going from on to off would be on the GPU side.

It doesn’t matter if the 3600 can do 60 fps on a last Gen game with just one run rt effect, the point is that even that game is holding back a powerful gpu like the 3090. If cernys intention is to get rt performance up to speed with the 3080 or 4070 then he needed a bigger cpu upgrade.

Ok Slimy. You know best.
 

john2gr

Member
People will get REALLY disappointed when they find out they won't get that 2-4X performance boost in Ray Tracing on the most demanding current-gen games (news flash: RT also increases CPU requirements).

This is EXACTLY like the 8K logo Sony added to PS5 (which promised 8K capabilities... yeah, when the console can barely render demanding games at 900p), and how "game-changing" the PS5 SSD would be as it would "completely change the way devs develop their games, opening entirely new experiences that could have never been achieved". It's funny watching people falling into the same hype cycle over and over and over and over. Some people never learn.
 

ChiefDada

Gold Member
People will get REALLY disappointed when they find out they won't get that 2-4X performance boost in Ray Tracing on the most demanding current-gen games (news flash: RT also increases CPU requirements).

Hi. Can you name 3 RT games that are currently CPU limited on PS5/XSX?


Or 2?

Or 1?
 

Bojji

Member
Um yeah and most of that fps clawback going from on to off would be on the GPU side.

This scene is literal CPU benchmark, GPU is underutilized (less than 100%). GPU could render more frames but cpu can't keep up with rt.

Digital foundry have made many videos on cpu cost of ray tracing



34OmrIZ.png


HALF of performance is lost here and that's entirely on CPU side.
 

Bojji

Member
How is 33.3 Tflop possible? That has to be pumped up number using Ray Tracing to hit that. What is the actual GPU compute? 18 Tflop?

16.75 but IF developers start to use this dual issue stuff it can potentially come closer to that 33.5 number. This reminds me of 8tf for fp16 calculations on PS4 pro, fanboys were saying that it was stronger than Xbox one x, hahaha.
 
16.75 but IF developers start to use this dual issue stuff it can potentially come closer to that 33.5 number. This reminds me of 8tf for fp16 calculations on PS4 pro, fanboys were saying that it was stronger than Xbox one x, hahaha.

A bit lower than what I was expecting, but 2 - 3x better ray tracing performance should be nice for the games that utilize it.

Will this be enough to take unlocked FPS modes that hover in the 40 - 50fps range to 60fps?
 
Last edited:

StreetsofBeige

Gold Member
This scene is literal CPU benchmark, GPU is underutilized (less than 100%). GPU could render more frames but cpu can't keep up with rt.

Digital foundry have made many videos on cpu cost of ray tracing



34OmrIZ.png


HALF of performance is lost here and that's entirely on CPU side.

Maybe I'm blind as hell but in the pic I dont see any visual improvement at half the frames.
 
Top Bottom