• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 Pro Specs Leak are Real, Releasing Holiday 2024(Insider Gaming)

Skifi28

Member
That AMD link doesn't go into detail about how they achieved their resultant file sizes other than mentioning neural texture block compression. It's most likely what the name implies which is using machine learning and neural networks to upsample low resolution textures in real-time.
I could have sworn I watched a video where a dev was explaining this, but I can't find anything. Maybe it was just a dream.
 

yurinka

Member
Maybe because they are releasing a mid-gen a full year later than last time?
The mid gen refresh is just a small extra for them, Pro was only a small part of the PS4 sales. I don't think it will impact PS6.

The delay may have been related to the component shortage issues or something like that, something understandable for a product that maybe was originally supposed to be released last year. But not something that would release late 2027 and isn't under production or even designed.
 

Fafalada

Fafracer forever
For ease of understanding lets say something that can render this game 1080p30 has a performance index of 1.0x.
Standard PS5
Performance mode: 1080p60 => performance index of 2.0x
Fidelity mode: 1800p30 => performance index of 1.67x
1800p30 is 2.77x pixels of 1080p30 (I assume you use pixels as the metric - if it's something else - it's not clear what you're measuring).

Anyway:
Going from 1080p60 to 1440p60 (with fidelity mode settings) is pretty much identical type of increase to what PS4Pro titles did at launch (Knack, Infamous, Uncharted 4) and also was the most common type of increase done on PS4Pro in general (1440p was very common pick by 3rd parties as well).

This increase is also ~1.77x number the pixels, plus increases to rendering fidelity - so the pixel throughput is scaling above the '45%' expectation (that doesn't mean overall performance is - but people like use the pixels as shorthand for performance).

But I think the more noteworthy part of that example is that they replace 1800p fidelity mode with 1440p + PSSR - indicating the confidence it matches or beats quality of 1800p using a more standard upscale, which is similar to how the likes of DLSS perform. As an indicator of PSSR quality - it's a pretty positive one.
 
Last edited:

ChiefDada

Gold Member
Gaiff Gaiff c'mon brother don't gaslight me here. There absolutely has been a paradigm shift in console memory management. Some of the PC gamers, including Alex from DF, scoffed and assumed they'd be just fine because of the system memory pool on PC. And maybe you don't recognize because most of the games that are making the most of it aren't out on PC yet. GoW Ragnarok is streaming massive textures and I guarantee you plenty PC gamers will claim bad port later this year. Demon Souls is loading 3gb+ of textures on the fly as you turn corners, per developer. SM2 is now loading in animations on the fly in addition to textures during gameplay.

Look at the responses to a thread where OP asked if a 3070 would suffice to play next gen games at 1440p/ultra settings/144fps. The delusion was ultimately because they didn't understand/believe in the console i/o revelation and by extension focused entirely on compute and paid no mind to memory concerns; just look how many people justified their answer by bringing up DLSS. I think all are banned below but there are similar ones in the thread that aren't but I'm not seeking to call anyone out here. Just showing you the mindset of those who didn't foresee memory as an issue, and there were many.


Yes. Easily. You will want a card that has access to DLSS as its becoming more and more prevalent with even developers of smaller titles implementing it and older games like RDR2 opting to patch it in.

You probably won't hit 144fps in the most demanding games but 60fps should be attainable everywhere with DLSS and no RT.

My guess is mostly.

The 2070 could pretty much run every last gen game at 144hz @ 1440p. I imagine the 3070 should be enough for most, with the occasional poorly optimised game running circa 100fps.

Enough? At those specs is almost overkill.

Yes, I would trade the 6900xt for the 3070. But only because of the RT. DLSS is icing on the cake. But if you don't care about the RT, then I'd keep the 6900xt.

At 1080p with DLSS hah easy peazy whole next gen, even at WQHD you'll be fine, but it also depends if you have top CPU.

Yes without ray tracing you might even get 4k 60 with few settings tweaked


Although not sure if you can hit 140fps. But 70-80 should be doable at ultra settings most of the time
 

Gaiff

SBI’s Resident Gaslighter
Gaiff Gaiff c'mon brother don't gaslight me here. There absolutely has been a paradigm shift in console memory management. Some of the PC gamers, including Alex from DF, scoffed and assumed they'd be just fine because of the system memory pool on PC. And maybe you don't recognize because most of the games that are making the most of it aren't out on PC yet. GoW Ragnarok is streaming massive textures and I guarantee you plenty PC gamers will claim bad port later this year. Demon Souls is loading 3gb+ of textures on the fly as you turn corners, per developer. SM2 is now loading in animations on the fly in addition to textures during gameplay.
A paradigm shift is 3 games? Including one that's out on PS4 and running just fine? We must have a very different definition of what a paradigm and a shift is. Besides that, Rift Apart runs just fine on a run-off-the-mill SSD and it was supposedly a showcase. Ragnarok will run alright too. The real question mark is Spider-Man 2.
Look at the responses to a thread where OP asked if a 3070 would suffice to play next gen games at 1440p/ultra settings/144fps. The delusion was ultimately because they didn't understand/believe in the console i/o revelation and by extension focused entirely on compute and paid no mind to memory concerns; just look how many people justified their answer by bringing up DLSS. I think all are banned below but there are similar ones in the thread that aren't but I'm not seeking to call anyone out here. Just showing you the mindset of those who didn't foresee memory as an issue, and there were many.
The delusion stems from utter stupidity and ignorance because a 3070 was never enough to play games at 1440p/ultra/144 fps at any point in time unless you stuck to stuff like CSGO or LoL. I have no idea who thought they could get that sort of performance out of a mid-tier card, but they were foolish lol.
 
Last edited:

King Dazzar

Member
Do we know if the PS5 Pro would theoretically improve DRS and frame rate stability on PS4 BC games? Or are the PS4 BC games already running at there maximum possible with base PS5?
 

Mr.Phoenix

Member
Would be super cool if Spider-Man 2 and Rift Apart were both playable in their native fidelity mode at 60fps on PS5 Pro with a few more added bells and whistles

For these two games, easily.
Even if that were going to be the case, I think the sooner we all accept that the industry at large is shifting away from native resolutions... the better. And I am not only fully on board with that, you honestly just can't blame them.

The amount of power left on the table chasing a native 4K when you can be doing 1440p then Ai reconstructed to 4K is just ridiculous, especially when you consider that there isn't (if at all quantifiable) more than a like 5% fidelity difference between the two output solutions.

Just doesn't make any kinda sense to lose 70%+ of performance to gain 5% of fidelity.
Do we know if the PS5 Pro would theoretically improve DRS and frame rate stability on PS4 BC games? Or are the PS4 BC games already running at there maximum possible with base PS5?
I believe the PS5 already runs PS4 games at their theoretical max. Anything more than what we got would require patches to be made to said PS4 games.
 

nial

Gold Member
Typically, this is fine because by now, in say the PS2 gen, we would be talking about new consoles coming out next year (on the 5th year) or at worst the year after that. But when we have generations lasting as long 7-8 years
Eh, I would say long before that; PS2 was, almost, a full 7 year generation as well (March 2000 - November 2006, just off by 4 months).
 
Last edited:

lh032

I cry about Xbox and hate PlayStation.
Do we know if the PS5 Pro would theoretically improve DRS and frame rate stability on PS4 BC games? Or are the PS4 BC games already running at there maximum possible with base PS5?
i think it will improve if the ps4 BC games still does not "reach" the maximum framerate stability.

For an example, if the said ps4 games runs at 58-60 fps, it would probably runs at 60fps locked this time with ps5 pro.
if the said ps4 games runs at 60fps locked at "maximum" resolution, it will run the same on ps5 pro.

Otheriwse, some "enhanced" patch will be required from the devs.

Just my assumption.
 
Last edited:

Arioco

Member
Do we know if the PS5 Pro would theoretically improve DRS and frame rate stability on PS4 BC games? Or are the PS4 BC games already running at there maximum possible with base PS5?


It depends on the game. Some games running with unlocked frame rate or dynamic resolution on PS4/PS4 Pro do indeed run at max rez and 60 fps on PS5 (like Elden Ring and many others) . Some others don't. Assassin's Creed Unity for instance can't maintain 60 fps on PS5 despite being just 900, it seem to be CPU bound even on PS5, despite the fact that PS5's CPU should be able to run this game at over 100 fps. So it looks like PS5 has different profiles for different games, some can take full advantage of PS5's power but others can't handle such a huge boost in clock speed, so on those cases SONY decided to downclock the CPU, the GPU or both to ensure backwards compability.


This could also be the case when PS5 Pro launches.

I ignore if SONY's made any improvements to BC on PS5 since it launched, to be honest, but this was the situation three years ago.
 

ChiefDada

Gold Member
A paradigm shift is 3 games? Including one that's out on PS4 and running just fine? We must have a very different definition of what a paradigm and a shift is. Besides that, Rift Apart runs just fine on a run-off-the-mill SSD and it was supposedly a showcase. Ragnarok will run alright too. The real question mark is Spider-Man 2.

The paradigm shift is in how games are made. We will be seeing more i/o dependent games, not less. Few games had raytracing when Nvidia introduced Turing but those few games still represented a paradigm shift.

But sorry I'm beginning to derail the thread. Back on topic - I truly can't wait to see what the hw can do. My guess is they will prioritize showtime for all games that adopt the new graphics mode at the showcase. So most games will be shown at 60fps with visuals majority associate with 30fps cap. I think this will be the ultimate seller of the Pro.

Undoubtedly GTA VI will make an appearance and as I've stated before, Sony really can't lose here: Either GTA VI runs at 60fps on the Pro and shocks the vast majority of gamers and tech experts who anticipated CPU limited 30fps, or GTA VI runs at 30fps on the Pro and instantly becomes the best looking game ever made; contested by none and significantly distinguishable from even the reveal trailer with the amount of GPU headroom/RT performance PS5 Pro has over the base PS5.
 

Gaiff

SBI’s Resident Gaslighter
The paradigm shift is in how games are made. We will be seeing more i/o dependent games, not less. Few games had raytracing when Nvidia introduced Turing but those few games still represented a paradigm shift.
I dunno bro. Doesn't seem like much has changed. I'm quite underwhelmed by everything so far I'd even say. I was expecting Spider-Man 2 to look way better.

Now our only hope is for Naughty Dog to really blow us away. At least as far as first-party is concerned. GTA VI will probably do that, but the game has an unlimited budget so meh.
 

Boss Mog

Member
Gaiff Gaiff c'mon brother don't gaslight me here. There absolutely has been a paradigm shift in console memory management. Some of the PC gamers, including Alex from DF, scoffed and assumed they'd be just fine because of the system memory pool on PC. And maybe you don't recognize because most of the games that are making the most of it aren't out on PC yet. GoW Ragnarok is streaming massive textures and I guarantee you plenty PC gamers will claim bad port later this year. Demon Souls is loading 3gb+ of textures on the fly as you turn corners, per developer. SM2 is now loading in animations on the fly in addition to textures during gameplay.

Look at the responses to a thread where OP asked if a 3070 would suffice to play next gen games at 1440p/ultra settings/144fps. The delusion was ultimately because they didn't understand/believe in the console i/o revelation and by extension focused entirely on compute and paid no mind to memory concerns; just look how many people justified their answer by bringing up DLSS. I think all are banned below but there are similar ones in the thread that aren't but I'm not seeking to call anyone out here. Just showing you the mindset of those who didn't foresee memory as an issue, and there were many.
I'm not banned chief. Not sure why you quoted me nor what nonsense you're spouting but I stand by what I said then and I've been proven correct The 3070 does not run a lot of current games at 144 fps in 1440p with Ultra settings. In fact it doesn't even come close, some games run sub 60fps. But like I said, with AI upscaling/framegen (DLSS/FSR) and no RT you should be able to attain at least 60fps in all current games.
 

IDWhite

Member
The paradigm shift is in how games are made. We will be seeing more i/o dependent games, not less. Few games had raytracing when Nvidia introduced Turing but those few games still represented a paradigm shift.

But sorry I'm beginning to derail the thread. Back on topic - I truly can't wait to see what the hw can do. My guess is they will prioritize showtime for all games that adopt the new graphics mode at the showcase. So most games will be shown at 60fps with visuals majority associate with 30fps cap. I think this will be the ultimate seller of the Pro.

Undoubtedly GTA VI will make an appearance and as I've stated before, Sony really can't lose here: Either GTA VI runs at 60fps on the Pro and shocks the vast majority of gamers and tech experts who anticipated CPU limited 30fps, or GTA VI runs at 30fps on the Pro and instantly becomes the best looking game ever made; contested by none and significantly distinguishable from even the reveal trailer with the amount of GPU headroom/RT performance PS5 Pro has over the base PS5.

It's not that games are more dependent on I/O now or that they woud be on the future, they have always been dependent on having a system that moves data from slower to faster memory. The thing is that we continuously are needing more memory to store assets with higher fidelity and that translate to a need on more bandwidth for moving that data. Actually you can play all games on SSD's sata without issues, and a lot of them even on HDD's.

The real paradigm shift that Mark Cerny refers it that on Ps5 due to his architecture you could actually made data streaming on camera move, so when you are turning they can selectively pull it off from memory some assets that are out of view and dedicate that freed memory to what your are going to see. This depends not only on bandwidth but also on latency.

Actually that paradigm shift hasn't materialized yet. At least not to the level expected by the people who built the system and those who know what it is capable of.
 
Last edited:

King Dazzar

Member
Arioco Arioco lh032 lh032 Imtjnotu Imtjnotu thanks guys. Yeah I was more wondering if the PS4 BC was already limited by software BC emulation type constraints. i.e extra power may not bring any benefits at all as the PS4 BC "container" on PS5 is already emulating to the maximum it can do. If that makes better sense. And I'm with Mr.Phoenix Mr.Phoenix on this. But theres no doubt someone out there who knows even more technically, may have have been able to confirm more from a technical standpoint.
 
Last edited:

Perrott

Member
The paradigm shift is in how games are made. We will be seeing more i/o dependent games, not less. Few games had raytracing when Nvidia introduced Turing but those few games still represented a paradigm shift.

But sorry I'm beginning to derail the thread. Back on topic - I truly can't wait to see what the hw can do. My guess is they will prioritize showtime for all games that adopt the new graphics mode at the showcase. So most games will be shown at 60fps with visuals majority associate with 30fps cap. I think this will be the ultimate seller of the Pro.

Undoubtedly GTA VI will make an appearance and as I've stated before, Sony really can't lose here: Either GTA VI runs at 60fps on the Pro and shocks the vast majority of gamers and tech experts who anticipated CPU limited 30fps, or GTA VI runs at 30fps on the Pro and instantly becomes the best looking game ever made; contested by none and significantly distinguishable from even the reveal trailer with the amount of GPU headroom/RT performance PS5 Pro has over the base PS5.
Even though the next-gen ports of GTA V were unveiled and shown at Sony conferences/showcases, I wouldn't dare to say that GTA VI is undoubtedly going to show up at the next PlayStation Showcase.

Its been actually 15 years since Rockstar last threw a bone to these platform holders' events with any sort of announcement or update regarding an upcoming, original title - and that was at Sony's E3 2009 and with Agent, out of all things.

That said, Sony would be stupid to not try to leverage their long-standing relationship with Rockstar in order to sell this idea to mainstream audiences that the best place to play GTA VI is going to be the PS5 Pro.
 
Last edited:

ChiefDada

Gold Member
I'm not banned chief. Not sure why you quoted me nor what nonsense you're spouting but I stand by what I said then and I've been proven correct The 3070 does not run a lot of current games at 144 fps in 1440p with Ultra settings. In fact it doesn't even come close, some games run sub 60fps. But like I said, with AI upscaling/framegen (DLSS/FSR) and no RT you should be able to attain at least 60fps in all current games.

Hey man I didn't quote to shame anyone just revisiting expectations. But on the bolded there are a great many current gen games that can't maintain 60fps at 1440p ultra, dlss or not. And to keep history straight you mentioned nothing about framegen in your original post because of course framegen didn't exist then.
 

David B

An Idiot
So I posted in a group on facebook. It's a PS5 group, so you would think saying something about PS5 be good right? Wrong! I said 33 teraflops is the big rumor for a while now. And that will be better than most people's PCs. They all went hahahahaha at me. Like dudes, seriously, most people don't have a GPU in there PC and 33 teraflops is like 3070 or better even. I don't get it how whenever I post it's a joke. Yet this isn't a joke. They even said the ram is gonna be the same, and the CPU also, it's still gonna be a slow 3rd gen AMD card CPU, hahahahaha. Like ok, so? The ram is DDR5 or better already, so yeah it's all fine and gonna be way better.
 

winjer

Gold Member
So I posted in a group on facebook. It's a PS5 group, so you would think saying something about PS5 be good right? Wrong! I said 33 teraflops is the big rumor for a while now. And that will be better than most people's PCs. They all went hahahahaha at me. Like dudes, seriously, most people don't have a GPU in there PC and 33 teraflops is like 3070 or better even. I don't get it how whenever I post it's a joke. Yet this isn't a joke. They even said the ram is gonna be the same, and the CPU also, it's still gonna be a slow 3rd gen AMD card CPU, hahahahaha. Like ok, so? The ram is DDR5 or better already, so yeah it's all fine and gonna be way better.

But it's not really 33TFLOPs. It's 16.5 TFLOPS + VODP.
It will be better than just 16.5 TFLOPS. But it won't be anywhere close to the performance of 33TFLOPS.
 

HeisenbergFX4

Gold Member
But it's not really 33TFLOPs. It's 16.5 TFLOPS + VODP.
It will be better than just 16.5 TFLOPS. But it won't be anywhere close to the performance of 33TFLOPS.
Matthew Bingo GIF
 

David B

An Idiot
But it's not really 33TFLOPs. It's 16.5 TFLOPS + VODP.
It will be better than just 16.5 TFLOPS. But it won't be anywhere close to the performance of 33TFLOPS.
This another rumor filled thing? I'm going from based off of that youtube guy who said 33 teraflops. Now 16.5. Well we still don't know really until Sony comes out with a PS5 Pro comercial.
 

HeisenbergFX4

Gold Member
Their 45% faster performance shows that its actually worse than 16.5 tflops which if performance was scaling 1:1 with tflops would be 60% faster.
People focused way too much on TFs and not sure how long I have warned people to not look at that number and that they would be disappointed by the actual figure and wait to see this thing actually perform

This 45% is way more important by Sony's own best case scenario figures

I know I have said that 15-17 ish TF range for well over a year now but that was before I had heard anything about PSSR
 

Imtjnotu

Member
We just have to see what PSSR brings to the table before we start talking about performance people.

If it does have frame Gen we won't be seeing many 30fps titles anymore
 
I personally wouldn't read too much into the 33 TF figure, that comes for the GPU's dual issue compute capability. AMD's own drivers struggle to take advantage so I'm not sure what will be the case for Sony. Studios can do written assembly code to take advantage of the feature but we all know how active devs are these when it comes to advanced features.

Regardless, I think the dual issue compute capability won't mean much, even if devs do decide to take advantage of it, there's a good chance they'll run into memory bottlenecks for it to have any meaningful advantage. Just a guess on my part but we'll see how things play out.
 

MarkMe2525

Gold Member
So because there is no material proof and no official announcement you doubt it exists? As opposed to looking at previous precedent and at least acknowledging that sony makes such announcements at most 2 months away from an official release. They did it with the PS3 Slim... twice, the PS4 Slim, the PS4pro...etc. Why you would somehow expect them to now break that trend and announce a PS5pro, potentially 6 months before then it means its not coming?
I'm not agreeing with the other poster, but I wanted to point out the flaw in your proposed logic, and why holding onto a high expectation of evidence is a rational way to form beliefs. Not because I disagree with either side of the debate, but because I find the topic of belief interesting.

If you care about believing "true" things, then being leery of confirmation by precedent is a rational position to hold. An example, if I flip a coin 4 times and every flip results in the coin landing on heads, does the precedent change the probability of the coin landing on tails on a fifth flip? It doesn't. Now in this case we are not dealing with random variables, but even in cases were precedent was set by an actor with agency, we are dealing with incomplete information. At best, all we can really assign, to the likelyhood of the rumors being true, is a probability. Even in cases were we assign a high probability of something being true, that in itself is not a good reason to "believe" it's true. The most rational position to take, when presented with incomplete information, is to withhold "belief" until the "truth" of the matter can be demonstrated.

I could go on for hours, but I'll stop in case you have a rebuttal.
 

PaintTinJr

Member
The only thing I could find is this PDF presentation which was used at GDC, I'm guessing there's a full video presentation out their locked behind the GDC paywall.

IMHO This is a really interesting ML upscaling presentation, but it also solidifies IMO the problems we've seen in the PS5 underwhelming in its use of its hardware - compared to previous gens like PS3 - because from reading the paper it feels like the whole motivation was to ship a PS4 game with 2048x2048 BC7 compressed textures and then then just use additional compute in the PS5 to ML AI upscale them to 4096x4096 to make it a bit more PS5 like, where only 7GBs of data transmission is saved over 10minutes of gameplay, which is frankly nothing compared to the IO complex throughput for 10minutes where even with loading times to load 8192x8192 BC7 textures - for normal maps - would have been superior in every way from a visual perspective.

First hand eyes on, the normal mapping - and texturing in general - in GoW Rag on PS5 did look amazingly clean and sharp IMO from seeing it in action at a friend's on his OLED but it did also have a look of repeating high frequencies, as I recall. Which from reading the paper makes perfect sense because upscaling from a lossy formatting still can't transcend the problems of the BC7 data at source compared to original higher resolution BC7 textures (or raw/lossless) .

I was also pretty underwhelmed by the PSNR level stated, the comparison visual too versus bi-linear, and the performance measurements, both before and after optimisation on Ps5, as though the strategy to use just one texture package across PS4 and PS5 was completely the wrong decision from the outset. IMO for such a strategy that intends to ship smaller textures and upscale using AI, far better tailored regression algorithms strategies that don't build on compromised lossy BC (block compression) formats to begin with would perform better and provide superior PSNR and parity on the compression because it would be taking source to end result, rather than quantizing both the lossy compression and the upscaling through the precision limitations of BC7. But I guess the interesting part is that if this was the origins of a PSSR algorithm the presentation is quite critical of the things that didn't work to the point of only using it for normal maps and colour texture fallback in the main product, so certainly had their sights set on much better PSNR and weren't drinking their own cool aid.
 
Last edited:

PaintTinJr

Member
But it's not really 33TFLOPs. It's 16.5 TFLOPS + VODP.
It will be better than just 16.5 TFLOPS. But it won't be anywhere close to the performance of 33TFLOPS.
I guess it depends on what the input resolution for pristine PSSR 4K is. If the algorithm lets them drop a resolution tier as the source input for PSSR then a PS5 Pro only having to render native at 1080p could do fx that would transcend its TF/s number when reconstructed upscaled with PSSR because in the current PC DLSS space, 1440p is required as the source for quality 4K, and the performance cost of native at 1080p to 1440p is even more significant when you add RT to that native image.
 
Last edited:

winjer

Gold Member
I guess it depends on what the input resolution for pristine PSSR 4K is. If the algorithm lets them drop a resolution tier as the source input for PSSR then a PSSR only having to render native at 1080p could do fx that would transcend its TF/s number when reconstructed upscaled with PSSR because in the current PC DLSS space, 1440p is required as the source for quality 4K, and the performance cost of native at 1080p to 1440p is even more significant when you add RT to that native image.

That that's still the same amount of compute that the hardware can do.
It just improves performance by not having to render as many fragment shaders.
 

PaintTinJr

Member
That that's still the same amount of compute that the hardware can do.
It just improves performance by not having to render as many fragment shaders.
It is also at least 1/3rd less resource contention on standard forward render raster games like CS, or more like 75% less on the basic RT mode in games like Spider-man 2 and even more with the multi bounce in Spider-man 2 and even more with Pro level RT. So a native 1080p image with a better ML AI upscaler to 4K would massively distort perception of the hardware TF/s IMHO.
 

Valonquar

Member
How long was the time span between PS4 Pro offical announce and PS4 Pro release? Feel like Sony should be announcing this pretty soon for real if it's going to be a winter release.

***Edit: PlayStation 4 Pro was announced on September 7, 2016 and was released on November 10th, 2016. I guess we'll know for sure if it's going to happen in a few months.
 
Last edited:

THE:MILKMAN

Member
How long was the time span between PS4 Pro offical announce and PS4 Pro release? Feel like Sony should be announcing this pretty soon for real if it's going to be a winter release.

***Edit: PlayStation 4 Pro was announced on September 7, 2016 and was released on November 10th, 2016. I guess we'll know for sure if it's going to happen in a few months.

2.5 months. Early Sept reveal, mid/late Nov release 2016.
 

Mr.Phoenix

Member
I'm not agreeing with the other poster, but I wanted to point out the flaw in your proposed logic, and why holding onto a high expectation of evidence is a rational way to form beliefs. Not because I disagree with either side of the debate, but because I find the topic of belief interesting.

If you care about believing "true" things, then being leery of confirmation by precedent is a rational position to hold. An example, if I flip a coin 4 times and every flip results in the coin landing on heads, does the precedent change the probability of the coin landing on tails on a fifth flip? It doesn't. Now in this case we are not dealing with random variables, but even in cases were precedent was set by an actor with agency, we are dealing with incomplete information. At best, all we can really assign, to the likelyhood of the rumors being true, is a probability. Even in cases were we assign a high probability of something being true, that in itself is not a good reason to "believe" it's true. The most rational position to take, when presented with incomplete information, is to withhold "belief" until the "truth" of the matter can be demonstrated.

I could go on for hours, but I'll stop in case you have a rebuttal.
No rebuttals here... I am in agreement. The issue with what he said though, is that "he believes" its not coming. Because as you said, there is a lack of information.

I on the other hand, am of the impression that the same way a lack of information doesn't confirm something, it doesn't deny it either. And in such a case, if one must err, then you err on the side of caution, in this case, that would be caution informed by precedent.

Eg. If history has shown, that with things like these, be it pro consoles, or refresh consoles, at first you have rumors and leaks, then you have an official announcement around 2 months before the thing is released, and such new hardware has a tendency to release in November... it would be flat out ignorant, to not at least wait until September (two months before said November) to then start assuming that thing won't happen, when everything else that usually happens to lead to that, already has.
 
Last edited:

Audiophile

Member
I can't find the chart, but I recall one (I think on B3D) showing operations per cycle for Dual-Issue on RDNA3 vs Single-Issue on RDNA2. I'm pretty sure the areas of advantage were FP32 Adds to the tune of +~80%, FP32 Multiply/Adds to the tune of +~20% and some mixed FP/INT operations to the tune of +~10%. With everything else being neck and neck or worse.

I have no idea what proportion of operations in your average game engine are applicable to these. But if someone knew I guess you could roughly work out what the real-world performance gains of Dual-Issue would be if it was optimised for.

Hopefully with it being a single fixed spec platform, they'd be able to utilise it more in time.

As it stands though I think it's still fair to consider the Single-Issue FP32 performance as the more reasonable point of reference.

I'd love it if Cerny talks about this in a presentation, to see if they have any plans to properly utilise Dual Issue. It seems like a good avenue to go down given the limited power budget of consoles and the need for efficiency wherever possible.
 
Last edited:

MarkMe2525

Gold Member
No rebuttals here... I am in agreement. The issue with what he said though, is that "he believes" its not coming. Because as you said, there is a lack of information.

I on the other hand, am of the impression that the same way a lack of information doesn't confirm something, it doesn't deny it either. And in such a case, if one must err, then you err on the side of caution, in this case, that would be caution informed by precedent.

Eg. If history has shown, that with things like these, be it pro consoles, or refresh consoles, at first you have rumors and leaks, then you have an official announcement around 2 months before the thing is released, and such new hardware has a tendency to release in November... it would be flat out ignorant, to not at least wait until September (two months before said November) to then start assuming that thing won't happen, when everything else that usually happens to lead to that, already has.
Very true, a lack of evidence neither confirms nor refutes a proposition. All that a lack of evidence provides, is a reason to be skeptical of a positive assertion. Now, here is where i may get myself in trouble, I assume the users assertion that they "do not believe" the PS5 rumors, is an assertion that they are not convinced of the proposition. We often use the word belief in colloquial terms, and the users mention of a lack of "material evidence" suggests he meant to refute the positive assertion, rather than make a positive assertion themselves. I acknowledge my assessment, of the intent of the other users statement, hinges on my presumption that they are using rational means to come to their conclusion, and presuming can often lead us to inaccurate conclusions. Regardless, I appreciate you humoring me.
 
Last edited:

bitbydeath

Gold Member
But it's not really 33TFLOPs. It's 16.5 TFLOPS + VODP.
It will be better than just 16.5 TFLOPS. But it won't be anywhere close to the performance of 33TFLOPS.
I’m thinking it will use 33TF to reach 8K as 33TF is available but it takes more work to get there. The 16.5TF will be for the standard 4K mode.
 

bitbydeath

Gold Member
That is not how vodp works.
I think you’re getting mixed up with the 67TF figure.

Let’s look at what we know:
They’re aiming for-
4K 60+
8K 30+

We’ve already seen 4K 60+ is using PSSR, so unless it is some miracle software it won’t bolster to 8K 30+ through PSSR alone.

We also know 8K 30+ is coming in a further software update after launch.

Now either you’re dismissing that part of the rumour in which case you may as well dismiss all of it, or you haven’t thought through how 8K 30+ is going to work.
 

winjer

Gold Member
I think you’re getting mixed up with the 67TF figure.

Let’s look at what we know:
They’re aiming for-
4K 60+
8K 30+

We’ve already seen 4K 60+ is using PSSR, so unless it is some miracle software it won’t bolster to 8K 30+ through PSSR alone.

We also know 8K 30+ is coming in a further software update after launch.

Now either you’re dismissing that part of the rumour in which case you may as well dismiss all of it, or you haven’t thought through how 8K 30+ is going to work.

The 67tflop figure is from fp16 precision.
 

PaintTinJr

Member
The 67tflop figure is from fp16 precision.
Assuming PSSR has some commonality with the ML AI BC7 texture upscaling used in GoW Rag on PS5, some of that 67TF/s would definitely get utilised with minimal wastage as the PS5 optimisations in that presentation show them eventually dropping down to FP16 with no discernible loss of precision over FP32, and they'd already mapped the algorithm at the Wave CU level, and would probably be able to handle more blocks per cycle on dual issue to use that extra capacity if I've followed their optimisations correctly.

edit:
It is interesting that they also point out that the optimised algorithm would probably perform badly on other hardware unaltered
 
Last edited:

winjer

Gold Member
Assuming PSSR has some commonality with the ML AI BC7 texture upscaling used in GoW Rag on PS5, some of that 67TF/s would definitely get utilised with minimal wastage as the PS5 optimisations in that presentation show them eventually dropping down to FP16 with no discernible loss of precision over FP32, and they'd already mapped the algorithm at the Wave CU level, and would probably be able to handle more blocks per cycle on dual issue to use that extra capacity if I've followed their optimisations correctly.

edit:
It is interesting that they also point out that the optimised algorithm would probably perform badly on other hardware unaltered

For ML task, including upscalers, its more common to use int8 and int4. And then there might be sparsity for rDNA4.
So for ML, the number of TOPs will be much higher.
 
Top Bottom