Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Sorry, but that's one of the stupidest comments I've read here for a long time. Calling a game "worst thing happened" just because of the graphics is really weak. If you don't like it: All right. But what about games like Shovel Knight or The Binding of Isaac? No graphic masterpieces, but very good games. So what's wrong with that? Did Minecraft exactly influence the graphics of the other current gen titles?

If you said something against the technically really creepy Java version on the PC, I would go along. But that?
I agree with you, Bo_Hazem Bo_Hazem please tell me you don't evaluate the game for the graphics.
 
Sorry, but that's one of the stupidest comments I've read here for a long time. Calling a game "worst thing happened" just because of the graphics is really weak. If you don't like it: All right. But what about games like Shovel Knight or The Binding of Isaac? No graphic masterpieces, but very good games. So what's wrong with that? Did Minecraft exactly influence the graphics of the other current gen titles?

If you said something against the technically really creepy Java version on the PC, I would go along. But that?

Man, if I see kids playing Minecraft in class I pull their cellphones (well, they're not allowed to have their cellphones in class anyway), and give them a pass if it's another game :lollipop_tears_of_joy:

Minecraft: Graphics, gameplay, everything is just painful to look at, it might be the best tool to mind-torture a suspect to confess without physical harm by using that crap running in a loop.

But hey, it's me, and I think many share that opinion. If you like it, well, happy for you.:lollipop_raising_hand:
 
Last edited:
I agree with you, Bo_Hazem Bo_Hazem please tell me you don't evaluate the game for the graphics.


Man, it's not just graphics, it's everything there. I've watched it several times with kids and tried to convince myself, but couldn't. It's so bad that if I watch it I feel like crying of boredom.

My opinion is that it's the worst game I've ever seen in my life, I would take Life of the Black Tiger any day. :lollipop_tears_of_joy:

 
Last edited:
Man, if I see kids playing Minecraft in class I pull their cellphones (well, they're not allowed to have their cellphones in class anyway), and give them a pass if it's another game :lollipop_tears_of_joy:

Minecraft: Graphics, gameplay, everything is just painful to look at, it might be the best tool to mind-torture a suspect to confess without physical harm by using that crap running in a loop.

But hey, it's me, and I think many share that opinion. If you like, well, happy for you.:lollipop_raising_hand:

Or you just accept that other people like it. After all, it's the best-selling game ever.

Incidentally, I'm not a friend of Minecraft. I don't play Fortnite either. But don't shoot these games right away and come up with some confused reasons why they are bad. A certain objectivity would be appropriate here.
 
Or you just accept that other people like it. After all, it's the best-selling game ever.

Incidentally, I'm not a friend of Minecraft. I don't play Fortnite either. But don't shoot these games right away and come up with some confused reasons why they are bad. A certain objectivity would be appropriate here.

I'm pretty objective here, if someone likes it then doesn't mean I should. I personally find it to be insultingly bad, and if I get paid $500 from the devs to play and complete its story (if it has any) I won't take it.

It's a game, take it easy there.🤷‍♂️ I don't want shit examples of ray tracing, even Quake in that matter, it reflects nothing to real games.

By the way, it entertains me that "graphics don't matter" then we talk about 10-12TF and other things. Above all of that, some are still arguing about 4K being overrated.

Hint: Graphics are a MASSIVE part of game immersion.
 
Last edited:
Man, it's not just graphics, it's everything there. I've watched it several times with kids and tried to convince myself, but couldn't. It's so bad that if I watch it I feel like crying of boredom.

I found your description of Minecraft as "infant-level" to be hilarious. However, while I've personally never played Minecraft, I see the appeal; being able to create your own environments - even at the scale of planets - must be fun.

As for the graphics, well, I'd imagine that it would be difficult and perhaps game-breaking to enable players to make environments with high quality graphics; as players would make environments larger, they would encounter the point at which their GPUs and RAM cannot maintain performance far sooner than they do now.
 
I found your description of Minecraft as "infant-level" to be hilarious. However, while I've personally never played Minecraft, I see the appeal; being able to create your own environments - even at the scale of planets - must be fun.

As for the graphics, well, I'd imagine that it would be difficult and perhaps game-breaking to enable players to make environments with high quality graphics; as players would make environments larger, they would encounter the point at which their GPUs and RAM cannot maintain performance far sooner than they do now.

I know that appeal, that's why I loved LBP series and Dreams for that matter. No need to regress generations back to meet creativity, massive worlds are made in DREAMS with amazing details.
 
Last edited:
I'm pretty objective here, if someone likes it then doesn't mean I should. I personally find it to be insultingly bad, and if I get paid $500 from the devs to play and complete its story (if it has any) I won't take it.

It's a game, take it easy there.🤷‍♂️ I don't want shit examples of ray tracing, even Quake in that matter, it reflects nothing to real games.
You don't understand it.

It's the way you say it. There is a difference between "I don't like the game" and "It's the worst thing that could have happened to the current generation".

I accept your opinion. But at least try to stand by if you make a mistake. Not everything can be solved with memes and smileys.
 
You don't understand it.

It's the way you say it. There is a difference between "I don't like the game" and "It's the worst thing that could have happened to the current generation".

I accept your opinion. But at least try to stand by if you make a mistake. Not everything can be solved with memes and smileys.

I'm not sure why you're upset. I find it the worst game ever happened, does that change your perspective to it? No.

Try to take it a bit easy, mate. :messenger_fearful: Even most indie games, 8-bit ones are way, way more enjoyable.

BTW, I find GTA5 to be a crap game as well, I hope that doesn't offend anyone.:messenger_expressionless:
 
I found your description of Minecraft as "infant-level" to be hilarious. However, while I've personally never played Minecraft, I see the appeal; being able to create your own environments - even at the scale of planets - must be fun.

As for the graphics, well, I'd imagine that it would be difficult and perhaps game-breaking to enable players to make environments with high quality
I know that appeal, that why I loved LBP series and Dreams for that matter. No need to regress generations back to meet creativity, massive worlds are made in DREAMS with amazing details.
Considering that a somewhat working (and GIGANTIC) calculator that simulated transistors was made in Minecraft, it's all but infant level.
Minecraft is the perfect example of easy to play, difficult to master (aside from the fact that some Minecraft versions completely missed tutorials and the game let you discover stuff on your own). I can set DMC to easy and win the game with one move, but that's only the start.
Dreams is kind of a bad example, because it is not a sandbox game, it's basically an engine, if you want to build something graphically really complex as of right know you need to sacrifice gameplay and logics, and the results in terms of scope would not match what Minecraft can achieve in sheer size, let alone that it is build for current gen hardware.
Same for LBP, which also has fixed 2D camera and do not need to account for 3D movements at all, nor it has collateral functions (3D terrain, weather, time, animals, enemies, structures, terraforming) while creating. It is just create from a blank level.
As for the gameplay, you may not bare combat, but it was pretty much the standard for the sandbox\survival game at the time, with the right balance of building integrated with an actual gameplay loop, and a bold example in the industry of how a game that tells you nothing of its functioning can still be incredibly profitable.
 
Last edited:
I think launch day versions of both of these consoles will be fine but also shoutout to Mystic. He runs one of, if not, the best Playstation channels on YouTube. Good content and isn't a console warrior.



Man as good as Beyond Good & Evil 2 looks, it is so ambitious that I doubt it'll ever get released. And if it does I think it'll end up not delivering on the promise they're sharing. Obviously I hope I'm wrong though.
Yeah I wanted to shout out Mystic but I didn't want to sound like I was plugging his Youtube channel, me being weird I know.

But yeah, his channel is awesome, very Playstation centric content and none of the fanboy nonesense you get from MBG and Foxy and the Xbox idiots. Very refreshing. I'd recommend watching his videos on PS4, titled "How Sony became King of the consoles again", really in depth video on how the PS4 came about, it was born from the failures of the PS3 and it was tailored for developers, the latter being an important factor for PS5 as well.
 
I'm not sure why you're upset. I find it the worst game ever happened, does that change your perspective to it? No.

Try to take it a bit easy, mate. :messenger_fearful: Even most indie games, 8-bit ones are way, way more enjoyable.

BTW, I find GTA5 to be a crap game as well, I hope that doesn't offend anyone.:messenger_expressionless:
No GTA V is crap and no one should be allowed to play it, that's for sure lol
 
PC games will never achieve:
  • 1-second boot time
  • multiple games in suspend and resume in less than a second
  • no loading screens
  • jumping straight to a section in a game or online lobby in a second


That immediacy cannot be duplicated in a PC even if their SSD reach 10GB/s because of the inherent bottlenecks in the PC architecture.

Although, PC can chase PS5 when it comes to streaming high resolution textures and details by using an excessive amount of RAM to compensate and used as a large cache.

I absolutely love how you left out Xbox Series X on this. Cheers darling

mCp2jRg.jpg
 
Mike Rayner from The Coalition said:
With the Xbox Series X, out of the gate, we reduced our load-times by more than 4x without any code changes. With the new DirectStorage APIs and new hardware decompression, we can further improve I/O performance and reduce CPU overhead, both of which are essential to achieve fast loading. As we look to the future, the Xbox Series X's Sampler Feedback for Streaming (SFS) is a game-changer for how we think about world streaming and visual level of detail. We will be exploring how we can use it in future titles to both increase the texture detail in our game beyond what we can fit into memory, as well as reduce load times further by increasing on-demand loading to just before we need it, instead of pre-loading everything up-front as we would use a more traditional 'level loading' approach.

Kevin Floyer from Rebellion said:
Hardware-accelerated ray tracing is also a very welcome feature. Not only does it allow us to take players to environments that are that more realistic and dynamic, but ray tracing hardware can do more than just rendering! For example, we can do incredible things with the acoustics to enhance realism. Modeling realistic sound occlusion for AI's hearing in real-time, for example, is an incredibly useful feature for creating stealth games.

The SSD storage speed then lets us take those beautiful realistic environments and make them load in a flash. And we're talking about bigger environments than the ones we could create on Xbox One.

Finding new things to stream is an important part of this generation, and animation streaming is a game-changer for motion capture. Now we can support detailed motion capture on a much wider scale, like non-player characters simply doing their thing in the background. Instead of all enemy NPCs moving in an identical way, for example, the SSD storage speed means we can offer many unique motion-captured animations – and given we own Audiomotion, Europe's leading motion capture studio, it's something we'll be very keen to do.

Alexandre Sabourin from Snowed said:
Raytracing hardware allows us to do some interesting things, but usually, production raytracers can take quite some time to converge to a reasonable image. The limit on samples per pixel means that there has been an interesting race to determine who can write an efficient denoiser. I'm interested to see what this new tech will lead to in terms of new research being done. Denoiser's are one interesting avenue, but maybe it will allow developers to write their own light mappers, or pre-render cinematics more effectively.
 
Last edited:
I'm not sure why you're upset. I find it the worst game ever happened, does that change your perspective to it? No.

Try to take it a bit easy, mate. :messenger_fearful: Even most indie games, 8-bit ones are way, way more enjoyable.

BTW, I find GTA5 to be a crap game as well, I hope that doesn't offend anyone.:messenger_expressionless:
Finally someone sees GTA5 is crap (imo ofc) 😌😌
 
This thread name must be changed to "Cerny fetish"

PlayStation 4 Pro CAN do 8.4 TFLOPS with FP16 Precision according to Mark Cerny Honey. Lets create a thread about it!!! mmmm sooo gigi goooddeee!!!! Wanna hear all those theoretical possibilities oooowww!!!

BS0Oqxa.jpg
 
Had a quick read, didn't expect so much about the SSD.

As a non developer with a casual interest in the tech inside consoles, the SXS's GPU is its best feature.
Man just read, you had many pages of discussion for this and if you think so simple well is not worth it to try to explain you
 
Last edited:
It will be combined with GI and shadows because you are already tracing the path of the lower order lighting fx, so it would make little sense to do reflections with out. Reflections of objects missing shadows would look very incoherent, and using shadow mapping with RT reflections probably isn't possible, because shadow mapping is usually hand tuned from knowing the circumstances the viewer can see the shadows. Reflections would be a new problem for that.

edit: 3D audio is slightly different because the wavelengths being bigger allow the wave to pass through materials in an attenuated way. But Cerny peg that as the cheap RT fx of less than 1Million Rays/s, so it sounds like a freebie in terms of hundreds of millions.
It think he only was talking about reflections but doesnt should be a problem use in the same time things like audio as this require much less rays, maybe even you should be able to use GI wich looks like is not so expensive as I though but in the end I an just guessing.
Considering RT GI is more intensive work than shadows or reflections, I'm assuming Cerny meant a more traditional node-based GI combined with RT for artifact correction, just like the method Control is using. Regarding what he said about reflection RT, I'm assuming he meant that reflection RT is the only RT effect used without anything used at the same time except maybe sound RT because sound RT is nothing for an RT supporting GPU. Everything he said sounded very standard affair for real-time RT, that's why he ran through it really quick and moved on in like 60 seconds.

Isn't that technically full ray-tracing then? Since it includes audio, GI, shadows and reflections, his wording kinda confused me when he said "how far can we go?" and then talked about running ray-traced reflections without mentioning full ray-tracing as if that's a completely different thing.
Full ray-tracing requires everything done by ray-tracing. Both consoles can't do that unless we are talking about a really old game (Quake 2 RTX) or an indy looking game (Minecraft). Full path tracing is the future, but it will probably remain the future for a good decade or so.
 
I think chances are high PS5 retail design will resemble the devkit leak
ps5_devkits_teaser.jpg

For reference past devkit designs
sR2L3Sc.png

Notice how the only past devkit design (PS2) that isn't just a generic rectangle box ends up being pretty close to the retail version? Fingers crossed that PS5 retail design resembles the devkit
 
I think chances are high PS5 retail design will resemble the devkit leak
ps5_devkits_teaser.jpg

For reference past devkit designs
sR2L3Sc.png

Notice how the only past devkit design (PS2) that isn't just a generic rectangle box ends up being pretty close to the retail version? Fingers crossed that PS5 retail design resembles the devkit
We're going to see the V shaped air-intake definitely, judging by the controller I think the console itself will look very cool👌
 
Although looking at what you've just wrote, I have to say you are pretty wrong. Obviously the power difference between PS5 and XSX isn't that big, no one should care, but the Lockhart isn't holding back next-generation. If it's purely to output 1080p, there isn't really an issue.

What about minecraft it's 1080p on series x what will it be on lockhart?
 
Considering RT GI is more intensive work than shadows or reflections, I'm assuming Cerny meant a more traditional node-based GI combined with RT for artifact correction, just like the method Control is using. Regarding what he said about reflection RT, I'm assuming he meant that reflection RT is the only RT effect used without anything used at the same time except maybe sound RT because sound RT is nothing for an RT supporting GPU. Everything he said sounded very standard affair for real-time RT, that's why he ran through it really quick and moved on in like 60 seconds.
Yeah makes sense I only say that for the image he presented
dUCOnOc.jpg

I am not sure if they was refering at similar process as explain by NVIDIA or not
https://developer.nvidia.com/rtxgi
 
Last edited:
Full ray-tracing requires everything done by ray-tracing. Both consoles can't do that unless we are talking about a really old game (Quake 2 RTX) or an indy looking game (Minecraft). Full path tracing is the future, but it will probably remain the future for a good decade or so.
Unless you are a developer on both systems and breaking an NDA - which I doubt you are -you can't make that claim. Your claim may prove to be true, although even if Sony's first party devs have to render at 720p30, I doubt the next-gen will finish without us having atleast some AAA built-for-RT games on PS5, GT7 almost certainly will be full RT at some resolution going by the history of how Sony Japan have used a GT game tech prototype, which the tech then gets sized for the next PlayStation.
 
Unless you are a developer on both systems and breaking an NDA - which I doubt you are -you can't make that claim. Your claim may prove to be true, although even if Sony's first party devs have to render at 720p30, I doubt the next-gen will finish without us having atleast some AAA built-for-RT games on PS5, GT7 almost certainly will be full RT at some resolution going by the history of how Sony Japan have used a GT game tech prototype, which the tech then gets sized for the next PlayStation.

I can see Gran Turismo doing some great ray-tracing, it's only a matter of smart coding without throwing all raytracing there with some not being adding anything to the final image.

I'm still wondering how they managed 8K@120fps with raytracing, HDR here:




Could we expect PS5 Pro to reach near that level? And why GT devs were talking about 4K@240fps as well on PS5:

 
Last edited:
Anyone can see the podcast, if you try to play the smart guy why don't you mention the prat when they said "Sony is going to release their games on PC, hence PC will hold back their games etc...Next time mention it so everyone knows.

Dude it was a 3 hour podcast I'm not rewatching that for a throwaway quote :pie_roffles:
 
Last edited:
You mentioned those percentages shown on display aren't a indicator of utilization but busy numbers
My question is: How do you quantify how busy a CPU/GPU is on percentages?

Apologies for using them interchangeably. Your example of 1:4 compression ratio that would translate to 9.8GB/s output from the decompression block.
This 9.8GB/s rate surpasses what i believe to be the decompression block max throughput (~6.5GB/s)

Sony did customize their SSD & I/O extensively and they need a more powerful decompressing block to handle the 5.5GB/s SSD input and decompress a bigger pool of data. Not sure if these things scale linearly?

That's why i asked earlier if the 4.8GB/s figure could be interpreted as only BCPack but you made a good point against it.
However considering Textures are the majority of streamed data couldn't they theoretically reach the 4.8GB/s figure using both zlib/bcpack? Like in the quote below

This checks out for the 4.8GB/s average, im just not convinced the decompressing block can handle 7GB/s+ because they were specifically talking about the hardware decompression block capabilities (not unlike Sony) and didn't say anything that could be interpreted as an average, 4.8GB/s wasn't even brought up. For this question i'd use occam's razor
I also don't think its a coincidence both had "typical" compression output and max throughput from the decompressing block. But this just context for my current stance

I know wasn't meant to correct you just to add further clarification so others don't get confused
Yes, obviously the input is the bus bandwidth, I was talking about the output.

Yeah makes I only say that for the image he presented
dUCOnOc.jpg

I am not sure if they was refering at similar process as explain by NVIDIA or not
https://developer.nvidia.com/rtxgi
Unless you are a developer on both systems and breaking an NDA - which I doubt you are -you can't make that claim. Your claim may prove to be true, although even if Sony's first party devs have to render at 720p30, I doubt the next-gen will finish without us having atleast some AAA built-for-RT games on PS5, GT7 almost certainly will be full RT at some resolution going by the history of how Sony Japan have used a GT game tech prototype, which the tech then gets sized for the next PlayStation.
I'm not a developer for either platform, but he did talk about GI as if it takes an order of magnitude fewer rays than RT shadows while GI actually takes a lot more rays than RT GI and is one of the heaviest RT effects. There are a lot of methods of assisting node based GI by using RT, like the one Metroiddarks Metroiddarks pointed out (which exists now in Unity for all developers) or what Remedy is using in Control. Maybe Cerny should have called the GI in the slide "hybrid GI" to make it more clear.
 
I know that appeal, that's why I loved LBP series and Dreams for that matter. No need to regress generations back to meet creativity, massive worlds are made in DREAMS with amazing details.
I like the concept of dreams but... 99% user created games are either hot garbage, demos and memes. I rather play minecraft than amateur hour
 
I like the concept of dreams but... 99% user created games are either hot garbage, demos and memes. I rather play minecraft than amateur hour

To be honest, is the nature of the game, with imply using blocks and by that giving unlimited possibilities for game longevity. Besides, and even if I remember right, is Minecraft a real 3D game or some sort of game that give you the ilusion of being a 3D game?
 
Yes, obviously the input is the bus bandwidth, I was talking about the output.
Yes im aware, that's why i said the theoretical 9.8GBs figure surpasses the decompression block output limit (~6.5GB/s). But that shouldn't prevent it from reaching the 4.8GB/s average considering what MS said about most data being streamed is textures?
 
I'm not a developer for either platform, but he did talk about GI as if it takes an order of magnitude fewer rays than RT shadows while GI actually takes a lot more rays than RT GI and is one of the heaviest RT effects. There are a lot of methods of assisting node based GI by using RT, like the one Metroiddarks Metroiddarks pointed out (which exists now in Unity for all developers) or what Remedy is using in Control. Maybe Cerny should have called the GI in the slide "hybrid GI" to make it more clear.

This is exactly what he said:
"I'm thinking it'll take less than a million rays a second to have a big impact on audio. That should be enough for audio occlusion and some reverb calculations."
"With a bit more of the GPU invested in ray tracing it should be possible to do some very nice global illumination."
"Having said that, adding ray traced shadows and reflections to a traditional graphics engine could easily take hundreds of millions of rays a second."
"And full ray tracing could take billions."

But it seems like you are spliting hairs, and possibly trying to undercut what he said, like if PS5 couldn't do Full RT but XsX could, but someone then tried to level things by saying, well it isn't doing 100(?) rays per pixel like a movie, so it can't really ray trace. The light transport choices in the context of RT might endup being a bit mix and match, but it will still be an overall better IQ than we currently have, which may be more elaborate fake GI techniques coupled with inferior or missing lighting calculations for other fx. Exhaustive RT GI surely isn't within the first orders of business for next-gen, is it?
 
Last edited:

Let's pray God for having PSVR2 equipped with eye tracking, enabling to utilize foveated rendering as a base for VR games rendering. To have 120 frames per eye and still maintaining high quality rendering.
PSVR2 is the thing that I'm more excited for, in the next generation. And I really hope games on PSVR2 will have at least the same graphical quality of 30 fps, tv-screen ones. Do you think we can expect such visual quality from PSVR2 games?
 
BTW, I find GTA5 to be a crap game as well, I hope that doesn't offend anyone.:messenger_expressionless:
Oh man, you are committing a crime right now, GTA is the GOAT!! But I understand where you're coming from, I have talked to a lot of people who do not like GTA at all, but man the next-generation GTA is gonna be something I don't think we are ready for.
 
Had a very quick read, didn't expect to see the SSD mentioned as much as it was.

Maybe the SSD hype was real all along,,,
People have been talking about PS5 SSD since the first Wired article in April 2019, but now that a bunch of devs talk about XSX SSD is when everyone gives a damn about it being a game changer? Jesus!!
 
Yes im aware, that's why i said the theoretical 9.8GBs figure surpasses the decompression block output limit (~6.5GB/s). But that shouldn't prevent it from reaching the 4.8GB/s average considering what MS said about most data being streamed is textures?
Sorry, I quoted you by mistake, it was a reply to Fafalada :/

Let me redo your quote:
Sony did customize their SSD & I/O extensively and they need a more powerful decompressing block to handle the 5.5GB/s SSD input and decompress a bigger pool of data. Not sure if these things scale linearly?
Yes, but both blocks use different decompression methods. For instance, Oodel's Kraken decompresses more than x2 faster than Zlib on the same CPU. So the question is, how fast Kraken decompresses VS BCPack on the same CPU.

That's why i asked earlier if the 4.8GB/s figure could be interpreted as only BCPack but you made a good point against it.
However considering Textures are the majority of streamed data couldn't they theoretically reach the 4.8GB/s figure using both zlib/bcpack? Like in the quote below

This checks out for the 4.8GB/s average, im just not convinced the decompressing block can handle 7GB/s+ because they were specifically talking about the hardware decompression block capabilities (not unlike Sony) and didn't say anything that could be interpreted as an average, 4.8GB/s wasn't even brought up. For this question i'd use occam's razor
I also don't think its a coincidence both had "typical" compression output and max throughput from the decompressing block. But this just context for my current stance
Yes, 4.8GB/s is probably the weighted average (weighted because the Zlib data and the BCPack data aren't the same size) of the two. So BCPack is providing faster than 4.8GB/s throughput coming out of its' part of the block while Zlib is providing slower than 4.8GB/s throughput and they average at 4.8GB/s. It doesn't really matter because the end result is ~4.8GB/s while the PS5's end result is ~8.5GB/s so the gap is big no matter how you look at it. MS only has one way of narrowing the gap this late, replace the x4 1GB GDDR6 memory chips with 2GB chips, and hope the extra 4GB will help them make up for the slower SSD speed.

This is exactly what he said:
"I'm thinking it'll take less than a million rays a second to have a big impact on audio. That should be enough for audio occlusion and some reverb calculations."
"With a bit more of the GPU invested in ray tracing it should be possible to do some very nice global illumination."
"Having said that, adding ray traced shadows and reflections to a traditional graphics engine could easily take hundreds of millions of rays a second."
"And full ray tracing could take billions."

But it seems like you are spliting hairs, and possibly trying to undercut what he said, like if PS5 couldn't do Full RT but XsX could, but someone then tried to level things by saying, well it isn't doing 100(?) rays per pixel like a movie, so it can't really ray trace. The light transport choices in the context of RT might endup being a bit mix and match, but it will still be an overall better IQ than we currently have, which may be more elaborate fake GI techniques coupled with inferior or missing lighting calculations for other fx. Exhaustive RT GI surely isn't within the first orders of business for next-gen, is it?
I'm not sure what you mean by that. I'm not trying to disrespect the PS5 or something, IMO it will have the exact same RT performance as the XSX minus like 15% because of the reason I've talked about in previous posts. Whatever the PS5 can do, the XSX can do and vice versa, plus-minus 15% performance which will probably be mitigated by resolution. All I'm saying is that Cerny said a bit more than 1 million rays a second will allow for good looking RT GI, which is true because RT assisted GI is very good looking and requires very few rays (a few million rays a second are nothing for a 2060RTX for instance). Cerny's slide and words are a bit misleading because with that ray budget, you can't RT GI Nathan Drake's forehead.
 
Last edited:
Hey BGs how are ya man?

I have a question if you can answer it or not, since you're working on a VR game for next-generation consoles, is your game gonna have some form of RT while in VR?

Hmm.....I think it is a very interesting question, but if you can't respond then leave my comment and give a TRIGGERED reaction to it.
 
Last edited:
I'm not sure what you mean by that. I'm not trying to disrespect the PS5 or something, IMO it will have the exact same RT performance as the XSX minus like 15% because of the reason I've talked about in previous posts. Whatever the PS5 can do, the XSX can do and vice versa, plus-minus 15% performance which will probably be mitigated by resolution. All I'm saying is that Cerny said a bit more than 1 million rays a second will allow for good looking RT GI, which is true because RT assisted GI is very good looking and requires very few rays (a few million rays a second are nothing for a 2060RTX for instance). Cerny's slide and words are a bit misleading because with that ray budget, you can't RT GI Nathan Drake's forehead.
You still sort of are (IMHO) if you read the bolded parts below, because you are saying what he has said is false, despite him making a proper argument to show he's correct.

"In general, I like running the GPU at a higher frequency."
"Let me show you why."
"Here's two possible configurations for a GPU roughly of the level of the Playstation 4 Pro".
36 CU @ 1GHz VS 48 CU @ 0.75GHz
"This is a thought experiment. Don't take these configurations too seriously."
"If you just calculate Teraflops you get the same number, but actually the performance is noticeably different because teraflops is defined as the computational capability of the vector ALU."
"That's just one part of the GPU, there are a lot of other units and those other units all run faster when the GPU frequency is higher. At 33% higher frequency rasterization goes 33% faster."
"Processing the command buffer goes that much faster."
"The L2 and other caches have that much higher bandwidth, and so on."


And according to that video of Nvidia RTXgi, the technique solves the Nathan Drake forehead issue (AFAIK) because it handles the situation of the artist needing to manually place probes - and still getting it wrong because of the moving models. If that isn't the case, then I'm sure it won't be too distracting in amoungst far better looking visuals than this gen. The RTXgi technique also gave a number of about 2.4Mrays/s at full stress for the GI (at 1080p using 16,384 probes and 144 rays per probe).
 
You still sort of are (IMHO) if you read the bolded parts below, because you are saying what he has said is false, despite him making a proper argument to show he's correct.

"In general, I like running the GPU at a higher frequency."
"Let me show you why."
"Here's two possible configurations for a GPU roughly of the level of the Playstation 4 Pro".
36 CU @ 1GHz VS 48 CU @ 0.75GHz
"This is a thought experiment. Don't take these configurations too seriously."
"If you just calculate Teraflops you get the same number, but actually the performance is noticeably different because teraflops is defined as the computational capability of the vector ALU."
"That's just one part of the GPU, there are a lot of other units and those other units all run faster when the GPU frequency is higher. At 33% higher frequency rasterization goes 33% faster."
"Processing the command buffer goes that much faster."
"The L2 and other caches have that much higher bandwidth, and so on."


And according to that video of Nvidia RTXgi, the technique solves the Nathan Drake forehead issue (AFAIK) because it handles the situation of the artist needing to manually place probes - and still getting it wrong because of the moving models. If that isn't the case, then I'm sure it won't be too distracting in amoungst far better looking visuals than this gen. The RTXgi technique also gave a number of about 2.4Mrays/s at full stress for the GI (at 1080p using 16,384 probes and 144 rays per probe).
Yup, everything you've said perfectly supports my post regarding RT. Regarding Cerny's quote, it's true if the machines are 100% the same and have the same TF number. The XSX and PS5 aren't the same machines. They don't have the same TF performance, they don't have the same memory bandwidth and they don't have the same amount of cache. Cerny was trying to convey to the laymen that a higher frequency machine can punch above its' weight which is true, but it still depends on what machine it is facing.
 
Last edited:
Yeah I wanted to shout out Mystic but I didn't want to sound like I was plugging his Youtube channel, me being weird I know.

But yeah, his channel is awesome, very Playstation centric content and none of the fanboy nonesense you get from MBG and Foxy and the Xbox idiots. Very refreshing. I'd recommend watching his videos on PS4, titled "How Sony became King of the consoles again", really in depth video on how the PS4 came about, it was born from the failures of the PS3 and it was tailored for developers, the latter being an important factor for PS5 as well.
Yup, Mystic is a cool cat (his cat is cool too).
 
Yup, everything you've said perfectly supports my post regarding RT. Regarding Cerny's quote, it's true if the machines are 100% the same and have the same TF number. The XSX and PS5 aren't the same machines. They don't have the same TF performance, they don't have the same memory bandwidth and they don't have the same amount of cache. Cerny was trying to convey to the laymen that a higher frequency machine can punch above its' weight which is true, but it still depends on what machine it is facing.
You could probably have said that easier by saying they don't have the same bottlenecks, and when we consider the pitch made by Cerny for the PS5, every aspect of the design was considering how to remove bottlenecks - which by comparison to the mess of the XsX memory configuration in context of it supposedly being a HSA design - surely the marginal TF difference in your opinion is more than made up by the higher clock and absence of bottlenecks, no?

edit: Can you elaborate on this "and they don't have the same amount of cache."? I assume you are referring to the GPU L2 cache sizes, rather than 52/2 * unit L2 module size vs 36/2 * unit L2 module size?
 
Last edited:
So far it's 11sec, only 8 seconds faster than SATA 3 on a potato PC:



AMD FX 6300 6 core cpu, 16gb of DDR3 RAM, Geforce GTX 680 2gb video card with 3 hard drives and 2 SSD 250 Sata 3 soon I will be adding a M.2 PCIe through a PCIe x4 slot adapter


Uh oh. That doesn't look good. It looks like it still has a lot of the bottlenecks present in the PC architecture.

If it'll take a second for PS5 to boot a game as per Cerny, how long will it take for xsex? My guess is 10 seconds. Still faster than PC though.

I absolutely love how you left out Xbox Series X on this. Cheers darling

mCp2jRg.jpg
 
Last edited:
Status
Not open for further replies.
Top Bottom