• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

assurdum

Banned
my bet is this is not true anymore with fixing driver update.
My bet indeed is you as many others haven't understood yet what's the difference between virtual environment coding and low API access. Sony and MS have the opposite philosophy in the libraries use.
 
Last edited:

onesvenus

Member
The best I can offer was a reminder of the widely read article at the start of PS4/XB1 gen - which I'm sure most people remember, hopefully including yourself -when Ubisoft said they developed to make both versions of AC equal frame-rate/resolution/fidelity - despite the major hardware differences between the two platforms; essentially confirming a parity clause was in play, without actually saying it was contractually obligated.
I can't remember that but thank you
 

Krisprolls

Banned
I don't know if such a parity clause exists though. If a game can load in 2 seconds on the PS5 and in 10 seconds on the XSX (just an example) I don't see why it's a big deal for developers to do that. They already are making decisions that cause visuals differences between the two (Hitman 3). I don't see why they can't do the same where the I/O is concerned.

But you're right that 1st will be the ones to truly show it off. Ratchet should be an interesting one to look at since it's supposed to show off what the I/O can do.

You're wrong, it's a big deal.

You have to build your whole asset loading and streaming to benefit from PS5 superior I/O system. You have to load huge chunk of data at once. It's not a given at all and that's why nobody does it on older titles.

You'll see extremely fast loading on 1st party titles, like Spiderman which takes like 2 sec from menu.
 

martino

Member
My bet indeed is you as many others haven't understood yet what's the difference between virtual environment coding and low API access. Sony and MS have the opposite philosophy in the libraries use.
Why bring ms and sony ? breathe not everything is to hurt your favorite brand....
He has deduced there is no architectural change from a driver point of view maybe things on the mesh shader front were not final.
 

assurdum

Banned
Why bring ms and sony ? breathe not everything is to hurt your favorite brand....
He has deduced there is no architectural change from a driver point of view maybe things on the mesh shader front were not final.
People like you are always hilarious." Ps5 doesn't have mesh shader, no no no no, it's not the same thing of RDNA 2 mesh shaders, no no no, look at the benchmark." "You see ps5 has GE which its own VRS and mesh shaders version, the opposite philosophy of MS, that's the reason." "Why you Sony fanboy you want to talk just about Sony VS MS?" . o_O Maybe because everyone continue to look at MS Vs Sony everytime we talk about mesh shaders?
 
Last edited:

martino

Member
People like you are always hilarious." Ps5 doesn't have mesh shader, no no no no, it's not the same thing of RDNA 2 mesh shaders, no no no, look at the benchmark." "You see ps5 has GE which its own VRS and mesh shaders version the opposite philosophy of MS." Why you Sony fanboy you talk just about Sony VS MS?"
So a bad attempt at authority argument didn't worked...Now it's plain ad hominem....
it's a pattern with you when it comes to discuss technical stuff.
What i said have nothing to do with consoles and is about amd use of mesh shader on pc..sure it's a bit out topic but i didn't started it
Can i still react and speculate on it without seing a "pretend to know all" fanboy to make a tantrum ?
 
Last edited:

Lysandros

Member
That is not what he is saying. He saying that the primitive shader function is the same for both shaders. What the Mesh hardware piece is doing is determining on what pieces of the geometry that primitive shader should be applied and how much (i.e. culling and variable rate shading respectively). RDNA1 does not have that sorting function.

If you look at a simplistic (please observe) outline of the hardware flow comparison for the shader function between RDNA1, RDNA2 and the Sony solution it looks roughly as follows:

RDNA1: Geometry generated by the GE -> 100% of geometry data pushed to the shader arrays/CUs -> primitive shader hardware unit works on close to 100% of geometry data -> output
RDNA2: Geometry generated by the GE -> 100% of geometry data pushed to the shader arrays/CUs -> Mesh/VRS hardware block works through geometry to cull/ prioritize -> primitive shader hardware unit works on just 10-20% of geometry data
PS5: Geometry generated by the GE -> Culling and prioritization of geometry happens on GE level -> 10-20% of geometry data pushed to the shader arrays/CUs -> primitive shader hardware unit works on just 10-20% of geometry data

On paper the PS5 solution is better BUT comes with a big drawback - it requires reengineering of graphical engines to be used. The RDNA2 (and Nvidia solution that is similar) will be able to be used with current graphical engines without significant recoding.

Why is the PS5 solution better on paper (please observe again - paper is one thing and reality is another - we do not know the real world numbers yet)? Two main things: Doing culling and prioritization on the GE level makes it easier for the programmer to control exactly what is happening with APIs. Secondly, since less data is pushed to the arrays in the first place your CUs are used much more efficiently throughout (this is what Matt H meant with his tweet that is above in this chain) including a significant increase in practical shader array and CU cache sizes (less data -> practical increase in cache size and efficiency).
Thanks for explaining the differences in the process/implementations this clearly, very informative post.
 

assurdum

Banned
So a bad attempt at authority argument didn't worked...Now it's plain ad hominem....
it's pattern with you when it comes to discuss technical stuff.
What i said have nothing to do with consoles and is about amd use of mesh shader on pc..sure it's a bit out topic but i didn't started it
Can i still react and speculate on it without seing a "pretend to know all" fanboy to make a tantrums
my bet is this is not true anymore with fixing driver update.
He was corrected by LeviathaGamer2 on the Mesh Shader stuff.





So what is it no more the cases? This post talking about the ps5 mesh shader but you said ps5 Vs series X isn't it in the equation of your conversation. What exactly was the point of your post then?
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
You all looking for Mesh Shaders on PS5..
You all was looking at them od UE5 demo, because there was Meshes (at full res) that was Shaded.
Epic said the hardware feature they used was primitive shaders.

The rest was done by the engine in software.


We can't beat hardware rasterisers in all cases though so we'll use hardware when we've determined it's the faster path. On PlayStation 5 we use primitive shaders for that path which is considerably faster than using the old pipeline we had before with vertex shaders

But again, everyone seems to be missing the point.. that it is believed PS5 doesn't have AMD's SPECIFIC mesh shader hardware, doesn't mean they don't have their own custom hardware that does something very similar.
 
Last edited:

assurdum

Banned
possibly, it's deductions from early driver .
Possibly what? I explained to you what's the reason behind the choice of Sony but you accuse me to be fanboy because I compared it to the series X. What exactly fanboism has to do with my example?
 
Last edited:

martino

Member
I think you need to accept that the actual shader hardware block is the same - it is where and how that is applied that differs (and that makes a huge difference ofc).
I'm fine being wrong.
I just hope someone will look at it to be sure.it's better than not.
 

Great Hair

Banned
Turns out WiLD is still a thing.

Hiring ongoing.


"Jim, that game is dead!"
 

LiquidRex

Member
If you're gullible enough to misinterpret "bandwidth multipliers" marketing speak, the RAM magically grows from 10 to 35GB.

The idea is that portions of textures are streamed selectively, and that saves up memory bandwidth, not create it.

It does not, but it's only loading parts of texture which are visible. It's more like, that it loads half of data as it would normally do. Because texture is traditionally loaded to whole surface even if it's on other side....
Yep... Just a tweet I saw this morning about XSS, SFS results in 20GB eclipsing PS5 RAM.
 

Garani

Member
I am late to the party, so please bear with me.
What I find interesting is the broader picture of what is happening with graphics and its implications for hardware.

Currently - and over the last decade+ games have been run as single threaded executables with all graphical calculations off-loaded to the GPU. In other words, once the graphical assets have been loaded into VRAM a game performs as your single core/single thread CPU performs and your GPU as two separate islands. The rest of your computer can frankly be fairly crap and does very little do help you with gaming.

What we are seeing now is that more and more tasks in the graphical engines are performed by the CPU and not only the GPU. Especially UE5 seems to utilise the CPU almost as much as the GPU to do its work. I think we underestimate the implications of this.

The loop CPU -> CPU cache -> RAM -> VRAM -> GPU cache -> GPU -> RAM -> CPU cache -> CPU etc becomes really important. Synchronization becomes really important between your components in terms of frequencies and scheduling to avoid bottle-necks that result in stutters etc. In other words I/O. Will be very interesting to see how this plays out with the evolution of the graphical engines and how we build our computers. Another way to state this is that Cerny seems to be really on top of what will matter across this generation - I continue to find real golden nuggets in his speech 'The road to PS5' - so cool to listen to someone that have such inside knowledge about what is about to happen and have tried to design a system for the task.

I am still amazed by the fact the Cerny explained all this almost a year ago and we are still going around the issue. It's clear that Sony had a strategy, while MS another. The end result is that, at the moment, they are pretty much on par, with a bit of an advantage for Sony.

😁Control photomode says ‘No’

Please, you can't be serious. Photomode has nothing to do with gameplay. Trying to get a "win" thanks to a still image that has no need to deal with the rest of the game cycles, is quite petty.

Whenever we see the XSX fall behind the PS5 its fair to question why because it's marketed as being the superior system. Controls fotomode proves that it isn't because of the GPU. The only thing left is the I/O, ram, APIs and CPU. On those I've read many theories on why the XSX performance can fall behind the PS5 at certain moments.

I respectfully disagree. At the moment, during a game, Sony has a better GPU output. The real big difference between the system is the I/O compatment. It's not the SSD in itself that makes things different, but the fact that the whole process is totally off loaded. And Cerny told us all about it in details.
 

IntentionalPun

Ask me about my wife's perfect butthole
If you're gullible enough to misinterpret "bandwidth multipliers" marketing speak, the RAM magically grows from 10 to 35GB.

The idea is that portions of textures are streamed selectively, and that saves up memory bandwidth, not create it.
Gotta be only for static objects in the distance right?

Doesn't seem like it would be efficient to do this for anything the player could easily move the other side of.

I wish we'd see real data on this stuff; like show a video of a game w/ a counter on screen of the amount of texture data not loaded.
 

BigLee74

Member
Please, you can't be serious. Photomode has nothing to do with gameplay. Trying to get a "win" thanks to a still image that has no need to deal with the rest of the game cycles, is quite petty.

Context, please.

My comment was with regards to a poster claiming the GPU was performing better on the PS5 as if a matter of fact. I have merely pointed out that a pure GPU benchmark (admittedly for this one game) proved that not to be the case. I’ll bend backwards a bit (given the thread I am on where I can predict exactly who will respond with laughter or triggered emojis) and say at best everything is very inconclusive at the moment.

I never once said anything about gameplay, or consoles as an overall package, so don’t go projecting!
 

IntentionalPun

Ask me about my wife's perfect butthole
Context, please.

My comment was with regards to a poster claiming the GPU was performing better on the PS5 as if a matter of fact. I have merely pointed out that a pure GPU benchmark (admittedly for this one game) proved that not to be the case. I’ll bend backwards a bit (given the thread I am on where I can predict exactly who will respond with laughter or triggered emojis) and say at best everything is very inconclusive at the moment.

I never once said anything about gameplay, or consoles as an overall package, so don’t go projecting!
The question becomes: is the overall performance lower because of a bottleneck that is unsolvable, or is it a software issue that can be solved?

If I was the Xbox team I'd be profiling the shit out of Control to figure out what is happening.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
I'll believe it when I see it with Xbox; and maybe I'll play some 3rd parties their instead of my PS5s.

But it's likely we'll just be seeing locked 60 games w/ very similar settings anyways. But PS5 has my early gen momentum; my XSX sits in my bedroom and I've only used it to watch midget porn.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I think it will be really interesting if the xsx ends up with a massive advantage not because of its extra tflops but the mesh shader implementation. but would people really care 3-4 years down the line? UE5 hasnt even shipped yet, and games being developed right now probably arent using mesh shaders. so its not like BF6, the next COD, next AC and next GTA will be using mesh shaders. I hope I'm wrong because i would love to see this stuff sooner rather than later.

Not sure if it will convince me to pick up an xsx since my rtx 2080 will likely offer better performance, but I think it will give a big boost to the xsx even the difference is 2x the framerate or 2x the resolution. But if it comes after say 2023 then sony can just have their ps5 pro with a mesh shader implementation.
 

IntentionalPun

Ask me about my wife's perfect butthole
I think it will be really interesting if the xsx ends up with a massive advantage not because of its extra tflops but the mesh shader implementation. but would people really care 3-4 years down the line? UE5 hasnt even shipped yet, and games being developed right now probably arent using mesh shaders. so its not like BF6, the next COD, next AC and next GTA will be using mesh shaders. I hope I'm wrong because i would love to see this stuff sooner rather than later.

Not sure if it will convince me to pick up an xsx since my rtx 2080 will likely offer better performance, but I think it will give a big boost to the xsx even the difference is 2x the framerate or 2x the resolution. But if it comes after say 2023 then sony can just have their ps5 pro with a mesh shader implementation.
Mesh shaders have been in the GDK since last June though. (stop reading anything Riky Riky says lol)

Not sure if any shipped games use them yet; but I don't know why some upcoming DX12 games wouldn't use them.

But PS5 has customizations that might outdo AMD's mesh shader implementation.. so it's all potentially a moot point anyways.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Mesh shaders have been in the GDK since last June though.

Not sure if any shipped games use them yet; but I don't know why some upcoming DX12 games wouldn't use them.
BF6, the next cod and the next AC began development at least a year before last June. I am assuming that entire engines have to be retooled to take advantage of these new mesh shaders since its a completely new way of rendering objects.
 

IntentionalPun

Ask me about my wife's perfect butthole
BF6, the next cod and the next AC began development at least a year before last June. I am assuming that entire engines have to be retooled to take advantage of these new mesh shaders since its a completely new way of rendering objects.
It's a pretty simple API, and IIRC versions of it have existed since 2019.

You add a call to check for support; then run your regular pipeline if the hardware doesn't support it.

I'm sure engine developers have been working on those alternate pipelines since 2019. Will they make 2021 games? I don't know; but I would guess at least someone will make it.

The June 2020 release was just the non-beta version specifically for the "GDK"; DX12 itself has had this in preview for a while. (as has the GDK.) It's all the same calls. All the same opensource code.
 
Last edited:

LiquidRex

Member
Mesh shaders have been in the GDK since last June though. (stop reading anything Riky Riky says lol)

Not sure if any shipped games use them yet; but I don't know why some upcoming DX12 games wouldn't use them.

But PS5 has customizations that might outdo AMD's mesh shader implementation.. so it's all potentially a moot point anyways.
Can we not use UE5 showcased on the PS5 as a prototype benchmark of what is capable with the use of the consoles custom Primitive Shader implementation.
 
I think it will be really interesting if the xsx ends up with a massive advantage not because of its extra tflops but the mesh shader implementation. but would people really care 3-4 years down the line? UE5 hasnt even shipped yet, and games being developed right now probably arent using mesh shaders. so its not like BF6, the next COD, next AC and next GTA will be using mesh shaders. I hope I'm wrong because i would love to see this stuff sooner rather than later.

Not sure if it will convince me to pick up an xsx since my rtx 2080 will likely offer better performance, but I think it will give a big boost to the xsx even the difference is 2x the framerate or 2x the resolution. But if it comes after say 2023 then sony can just have their ps5 pro with a mesh shader implementation.

I don't know but it reminds me of something Mr X would say. I'd honestly never seen something like that happen but who knows at this point?

However I'm pretty sure Sony has knowledge about Mesh Shaders and if they did provide such a massive performance boost they wouldn't leave something like that out.

It's very possible they have something similar because they did mention they have new culling capabilities in their GE.
 

IntentionalPun

Ask me about my wife's perfect butthole
Can we not use UE5 showcased on the PS5 as a prototype benchmark of what is capable with the use of the consoles custom Primitive Shader implementation.
Epic said everything they do is in software other than using PS5's primitive shaders; I don't believe those are customized.. Sony hasn't said.. but they have other customizations believed to be related to mesh shading, and Epic did not claim to be using any of that.

But it's all vague at this point anyways.
 

Riky

$MSFT
Mesh shaders have been in the GDK since last June though. (stop reading anything Riky Riky says lol)

Not sure if any shipped games use them yet; but I don't know why some upcoming DX12 games wouldn't use them.

But PS5 has customizations that might outdo AMD's mesh shader implementation.. so it's all potentially a moot point anyways.

I think the games released so far might have been started a bit before last June😂

I told people to hold on with the victory parade, we'll see what comes next.
 

PaintTinJr

Member
Epic said everything they do is in software other than using PS5's primitive shaders; I don't believe those are customized.. Sony hasn't said.. but they have other customizations believed to be related to mesh shading, and Epic did not claim to be using any of that.

But it's all vague at this point anyways.
The nvidia showcase IIRC for mesh shaders was a asteroid field, and AFAIK the bulk of the mesh-shading efficiency is for the mid to background geometry, not the foreground. UE5 shows detail beyond polygons (4 per pixel on average IIRC) making aliasing a non-issue for the scene - rather than the foreground moving assets - which apparently stressed the system no more than fornite 1080p60 workload on PS4.

So my question would be, how would mesh-shaders make any real difference in that typical UE5 background, and 10TF conventional rendering foreground setup - assuming they weren't already on show by another name for the moving geometry assets?
 

IntentionalPun

Ask me about my wife's perfect butthole
The nvidia showcase IIRC for mesh shaders was a asteroid field, and AFAIK the bulk of the mesh-shading efficiency is for the mid to background geometry, not the foreground. UE5 shows detail beyond polygons (4 per pixel on average IIRC) making aliasing a non-issue for the scene - rather than the foreground moving assets - which apparently stressed the system no more than fornite 1080p60 workload on PS4.

So my question would be, how would mesh-shaders make any real difference in that typical UE5 background, and 10TF conventional rendering foreground setup - assuming they weren't already on show by another name for the moving geometry assets?
I don't know; Epic only talked about running UE5 on PS5 and said the only hardware specific rendering feature they used were primitive shaders.

Would mesh shaders improve UE5's performance? You'd have to ask Epic. I don't know what aliasing has to do with this though?
 

SlimySnake

Flashless at the Golden Globes
It's a pretty simple API, and IIRC versions of it have existed since 2019.

You add a call to check for support; then run your regular pipeline if the hardware doesn't support it.

I'm sure engine developers have been working on those alternate pipelines since 2019. Will they make 2021 games? I don't know; but I would guess at least someone will make it.

The June 2020 release was just the non-beta version specifically for the "GDK"; DX12 itself has had this in preview for a while. (as has the GDK.) It's all the same calls. All the same opensource code.
so we could potentially have older games support this? i wonder if cd project can use mesh shaders to improve performance on PC and next gen consoles. even if the gains arent 500%, just a 100% gain means doubling the framerate.
 

PaintTinJr

Member
I don't know; Epic only talked about running UE5 on PS5 and said the only hardware specific rendering feature they used were primitive shaders.

Would mesh shaders improve UE5's performance? You'd have to ask Epic. I don't know what aliasing has to do with this though?
The thing is, primitive shaders will be mentioned interchangeably with mesh-shaders, because Sony's GE named feature will still be under NDA, and for anyone listening, the interchangeable term use will make little difference in the context of the UE5 talk.

The reason I mentioned aliasing is that there is a direct correlation between geometry size and aliasing, so if they are sub-pixel in nanite, then mesh-shading can't make any difference to the image quality in a positive way - if anything, they probably still result in geometry that is 1-3 pixels in size at their smallest output, which can produce aliased edges.
 
  • Like
Reactions: Rea

LiquidRex

Member
Found Cerny talking about Primitive Shaders. It's basically exactly what mesh shaders are supposed to do with LODs.

Timestamped:


This sounds exactly like what Nvidia demo'd here in their mesh shaders demo.


They are same then, but isn't GE a separate block of Hardware, can run automatically but to really take advantage of what it can do, you need to invest time coding for it.
 

SlimySnake

Flashless at the Golden Globes

Trumps fault. 😜
Yesterday Biden signed an executive order aiming to fix the shortage by clearing up some supply bottlenecks. whatever that means.

Hopefully it was due to Trumps trade war and we can get this shit sorted out in the next few months. The last thing I want is publishers freaking out and making more games cross gen. I really dont want to play cross gen games in 2024.
 

Garani

Member
Context, please.

My comment was with regards to a poster claiming the GPU was performing better on the PS5 as if a matter of fact. I have merely pointed out that a pure GPU benchmark (admittedly for this one game) proved that not to be the case. I’ll bend backwards a bit (given the thread I am on where I can predict exactly who will respond with laughter or triggered emojis) and say at best everything is very inconclusive at the moment.

I never once said anything about gameplay, or consoles as an overall package, so don’t go projecting!

What is photomode supposed to test? I could decide to turn everything up to 1000x and get 1fps. Great image quality for a picture, but that's it. Using photomode is just useless in the context of GPU and CPU benchmarking.

In game benchmarking PS5's GPU was shown consistantly on par with XSX's, if not superior in certain aspects. You see, you play games, and pictures are there just to share a nice moment and fps are not indicative of GPU performance.


He still hasn't learned how to properly open the PS5 pannels. Now this is starting to become annoying.
 
That is not what he is saying. He saying that the primitive shader function is the same for both shaders. What the Mesh hardware piece is doing is determining on what pieces of the geometry that primitive shader should be applied and how much (i.e. culling and variable rate shading respectively). RDNA1 does not have that sorting function.

If you look at a simplistic (please observe) outline of the hardware flow comparison for the shader function between RDNA1, RDNA2 and the Sony solution it looks roughly as follows:

RDNA1: Geometry generated by the GE -> 100% of geometry data pushed to the shader arrays/CUs -> primitive shader hardware unit works on close to 100% of geometry data -> output
RDNA2: Geometry generated by the GE -> 100% of geometry data pushed to the shader arrays/CUs -> Mesh/VRS hardware block works through geometry to cull/ prioritize -> primitive shader hardware unit works on just 10-20% of geometry data
PS5: Geometry generated by the GE -> Culling and prioritization of geometry happens on GE level -> 10-20% of geometry data pushed to the shader arrays/CUs -> primitive shader hardware unit works on just 10-20% of geometry data

On paper the PS5 solution is better BUT comes with a big drawback - it requires reengineering of graphical engines to be used. The RDNA2 (and Nvidia solution that is similar) will be able to be used with current graphical engines without significant recoding.

Why is the PS5 solution better on paper (please observe again - paper is one thing and reality is another - we do not know the real world numbers yet)? Two main things: Doing culling and prioritization on the GE level makes it easier for the programmer to control exactly what is happening with APIs. Secondly, since less data is pushed to the arrays in the first place your CUs are used much more efficiently throughout (this is what Matt H meant with his tweet that is above in this chain) including a significant increase in practical shader array and CU cache sizes (less data -> practical increase in cache size and efficiency).

Do you think there is some dedicated separate block for handling geometry?

Maybe that is not what is happening with Primitive/Mesh Shaders.

Like LeviathanGamer look in the drivers source code, Mesh Shaders are software upgrade. Geometry Engine would be the same thing. AMD would have to implement Mesh Shaders for RDNA 1, but itself independent of DirectX12.
 

LiquidRex

Member
Yesterday Biden signed an executive order aiming to fix the shortage by clearing up some supply bottlenecks. whatever that means.

Hopefully it was due to Trumps trade war and we can get this shit sorted out in the next few months. The last thing I want is publishers freaking out and making more games cross gen. I really dont want to play cross gen games in 2024.
More cross gen titles has been my concern for Gen 9 due to the Chip shortages... I do wonder if this will put thinks behind 12 to 18 months with regards game development for Gen 9 consoles.
 
Status
Not open for further replies.
Top Bottom