Shmunter
Member
Sort of like PS5 has no raytracing etc?Wrong answer try again we've known the PS5 specs for 6 months now and what it's capable of. RGT hasn't been right about anything
Sort of like PS5 has no raytracing etc?Wrong answer try again we've known the PS5 specs for 6 months now and what it's capable of. RGT hasn't been right about anything
No. He's been saying this publicly as well, I think.It's a private conversation. He's got a reasonable expectation that it will remain that way.
Ok so what makes Sony's geometry engine better than Microsoft's? What customisations did they make and what improvements to do they give? What customisations have Microsoft made, if any? What benefits do they bring?
customised doesn't mean better (or worse), it just means different. This whole "AMD will use Sony tech in RDNA3" is all just fanboy garbage at this point.
Wow did mark cerny told you all that?
Why not ask your friend mark to give a full Q&A with DF, so he can show off all his next gen secret sauces? I figure would make good official PR. Why has he been hesitating so long?![]()
I would be as receptive as I was in 2013 when it was Microsoft shills spinning their secret sauce bullshit.
Funny how no one has an answer for what would be the smallest space that a PS5 could be physically placed in because answering as much would be admitting they made a console 50% larger than their competition.
Bullshit artists being bullshit artists.
That's very interesting. Thanks for sharing. The part where you said PS5's GE is built to handle geometry with small pixel-sized vertices reminds me of the Nanite tech we saw in the UE5 demo running on PS5. It is essentially a pixel-sized polygon renderer. Also, this reminds me of Mark Cerny's quote from his 'Road to PS5':What Sony has claimed publicly are three things (but there might be more of course):
1) The PS5 GE performs culling of geometries that are not visible (e.g. obscured behind another object on screen etc). This is done downstream of the GE in current GPUs by the mesh-shaders. The advantage of front loading it is that no other part of the rendering pipeline needs tp conduct a single calculation on those geometries since they are already taken away.
2) The PS5 GE assigns priorities to remaining geometry depending on how important it is in the final rasterised frame. This is down downstream of the GE in current GPUs through VRS. The advantage of front loading it is that downstream parts of the pipeline does not need to do any geometry calculations - the priorities are already there and can be applied off the bat. This also means that all functions can access those priorities such as RT and texture handling.
3) The PS5 GE is built to handle geometry with very small vertices (almost pixel sized). In a 'normal' rendering pipeline your primitives cannot be too small or the functions that access the geometry such as mesh shaders and the VRS function become exponentially slower. Sony has tried to address this (partly through 1) above).
You are right. Please note that the text you wrote that I reacted to was your claim that the GE is the same - it is not. Does it perform better? We do not know. The early indicators are positive however and the dev buzz is very positive.
I believe there might be challenges with some multi-plats in the short-run since they are built around the current way of rendering. That might introduce worse performance in the PS5 - we will see.
Why so aggressive?
Because it is a complicated topic and the majority of people just see 12 and 10 TFLOPs and cannot think beyond that. Better to let games do the talking and then bring up the reason 'why'. So I expect some disclosures around how this works after games have been tested and shown on both the XSX and PS5.
As a side note - I also expect some PS5 cache information to be released. I believe we are in for a small treat there as well.
Also, it's easier to fully use 36 CUs in parallel than it is to use 48 CUs. When triangles are small, it's much harder to fill all those CUs with useful work.
in fact you don't do it ovenight it toke an year long . and no i think no hw engineer would design a gpu with only 36cu clocked at an exaggerated level in 2020. in fact no one else does. Especially if the cost of the project does not demonstrate obvious economic savings.I think that from all the evidence that I have brought you it is more logical to think like me than like you .. so it is better to change the subject because you will not convince me that cerny has deliberately designed a console with only 36 cu not even using the full set of technologies present in the new amd architecture I have been following the console market since their first dawn and I can say with relative confidence that if Ms had been in the position of Sony (with a 36cu clocked that high with a huge console and doubts about the architecture used) we would not even have these discussions because everything would be very evident and clear.....stop chasing the possible "different approaches" or "secret sauces"Yes, yes it is. Your take is absolutely nonsensical. Everything in the console was built around it having very high frequencies from the get go: The liquid metal addition, the giant heatsink, the tweaks at the architectural level to process data at top speed...You do not do that overnight, buddy. This has taken them years and years of prototyping and testing.
in fact you don't do it ovenight it toke an year long . and no i think no hw engineer would design a gpu with only 36cu clocked at an exaggerated level in 2020. in fact no one else does. Especially if the cost of the project does not demonstrate obvious economic savings.I think that from all the evidence that I have brought you it is more logical to think like me than like you .. so it is better to change the subject because you will not convince me that cerny has deliberately designed a console with only 36 cu not even using the full set of technologies present in the new amd architecture I have been following the console market since their first dawn and I can say with relative confidence that if Ms had been in the position of Sony (with a 36cu clocked that high with a huge console and doubts about the architecture used) we would not even have these discussions because everything would be very evident and clear.....stop chasing the possible "different approaches" or "secret sauces"
Again - geometry engine is already in RDNA2. The geometry engine is not some Sony specific thing. The Xbox has it too lol. People are now saying Sony have customised theirs, but without any actual evidence or proof or details, and if they have then the possibility also exists that MS have customised theirs too.Could be, I don't see nothing of shocking about it. Geometry engine seems a very smart tech even better than vrs , RDNA3 could use it. Not sure where coming all that surprise.
"the outlier" it is not the xsx with a relatively lower "fixed" clock than it would be on a desktop gpu individually cooled by 3 fans. the new RX 6800 has a game clock of 1815 GHz and 2105 Ghz in boost the XSX has 52 cu (less than the 60 cus in brand new RX 6800 rdna2) and is clocked at 1825 GHz ....... the Ps5 it's only (this should be remembered more often) 36 cu's ... clocked 2233 GHz (remain to see how much it will keep them in the real world) pretty much as high as the RX 6900 XT (80 cu's!!!) which has a base game clock of 2015 GHz and only in boost reaches 2250 !!!! So don't talk to me about "outlier" I repeat the logic pushes everyone to think that the ps5 had to come out 1 year earlier and has been modified where possible to remain competitive. this is what I thinkYou do realise that XSX is the outlier having low clocks compared to PS5 and the recently announced big navi cards right?
"the outlier" it is not the xsx with a relatively lower "fixed" clock than it would be on a desktop gpu individually cooled by 3 fans. the new RX 6800 has a game clock of 1815 GHz and 2105 Ghz in boost the XSX has 52 cu (less than the 60 cus in brand new RX 6800 rdna2) and is clocked at 1825 GHz ....... the Ps5 it's only (this should be remembered more often) 36 cu's ... clocked 2233 GHz (remain to see how much it will keep them in the real world) pretty much as good as the RX 6900 XT which has a base game clock of 2015 GHz and only in boost reaches 2250 !!!! So don't talk to me about "outlier" I repeat the logic pushes everyone to think that the ps5 had to come out 1 year earlier and has been modified where possible to remain competitive. this is what I think
Additionally, isn't PS5's GE fully programmable compared to fixed function GE's as hinted by Matt Hargett in RedGaming tech podcast?All GPUs have a piece of hardware dealing with geometry. AMD are calling this piece the geometry engine. Sony has redesigned the entire rendering pipeline around a heavily customised GE in PS5. It is also rumoured to be part of the future RDNA roadmap. It is unclear to me why you write stuff per above unless your intention is to only troll.
in fact you don't do it ovenight it toke an year long . and no i think no hw engineer would design a gpu with only 36cu clocked at an exaggerated level in 2020. in fact no one else does. Especially if the cost of the project does not demonstrate obvious economic savings.I think that from all the evidence that I have brought you it is more logical to think like me than like you .. so it is better to change the subject because you will not convince me that cerny has deliberately designed a console with only 36 cu not even using the full set of technologies present in the new amd architecture I have been following the console market since their first dawn and I can say with relative confidence that if Ms had been in the position of Sony (with a 36cu clocked that high with a huge console and doubts about the architecture used) we would not even have these discussions because everything would be very evident and clear.....stop chasing the possible "different approaches" or "secret sauces"
fixed clock is easier in the console world ...the ps5 is the first "outlier" console with a variable clock (and this brings us back to why for the first time they had to insert a variable clock) probably to gain some tf from the original projectYou should be comparing boost clocks against the fixed console clock as that is the max range PC cards can run at.
You'd have a point if the 6800 boost clock was 1825 but that's not the case.
OkEverything points in the other direction. The freqeuncies is in line with RDNA2, they redesigned the rendering pipeline around the new GE and the cooling solution is there because it is not one but several ICs in the package so additional cooling is required - a System in Package design. All of these things takes year to develop from prototype etc - it was the plan all along. And they wanted the high frequency because time is as an important component when you do a lot of post-processing as computational power - and time is measured in frequency.
I know I might have to back-track on this one but after all reading I have done, I am certain that Sony is using a stacked chip. And in that vertical stack there is at least a significant cache for the GPU that we currently do not know about. The Sony cooling patent has the following main illustration language:
"In the example illustrated in FIG. 1, the integrated circuit apparatus 5 has two IC chips 5 c and 5 d that are vertically stacked one on top of the other."
And as everyone that has written patents know, you start with broad claims but ultimately exemplify with your actual solution so that you know that your actual solution is covered by the patent as a minimum if there is any challenge/opposition. The patent's main illustration uses a vertically stacked chip with two IC layers.
Well, we can all argue back and forth til the cows come home. But not long to go til we find out.Okso Cerny the same cerny that we all consider brilliant did all this crazy work. to have in the end what? a solution that costs the same (if not more) a physically huge console and more difficult to cool and more importantly less performing. sorry no I don't believe it.
Sure like everyone else I can't wait to see the DF comparations .....but again wait for? ..to find out what? if the ps5 is more powerful than what the hw show it is ? i dont think that magic can transform what there's inside the ps5 guys. we can only wait to understand if the differences are simply those of the hw (around 20%) or if the variable clock isn't all of that and increases the gap ... or if the customizations made by Sony do not correspond exactly to those of rdna2. this is what I personally expectWell, we can all argue back and forth til the cows come home. But not long to go til we find out.
What's the point? He and Lisa Su said it's custom RDNA2 but people still say it's RDNA 1.Why can't cerny or someone just come out and say it
Okso Cerny the same cerny that we all consider brilliant did all this crazy work. to have in the end what? a solution that costs the same (if not more) a physically huge console and more difficult to cool and more importantly less performing. sorry no I don't believe it.
fixed clock is easier in the console world ...the ps5 is the first "outlier" console with a variable clock (and this brings us back to why for the first time they had to insert a variable clock) probably to gain some tf from the original project
an innovation that essentially gives the system on chip a set power budget based on the thermal dissipation of the cooling assembly.
The behaviour of all PS5s is the same," says Cerny. "If you play the same game and go to the same location in the game, it doesn't matter which custom chip you have and what its transistors are like. It doesn't matter if you put it in your stereo cabinet or your refrigerator, your PS5 will get the same frequencies for CPU and GPU as any other PS5.
Cerny mentioned that they could run the GPU way over 2230 MHz, but they had to cap it so that on-chip logic operated properly. The cooling solution of the PS5 was designed after the GPU/CPU frequencies and the power levels were set. It was designed specifically for that power level. And they spent over 2 years preparing the adoption of liquid metal as their thermal interface material.Okso Cerny the same cerny that we all consider brilliant did all this crazy work. to have in the end what? a solution that costs the same (if not more) a physically huge console and more difficult to cool and more importantly less performing. sorry no I don't believe it.
Well, we'll see.Sure like everyone else I can't wait to see the DF comparations .....but again wait for? ..to find out what? if the ps5 is more powerful than what the hw show it is ? i dont think that magic can transform what there's inside the ps5 guys. we can only wait to understand if the differences are simply those of the hw (around 20%) or if the variable clock isn't all of that and increases the gap ... or if the customizations made by Sony do not correspond exactly to those of rdna2. this is what I personally expect
in fact you don't do it ovenight it toke an year long . and no i think no hw engineer would design a gpu with only 36cu clocked at an exaggerated level in 2020. in fact no one else does. Especially if the cost of the project does not demonstrate obvious economic savings.I think that from all the evidence that I have brought you it is more logical to think like me than like you .. so it is better to change the subject because you will not convince me that cerny has deliberately designed a console with only 36 cu not even using the full set of technologies present in the new amd architecture I have been following the console market since their first dawn and I can say with relative confidence that if Ms had been in the position of Sony (with a 36cu clocked that high with a huge console and doubts about the architecture used) we would not even have these discussions because everything would be very evident and clear.....stop chasing the possible "different approaches" or "secret sauces"
i did. how many video cards have they presented with less than 60 cu's and with a game clock above 2230? ...zeroA wild developer appears! Stopped reading after that gem of a line. You did not check the latest AMD GPUs line, did you?
You should be comparing boost clocks against the fixed console clock as that is the max range each can run at.
You'd have a point if the 6800 boost clock was 1825 but that's not the case here.
XSX has more in common with the 5000 series in that regard.
i did. how many video cards have they presented with less than 60 cu's and with a game clock above 2230? ...zero
exactly if sony wanted a powerful and less complicated console it had other much simpler and more profitable ways than go with a huge console to build (and for its unpleasant size to see) ... because of such a high clock that prompted designers to insert the problematic liquid metal (anyone who builds PCs knows the risks of this type of heat conductor) and a huge and very expensive heatsink.and to invent a whole new way to keep that clock as much as possible! all just because you had 36 cu's. number of cu's far from today's standard and much ... much closer to the previous a amd architecture.
i did. how many video cards have they presented with less than 60 cu's and with a game clock above 2230? ...zero
The presented top flagship die, only 1 PRESENTED so far, all have 80 CU the differentiator is some have CU and some have a whole SE disabled.
Only the founders edition have been presented by AMD at conservative clocks, here is the Stryx Aib
Of course you realise bigger die are clocked lower than smaller die, but it wont be much.
The 40 CU is 2.4 GHz rumoured, but not availabel yet for sale, does not mean it does not exist and will announced soon enough.
I am sure with Gallium eutectic alloy AIB options will go higher, some reports of 2.8 Ghz peak have been stated.,
Some of the worlds best engineers spending millions on RnD for years on end built the PS5.exactly if sony wanted a powerful and less complicated console it had other much simpler and more profitable ways than go with a huge console to build (and for its unpleasant size to see) ... because of such a high clock that prompted designers to insert the problematic liquid metal (anyone who builds PCs knows the risks of this type of heat conductor) and a huge and very expensive heatsink.and to invent a whole new way to keep that clock as much as possible! all just because you had 36 cu's. number of cu's far from today's standard and much ... much closer to the previous a amd architecture.
Again ... sorry but I can't believe it at all ..in very sorry to think differently from you
the thing is ....the ps5 was rumored to be a 2019/2020 console. or I don't know what but everything points that they have had a design review. all. how can you not see it?Some of the worlds best engineers spending millions on RnD for years on end built the PS5.
But you've seen right through them...good job!
1) you talking about boost and not game mode.
2) gpus in a console are not dgpu and are cooled very ..very...very differently
3) Substantial fluctuations in the clock such as those that occur in the PC world between game and boost mode would create enormous problems for developers in the world of consoles.
none of the new architecture rdna2 gpus have less than 60 cu's the only outlier is the xsx gpu with 52 clocked basically as the rx 6800 in game mode
Yikes. Are you serious? Do you also look at FX 8350 which is an octa-core and then look at Ryzen 7 3700X which is also an octa-core and go "eh... 8 corezz same numerzz much mucchh clozerr...."?all just because you had 36 cu's. number of cu's far from today's standard and much ... much closer to the previous a amd architecture.
Nothing points to that. NOTHING.the thing is ....the ps5 was rumored to be a 2019/2020 console. or I don't know what but everything points that they have had a design review. all. how can you not see it?
So are you trying to say there won't be any RDNA 2 GPU with less than 60 CUs? 40, 36 CUs and even lower?1) you talking about boost and not game mode.
2) gpus in a console are not dgpu and are cooled very ..very...very differently
3) Substantial fluctuations in the clock such as those that occur in the PC world between game and boost mode would create enormous problems for developers in the world of consoles.
none of the new architecture rdna2 gpus have less than 60 cu's the only outlier is the xsx gpu with 52 clocked basically as the rx 6800 in game mode
1. Game clock for 40 CU will also be > 2.2
2. No they are cooled with fans and thermal conduction, just because you can dedicate 2 fans to a card does not make it different, just more dedicated.
3. Hahah, thats funny, I am sure sony are really struggling. Whata load of tripe.
MS clearly went with a solution that did not involve the fast gated frequency control and optimisations to get high clocks. Its clear, if you understood any of the below points we could discuss, but you clearly dont.
![]()
Note they are written in a way to obscure so not to uspet some partners (for now), White paper will show fast CUs are designed with per WGP clock gating for starters.
Do you know what the other 2 points mean ?
i doubt someone competent enough said mesh shader which is part of a new render pipeline is downstream of anything.What Sony has claimed publicly are three things (but there might be more of course):
I minor in psychology. Some of you are in the grief and anger stages, and you are in trolling free style mode.
I understand 28 October was supposed to be the day of crow eating, supposed the world to see super high game clocks and special sauces. Too bad the event belonged to AMD and MS DX12u. Too bad we see the best 7nm rDNA2 dies runs only slightly above 2ghz at 300w tgp.
You did not get the beautiful clocks, so now you are resorting to fud that Series X clocks are too slow and so not rDNA2.
Nevermind its pointed out repeatedly, go look at 6800, at 250w.
Im sure you know are trolling and egging at this point. Grow up and stop polluting the forums.
I have a feeling you are from REE and trying to spoil neogaf. Go back to your socially politically 'perfect' place and be 'normal.'
I understand 28 October was supposed to be the day of crow eating, supposed the world to see super high game clocks and special sauces. Too bad the event belonged to AMD and MS DX12u.
And XSX GPU is below the RDNA 2 cards(and PS5)You do realise that XSX is the outlier having low clocks compared to PS5 and the recently announced big navi cards right?
And XSX GPU is below the RDNA 2 cards(and PS5)
![]()
I think that the design of the ps5 clearly demonstrates a reworking of the initial project as I have already said nobody in their right mind would want to approve a console with that size ...cu's / clock setup .. at that price ...if not forced for some obscure raeason especially if already before going out on paper it is not even at the top of the performances compared to the competition.Nothing points to that. NOTHING.
From reading your posts, are you telling me because Sony went with 36CU design that somehow they felt caught off guard by Series X? In response they delayed the console and decided to boost clock frequency to the max and hope for the best? Before I go any further I just want to get this straight, so is that how you view this?
dgpu? yes absolutely..you will not get a 36cu's dgpu lolSo are you trying to say there won't be any RDNA 2 GPU with less than 60 CUs? 40, 36 CUs and even lower?
First and foremost, you have not explained how the PS5 demonstrates a reworking.I think that the design of the ps5 clearly demonstrates a reworking of the initial project as I have already said nobody in their right mind would want to approve a console with that size ...cu's / clock setup .. at that price ...if not forced for some obscure raeason especially if already before going out on paper it is not even at the top of the performances compared to the competition.