xHunter
Member
or this one
like someone said earlier, google trends is not so good, since changing one letter/number shows different results
Last edited:
or this one
As for people seriously trying to say the PS5 has somehow horrible or inefficient GPU performance simply because of 18% less Tflops?
See what I did here?
Also
Microsoft themselves seems to disagree with you here:
MS touting their slow ass SSD as their most innovative features![]()
"The CPU is the brain of our new console and the GPU is the heart, but the Xbox Velocity Architecture is the soul," said Andrew Goossen, Technical Fellow on Xbox Series X at Microsoft. "The Xbox Velocity Architecture is about so much more than fast last times. It's one of the most innovative parts of our new console. It's about revolutionizing how games can create vastly bigger, more compelling worlds."
I dint quite understand your math.The power advantage is MINOR, consoles never been this close in power in history for consoles releasing the same year.
PS2 vs Xbox : more than 100% GPU power to Xbox not counting Ram advantages and CPU (Xbox released a year later)
PS3 vs 360: different arch w/ advantages to each console.
PS4 vs Xbox One: 40% GPU adv. With more than double the Ram BW to PS4.
One X vs PRO: 40% GPU to X with slightly faster BW.
And now:
PS5 vs Series X: 18% GPU to X with vastly slower SSD and worse I/O , compression and audio.
Sony had to make those "minor" sacrifices on the GPU simply to improve their silicon on other areas while keeping the prices reasonable. I think they made the right call here, we are getting:
I think PS5 is one of the most elegantly designed consoles ever created.
- Innovative controller w/ most featurs
- Innovative Audio experience for everyone (never done before)
- Blazing fast SSD that competitor are dreaming off.
- Streamlined gaming experience by removing almost all bottlenecks in the system.
18% GPU won't result in any noticeable visual improvement, maybe a frame or two higher than PS5 or a few line of resolution that you need DF zoom 300x to notice the difference.
I dint quite understand your math.The power advantage is MINOR, consoles never been this close in power in history for consoles releasing the same year.
PS2 vs Xbox : more than 100% GPU power to Xbox not counting Ram advantages and CPU (Xbox released a year later)
PS3 vs 360: different arch w/ advantages to each console.
PS4 vs Xbox One: 40% GPU adv. With more than double the Ram BW to PS4.
One X vs PRO: 40% GPU to X with slightly faster BW.
And now:
PS5 vs Series X: 18% GPU to X with vastly slower SSD and worse I/O , compression and audio.
Sony had to make those "minor" sacrifices on the GPU simply to improve their silicon on other areas while keeping the prices reasonable. I think they made the right call here, we are getting:
I think PS5 is one of the most elegantly designed consoles ever created.
- Innovative controller w/ most featurs
- Innovative Audio experience for everyone (never done before)
- Blazing fast SSD that competitor are dreaming off.
- Streamlined gaming experience by removing almost all bottlenecks in the system.
18% GPU won't result in any noticeable visual improvement, maybe a frame or two higher than PS5 or a few line of resolution that you need DF zoom 300x to notice the difference.
hmmm...
Well a lot of GPUs that are older support DX12. The mono driver and DX12 is where they wanted to head, but XBO was just so delayed, everything was just so delayed. You can really see how MS turned it around with this coming generation. APIs, feature set, hardware, Targets being hit etc, marketing that makes sense, services moving forward, all sorts of compatibility etc, all of it coming together in a very strong cadence.
On this case when on the topic of XBO hardware feature support, XBO only supported features up to 12_0. There was some specific xbox only features that they supported, in particular, some additional microcode around the executeIndirect function. But outside of that, we didn't see any specific hardware support outside what GCN had already supported in both tier or feature levels.
A _sharp_ contrast to what we have coming.
I expect XSX to continue it's development on the command processor to incorporate the additional work they've been doing there since XBO, X1X, and now to XSX. They have a different Tier version of VRS that is not covered by DX12U (if our patent understanding is correct), both AMD and Nvidia implementations differ in what is offered in this case. Not sure if MS has more up it's sleeve. But as a baseline, yes all 3 should have VRS.
The DF article mentioned something about " the Series X GPU allows for work to be shared between shaders without involvement from the CPU, saving a large amount of work for the Zen 2 cores, with data remaining on the GPU". Definitely sounds like more development on the command processor being able to generate work without the cpu.
It would be a major win for this generation, if next generation pushed developers to go this route. Having executeIndirect with state changes is the only forward looking feature that they have in the console and if the programming moves into this direction could prolong the lifespan of these consoles as Jaguar is significantly weaker compared to Zen2. But if you're reducing the amount of work they need to do Jaguar might just be enough to keep up.
Traditionally, I believe executeIndirect can be used to generate work for itself. In the demo Max presented, the GPU using executeIndirect to send the near finished results over to the iGPU to fiinish the buffer off and send out for processing without CPU intervention.
Lol. I just curious to see the process carry out as expected. We've had a lot of dud features happen over the time. I do recall reading about some developers taking it on; it's one step closer to complete separation of CPU and GPU activities. None of it really means anything for me; just interested in seeing this programming paradigm play out. XSX might just be the closest they'll get to that with the GPU being able to call textures directly from the SSD. It may need very little CPU intervention. It's the sort of thing where if we are successful here you can move forward further on mGPU as well.
ideally we'll see much higher saturation of the CUs because you're not waiting on the CPU for the next set of instructions. You just keep mowing through everything without CPU intervention.
If that was true then Sony wouldn't ever have to make any effort to win a generation. They would just show up and run out of consoles to sell straight away just off the power of their brand name. We all know that's not what happens in reality. They trailed the Xbox360 almost the entire previous generation. People actually educate themselves about which console performs better, even if they don't understand technical specs. Their techie friends tell them. The staff at gamestop tells them. The advert on MTV saying "the worlds' most powerful console" tells them (yes that was the PS4 advert). Sorry to say but the notion that people ask "which console is Sony's?" and just buy that one is fanboy delirium.Out of the 3 million plus people who liked the PS5 controller picture shown last week, how many of them do you think hang our here, on REE or any other gaming board? I'd say 2% (being generous). The other 98% don't care about TFLOPS or SSDs. PS has the brand and will have the worldwide demand. Can't see MS doing anything to dent that even with their console capable of pushing more TF. Vast majority don't really care.
Because in one case, you are comparing GPUs that are both less than 2TF capable, and in the other, GPUs that perform at at least 10TF.How does a .4 tflop difference result in a 900p to 1080p disparity but a 1.9 tflop difference will hardly be noticeable?
or this one
like someone said earlier, google trends is not so good, since changing one letter/number shows different results
Dude sometimes I really think that you're not the same person lol, calm down Sony won't react because you said so, with all due respect you're not smarter than the people who work at marketing they know what they're doing. Sony don't give a flying fuck if some people spread FUD about them, they know very well what a damn controller reveal did to public and that reveal overshadowed inside Xbox as well lol.Doesnt matter if its minor, its an advantage. Sony will lose every DF faceoff right at the start of the gen, leaving the masses to buy the most powerful console. If not the masses, then the xbox 360 owners who switched to ps4 last gen. if every multiplat games runs the best on the xbox series x, why would you want the ps5? sony better have some amazing exclusives ready at launch. and not just their action adventure titles. they need first person shooters and WRPGs.
I am not doubting that the ps5 is a powerful console, im saying by focusing on features instead of raw power, they have put themselves in a corner. Where the only way out is great marketing to show off these features and they have done a miserable job of showcasing these so far.
or this one
like someone said earlier, google trends is not so good, since changing one letter/number shows different results
You make it looks like there's this huge performance gap between the two consoles, when the delta is less than half of current generation. Just chill out, this is not going to be another 'resolutiongate', I believe DF comparisons will be much more boring this time.Doesnt matter if its minor, its an advantage. Sony will lose every DF faceoff right at the start of the gen, leaving the masses to buy the most powerful console. If not the masses, then the xbox 360 owners who switched to ps4 last gen. if every multiplat games runs the best on the xbox series x, why would you want the ps5? sony better have some amazing exclusives ready at launch. and not just their action adventure titles. they need first person shooters and WRPGs.
I am not doubting that the ps5 is a powerful console, im saying by focusing on features instead of raw power, they have put themselves in a corner. Where the only way out is great marketing to show off these features and they have done a miserable job of showcasing these so far.
Check out the RE3 remake df video. The performance delta between the base consoles with 0.5 TFLOPS difference and then the performance delta between the Pro and X with 1.8TFLOPS difference between them.I dint quite understand your math.
XB1 was 1.3, then 1.4 tflops vs Sony's 1.84 tflops. 40%
XB1X is 6 flops vs PS4s 4.2 tflops. 40%
XSX 12.1 tflops vs. 10.2 tflops. 18%?
Is that right? Seems off.
How does a .4 tflop difference result in a 900p to 1080p disparity but a 1.9 tflop difference will hardly be noticeable?
Pus we still don't know what each machine is packing still. How can you make claims like this when there are still unknowns for each console?
The differences will be noticeable in different ways this time. Such as ray-tracing, probably.You make it looks like there's this huge performance gap between the two consoles, when the delta is less than half of current generation. Just chill out, this is not going to be another 'resolutiongate', I believe DF comparisons will be much more boring this time.
You are missing what makes TE an existing prospect, i'll summarize in three pointsThe only thing I'm sure is lots of you are going to be deluded by tempest engine, it's not the 2nd coming of Jesus is just a positional audio engine technology
I don't think either Sony/MS are modifying basic RDNA2 features (RT, TSS, Mesh Shaders etc) that are built into the design, AMDs the expert already doing the best possible implementation of the feature. It hasn't been the case this gen that consoles & pc had different designs of the same features.You and Trueblakjedi could both be right on this, FWIW. Both systems are using default templates of RDNA2 for RT that will exist at silicon level for all RDNA2 products, and both systems might have also made modifications to the CUs (since RT in RDNA2 is tied to the CUs), via alterations or other affixed silicon, that are more custom for their specific attempts.
Custom unique features that aren't widely useful outside of the specific console design like cache scrubbers or pipeline modifications/optimizations to minimize bottlenecks and maximize performance.both Sony and MS have already alluded to features in their systems that are custom to their platforms and may/may not see implementation in the PC side depending on how things shape out.
LOL!Dude sometimes I really think that you're not the same person lol,
calm down Sony won't react because you said so, with all due respect you're not smarter than the people who work at marketing they know what they're doing. Sony don't give a flying fuck if some people spread FUD about them, they know very well what a damn controller reveal did to public and that reveal overshadowed inside Xbox as well lol.
Goodle trends is really bad. For instance you can differ between brand and just search term which gives you different results...
Like for PlayStation (brand) and PlayStation (search term). At least the trend is the same:
Best example of that is when using PlayStation 4 (game console) and PlayStation 4 (search term). Two completely different results.
Well well well....looks like the GitHub folders that were about the Next-gen consoles weren't hosted by AMD after all, they were hacked or leaked from a worker at AMD and the ones who hosted those folders on GitHub were a bunch of Xbox fanboys like Hmqgg and the others as well, which is why we have that mysterious Reddit leak that was in January 2019, that clearly stated Lockhart being 4+ TF console, PS5 +8 TF (it was Ariel at that time with 36 CUs at 1800 MHz), and finally, Xbox Series X being 12 TF console.
Hmqgg even admitted it on Twitter and then made his Twitter account private, that Xbox fanboy infested Discord named "Xbox Era Discord" which has guys like Spechal Ed, TimDog, ProElite, Colbert, Klobrille, Colteastwood, and even Digital Foundry's own Alex Battaglia is in there, no wonder they were acting like they're insiders or "in the know" when Mark Cerny 1st revealed some of the specs inside the PS5 back in April 16th, 2019, they were saying that Xbox Scarlett was gonna be "more powerful and more advanced" (Which at that time, the GitHub leak that was posted by Hmqgg was stating that PS5 won't be having VRS, no HW RT, 8 TF Ariel GPU codename, RDNA 1, Navi 10 Lite).....So......we can conclude that it was some Watch Dogs hacking-level territory LMAO!! And it even disproves what Klobrille at that time, he said that PS5 is weaker and less advanced, weaker? Yes, slightly weaker GPU-wise, but it is as advanced as XSX which is RDNA 2 GPU.
The GitHub leak is 50/50 true, so it wasn't the full story as some Xbox fanboys wanted it to be, the difference between XSX and PS5 is 16% in GPU power which is really almost nothing, while PS5 has double the SSD speed.
We can conclude that:
—— GitHub folders were NOT hosted by AMD.
—— It was hosted by Hmqgg which is part of that Xbox Era Discord and I don't know how he got it, maybe he hacked it from AMD servers? From a worker at AMD? Interrigation? ???????
—— Many Xbox fanboys were wrong.
—— Xbox Era Discord is disgusting and full of disgusting fanboys that spread FUD about PS5 being 8 TF which is wrong.
—— GitHub leak was 50% wrong, 50% right.
—— Thanks toSlimySnake and @MonkeyPunch for making this very clear to us here and now we understand everything.
Xbox fanboys are very sneaky after all, but Phil Spencer will be giving them Game Pass small ass games in the end for all those 12 TFs!!![]()
Indeed different ways, we still don't know what practical differences PS5 SSD and all that I/O effort will make in games compared to XSX solution.The differences will be noticeable in different ways this time. Such as ray-tracing, probably.
Wait a second you already respond yourself, when you compare things like a GPUs you need to compare their performance in porcentaje not in flopsI dint quite understand your math.
XB1 was 1.3, then 1.4 tflops vs Sony's 1.84 tflops. 40%
XB1X is 6 flops vs PS4s 4.2 tflops. 40%
XSX 12.1 tflops vs. 10.2 tflops. 18%?
Is that right? Seems off.
How does a .4 tflop difference result in a 900p to 1080p disparity but a 1.9 tflop difference will hardly be noticeable?
Pus we still don't know what each machine is packing still. How can you make claims like this when there are still unknowns for each console?
I know what are you coming from, but until June we can't judge Sony yet. This pandemic ruined alot of things, as Soon as Sony show their games believe me you will forget about all these things that concern you.LOL!
I am sure MS marketing and product Execs who came up with the xbox one, and Sony's PS3 execs are also smarter than me, but smart people make mistakes. I think we all agree that Cerny's presentation being the first true PS5 reveal was a terrible idea by some very smart people.
yeah, as much as i want to believe, i think option 2 is far more likely. none of us here thought sony would settle for 36 CUs to save a few dollars and yet here we are. the fact that they downgraded ram from 560 or whatever was in the github and flute leaks to 448 gbps is even more alarming. this is clearly not the same company that we saw back at e3 2013. cancelling E3 the year of launch is mind blowing considering how much great press they got in e3 2013, 2015 and 2016.
ive brought this up before, but all those 20-30 year vets leaving in a span of a couple of years was always a sign of trouble. andrew house launches sonys biggest product in years to massive success and is then let go? then his replacement is replaced? then shawn layden is given an unceremonious exit immediately afterwards? and then shu is replaced after his studios had arguably the best generation ever for a first party console manufacturer. I mean he produced three overall GOTY winners in U4, GoW and DS, the last of which was not first party but i highly doubt shu wasnt heavily involved in kojima's recruitment and the sharing of the decima engine.
these guys have been around since the ps1 launch. all gone. Now we are stuck with Logo enthusiast Jim Ryan who is more prone to gaffes than Don Matrick and Joe Biden combined. they shouldve brought back andrew house or jack tretton after kodera didnt work out. i have no idea whats going on at sony.
Nintendo would be Ermac.Does this make Nintendo Sektor? :-\
You cannot patent a name, and I find nothing that points to MS having a registered trademark on VRS. There is no (r) symbol anywhere I can see, and NVidia is using it actively for marketing purposes.4: PS5 will have a version of VRS, but it won't be called VRS, as that particular implementation and name is patented by Microsoft. Nothing prevents Sony from having their own abstraction and API stack for implementation of their own version of VRS, however, and I'm certain they have just that.
But it did state Navi 10 Lite which is not RDNA 2, so....it is wrong 50/50, it got leaked after all and Hmqgg is the one who hosted that GitHub page, not AMD.The github leak did not say anything about PS5 not having hardware RT.
The info was also quite different between Xbox Series X and PS5 chip. You could not look at the Xbox chip and read "Has X" and then look at the PS5 chip and see that it doesn't say it "Has X" and conclude that it does not have X.
![]()
PS5 and Xbox Series X GPU specs leak: how powerful is next-gen?
A remarkable story unfolded last April where it seemed that an intrepid explorer of the 3DMark benchmark database stumb…www.eurogamer.net
"In reaction to the leak, there has been an argument suggesting that the PS5 specs are invalid because there is no mention of hardware accelerated ray tracing, whereas Arden has it confirmed (along with VRS - variable rate shading). However, the documentation for both processors is very different and can't be directly compared. A lot of AMD's validation testing for the PS5 'Oberon' processor is in the leak, whereas the Series X data is best described as somewhat patchy by comparison. If the PS5 specs are to be taken with a pinch of salt, have an armful of the stuff ready when looking at the table directly above with mooted Series X specs. "
In the end we saw that the information was pretty much spot on but not for final clocks which is pretty obvious (these things gets decided pretty late and the leak was pretty old). Both got around 10% increased clocks.
Again - dont discredit the github leak because some people speculated a little but to much about what the leak DID NOT SAY.
But it did state Navi 10 Lite which is not RDNA 2, so....it is wrong 50/50, it got leaked after all and Hmqgg is the one who hosted that GitHub page, not AMD.
If that was true then Sony wouldn't ever have to make any effort to win a generation. They would just show up and run out of consoles to sell straight away just off the power of their brand name. We all know that's not what happens in reality. They trailed the Xbox360 almost the entire previous generation. People actually educate themselves about which console performs better, even if they don't understand technical specs. Their techie friends tell them. The staff at gamestop tells them. The advert on MTV saying "the worlds' most powerful console" tells them (yes that was the PS4 advert). Sorry to say but the notion that people ask "which console is Sony's?" and just buy that one is fanboy delirium.
Seriously!?Talk about missing entirely what I was doing. And btw.. you do realize that State of Decay that you saw running on Series X isn't actually designed with any Xbox Series X features in mind, right?
As for people seriously trying to say the Xbox Series X has somehow horrible or inefficient I/O performance when we saw the damn thing quickly switch between 5-6 different games with roughly 8 seconds tops in between each, yea.. I don't know about that one. None of those games were actually designed for the Series X's capabilities also. And of course the PS5 SSD sounds very impressive, but I really don't believe outside of initial loads people are going to be seeing the SSD advantages they are hoping to see in game comparisons. And why is that? It's not that Sony's first parties won't do unbelievable things, they will. The problem? People are MIGHTILY overestimating what an SSD can bring to the experience and greatly downplaying the role that the CPU and GPU play. Make no mistake, the CPUs are the biggest game changers coming into this generation, not the SSD. Then when you factor in the significant efficiency improvements of the GPU architectures this time around, along with the features already confirmed for the Series X, I just don't see any realistic scenario where any other announced console is somehow going to run circles around this thing simply because of a much faster SSD.
But the best part is we will have the chance to find out, won't we?
It was really a great presentation.Mark Cerny is so hard to dislike, even after the The road to PS5 talk which annoyed many people, the comment section was just filled with praise about how Cerny almost flawlessly delivered the presentation lol
It is weird these quotes because nothing they says is hardware related.Some interesting discussion from Beyond3D on certain GPU features, in the XSX thread.
iroboto:
scently:
iroboto:
Highlighted the particularly interesting parts. Hope these dudes don't mind me copypasting their quotes. These kind of advancements would be great for next-gen, and I always wondered how far you could customize GPUs to essentially run their own code without needing the CPU to issue the commands to them.
Remember these are just fellow people speculating on next-gen features like the rest of us, but this particular topic caught my attention.
yeah, sony marketing has been a shit show for two years now. I have two theories.
And this cheapness extends to what games they will have coming. All big ones will be cross gen, and the next gen exclusives will be like ps4 in 2013. So we will see the equivalent of knack, Killzone, and resogun.
I think option two is more likely as I see no evidence of Sony sinking big money into playstation to wow us beyond making more graphical AAA 3rd person action adventure showcase games.
yeah, as much as i want to believe, i think option 2 is far more likely. none of us here thought sony would settle for 36 CUs to save a few dollars and yet here we are. the fact that they downgraded ram from 560 or whatever was in the github and flute leaks to 448 gbps is even more alarming. this is clearly not the same company that we saw back at e3 2013. cancelling E3 the year of launch is mind blowing considering how much great press they got in e3 2013, 2015 and 2016.
ive brought this up before, but all those 20-30 year vets leaving in a span of a couple of years was always a sign of trouble. andrew house launches sonys biggest product in years to massive success and is then let go? then his replacement is replaced? then shawn layden is given an unceremonious exit immediately afterwards? and then shu is replaced after his studios had arguably the best generation ever for a first party console manufacturer. I mean he produced three overall GOTY winners in U4, GoW and DS, the last of which was not first party but i highly doubt shu wasnt heavily involved in kojima's recruitment and the sharing of the decima engine.
these guys have been around since the ps1 launch. all gone. Now we are stuck with Logo enthusiast Jim Ryan who is more prone to gaffes than Don Matrick and Joe Biden combined. they shouldve brought back andrew house or jack tretton after kodera didnt work out. i have no idea whats going on at sony.
Yep, Sony probably already has a similar extension in GNM already for the PS5It is weird these quotes because nothing they says is hardware related.
It is just like PS5's API vs DX12 API.
Just different software logic (with PS5's API being more close to metal due targeting one single platform) to the same result.
AMD's command processor is programmable... MS did that with XB1, Sony did that with PS4.... both will do with new Xbox and PS5... it is just how the AMD GPU command processor works.
And that other part of VRS being different in Xbox is because MS has patented a software logic do do VRS... it still needs the AMD and nVidia hardware but the logic is exclusive to MS DX12.
Sony will probably have a logic for VRS that will use t he AMD hardware too.
It is weird these quotes because nothing they says is hardware related.
It is just like PS5's API vs DX12 API.
Just different software logic (with PS5's API being more close to metal due targeting one single platform) to the same result.
AMD's command processor is programmable... MS did that with XB1, Sony did that with PS4.... both will do with new Xbox and PS5... it is just how the AMD GPU command processor works.
And that other part of VRS being different in Xbox is because MS has patented a software logic do do VRS... it still needs the AMD and nVidia hardware but the logic is exclusive to MS DX12.
Sony will probably have a logic for VRS that will use t he AMD hardware too.
You cannot patent a name, and I find nothing that points to MS having a registered trademark on VRS. There is no (r) symbol anywhere I can see, and NVidia is using it actively for marketing purposes.
I don't think either Sony/MS are modifying basic RDNA2 features (RT, TSS, Mesh Shaders etc) that are built into the design, AMDs the expert already doing the best possible implementation of the feature. It hasn't been the case this gen that consoles & pc had different designs of the same features.
Custom unique features that aren't widely useful outside of the specific console design like cache scrubbers or pipeline modifications/optimizations to minimize bottlenecks and maximize performance.
Those are the type of customizations being alluded not unique implementations of core RDNA2 features that were part of the design since its inception
The WH-1000XM4 should be good, they will have aptX HD, and might even have aptX Adaptative, great for latency (apart from the obvious LDAC, which is Sony's proprietary codec, and is actually the one with potentially the better quality), so you would be good with those.
Yeap an additional ARM or custom units inside the APU to do custom Xbox things is believable.All of this is true, but there's nothing in the quotes I posted which contradict anything you've just mentioned here, either. Both can be true. We don't even know the full extent of RDNA2's featureset in terms of what are considered the common features.
They do allude to a possible hardware-related feature, however. An AMD engineer who worked on the XSX side of things had a post on Linkdn a long while back where they talked about that system's APU and mentioned ARM-related work on the APU. We all know AMD CPUs have an ARM core on them for security-related functions, so it makes no sense to mention that if such was the only extent of ARM implementation.
I'm just leaving options open here to see what customizations may've been had given info already mentioned and patent info that has been found.
It's an implementation of the technique that can be patented.
Sony have already hinted they've done exactly this with the Geometry Engine, which is essentially a rebranding of the Primitive Shaders in RDNA2, with some customizatons. If either company sees basic features that can be enhanced to benefit specific use-cases of their consoles, there's no reason for them to leave them at the RDNA2 default. It's a part of the customization process both systems are using for their APUs anyway.
To the other things you mentioned, yes those are part of the process behind customizations as well. It's part of the reason I posted the Beyond 3D posts here too because those speculate on possible hardware customizations to compliment API stacks.
I don't think lockart if it exists will have a lower speced CPU the idea is to match XSX at 1080p (graphics parity)I was thinking as the XSX will almost force to all its first titles to be 60 fps so in this way its budget console can use a slower CPU and has games running to 30 fps, I don't
know how this gonna work with the third parties which want to have its title to 30 fps like Ubisoft.
In other hand SIE will prioritize the graphics so even if exists a minor difference in GPU power a game running in a PS5 to 30 fps could looks better than one in XSX to 60 fps.
Is geometry processor = geometry engine ?Unless AMD changed a lot with RDNA 2 that is how it is in RDNA
"Inside the graphics core of the die, in the center is the command processor which interfaces with the software over the PCI Express interface. Next to the command processor is the geometry processor which assembles the primitives and vertices. It is also responsible for distributing the work among the two shader engines. The two shader engines house all the programmable compute logic. Within each shader engine, there are two shader arrays. Inside each array is the primitive unit, a rasterizer, a group of four render backends (RBs), a shared L1 cache, and the new compute units."
There is no way I can see RDNA 2 doesn't having a Geometry Engine.
It doesn't make sense to be fair.
You do realize this is an oxymoron right?Nobody said the PS5 was horrible and inefficient though, only that it will clearly be outperformed by the Xbox Series X based on the information we possess
What is "taxing", however? Audio tasks, even 3D audio ones, don't require very much processing power relative to 3D graphics tasks. Some people are floating the idea that up to 20% system resources (GPU, bandwidth, both, etc) would need to be used for audio, which is a ridiculous amount. It could be 10%, it could be 5%, it could be 1%, but I wouldn't say any of those (particularly the last two) are taxing.
And that is assuming everything regarding the audio setup has been divulged, which likely hasn't and probably never will. These companies just like telling us enough specs but never 100% of everything, that's for us to figure out years down the line, as it happens every time.
Highlighted the particularly interesting parts. Hope these dudes don't mind me copypasting their quotes. These kind of advancements would be great for next-gen, and I always wondered how far you could customize GPUs to essentially run their own code without needing the CPU to issue the commands to them.
Remember these are just fellow people speculating on next-gen features like the rest of us, but this particular topic caught my attention.
They didn't though, he just said "'ps5 has a new unit called geometry engine ". A type of Geometry Engine already exists on RDNA even if it isn't as capable as what would be the RDNA2 version. What makes you say its a rebranding?Sony have already hinted they've done exactly this with the Geometry Engine, which is essentially a rebranding of the Primitive Shaders in RDNA2, with some customizatons.
Thats my point though AMD already has implemented core features the best possible way, if there are specific use case modifications it will be in the surrounding logic to eliminate possible bottlenecks or by introducing unique features specific to the design. There isn't a precedent for altering core features in consoles only. Their collaboration with Sony in designing the PS4 chip for example saw them improve a core feature (ACEs) and this improvement was implemented on discrete cards as well.If either company sees basic features that can be enhanced to benefit specific use-cases of their consoles, there's no reason for them to leave them at the RDNA2 default. It's a part of the customization process both systems are using for their APUs anyway.
So 20GB/s is not taxing? Even if 5%, you know that it's only 18.19% (10.28 vs 12.15TF) and 5-10% could close the gap with that offloaded to the Tempest? Of course, don't bring the XSX chip as it's only meant to offload the CPU to free the starving 10GB ram.
AMD's TrueAudio eats up to 4 CU's out of the GPU, that's 10% out of a 40CU GPU:
![]()
I think people are underestimating this stuff. Of course, is you wanna enjoy current gen audio then that's probably a 1%, but XSX would have a massive disadvantage if they plan so, and they won't so it'll be GPU taxing, but not on PS5.
Yeap an additional ARM or custom units inside the APU to do custom Xbox things is believable.
It is just that custom command processor meaning as a hardware change is really hard to believe... these AMD GPU units that works won't be changed in any APU imo.... AMD took years to make it works like it is today to come some external and ask to change it.
The base of both Zen 2 and RDNA 2 are unchanged in both consoles... MS and Sony added custom silicon units for what they find be better for their needs.
How many units? Well we don't know except some that Sony said in the presentation.
These are wishful thinking at best.
RDNA2 cannot create work for itself AFAIK. there are some bandages here and there for ray-tracing (using work creation from CPU would be too slow).
But not for any shaders/states.
Generic GPU work creation would benefit PC the most, because in PC CPU->GPU sync path is the longest and most cumbersome.
For console it's not that bad, but obviously it would benefit consoles too.
It wasn't right as it was Navi 10 Lite.No.
That was from the Gonzalo leak back in April 2019 (from 3D Marks result database)
The Github leak pretty much killed the Gonzalo leak.
So - github leak was right 100% (but not final clocks since MS/Sony/AMD themselves did not know were they would end up at the time).
They didn't though, he just said "'ps5 has a new unit called geometry engine ". A type of Geometry Engine already exists on RDNA even if it isn't as capable as what would be the RDNA2 version. What makes you say its a rebranding?
Thats my point though AMD already has implemented core features the best possible way, if there are specific use case modifications it will be in the surrounding logic to eliminate possible bottlenecks or by introducing unique features specific to the design. There isn't a precedent for altering core features in consoles only. Their collaboration with Sony in designing the PS4 chip for example saw them improve a core feature (ACEs) and this improvement was implemented on discrete cards as well.
In short: I don't think its likely for core features to be improved in a way that'll only be useful to the specific console design
Where are you getting 20% from? That's the whole reason I said it was questionable; an engineering member from the Xbox team themselves laughed off that figure and said it won't be anywhere near it. I think we can take them on their word with something like that, because when you think it out, it's a ridiculous amount.
Also you're using AMD's graphic when this has probably been something customized on XSX particularly because of things like the dedicated audio chip. We already know the consoles are not using 100% the same RDNA2 feature set as the PC side, that includes at the silicon level.
All in all I think you're overshooting on the 20% figure by a good deal.