Riky
$MSFT
Please tell me this is some sort of parody channel.......it has to be surely
The Xbox guy at hot chips said that's the biggest missconception regarding DX. When compiling to Xbox, all that abstraction layer is removed so all those "ifs" as you say, shouldn't be there.
I never use photo mode so I might be a bit fuzzy on the specifics of how it works, but IIRC, the streaming of model (LOD) and mipmap textures, updating of shadow maps, etc - and probably BVH streaming -effectively stops in photo mode - in the context of the game play camera movements at 16.6ms.1CCX Vs 2CCX is the only thing that I can think of. Without a die shot we really can't confirm if it's true or not.
The thing is XSX isn't objectively 'more powerful' based on theoretical "on paper" specs, unless this spec is conveniently constrained to just one or two GPU metric favoring it.It’s deeper than that. It’s just really hard for them not to have the most powerful console by a long shot. It’s not enough for it to be on paper and theoretical. So this “benchmark “ , even though it’s technically still on paper and still theoretical, is “proof”
I don’t think anyone really doubted the advantage. The systems being this close is an absolutely crushing outcome.
No, not at all.In theory the XSX should outperform the PS5 version of games 100% of the time. When it doesn't it raises questions.
Which is why eliminating that misconception of DX makes everything even stranger. If it isn't the software than it must be the hardware.
I never use photo mode so I might be a bit fuzzy on the specifics of how it works, but IIRC, the streaming of model (LOD) and mipmap textures, updating of shadow maps, etc - and probably BVH streaming -effectively stops in photo mode - in the context of the game play camera movements at 16.6ms.
With the XsX having one BVH or texture unit per CU - unlike AMD's RDNA2 cards that can do both operations simultaneously - then in photo mode you would expect the engine to recover those units and cache resources used for preparing to render areas on the edge of view frustum. Effectively the engine goes from being cache/unit bottlenecked by time and motion( in game), to being just time limited - less stressed, and because of the lack of motion, the FPS isn't a real measure because some of the fx will be accumulated over many frames - either just the data in caches or the actual shader results, and likely giving big benefits to things like VRS too, which should be amazing when the camera position is fixed.
No, not at all.
One or two uh?The thing is XSX isn't objectively 'more powerful' based on theoretical "on paper" specs, unless this spec is conveniently constrained to just one or two GPU metric favoring it.
One or two uh?
Highly unlikely - going by solid state logic's history of being anti-digital mastering(before 2000) which lead to a spin off company called Oxford Digital , which was comprised of Ex-SSL staff that believed in digital being the future.https://www.solidstatelogic.com/products/t25
Is there a link with this technology and the Tempest Engine in the PS5.
Only if you look at things with a simple mindset of TF > (or select biases)Honestly I think that's why there's so much debate over the XSXs power. People expect it to be superior and when it isn't people question it.
Max theoretical floating point throughput and texel fill rate.One or two uh?
Only if you look at things with a simple mindset of TF > (or select biases)
Which a certain engineer and many, many developers said not to rely on going forward. But we've been through this countless times over the past year, why not continue to have amnesia on it.
I just find strange that now the only thing going for it is the CU's, i don't know. Maybe the pc gamers, should start buying weaker gpus and overclock them. I know that there are advantages on both consoles just comes out weird when people refer to that advantage as only one or two.For the GPU?
I thought the main advantage there was the additional CUs. It could very well have more cache per CU but we don't have that information for the PS5. I know that RT scales with CUs so there's definitely an advantage there.
I just find strange that now the only thing going for it is the CU's, i don't know. Maybe the pc gamers, should start buying weaker gpus and overclock them. I know that there are advantages on both consoles just comes out weird when people refer to that advantage as only one or two.
And there is a myriad of other factors like memory, CPU, and I/O latency, etc..I guess in situations where a game needs those extra CUs it will have an advantage. But if more pixel fillrate is required then the PS5 should perform better.
What I don't understand is what pixel fillrate means.
I'm Not trying to say anything. On pc when I buy a 1080 it is actually more powerful than a 1070 for example. It seems is not the case with consoles, because of the clock speed. I thought it was supposed to have more bandwith etc.Well we are just looking at the GPUs. And main difference between the two is the clock speed and the number of CUs. The PS5 has higher clocks while the XSX has more CUs.
We then have some unknowns for the PS5s GPU like the amount of cache.
No idea what you're trying to saying since both GPUs have RDNA2 CUs so essentially they are the same architecture.
I meant that it HAS to be more powerful by a long shot to them. They’re not happy if one component is more powerful leading to winning a head to head here or there and the systems being really close. This benchmark just proves what we already knew and does nothing but put a face on one area where the x has an advantage. If, for example , scrolling through the menu faster(just for the sake of argument) proved that the throughput on the PS5 was faster and someone made a video analysis of it, it would be equally ridiculous. We already know it to be true. We don’t need to prove it. It’s part of the reason that it performs the way it does overall. So is the GPU on the x.The thing is XSX isn't objectively 'more powerful' based on theoretical "on paper" specs, unless this spec is conveniently constrained to just one or two GPU metric favoring it.
I'm Not trying to say anything. On pc when I buy a 1080 it is actually more powerful than a 1070 for example. It seems is not the case with consoles, because of the clock speed. I thought it was supposed to have more bandwith etc.
Again or find strange also that an undisputed advantage some months ago now it is not. Like the ssd in the ps5 is superior and that is undisputed. I'm not an hardware engineer so what do I know... I was just trying to figure it out
Thanks for the explanation.It's not undisputed when you have actual proof with these comparisons. As we can see from all of them the differences on paper doesn't necessarily translate to the actual gameplay.
Also funny that you mentioned CUs because Nvidia GPUs tend to have a lot more than AMD ones. Yet AMD GPUs with much lower TF and CU counts perform similarly to them unless you use RT of course. I think there's more to a GPUs performance than just CU count. Other things matter like the number of shader arrays, size of the cache and clock speed.
Thanks for the explanation.
Well then, maybe you should apply to MS engineering since you know oh so much about hardware design.
Also, smashing? Really?
You know, if they spent as much energy on deep diving alternative tests as to why the PS5 is performing the way it is, instead of cringy photomode tests to support a manufactured narrative, they would probably get closer to the reasons and become enlightened themselves.Take into account I'm not an expert so take what I say with a grain of salt. But it's true that CUs are not the only things that can influence a GPUs performance.
Maybe they think it’s a....uninterestingYou know, if they spent as much energy on deep diving alternative tests as to why the PS5 is performing the way it is, instead of cringy photomode tests to support a manufactured narrative, they would probably get closer to the reasons and become enlightened themselves.
But something tells me that's not paying the bills.
You know, if they spent as much energy on deep diving alternative tests as to why the PS5 is performing the way it is, instead of cringy photomode tests to support a manufactured narrative, they would probably get closer to the reasons and become enlightened themselves.
But something tells me that's not paying the bills.
Oh, kinda like the lack of performance tests for the Halo Infinite gameplay reveal. Instead they put out a goofy deflection fluff piece about lIgHtInG.Maybe they think it’s a....uninteresting
Yes as you said. The abstraction layer gets totally compiled out, the output native code is 'optimised' for the specific machine it's been built forThe Xbox guy at hot chips said that's the biggest missconception regarding DX. When compiling to Xbox, all that abstraction layer is removed so all those "ifs" as you say, shouldn't be there.
cMoN iTs JuSt An EaRlY bUiLd MoNtHs AwAyOh, kinda like the lack of performance tests for the Halo Infinite gameplay reveal. Instead they put out a goofy deflection fluff piece about lIgHtInG.
That leash seems to have a certain length.
You talking about bandwidth usage but I'm talking of bandwidth speed. The unified architecture from what I have understood needs to run at the same speed because there is a unified bus for both processors, so GPU and the CPU can't run at different speeds. It's more or less the same argument of CUs and the lower common denominator for the paralellism.To be honest, I don't see your point.
in you first message, you wrote: "the GPU has to wait the CPU uses and viceversa if they have to work separately". I simply said that's exactly the same on PS5 and XsX, because of interleaved datas. CPU, GPU or other parts of the system are managed in a queue with priority orders to access to the memory.
"An unified environment can't use two separate speeds at the same time and can't have a minimal impact such approach. There will be surely a conflict when CPU and GPU are involved in the bandwidth in the same instant". Same, I don't see your point here, why separate speed ?? why would there be a conflict because of data read with different bandwidth?
FF16 is looking super neat and awesome
Also:
Too many awesome playstation NEWS I CANTTTT
cMoN iTs JuSt An EaRlY bUiLd MoNtHs AwAy
Oh, kinda like the lack of performance tests for the Halo Infinite gameplay reveal. Instead they put out a goofy deflection fluff piece about lIgHtInG.
That leash seems to have a certain length.
I loved that educational video about lighting technics where Alex showed how "the games you love" have so many limitations and imperfections (with hand-picked footage of Death Stranding and Horizon).Oh, kinda like the lack of performance tests for the Halo Infinite gameplay reveal. Instead they put out a goofy deflection fluff piece about lIgHtInG.
That leash seems to have a certain length.
Maybe because the demo was on PC, so it wouldn't really tell you anything.
OH I SEE, FAKE RT, how Sony even recover?!11! Good thing they had Remedy on payroll!!!
You do realize the game is releasing on PC too? And its funny how Xbox refuse to show any of their games running on actual Xboxes.Maybe because the demo was on PC, so it wouldn't really tell you anything.
True, you wouldn't make it there, why waste time doing the curriculum.Why would I waste my time?
Demon Souls is also using realtime GI and it is by the best looking game out so far. it's also 60 fps at 1440p. you dont need ray tracing.Control is an interesting case when it tries to do everything, but do we really need all raytracing now? If this is only software-based GI, only RT reflections are needed for full realism:
Control isn't using Geometry Engines and many other performance-efficient tools, nor expecting much from a small studio. Insomniac managed 1600-1080p with RT reflections at 60fps, many particles going around, and still with great city density. This is a muddy phase into this great generation, Death Stranding would've been shockingly realistic if made around PS5 from the ground up.
That is not how it works on PC.Yes as you said. The abstraction layer gets totally compiled out, the output native code is 'optimised' for the specific machine it's been built for
Dude the picture posted is deceiving. You don't see in middle of the screen the same object reflected on the ground because there is an overlay write above it.That's a Remedy fuckery going on, see the same red gate reflection on the left as well, they probably when with that design then scrapped it. Seems like the game is still reading that for the reflections.
Please tell me this is some sort of parody channel.......it has to be surely
How 2 systems priced exactly the same and having virtually the same performance in games is a crushing outcome for any of the manufacturers really is a tale for fanboy fire camps. I could understand your argument if the XSX was priced higher but it’s actually the one lowering the bill the most for gamers thanks to GP, so I fail to see anything other than a warriors attempt to claim victory out of a draw. I’ll personally wait about a year for these systems to reveal what they are really capable of.It’s deeper than that. It’s just really hard for them not to have the most powerful console by a long shot. It’s not enough for it to be on paper and theoretical. So this “benchmark “ , even though it’s technically still on paper and still theoretical, is “proof”
I don’t think anyone really doubted the advantage. The systems being this close is an absolutely crushing outcome.
You can buy a PS5 for $400..How 2 systems priced exactly the same and having virtually the same performance in games is a crushing outcome for any of the manufacturers really is a tale for fanboy fire camps. I could understand your argument if the XSX was priced higher but it’s actually the one lowering the bill the most for gamers thanks to GP, so I fail to see anything other than a warriors attempt to claim victory out of a draw. I’ll personally wait about a year for these systems to reveal what they are really capable of.
How 2 systems priced exactly the same and having virtually the same performance in games is a crushing outcome for any of the manufacturers really is a tale for fanboy fire camps. I could understand your argument if the XSX was priced higher but it’s actually the one lowering the bill the most for gamers thanks to GP, so I fail to see anything other than a warriors attempt to claim victory out of a draw. I’ll personally wait about a year for these systems to reveal what they are really capable of.
WRONG. All you need to do is go back just before launch. Has nothing to do with pricing. The x was supposed to be THE place for multiplat and the PS5 was going to be a nice little exclusives machine. Now there’s Barely any difference in head to heads and PS5 still has all those exclusives. That’s a HUGE fail for the series x. You can certainly make an argument that, besides them giving away gamepass for $1 with gold, there is hardly a reason to own an x today and for the foreseeable future. When that loophole closes its back to full price and $500 for 3 years vs $180 for ultimate. Especially with what Sony has been putting out on PS Plus.How 2 systems priced exactly the same and having virtually the same performance in games is a crushing outcome for any of the manufacturers really is a tale for fanboy fire camps. I could understand your argument if the XSX was priced higher but it’s actually the one lowering the bill the most for gamers thanks to GP, so I fail to see anything other than a warriors attempt to claim victory out of a draw. I’ll personally wait about a year for these systems to reveal what they are really capable of.
That’s true if you only consider the hardware but unfortunately those things need games and just the price of Godfall’s digital version at launch can secure over a year worth of game pass ultimate using all the deals available out there so.. ain’t that simple.Gamepass is kind of a flawed argument though because bro every game is released in it. Realistically Microsoft has the cheapest next gen system with the XSS and Sony has the one with the most value with the PS5 DE.
Ok.WRONG. All you need to do is go back just before launch. Has nothing to do with pricing. The x was supposed to be THE place for multiplat and the PS5 was going to be a nice little exclusives machine. Now there’s Barely any difference in head to heads and PS5 still has all those exclusives. That’s a HUGE fail for the series x. You can certainly make an argument that, besides them giving away gamepass for $1 with gold, there is hardly a reason to own an x today and for the foreseeable future. When that loophole closes its back to full price and $500 for 3 years vs $180 for ultimate. Especially with what Sony has been putting out on PS Plus.
That’s true if you only consider the hardware but unfortunately those things need games and just the price of Godfall’s digital version at launch can secure over a year worth of game pass ultimate using all the deals available out there so.. ain’t that simple.