Tests on RDNA1 show AMD scales badly to clock but well to CU. Extrapolated to estimate XsX/PS5 differences

I guess that's one way of looking at it. The other is that Uncharted and GOw had almost as many games as Halo. In less time.

Lets look at this year.
March had Ori (Sequel to a new IP this gen)
Bless Unleashed (MMO based off an old IP)
PSO2 (Old IP that people wanted to play)
Bleeding Edge (New IP)

How is that Halo/Gears/Forza?

Im not even going to talk about what's coming for the rest of the year.
cool man looking at that list makes me wanna go grab an xbox one.

LMFAO
 
Agreed and the push for 70% higher resolution (stretching their GPU advantage thinner than it should be, trying to also establish the narrative of Xbox always punching above its specs sheet) as he was saying to show off has also bit them in the arse with games dropping more frames more often, even hyper optimised games like DOOM Eternal.
Doom doesn't drop more frames, it appears to operate identically in normal gameplay at much higher resolution and is even more stable during the stress scene.



 
Last edited:
Actually, scaling with clocks can even be better than 100%, if rasterizer/gemetry engine is your bottleneck.
Also note that in the test below cad is likely TDP limited.
Anyhow, guru3d tests have shown:

RAM (Mhz)Clock (Mhz)Perf
120001600100%
14000 (+16.7%)1700 (+6.2%)107%
14800 (+23,3%)1775 (+10,1%)111%

So, perf gains are higher than clock gains, unfortunately they didn't test it without mem OC-ing.


Well, note that there is no need for consoles to keep writing something on that SSD It's about reading it a lot.
With studios claiming they are already going full raytracing, no more rasterizer - I don't see the relevance of your post. Raster graphics are dead. Everything will use some form of raytracing. So raytracing performance is the important thing, and that seems to scale very well with CU. Shows you how late Sony must've made that decision to delay PS5 a year to add RT, they didn't even have the time to upgrade to more CU for better RT performance. It's just a console that was hotfixed to at least do it, no matter how bad the implementation will be. Just somehow get to 3.5 GHz CPU, just somehow get to 10.28 TF, just somehow get RT running. And the whole system suffers from it by being expensive and unbalanced and a mess to develop for because you never know if the CPU or the GPU throttle, so you're trying to create for when they are not because it may break otherwise. I'm sure all that talk about how easy PS5 is to develop for comes from the pre-boost clocks era. Boost speeds are a joke.
 
I already watched that, he suspects there's something wrong with his HDD. Those hitches don't take place in any other analysis.

You could try again and try to actually be honest about it.

Never happened people argue with TLW as he gloats over a bit higher framerate or less tearing or stabler frame time and people jump on the "but but the higher resolution (makes up for it)"... you lecturing about intellectual honesty is rich ;).
 
Last edited:
Multiplatform titles will look near identical

PS5 will have the best 1st party visuals since they have Naughty Dog, Insomniac Games, Sucker Punch, Santa Monica, Bend and Guerrilla Games

Kojima is probably making a PS5 game right now too

Best visuals are a matter of money and powerful hardware.

XsX has the most powerful hardware, they just need to throw the money and give developers time.

The first Xbox came out of nowhere with Halo, which at it's time was leagues ahead of any other game. The same happened with Gears Of War.

Now is their time to keep doing what they did with their two first generations of consoles.
 
I already watched that, he suspects there's something wrong with his HDD. Those hitches don't take place in any other analysis.

You could try again and try to actually be honest about it.

Wut? Are you saying that HDD can actually affect performance of the game? Hmmmm aren't you the one downplaying PS5 beast SSD. LMAO 🤣
 
With studios claiming they are already going full raytracing, no more rasterizer - I don't see the relevance of your post. Raster graphics are dead. Everything will use some form of raytracing. So raytracing performance is the important thing, and that seems to scale very well with CU. Shows you how late Sony must've made that decision to delay PS5 a year to add RT, they didn't even have the time to upgrade to more CU for better RT performance. It's just a console that was hotfixed to at least do it, no matter how bad the implementation will be. Just somehow get to 3.5 GHz CPU, just somehow get to 10.28 TF, just somehow get RT running. And the whole system suffers from it by being expensive and unbalanced and a mess to develop for because you never know if the CPU or the GPU throttle, so you're trying to create for when they are not because it may break otherwise. I'm sure all that talk about how easy PS5 is to develop for comes from the pre-boost clocks era. Boost speeds are a joke.

Do you even take a pause to breath in there?

Seriously though, both companies are using the same RDNA2 architecture and RT solution and the 14% delta for FLOPS is the same regardless of ray tracing. Considering that on RTX2080ti Quake II is the top end of fully raytraced visuals and how even that one uses reconstruction and ML to demonise the image (not enough rays), you must be from year 2040 or something indeed ;).
 
Last edited:
With studios claiming they are already going full raytracing, no more rasterizer - I don't see the relevance of your post. Raster graphics are dead. Everything will use some form of raytracing. So raytracing performance is the important thing, and that seems to scale very well with CU. Shows you how late Sony must've made that decision to delay PS5 a year to add RT, they didn't even have the time to upgrade to more CU for better RT performance. It's just a console that was hotfixed to at least do it, no matter how bad the implementation will be. Just somehow get to 3.5 GHz CPU, just somehow get to 10.28 TF, just somehow get RT running. And the whole system suffers from it by being expensive and unbalanced and a mess to develop for because you never know if the CPU or the GPU throttle, so you're trying to create for when they are not because it may break otherwise. I'm sure all that talk about how easy PS5 is to develop for comes from the pre-boost clocks era. Boost speeds are a joke.

Apart from a few things in bold -- there's so much baseless speculation and obviously incorrect information.
 
Of course it didn't scale well in their tests RDNA barely can cross 2Ghz.

BTW RX 5700 with OC reaches/beats in performance RX 5700XT that have more CUs.

"HOW FAST IS AN UNLOCKED AMD RX 5700?
With the unlocked overclock in place, and running perfectly stable, we were able to see gaming performance on par with the RX 5700 XT. And in some cases it was actually slightly outperforming what we'd seen in Navi benchmarks."
 
Last edited:
How it is gonna be:

- ps5 runs at 1800p what runs at 2100p on xbox
- framerate same
- loadings/pop ins/LoD/object density and OS snappiness 2.5x faster on PS5

No, you are scaling the 16 % wrong just on one axis, resolution is an area...

Its 2000p vs 2160p if no dynamic resolution if that fits your narrative ?
 
Last edited:
30% difference?

Lol. This is not RDNA1. It's RDNA 2.

There's no doubt it'll be stronger but that's just silly. With RDNA1 it wouldn't even be possible to run at 2.23Ghz on a closed box for starters.

remember what Cerny said about CUs in RDNA 2. The difference could be more significant under RDNA 2
 
LOL, try reading a GPU specialist in optimising GPU throughput from other site and learn something - something like 2160p vs 2000p as speed and other things make a difference, or 90 % vs 80 % at 4k60. more likely with variable resolution scaling.

Both are bandwidth per CU limited, try dividing bandwidth by CU to feed and come back !

Or we now believing gaming bolt lol


E3Os7Qd.png


tvPbzWO.png
Can you post a link?
 
These shills have no shame, the amount of delusion it takes to write this kind of tripe lol....
Who wrote that thing? Thelastword?
That is so wrong on so many points I wouldnt know where to start. The whole article is written on the premise that as the PS5 has a dedicated audio chip and the XSX doesn't, 20% of the XSX GPU is going to be taken up by audio which will reduce the XSX power to 9.x tflops, less than the PS5 @ 10.2.

Yet even an moron knows the XSX also has a dedicated audio chip.
From an interview with Ninja Theory developer.

Xbox Series X will have a dedicated audio chip. "It's extremely exciting," senior sounddesigner Daniele Galante said of the new console. "We're going to have a dedicatedchip to work with audio, which means we finally won't have to fight with programmers and artists for memory and CPU power."
 
Last edited:
Listen to the video. Xbox Velocity Architecture will allow 100GB to be available instantly from the SSD. Plenty of room for devs. Sony's SSD is faster but it lacks in other areas. This is all just theory right now. Show me a game that can be made on one SSD and not the other.
What the fuck does "instantly available' even mean in this context?

That the Project Velocity architecture has infinite bandwidth? I mean if that were the case, why even bother using RAM? The magic velocity architecture gives you infinite bandwidth and 1TB volume.

The fundamental fact of the matter is, the SSD in the Series X has a raw bandwidth of 2.5GB/s and a compressed bandwidth of 4.8GB/s.

This is quite literally 2 orders of magnitude slower than their GDDR6 RAM, which is 336GB/s at its slowest and 560GB/s at its fastest.

Doesn't matter how much of the SSD Microsoft allocate as a VRAM cache, it's still going to be 200 times slower than the actual RAM.
 
What the fuck does "instantly available' even mean in this context?

That the Project Velocity architecture has infinite bandwidth? I mean if that were the case, why even bother using RAM? The magic velocity architecture gives you infinite bandwidth and 1TB volume.

The fundamental fact of the matter is, the SSD in the Series X has a raw bandwidth of 2.5GB/s and a compressed bandwidth of 4.8GB/s.

This is quite literally 2 orders of magnitude slower than their GDDR6 RAM, which is 336GB/s at its slowest and 560GB/s at its fastest.

Doesn't matter how much of the SSD Microsoft allocate as a VRAM cache, it's still going to be 200 times slower than the actual RAM.

You reminded me about this lol... I completely agree nothing is "instantaneous".


ZDthhpV.jpg


I imagine both systems will work similarly to "gigasampler". So long as the streaming source can keep up with your throughput needs you hide the latency by using ram for the first part of every texture ( or sample). Rather than a tradition cache .. where you put a whole file into ram .. if the disk latency is 1 ns... you load the first 1 ns worth ( however much the gpu can process in 1ns) of every texture into ram then by the time that has been used your input buffer is now filling up from the SSD.

Your prob thinking but mem is 500GB and the SSD is 5 per sec ...each pixel shader will perform many reads and writes all of which uses up mem bandwith ..but you usually only need to read the texture once.

There's two important values here ... the latency as well as the SSD throughput.

If the latency is lower you can have more textures precached ( because you don't need to cache as much for each one).

if the bandwidth is higher then you can stream higher quality textures.

It will be interesting to see how this all pans out!
 
Best visuals are a matter of money and powerful hardware.

XsX has the most powerful hardware, they just need to throw the money and give developers time.

The first Xbox came out of nowhere with Halo, which at it's time was leagues ahead of any other game. The same happened with Gears Of War.

Now is their time to keep doing what they did with their two first generations of consoles.
You also need good developers and Sony has the best
 
well, a lot of devs from Sony studios already left and went to MS studios. MS now has like 15 studios.
We will see how it will be nextgen. Things change, you know?
Not a lot. If I had to guess 7-10 but Sony has thousands of developers and they also get talent from other studios too

We just never hear about it because only Xbox fans brag about getting staff from big name studios

I bet most people didn't know this guy from Rockstar is at PlayStation

tiW9gQb.jpg
 
Last edited:
Also comparison of clock scaling between different architectures is pointless. Just as pointless as comparing TFLOPs between different architectures.
 
What?
Are you from year 2040, son?

In an interview centred on 4A Games' excellent work in bringing Metro Redux to Switch, we asked CTO Oles Shishkovstov for his reaction to publicly revealed aspects of next generation console hardware. "We are fully into ray tracing, dropping old-school codepath/techniques completely," he told us.

 
If the difference is 20-30% in real world rather than 7% ... that's going to be VERY noticeable.

If the PS5 can do everything the One X can do, just at 3200x 1800 instead of 4K... I don't think it is going to be very noticeable.

Most One X games run at 1800p or lower and I don't see complaints.
 
Well they show it and it looks mediocre, so I am not sure where you are getting this information. Cutscenes obviously does not count.
Naughty Dog brings top tier visuals every time they release a game. This is from Uncharted 4 done on 1.84 Tflops btw

XZgQaBU.jpg


AnPG2mW.jpg


kQAtRp9.jpg


Never doubt ND :)
 
The SSD in the PS5 is 125% faster than XSX's off-the-shelf solution.
They not even the same SSD gen tech wise, compatible drives for PS5 are not even out yet.

Stop it. I'll give you a wet kiss and tell you how pretty you are if you don't stop it.

So I can buy an XSX SSD right now? From any 3rd-party?

PS5's internal SSD is replaceable with approved 3rd-party drives. By that notion it's also off-the-shelf. The flash NAND memory controller is what's actually custom to it, not the SSD itself.
 
So I can buy an XSX SSD right now? From any 3rd-party?

PS5's internal SSD is replaceable with approved 3rd-party drives. By that notion it's also off-the-shelf. The flash NAND memory controller is what's actually custom to it, not the SSD itself.
Is there any approved 3rd party drives for PS5 right now?
 
Is there any approved 3rd party drives for PS5 right now?

Same difference. You've got someone throwing around one approach being off-the-shelf when they don't even understand what the phrase means, and I asked a question to see how what they said held up. It doesn't.

Of course you can't buy a 3rd-party drive for PS5 right now, but the fact some people don't understand that the customization is on the memory controller, not the drive itself, and they only want to insinuate one system going with an off-the-shelf approach is kind of suspect.

It's the same as people saying we don't know everything on PS5 yet. That's true, but there are still XSX technical details we don't know about yet either. But some people operate on the assumption that XSX has no further technical details to be revealed. It probably has more to do with the fact Sony's only just recently started talking PS5 specs in detail, so it feels like MS has exhausted through a lot of theirs by comparison since they started talking earlier in that level of detail.
 
In an interview centred on 4A Games' excellent work in bringing Metro Redux to Switch, we asked CTO Oles Shishkovstov for his reaction to publicly revealed aspects of next generation console hardware. "We are fully into ray tracing, dropping old-school codepath/techniques completely," he told us.


Guys,

As someone from fucking 2020 I gotta tell you that full frame ray tracing needs 100 times more rays proceesed per second and as a side effect about as much of RAM throughtput boost.
Unless there is a major breakthrough from nowhere we are not getting close to it even 10 years from now.
 
I love watching some of these threads lol especially the like button, the hype men(and wom(a)n) for the XSX are out in full force in this one. 😂
 
I was replying to the guy that said the gap will be big if Hellblade 2 is anything to go by

I know we haven't seen any games from Sony, if I were them I would've showed a few trailers already because the moment they do, a lot of this concern will die off

It's not just games. Sony hasn't actually showed anything. These high clocks won't work without proper cooling and to the extent they have refused to even share the concept of their cooling plan we have zero evidence those high clocks will hold. Cerny not only failed to deliver a complete physical console to DF to review. He failed to even deliver a complete blueprint. This is smoke and mirrors right now. Marketing and nothing more.
 
It's not just games. Sony hasn't actually showed anything. These high clocks won't work without proper cooling and to the extent they have refused to even share the concept of their cooling plan we have zero evidence those high clocks will hold. Cerny not only failed to deliver a complete physical console to DF to review. He failed to even deliver a complete blueprint. This is smoke and mirrors right now. Marketing and nothing more.
Because they're obviously not ready to fully reveal the console and there could be different reasons for that. At least we know it'll be powerful and PlayStation brings the games every generation :)
 
Wut? Are you saying that HDD can actually affect performance of the game? Hmmmm aren't you the one downplaying PS5 beast SSD. LMAO 🤣
The HDD isn't affecting the performance of the game, at least not how you're trying to make it appear. The I/O in the Xbox One X is broader than the PlayStation 4 Pro and neither have a problem with streaming in the game content from the memory interface or their HDD as is clearly apparent from other analysis of this game.

John's drive appears to be failing, his drive in specific, the drive in the One X itself as a console is not. Is this the best BS you guys can muster?
 
Top Bottom