Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
When you are at a very high speed and revealing more geometry onto the screen while being at a very high speed is way more demanding than a cut to a frame, there's no other way around this, my guy.

There were many other cuts and were more demanding, like the 500 statues cut-to-camera and they didn't pop-in at all, surely they are more demanding than a cut to a wall.
If you have technical knowledge and think flying is more demanding than a camera cut in a system streaming to the next frame, you should think about it for a few minutes.
 
Firstly, please laugh a little, I really wasn't being terribly serious. This is the speculation and leaks thread - anyone who claims to know the objective truth here is full of shit - and that includes me :)
Secondly, I actually said above I think PS5's hardware approach to I/O is going to come out ahead - software can only do so much if the hardware isn't there. However, I do think the gap in I/O performance is smaller than it appears on paper.

As for the rest of the machine, I think Microsoft have knocked it out of the park, and I think it's going to show - but it all comes down to price. If the Series X is $599 and the PS5 is $399, I'd say Sony built the superior machine. We're all "wait and see" for that aspect.

Interestingly, you keep mentioning that Microsoft has the SW advantage, ignoring that Sony has the HW advantage and has been designing consoles for longer.
 
I'm not sure what I'm responding to, frankly? I mean, DX12U is a multi-platform API. If it was fully customised for Series X... then it wouldn't be multi-platform any more. But, that's kind of the point. The name "Xbox" is a contraction of "Direct X Box", which was the informal name used to describe the console while it was in development. Xboxes are literally built to run DirectX. This isn't unique to this generation - the Xbox 360, for example, used DirectX9c, though I can't recall what version of DirectX the OG used. It's pretty common knowledge that the Xbox One has actually used different versions of Direct X, with DX12 being the most recent. So... yeah?
They is a difference between the previous consoles and DitectX.

Both Xbox, 360 and Xbox One uses a customized DirextX API specific to these consoles.... there is functions and features that doesn't works in PC.

That changes with Dirext X 12 Ultimate... like the name says the API is the same no matter the platform.... all features and functions are common between PC, Xbox One, Series X, etc... all platforms that have that API.

That is the sole reason MS released DirextX 12 Untilamte instead to just add a new tier to DirecrX 12.
 
Last edited:

It should be noted that Tim Sweeney has discussed the PlayStation 5 since that has been the company's primary focus, as Epic has been working for 'quite a long time' alongside Sony on that Unreal Engine 5 demo. However, the Xbox Series X also features its own Velocity Architecture which is designed to 'radically improve asset streaming' and to 'effectively multiply available memory' thanks to a custom NVMe SSD (rated at 2.4 GB/s with raw data), a dedicated hardware decompression block, the new Sampler Feedback Streaming technology and last but not least, the new DirectStorage API.

The latter component is particularly relevant because Microsoft already confirmed plans to bring it to Windows PC, too. According to Microsoft, DirectStorage can significantly reduce CPU overhead for I/O operations (such as those happening in the background to load the next parts of the world) from several cores to a small fraction of a single core. Needless to say, this could severely diminish the PC I/O issues mentioned above by Tim Sweeney.


Let them Dream?
Leave it to wccftech to take an interview where someone talks about PS5 and turn it into an Xbox ad. Shilling has never been this obvious.
 
If you have technical knowledge and think flying is more demanding than a camera cut in a system streaming to the next frame, you should think about it for a few minutes.
OK, let's forget about the fly-through thing (even though it is a strong one but this evidence isn't enough of a confirmation for you).

There were some other cut-frames, like the one when it revealed 500 statues, what's up with that? They are surely much more demanding on the system than a cut-frame to a wall, yet they didn't have any "pop-in", come on man, you can't get out of this. 😂🤣🤣🤣

I'm talking about this one. 🤣🤣🤣🤣


nREGS6A.jpg
 
Last edited:
Why not?
Seems like most companys like to take advantage of gullible people.
I mean you can see it everywhere all day.

Thats actually a real problem in modern society.
People don't even try to unterstand what some company tells them, they just go "yay sounds good take my money".
Then of course they start to fight for their product which they paid for.

Been there done that in my youth.
Quite glad I'm not so gullible and invested anymore.
Beeing sceptic about stuff is a healthy attitude. As long as you don't overdo it like "Flat-Earthers" and "Corona-Hoax" or "Deep State / Qanon" kind of people.
Sorry for the offtopic, but please dont bring conspiracy theories into this thread. Some theories are indeed interesting and maybe even true, but here we are talking about next gen consoles and you are provoking people to talk about them.
 
If you have technical knowledge and think flying is more demanding than a camera cut in a system streaming to the next frame, you should think about it for a few minutes.
You can also afford to stream in assets with way less detail if the camera is flying past them at high speed. Most people won't notice, but the large room full of statues is where the engine really stressed the system IO.
 
Last edited:
OK, let's forget about the fly-through thing (even though it is a strong one but this evidence isn't enough of a confirmation for you).

There were some other cut-frames, like the one when it revealed 500 statues, what's up with that? They are surely much more demanding on the system than a cut-frame to a wall, yet they didn't have any "pop-in", come on man, you can't get out of this. 😂🤣🤣🤣

I'm talking about this one. 🤣🤣🤣🤣


nREGS6A.jpg
There is pop-in there, but it is minimal and barely noticeable. You have to really zoom in to notice anything at all.
 
If you have technical knowledge and think flying is more demanding than a camera cut in a system streaming to the next frame, you should think about it for a few minutes.

How great is it, that finally, gameplay to realtime cutscene transitions will not only be always seamless but cuts between those scenes to completely different locations will be as seamless as in a movie.

It will be great to actually never have a camera cut in God of War too.
 
There isn't a more demanding scene, this is after a camera cut. A 100% fresh frame is the harders work the SSD can have with a system that streams to the next frame. It's easier to stream something flying a million KMH than a camera cut in a next-frame streaming system.

And you too are missing the point, which is funny. What do you think is my point? That the PS5 SSD is bad? Did my whole post go over your head?

you should know it by now, anything, literally anything, that you will say, not even negative, but in any way questioning the brilliance mr Cerny or mr. Sweeny you will burn🔥
 
Sony just revealed the TLOU 2 PS4 Pro limited edition and it looks so good! ❤❤❤❤

Probably one of the best Limited Edition PS4s ever IMO, I cannot freakin' wait for The Last of Us Part 2 and experience this masterpiece!

I just want Microsoft to release games like TLOU 2 and GoW, masterpieces that will remembered for years to come.
 
Last edited:
I rely on technical knowledge of software, which I have been a developer of for many years, to write about this. No, I have no feelings in particular about this except when borderline lies are being used to convince less technical people, like yourself, about some technology.

I agree but those lies are not MS ones.
They are not responsible for the worst people on internet
Same apply to Sony (ex: RDNA 3)
 
Last edited:

It should be noted that Tim Sweeney has discussed the PlayStation 5 since that has been the company's primary focus, as Epic has been working for 'quite a long time' alongside Sony on that Unreal Engine 5 demo. However, the Xbox Series X also features its own Velocity Architecture which is designed to 'radically improve asset streaming' and to 'effectively multiply available memory' thanks to a custom NVMe SSD (rated at 2.4 GB/s with raw data), a dedicated hardware decompression block, the new Sampler Feedback Streaming technology and last but not least, the new DirectStorage API.

The latter component is particularly relevant because Microsoft already confirmed plans to bring it to Windows PC, too. According to Microsoft, DirectStorage can significantly reduce CPU overhead for I/O operations (such as those happening in the background to load the next parts of the world) from several cores to a small fraction of a single core. Needless to say, this could severely diminish the PC I/O issues mentioned above by Tim Sweeney.


Let them Dream?

someone else here explained this quite good, with ms and Sony approaching the same issue (IO bottlenecks) a bit differently, one with hardware/software mix more focused on software (ms), and the other with hardware/software mix more focused on hardware (Sony). We need to wait and see which one works out better, might as well end with the difference in real world scenarios being negligible.
 
I agree but those lies are not MS ones.
They are not responsible for the worst people on internet
Same apply to Sony (ex: RDNA 3)
Eh? Sony have never said they are using specific features from RDNA 3, what are you talking about? Why are you mixing "worst people on the internet" into this?
 
I agree but those lies are not MS ones.
They are not responsible for the worst people on internet
Same apply to Sony (ex: RDNA 3)
Are you calling this guy a liar?

At 11:45.

The irony is that he is an out-spoken Xbox fan. The harrassment he went through because of that video is ridiculous.
 
Last edited:
someone else here explained this quite good, with ms and Sony approaching the same issue (IO bottlenecks) a bit differently, one with hardware/software mix more focused on software (ms), and the other with hardware/software mix more focused on hardware (Sony). We need to wait and see which one works out better, might as well end with the difference in real world scenarios being negligible.


Software is essentially limited by what the hardware can do. Basically if an SSD is physically designed to be really fast the software can take advantage of that speed.

Also software can always be added or improved later on so alot of the software features that Sony has in their I/O can get better. The physical hardware can't change though.
 
Sorry, it's the other way around, you are the one who needs to prove that a high-quality lossy compression like BC7 will cause a real problem using micro polygons. Considering BC is one of the most popular methods to compress textures in games and one of the largest console makers decided to have a hardware unit dedicated just for that, I'm pretty sure the requirement for proof is on you.


It's easy, there you go.
Before - when not enough polygons streamed in:
o7y4BgC.png



After - when enough polygons streamed in:
ndDIZgp.png


took me 30 seconds, took these screenshots myself.

If you think that's streaming encoding, you don't know what streaming encoding artifacts look like. Calling me "splitting hairs" on the PS5 is losing touch with the actual argument we are having. My point was never against the PS5, it was actually the other way around, that this system streams in polygons and textures for the next frame and if it fails, all you see is just a lower quality asset for a fraction of a second so all of these problems will be worse on XSX. But for some reason, you've decided to go with a strawman, that somehow I'm attacking the PS5 for having some form of extremely gentle pop-in during 0.01% of the demo. That is not the point, the point was to show you what happens when the SSD can't keep up (which happened in these images I've brought because of a camera cut).

I agree with you on the streaming speed, SSD is just about fast enough, no way is SSD sitting idle or not being used much in this demo. This was pointed out by Naughty Dog dev tweet a few days ago.

Also the mesh data being in lots of tiles the debate is the compression without losses when smaller items. and the data cannot be compressed lossy as vertices are not textures......

Maybe thats why Sony chose Kraken as its not just about textures, itsabout fast compression and decompression and lossless on things other than large textures -

Looking forward to the Epic detailed breakdown, I am sure there will be surprises, and it makes you wonder how long Cerny has been aware of this type of application in the Ps5 design really....mmm
 
Last edited:
someone else here explained this quite good, with ms and Sony approaching the same issue (IO bottlenecks) a bit differently, one with hardware/software mix more focused on software (ms), and the other with hardware/software mix more focused on hardware (Sony). We need to wait and see which one works out better, might as well end with the difference in real world scenarios being negligible.
Maybe. But I'm thinking MS is PR'ing this to the max trying to deflate the i/o strides by Sony.

I very much suspect MS to be be omitting a lot of information with their miraculous software triumph. Figures quoted could be isolated to a very specific example, but nowhere near it in the real world.

As much as I'm an optimist, I'm also a skeptic of miracle cures. Just too good to be true with nothing technical backing this bar shallow PR lines being parroted by the media and shills alike. Show me the money!
 
Last edited:
Maybe thats why Sony chose Kraken as its not just about textures, itsabout fast compression and decompression and lossless on things other than large textures -

What's nice about kraken is that it decompresses everything while on the XSX developers have to use BCPak in combination with Zlib.

Maybe this helps make development a little easier since they only have yo worry about one thing that decompresses instead of two.
 
If PS5 I/O wasn't a big deal, you wouldn't have devs raving about it, U5 demo debuting on PS5 instead of PC, and a guy like Matt who has access to both devkits saying the difference between the I/O in the PS5 vs XSX to be "Significant".

MS software solutions will work about as well as Direct X worked this gen in bridging the gap between PS4 GPU and Xbox One GPU. It is what it is.

Those who don't want to accept it are suffering from FOMO. The good news is, multiplatform development will not only have to contend with PS5 and XSX, but PCs with basic SSD's, so nothing has changed for the Xbox gamer, only for the Playstation gamer.
 
someone else here explained this quite good, with ms and Sony approaching the same issue (IO bottlenecks) a bit differently, one with hardware/software mix more focused on software (ms), and the other with hardware/software mix more focused on hardware (Sony). We need to wait and see which one works out better, might as well end with the difference in real world scenarios being negligible.
Although this is true, I think MS has more hardware to alleviate the bottlenecks than people realize. For some reason, people keep dismissing the 2x - 3x effective bandwidth and RAM multiplier with SFS, which uses a hardware filter, rather than software. I haven't seen any legitimate reason as to why this is dismissed.
 
Are you calling this video a fact ?
His video is just a video. He is talking about a rumor, for which a precedent exists. Bespoken rumor which some very sick people (as you point out) twisted into a lie and used to attack others, starting with him.
 
Last edited:
OK, let's forget about the fly-through thing (even though it is a strong one but this evidence isn't enough of a confirmation for you).

There were some other cut-frames, like the one when it revealed 500 statues, what's up with that? They are surely much more demanding on the system than a cut-frame to a wall, yet they didn't have any "pop-in", come on man, you can't get out of this. 😂🤣🤣🤣

I'm talking about this one. 🤣🤣🤣🤣


nREGS6A.jpg

I'm assuming there is in every camera cut. I'll check when I'm near a computer. In theory every scene is made out of ~20M polygons so it doesn't matter what you are looking at. Actually, the status might even be an easier task for the system than that wall because it's the same asset repeating over and over again so the system needs to to pull most of the data from the same asset.

How great is it, that finally, gameplay to realtime cutscene transitions will not only be always seamless but cuts between those scenes to completely different locations will be as seamless as in a movie.

It will be great to actually never have a camera cut in God of War too.
I'm sure SM would have killed to have the PS5's SSD for GOW!
 
when sony lost the TF fight, you knew Sony fanboys would come back hard

There was never a "fight" just technical and scientific data. Teraflops lack explanatory scope and power especially after the UE demo. Did you view the UE demo? What do you think? The narrative now seems to be focused on "faster, SSD". Ironically....it seems as though Microsoft has "lost the TF fight" lol.

giphy.gif


Remember what you said.

My stomach from laughing at this :messenger_tears_of_joy:!
 
His video is just a video. He is talking about a rumor, for which a precedent exists. Bespoken rumor which some very sick people (as you point out) twisted into a lie and used to attack others, starting with him.

if the precedent is packed math...it's an half true to call that a precedent
i'm tempted to say SFS is to software what packed math is to hardware.
Something existing for long they decided to push at some point.
 
Last edited:
A few thoughts on the design of the final PS5: After all the information that has been available to date, I strongly believe that we will see a black and white PS5 with a V-shape that looks very similar to the devkit.

Reasons:
1. The color of the DualSense. In all previous PlayStation consoles, the console and controller had the same colors (except for special editions). A PS5 only in white or black would look strange with this DualSense.

2. The cooling solution. It was often said that Sony opted for an interesting and better cooling solution. Since the heat development of the devkit is said to have improved, I assume that this was improved with the revisions of the devkits. I therefore consider it's very likely that the final console will look similar in the end to keep the shape optimized for cooling.
 
Although this is true, I think MS has more hardware to alleviate the bottlenecks than people realize. For some reason, people keep dismissing the 2x - 3x effective bandwidth and RAM multiplier with SFS, which uses a hardware filter, rather than software. I haven't seen any legitimate reason as to why this is dismissed.

As I understand it the texture filter is a standard GPU hardware feature that Microsoft have then tweaked/customised for Series X. I'm sure they do that for many parts where it makes sense. I won't dismiss it as nothing (maybe Microsoft do have some additional HW units they're holding back) but I don't believe it is some magic bean, either.
 
MS offers 100gb of instant data - they just mean bypass the file system and address SSD storage direct like it was a slow portion of RAM.

Sony offers 852Gb of instant data using MS's definition - at least according to what has been released so far.

The 100gb on Xsex is a limitation compared to PS5. Still 100gb is a fair chunk of memory and ms have worked on things like compression so I don't think it will actually impact anything.

EDIT - Also to mention both Xsex and PS5 have a Flash File System so data can be looked up in the same way one would do it with a HDD. The above direct addressing systems are just faster in some scenarios.


So how exactly does this work compared to the ps5 io?

We know they have specific hardware equivalent to some zen 2 cpu cores to handle io things like coherencey etc.

Direct storage is described to reduce cpu overhead for io operations.

So if Sony are needing a few zen2 cpu cores worth of processing for their IO, excluding kraken, does the xbox method hit the CPU hard?

If PS5 I/O wasn't a big deal, you wouldn't have devs raving about it, U5 demo debuting on PS5 instead of PC, and a guy like Matt who has access to both devkits saying the difference between the I/O in the PS5 vs XSX to be "Significant".

MS software solutions will work about as well as Direct X worked this gen in bridging the gap between PS4 GPU and Xbox One GPU. It is what it is.

Those who don't want to accept it are suffering from FOMO. The good news is, multiplatform development will not only have to contend with PS5 and XSX, but PCs with basic SSD's, so nothing has changed for the Xbox gamer, only for the Playstation gamer.

This. So all this marketing from MS seems to me just to muddy the waters. But if so once again, they are setting up believers for a fall...
 
Although this is true, I think MS has more hardware to alleviate the bottlenecks than people realize. For some reason, people keep dismissing the 2x - 3x effective bandwidth and RAM multiplier with SFS, which uses a hardware filter, rather than software. I haven't seen any legitimate reason as to why this is dismissed.

Because you're parroting a marketing bullet point about, and even worse you're expanding it beyond its use. It's applied to textures, therefore it's not an effective overall 2x or 3x multiplier in bandwidth and RAM, you keep spreading BS. And MS is so vague about this, that it's not even possible to understand just how different it is from partially resident textures that has been hardware supported since GCN.


"Use of sampler feedback with streaming is sometimes abbreviated as SFS. It is also sometimes called sparse feedback textures, or SFT, or PRT+, which stands for "partially resident textures".
 
Last edited:
if the precedent is packed math...it's an half true to call that a precedent
i'm tempted to say SFS is to software what packed math is to hardware.
Something existing for long they decided to push at some point.
And I would say sure, why not. I think some of the ideas in the XsX IO system are going to make it to PCs soon as well. I hope so in fact!

Why is it so incredible that a hardware feature that does not make sense to adopt in desktop cards right now (not talking about packed math, which by the way, is the GPU analogous to vectorization in CPUs, so yes, ancient) eventually makes its way there once AMD decides to make it part of their roadmap?

So, you agree how utterly stupid it is to call people delusional for musing that possibility (and something that would happen years down the line anyway). Even people openly speculating about that long before RTG mentioned this idea from his conversation got attacked for that in this forum.
 
I'm assuming there is in every camera cut. I'll check when I'm near a computer. In theory every scene is made out of ~20M polygons so it doesn't matter what you are looking at. Actually, the status might even be an easier task for the system than that wall because it's the same asset repeating over and over again so the system needs to to pull most of the data from the same asset.


I'm sure SM would have killed to have the PS5's SSD for GOW!
Let me know when you see pop-ins in those other screens, I'm sure there weren't any, are those were more demanding. 😀
 
Because you're parroting a marketing bullet point about, and even worse you're expanding it beyond its use. It's applied to textures, therefore it's not an effective overall 2x or 3x multiplier in bandwidth and RAM, you keep spreading BS. And MS is so vague about this, that it's not even possible to understand just how different it is from partially resident textures that has been hardware supported since GCN.
"Parroting"? If you're trying to have a conversation, you can start by not degrading others. I don't parrot anything. I have read and understood what it actually does.

Microsoft is "vague"? It was explained quite clearly, actually;


As textures have ballooned in size to match 4K displays, efficiency in memory utilisation has got progressively worse - something Microsoft was able to confirm by building in special monitoring hardware into Xbox One X's Scorpio Engine SoC. "From this, we found a game typically accessed at best only one-half to one-third of their allocated pages over long windows of time," says Goossen. "So if a game never had to load pages that are ultimately never actually used, that means a 2-3x multiplier on the effective amount of physical memory, and a 2-3x multiplier on our effective IO performance."

A technique called Sampler Feedback Streaming - SFS - was built to more closely marry the memory demands of the GPU, intelligently loading in the texture mip data that's actually required with the guarantee of a lower quality mip available if the higher quality version isn't readily available, stopping GPU stalls and frame-time spikes. Bespoke hardware within the GPU is available to smooth the transition between mips, on the off-chance that the higher quality texture arrives a frame or two later. Microsoft considers these aspects of the Velocity Architecture to be a genuine game-changer, adding a multiplier to how physical memory is utilised.



Textures are the largest assets that require streaming. So the 2x - 3x is not unrealistic at all, especially considering that is at best the amount that is being used when things are done in the old ways.

Unable to understand how different it is from partially resident textures? Microsoft is arguing a 2x - 3x multiplier compared to the Xbox One X. If you argue that PRT was already used in older consoles, things are clear as day.
 
Last edited:
"Parroting"? If you're trying to have a conversation, you can start by not degrading others. I don't parrot anything. I have read and understood what it actually does.

Unable to understand how different it is from partially resident textures? Microsoft is arguing a 2x - 3x multiplier compared to the Xbox One X. If you argue that PRT was already used in older consoles, things are clear as day.


Terminology

Use of sampler feedback with streaming is sometimes abbreviated as SFS. It is also sometimes called sparse feedback textures, or SFT, or PRT+, which stands for "partially resident textures".
 
Sorry for the offtopic, but please dont bring conspiracy theories into this thread. Some theories are indeed interesting and maybe even true, but here we are talking about next gen consoles and you are provoking people to talk about them.

Nah, I'd say don't worry about people being roped into conspiracy theories in general in this thread. We already have enough of those regarding the two consoles anyway.

So I guess we're immune by now.
 
Nobody said (and particularly on Sony fanboys side) that XSX is not powerful and worst that PS5 is stronger than XSX.
But that's a fact that MS fanboy are not happy with the XSX spec and i don't understand why? That's weird to see those MS fanboy trying to want PS5 as powerful as a PS1.
Exactly, certain people are trying to put out dreck that makes out the PS5 to be nothing more than a PS4 Pro PRO, if you will.
 
Status
Not open for further replies.
Top Bottom