• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

DeepEnigma

Gold Member
what they have shown so far looks nice, but not eyeball melting.

61MxJCG.gif


Definitely looks better than Ratchet on PS4.

9ct1HgY.gif


But im not seeing a generational leap. Maybe if they had changed the art style to something more realistic. Looks busier and there is more destruction, but overall im not blown away by the graphics.
Cant See Cheech Marin GIF
 
Returnal is really something special. Yeah, it might have some progression issues, but this is what Cerny must have had in mind when he designed the 3d audio and the dualsense features. It feels immersive in ways other shooters simply arent. the game might not be a looker, but its atmospheric as fuck. I was watching Ratchet's state of play today to try and imagine the dualsense and 3d audio in that game, and it felt so empty in comparison. Atmosphere is so important. This is far more impressive overall than Demon Souls, Miles and Astrobot. I just cant get over how good the sound is in this game. Everything from the scifi soundtrack to the enemy screams and explosions, its audio porn. Play with headphones on, please.

I'd highly recommend it as a technical showcase even if you arent into rougelikes or brutal difficulty spikes. This is Cerny's vision brought to life.

Thanks Slimy, you just sold me on the game.
 
Regarding DF analysis on Ratchet and Clank it appears that Insomniac made a lot of smart decisions to save on performance. Like how the RT reflections are at a lower quality than the actual assets. I'm pretty certain the final product will perform extremely well due to all these decisions.
 

Rea

Member
So RT is behind that?
I meant it is something like Demon's Souls GI.
But nowadays there's something like Lumen which is also kind of ray traced GI but computationally less expensive than Nvidia, so it can run without RT hardware. My personal guess is that R&C might be using GI like Lumen and use RT hardware for RT reflections. *could be wrong *
 
I meant it is something like Demon's Souls GI.
But nowadays there's something like Lumen which is also kind of ray traced GI but computationally less expensive than Nvidia, so it can run without RT hardware. My personal guess is that R&C might be using GI like Lumen and use RT hardware for RT reflections. *could be wrong *

Interesting well hopefully the devs do a deep dive on the game. There's a lot of interesting technology that they are using in it.
 

onesvenus

Member
The light diffusion in the smoke is pretty neat. I wonder how they did that?
It'd make a lot of sense for it to be encoded in the effect itself instead of doing the expensive computation each time.
It's some kind of real time GI. Their customed GI also improved compared to last generation.
Do you have any evidence of that? There are multiple games with those kind of effects being encoded as is without any RT involved.
I meant it is something like Demon's Souls GI.
But nowadays there's something like Lumen which is also kind of ray traced GI but computationally less expensive than Nvidia, so it can run without RT hardware. My personal guess is that R&C might be using GI like Lumen and use RT hardware for RT reflections. *could be wrong *
Without seeing more of them on different scenarios, I'd say they are not doing any raymarching through the cloud volume. It's really expensive and wouldn't bring anything to the table for an effect that lasts a couple of seconds.
In fact, taking into account the I/O speed of the PS5, I'd guess they are using Houdini simulations instead. They don't have the problem of taking a big amount of RAM anymore
 

Rea

Member
It'd make a lot of sense for it to be encoded in the effect itself instead of doing the expensive computation each time.

Do you have any evidence of that? There are multiple games with those kind of effects being encoded as is without any RT involved.

Without seeing more of them on different scenarios, I'd say they are not doing any raymarching through the cloud volume. It's really expensive and wouldn't bring anything to the table for an effect that lasts a couple of seconds.
In fact, taking into account the I/O speed of the PS5, I'd guess they are using Houdini simulations instead. They don't have the problem of taking a big amount of RAM anymore
Everything i said was speculation of course. Anything is possible until dev confirms it.
 

Rea

Member
I'd guess they are using Houdini simulations instead. They don't have the problem of taking a big amount of RAM anymore
Notice that the whole environment geometry is lit up whenever the explosion occurs, like a light source, can houdini simulation do that ? I don't know.

6uQ118s.gif
 

Pedro Motta

Member
Notice that the whole environment geometry is lit up whenever the explosion occurs, like a light source, can houdini simulation do that ? I don't know.

6uQ118s.gif
The explosion is generated in Houdini, but it's rendered to an animated texture atlas, so it isn't emiting light, the light is added afterwards when composing the asset. It's notorious the improvement in sprite quality.
 

onesvenus

Member
Notice that the whole environment geometry is lit up whenever the explosion occurs, like a light source, can houdini simulation do that ? I don't know.

6uQ118s.gif
That can be achieved with a simple point light enabled during the expiration as how it's been done for a long time
 
This is an example of the type of post that will get you permanently removed from the thread.
When is Xbox expected to have a truly next gen game?
 

ZehDon

Member
End of this year I reckon.

Either Halo or Starfield OR both
While Starfield could be this gen's TESIV: Oblivion in terms of "next gen", I really doubt that Halo Infinite delivers on "next gen". It's a cross-gen game; the delay might help it not look like a bad Xbone game that's been up-scaled, but it can't make it look like a next-gen only one. I suspect Bethesda will be delivering that "next gen" moment long before XGS does.
 

reksveks

Member
While Starfield could be this gen's TESIV: Oblivion in terms of "next gen", I really doubt that Halo Infinite delivers on "next gen". It's a cross-gen game; the delay might help it not look like a bad Xbone game that's been up-scaled, but it can't make it look like a next-gen only one. I suspect Bethesda will be delivering that "next gen" moment long before XGS does.
And possibly flight sim if that does come out this year
 
This reads like an xbox PR piece, which it might very well be.

Your forgetting that RGT is a PC centric channel and Paul himself is primarily a PC gamer.

The great thing about when a new Xbox console launches is that it brings a ton of innovations with it which also come to the PC gaming space, most notably DX12 Ultimate and it's software feature set. So it makes sense for him to be excited about it and do extensive videos on the subject.

I know personally a number of my friends who are PC gamers who are excited for for things like DirectStorage. Which I think is a cool and useful feature, as it brings large optimisations in the way the PC hardware handle things like I/O requests, data streaming and loading, it also reconfigures the data streaming pipeline for PC's to allow for more efficiency since PC's don't have dedicated I/O hardware and usually rely on the CPU and GPU for data decompression. Things like these will become critical especially because of the massive increase in data (thanks to the visual fidelity) the next-generation games will bring and the significantly growing adoption of SSD's amongst PC gamers especially now that the consoles support them.

Here's a great read on the subject, I recommend Playstation gamers (like myself) also give this a read as it is very insightful into how the SSD's will impact next-generation gaming moving forward.

 

Md Ray

Member
Your forgetting that RGT is a PC centric channel and Paul himself is primarily a PC gamer.

The great thing about when a new Xbox console launches is that it brings a ton of innovations with it which also come to the PC gaming space, most notably DX12 Ultimate and it's software feature set. So it makes sense for him to be excited about it and do extensive videos on the subject.

I know personally a number of my friends who are PC gamers who are excited for for things like DirectStorage. Which I think is a cool and useful feature, as it brings large optimisations in the way the PC hardware handle things like I/O requests, data streaming and loading, it also reconfigures the data streaming pipeline for PC's to allow for more efficiency since PC's don't have dedicated I/O hardware and usually rely on the CPU and GPU for data decompression. Things like these will become critical especially because of the massive increase in data (thanks to the visual fidelity) the next-generation games will bring and the significantly growing adoption of SSD's amongst PC gamers especially now that the consoles support them.

Here's a great read on the subject, I recommend Playstation gamers (like myself) also give this a read as it is very insightful into how the SSD's will impact next-generation gaming moving forward.

A dedicated decompression unit will likely be built into the GPU for PC, imo.
I elaborated more on this PC vs console I/O topic in another thread. Read it if you're interested or ignore it.

When we talk about PS5 I/O, we aren't talking about the SSD or the speed (5.5 GB/s) of the SSD. That's not a problem for PC. On the hardware side, PC currently lacks the dedicated decompression unit that consoles (PS5/XSX | S) have. On the software side, its existing file I/O protocols are ancient. DirectStorage is being pushed to overcome this at least on the software side of things.

Decompression with Gen4 SSD speeds can take up to 24 cores with a conventional CPU (Threadripper) according to NVIDIA, 11 to 13 Zen 2 cores according to Sony and MS, respectively. With just 8 cores inside consoles, you can see they can't just rely on the main CPU for decompression of the game files. This is why console manufacturers built a custom decompression block into the I/O unit of the main SoC that offloads those tasks from the main CPU so the CPU can focus on what it's meant to do like processing of the game physics, instructions, preparing draw calls, etc. without having to worry about decompression overhead that much.

And this is where it's a problem for PC. DirectStorage kinda solves the problem on a software level, not on a hardware level. Gaming PCs with 6C/12T CPU like Ryzen 3600 or 5600X, heck even 8 cores are quickly going to get saturated when the decompression with Gen4 SSD is taking place leaving nothing for the games. They just aren't gonna cut it when games demand data to be decompressed and loaded from the NVMe to VRAM as soon as possible.

To solve this hardware problem NVIDIA is proposing a solution called "RTX I/O" where instead of using the CPU cores, with DirectStorage the game will use the GPU's CUDA cores or the SMs for decompressing the data. This is still going to impact gaming perf, IMO, and NVIDIA won't directly tell you or talk about this, at least not now. Because we'll likely see a dedicated decompression unit similar to consoles in the next-gen PC GPUs that will offload this task from the SMs. I mean this is not some wild concept. Remember this chart?

They'll have something like this up for the I/O decompression talk when selling their next-gen GPUs to you and show you how having a dedicated decompression, I/O cores in the GPU speeds up gaming perf instead of relying on the SM or the CPU.
Desktop-Screenshot-2020.09.01-11.17.07.05.png

This means gamers with Turing and Ampere GPUs are likely screwed if/when that happens just like GTX users are screwed without RT and AI cores. Think about this, even the console manufacturers could have gone the same route as NVIDIA and used their GPUs for decompression work, but they didn't. They felt the need to add a dedicated block so that it didn't impact gaming perf.

Now, do you understand what we mean when we talk about PS5/XSX I/O unit? Even XSX and XSS have some form of hardware decompression unit in the SoC.
 
I elaborated more on this PC vs console I/O topic in another thread. Read it if you're interested or ignore it.
Funnily enough, in the Microsoft Stack talk, Andrew Yeung (DirectStorage Engineer) alluded to future discrete GPU's incorporating dedicated hardware for I/O and that there will be an "exciting evolution to DirectStorage".
 
Last edited by a moderator:
Is it in this vid?


I knew it. Wish they'd incorporated this already in the Ampere or RDNA 2 line of GPUs. :messenger_expressionless:

Yes that's the video.

It's been heavily rumoured know that RDNA 3 will follow a chiplet design allowing it to have more CU's on the die, maybe AMD will dedicate some of those cores to I/O data decompression.

Likewise it's heavily rumoured that Nvidia's Hopper architecture (50 Series) which will succeed the upcoming Lovelace (40 Series) will also follow a chiplet design and maybe they'll go in the same direction.

This is all just my speculation of course.
 
Last edited by a moderator:
Yes that's the video.

It's been heavily rumoured know that RDNA 3 will follow a chiplet design allowing it to have more CU's on the die, maybe AMD will dedicate some of those cores to I/O data decompression.

Likewise it's heavily rumoured that Nvidia's Hopper architecture (50 Series) which will succeed the upcoming Lovelace (40 Series) will also follow a chiplet design and maybe they'll go in the same direction.

This is all just my speculation of course.

I wonder if PCs will ever adopt the approach of having a dedicated decompression hardware instead of using GPU/CPUs to do it. I can imagine that for highly compressed assets you would need to use a more powerful decompressor like Sony has.
 
I wonder if PCs will ever adopt the approach of having a dedicated decompression hardware instead of using GPU/CPUs to do it. I can imagine that for highly compressed assets you would need to use a more powerful decompressor like Sony has.
I think it would depend on the performance cost and the amount of data being decompressed. Although it's very likely they'll incorporate dedicated hardware.

Nvidia suspiciously didn't go into much detail about the performance cost of the GPU for data decompression. I imagine triple A next-gen games will exceed 100 GB in file sizes and that's a minimum. Although Playstation's Kraken has been doing an excellent job when it comes file compression and game size so it's anyones guess as to how much data will be thrown around.

In the DirectStorage talk, Andrew did mention that GPU's handle data decompression much better than the CPU so we'll see.
 

kyliethicc

Member
Starfield might be crossgen too
Yup. Flight Sim even got a PEGI rating for Xbox One.

But Deathloop & Ghostwire Tokyo are next gen only tho, so Starfield may be as well.
(And there's a small chance that Starfield even comes out on PS5, but that's unlikely.)
Deathloop won't be on Xbox until Sept 2022 tho. And Ghostwire might not even ship on PS5 til 2022, and Xbox in 2023.

If there's a Forza Horizon 5 this year, its gonna be an Xbox One game too, just like Halo.
And Psychonauts 2 is coming out on PS4.

I think Forza Motorsport could be the first XBS exclusive. In probably 2022, like GT7. Starfield is other possibility.
Long time to wait for the 12 floppies to be shown off.
 

Thirty7ven

Banned
I elaborated more on this PC vs console I/O topic in another thread. Read it if you're interested or ignore it.

Exactly, and it's par for the course really. Yesterday's tech is optimized for yesterday's needs. If you need a bunch of CPU/GPU cores busy with decompression for streaming at runtime, it becomes a problem. What we are seeing is a simple evolution, like we've seen before, and I/O systems like the ones on consoles will become common place once manufacturers figure out who's who. I think the answer is clear though, both AMD and NVidia will start shipping GPUs with integrated solutions.

It's probable that by 2024, a lot of PC gamers will have to upgrade their hardware simply because of that.
 
That's the thing, i was also seeing GEs (essentially) to be responsible for hardware culling and since PS5's GE is 22% faster and further customized to be fully programmable to do culling earlier in the pipeline, i was considering PS5 to be convincingly more capable at it.

Your counter argument of using CUs/compute resources for culling seems a bit counter untiutive to me but i lack knowledge about the matter, i need to do some research about it. Thanks.
Primitive shaders are fully programmable and are driver compiled,they execute a compute shader thread group with access to group shared memory.Triangle culling is done by the shaders on the compute unit,completely bypassing the fixed function primitive unit,this is not only much faster than fixed function but happends much earlier.

Primitive shaders reduce the the geometry pipeline to just 3 stages (world space pipeline ),triangle culling is done before the data reaches this pipeline via the shaders,the shaders operate at the GPU's clock speed (2.23 Ghz), triangle culling rate 17.84 billion per second.

Using primitive shaders does away with the crossbar which is used to send data across shader arrays,instead it uses the local data store which is accessible by the whole GPU and is freely accessible by the rasterizer,this eliminates the need for buffer space which the crossbar used as cache.

The UE5 demo used primitive shaders,this demo used billions of triangles on screen,most the size of a pixel,it used 768 MB of RAM,it ran at 4.5 ms frametime.

Daniel Wright Tech director of graphics at Epic

" On the PS5 we used primitive shaders for that path which are considerabley faster than using the old pipeline we had before with vertex shaders"
 
Status
Not open for further replies.
Top Bottom