• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Not what I meant, when people talk about smaller sizes on PS5, they only think about compression methods such as Oodle-"anything" as the only sole reason, not considering the fact that the PS5's SSD speed is enough to remove duplicate assets.

This may partly explain why XS games aren't as far behind in loading speeds, as they might still have more duplicates to rely on than PS5.
You are wrong the duplication can be removed in even in SSD SATA because the seek times and the mechanical nature of HDD
are the reason of duplication of assets not the bandwidth.

In my personal opinion the reason of why we saw this deltas in size if because the compression is better in PS5 side or for some reason
is not even applied in the same extension in the XS consoles, probably that PlayStation let you use Oddle compression for free is making more
devs use it in the PS5 version.

Regarding the loadings right now is only for the delta in bandwidth, even in some crossgen games where the optimization for loading was poor
they are close probably because still using the CPU for that work, in the future when we see games design around SSD we can start talk about
the other components like decompressor, APIs, etc.

Only a frame rate increase, no res enhancements?

Bro the game is free
 
Last edited:

ethomaz

Banned
You are wrong the duplication can be removed in even in SSD SATA because the seek times and the mechanical nature of HDD
are the reason of duplication of assets not the bandwidth.

In my personal opinion the reason of why we saw this deltas in size if because the compression is better in PS5 side or for some reason
is not even applied in the same extension in the XS consoles, probably that PlayStation let you use Oddle compression for free is making more
devs use it in the PS5 version.

Regarding the loadings right now is only for the delta in bandwidth, even in some crossgen games where the optimization for loading was poor
they are close probably because still using the CPU for that work, in the future when we see games design around SSD we can start talk about
the other components like decompressor, APIs, etc.
You are right but you need to thing about something else.

Even having gamers with SSD you can't avoid the duplication of data on PC because there will be gamers with HDD and the dev can't made two separated versions for SSD and HDD... so they ship the one with duplicated assets that allow both type players to have the game.
The same can be said about the decompression... PC don't have these hardware units for that so the PC game have to be shipped without these heavy compressed assets.

Now compare the size of the versions:

PC, Series, Xbox, PS4, etc: XX GB
PS5: 40% of XX GB

You see where the remove of the duplication and the use of heavy compression is happening? PS5.

That is part explained due the nature of the GDK that have the goal to allow devs to develop the same version across all Xbox devices and PC.... so if one or more hardware can't remove the duplication and can't do the heavy decompression then these version will be shipped as the duplication and little to no decompression.

That is the whole difference in a Multiplatform Tools vs a Hardware Specific Tools.

In the Multiplataform Tools the dev needs to choose to do specific changes to a specific hardware... most devs won't take or even have that time... most devs will choose what is common in all platforms.

That changes in a Hardware Specific Tools... you are already having to do the port so why not use the Tools available features?
 
Last edited:
EDIT: Another update on CDPR future plans....





Funnily enough CDPR released a 2021 roadmap which includes the next-gen Witcher 3 patch although there is no exact date, which makes it useless even releasing a road map in the first place lol



They need to stop talking about what they "plan to do" and start taking more action ....

We are months out from Cyberpunk 2077's launch and we still have no release date on the next gen update for the game or Witcher 3's next gen update.

How many different "roadmaps" are they going to release at this point?
 
Last edited:

StreetsofBeige

Gold Member
It was confirmed a while back that a Witcher game was in the works. Here's a quote from CDPR CEO.

"The first three 'Witchers' were by definition a trilogy, so we simply could not name the next game 'The Witcher 4'. This does not mean, of course, that we will leave the world of The Witcher,""The Witcher is one of two franchises on which to build the future activities of the company," he adds. " Today, unfortunately, I can not reveal anything more."

Here's the link


Also given the immense popularity of the netflix series CDPR would most likely do their best to bank off that success and release a Witcher game.

EDIT: Another update on CDPR future plans....





Funnily enough CDPR released a 2021 roadmap which includes the next-gen Witcher 3 patch although there is no exact date, which makes it useless even releasing a road map in the first place lol


Ah crap. So no more YT videos 8 years from release?
 

GAF machine

Member
Since Nvidia’s DLSS is a big topic these days (especially in regard to its use in a possible Switch Pro), I was just reading a bit up on it, which led me to an Wikipedia article about AI accelerators. Interestingly, the article mentions the Cell processor as having features significantly overlapping with AI accelerators. Which leads me to my question: would the PS3‘s Cell processor theoretically have been able to do something similar to DLSS, at least to some extent? Or maybe a hypothetical Cell successor, if Sony had stuck with the architecture after the PS3?

To answer both your questions: Yes, because there's an optimized library for computing GEMMs on SPEs somewhere in Sony HQ collecting dust.

Before I get to GEMMs on SPEs, per Nvidia:

"GEMMs (General Matrix Multiplications) are a fundamental building block for many operations in neural networks"... "GEMM is defined as the operation C=αAB+βC, with A and B as matrix inputs, α and β as scalar inputs, and C as a pre-existing matrix which is overwritten by the output."... "Following the convention of various linear algebra libraries (such as BLAS), we will say that matrix A is an M x K matrix, meaning that it has M rows and K columns. Similarly, B and C will be assumed to be K x N and M x N matrices, respectively." -- Nvidia

Here are illustrations of what Nvidia described...
tensor-core-acceleration-which-operations-l.jpg


background-tc-accelerated-gemm-output-matrix-l.jpg

To Nvidia's point about following the BLAS convention for GEMMs, the CELL SDK has a BLAS library with optimized SPE GEMM routines (BLAS Level III SGEMM and DGEMM). These routines put SPEs on the level with Tensor cores in terms of having an optimized procedure for doing matrix multiplication...

multicore-matrix-multiplication-library-2-l.jpg



problem-decomposition-1-l.jpg

A possibly interesting point of fact is that Mercury Computer Systems (once makers of the CELL-based PowerBlock 200 rugged computer for "applications in network-centric warfare") added SPE GEMM support to its MultiCore library, which the company claimed "demonstrated superb performance".

As you may know, training a neural network can be done with FP32 matrix multiplication or with a mix of FP16/FP32 matrix multiplication. The existing SPE has support for native FP32 math, but lacks native support for FP16 math (only supports INT8, INT16, FP32, FP64). However, despite this SPEs can still compute FP16 values. For example...

- DICE's SPU tile-based shader in Battlefield 3 converted the lighting output of pixels shaded with FP32 precision into FP16 precision in order to "Pack" twice the shaded pixels inside SPE local stores for DMA out to RSX graphics memory (bka 'Rapid Packed Math')
- ex-Ubisofter Sebastian Aaltonen said of PS3:

"Console developers have a lot of FP16 pixel shader experience because of PS3. Basically all PS3 pixel shader code was running on FP16." -- sebbbi

but the absence of native FP16 math meant that programmers had to code support for the format from scratch. If a programmer were willing to code in FP16 format/FP16 GEMM support for PS3's CELL, it's highly likely he/she could've built a working SPU-based version of DLSS for it. Had SIE stuck with CELL for PS4, native FP16 math would've likely been added to the list of "CELL v2" operations at the request of Mike Acton as he was well known to PS4 platform architecture manager, Alex Rosenberg (timestamped).

Naturally, the existing CELL BLAS GEMM routines would've been updated to reflect the addition of native FP16 support too, as currently it only supports SPE Single GEMM (SGEMM, FP32) and SPE Double GEMM (DGEMM, FP64). With these tweaks to CELL's SPE operations and BLAS GEMM routines, SIE would've had the means to construct a SPU-based version of DLSS for "CELL v2" with few if any compromises in function. How it would've performed is anyone's guess; but there may be clues that hint at it. The 640 Tensor cores of Nvidia's V100 operating on 4 x 4 matrices perform 64 FMAs per cycle for a total of 40,960 FMA operations in mixed precision (FP16/FP32).

Incredibly (if my comprehension hasn't failed me), a university study revealed that in one example a single SPE operating on 64 x 64 matrices in FP32 (SGEMM) performed 65,536 FMA operations with 99.8% of its single precision flops (25.55 GFLOPs out of 25.6 GFLOPs). That's 1.6x the performance of the V100's 640 Tensor cores on matrices 16x larger in size. This counter-intuitive absurd level of performance and efficiency was attributed to "very aggressive" loop unrolling that involved stuffing 4,096 FMA operations into the body of a loop that repeats 16 times.

Other reasons for the SPE's high rate of efficiency center around it having a dataflow design that requires software managed DMA into/out of its local stores (SRAMs) -- both traits of IBM's yet to be finalized Digital AI Core accelerator:

"At VLSI, IBM presented a research paper on a programmable accelerator for training neural networks and performing inference using a trained network, with authors from both IBM Research and the IBM Systems group....The processor is a dataflow-based design, which naturally maps to neural networks."...

"Last, the processor relies on software-managed data (and instruction) movement in explicitly addressed SRAMs, rather than hardware-managed caches. This approach is similar to the Cell processor and offers superior flexibility and power-efficiency (compared to caches) at the cost of significant programmer and tool chain complexity. While not every machine learning processor will share all these attributes, it certainly illustrates a different approach from any of the incumbents – and more consistent with the architectures chosen by start-ups such as Graphcore or Wave that solely focus on machine learning and neural networks." -- real world technologies

Makers of today's high-performance dedicated AI accelerators incorporating foundational aspects of CELL's architectural design (a design that went under-appreciated) into their own designs is a testament to just how forward-looking the chip really was.

What Ken Kutaragi oversaw the creation of was a scalable general purpose CPU with SPEs (CPUs) that rivaled the performance of GPU compute units, all wrapped up in an efficient architecture best suited for the data transfer needs of neural networks. Computing a version of DLSS (and all rest) at satisfactory speed wouldn't have been an issue for a robust CELL successor...
sampling-of-networks-trained-in-mixed-precision-l.jpg


ibm-research-32-l.jpg


ibm-research-33-l.jpg
 
Last edited:
Terrible example with the rtx 3090, it's clearly a cash grab by Nvidia and not even close to being a great card for the price asked. And this is coming from someone that paid over the market price for a 3080 late last year.

I honestly disagree with the rational behind the xss even as you put it. It costs $100 less than a fully capable PS5, but in exchange for those $100 it will gimp an entire generation (7 or so years) of games if developers really bend over backwards to support it. And the argument you're putting forth can be easily slippery sloped. Certainly in some markets even $300 is too much money for a console, maybe we should have one that's $200 and further be cut. Maybe $100? Where do we stop?

A baseline is set by the first parties every generation, they are supposed to put their best attempt at driving technology and gaming as a medium forward while staying within a mass market price range. Now MS is trying to have their cake and eat it too. Remember how they even wanted their "next gen" games to run on XB1? The only 2022 / 2023 game that should be able to run on a base XB1 or PS4 is an indy game, not a AAA one. And the XSS is barely more capable than a last gen machine.
Again unless you can point to features the XSS actually lacks it isn't gimping anything. Your slippery slope argument is equally as silly because the XSS has the same feature set as the XSX so the thing compromised is graphical features. Which is by design.

If you feel that the PS5 because it has no low end option will set the standard then I fully expect all PS5 games to blow away Xbox games. That already isn't true but who knows what's in store for the future. I have a feeling that both consoles will have great looking titles. Gears 5 looks just as good as Returnal and that is current gen vs last.

You are wrong about the the XSS being barely more capable than the X1. On the CPU alone there is a generational difference. You might not know this but the CPU has the same performance as the mighty PS5 only again missing the graphical horsepower. There is no evidence the XSS will hold anything back but it certainly won't win high in graphic performance awards.

You are free like anyone else, including me, to not buy the XSS but it's existence is good for the industry and as a PS5 owner it existing does not affect you at all. God of War, Horizon, The Last of Us, will still be nice games and won't have any XSS cooties on it so you can rest easy.

But Nintendo righted the ship with the Switch. Which launched about 4 years after the XBO and passed it in hardware sales. MS tried to right the ship and threw everything they could think of out there and nothing really helped.

Using Nintendo as an example but leaving this out is wild.
And MS righted the ship with the XSX|S. The Xbox launched at the same time as the GameCube and they outsold it. MS added more studios, beefed up their back compatibility, and added tons of games to Game pass. Giving Nintendo a mulligan while holding MS's feet to the fire is unfair at least. At least MS honored their current customers and still continue to support the X1 to this day Nintendo abandoned Wii U customers. Making them repurchase titles a 2nd time on Switch. This is not a practice we should cheer but no one had a problem with that. This is just as sad as making fun of MS for supporting back compatibility and honoring Sony for 'believing in generations'. It is what it is.

Wtf.. you're the one making all the predictions my man..I haven't made any :messenger_tears_of_joy: . Just making fun of all your stupid ones and yet you're still saying stuff will never happen. It's insanity.

X1 beat Wii U? That is very, very impressive.

I hope you celebrated hard that day.
You have to be joking man. You the guy who 'predicted' Kinects removal in 2021 talking about things that are stupid? You brought up how dominant the PS4 and PS2 were I didn't see you mention the PS3. Why is that? Didn't it dominate the Xbox especially in the US?

I'll celebrate hard the day you don't make a disingenuous argument about the Xbox platform. I am still laughing at your prediction that the XSS will not have performance improvements like every other console in history including the mighty PS3 which you seem to have forgotten. This will be a very interesting generation indeed. Looking forward to seeing how things play out. 😊
 
Last edited:

dcmk7

Banned
You have to be joking man. You the guy who 'predicted' Kinects removal in 2021 talking about things that are stupid? You brought up how dominant the PS4 and PS2 were I didn't see you mention the PS3. Why is that? Didn't it dominate the Xbox especially in the US?

I'll celebrate hard the day you don't make a disingenuous argument about the Xbox platform. I am still laughing at your prediction that the XSS will not have performance improvements like every other console in history including the mighty PS3 which you seem to have forgotten. This will be a very interesting generation indeed. Looking forward to seeing how things play out. 😊

I'm not even messing around but are you actually ok? Like genuinely. None of that is coherent.

Show me where I have made a prediction?


You have to be joking man. You the guy who 'predicted' Kinects removal in 2021 talking about things that are stupid? You brought up how dominant the PS4 and PS2 were I didn't see you mention the PS3. Why is that? Didn't it dominate the Xbox especially in the US?

I don't understand this. Predicted Kinect removal in 2021??

I have mentioned the PS4 and PS2, yes, and? I also mentioned the X1 and X360... what is the problem? Have only mentioned these consoles here, with this paragraph:
X1 did fine? PS4 had their most dominant era since PS2. It did absolutely nothing to build upon the relative success of Xbox 360
From the above why do I need to mention the PS3? You're making out I'm hiding something.. what is the insinuation?


I am still laughing at your prediction that the XSS will not have performance improvements like every other console in history including the mighty PS3 which you seem to have forgotten.
Can you show me where I have made a prediction?

Because there hasn't been any.

I've told you many times developers have said issues are going to get worse. Be angry at them.

But you still think it's my 'prediction' when paraphrasing (with links to interviews).

But you don't seem like you are following the discussion at all.

I'm beginning to feel bad as I am sort of starting to suspect something here.. that you might not be ok.
 
Last edited:
You are right but you need to thing about something else.

Even having gamers with SSD you can't avoid the duplication of data on PC because there will be gamers with HDD and the dev can't made two separated versions for SSD and HDD... so they ship the one with duplicated assets that allow both type players to have the game.
The same can be said about the decompression... PC don't have these hardware units for that so the PC game have to be shipped without these heavy compressed assets.

Now compare the size of the versions:

PC, Series, Xbox, PS4, etc: XX GB
PS5: 40% of XX GB

You see where the remove of the duplication and the use of heavy compression is happening? PS5.

That is part explained due the nature of the GDK that have the goal to allow devs to develop the same version across all Xbox devices and PC.... so if one or more hardware can't remove the duplication and can't do the heavy decompression then these version will be shipped as the duplication and little to no decompression.

That is the whole difference in a Multiplatform Tools vs a Hardware Specific Tools.

In the Multiplataform Tools the dev needs to choose to do specific changes to a specific hardware... most devs won't take or even have that time... most devs will choose what is common in all platforms.

That changes in a Hardware Specific Tools... you are already having to do the port so why not use the Tools available features?
Guys you know I love consoles but you are taking this too far:

For all this crossgen doesn't make sense to not have you version of PC requesting SSD in order to avoid duplication but is not because
doesn't has have those dedicate decompression units is more because is already a crossgen game.Also regarding to have two versions for
SSD and HDD, yes you can do it but you need to think if is worth it and you as we can see one of the most played games is Warzone is clear
that space is not so big problem as happens in consoles.

Another thing is if the game is going to take advantage of running in a SSD like Star Citizen. But I can assure in the end of this year you will see
a couple of games in PC (at least announced) which doesn't use duplication requesting an SSD.

Regarding the decompression is topic a little comprex topic as PC has its advantages like; improves over time, already exists in the market CPUs
with multiple time more power than so yeah is probably is not far to see a PC using brute force to do that. Even if I remember correctly the
Oodle texture compression can be decompress by the GPU so basically PC can use "brute force".

I mean even today is not weird find game using BC compression in games created in UE4 or Unity if the question is how much aggressive this
will be that another topic that depends of how the PC market involve.

And chose if you want to not use duplicated assets is no so hard, the "problem" is determinate how many and which in case you want to use it.

About why XS consoles didn't use i in my personal opinion is related to have a version which can runs in the last gen and XS consoles. This means
will reduce its size in the future? yes, Will be reduce as good in PS5 ? probably not.

Be patient guys, you want to see all the improvements in the first 6 months of this new gen.
 

ethomaz

Banned
Guys you know I love consoles but you are taking this too far:

For all this crossgen doesn't make sense to not have you version of PC requesting SSD in order to avoid duplication but is not because
doesn't has have those dedicate decompression units is more because is already a crossgen game.Also regarding to have two versions for
SSD and HDD, yes you can do it but you need to think if is worth it and you as we can see one of the most played games is Warzone is clear
that space is not so big problem as happens in consoles.

Another thing is if the game is going to take advantage of running in a SSD like Star Citizen. But I can assure in the end of this year you will see
a couple of games in PC (at least announced) which doesn't use duplication requesting an SSD.

Regarding the decompression is topic a little comprex topic as PC has its advantages like; improves over time, already exists in the market CPUs
with multiple time more power than so yeah is probably is not far to see a PC using brute force to do that. Even if I remember correctly the
Oodle texture compression can be decompress by the GPU so basically PC can use "brute force".

I mean even today is not weird find game using BC compression in games created in UE4 or Unity if the question is how much aggressive this
will be that another topic that depends of how the PC market involve.

And chose if you want to not use duplicated assets is no so hard, the "problem" is determinate how many and which in case you want to use it.

About why XS consoles didn't use i in my personal opinion is related to have a version which can runs in the last gen and XS consoles. This means
will reduce its size in the future? yes, Will be reduce as good in PS5 ? probably not.

Be patient guys, you want to see all the improvements in the first 6 months of this new gen.
There is a difference between you can do and actually do... you can do everything on PC with Brute Force but you will not run in 99% of the gamers setup.

The decompression is good example... you can Brute Force it on CPU or GPU but for that you need to send the asset already compressed... that means for these that can’t Brute Force it won’t decompress it.

So the Dev on PC needs always look at the minimum requirement and so heavy decompression is not something they can ship to PC gamers.

That changes when you have a hardware specific port.

You can be sure no game on PC will be shipped with assets compressed like you find on PS5... that won’t happen even in the near future because most gamers won’t have hardware to decompress it in render time.

Said that PC games needs a big shake in minimum specs and stop to follow what console hardware dictate.
 
Last edited:

assurdum

Banned
To answer both your questions: Yes, because there's an optimized library for computing GEMMs on SPEs somewhere collecting dust at Sony HQ.

Per Nvidia:



Here are illustrations of what Nvidia described...
tensor-core-acceleration-which-operations-l.jpg


background-tc-accelerated-gemm-output-matrix-l.jpg

To Nvidia's point about following the conventions of BLAS, the CELL SDK has a BLAS library with optimized SPE GEMM routines (BLAS Level III SGEMM and DGEMM). Mercury Computer Systems (once makers of the CELL-based PowerBlock 200 rugged computer for "applications in network-centric warfare") used these routines to add SPE GEMM support to its MultiCore library, which the company claimed demonstrated "superb" performance...

multicore-matrix-multiplication-library-2-l.jpg



problem-decomposition-1-l.jpg

As you may know, training a neural network can be done with FP32 matrix multiplication or with a mix of FP16/FP32 matrix multiplication. The existing SPE has support for native FP32 math, but lacks native support for FP16 math (only supports INT8, INT16, FP32, FP64). As is, SPEs can still compute FP16 values...

- DICE's SPU tile-based shader converted the lighting output of pixels shaded with FP32 precision into FP16 precision in order to "Pack" twice the shaded pixels inside SPE local stores for DMA out to RSX graphics memory (b.k.a 'Rapid Packed Math')
- ex-Ubisofter Sebastian Aaltonen said of PS3:



but it requires programmers to code support for the format from scratch. Had SIE stuck with CELL for PS4, native FP16 math would've likely been added to the list of "CELL v2" operations at the request of Mike Acton as he was well known to PS4 platform architecture manager, Alex Rosenberg (timestamped).

Naturally, the existing CELL BLAS GEMM routines would've been updated to reflect the addition of native FP16 support too, as currently it only supports SPE SGEMM (FP32) and SPE Double GEMM (DGEMM, FP64). With these tweaks to CELL's SPE operations and BLAS GEMM routines, SIE would've had the means to build a SPU-based version of DLSS. How it would've performed is anyone's guess; but there may be clues that hint at it. The 640 Tensors of Nvidia's V100 operating on 4 x 4 matrices perform 64 FMAs per cycle for a total of 40,960 FMA operations in mixed precision (FP16/FP32).

Incredibly (if my comprehension hasn't failed me), a university study revealed that in one example a single SPE operating on 64 x 64 matrices in FP32 (SGEMM) performed 65,536 FMA operations with 99.8% of its single precision flops (25.55 GFLOPs out of 25.6 GFLOPs). That's 1.6x the performance of the V100's Tensor cores on matrices 16x larger in size. This counter-intuitive absurd level of performance and efficiency was attributed to "very aggressive" loop unrolling that involved putting 4,096 FMA operations into the body of a loop that repeats 16 times.

Other reasons for the SPE's high rate of efficiency center around it being a dataflow design that requires software managed DMA into/out of its local stores (SRAMs) -- both traits of IBM's yet to be finalized Digital AI Core accelerator:



Makers of today's high-performance dedicated AI accelerators incorporating foundational aspects of CELL's architectural design (a design that went unappreciated) into their own designs is a testament to just how forward-looking the chip really was.

What Ken Kutaragi oversaw the creation of was a scalable general purpose CPU with SPEs (CPUs) that rivaled the performance of GPU compute units, all wrapped up in an architecture best suited for the data transfer needs of neural networks at near peak efficiency. Computing DLSS and all rest at satisfactory speed wouldn't have been an issue...

sampling-of-networks-trained-in-mixed-precision-l.jpg


ibm-research-32-l.jpg


ibm-research-33-l.jpg
Man Kutaragi is always on the point even when seemed to have screwed everything in the ps3...
 
There is a difference between you can do and actually do... you can do everything on PC with Brute Force but you will not run in 99% of the gamers setup.

The decompression is good example... you can Brute Force it on CPU or GPU but for that you need to send the asset already compressed... that means for these that can’t Brute Force it won’t decompress it.

So the Dev on PC needs always look at the minimum requirement and so heavy decompression is not something they can ship to PC gamers.

That changes when you have a hardware specific port.

You can be sure no game on PC will be shipped with assets compressed like you find on PS5... that won’t happen even in the near future because most gamers won’t have hardware to decompress it in render time.

Said that PC games needs a big shake in minimum specs and stop to follow what console hardware dictate.
Ok but I agree partially with you, yes of course you will not use your 99% of your CPU for this job but as tell you before your GPU can do it at
least for Oodle compression and that doesn't means necessarily use the 99% of a GTX 1060.

But first I don't say this year the PC games will have the same level of compression as PS5 but that doesn't means will not happen in
4-5 years. Also if this feature become so essential as we think you can be sure the next gen of GPUs or even CPU will have a way to solve
this in some degree.

In the same way I always say the optimization you find in console game is not feasible in PC is also true you cannot say when the PC
hardware reach it limits. But when we talk only compression if I have to do game for PC I will less worried than in consoles for space
available for possible future users so I prefer that power of decompression in another things.

This new gen is more similar to the gen of Xbox 360/PS3 that PS4/Xbox One because the requirement in PC increment in a decent amount in
many things. Is just the PC gamer who start with this around 2012 are not use to this. Right now the hardware scenario in PC is horrible so I don't
know if most of the PC gamers will be able to have RT/Mesh Shaders even in the end of 2022.

So the PC gaming has bigger problems that the size of the files, I am also a PC gamer but at difference to other people in the community I
understand that the majority of the people doesn't has a 5950x with a 3090 and 32 GB of RAM but I enough old to know that sometimes
the games increment a lot its requirements.
 
Last edited:

ethomaz

Banned
Ok but I agree partially with you, yes of course you will not use your 99% of your CPU for this job but as tell you before your GPU can do it at
least for Oodle compression and that doesn't means necessarily use the 99% of GTX 1060.

But first I don't say this year the PC games will have the same level of compression as PS5 but that doesn't means will not happen in
4-5 years. Also if this feature become so essential as we think you can be sure the next gen of GPUs or even CPU will have a way to solve
this in some degree.

In the same way I always say the optimization you find in console game is not feasible in PC is also true you cannot say when the PC
hardware reach it limits. But when we talk only compression if I have to do game for PC I will less worried than in consoles for space
available for possible future users so I prefer that power of decompression in another things.

This new gen is more similar to the gen of Xbox 360/PS3 that PS4/Xbox One because the requirement in PC increment in a decent amount in
many things. Is just the PC gamer who start with this around 2012 are not use to this. Right now the hardware scenario in PC is horrible so I don't
know if most of the PC gamers will be able to have RT/Mesh Shaders even in the end of 2022.

So the PC gaming has bigger problems that the size of the files, I am also a PC gamer but at difference to other people in the community I
understand that the majority of the people doesn't has a 5950x with a 3090 and 32 GB of RAM but I enough old to know that sometimes
the games increment a lot its requirements.
Just to add a bit more... there are plans or talks between manufactures to include dedicated decompression units in future processors or chipsets.... it is not clear where it will be put.

But that is like the FPU that used to be a dedicated separably chip outside the CPU that only some chipsets had but after Pentinum I believe it was included in all CPUs.... it will take some years to the dedicated decompressor units to enter in the new chips.
 
Last edited:

Elysion

Banned
To answer both your questions: Yes, because there's an optimized library for computing GEMMs on SPEs somewhere collecting dust at Sony HQ.

Wow, thanks for the detailed reply! It really seems like the Cell was ahead of its time. Considering the rising importance of AI accelerators these days, this might be one of the biggest missed opportunities in Sony‘s history. If they had stuck with Cell, and kept working on and improving the architecture, they might‘ve been far ahead of most other hardware companies in this field today thanks to their early start, and could be dominating the market for AI accelerators the same way they dominate the image sensor market. There would be Cell processors everywhere: in cars, TVs, mobile hardware, data centers etc (which is what Kutaragi’s ultimate plan was, if I remember correctly). Not to mention what it would‘ve meant for the capabilities of the PS4. Imagine if the PS4 was able to render games internally at only 540p or so and use some DLSS equivalent to upscale all the way to 1080p (or 1080p to 4k for the Pro), using the freed up hardware power for much better lighting, effects, shaders etc. The games would‘ve looked mindblowing.

An even better question is if something similar would’ve been true for the PS3, if developers could’ve used DLSS back then without sacrificing too much in other aspects (since many games used the SPEs for all kinds of stuff). It would be fascinating to know if someone with access to an AI upscaling algorithm and an old PS3 devkit could implement some kind of DLSS equivalent. Might be a project for the homebrew community?
 
Last edited:
CD Projekt is going to release Nextgen patch for PS5 and Xbox Series X

 
what they have shown so far looks nice, but not eyeball melting.

61MxJCG.gif


Definitely looks better than Ratchet on PS4.

9ct1HgY.gif


But im not seeing a generational leap. Maybe if they had changed the art style to something more realistic. Looks busier and there is more destruction, but overall im not blown away by the graphics.
Ok but in my opinion looks incredible for a game which will be released in less than a year the console release.

But Snake correct if I am wrong but I am sure I am not the first person who tell you, you have unrealistic expectations.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Early last year build?
looked pretty polished to me. for some reason they are going with native 4k. or close to native 4k with temporal injection. thats leaving a lot of the GPU power on the table.

it will let them do 1440p 60 fps though so thats gonna be the default mode to play. i was hoping for 1440p 30 fps.
 

SlimySnake

Flashless at the Golden Globes
Ok but in my opinion looks incredible for a game which will be released in less than a year the console release.

But Snake correct if I am wrong but I am sure I am not the first person who tell you have unrealistic expectations.
Maybe. But the insomniac dev is the guy telling everyone that their eyes will bleed, their minds will be blown, their pussies will get wet. its not me who is setting unrealistic expectations here.
 
Maybe. But the insomniac dev is the guy telling everyone that their eyes will bleed, their minds will be blown, their pussies will get wet. its not me who is setting unrealistic expectations here.
Aside your opinion of this game, this is not the first time you said you are disappointing for a game regarding the graphics.

And if for some people like me is agree that this game looks gorgeous, then he is not lying or something. Looks if for you is not enough
and you think you have enough arguments to say that, ok then.

If you expect to see a jump like happens between PS2-Gamecube-Xbox to Xbox 360-Ps3 that will not happen.
 
Last edited:

sncvsrtoip

Member
Maybe. But the insomniac dev is the guy telling everyone that their eyes will bleed, their minds will be blown, their pussies will get wet. its not me who is setting unrealistic expectations here.
I’m not blow away also but how much better can look this type of graphics ? For sure looks great and better than ps4 ratchet.
 
Last edited:
I’m not blow away also but how much better can look graphics this type ? For sure looks great and better than ps4 ratchet.
Not only that when someone say this doesn't looks great so that mean that person is thinking in a game which looks better,
maybe only imagine a CGI or even doesn't like kind of graphics which is undertable is a persona taste.
 

jroc74

Phone reception is more important to me than human rights
You are right but you need to thing about something else.

Even having gamers with SSD you can't avoid the duplication of data on PC because there will be gamers with HDD and the dev can't made two separated versions for SSD and HDD... so they ship the one with duplicated assets that allow both type players to have the game.
The same can be said about the decompression... PC don't have these hardware units for that so the PC game have to be shipped without these heavy compressed assets.

Now compare the size of the versions:

PC, Series, Xbox, PS4, etc: XX GB
PS5: 40% of XX GB

You see where the remove of the duplication and the use of heavy compression is happening? PS5.

That is part explained due the nature of the GDK that have the goal to allow devs to develop the same version across all Xbox devices and PC.... so if one or more hardware can't remove the duplication and can't do the heavy decompression then these version will be shipped as the duplication and little to no decompression.

That is the whole difference in a Multiplatform Tools vs a Hardware Specific Tools.

In the Multiplataform Tools the dev needs to choose to do specific changes to a specific hardware... most devs won't take or even have that time... most devs will choose what is common in all platforms.

That changes in a Hardware Specific Tools... you are already having to do the port so why not use the Tools available features?
This makes alot of sense.

Again unless you can point to features the XSS actually lacks it isn't gimping anything. Your slippery slope argument is equally as silly because the XSS has the same feature set as the XSX so the thing compromised is graphical features. Which is by design.

If you feel that the PS5 because it has no low end option will set the standard then I fully expect all PS5 games to blow away Xbox games. That already isn't true but who knows what's in store for the future. I have a feeling that both consoles will have great looking titles. Gears 5 looks just as good as Returnal and that is current gen vs last.

You are wrong about the the XSS being barely more capable than the X1. On the CPU alone there is a generational difference. You might not know this but the CPU has the same performance as the mighty PS5 only again missing the graphical horsepower. There is no evidence the XSS will hold anything back but it certainly won't win high in graphic performance awards.

You are free like anyone else, including me, to not buy the XSS but it's existence is good for the industry and as a PS5 owner it existing does not affect you at all. God of War, Horizon, The Last of Us, will still be nice games and won't have any XSS cooties on it so you can rest easy.


And MS righted the ship with the XSX|S. The Xbox launched at the same time as the GameCube and they outsold it. MS added more studios, beefed up their back compatibility, and added tons of games to Game pass. Giving Nintendo a mulligan while holding MS's feet to the fire is unfair at least. At least MS honored their current customers and still continue to support the X1 to this day Nintendo abandoned Wii U customers. Making them repurchase titles a 2nd time on Switch. This is not a practice we should cheer but no one had a problem with that. This is just as sad as making fun of MS for supporting back compatibility and honoring Sony for 'believing in generations'. It is what it is.


You have to be joking man. You the guy who 'predicted' Kinects removal in 2021 talking about things that are stupid? You brought up how dominant the PS4 and PS2 were I didn't see you mention the PS3. Why is that? Didn't it dominate the Xbox especially in the US?

I'll celebrate hard the day you don't make a disingenuous argument about the Xbox platform. I am still laughing at your prediction that the XSS will not have performance improvements like every other console in history including the mighty PS3 which you seem to have forgotten. This will be a very interesting generation indeed. Looking forward to seeing how things play out. 😊
....Nintendo righted the ship....in the same console generation..... the exact same generation the Wii U launched in that you felt the need to bring up XBO outsold it and conveniently left out the Switch.

You are talking about MS righting the ship with their next gen consoles.

?

So....if thats the case....you must think the One X was a failed experiment.

Sorry but you are kinda all over the place. Talking holding MS feet to the fire, but at the same time dismissing the hell of a comeback Nintendo did is beyond wild.

You are talking about honoring customers, but at the end of the day the main goal for any of these companies is making money. MS chose one way, Nintendo chose another.

MS is honoring customers with XBO consoles....and yet the XBO consoles are all discontinued.

Stop thinking these companies do things out of the goodness of their hearts. MS and Nintendo made choices that's best for their companies. Honestly, way to bring in alot of unrelated stuff. This entire thing was about how you felt to need to dismiss the Switch by saying the XBO outsold the Wii U and left it at that.
 
Last edited:

Imtjnotu

Member
looked pretty polished to me. for some reason they are going with native 4k. or close to native 4k with temporal injection. thats leaving a lot of the GPU power on the table.

it will let them do 1440p 60 fps though so thats gonna be the default mode to play. i was hoping for 1440p 30 fps.
Who cares what last year's early build looks like
 

Nikana

Go Go Neo Rangers!
what they have shown so far looks nice, but not eyeball melting.

61MxJCG.gif


Definitely looks better than Ratchet on PS4.

9ct1HgY.gif


But im not seeing a generational leap. Maybe if they had changed the art style to something more realistic. Looks busier and there is more destruction, but overall im not blown away by the graphics.

Gonna have to disagree. From an IQ stand point it's not a night and day difference but the complexity of everything on screen is vastly superior on rift apart.
 
They need to stop talking about what they "plan to do" and start taking more action ....

We are months out from Cyberpunk 2077's launch and we still have no release date on the next gen update for the game or Witcher 3's next gen update.

How many different "roadmaps" are they going to release at this point?
Under pressure studio that failed to hit critical deadlines for release puts itself under more pressure with yet more roadmaps and critical deadlines.

But this is CDPR so it checks out.
 
Last edited:

dr guildo

Member
what they have shown so far looks nice, but not eyeball melting.

61MxJCG.gif


Definitely looks better than Ratchet on PS4.

9ct1HgY.gif


But im not seeing a generational leap. Maybe if they had changed the art style to something more realistic. Looks busier and there is more destruction, but overall im not blown away by the graphics.
Photomode will tell us that...
 

ethomaz

Banned
what they have shown so far looks nice, but not eyeball melting.

61MxJCG.gif


Definitely looks better than Ratchet on PS4.

9ct1HgY.gif


But im not seeing a generational leap. Maybe if they had changed the art style to something more realistic. Looks busier and there is more destruction, but overall im not blown away by the graphics.
These two gifs doesn't even enter in the same category to me.
They are night and day different in terms of graphics.

And it is not just the amount of particles effects being used... the characters is drastic different in terms of details and animation.
It doesn't even feel like te same game or same character.
 
Last edited:
Status
Not open for further replies.
Top Bottom