Graphical Fidelity I Expect This Gen

Why don't we think they will do like third party studios currently porting their games to switch 2 like resident evil requiem or chronos and even there are rumors about gta6 coming to it, so ps6 Protable with far better gpu than switch2, upscalled 2k resolution and rdna 5 based gpu even it's if row performance ~50% ps5 it's architecture are far higher, in the end I want to say Protable ps6 won't cause a problem considering it won't be that hardto run ps6 games on far lower resolution and without some techs like path tracing
I love your optimism but they are still forced to develop with a tablet cpu in mind.
 
ps6 will have tech (Neural arrays) similar to nividia High End database GPUs not like regular gaming RTX GPUs
"Neural arrays" is basically "tensor cores". Present in Nvidia h/w since Turing, got some rendering oriented optimizations in Blackwell.

and a New Hardware Compression tech not presented yet on nividia gpus
We don't know what is present and what isn't inside Nvidia GPUs. This is a completely internal thing which they may have for years as well.

From what AMD and Sony said yesterday pretty much everything is already in GeForce. They are catching up though which is a good thing.
 
Last edited:
"Almost certainly" based on what?
The considerable reduction in improvements smaller nodes are offering as well as the increasing cost to produce these chips. They can not continue raising prices and TDP indefinitely if they expect these GPUs to actually sell.

Hardware compression was presented in the video in my previous post, working on 5090. And I'm not sure if bandwidth would be a problem on PC.
My takeaway was that this is similar to delta color compression but applied to the entire GPU as opposed to just the graphics queue. So work done in the compute queue for example will now benefit. We have no idea whether or not this already exists in nvidia GPUs since they stopped giving useful information about how their hardware actually works almost a decade ago.
 
Last edited:
"Neural arrays" is basically "tensor cores". Present in Nvidia h/w since Turing, got some rendering oriented optimizations in Blackwell.


We don't know what is present and what isn't inside Nvidia GPUs. This is a completely internal thing which they may have for years as well.

From what AMD and Sony said yesterday pretty much everything is already in GeForce. They are catching up though which is a good thing.
K KeplerL2 comments, PS6 Neural array tech not presented on nividia gaming GPUs.👇
srqVN3EGqPeLSVgL.jpg

srqVN3EGqPeLSVgL.jpg
 
Last edited:
i posted his comments above
I saw this, but it doesn't mean it would be a game changer vs Nvidia (but it absolutely is vs ps5). One of the selling points of ps5 was a fast ssd and Kraken, and 5 years later we have the Yotei as the most advanced PS5 graphics..

The considerable reduction in improvements smaller nodes are offering as well as the increasing cost to produce these chips.
What you're talking about is more about rasterization. Rasterization as a driver of progress in graphics is practically dead. More specialized cores will provide a significant increase in path tracing and ml features, that's why Cerny talked exclusively about them in the video.
 
Last edited:
I saw this, but it doesn't mean it would be a game changer vs Nvidia (but it absolutely is vs ps5). One of the selling points of ps5 was a fast ssd and Kraken, and 5 years later we have the Yotei as the most advanced graphics of PS5..


What you're talking about is more about rasterization. Rasterization as a driver of progress in graphics is practically dead. More specialized cores will provide a significant increase in path tracing and ml features, that's why Cerny talked exclusively about them in the video.
My main goal of talking above is not to ps6 challenge nividia or pc, I am taking about ps6 will have great upgrade and investment on techs far better than what they did with ps5 by the time of its release
 
What you're talking about is more about rasterization. Rasterization as a driver of progress in graphics is practically dead. More specialized cores will provide a significant increase in path tracing and ml features, that's why Cerny talked exclusively about them in the video.
It applies to everything. The issue youre reffering to WRT rasterization is the inability to efficiently calculate accurate light transport due to the limits of how rasterization functions. The performance falls off a cliff and actually becomes slower than RT. This is a completely separate issue from hardware performance scaling. You can just look at Nvidia's AI chips. The architecture is remaining very similar and they are relying on bigger chips, more power and lower precision math to continue to offer meaningful performance improvements. They already have cores for basic math, special function math, textures, geometry, RT, matrix math etc. What more can they move to fixed function without severely limiting programmability?
 
Last edited:
Not enough love for this game here ;d It can be very underwhelming with geometry of assets like rocks etc but when it all works it looks stunning. Also one of best pssr implementation I've seen + nice hdr (when most games on ue5 has it broken for some reason).

The HDR in this game is truly beautiful.
It's so much fun to gallop around on horseback.
White rocks with chamfered edges are often seen in scenic spots in Japan. In Yotei, the rocks can look unbalanced when placed among the highly realistic vegetation.

Photos of scenic spots in Japan
rlnZljDf_o.jpg


Ghost of Yotei (PS5 Pro RT Pro Mode) Mountain and forest scenery
p6G3OCQy_o.jpg
 
No, it's AMD's version of Hopper's Thread Cluster/Distributed Shared Memory
It has that too but "Neural arrays" really sounds just like "tensor cores" i.e. a dedicated array of MM ALUs (as in opposing an array of MM ALUs inside SIMDs in RDNA4). How it operates with caches, LDS (i.e. "Thread Cluster/Distributed Shared Memory") and memory is secondary to what it is.

Which is interesting because while NVIDIA has had it for a while they never brought it to gaming GPUs.
Well it likely has no benefits for gaming code which is why it's DC exclusive.
 
My main goal of talking above is not to ps6 challenge nividia or pc, I am taking about ps6 will have great upgrade and investment on techs far better than what they did with ps5 by the time of its release
I'm not really talking about the challenge either, I'm saying that we need an even more powerful ps6 than we currently discussing.

The issue youre reffering to WRT rasterization is the inability to efficiently calculate accurate light transport due to the limits of how rasterization functions.
No, I'm saying that the main increase will come from the number of specialized cores, and I don't see a problem with adding them to future cards. I'm not too concerned about stagnation in rasterization - in a few years, half or more of the AAA games will be based on ray tracing. I think ps6 will have this from the start.
 
Last edited:
It has that too but "Neural arrays" really sounds just like "tensor cores" i.e. a dedicated array of MM ALUs (as in opposing an array of MM ALUs inside SIMDs in RDNA4). How it operates with caches, LDS (i.e. "Thread Cluster/Distributed Shared Memory") and memory is secondary to what it is.


Well it likely has no benefits for gaming code which is why it's DC exclusive.
AMD has had matrix cores for ages dude, this specific feature is sharing LDS across WGPs to process larger workgroup clusters.
 
"Matrix cores" is not tensor cores. What AMD had and has now are matrix multiplication ALUs inside main SIMD processors. "Neural arrays" sound a lot like tensor cores, and as I've said
It's literally the same thing. The fact that AMD chooses not to have dedicated execution ports on their gaming uarchs is just area optimization.
This isn't "arrays", it's memory hierarchy.
It's an Array of Matrix cores working together, what's so hard to understand?
 
I'm not really talking about the challenge either, I'm saying that we need an even more powerful ps6 than we currently discussing.

Depends on what the design objective is, from the statements made by Cerny and AMD, it's clear they're trying to develop a console which can handle path tracing at reasonably good image quality, and likely 60 FPS.

Judging by the feature set, that's where they are headed. Could it be more, sure but that's always been an argument for consoles though but it does seem to be enough.
 
Around 6x

Omg this huge difference will affect the quality of games that would released on both systems GymWolf GymWolf you are right I thought the difference would be 3-4x on overall

Looks like my crude math was on point few months ago:


You see Topher Topher ? I'm learning when it comes to quoting myself from the past. Still not on O onQ123 level...
 
Last edited:
Looks like my crude math was on point few months ago:


You see Topher Topher ? I'm learning when it comes to quoting myself from the past. Still not on O onQ123 level...
Half a PS5 while in portable is insane bearing in mind the Switch 2 while in portable is half a ham sandwich. I wonder if those metrics are for some kind of docked mode. And it's 1/3 of the PS5 bandwidth means that that thing isn't going to run anything but paired backs PS5 games, no chance in hell that stuff can run paired back PS6 games.
 
I can't believe they're going to release a weak handheld alongside the console next gen (weak compared to the console). This sounds worse than the Series S. I hope this doesn't hold back the ambitions of game developers 😟.
 
Wait a second, we are gonna see a ps6 not earlier than 3-4 years and we can't even hope for a 5080 level of power?
Holidays 2027 launch, yet it will be weaker from 5080, which means substantially weaker from rtx 4090 which launched in 2022, aka by then 5yo gpu, writting is on the wall if ps6 will only draw 160Watts like they saying in the leaks.

Ofc it will still be substantally stronger in raster(and even more in rt/ai upscaling) from pathethically weak ps5pr0, still by then we will have 60xx series cards from nvidia already, so very likely even midrange rtx 6070(likely 800-1k usd streetprice) will be stronger from brand new ps6 in every possible aspect.

It makes total sense, especially if we assume that smaller/weaker ps6 is made to cost at max 499-599$(with xbox rising white flag sony doesnt even really need to try that much unfortunately).
 
Holidays 2027 launch, yet it will be weaker from 5080, which means substantially weaker from rtx 4090 which launched in 2022, aka by then 5yo gpu, writting is on the wall if ps6 will only draw 160Watts like they saying in the leaks.

Ofc it will still be substantally stronger in raster(and even more in rt/ai upscaling) from pathethically weak ps5pr0, still by then we will have 60xx series cards from nvidia already, so very likely even midrange rtx 6070(likely 800-1k usd streetprice) will be stronger from brand new ps6 in every possible aspect.

It makes total sense, especially if we assume that smaller/weaker ps6 is made to cost at max 499-599$(with xbox rising white flag sony doesnt even really need to try that much unfortunately).
From my perspective The most important 2 things on ps6 is ray tracing performance which rumored to be on par with RTX 5090 not 5080 and ML advanced Tech which is completely new on AMD gpus, I know raster performance and raw teraflops are important but if ps6 gonna have dedicated cores for ray/path tracing comparable to RTX 5090 Ray tracing cores on performance that's completely fine for me considering it's suggested price 500~600$ the only downside thing I see on those leaks is ps6 Protable
 
Last edited:
I like how we are discussing features like neural rendering and path tracing when Sony's own first party studios couldnt be bothered to use the existing mesh/primitive shader, ray tracing and other hardware features on the ps5 in the first five years.

i have been resigned to a smaller upgrade this time around because the costs have risen, and profit first mentality of sony, nintendo and microsoft means more expensive consoles with weaker specs, but if you look at it like a mid gen upgrade, i think it will be fine. most devs will be adding path tracing in their games releasing in 2027 and beyond and it would be good to have a console that can run them at decent resolutions.

I am ok with it. Just not expecting anymore big generational leaps anymore. lets just get UE5 working well and maybe if we are lucky we will get marvel 1943 visuals in every game next gen.
 
I am ok with it. Just not expecting anymore big generational leaps anymore. lets just get UE5 working well and maybe if we are lucky we will get marvel 1943 visuals in every game next gen.
I really wanna see gta6 on ps5, we will be able to get similar lvl of fidelity from proper next gen games(obviously at way smaller scale, gta6 will be open world juggetnaut with crazy attention to detail coz of crazy big budget and dev time) so maybe on ps6/ midspecs pc around 2030 or 2032 latest.
 
I like how we are discussing features like neural rendering and path tracing when Sony's own first party studios couldnt be bothered to use the existing mesh/primitive shader, ray tracing and other hardware features on the ps5 in the first five years.

i have been resigned to a smaller upgrade this time around because the costs have risen, and profit first mentality of sony, nintendo and microsoft means more expensive consoles with weaker specs, but if you look at it like a mid gen upgrade, i think it will be fine. most devs will be adding path tracing in their games releasing in 2027 and beyond and it would be good to have a console that can run them at decent resolutions.

I am ok with it. Just not expecting anymore big generational leaps anymore. lets just get UE5 working well and maybe if we are lucky we will get marvel 1943 visuals in every game next gen.
I'm not okay with it, I want generational leaps...

 
Last edited:
Top Bottom