Graphical Fidelity I Expect This Gen

We are on graphics theard so it's logical that we here talk only about graphics.

Yes, and that's what should be discussed. But that doesn't stop us from questioning certain senseless comments.
Outlandish comparisons like "this looks like a PS3" sound more like a bunch of people spouting random nonsense. Or thinking that graphics are all about super-fidelity, when they're actually about countless other factors.
 
Explain to us what FLOPS are and how they influence the look of a 3D game?
Learn it yourself if you want to partake in graphics discussions. Maybe you are 12 and this is your first time on the internet, and thats ok, but im not going to waste my time talking about something that is literally the basic foundation of what we discuss here.
 
Learn it yourself if you want to partake in graphics discussions. Maybe you are 12 and this is your first time on the internet, and thats ok, but im not going to waste my time talking about something that is literally the basic foundation of what we discuss here.
Just Iet Ai answered him a very quick and detailed answer
(Teraflops, which stands for "trillion floating-point operations per second," is a measure of a computer's processing power, particularly its ability to perform complex mathematical calculations. In the context of graphics, more teraflops generally translate to better performance and more visually impressive graphics in games. This is because graphics rendering relies heavily on floating-point operations to calculate lighting, textures, and other visual elements.
Here's a more detailed breakdown:
How Teraflops affect graphics:
Enhanced Visuals:
More teraflops allow for more complex and detailed scenes to be rendered, leading to higher resolutions, more realistic lighting effects, and more detailed textures.
Smoother Gameplay:
A higher teraflop count enables the GPU to render frames faster, resulting in smoother gameplay with higher frame rates and less stuttering.
Complex Simulations:
Teraflops are also crucial for physics simulations within games, such as realistic particle effects, cloth physics, and more.
Ray Tracing:
Ray tracing, a technique that simulates the behavior of light to create incredibly realistic visuals, requires significant computational power. Higher teraflop counts are essential for enabling ray tracing in games.)
 
Last edited:
The PS5 generation feels like a wet fart.
Why do people keep saying this after we have seen graphics stunners released this gen starting from 2023 onwards?

The problem is sony studios who have shown time and time again with GOW, GT7, Spiderman 2, Astro Bot, DS2 and now GoY that they are ok with making cross gen looking trash.

The rest of the industry is literally a generation ahead of them and has been for a while. HFW literally came out 3.5 years ago, and that was the last time Sony was on top.
 
Yes, and that's what should be discussed. But that doesn't stop us from questioning certain senseless comments.
Outlandish comparisons like "this looks like a PS3" sound more like a bunch of people spouting random nonsense. Or thinking that graphics are all about super-fidelity, when they're actually about countless other factors.
I know comparing to ps3 is a bit over, but thoss lazy developers disappointed many here in terms of graphics.
 
remember when we had whole trilogies on xbox 360? 3 gears games, 3 mass effects, 3 uncharted etc... no one complained about them looking very similar back then. this is just nonsense.
My guy, this is a bizarre comment and a major brainfart. You losing the plot.
 
Last edited:
And obviously, Mafia looks stunning at times, but that particular screenshot is awful. The whole point of lumen is to ensure that graphics never look that bad under any lighting condition.
Thatd be the point of proper ray tracing lighting/GI, which is what the devs use when they care about quality, lumen mainly exists just to make development easy and fast at the expense of quality and ambition.

As it stands, using lumen (both sw andhw) is a sign to me that the game will have low quality effects and cut curners, so far only wukong devs did a decent job with it and even there it looks bad compared to full rt, which is kind of obvious, but still.
 
Last edited:
Learn it yourself if you want to partake in graphics discussions. Maybe you are 12 and this is your first time on the internet, and thats ok, but im not going to waste my time talking about something that is literally the basic foundation of what we discuss here.

¯\_(ツ)_/¯

If it's so basic, why did you avoid the question and not answer it?
 
Well, we are not talking about the average PS3 game which yes, looks like shit in 2025. Top tier games like Uncharted, Killzone, Halo 4, Beyond, GOW3, and GTA5? Thats where the comparisons become a bit more accurate.

And yes, youtube and gifs help those games as well. But some of these games coming out dont even look as good as some PS4 games, so people will naturally think PS3.

And obviously, Mafia looks stunning at times, but that particular screenshot is awful. The whole point of lumen is to ensure that graphics never look that bad under any lighting condition.
Also, it's not like the statements are necessarily inaccurate. These were also some of the last games to release on the system, the games most people likely even remember from the PS3 at this point.
 
Last edited:
Thatd be the point of proper ray tracing lighting/GI, which is what the devs use when they care about quality, lumen mainly exists just to make development easy and fast at the expense of quality and ambition.

As it stands, using lumen (both sw andhw) is a sign to me that the game will have low quality effects and cut curners, so far only wukong devs did a decent job with it and even there it looks bad compared to full rt, which is kind of obvious, but still.
I dont know man. I have been very happy with both Software and hardware Lumen's indirect lighting. Mafia just seems like an outlier.

I love the way lumen handles sunlight peaking in and still lighting walls, and providing great AO coverage properly shadowing areas that should not be lit by an outside light source. You dont see that in other last gen games like TLOU2 which still look jaw dropping outdoors with direct light but then begin to fall apart in indirect lighting sources. I am seeing the same thing in Mafia which is super weird since its supposed to be using Lumen.

mykOlKu.gif


NPsl3Hf.gif




EBLoO62.jpeg


D0pZnPY.jpeg
 
¯\_(ツ)_/¯

If it's so basic, why did you avoid the question and not answer it?
AI quick answer to your question if you asking him wanting to learn something
(Teraflops, which stands for "trillion floating-point operations per second," is a measure of a computer's processing power, particularly its ability to perform complex mathematical calculations. In the context of graphics, more teraflops generally translate to better performance and more visually impressive graphics in games. This is because graphics rendering relies heavily on floating-point operations to calculate lighting, textures, and other visual elements.

Here's a more detailed breakdown:

How Teraflops affect graphics:

Enhanced Visuals:

More teraflops allow for more complex and detailed scenes to be rendered, leading to higher resolutions, more realistic lighting effects, and more detailed textures.

Smoother Gameplay:

A higher teraflop count enables the GPU to render frames faster, resulting in smoother gameplay with higher frame rates and less stuttering.

Complex Simulations:

Teraflops are also crucial for physics simulations within games, such as realistic particle effects, cloth physics, and more.

Ray Tracing:

Ray tracing, a technique that simulates the behavior of light to create incredibly realistic visuals, requires significant computational power. Higher teraflop counts are essential for enabling ray tracing in games.)
 
AI quick answer to your question if you asking him wanting to learn something
(Teraflops, which stands for "trillion floating-point operations per second," is a measure of a computer's processing power, particularly its ability to perform complex mathematical calculations. In the context of graphics, more teraflops generally translate to better performance and more visually impressive graphics in games. This is because graphics rendering relies heavily on floating-point operations to calculate lighting, textures, and other visual elements.

Here's a more detailed breakdown:

How Teraflops affect graphics:

Enhanced Visuals:

More teraflops allow for more complex and detailed scenes to be rendered, leading to higher resolutions, more realistic lighting effects, and more detailed textures.

Smoother Gameplay:

A higher teraflop count enables the GPU to render frames faster, resulting in smoother gameplay with higher frame rates and less stuttering.

Complex Simulations:

Teraflops are also crucial for physics simulations within games, such as realistic particle effects, cloth physics, and more.

Ray Tracing:

Ray tracing, a technique that simulates the behavior of light to create incredibly realistic visuals, requires significant computational power. Higher teraflop counts are essential for enabling ray tracing in games.)
lol Stop wasting your time with this fool.

This is basic knowledge. Anyone telling you that graphics processing power does not correlate with performance is being disingenious. We literally have 9 generations of evidence and this guy is playing a fool. Zero point giving them air.
 
Or thinking that graphics are all about super-fidelity, when they're actually about countless other factors.
What constitutes next gen graphics is primarily fidelity, everything else is secondary. You can still make games that are visually appealing wih good art style, but that doesn't mean that such games will be regarded "next gen" if they're still based on old tech. If you're looking for a thread that isn't heavily modern tech focused then this thread isn't for you.
 
Last edited:
What constitutes next gen graphics is primarily fidelity, everything else is secondary. You can still make games that are visually appealing wih good art style, but that doesn't mean that such games will be regarded "next gen" if they're based on old tech. If you're looking for a thread that isn't heavily modern tech focused then this thread isn't for you.

Based on what? Where does this definition come from?
Graphics, in games, mean the visual elements displayed on a screen, including everything from character models and environments to textures, lighting, and special effects, etc.
Fidelity doesn't define this; it's just an element like any other. It's not the primary one, because that depends on the objective.

lol Stop wasting your time with this fool.

This is basic knowledge. Anyone telling you that graphics processing power does not correlate with performance is being disingenious. We literally have 9 generations of evidence and this guy is playing a fool. Zero point giving them air.

Just because I asked a question? LOLOLOLOLOLOL

Imagine how anxious you both would be when you were questioned about every little thing.
 
Based on what? Where does this definition come from?
Graphics, in games, mean the visual elements displayed on a screen, including everything from character models and environments to textures, lighting, and special effects, etc.
Fidelity doesn't define this; it's just an element like any other. It's not the primary one, because that depends on the objective.
You're just here to argue, not to add to the conversation. Maybe you should make a thread of your own and talk about it there. Or you could stay here and get obliterated in replies.
 
Sony is clearly limiting budgets here, they lost so much on Concord and other Gaas that they don't want to waste money on tech development.
That would explain why there has been zero investment into RTGI, utilization of mesh shaders, CPU heavy physics effects, destruction and simulation. They simply dont have enough graphics engineers to implement these features even if the designers and artists might want them.

What it does not explain is why the artists did not bother redoing assets and textures for a completely new sequel on a completely new system. Despite being given 5 years by Sony. Despite being given a mandate to make a next gen only game. Something Horizon FW was not afforded. And yet GG artists did go and update all the trees, all the foliage, all those shrubs, and ground level textures. They realized that their indirect lighting solution was absolute garbage in the first game, and went out of their way to improve it for the sequel despite being forced to stick with a baked lighting solution. it was not perfect and 3 years on, other games have had better indirect lighting solutions, but it was a massive improvement.

DF outlines the improvements made to this in their HFW review. The awful orange lighting from the first game was completely removed. They added more probes, better AO and shadow coverage, that simulated bounce lighting that made indirect lighting look great in some areas. Sucker Punch has not done any work on this. Like zero. it's like they fired all their programmers and artists after the first game.

Timestamped:


YD6E3yzNn1hOpiF7.jpg
 
Based on what? Where does this definition come from?
Graphics, in games, mean the visual elements displayed on a screen, including everything from character models and environments to textures, lighting, and special effects, etc.
Fidelity doesn't define this; it's just an element like any other. It's not the primary one, because that depends on the objective.



Just because I asked a question? LOLOLOLOLOLOL

Imagine how anxious you both would be when you were questioned about every little thing.
You stop trush that you post here I give you detailed answer I am not anxious, but if you don't understand go learn instead of bullshit knowledge that you have and coming here talk bullshit
 
Last edited:
I dont know man. I have been very happy with both Software and hardware Lumen's indirect lighting. Mafia just seems like an outlier.

I love the way lumen handles sunlight peaking in and still lighting walls, and providing great AO coverage properly shadowing areas that should not be lit by an outside light source. You dont see that in other last gen games like TLOU2 which still look jaw dropping outdoors with direct light but then begin to fall apart in indirect lighting sources. I am seeing the same thing in Mafia which is super weird since its supposed to be using Lumen.

mykOlKu.gif


NPsl3Hf.gif




EBLoO62.jpeg


D0pZnPY.jpeg
Lumen is certainly better compared to completely static and traditional games of the past, SH2 is one of the better examples ive seen, still wish theyve pushed it a bit more. The lighting and AO were my only problems with TLou2, so i agree with you there.

But no matter what UE5 game, they always ship with subtle lumen settings instead of truly pushing it (presumably to be light on performance because in older ue5 versions it used to be far too heavy for the visuals you were getting, so lumen felt kneecapped)

Games on proprietary engines like SW Outlaws, Avatar, Ac Shadows, Alan Wake 2, Metro Exodus enhanced and Cyberpunk have far better effects comparatively to the best UE5 games ive played (and i play a lot of those) and i mean with standard RT, it's because they have to develop their own tech, its a calculated decision for those games and part of their core design, so it's always used with actual intention.

I still think we haven't truly seen anyone push lumen to it's true potential and that it's mostly abused, which is not purely a bad thing, but i hope we will see much better showcases soon, now that ue5 has improved it's performance. I love ue5, but lumen has so far been a bit underwhelming.
 
Last edited:
One area of fidelity where Yotei might end up looking really good is the quality of the vegetation cardboard assets in the far distance.
AC:S often looked atrocious in this regard.

iIGtPNKwYPyNEUgN.jpg

7bStR3vumkd7gw0A.jpg
That is the one and only thing that looks greatly improved in Yotei....why couldn't they improve the textures, lighting, and Rockstar formations though is what I want to know....just having draw distances look great (distant detail rather) isn't enough to make this look like a current gen title.
 
You're just here to argue, not to add to the conversation. Maybe you should make a thread of your own and talk about it there. Or you could stay here and get obliterated in replies.

It's 679 pages of way more talk than anything to actually add. I'll stay here if I want, you can just not answer me. Being obliterated seems like too strong an argument to me, because what I'm seeing is a lot of people being defensive and running away from me.

You stop trush that you post here I give you detailed answer I am not anxious, but if you don't understand go learn instead of bullshit knowledge that you have and coming here talk bullshit

I just asked a question, but you went on the defensive with ad hominem arguments. You're extremely anxious about a simple question.
 
It's 679 pages of way more talk than anything to actually add. I'll stay here if I want, you can just not answer me. Being obliterated seems like too strong an argument to me, because what I'm seeing is a lot of people being defensive and running away from me.
People are running away from pointless arguments you trying to create. You came here and started picking fights and wanted to argue. You talk about being defensive and yet immediately started defending bad graphics by given an excuse about graphics being this and that.
 
Last edited:
It's 679 pages of way more talk than anything to actually add. I'll stay here if I want, you can just not answer me. Being obliterated seems like too strong an argument to me, because what I'm seeing is a lot of people being defensive and running away from me.



I just asked a question, but you went on the defensive with ad hominem arguments. You're extremely anxious about a simple question.
I answered your question with detailed answer above, you don't talk about my answer also if you don't understand or not believe this answer here is epic founder Tim sweeney talk about how we need more Tera flops to achieve photo realestic graphics
If you doubt him or don't understand the answer I give to you above then you just bullshiting lol,
 
Last edited:
It's 679 pages of way more talk than anything to actually add. I'll stay here if I want, you can just not answer me. Being obliterated seems like too strong an argument to me, because what I'm seeing is a lot of people being defensive and running away from me.



I just asked a question, but you went on the defensive with ad hominem arguments. You're extremely anxious about a simple question.
Dude you dont even know that better GPUs equals better graphics. How can you even come into a graphics thread and admit that? Let alone ask others to explain these basic things to you? Why would we waste our time trying to educate you when you can simply just do that yourself?

This is like if i went to a girl I liked, and asked her what sex is. 'Hi, I would like to have sex with you but could you please tell me how babies are made?'
 
I answered your question with detailed answer above, you don't talk about my answer also if you don't understand or not believe this answer here is epic founder Tim sweeney talk about how we need more Tera flops to achieve photo realestic graphics
If you doubt him or don't understand the answer I give to you above then you just bullshiting lol,
I remember this. Sweeny said you need 40 teraflops, which is 4X base PS5. I think he said this long ago, so I'm not sure how true this would hold up today. I think PS6 should be 4X over base PS5, but I really doubt PS6 will be achieving photorealism. I think we'll not only need more computational power than that, but we'll also need hardware that can do full scale path tracing. I guess PS6 could come close to photorealism (or something that looks like it), but without path tracing we won't be there all the way.
 
Last edited:
I remember this. Sweeny said you need 40 teraflops, which is 4X base PS5. I think he said this long ago, so I'm not sure how true this would hold up today. I think PS6 should be 4X over base PS5, but I really doubt PS6 will be achieving photorealism. I think we'll not only need more computational power than that, but we'll also need hardware that can do full scale path tracing. I guess PS6 could come close to photorealism (or something that looks like it), but without path tracing there will always be something lacking.
UE5 do great things this gen hopefully UE6 give as what sweeney promised on PS6 which may have even above 40 teraflops
 
I remember this. Sweeny said you need 40 teraflops, which is 4X base PS5. I think he said this long ago, so I'm not sure how true this would hold up today. I think PS6 should be 4X over base PS5, but I really doubt PS6 will be achieving photorealism. I think we'll not only need more computational power than that, but we'll also need hardware that can do full scale path tracing. I guess PS6 could come close to photorealism (or something that looks like it), but without path tracing we won't be there all the way.
I'd say he's not going to end up wrong at all, four times a PS5 is what we have currently on PC and what it can do aided with AI upscaling is utterly astonishing, while not being specifically targeted during development. If a 4090 can do PT with significantly higher resolution than most 60 FPS console games, it's more than doable.
 
People are running away from pointless arguments you trying to create. You came here and started picking fights and wanted to argue. You talk about being defensive and yet immediately started defending bad graphics by given an excuse about graphics being this and that.

I wasn't defending anything; I think Ghost of Yotei is pretty standard. I even explained why and what would lead the studio to play it safe.
The bottom line is that if people are exaggerating and criticizing the game's graphics, any logical, non-emotional argument will make some people hysterical.

I answered your question with detailed answer above, you don't talk about my answer also if you don't understand or not believe this answer here is epic founder Tim sweeney talk about how we need more Tera flops to achieve photo realestic graphics
If you doubt him or don't understand the answer I give to you above then you just bullshiting lol,

Tim Sweeney was just being didactic, after all he is giving an interview to a gaming website, where the vast majority of readers are casual like you, so he will be generic to make it easier to understand.

Teraflop is just a unit of measurement for floating-point calculations; it doesn't mean anything on its own. It just means I can simulate real numbers to perform operations (simulating because computers only understand integers).

AMD GPUs produce significantly more floating-point computations than Nvidia's, but that doesn't mean they're better or produce realistic graphics. It just means these GPUs have arithmetic units that produce a lot of numbers.
Many things will define what will bring better graphics, such as a graphics processor with new instructions (defined by APIs like DirectX), faster shader units, co-processors (like the matrix calculation for ray tracing), with higher clocks.

So, looking solely at teraflops as a way to compare performance has long since ceased to make sense. GPUs have evolved significantly and have many more tasks and coprocessors than their predecessors, meaning that the raw number of teraflops doesn't always define reality.
 
That is the one and only thing that looks greatly improved in Yotei....why couldn't they improve the textures, lighting, and Rockstar formations though is what I want to know....just having draw distances look great (distant detail rather) isn't enough to make this look like a current gen title.
Thats something journalists, especially tech journos like DF can get answered from the devs themselves, but they would rather post videos jerking off to Death Stranding 2 and TLOU1 for 50 minutes instead of reviewing them with a critical eye.
I remember this. Sweeny said you need 40 teraflops, which is 4X base PS5. I think he said this long ago, so I'm not sure how true this would hold up today. I think PS6 should be 4X over base PS5, but I really doubt PS6 will be achieving photorealism. I think we'll not only need more computational power than that, but we'll also need hardware that can do full scale path tracing. I guess PS6 could come close to photorealism (or something that looks like it), but without path tracing we won't be there all the way.
Remember, he did this with a 10 tflops GPU.

EG6Cscb.gif
UhXTcwS.gif


When I first played it, i legit thought it was FMV until the chase sequence ended and they put me in the open world. 4x more power and i can definitely see them getting there.

That said, devs like Kojima are targeting native 4k 30 fps, instead of being smart about it like the matrix dev team was (cutscenes were 24 fps instead of 30 saving them 20% performance, plus black bars that gave them another 40% of the GPU to push towards better fidelity) and targeting somewhere between 1296p-1440p 30 fps reconstructed to 4k. So if the devs also target native 4k 30 fps on the 40 tflops PS6 then we wont come close to photorealism until PS7.

Thankfully Rockstar seems to be targeting 1440p 30 fps using FSR upscaling on the base PS5 so hopefully for their next game, we will get something truly photorealistic.
 
Ghost Of Tsushima looks like an early generation PS4 game, Ghost Of Yotei looks like a late generation PS4 game.

I can't wait for SP to make a mid generation PS5 game on PS6..!
 
Last edited:
I'd say he's not going to end up wrong at all, four times a PS5 is what we have currently on PC and what it can do aided with AI upscaling is utterly astonishing, while not being specifically targeted during development. If a 4090 can do PT with significantly higher resolution than most 60 FPS console games, it's more than doable.
4090 runs path traced Cyberpunk (last gen game) at 20fps in native resolution, it will not run PS6 games at anything except single frames when it comes to path tracing. Even Nvidia said they don't expect hardware to be able to play path traced games until 2034. I think best PS6 will do is support widescale RTGI with multiple bounce, not path tracing.
 
Last edited:
4090 runs path traced Cyberpunk (last gen game) at 20fps in native resolution, it will not run PS6 games at anything except single frames when it comes to path tracing. Even Nvidia said they don't expect hardware to be able to play path traced games until 2034. I think best PS6 will do is support widescale RTGI with multiple bouncs, not path tracing.
No one cares about native resolution, that's the thing. With your FSR4(5?) and sufficiently developed frame generation, the sky is the limit.
 
Last edited:
No one cares about native resolution, that's the thing. With your FSR4(5?) and sufficiently developed frame generation, the sky is the limit.
Even $2000 Nvidia GPU can barely make last gen Cyberpunk a good experience in path tracing. Remember, Cyberpunk doesn't even do full scale path tracing, it still uses traditional methods in some areas. Path tracing is just not happening next gen, unless Sony makes a $2000 console.
 
Even $2000 Nvidia GPU can barely make last gen Cyberpunk a good experience in path tracing. Remember, Cyberpunk doesn't even do full scale path tracing, it still uses traditional methods in some areas. Path tracing is just not happening next gen, unless Sony makes a $2000 console.
It's a pretty damn good experience, though? The IQ you get aided with DLSS utterly destroys the image quality that TAA offers on consoles, while also managing to hit 60 fps nearly all the time, not even frame gen involved. The price of the GPU matters nothing when your 600$ 9070 XT already often defeats a 4080 in raster, a full architecture behind what PS6 will be in.
 
I wasn't defending anything; I think Ghost of Yotei is pretty standard. I even explained why and what would lead the studio to play it safe.
The bottom line is that if people are exaggerating and criticizing the game's graphics, any logical, non-emotional argument will make some people hysterical.



Tim Sweeney was just being didactic, after all he is giving an interview to a gaming website, where the vast majority of readers are casual like you, so he will be generic to make it easier to understand.

Teraflop is just a unit of measurement for floating-point calculations; it doesn't mean anything on its own. It just means I can simulate real numbers to perform operations (simulating because computers only understand integers).

AMD GPUs produce significantly more floating-point computations than Nvidia's, but that doesn't mean they're better or produce realistic graphics. It just means these GPUs have arithmetic units that produce a lot of numbers.
Many things will define what will bring better graphics, such as a graphics processor with new instructions (defined by APIs like DirectX), faster shader units, co-processors (like the matrix calculation for ray tracing), with higher clocks.

So, looking solely at teraflops as a way to compare performance has long since ceased to make sense. GPUs have evolved significantly and have many more tasks and coprocessors than their predecessors, meaning that the raw number of teraflops doesn't always define reality.
Some of what you say is true about other factors affecting graphics like comparing nividia and amd but still Tera flops is the measurement of raw graphics power by increasing it you have higher graphics and i manly talk here, about GPUs of same company that have same architecture take for example nividia you can't compare same nividia gpu with same architecture while one have 12 tf and other have 16 clearly the higher one will have better performance & graphics, your point of comparing nividia and amd are not worse considering These architectures have different efficiencies and optimizations for various tasks, we here compare ps5 and ps4 which both based on amd tech also ps5 flops have higher efficiency than ps4 flops considered it have higher architecture, so for pro 10 times more power and efficiency than ps4 and those lazy developers don't use it while other developers like rockstar, some ubisoft studious and other using UE5 talk advantage of those more power
 
Last edited:
4090 runs path traced Cyberpunk (last gen game) at 20fps in native resolution, it will not run PS6 games at anything except single frames when it comes to path tracing. Even Nvidia said they don't expect hardware to be able to play path traced games until 2034. I think best PS6 will do is support widescale RTGI with multiple bounce, not path tracing.
Cyberpunk RT in general is extremely poorly optimized. I remember even the standard ray tracing running at 6 fps on my 2080. They need DLSS for it for some reason.

Thats not the case for other RTGI solutions like Anvil's or Snowdrop's RTGI. Hardware lumen also is way more efficient than Cyberpunk's RTGI. Metro Exodus or Id Tech's RTGI do not require any upscalers on consoles.

My guess is that CD project took some money from nvidia and just hacked in the RT implementation like they did practically every single gameplay system in the OG version of cybperunk. Thats why it still cant run well on even the best AMD GPUs that have no problems running other games with RTGI.

Path tracing implemented natively into UE5 or UE6 will be far more performant. I still remember how experts like Alex looked at how poorly Cyberpunk performed on AMD's first gen RT GPUs and said RTGI would never come to consoles. Now its pretty much standard and we trash any dev who chooses to forgo it.
 
It's a pretty damn good experience, though? The IQ you get aided with DLSS utterly destroys the image quality that TAA offers on consoles, while also managing to hit 60 fps nearly all the time, not even frame gen involved. The price of the GPU matters nothing when your 600$ 9070 XT already often defeats a 4080 in raster, a full architecture behind what PS6 will be in.
Rasterisation is easy, PT is tons more expensive. Nvidia is far ahead of AMD in this regard and even they don't expect PT to become mainstream until middle of next decade. You expecting too much from $500 console. I think logical step for PS6 will be support for RTGI and good level of ray tracing performance for reflections and shadows.
 
Cyberpunk RT in general is extremely poorly optimized. I remember even the standard ray tracing running at 6 fps on my 2080. They need DLSS for it for some reason.

Thats not the case for other RTGI solutions like Anvil's or Snowdrop's RTGI. Hardware lumen also is way more efficient than Cyberpunk's RTGI. Metro Exodus or Id Tech's RTGI do not require any upscalers on consoles.

My guess is that CD project took some money from nvidia and just hacked in the RT implementation like they did practically every single gameplay system in the OG version of cybperunk. Thats why it still cant run well on even the best AMD GPUs that have no problems running other games with RTGI.

Path tracing implemented natively into UE5 or UE6 will be far more performant. I still remember how experts like Alex looked at how poorly Cyberpunk performed on AMD's first gen RT GPUs and said RTGI would never come to consoles. Now its pretty much standard and we trash any dev who chooses to forgo it.
Path tracing in UE5 is actually very expensive and not great when it comes to performance. I doubt Epic will be pushing path tracing because it's too early, they will likely refine Lumen and improve on it for PS6 and next gen.
 
Most of them will be replaced by AI by turn of the decade, so they better learn a new trade.
Wasnt there an AI upscaled photo posted here of the lead taking a bath? or was it the other thread? it cleaned up the shadows and textures to make the image look so much better.

If artists cant take a cinematic shot and make it look better than that in five years then they deserve to be replaced by AI. Fuck these lazy cunts.

Fake Edit: just saw someone say that they spent a full year taking pictures in Japan lmao. As if Sony would expense a 200 team to go vacation in Japan for an entire year. I bet they were there for a week max and like 10 of them went there. These trips costs tens of thousands of dollars per person for a five day business trip. no one is sending anyone to japan for a full year.
 
Rasterisation is easy, PT is tons more expensive. Nvidia is far ahead of AMD in this regard and even they don't expect PT to become mainstream until middle of next decade. You expecting too much from $500 console. I think logical step for PS6 will be support for RTGI and good level of ray tracing performance for reflections and shadows.
I think mark Cerny will go all for path tracing and Ai reconstruction he is one that learn from his mistakes and won't neglecting PS6 path tracing capacity like he did with ps5 ray tracing
 
Rasterisation is easy, PT is tons more expensive. Nvidia is far ahead of AMD in this regard and even they don't expect PT to become mainstream until middle of next decade. You expecting too much from $500 console. I think logical step for PS6 will be support for RTGI and good level of ray tracing performance for reflections and shadows.
All the opposite frankly, RDNA4 is already almost there. Now that Cerny is also on board, I wouldn't expect UDNA to be any less performant at raytracing than even Blackwell, RTGI is already being done this generation for what's worth. Project Amethyst (aka PS6 R&D) is a good measure of where all of this is going.
 
I think on PS6 we'll see PT in some of the smaller, much more rudimentary titles with simple visuals and perhaps some Path Traced remasters of PS4/PS5 stuff. Likely 1080p with ML Scaling to 1440-4K & Ray Regen.

Whereas most flagship titles will be using comprehensive RTGI + RT Reflections.

I also expect them to go extremely aggressive on ML. Something like 1200TOPS + much more refined models with the ability to do high quality 2-4x upscaling, 2-3x frame gen, ray regeneration and neural texture decompression.
 
Last edited:
All the opposite frankly, RDNA4 is already almost there. Now that Cerny is also on board, I wouldn't expect UDNA to be any less performant at raytracing than even Blackwell, RTGI is already being done this generation for what's worth. Project Amethyst (aka PS6 R&D) is a good measure of where all of this is going.
You're already on a hype train and train hasn't even been built, lol. Ok, if you think you gonna fto from basic tier RTGI (which btw not many games support 5 years into this gen) to full scale path tracing on a $500 console (from AMD nonetheless) in 3 years time then you go ahead and dream, lol.
 
All the opposite frankly, RDNA4 is already almost there. Now that Cerny is also on board, I wouldn't expect UDNA to be any less performant at raytracing than even Blackwell, RTGI is already being done this generation for what's worth. Project Amethyst (aka PS6 R&D) is a good measure of where all of this is going.
the question is whether or not these lazy devs will bother adding PT support to their engines when they didnt bother with RT this gen?

Cerny went on and on about IO and SSD and ray tracing during his wired articles and road to PS5 conference, and his developers looked at him and said suck a dick, we will make the same exact game we made last gen.

Cerny can add all kinds of shit to the GPU and it will be up to the devs to utilize it. Sony devs will not. Will Anvil and Snowdrop upgrade to Path tracing next gen? Will Rockstar? Or CD Project? Cyberpunk 2 is in development in Boston under a brand new team and its probably a 2030 title. Do they know whats in the next gen consoles? I want to believe, but im not putting my faith in hacks who dont even bother changing textures.,

In the other thread, people were bringing up Zelda ToTK as proof that Zelda team didnt upgrade textures either. Well, they literally had to make their sequel on a handheld with half the power of the Wii U in handheld mode. Sucker Punch has access to 8x more GPU power plus ray tracing and mesh shader support. Not to mention Cerny's super secret IO which was meant to load data at the rate of 22 Gbps due to kraken compression allowing vram to hold more detailed assets than ever before. These lazy cunts used none of that. At least insomniac used the ssd and io for some gimmciky levels and faster traversal.
 
You're already on a hype train and train hasn't even been built, lol. Ok, if you think you gonna fto from basic tier RTGI (which btw not many games support 5 years into this gen) to full scale path tracing on a $500 console (from AMD nonetheless) in 3 years time then you go ahead and dream, lol.
the days of $500 consoles are long gone. With Xbox out of the console race, Sony might even release an $800 console and everyone will buy it on day one.
 
I think mark Cerny will go all for path tracing and Ai reconstruction he is one that learn from his mistakes and won't neglecting PS6 path tracing capacity like he did with ps5 ray tracing
Mark Cerny can't beat physics and his limited budget. You will never hear him say "path tracing" leading up to PS6 launch, because he knows it ain't happening.
 
Top Bottom