Are "Generational" leaps become a thing in the past? AI upscaling and Frame generation is the future

Devs are focusing on the wrong things and are trying to sell games based on how they look. The top 10 charts show that the most played games are mostly older games where graphics are secondary to gameplay.

Games look pretty enough but mission design, AI haven't progressed at all. Put some power into that.

I don't care if my gf is hot as F when she is dead on the inside.
 
Last edited:
You forgot to mention the big one which is diminishing returns.

Even if we do figure out a way to keep Moore's law going in some fashion, it will quickly become irrelevant again due to the fact you won't see the improvements with your eyes..

I think we need to come to peace with the fact that we need something other than graphics to continue the roll out of the industry. The graphics race is quickly coming to an end. We need to accept it and look to other things that can improve the experience..
 
stanley-hudson.gif


I personally care faaaaaaaar more for good art direction rather than raw graphics.
 
I think developers need to use better, less resource intense, engines and start developing their own again. We've seen some amazing detail out of Source engine, especially in VR. I'm not sure why more devs don't utilize it.
 
Last edited:
How many times are we going to have the same topic on the matter? It was already made obvious in various threads that efficiency is what's prioritized. If optimization yields diminishing returns for the business over the time they could get a product out the door, then it will be offloaded to AI accordingly. Likewise, if the cost of engineering semi-conductors comes with more complicated and complex challenges (cost-benefit analysis) at present then the obstacles therein will be better addressed by leveraging more intelligent design theory — this is where AI steps in.

It's highly accurate and precise based on the data it is fed, and the whining tends to be perceptual issues than statistical. The only people who think otherwise are stuck in primitive, technologically conservative paradigms where they believe a stone club that's bigger is better.
 
Last edited:
Call me loony but I think something happened when they introduced ray tracing. Up until that point better hardware went hand in hand with huge leaps. Suddenly Nvidia came around the corner with this, lets call it, effect that demanded an INSANE amount of hardware for no obvious reason and ever since hardware demands have went up through the roof without the games even looking that much better.... and every time you raise an eyebrow it's like... but.... but... but... RAY TRACING!!!!

Biggest fluke in gaming technology! There, I said it... Come at me!
Yes you're kinda close. What happened is Moore's law and dennard scaling died so it became unsustainable to shrink transistors and achieve the same power and cost reductions we used to have. Now every next generation node is drastically more expensive than the last. Take a look at this image displaying TSMC's wafer prices:

iMN33Bj.jpeg


We were seeing the effects of this before but it really started to become unsustainable at 7nm which is what the PS5, Xbox Series, AMD RDNA1 and 2 use. Whata happening now is that each node is becoming more expensive to make chips on such that technology is getting more expensive not cheaper as it used to. This is why the PS5 has not dropped in price when this time last gen we already had the PS4 slim. 2nm is expected to cost $30k+ per wafer, that's pure insanity. GPUs and consoles made on 2nm will be faster but they will be very expensive.
 
For rasterization, yes. I think it's going to be all ray tracing leaps and AI for the rest of the life of silicon based transistors. We could live to see a change past that, but too early to tell.
Whatever the the next paradigm beyond silicon transistors, it will probably be designed by AI, too.
 
Yes you're kinda close. What happened is Moore's law and dennard scaling died so it became unsustainable to shrink transistors and achieve the same power and cost reductions we used to have. Now every next generation node is drastically more expensive than the last. Take a look at this image displaying TSMC's wafer prices:

iMN33Bj.jpeg


We were seeing the effects of this before but it really started to become unsustainable at 7nm which is what the PS5, Xbox Series, AMD RDNA1 and 2 use. Whata happening now is that each node is becoming more expensive to make chips on such that technology is getting more expensive not cheaper as it used to. This is why the PS5 has not dropped in price when this time last gen we already had the PS4 slim. 2nm is expected to cost $30k+ per wafer, that's pure insanity. GPUs and consoles made on 2nm will be faster but they will be very expensive.
That doesn't explain though why the hardware is so much better today then say 10 years ago but the quality difference is anything but obvious.
 
Nah, generational leaps will continue there. New generational leaps will be needed to make a major improvement in RT/RTGI and real time AI stuff applied to games not only for upscaling or extra frame generation, which also will improve with generational leaps.

Stuff like tweaking / improving textures, models, animations, lighting, shadows, reflections, NPC conversations etc. in real time via AI will also be introduced and improved over time in different generational leaps.
 
Last edited:
I want better animation, better faces, better motion capture, better acting, better stories, better physics.. 🙄 Stat stuff takes time and talent.
 
We've definitely hit diminishing returns on rasterization improvements, and just what amount more detail the average person's eyes will even notice.

I don't think generational leaps have to go away, but it's going to be some other kind of technology that improves fidelity in areas that aren't just higher res textures, or maybe stuff in caustics/fluid simulation.

Either way, the 2 benefits are that it forces more inventive presentation than just banking on muh realism, and the diminished return benefits devices like the Switch 2 & inevitable next Steam Deck that will have PS4-PS4 Pro tier visuals on the go that will be enough for most people.
 
I think developers need to use better, less resource intense, engines and start developing their own again. We've seen some amazing detail out of Source engine, especially in VR. I'm not sure why more devs don't utilize it.
I agree with that but it's about money. Developing a custom game engine cost Tens of millions of dollars. They find it cheaper just to license something like Unreal 4 or 5.
 
I agree with that but it's about money. Developing a custom game engine cost Tens of millions of dollars. They find it cheaper just to license something like Unreal 4 or 5.
Which is why there should be more ambitious indie devs. Which there are, but there needs to be more of them. The industry was founded on innovation and optimization and over the years, greed has gotten in the way. It's sad. Unity and UE, need to be utilized less, or need to be utilized properly, which also is an issue.
 
Last edited:
Let's wait to see what Switch 2 can do. If AI is a magic fix for that then I guess so and a whole heap of other companies will start making cheaper handhelds that can run the newer games and generations go away until some other technological leap happens
Its not even looking like Switch 2 will even have DLSS ...
 
You forgot to mention the big one which is diminishing returns.

Even if we do figure out a way to keep Moore's law going in some fashion, it will quickly become irrelevant again due to the fact you won't see the improvements with your eyes..

I think we need to come to peace with the fact that we need something other than graphics to continue the roll out of the industry. The graphics race is quickly coming to an end. We need to accept it and look to other things that can improve the experience..
Until full RT is available in consoles id never think to presume that "graphics race is coming to an end"....we have a ways to go
 
Compare the hardware specs of the Original Xbox(2001) and the Xbox 360(2005).
You see there a big jump in processing power just in 4 years.

When the dennard scaling died and the moore's law slow down, leaps like that in 4 years couldn't happen again.

Original-Xbox-hardware-specs.png


Xbox-360-hardware-specs.png



40-years-of-processing-power.png
 
Last edited:
Switch 1 to Switch 2 is a pretty big leap.

Going from say Witcher 3 on Switch 1 to Cyberpunk 2077 on Switch 2 is a night and day difference.
 
No but they're not important or even appealing to most gamers, we already reached a point of "good enough" even with the Switch 1 first party games.

Now we have to consider young people growing with anime being too popular so they mostly don't care about more realistic games and they also playing on phones since they are children (eww don't give phones to children) so their focus on visuals is mostly art direction.

Also diminishing return.

So no, generational leaps importance isn't even as much as it was before, as long as games have good art and gameplay it's all most people care about, only those that always liked more realistic graphics are the ones complaining about it these days.
 
ps4 to ps5 leap was good. just not enough devs taking full advantage of latest greatest tech. takes a while for things to catch up.
ps5 to ps6 will be nice too
 
I dont really care, only thing a game needs to deliver is art direction and fun gameplay.

I feel sorry for graphics whores (not really) this is worse timeline for them.
 
Yes but it's complicated.

The hardware is getting more powerful still, but the primary issue is the very limited number of development teams that have the talent and budget to push these machines to their limits.

Every generation there is a smaller and smaller amount of games that can only run on the newest hardware, and it's because of the limiting factor of development budgets. ONLY the 1st party studios and like, Rockstar, really have the budgets to max these things out nowadays.

Remedy/CDPR would be the rare 3rd party exceptions.
 
Last edited:
Resolutions to set apart generations (HD -> full HD -> QHD -> 4K -> 8K) went the way of the "bit-ness" in earlier generations, that is has become meaningless marketing term with the advancing of technology due to overshotting consumer demand (Sony has discontinued the production of 8K TV sets due to low demand) and the emergence of new tech that "tricks" the achievement (think of DLSS in the Nintendo Switch 2).
So to continue with the generational leaps it's mandatory the introduction of a new paradigm shift (or multiple) that is meanigful for consumers and that somehow helps the game production side, which is also constrained at the moment, to make the game production cheaper/quicker.
 
Last edited:
I dont really care, only thing a game needs to deliver is art direction and fun gameplay.

I feel sorry for graphics whores (not really) this is worse timeline for them.
Don't know what you're playing but as a graphics whore who has moved to PC I think this timeline is great. There are almost no console exclusives at all and games are starting to use PC centric features like path tracing. I'm playing the same games with good art direction and fun gameplay using the same controller but I do it with better graphics and framerates than before.
Currently playing No Rest For The Wicked which fits your description of good art direction and fun gameplay.
 
If I compare any known ps2 game to cyberpunk, the fairest answer would be, every pixel/object while playing Tekken 5 on the screen matters. But the whole experience playing cyberpunk could be concluded in a plane ride looking at a scenery from a window's pov.
 
Last edited:
Just consider that in 2020, when this generation released, a 300mm wafer in N7 cost around 9000 $US.
A 300mm wafer in N2 costs around 300000 $US. And a wafer in N1.6 is rumored to go up to 45000§US.
So to get a chip of the same size, with more transistors, more Compute Units, more RT units, more AI units, more CPU cores, cache, as we usually expect from a next generation of consoles, each chip will cost a lot more.
This is not going to be possible, without a price increase of the console. Or Sony picking a smaller, cheaper chip, that doesn't make a big generational leap.
Our only hope is that the AI boom would crash in the meantime, reducing the demand for wafers and forcing TSMC to lower prices.
 
The only innovations that are able to shock us are physics and AI.

Because we have already experienced the others (high resolution, high fps, PT).
 
Last edited:
Top Bottom