• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Hot Take: Graphics have completely stagnated since 2019

Graphics peaked with The Order 1886

286123-2051289-718917_20130820_004.jpg

"graphics peaked with Leonardo Da Vinci"

HathYLd.jpg
 

Jesb

Member
The next gen feature for this gen has been GeForcenow. I just recently joined the ultimate tier and just loving it. It’s not without some issues, but if the library keeps growing a lot there’s no need for anything else for me.
 

Robb

Gold Member
Hard to tell, to me it feels more like a money issue - but I’m no tech expert by any means. It just seems like it’s too expensive to make cutting edge stuff now, so devs don’t.

We’ve already seen stuff like the Matrix UE5 Demo being playable. That stuff looked incredible.

So it’s obviously possible to push things way further. But instead we’re getting poor looking GAAS titles designed only to squeeze as much money out of whales as possible.
 

Jakk

Member
Eh, nah. You'll get plenty of graphical advancements on consoles. No matter what folks tell themselves, the best looking games this gen could not have been done on last gen hardware without significant compromise.

And even with $800+ graphics cards, you aren't seeing the kind of advancement, relative to that amount, that you think you are. It can be done on consoles, you'll just pay a performance price for it. It's a resource allocation, not a hardware limitation.

No game is taking advantage of even my 3090 Ti, let alone a 4090 and better. It's not happening. I want people to understand this. Those cards are capable of far, far, far more than you've been getting.

What they do allow for, however, is better framerates and that's the biggest advantage. But, as long as consoles have the presence, share, and mass pricing accessibility that they do, your shiny PC graphics card will never, ever have its parts truly exploited and put to use.

Some examples of what could/would happen if PC devs only took advantage of the latest and greatest cards:

Rendered on a TITAN RTX in under four minutes in UE4 back in 2020:

Game-FINAL.png


Laundry-002.png

Laundry-001.png

Laundry-003.png


Fully path-traced, 8k normal maps on nearly all surfaces, high quality fog, many dust particles (thanks to instancing). This would be the norm if PC's best cards were actually taken advantage of! And that's.....2020.
No, that wouldn't be the norm right now, not even 4090 is that powerful. How is pre-rendered footage relevant when it comes to real-time graphics?
 

HeWhoWalks

Member
No, that wouldn't be the norm right now, not even 4090 is that powerful. How is pre-rendered footage relevant when it comes to real-time graphics?
Powerful for what? To run that scene? :LOL:

It’s an environment taken straight from UE4 with a few of those assets already used in games and was an example.

Point is it doesn’t take rocket science to know how much better graphics would be if the high-end cards were actually taken advantage of, which is what the convo was about.
 
Last edited:

Jakk

Member
Powerful for what? To run that scene? :LOL:

It’s an environment taken straight from UE4 with a few of those assets already used in games and was an example.

Point is it doesn’t take rocket science to know how much better graphics would be if the high-end cards were actually taken advantage of, which is what the convo was about.
I'm not sure where are those scenes from, but if they are pre-rendered and it took 4 minutes to render on a TITAN, then obviously a 4090 isn't powerful enough to run them real-time at playable FPS. You are mixing up real-time graphics with pre-rendered scenes. I mean, sure, I agree graphics would be better if high-end was the baseline, but I wouldn't say there are no games taking advantage of a 4090 - Cyberpunk barely hits 20 FPS with path tracing at 4K, Alan Wake 2 barely stays at 30 FPS. Besides, what many people in this thread have said is true, we are hitting diminishing returns, there will never again be such a generational jump in graphics as before.
 

HeWhoWalks

Member
I'm not sure where are those scenes from, but if they are pre-rendered and it took 4 minutes to render on a TITAN, then obviously a 4090 isn't powerful enough to run them real-time at playable FPS. You are mixing up real-time graphics with pre-rendered scenes. I mean, sure, I agree graphics would be better if high-end was the baseline, but I wouldn't say there are no games taking advantage of a 4090 - Cyberpunk barely hits 20 FPS with path tracing at 4K, Alan Wake 2 barely stays at 30 FPS. Besides, what many people in this thread have said is true, we are hitting diminishing returns, there will never again be such a generational jump in graphics as before.
I do somewhat get where you're coming from. However, Diminishing Returns, verbatim, is about budget/return. We could have the leaps people are looking for, but we won't because it financially isn't feasible anymore. Things have stagnated on that basis, not because we are close to any graphical ceiling (which is what folks have gotten wrong). The law dictates a negative return, not capability (although graphics will eventually reach that point, but there is a very long way to go).

Next, unlike a closed system (console), developers aren't creating their projects with the metal in mind. Yes, there are advantages to owning a powerful rig, but they will never be used to the fullest. What is happening is like RAM — the more of it you have, the more your system will use. The better the hardware, the better the output. Absolutely. Just not to the degree it would be if that were strictly the target.
 
Last edited:

Jakk

Member
I do somewhat get where you're coming from. However, Diminishing Returns, verbatim, is about budget/return. We could have the leaps people are looking for, but we won't because it financially isn't feasible anymore. Things have stagnated on that basis, not because we are close to any graphical ceiling (which is what folks have gotten wrong). The law dictates a negative return, not capability (although graphics will eventually reach that point, but there is a very long way to go).

Next, unlike a closed system (console), developers aren't creating their projects with the metal in mind. Yes, there are advantages to owning a powerful rig, but they will never be used to the fullest. What is happening is like RAM — the more of it you have, the more your system will use. The better the hardware, the better the output. Absolutely. Just not to the degree it would be if that were strictly the target.
I mean, some of those scenes don't look that much different from the best looking games nowadays. What I'm saying is if those are pre-rendered scenes and it took 4 minutes to render them on a TITAN, then a 4090 obviously isn't powerful enough to run them real-time. Maybe some of them actually were real-time, I can't tell, as I don't know where did you get them from. Diminishing returns are about graphics.

aFKEttJ.png


Sure, there are many other ways you can improve graphics other than increasing polygon count, but almost none of them are as visually striking as the increase in polygon count once was. The fact we hit diminishing returns doesn't necessarily mean we are close to the ceiling, it just means it's getting harder and harder to get closer to the ceiling. And I don't know, I would say some games are getting pretty close to photorealistic-ish graphics.
 
Last edited:

HeWhoWalks

Member
I mean, some of those scenes don't look that much different from the best looking games nowadays. What I'm saying is if those are pre-rendered scenes and it took 4 minutes to render them on a TITAN, then a 4090 obviously isn't powerful enough to run them real-time. Maybe some of them actually were real-time, I can't tell, as I don't know where did you get them from. Diminishing returns are about graphics.

aFKEttJ.png


Sure, there are many other ways you can improve graphics other than increasing polygon count, but almost none of them are as visually striking as the increase in polygon count once was. The fact we hit diminishing returns doesn't necessarily mean we are close to the ceiling, it just means it's getting harder and harder to get closer to the ceiling. And I don't know, I would say some games are getting pretty close to photorealistic-ish graphics.
It's just UE4 assets (most of which made it to games). They are just lumped nicely together with the highest end features, such as path/raytracing (which on double checking was 40 second renders per image in 2020, not 4 minutes — my error).

In any case, I think we are most on common ground regarding the returns. I'm mainly talking about the folks who are mostly focusing on visuals and saying it is not as simple as that. True photorealistic, in-game graphics are still years away (although Tim Sweeney predicted last year would be the year we'd see fully playable, photoreal games..... long before a pandemic poured cold water on things :p). We have some excellent results already but are not where we should be and that is because of external (such as the continued fall out from COVID), and internal (such as budget), factors. But, yes, we are much closer to the ceiling than we have ever been, even if it's going to be a while.
 
Last edited:

Jakk

Member
It's just UE4 assets (most of which made it to games). They are just lumped nicely together with the highest end features, such as path/raytracing (which on double checking was 40 second renders per image in 2020, not 4 minutes — my error).

In any case, I think we are most on common ground regarding the returns. I'm mainly talking about the folks who are mostly focusing on visuals and saying it is not as simple as that. True photorealistic, in-game graphics are still years away (although Tim Sweeney predicted last year would be the year we'd see fully playable, photoreal games..... long before a pandemic poured cold water on things :p). We have some excellent results already but are not where we should be and that is because of external (such as the continued fall out from COVID), and internal (such as budget), factors. But, yes, we are much closer to the ceiling than we have ever been, even if it's going to be a while.
Yeah, I agree with that. Another way you could describe diminishing returns is that the same amount of work gives you less noticeable improvement than before. So theoretically, if the game dev companies had much bigger budgets, the improvement could be bigger, but unfortunately with all the things happening in recent years, I would say most companies' budgets are getting tighter.

Also, while generational improvements in GPU gaming performance are still generally high, the same can't be said about CPUs - and stuff like raytracing/pathtracing is quite demanding not only on GPU, but also CPU. I think we could easily run into CPU bottlenecks with RT/PT at some point.
 
Last edited:

raduque

Member
Only people with subpar hardware would say that. There is a HUGE difference between 4K and QHD. No DLSS setting will nullify this difference.
I jumped from a 2070 Super to a 4070 Super. Playing 4k native now and the difference is night and day.

Don't say it's unnecessary or overrated just because your setup can't keep up.
It's a lie.
No.
 
Last edited:

HeWhoWalks

Member
Only people with subpar hardware would say that. There is a HUGE difference between 4K and QHD. No DLSS setting will nullify this difference.
I jumped from a 2070 Super to a 4070 Super. Playing 4k native now and the difference is night and day.

Don't say it's unnecessary or overrated just because your setup can't keep up.
It's a lie.
My setup certainly can keep up and 1440p gets the job done. 4K with my PS5, 1440p on PC with framerates that are beyond acceptable. Not a bad little trade off.
 
My setup certainly can keep up and 1440p gets the job done. 4K with my PS5, 1440p on PC with framerates that are beyond acceptable. Not a bad little trade off.
ps5 upscales most games to 4k, not native.
if you want the unfiltered 4k experience, you need a pc that can do 4k, hooked up to a 4k monitor/tv.

4k120vrr is where its at.
1440p is squarely for midrange or high refresh/competitive stuffs.
1440p is still competent for small screens, but 4k really is a lot better.
 

HeWhoWalks

Member
ps5 upscales most games to 4k, not native.
if you want the unfiltered 4k experience, you need a pc that can do 4k, hooked up to a 4k monitor/tv.

4k120vrr is where its at.
1440p is squarely for midrange or high refresh/competitive stuffs.
1440p is still competent for small screens, but 4k really is a lot better.
I said nothing about the games, I said the PS5 is on a 4K monitor, which results in a fantastic experience. And, games like GT7 are in native 4K.

I also didn't say 1440p was better than 4K. I said it gets the job done. And there is nothing mid-ranged about a 13900K, with a 3090 Ti, and 96GB of DDR5 @6800MHz. Thus, none of what you said was relevant :p
 
Last edited:

Jakk

Member
I'll let you on a secret, graphics never actually mattered at all, ever.
Disagree. It depends on the type of the game, but if someone is trying to create an antmospheric, immersive and/or realistic game, graphics do matter very much.
 

raduque

Member
My eyes dont lie. On a 32 inch monitor 3 feet away YES. The difference is very clear. Get some glasses.
Eyes may not lie, but people certainly do. I do not believe for one second that you can see a 2560x1440's pixels at 3' on a 32" display.

I do wear glasses. I can't see the pixels on a 1080p 32" display from less than 2'.
 
Top Bottom