• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How many teraflops do we need for excellent graphics?

ZoukGalaxy

Member
Look at the "The Order 1886", one of the FIRST game of the first PS4: that's all what you need with competent and passionate devs

12791066865_74c620ee50_o.gif
 
Last edited:

sankt-Antonio

:^)--?-<
Given that people who can actually code stuff are not working in game dev, we have to make do with the "talent" this sector has, so 100TFLOPS is where even the most incompetent dev should be able to make excellent gfx a reality, even people like Fromsoft.
 

LordOcidax

Member
Are you saying consumers will never be satisfied? That makes no sense.
Do you ever feel that the real world doesn't look real enough because its remained at the same fidelity all your life?
You don't, and that means there's a limit.
No, the consumers always is going to want the most powerful thing, look at the hypercars market, but that’s another whole topic… The real issue in the game industry right now is development cost and talent. You can have amazing results with old lighting techniques but that take a lot of time and talent. Using RT hardware cuts a lot of development cost due his real time nature. Hardware makers are already pushing 8K and is never going to end.
 

Clintizzle

Lord of Edge.
Requirments based on game engine:
Decima: 20
Dunia: 20
Unity: 40
CryEngine: 45
Naughty Dog Game Engine: 10
RAGE: 20
IdTech: 5
Any other Zenimax/Bethesda engine: Jesus.
 

hinch7

Member
Teraflops is an awful metric for performance but I'll bite..

Assuming we have enough memory bandwidth feeding the GPU. Enough cache and decently performing RT cores. And UE5 being THE defacto engine going forward...

40TF non dual issue should be good to go. Basically around a 4090 in compute. Then we're looking at games with decent hardware based raytracing without sacrificing image quality for performance (so much). Anything above is gravy.

I'd be massively dissapointed if next generation consoles don't at least reach that as a baseline. Raytracing/Pathtracing is the way forward for games development and fidelity.
 
Last edited:

Geometric-Crusher

"Nintendo games are like indies, and worth at most $19" 🤡
Nintendo after cel shading was invented ''to hell with teraflops I found eternal El Dorado, baby''
 
Last edited:

Lysandros

Member
Saying Tflops doesn’t matter is the same as saying horsepower in a car engine is not relevant.. of course a powerful engine alone don’t take you very far but if the rest os the car (suspension, weight, transmission, etc..) is well calibrated the car with the most powerful engine will always have the advantage.

With computer graphics is exactly the same thing.. the most powerful GPU will always win in a identical setup..

Dismissing Tflops is such a weird thing to do..
Teraflops≠Power in itself. It's a metric among many others only contributing to it. It's about the whole.
 

lh032

I cry about Xbox and hate PlayStation.
We already have excellent graphics games .

The issue is optimization
 

James0007

Neo Member
More TFlops just means “how much of this shat can I get away with brute forcing before I have to (ffs) optimise my code”.
 
Since "excellence" in that regard can mean everything from good style to absolute photorealism that question can't be answered in that form.

I'll go with perfect photorealism and that is a long ways off still and since Tflop numbers don't mean squat anymore and continue to lose meaning in the face of ML advancements I don't think anyone is even able to give as much as an educated guess right now.
 
Last edited:

Buggy Loop

Gold Member
Look at the "The Order 1886", one of the FIRST game of the first PS4: that's all what you need with competent and passionate devs

12791066865_74c620ee50_o.gif

Amazing what they did with ~2TF and 8GB of memory.

Also PS3 with 0.2TF & 256MB

ijlpAErx93Mnu.gif


iHPSqU839KOvt.gif


iyf1iatdgqsed.gif


What the fuck happened? Clearly that was room for improvement from PS3 to PS4. But this gen feels like a huge plateau in graphics.
 

HeWhoWalks

Gold Member
Amazing what they did with ~2TF and 8GB of memory.

Also PS3 with 0.2TF & 256MB

pics

What the fuck happened? Clearly that was room for improvement from PS3 to PS4. But this gen feels like a huge plateau in graphics.
"What the fuck happened"?

Um, games look a lot better...


Alan-Wake2-2023-10-29-08-36-11.png



2KR6QKN.png


Hellblade2-Win-GDK-Shipping-2024-05-22-16-52-09.png


Dead-Island-Win64-Shipping-2023-04-29-19-42-27.png
 
Last edited:

Loomy

Thinks Microaggressions are Real
People always complain on graphics,
but I feel, if 4k@60 with full path tracing were default settings,
I would be satisfied as hell.

How many teraflops do we need to reach the bare minimum?
Look at the Steam hardware survey. Most people are playing on 1080p and 1440p displays. Incentive is not fully there yet, so 4k@30 is considered a win by many.
 

Buggy Loop

Gold Member
"What the fuck happened"?

Um, games look a lot better...

From PS3 to PS4 yeah, like I said. Still mind blowing what they achieved with 256MB. If you asked devs nowadays to make an AAA with that memory budget, they would curl up in a fetal position.

PS4 to PS5? LOL, weakest jump
 
Last edited:

HeWhoWalks

Gold Member
From PS3 to PS4 yeah, like I said.

PS4 to PS5? LOL, weakest jump
If we're just looking at consoles (which is stupid, given the topic), then sure. The jump isn't as big. When looking at the full picture (which is what any sane person would do), the jump is much bigger than you've eluded to.

Besides, I'm not sure why the global pandemic continues to be ignored in all of this. It impacted every angle of life, games included. They are just beginning to get back to some normalcy, but that's also in addition to the rising cost of game development. It all hit at the same time, and yet, we've still got the results that I've shown above. It's not all about PlayStation when it comes to graphics.
 
Last edited:

Buggy Loop

Gold Member
If we're just looking at consoles (which is stupid, given the topic), then sure. The jump isn't as big. When looking at the full picture (which is what any sane person would do), the jump is much bigger than you've eluded to.

You showcases 3 games with super small areas and coated with fog for visual effects, that's your benchmark of improvement?

Still more impressive after all these years and arguably still king of graphics :





All can run on 2 TF hardware while the games you show will eat up anything you throw at them

Don't even know the last image, Dead island 2? Runs on PS4 with almost the same visuals. Not even a generational leap, screams crossgen.

But sure, do we LOVE games that feel like a return to corridor horror scales and require a 4090 to even power its way through, with frame generation and >16GB VRAM and >32GB RAM and maybe it'll avoid shader stutters but not traversal stutter. What a nice path we're heading into. Let's all coat this with temporal solutions which smears the image quality. Hurrah!

This is the very definition of plateau. We're right on it.
 
Last edited:

HeWhoWalks

Gold Member
You showcases 3 games with super small areas and coated with fog for visual effects, that's your benchmark of improvement?

Still more impressive after all these years and arguably still king of graphics :


All can run on 2 TF hardware while the games you show will eat up anything you throw at them

Don't even know the last image, Dead island 2? Runs on PS4 with almost the same visuals. Not even a generational leap, screams crossgen.

But sure, do we LOVE games that feel like a return to corridor horror scales and require a 4090 to even power its way through, with frame generation and >16GB VRAM and >32GB RAM and maybe it'll avoid shader stutters but not traversal stutter. What a nice path we're heading into. Let's all coat this with temporal solutions which smears the image quality. Hurrah!
You're actually going to attack linear games when all you showed was that (particularly fuckin' Beyond: Two Souls)? Pure comedy! Also, Dead Island 2 isn't linear. Those games use some form of raytracing/pathtracing and/or are definitely ahead of games from last gen. And no, Red Dead Redemption 2 isn't the "king" of anything, as a clear step below stuff like maxed out Cyberpunk 2077 on PC or Horizon: Forbidden West on PS5/Pro. A good deal of games look better than ones from last gen (in some cases, considerably).

As for that last line — you said "what the fuck happened" and I showed you what the fuck happened.
 
Last edited:

Knightime_X

Member
I dont care as long as whatever graphics they aim for is at LEAST 1080p 60fps 4xaa
1440p 60fps with 2x aa will be the sweet spot.
Even if dlss is required.
 

Buggy Loop

Gold Member
You showed linear games. I showed linear games (outside of Dead Island 2).

You're still stuck on the PS3 games?
You started by interpreting my post all wrong, thinking that I'm saying games have not evolved since then, in the same post that I say that it evolved from PS3 to PS4 🤦‍♂️

Its just my personal list of holy shit what did they do with 256MB. Do you understand this?

I'm impressed by these PS3 games, for their time, and memory print voilà. ZoukGalaxy ZoukGalaxy 's the order 1886 already does the talking for "what the fuck happened" since then. Because it barely budged.

You can minimize things all you want, but the games I showed either use some form of raytracing/pathtracing or are clear steps ahead of all the games you showed and definitely ahead of games from last gen, whether you like it or not.

Besides, you showed nothing but console games while trying to make this argument. It doesn't get funnier than that.

Cute you would throw raytracing/pathtracing my way when I'm one of the biggest advocate for it on this forum.

Still is little to no return for the performance costs and not something to build around for your games to sell as its a minority of users who will even enable this. Hardware for path tracing is so niche that its equivalent to a statistic error.

Lumen/Ray/path tracing should mainly be used for highly dynamic games where scene changes drastically due to construction/demolition or intense changes in scene lighting, otherwise devs should stick to intelligent raster solutions. Silent hill 2 being the biggest culprit of this as its mainly static. Could have been baked! Or do 1 Lumen pass, not always refreshing it every couple of frames and wasting GPU power needlessly.

That very same Alan wake 2 developer had an amazing global illumination method for Quantum break



Alan wake 2 still looks good with ray tracing off.

For path tracing it certainly does not look 82 TFLOP 24GB GDDR6 4090 + 32GB RAM with a 7800x3D to get ~32 FPS before you start with upscalers and frame gen. I'm all for future of path tracing and that hardware moves in this direction by speeding things up, but we're in the "this is dumb" phase because performances are utter shit.

Forbidden west, although I want nothing with that franchise because its boring, wipes the floor with most ray/path tracing games with pure raster.

As someone said in this thread, its really about dev talent, way more than technology.

 

Buggy Loop

Gold Member
So, games look better, like I said. Good! Glad we agree.

P
L
A
T
E
A
U

Last of us Part 2 looks better than Alan wake 2

Certainly ain't worth the hardware cost we see. Worst generational leap in all of gaming. A 4090 to run Alan wake 2 path tracing is a laughable joke of a proposition or to even insert into the discussion of graphical progress. Top line PC GPUs used to be generational leaps above consoles. Not this...

 
Last edited:

HeWhoWalks

Gold Member
P
L
A
T
E
A
U

Certainly ain't worth the hardware cost we see. Worst generational leap in all of gaming. A 4090 to run Alan wake 2 path tracing is a laughable joke of a proposition or to even insert into the discussion of graphical progress. Top line PC GPUs used to be generational leaps above consoles. Not this...


"Last of us Part 2 looks better than Alan wake 2".... oh my.

Oh well, you've got this!
 
Last edited:
Top Bottom