• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How many teraflops do we need for excellent graphics?

TheGrat1

Member
15372e0b19b15c89872448831632.gif
 

SlimySnake

Flashless at the Golden Globes
People always complain on graphics,
but I feel, if 4k@60 with full path tracing were default settings,
I would be satisfied as hell.

How many teraflops do we need to reach the bare minimum?
4090 is 63 tflops. But performs more like a 35-40 tflops card because nvidia is inflating their tflops numbers.

Right now for the games currently on the market, 4090 seems to be enough for 4k 60 using path tracing but you will need to use dlss.

The problem is that next Gen devs will push more geometry, more effects, better character models, better textures, more npcs and more everything so even if by some miracle we get a 4090 in a ps5 next Gen, only cross Gen games will run at 4k 60 fps. Next gen games will still run at 30 and sub 1080p resolutions in 60 fps modes.

You can get to 100 tflops and the problem will remain.
 

Geometric-Crusher

"Nintendo games are like indies, and worth at most $19" 🤡
This was on 1,3 TF in 2013 on Xbox one


sOS3Go1.gif



It depends on the Dev
Today we have 12 / 16 / 20 TF..
some scenes are cgi dude but yeh this game looks fine
I'm impressed how the Xbox One made such good graphics in Star Wars, Gears 5, Forza Horizon, even though RDR2 was the worst version it impressed me.
 
Last edited:
We're hitting diminishing returns with graphics.

The amount of work it requires to produce a game with good graphics takes experientially increasing amount of man hours and budget.
 

Mayar

Member
Are we talking about graphics from the point of view of technology or graphics from the point of view of art style and design? If we are talking about the second option, then teraflops, in principle, do not matter. Creating a beautiful picture is not just Graphics and the Power of your PC or Console, it is talent and mastery of art on the part of Designers, 3D modelers, etc. to display a picture on the screen that would captivate the eye.
 

HeWhoWalks

Gold Member
We're hitting diminishing returns with graphics.

The amount of work it requires to produce a game with good graphics takes experientially increasing amount of man hours and budget.
It's not graphical fidelity that is the issue as you don't really need a huge budget to produce stellar results. It is said fidelity at higher framerates that's become more costly.

Video games on their own are nowhere close to hitting a graphical ceiling, though. Nowhere close. Offline rendering is still miles ahead, for example. But, they've gotten so good that to truly make a big splash, you need the production value to cooperate, so in that regard, I agree.
 
Last edited:

HeWhoWalks

Gold Member
We are talking about 4k@60 + path tracing as default and available settings for most developers.
But you need neither 4K/60fps nor Path Tracing for "excellent" graphics. If the question was "how many teraflops do you need for advanced techniques to run with acceptable/smooth framerates", then that would be another matter.... though it would still be a silly question! 😆
 
Last edited:

Parazels

Member
But you need neither 4K/60fps nor Path Tracing for "excellent" graphics. If the question was "how many teraflops do you need for advanced techniques to run with acceptable/smooth framerates", then that would be another matter.... though it would still be a silly question! 😆
I wrote initially, that for me 4k@60 + path tracing are necessary attributes of excellent graphics.
 

HeWhoWalks

Gold Member
I wrote initially, that for me 4k@60 + path tracing are necessary attributes of excellent graphics.
Fair, but that's you. I've seen excellent graphics all gen and less than 5 games use Pathtracing. It's just not a necessity to accomplish that feat.

And yet, that still can't answer the OP's question because, well, it's asking for an absolute to come up with a subjective result.
 
Last edited:
There will never be enough because the graphical bar is going to be constantly raised along with framerate. 4k at 120 fps could be the standard next gen. You might need 40 TF just to take a current gen game like FF 16 from 1440p 30 to 4k 120 with the exact same visuals. This is why you are seeing a greater push for AI. It would be more practical to have a 40 TF GPU perform more like a 60 TF than actually have a 60 TF GPU that would cost $2000.
 

SHA

Member
For ultra preset I think 150 would be enough at the moment, but it's stupid cause you can cut corners and be smarter than devs, I think 70 in this case, I'm talking about pc, not futuristic consoles.
 

Hoddi

Member
We're hitting diminishing returns with graphics.

The amount of work it requires to produce a game with good graphics takes experientially increasing amount of man hours and budget.
That only applies to asset creation. Games don't need full motion capture or massive open worlds to look realistic but they need good lighting.

There was a path tracing mod released for Quake 2 some years ago and it completely transforms the game. Modern games could similarly have the same asset costs as 10 years ago but they would look significantly better with more accurate lighting. And that is where having faster hardware comes in.

 

SlimySnake

Flashless at the Golden Globes
TFLOPs are officially the new bits I swear. Bandwidth, RAM amount/speed, clock speeds, and transistor count are far more important than TFLOPs. Also graphics are excellent depending on how a dev makes the game look. To me art style is everything.
lmao bandwidth and ram amount/speed are not FAR MORE important. at best, they are equally as important, but they still dont render fucking graphics. they just facilitate it. Also, clock speeds and transistor/cu counts literally make up tflops.

there has been this insane push to diminish the role tflops play in gpu power. Tflops are still the best way to measure gpu performance. you cannot take a 10 tflops ps5 and just give it a vram and bandwidth boost and watch it perform like a fucking 4090. i swear i am losing brain cells every time i see people downplay tflops. you NEED more gpu plain and simple. if you didnt we will still be gaming at 0.25 tflops xbox ones hoping 16gb of vram and 560 gbps of vram bandwidth would make it perform like an xsx.

Just because MS fucked up the design of the xsx by not giving the 52 cu gpu enough bandwidth or cache and stuck with a split ram architecture doesnt mean tflops arent important. Just because cerny repeated all the same mistakes as those idiotic xbox engineers making this 67% more powerful gpu perform like a 30-45% more powerful gpu doesnt mean tflops arent important. it just means he didnt give the gpu the bandwidth and lanes it needed to perform up to its theoretical limits. the PS5 performs like a 10 tflops gpu. the x1x performs like a 6 tflops gpu. the ps4 performed like a 1.84 tflops gpu and the xbox 360 performed like a 0.25 tflops gpu. the shitty half assed designs of the ps3, xsx and ps5 pro are an indictment on the retarded engineering teams over at sony and ms. its not an indictment on tflops
 

tkscz

Member
You are right, but what if I want Little Nightmares 3 at 4k@60 + full PT?

What kind of a console do I need to get these graphics?
It's all about architecture, if it's AMD based, AMD needs to gets better at it's Ray/Path tracing pipelines and get better with FSR.

Nvidia's DLSS and RT cores have a much better time handling effects like that. But Nvidia is apparently a bitch to work with and often has terrible deals for console manufactures.
 

efyu_lemonardo

May I have a cookie?
You can get to 100 tflops and the problem will remain.
At some point the human eye will not be able to perceive any further increase in fidelity.

You could render a million LOD0 assets to an 8K screen with fully ray traced lighting running at 120 fps with complete physics but most of that information will go unnoticed.
 
Last edited:

diffusionx

Gold Member
4090 is 63 tflops. But performs more like a 35-40 tflops card because nvidia is inflating their tflops numbers.

Right now for the games currently on the market, 4090 seems to be enough for 4k 60 using path tracing but you will need to use dlss.

The problem is that next Gen devs will push more geometry, more effects, better character models, better textures, more npcs and more everything so even if by some miracle we get a 4090 in a ps5 next Gen, only cross Gen games will run at 4k 60 fps. Next gen games will still run at 30 and sub 1080p resolutions in 60 fps modes.

You can get to 100 tflops and the problem will remain.
same as it ever was. I'm starting to think people are just not intelligent enough to grasp this.

This is all relative and based on technological advancement. The PS5, the base one, can run plenty of games at 4K/60fps, they're just PS4 games. While PS4 games looked great in 2018, these days that's not good enough. This really isn't that complicated but people don't get it.
 
Last edited:

LordOcidax

Member
This was on 1,3 TF in 2013 on Xbox one


sOS3Go1.gif

Q6ci5Bn.gif


It depends on the Dev
Today we have 12 / 16 / 20 TF..
Perfect example, now the majority of devs just want brute force to make games, but optimization and time and specially talent cost a lot of money too. The Order 1886 is another great example.
 
Last edited:

daninthemix

Member
You can never have enough TFLOPS, and that's a commercially provable fact - in that the 5090 will sell out within hours of launching, despite costing upwards of $2k.
 

Little Mac

Gold Member
However many teraflops needed to make HD2D remakes of Pokémon Red/Blue, FF6, and Castlevania: Symphony of the Night.
 

BlackTron

Member
It's gonna take a lot more for an open world or realistic racing game, than for a fighting game. But we don't want a nuanced answer, we want the magic buzzword numbers as some symbol of what we want and stand for, or something.
 

efyu_lemonardo

May I have a cookie?
same as it ever was. I'm starting to think people are just not intelligent enough to grasp this.

This is all relative and based on technological advancement. The PS5, the base one, can run plenty of games at 4K/60fps, they're just PS4 games. While PS4 games looked great in 2018, these days that's not good enough. This really isn't that complicated but people don't get it.
Are you saying consumers will never be satisfied? That makes no sense.
Do you ever feel that the real world doesn't look real enough because its remained at the same fidelity all your life?
You don't, and that means there's a limit.
 

diffusionx

Gold Member
Are you saying consumers will never be satisfied? That makes no sense.
Do you ever feel that the real world doesn't look real enough because its remained at the same fidelity all your life?
You don't, and that means there's a limit.
This is essentially a limit function approaching infinity. Whenever there is some big leap in fidelity people will say it looks “photorealistic” then a few years later people laugh at themselves saying it. I myself said it with NFL 2K on Dreamcast lol.

Will games ever look indistinguishable from the real world, no. But even if they did, it doesn’t matter because most don’t even portray the real world. You can make a card with 1000 teraflops and one with 10,000 will make it look quaint. How, I don’t know, just like I couldn’t imagine what a PS5 would do when the PS1 came out.
 
Last edited:

efyu_lemonardo

May I have a cookie?
This is essentially a limit function approaching infinity. Whenever there is some big leap in fidelity people will say it looks “photorealistic” then a few years later people laugh at themselves saying it. I myself said it with NFL 2K on Dreamcast lol.

Will games ever look indistinguishable from the real world, no. But even if they did, it doesn’t matter because most don’t even portray the real world.
Why wouldn't games ever look indistinguishable from reality?

CGI in cinema has already reached that point in certain areas. Eventually this will be possible to do in real time as well.

Edit: don't think about this from the perspective of ever increasing hardware power, but from the perspective of the limits of human senses.
 
Last edited:

diffusionx

Gold Member
Why wouldn't games ever look indistinguishable from reality?

CGI in cinema has already reached that point in certain areas. Eventually this will be possible to do in real time as well.
No it doesn’t. It looks like CGI. Actually CGI seems to be getting worse for non technical reasons.
 

efyu_lemonardo

May I have a cookie?
No it doesn’t. It looks like CGI. Actually CGI seems to be getting worse for non technical reasons.
There are many CGI shots in film you are not aware of because they look real, so you assume they were shot physically. This has been the case for a while now.

Edit: Check out this series
 
Last edited:

Mr.Phoenix

Member
Excellent graphics is no longer a Teraflops problem. You can have excellent graphics with anything between 20-40TF.

What we need,

  • is more memory bandwidth, to the point where its never a bottleneck
  • more/better-specialized units in the GPU. Eg, you would want something that has enough RT muscle to do the full suite of RT effects (GI, AO, Shadows, reflections) for every pixel of a 1440p image in under 2ms. And a well-trained AI upscaler that can then take that image to 4K without any artifacts in under 2ms too.
That's it, that's all that's needed to have the best graphics we can have.
 
Top Bottom