• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Will a 4070 ti be enough for 4-5 years at 1080p/1440p ?

DeaDPo0L84

Member
Yes/no. Depending on how you are with settings and not needing them to be maxed out. Also DLSS will only continue to improve so that will give your GPU longer legs.
 

nashman

Member
Don’t listen to the haters, the 4070ti is amazing for 1440p , add in DLSS 3.5 and we feasting. It even beats a 3090ti at 1440p in most new games. I’m loving mine but will get 5080 eventually. The first few UE5 games have used 7-10 gigs of VRAM maxed settings at 1440p we good.
 

diffusionx

Gold Member
Dumbasses saying no are the same people who answered with no when the same question was probably asked at the 1070/1080 generation but facts speak for themselves many years later.

pk1AXEu.png



You do not need to put texture quality on max because realistically you will never notice the difference between the max and the one lower, yet the max will always be just there to eat vram so people who have best gpus on the market feel good about themselves. Same with shadows, same with certain RT features. You'll be fine for many years playing games at 1400p with almost maxed out settings and even raytracing if you have DLSS or FSR. Buy the gpu you want, stop asking people who only buy the latest and best tell you no. Also stop watching benchmark videos, they specifically test the games on max settings everything cranked up with settings u will never notice any difference visually, yet the vram consumption is huge. Peace out.

Immortals of Aveum runs at ~56fps at 1440p, that is a game that is out now. So if you want to play at 60fps, you are turning down settings on a game that is out right now, not three or five years from now.


maybe that game will be the most resource intensive game to come out over the next five years, so it is the floor. Who knows. But, I doubt it.
 

Ev1L AuRoN

Member
It all depends on how do you want to experience the games, RTX is expensive on VRAM since it needs space for the BVH structure etc.. Today most games have RTX as an option, but in 5 years, maybe we will start to see games like Metro Exodus Enhanced Edition that does all the lighting using RT and being incompatible with GPU's that don't support HW RT. It's likely to be fine, but we never know.
 

Crayon

Member
If you care about high end ray tracing features AMD is not really an option. A 4070Ti is gonna run path traced Cyberpunk or Portal RTX infinitely better than the XTX.

I can't find anything atm specifically showing cp overdrive, which is a really havey tr workload. But in general the 4070ti and 7900xtx RT seem to be neck and neck. This chart shows the xtx about 5% leading but that surely varies by the test suite.

relative-performance-rt_2560-1440.png
 

Bkdk

Member
would at least go for 4080. Games will get a hardware requirement bump with mid console generation upgrade from PS and XBOX.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Seems like a bad idea to hold on to a GPU for 5 years if you want it to perform well. I'm sure you saw this with your 1080 Ti. It was a beast when it launched in 2017, but today it loses to the lowly 4060.

In 5 years a 4070 Ti/7900 XT will lose to a 6060.

I recommend doing more frequent upgrades in the mid-range. If you also sell the GPU you'll end up spending less in the end.
 

Crayon

Member
Seems like a bad idea to hold on to a GPU for 5 years if you want it to perform well. I'm sure you saw this with your 1080 Ti. It was a beast when it launched in 2017, but today it loses to the lowly 4060.

In 5 years a 4070 Ti/7900 XT will lose to a 6060.

I recommend doing more frequent upgrades in the mid-range. If you also sell the GPU you'll end up spending less in the end.

Agree. I buy a good deal on something modern ever couple years and I can still sell the last one before it's ancient. They idea of spending 3 times as much to future proof doesn't make sense to me since you are spending the beginning with a hi end card and then degrading over the next 6 years or whatever. All while drawing more power than you had to. I can spend the less money and have something modern that keeps a similar level of performance over the same 6 years. Plus I get to zig and zag depending on what games at the time demand.
 

amigastar

Gold Member
4000 series sucks. It's just a software upgrade your paying for. Just buy a console. Or buy a 3000 series card and wait it out.
Software upgrade? How, it's running on Tensor cores which are hardware as far as i know. IMO the 4070 is a good card for what it is.
 

DragonNCM

Member
Absolutely yes !!!
I can still play all games with my 1660TI 6gb all today games with fsr2.0 or xess.
1660TI was launched 19 feb 2019.
 
Last edited:

Housh

Member
Consoles are terrible and all the hardware manufactures are focusing on AI upscaling solutions so my guess is yes.
 

SF Kosmo

Banned
I can't find anything atm specifically showing cp overdrive, which is a really havey tr workload. But in general the 4070ti and 7900xtx RT seem to be neck and neck. This chart shows the xtx about 5% leading but that surely varies by the test suite.

relative-performance-rt_2560-1440.png
The picture starts to look really different in really RT heavy stuff like Cyberpunk Overdrive and Portal RTX, though, and since these titles really rely on DLSS to achieve playable framerates and high image quality, you can start multiplying that. It might not be apples to apples, but the fact is a 4070Ti can play Cyberpunk OD at like 80-90 fps and the Radeon is gonna do 20fps, and that's what matters if you want to play these games and not just benchmark then.

And like I said XTX is priced as a 4080 competitor not a 4070Ti.
 
People with GTX 1080/ti can play easily today. People with 5070ti+ 8800XT+ will be play easily for 7-8 years
a 970 with the suboptimal memory solution is/was probably also a fine card for very long.
But all these cards around these generations, prior 2000series lasted that long because the adoption of RT is still slow. The one new feature that is a struggle. If you don't need RT and the game don't forces it on you like games could and did with other new DX features and OS requirements, you are fine with "ancient" cards. You did not really know that when you bought them though, where the industry was going to go. In hindsight, yeah, good buy, but not really what is implied in the term "future proving".
If RT would be a bit more performant from the go, and the crypto trigger did not fuck up prices, people might have gotten more 2080s and then or even before arival of the 3080 the industry would have flipped the switch one gen earlier and min. requirements would have excluded the 1080 already.
If someone is happy with whatever they purchased for 5 or even 8 years, great, sure, but you can't really plan that. You can hope for it, but imho not shooting for the sky and upgrading in smaller sorta frequent steps is practically impossible to not be the better way. Just being always above the good enough level, not great at the beginning and then slowly getting further away from the marketing visuals. I never understood why anyone would fantasize about a 5 year timeframe, on PC, get a console if you don't want to regularly upgrade, since even the market leader Nvidia themselves won't know for sure how the market will look that far away. That's like preordering any game 5 years in advance, without knowing anything beside the company making it has usually delivered previously.
 

SolidQ

Member
I never understood why anyone would fantasize about a 5 year timeframe, on PC
Because games making for Consoles, and PC never have another graphic engines. Only generation was PS1, where PC have dirrence engines. You just need minimum 3x-4x more GPU power than console and you will be fine for whole console generation. Only question what resolution you play.

If you don't need RT and the game don't forces it
There no game, which force RT and it won't be until PS6 generation, and maybe even PS6 won't force it

. You did not really know that when you bought them though, where the industry was going to go.
Industry now very very slow, it's not 2000s, where you have DX7, 8, 9, 10, 11
 

Shifty1897

Member
A 4070 TI will get you to at least 2028 for all games, 2030 for most if we have a long cross gen like we did with PS4 to PS5.
 

amigastar

Gold Member
A 4070 TI will get you to at least 2028 for all games, 2030 for most if we have a long cross gen like we did with PS4 to PS5.
2028 seems about right, not sure about 2030 though.
I have a 4070 and i hope it will hold out for a while, actually.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Hmmm...isn't the 7900XTX a little more expensive than the 4070ti?

At the time of that post......in August 2023.
Both GPUs were around 900 dollars but the 7900XTX was trading blows with the 4080 in raster.

Today honestly its kinda a toss up.
If you play alot of Unreal Engine 5 games and/or games with RT the 4070Ti Super is better.
If you are mostly using raster the 7900XTX gets the job done.


Realistically at this juncture just wait for the next generation unless you are getting a good deal.
 

buenoblue

Member
I picked up a 4070ti in the summer for £599 and it's been amazing upgrade from my 2070 super. Plays new games really well with dlss and framegen. Plays anything from a couple years ago and older at 4k max.

I don't think it will last 5 years though. I got after selling my 2070S for £200 it only cost me £400. Probably upgrade in a couple years to the 6 series.
 

Bojji

Member
I had 4070ti in 2023 for few months, power of this card is good but some games were already vram limited on highest settings...

Sold it and bought used 6800, then 3080ti and used that for few months.

Few months ago I sold 3080ti (basically for the same price I bought it) and went with 4070ti super - it's not much more powerful than regular ti but that 16GB makes all the difference in some games.

On console settings 4070ti will be good for next few years but if you want more than that you are already limited.

Edit: I know this thread is old as fuck.
 
Last edited:

Lorianus

Member
Trying to play games on ultra is dumb anyway when sometimes a game looks practically the same on medium to high settings while you suddenly gain +30-40 fps and the vram usage goes from 9-10gb to 6-7, even a 3080 with 10 gb will last you for this generation on 1440p on reasonable setttings.
 

Bojji

Member
Trying to play games on ultra is dumb anyway when sometimes a game looks practically the same on medium to high settings while you suddenly gain +30-40 fps and the vram usage goes from 9-10gb to 6-7, even a 3080 with 10 gb will last you for this generation on 1440p on reasonable setttings.

Raster games usually look similar between high and ultra, that's true. But path tracing changes graphics completely and it requires shit ton of vram to function, CP2077, AW and IJ easily go above 13-14GB usage with PT (Indiana above 15GB). Only Wukong is less but that's mostly thanks to vram friendly UE5, even fucking Portal goes ~12GB.

There is also texture quality that contributes to vram very much, I don't like to watch shit looking - muddy textures. Some people think that RT is a gimmich but even most of them will admit that textures are very important.
 
It depends on what the next consoles are like as that will become the base for developers. If they have 24 gigs and more, you won't be able to lower the settings enough. Will need to definitely upgrade by 2028.
 

Bojji

Member
It depends on what the next consoles are like as that will become the base for developers. If they have 24 gigs and more, you won't be able to lower the settings enough. Will need to definitely upgrade by 2028.

Cross gen, all games will be on PS5 for first few years.

I would say 2029-2030.
 
Cross gen, all games will be on PS5 for first few years.

I would say 2029-2030.
Yes but these cross gen games will likely run at 1080p/30 on the PS5. The OP is talking about 1080p/1440p and presumably 60. Will be a push on these late gen games. They will be the equivalent of Cyberpunk on last gen consoles. You want clear space. The ram situation on PC GPUs a potential mess if these consoles have anything near 32 gigs of video ram which is unlikely but possible as that's the trend. Developers would be building games around this level of ram as the consoles are the target platform and the majority of GPUs that people have will still be around 16 gigs by 2029/2030. Most people don't have a 4090.
 

Bojji

Member
Yes but these cross gen games will likely run at 1080p/30 on the PS5. The OP is talking about 1080p/1440p and presumably 60. Will be a push on these late gen games. They will be the equivalent of Cyberpunk on last gen consoles. You want clear space. The ram situation on PC GPUs a potential mess if these consoles have anything near 32 gigs of video ram which is unlikely but possible as that's the trend. Developers would be building games around this level of ram as the consoles are the target platform and the majority of GPUs that people have will still be around 16 gigs by 2029/2030. Most people don't have a 4090.

I agree.

But still some % of memory always have to be used as system RAM (plus few GBs for OS) so I would say that 24GB GPUs will be fully usable next gen even if consoles will have 32GB.
 
Last edited:

nkarafo

Member
I play at 1080p with a 3060 12GB and some games already consume up to 10-11GB since a couple of years now. Even with DLSS and some settings toned down.

I fully expect 12GB to become a bottleneck at 1080p this very year.
 
Top Bottom