• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Will a 4070 ti be enough for 4-5 years at 1080p/1440p ?

RoboFu

One of the green rats
The real answer is…. No one knows.
Will there be some major tech breakthrough during that time? 🤷‍♂️ if so then no it won’t.
 
Last edited:

Bojji

Member
Your 3900x will probably bottleneck the 4070 Ti (in certain titles) compared to something like the 5800X3D, which could be a direct slot-in replacement assuming you have a B450 motherboard or better (after a BIOS update that is).

Get yourself the 4070 Ti and then eventually (after a few years) when you're feeling the heat, drop some money on the 5800X3D (which should be way cheaper by then anyway).

i dont need high end cpu because im not targeting extreme high fps , 45-60 is enough for me

You will see situations where 3900X can't produce that 45 fps. Real stuff, this CPU is not good enough for gaming in 2023, many games are super CPU limited (especially with RT), this CPU will bootleneck you like motherfucker.

Changing for 5800X3D should be your priority.

As for GPU, I think best GPU right now will be 7800XT, is has the best price/performance ratio so far of all released hardware I think. 16GB of VRAM will be anough for years.

Personally I can't live without DLSS (and 3.5 looks like game changer for RT) so I'm stick with overpriced Nvidia untill AMD steps up their game with FSR quality.
 
Last edited:

SF Kosmo

Banned
I have a 1080 ti atm and im planning to upgrade , i would like to buy a 4070 ti but i would also like to wait until 50 series and buy a 5070

the 1080 ti lasted for a long time , and still does play games at decent fps so i would like my next gpu to last me just as much

the only problem that i can think of is VRAM , maybe 12 gb wont be enough in a couple of years

So what should i do ? Buy 4070 ti or wait for 5070 ti ?

EDIT : i have enough money for a 4080 , is it a wiser choice ? 4090 is out of the question , in my country its double the price
I personally think the VRAM concern is overblown when it comes to 12GB cards, and frankly a lot of it just AMD cope. The games we have that are pushing past 8GB usage are unoptimized console ports and because consoles are a fixed spec those are unlikely to get worse this console gen. Perhaps if you are looking for something to take you beyond 2027 then 12GB VRAM will be an issue.

Is a 4080 worth the extra $150-200? If you're playing the longevity game then yes, a 4080 will stay useful for a year or two longer than the 4070Ti. But if you're planning to replace in four years regardless, then the 4070Ti is gonna perform great in that time.
 

SF Kosmo

Banned
not if they are close in price, and7900 xtx is way ahead of the 4070ti in everything
7900xtx is priced and positioned as a 4080 competitor, not a 4070Ti competitor.

And it's really time to move beyond pure benchmarks. The feature gap is widening with these cards. nVidia is pushing these amazing looking path traced games like Cyberpunk and Alan Wake II and AMD can barely run these games at a playable framerate. All this DLSS stuff really matters, it's beyond just a performance saver at this point, it's allowing nVidia to produce transformatively different results and run content that feels a generation beyond what you can do on AMD cards.
 

MikeM

Member
Yes you'll be 100% fine. Majority of gamers are still playing with 8GB and they aint complaining. Look at the stats on steam. Dont let gaf fool you. Gaf is the kind of guy that will tell you need need OLED, 32 GB RAM, else you arent a real gamer and games dont work for you. Besides the fact that the future is heading towards AI with DLSS 3.5, FSR 3, highly unlikely games will ever need to surpass high VRAM requirements and those that do are unoptimized turds that you shouldnt touch at launch.
Majority of gamers play at 1080p tho. A 4070ti is not that card.
 

hinch7

Member
Next gen consoles are probably several years away and there will be a cross gen transition phase for the first couple years so yeah, easy. Might have to lower some settings down the line though its nothing to worry about.

Most people don't even have a GPU close to a 3070 power level never mind a 3080 or 3090 (4070Ti). And 12GB VRAM is enough, for now. Unless you do a lot of modding its fine for 1440P for the forseeable future.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Hello everyone,

I'm also contemplating upgrading from my RTX 2080. My current CPU is a 9700K, so I'm curious whether it might bottleneck a potential RTX 4080 upgrade. My primary focus is on VR and sim racing games at a resolution of 3440x1440@120FPS.
Yes, it will most definitely bottleneck at 4080 at 1080 and 1440p. 4K won't be as bad.
 

buenoblue

Member
Buy a lower end GPU and upgrade more often.

That makes more sense to me.
Yeah this is what I've been doing. I spent £1000 on 780ti sli in 2013. Then pretty soon vram became an issue so I got a 6gb 1070, then a 2070 super and I've just upgraded to a 4070. Every 2- 3 years I get a nice upgrade but it only costs me £2-300 once I sell my old card. Though I do game on ps5 a lot so I don't need a bleeding edge PC nowadays
 

UltimaKilo

Gold Member
I agree that it's a transition time, so cards are unlikely to last you as long. Why not pick up a mid-range card and wait another 2 years or so?
 

Hoddi

Member
12GB cards are still perfectly usable but I wouldn't buy a new GPU with this amount in 2023. I still use a 2080 Ti at 3440x1440 and I'm increasingly seeing games hitting VRAM limits at this resolution. Rift Apart runs very well with RT on this card, for example, but I need to drop textures to medium for it to fit in video memory. Higher texture settings otherwise cause visible thrashing on the PCIe bus and performance drops in half. Disabling RT then makes it perform normally. There's also been some other recent examples like Diablo 4.

I still have no intention to upgrade any time soon because the card mostly performs well enough. But I wouldn't replace it with a 12GB card if it came to that.
 

//DEVIL//

Member
12 gigs is not enough / barey enough for current games ( with ray tracing enabled ).

So no. a 4080 is. or AMD 7900xtx
 

SF Kosmo

Banned
I would not upgrade form a 1080Ti to 4070Ti. Makes no sense imho.
Wait for the Radeon RX 8800XT or Nvidia RTX 5070Ti
What? Why does that make no sense. The 4070Ti is like double the raster performance and way more than that when you factor in RT and AI features. Why would that not be a sensible upgrade? It would be night and day.

8800XT might not even exist and 5070Ti is probably two years away.

12 gigs is not enough / barey enough for current games ( with ray tracing enabled ).

So no. a 4080 is. or AMD 7900xtx
If you care about high end ray tracing features AMD is not really an option. A 4070Ti is gonna run path traced Cyberpunk or Portal RTX infinitely better than the XTX.
 
Last edited:

b0uncyfr0

Member
With the current shitty consoles as a baseline? For sure
Huh? The current consoles have about 13-14Gb vram usable
The 4070ti has more horse power, but whats the point if the fps drops down to 10 fps when its out of vram and swapping with ram :pie_diana:

If there's one thing you need more of, especially to keep your card long term - its vram.
Buying a 4070ti is dumb, especially at 1440p and 4-5 years. I doubt it'll last 2-3 years.
 

Bojji

Member
Huh? The current consoles have about 13-14Gb vram usable
The 4070ti has more horse power, but whats the point if the fps drops down to 10 fps when its out of vram and swapping with ram :pie_diana:

If there's one thing you need more of, especially to keep your card long term - its vram.
Buying a 4070ti is dumb, especially at 1440p and 4-5 years. I doubt it'll last 2-3 years.

Out of that usable RAM pool not everything is used as VRAM, CPU needs few GB too most of the time.

12GB is the lowest amount of memory on GPU I would recommend anybody but this GPUs should be fine playing PS5 and XSX games (on console settings!), but with PS5 Pro possibility with more memory pool...
 
What? Why does that make no sense. The 4070Ti is like double the raster performance and way more than that when you factor in RT and AI features. Why would that not be a sensible upgrade? It would be night and day.

8800XT might not even exist and 5070Ti is probably two years away.


If you care about high end ray tracing features AMD is not really an option. A 4070Ti is gonna run path traced Cyberpunk or Portal RTX infinitely better than the XTX.
RX8800XT will come out next year.
And yes, the 5070Ti is probably 2 years aways, but upgrading from a 12GB 1080Ti to a 4070Ti and downgrading 2 tier makes no sense at all.
Sure if that person only cares about RT and Raster performance and don't mind upgrading that 4070Ti in 3-4 years again, go for it.
But it's not reasonable.


Get a 8800XT or 5070Ti for a real upgrade, that last for 5-6 years and is easily good for 1440p and 4k
 
Last edited:

Bojji

Member
RX8800XT will come out next year.
And yes, the 5070Ti is probably 2 years aways, but upgrading from a 12GB 1080Ti to a 4070Ti and downgrading 2 tier makes no sense at all.
Sure if that person only cares about RT and Raster performance and don't mind upgrading that 4070Ti in 3-4 years again, go for it.
But it's not reasonable.



Get a 8800XT or 5070Ti for a real upgrade, that last for 5-6 years and is easily good for 1440p and 4k

What other thing should he care about? Losing pride and honor after downgrading two tiers? Hahaha.

Performance is the most important metric, GPU names and tiers are irrelevant and 4070ti is much more powerful than 1080ti.

There might not be 8800XT if AMD really is going mid tier only with RDNA4.
 
What other thing should he care about? Losing pride and honor after downgrading two tiers? Hahaha.

Performance is the most important metric, GPU names and tiers are irrelevant and 4070ti is much more powerful than 1080ti.

There might not be 8800XT if AMD really is going mid tier only with RDNA4.
That is already accounting for that.
There won't be any 8900XT or 8900XTX. But 8800XT will be the top end card with a mid-tier chip.


The 4070Ti is just 40% (4k) to 70% (1440p) faster after 6 years.
Only real generational leap is RT games.


When I upgraded from 470 to 1070 I got 3x the performance after 6 years.
 

Bojji

Member
That is already accounting for that.
There won't be any 8900XT or 8900XTX. But 8800XT will be the top end card with a mid-tier chip.


The 4070Ti is just 40% (4k) to 70% (1440p) faster after 6 years.
Only real generational leap is RT games.


When I upgraded from 470 to 1070 I got 3x the performance after 6 years.

gKbLr7i.jpg
lG3zrB0.jpg
 

SmokedMeat

Gamer™
Get yourself a 7900XT and use the savings for a 5800X3D.

You’d be wasting the potential of a powerful GPU, by pairing it with a 3900X.

You’ll have FSR 3, whatever their frame gen thing is for all DX11 and DX12 games, and better rasterization against Nvidia GPUs in the same class.
 

diffusionx

Gold Member
I don't know if any of the options on the market will hold up over the next five years, with what these new engines like UE5 are doing and bringing damn 4090s to their knees. I have a 3080 12GB and would love to upgrade it but the 4xxx series isn't good enough. Of course Nvidia is stuffing their new cards with all sorts of algorithm tricks to get more out of it, but that's ultimately because the cards aren't good enough.

I said this in another thread but it wasn't that long ago where we had to upgrade our GPUs every 2 years, 3 at most, if we wanted to keep up, and, you know, we might be in that world again.
 
I would not upgrade form a 1080Ti to 4070Ti. Makes no sense imho.
Wait for the Radeon RX 8800XT or Nvidia RTX 5070Ti
Waiting for the next best thing makes almost never sense, unless we talk about almost here and some major improvement, true leaps, like with Ryzen, Core2Duo or Radeon 9000 (edit 9600Pro and 9800Pro, not the XT stuff /e), where waiting their releases made sense once any credible infos were out. If 4070 makes no sense now, 5070 makes no sense when its released since the 6070 is around the corner. You can argue though if getting eg. 4060 and then 5060 or 6060, might not be an option, instead of hoping to go through some years with one card, make some smaller hops in between.

Nothing will ever be truly future prove. But aligning your upgrade path with major changes like consoles and or the relevant engines, should work. So I don't see how anything that is fine today will be garbage soon.
While the AI image stuff might offer some other unexpected jump, major shifts in performance seem not on the horizon, the tanking performance with RT will be work in progress for some time- similar with Doom3 shadows that were tough at release and hw and sw nowadays has no problem at all with that stuff.
 
Last edited:

draliko

Member
Waiting for the next best thing makes almost never sense, unless we talk about almost here and some major improvement, true leaps, like with Ryzen, Core2Duo or Radeon 9000, where waiting their releases made sense once any credible infos were out. If 4070 makes no sense now, 5070 makes no sense when its released since the 6070 is around the corner. You can argue though if getting eg. 4060 and then 5060 or 6060, might not be an option, instead of hoping to go through some years with one card, make some smaller hops in between.

Nothing will ever be truly future prove. But aligning your upgrade path with major changes like consoles and or the relevant engines, should work. So I don't see how anything that is fine today will be garbage soon.
While the AI image stuff might offer some other unexpected jump, major shifts in performance seem not on the horizon, the tanking performance with RT will be work in progress for some time- similar with Doom3 shadows that were tough at release and hw and sw nowadays has no problem at all with that stuff.
this is the only real answer, you can't really future proof anything, sometimes you get lucky and make a great purchase, sometimes things go way different from whats planned. The best way would be select an upgrade path (as suggested) and time it with console new gens or midgens, this will probably be the best way to maximise your money, but still as said, nothing is future proof.
 

Gaiff

SBI’s Resident Gaslighter
That is already accounting for that.
There won't be any 8900XT or 8900XTX. But 8800XT will be the top end card with a mid-tier chip.


The 4070Ti is just 40% (4k) to 70% (1440p) faster after 6 years.
Only real generational leap is RT games.


When I upgraded from 470 to 1070 I got 3x the performance after 6 years.
A 10 seconds Google search will tell you this is completely false.
 
short answer: yes.

long anser: you play the "ill wait for the next one" waiting game youre already dead.

just get the 4070ti bruh, and if youre that concerned get the 4080 since you can afford it.
 

rofif

Can’t Git Gud
yeah probably.
But buying graphics card for years never makes sense.
Your choice now will do little to shield you from the future.
Do You think it matters to people if they got 2070 or 2080 5 years ago? no.
 

Sanepar

Member
XX60 and XX70 for 4k(even with dlls)@60 fps and high settings doesn't worth money imo low vram and middle perf. It only worth if u want to upgrade every 2 years.
 

Magic Carpet

Gold Member
I'm also in that 1080ti conundrum. I think it best to wait it out even longer. See what pops up in the next couple of years. Maybe China will surprise everyone with a stolen Nvidia tech and 1/3 the price.
 

lmimmfn

Member
I went from 1080Ti -> 4070Ti, however i know it wont last anywhere near the 6 years i had my 1080Ti.
While i know the 4070Ti is crippled in terms of VRAM i can see it lasting 2-3 years before settings need to be turned down, the reason i chose it over a 7900XT was:
- Power consumption, the 4070Ti is more effecient than the 7900XT expecially at Idle, important for those of us in Europe where 1KW > 40 cent.
- GSync - my monitor is Gsync(doesnt have freesync) and there is no way in hell i would give up adaptive synch after using it for years.
- Prefer DLSS over FSR
Ray Tracing i'm not really bothered about, as its a performance hog.

So for me it boiled down to VRAM vs Power Consumption/GSynch/DLSS and i chose the 4070Ti.

Regardless OP of what you choose, the performance difference of 4070Ti or 7900XT vs 1080Ti is absolutely massive, playing Ultrawide @ 3440x1440, Cyberpunk is awesome, struggling with 47FPS(not great even with GSync) to 80+ FPS with higher settings.

After getting the 4070Ti though my aging Intel i7 6800K(also 6 years old) wasnt cutting it anymore, i considered a 5800X3D but having to buy a new motherboard along with CPU i just choose to go 7800X3D which should last me 6 years(PCIE-5 for GPU and NVME), but for OP 5800X3D would be a much cheaper solution.
 

Kacho

Gold Member
Definitely at 1080p. At 1440p you will likely need to start scaling back settings to maintain high framerates but that's fine.

The 2070 was a huge disappointment for me out of the box. I upgraded to a 4070ti earlier this year and I couldn't be happier with it. Should last me much longer than the 2070 did.
 
Dumbasses saying no are the same people who answered with no when the same question was probably asked at the 1070/1080 generation but facts speak for themselves many years later.

pk1AXEu.png



You do not need to put texture quality on max because realistically you will never notice the difference between the max and the one lower, yet the max will always be just there to eat vram so people who have best gpus on the market feel good about themselves. Same with shadows, same with certain RT features. You'll be fine for many years playing games at 1400p with almost maxed out settings and even raytracing if you have DLSS or FSR. Buy the gpu you want, stop asking people who only buy the latest and best tell you no. Also stop watching benchmark videos, they specifically test the games on max settings everything cranked up with settings u will never notice any difference visually, yet the vram consumption is huge. Peace out.
 
Last edited:
sure, you just keep lowering settings untill you dont like what you see anymore--then you upgrade.

my 980ti lasted over 5 years and i was using it for 4k30.
upgraded once i couldnt get around vram limitations anymore (6gb).
 

Crayon

Member
I don't think anyone can really answer that right now. I think the biggest question here is whether you are trying to future-proof or you've got games right now that aren't running as good as you like.
 
Top Bottom