Digital Foundry - 4070 Ti @ 1440p proportionally equal to 4090 @ 4K

I'd rather get the 7900 XTX for $100 more.

There is no good answer here. 7900xt is better overall and more future proof with VRAM but I CAN'T LIVE without DLSS so it's no option to me. Plus ray Reconstruction is going to be important in the future, AMD cards are good but lack features that are important to many people.

There will always be exclusive settings. See: Path Tracing. The 4070ti is far behind, even in 1440p.

What they meant is that 4070ti is around half the power of 4090 (just like 1440p is 50% of 4k). It also has half the ram, half memory bus and half the price.
 
All i want is to stay competetive with consoles this generation, i don't care if my 4070 won't be cutting edge in the next years. I want to upgrade my PC when the new PS and Xbox are released to the world.
 
There is no good answer here. 7900xt is better overall and more future proof with VRAM but I CAN'T LIVE without DLSS so it's no option to me. Plus ray Reconstruction is going to be important in the future, AMD cards are good but lack features that are important to many people.



What they meant is that 4070ti is around half the power of 4090 (just like 1440p is 50% of 4k). It also has half the ram, half memory bus and half the price.

I think FSR 2.2 is good enough for most people for image reconstruction, and FSR 3.0 is actually testing ahead of DLSS frame generation, but I understand your stance. DLSS still offers superior IQ compared to FSR 2.2 and ray reconstruction is going to be a big deal moving forward.
 
??

I dont buy a GPU every year. I have a 3080 and I am not going to buy a 40 series card. My next upgrade will be 5080 which i simply want to be 20% faster than the flagship 40 series card. Just like how the 4080 was 20% faster than the 3090 Ti. that should give me roughly a 100-110% performance boost compared to what i have today.
I went from a 1070 to a 2080 Ti to a 3080 12GB to a 4080 that I exchanged to a 4090.

I expected my 3080 12 GB to last much longer but I don't miss it. I had my 4080 for a month and decided if I was going to spend $1300 on a GPU I might as well spend $1600 and get something considerably faster.
 
Nope. The 4070Ti is around 60% performance of a 4090 at half the price or under.

I got my 4070Ti for £600 which is still A LOT for a graphics card and would never pay close to 1K on one component to play some games never mind 1.5K.
Still not the value the 3070 was. 3070 was 67% the power of a 3090 for 1/3 the price.

Or the 3080 10gb was half the price for 90% of the performance.
 
All i want is to stay competetive with consoles this generation, i don't care if my 4070 won't be cutting edge in the next years. I want to upgrade my PC when the new PS and Xbox are released to the world.
The 4070 already isn't cutting edge but it should be significantly faster than either consoles for the remainder of the generation. It's something like 80% faster in raster and can be over twice the performance in RT workloads. The 12GB of VRAM should also be enough but might not cut it for max settings+rt in the future. Not that you got enough GPU grunt to get that without DLSS anyway.
 
Still not the value the 3070 was. 3070 was 67% the power of a 3090 for 1/3 the price.

Or the 3080 10gb was half the price for 90% of the performance.
Sadly thats how Nvidia rolls this generation. Given there is practically no competition to deal with.. in terms of overall performance (including RT) and feature sets. And even then, RDNA 3 well underperformed and the mid range came fairly late. Due to AMD and Nvidia wanting to get rid of old stock. Doubt we're going to see a Turing to Ampere turn around but I think Blackwell should be a big comeback for consumer grade cards

Next generation should see massive improvements given we're getting GDDR7 and what I predict larger VRAM capacity accross the stack. And a big architectural change which should see equally large gains in RT. Right now the best value Nvidia card is probably the 4070 at sub $500.
 
Last edited:
Still not the value the 3070 was. 3070 was 67% the power of a 3090 for 1/3 the price.

Or the 3080 10gb was half the price for 90% of the performance.
Apparently nvidia listened when they were told it made no sense to buy a 3090 for $1500 when a 3080 was being sold for $700. Ignoring crypto costs.
10% performance increase for 50% price increase. With the 40 series they changed that such that there is a much bigger performance between the 4080 and 4090 that they priced the 4080 so high to incentivize purchasing the 4090.

Guess what? For me it worked. I'm a total whore.
 
Last edited:
Apparently nvidia listened when they were told it made no sense to buy a 3090 for $1500 when a 3080 was being sold for $700. Ignoring crypto costs.
10% performance increase for 50% price increase. With the 40 series they changed that such that there is a much bigger performance between the 4080 and 4090 that they priced the 4080 so high to incentivize purchasing the 4090.

Guess what? For me it worked. I'm a total whore.
They half "fixed" that early last year when they launched a 12gb 3080 for no msrp (but it was usually 12-1250 from partners).
 
I think FSR 2.2 is good enough for most people for image reconstruction, and FSR 3.0 is actually testing ahead of DLSS frame generation, but I understand your stance. DLSS still offers superior IQ compared to FSR 2.2 and ray reconstruction is going to be a big deal moving forward.
I still maintain that if you showed people a ray traced image and a rasteurized image, very few would be able to pick out which is which unless you know exactly what to look for.
 
I still maintain that if you showed people a ray traced image and a rasteurized image, very few would be able to pick out which is which unless you know exactly what to look for.

I agree with you outside of path tracing in CP2077. That's the first and only implementation of RT that has really wowed me, and could be a hint of what we'll see in the years to come.
 
I play games on a 4k display and 1440p dlss quality vs 4k dlss quality is a massive difference in IQ on a large 65 inch screen.

My next upgrade is going to be a 5080 which better be 20% faster than the 4090.
them modern games simply refuse to load proper crisp lods and models at 1440p. probably to be compliant with VRAM. 4K based gaming (upscaled or not!) will be only reserved for 16 GB GPUs going forward

these games simply becomes something else; crisp and clean at 4k upscaled situations even when compared native 1440p. it seems like devs use proportional LOD downgrades and that affects 1440p and 1080p immensely.

benefits is that 1440p reduces vram consumption a lot to give some "breathing room" for people like u and me that has 8-10-12 gb gpus

you can try native 1440p dlaa/taa vsersus 4k dlss performance. for example in last of us part 1, 4K DLSS OR even FSR performance definitely looks better than native 1440p, in my experience. Would love to hear some experience from actual 4K users too. It just feels like, to me, that game refuses to load proper textures and high quality models that devs are intended at native 1440p or 1440p based upscaling. for me (sadly) a game only looks like how it is intended when the output resolution is 4K. anything else feels like a big compromise to cope with lower spec memory budgets.

back then, games wouldn't scale much in terms of vram between 1080p/1440p and 4K. nowadays they do. this makes it perfect that NVIDIA can sell 8 gb 1080p, 12 gb 1440p and 16 gb 4K cards.... back then, they wouldn't be able to get away with it. with how games load garbage and compromised lods and models and textures at lower resolutions, I feel like this is what they banked on all along.

so 4070 or 4070ti being a "perfect" 1440p card does not interest me. for me, anything to do with 1440p DOESN'T LOOK good enough. If 4k dlss performance ends up with better visuals and similar performance to native 1440p, all I see that is 1440p is convenient excuse for NVIDIA to keep producing low vram specced cards at insane prices (800 bucks, 12 gb 4070ti).

even worse is these new cards are also crippled in a way that they get degraded performance at 4K now.

I will directly jump to 4K. 1440p doesn't interest me. I don't want to play at a resolution which looks worse than 4K with 1080p internal shading resolution.

it is not the fault of 1440p itself, clearly. it is how devs approach to scalability.

if they did 1440p justice, then you would also see insane vram requirements at 1440p too. the fact that we don't saw that is proof that there's a huge compromise going on there.
 
Last edited:
I still maintain that if you showed people a ray traced image and a rasteurized image, very few would be able to pick out which is which unless you know exactly what to look for.
Depends on the context and what's being displayed. Sometimes the difference is subtle. At other times it's massive.

Can you not tell the difference?

BfvOP5Z.jpg
 
I'm really happy with my 7900XT, and with FSR3 finally here, I'm hoping for good things.

Ray tracing reconstruction does look appealing though, and Nvidia is always a step or two ahead - so I might swing back to Nvidia next round. It'll all depend on the price for performance landscape at the time.
 
Sadly thats how Nvidia rolls this generation. Given there is practically no competition to deal with.. in terms of overall performance (including RT) and feature sets. And even then, RDNA 3 well underperformed and the mid range came fairly late. Due to AMD and Nvidia wanting to get rid of old stock. Doubt we're going to see a Turing to Ampere turn around but I think Blackwell should be a big comeback for consumer grade cards

Next generation should see massive improvements given we're getting GDDR7 and what I predict larger VRAM capacity accross the stack. And a big architectural change which should see equally large gains in RT. Right now the best value Nvidia card is probably the 4070 at sub $500.

We are unlikely to see the Ampere to Ada jump, which basically doubled the performance.

But it should still be a sizeable jump, L2 cache increasing again, and GDDR7 atleast on the 102 chip should make this thing pretty damn quick.

pCpkZ66.png
 
Gaf was shitting on 4070TI bcuz they actually believed VRAM will be an issue based on some shitty ports, yet CDPR has proven that even the highest tech on the market does not need high VRAM assuming ofc you're playing at 1440p which is still above the standard as most gamers based on Steam survey are still at 1080p and will most likely be for many years.

Yeah but highlighting the fact that they're shitty ports will not just automatically make the problem go away. There will always be incompetent developers who we have to use brute force to make their spaghetti code run well.
 
That's why I stuck with 1440P. Shit starts getting expensive going for 4K.

Next generation (Blackwell) is when people should go full in, going by the leaks.

Looks like the oppsite to me.
Blackwell looks like the generation to skip if you have an Ada chip.
Ampere to Ada was a big jump (near double the perf) even for people who had high end Amperes.
Blackwell is unlikely to be value for money for most people gen on gen.

pCpkZ66.png
 
If that's true, you could extrapolate that the 4070ti @4k with DLSS at performance is equal to the 4090 @ 4k with DLSS set to quality.
 
Looks like the oppsite to me.
Blackwell looks like the generation to skip if you have an Ada chip.
Ampere to Ada was a big jump (near double the perf) even for people who had high end Amperes.
Blackwell is unlikely to be value for money for most people gen on gen.

pCpkZ66.png

Depends on how you compare flagship to mid-range and lower end cards. And most people aren't in the market for flagship $1600 graphics cards. Nvidia gimped a lot of GPU's in the stack this generation below the 4090 and memory bandwidth is going way, way up next generation. With the changeover to faster and more denser GDDR7 modules, adding in higher bus and more cache.

From Tomshardware "Samsung's first 16Gb GDDR7 device features a data transfer rate of 32 GT/s and therefore boasts a bandwidth of 128 GB/s, up significantly from 89.6 GB/s per chip provided by GDDR6X at 22.4 GT/s. To put it into perspective, a 384-bit memory subsystem featuring 32 GT/s GDDR7 chips would provide a whopping 1.536 TB/s of bandwidth, which by far exceeds GeForce RTX 4090's 1.008 TB/s."

If Nvidia bumps the 5070/Ti to have 16GB of GDDR7 thats going to give a massive boost in memory bandwidth over the 4070/Ti. Then there's increased L2 and other improvements. Next gen Tensor core, bumped up specs and clocks etc etc. I can easily see at least another 50% gain over current gen with BW. Only real issue is going to be price.. as 3nm isn't going to come cheap.
 
Last edited:
Neither are particularly interesting to PC-casuals like me. Prices on recent GPUs are insane.
I thought they'd go down after the mining and covid bullshits but a 1440p card for 800 dollars? Count me out.
 
Nah, you can get 4080 on a deal for like $1050. 4090 isn't happening below $1600. That's 50% more.
Where you finding RTX 4080s for 1050 dollars?
Depends on how you compare flagship to mid-range and lower end cards. Nvidia gimped a lot of GPU's in the stack this generation below the 4090 and memory bandwidth is going way, way up next generation. With the changeover to faster and more denser GDDR7 modules, adding in higher bus and more cache.

From Tomshardware "Samsung's first 16Gb GDDR7 device features a data transfer rate of 32 GT/s and therefore boasts a bandwidth of 128 GB/s, up significantly from 89.6 GB/s per chip provided by GDDR6X at 22.4 GT/s. To put it into perspective, a 384-bit memory subsystem featuring 32 GT/s GDDR7 chips would provide a whopping 1.536 TB/s of bandwidth, which by far exceeds GeForce RTX 4090's 1.008 TB/s."

If Nvidia bumps the 5070/Ti to have 16GB of GDDR7 thats going to give a massive boost in memory bandwidth over the 4070/Ti. Then there's increased L2 and other improvements. Next gen Tensor core, bumped up specs and clocks etc etc. I can easily see at least another 50% gain over current gen with BW. Only real issue is going to be price.. as 3nm isn't going to come cheap.
The midrange is not going to be GDDR7, you can forget about that outright.
Maybe the 70Ti if people are lucky, otherwise Nvidia have no reason to spend on the midrange when they will be kicking AMDs ass easy work, and at the high end, just to show off might as well have a Halo product thats leagues and bounds.
Me thinks nextgen xx70s and xx80s are gonna be marginal improvements, unless they cut the xx80 out of the GB102.
 
Where you finding RTX 4080s for 1050 dollars?
Amazon or BestBuy. Amazon usually has some cards for $1099 but sometimes you can get another $30-50 off. On both Amazon and BestBuy you can get 4080 returns if you catch them at right time for even under $1K with full warranty and can always return if not ok good shape.
 
Amazon or BestBuy. Amazon usually has some cards for $1099 but sometimes you can get another $30-50 off. On both Amazon and BestBuy you can get 4080 returns if you catch them at right time for even under $1K with full warranty and can always return if not ok good shape.
The cheapest ive found is 1100 and that was Zotac and Gigabytes so id rather just flush that money down the toilet.
Zotac is Zotac and Gigabyte PCBs are frikken cracking and they wont honor the warranty.
GIGABYTE-RTX30-40-PCB-CRACKING-WARRANTY.jpg





I might have to just hunt down a 4070Ti from Palit and float the generation till after Blackwell.
 
The cheapest ive found is 1100 and that was Zotac and Gigabytes so id rather just flush that money down the toilet.
Zotac is Zotac and Gigabyte PCBs are frikken cracking and they wont honor the warranty.
GIGABYTE-RTX30-40-PCB-CRACKING-WARRANTY.jpg





I might have to just hunt down a 4070Ti from Palit and float the generation till after Blackwell.
Nah, they regularly have PNY or MSI for that. Just got to watch a sale. They also have Amazon Warehouse "Like New" or "Very Good" from Asuss sometimes under $1K or just above.

BestBuy can also be decent for their returned stuff.
 
Got my 4070Ti back in April(Upgrading from a 1080Ti), the thing cost me 1000euro which is insane and the 12GB VRAM sux.

However, gaming at 3440x1440 with GSync its a great card, mostly gaming at 90+FPS or so(my sweet spot with gsync), have 0 interest in Cyberpunk Overdrive or whatever its called, I'll try it for the laugh but even Raytracing I'm not too interested in other than trying out but don't care if i have to turn it off for performance.

In hindsight with pricing changes I would have got a 7900XTX or 7900XT however my 21:9 35" monitor is gsync only so stuck between a rock and a hard place(vendor lock in sux).

Card is great but overpriced and severely lacking vram.

I considered a 4090 but I don't need that performance and a 4080 for me is a waste as may as well get a 4090(no offence to 4080 owners)
 
Last edited:
serious question: what resolution should I be using? I currently have my 4070ti and dual Asus ProArt 279CVs set to 4k, 10bit, 60hz for doing video/3d animation work and single player gaming.

Is the only performance increase fps for gaming, because im pretty content with 30-60 fps generally speaking. But if theres going to be power improvements regarding my video and 3d work then I may have to drop down my res
 
serious question: what resolution should I be using? I currently have my 4070ti and dual Asus ProArt 279CVs set to 4k, 10bit, 60hz for doing video/3d animation work and single player gaming.

Is the only performance increase fps for gaming, because im pretty content with 30-60 fps generally speaking. But if theres going to be power improvements regarding my video and 3d work then I may have to drop down my res
Whatever you feel like. If you're content with 30-60fps at 4K, who's going to tell you it's not good enough? I prioritize performance above all else so I probably would mostly be gaming at 1440p on a 4070 Ti but if you prefer higher resolutions, there's nothing wrong with that.
 
serious question: what resolution should I be using? I currently have my 4070ti and dual Asus ProArt 279CVs set to 4k, 10bit, 60hz for doing video/3d animation work and single player gaming.

Is the only performance increase fps for gaming, because im pretty content with 30-60 fps generally speaking. But if theres going to be power improvements regarding my video and 3d work then I may have to drop down my res

Whatever looks good to your eye.
We arent you so we cant tell you what resolution or FPS you should be using.
 
The midrange is not going to be GDDR7, you can forget about that outright.
Maybe the 70Ti if people are lucky, otherwise Nvidia have no reason to spend on the midrange when they will be kicking AMDs ass easy work, and at the high end, just to show off might as well have a Halo product thats leagues and bounds.
Me thinks nextgen xx70s and xx80s are gonna be marginal improvements, unless they cut the xx80 out of the GB102.
I don't see why not. The cost of GDDR7 modules shouldn't be that far off GDDR6X, and they also need to offer mid-range products that actually make sense in 2025/2026 and for consumers to consider and upgrade. They can't have another moderate to marginal upgrades akin to the 4060/Ti/4070 because they have a hard time selling them as it is.

Also given these cards are coming in well over a year away and more for the lower tier stuff.
 
Last edited:
I don't see why not. The cost of GDDR7 modules shouldn't be that far off GDDR6X, and they also need to offer mid-range products that actually make sense in 2025/2026 and for consumers to consider and upgrade. They can't have another moderate to marginal upgrades akin to the 4060/Ti/4070 because they have a hard time selling them as it is.

Also given these cards are coming in well over a year away and more for the lower tier stuff.
GDDR6 prices plummeted due to high supply after the pandemic....the prices of GDDR7 will be much higher than GDDR6.

Micron dont even have their GDDR7 parts ready.
If Nvidia is already in development which we believe they are, they would most likely use GDDR6+ on their midrange cards and then use Samsung GDDR7 on their highend maybe the 5080 and 5090 get GDDR7 but everything below will be GDDR6+ or GDDR6X, believing Nvidia would goodwill us with GDDR7 across the board when they will have little to no competition next generation is wishful thinking at best.

Im gonna guess they will up the L2 cache and go back to the more usual memory bus widths with GDDR6X and GDDR6+ which that alone would bring a marked improvement to the midrange, the gen on gen increase on the 4070 and under parts should be pretty big, but dont expect the 5070 to catch the 4090.
 
If it had more than 12gb of ram it would probably be my new card but going to hold out till next gen unless I see an 7800 or 7900 for the right price
 
Amazon or BestBuy. Amazon usually has some cards for $1099 but sometimes you can get another $30-50 off. On both Amazon and BestBuy you can get 4080 returns if you catch them at right time for even under $1K with full warranty and can always return if not ok good shape.
Whoah. Meanwhile in Sweden: the cheapest RTX 4080 costs the equivalent of $1500.
 
Top Bottom