Kopite7kimi : RTX 5070/5070TI specs leak

12gb should have been for the rtx 5060 and 16gb should have been for the 5060 TI at 300 and 400. No reason for a 8gb only card in 2024 even in the lowest one and for 1080p cause even that resolution needs more than that now.

Hope the 5070 is going to be at a fair price at 450 while the 5070 ti be at 550 at most but doubt it.

Might just be the year I get a completely new PC if the price is right, been using my GTX 1070 for 8 years now. Haven't upgraded since one the games I wanted to play were good on my 1070 and two because the prices were rip offs due to scalpers and even after. But more and more games are out now or will come that my 8gb vram 1070 isn't enough for in both power and vram even on 1080p.

I'm in the same boat with my 1070 which has been the value king for nearly a decade, but finally I grabbed a $430 7800XT and am sitting here debating completing a new build or using a holiday return policy. At this point I'd even consider a discounted PS5 Pro as a better value than PC gaming.
 
The PC market is huge, but the high-end pc market is small. If it takes a 4090 to run a game maxed out at 4k60 native, then the game engine is too demanding and should be scaled back.
And somehow cheapest 4090 is way above msrp atm, and same with bis cpu 9800x3d, thats a gaming cpu, and ppl buy so many of those that it made its launch price(which was by no means low) jump up significantly.
There is market for high end pc parts, and its not small at all, otherwise we would have both bis cpu and gpu on the shelves at or even below launch msrp by now, which ofc isnt the case.
 
Last edited:
I'm in the same boat with my 1070 which has been the value king for nearly a decade, but finally I grabbed a $430 7800XT and am sitting here debating completing a new build or using a holiday return policy. At this point I'd even consider a discounted PS5 Pro as a better value than PC gaming.
Yeah the 1000 series is the best series ever in price, how long they last, and how well they perform in games. It's unfortunate that we won't ever see something as good as that ever again. In the current time, anything that's consider decent and not even high end still cost a insane amount of money for what they offer. I did thought about the PS5 Pro when it gets discounted too cause I could also use it as a UHD bluray player although I heard the fat og PS5 with the built in drive was better for it. The thing that makes me not want to focus on the pro is because all the exclusive games I want from it like Rebirth and Stellar Blade are coming to the PC and by the time the PS5 Pro gets on a discount than the PS6 would be right around the corner. Plus one of my main things on PC is modding and getting mods for games like Elder Scrolls VI and Fallout 5 plus others. If the PS6 has no disk drive version than I'll still consider getting the PS5 Pro.
 
And somehow cheapest 4090 is way above msrp atm, and same with bis cpu 9800x3d, thats a gaming cpu, and ppl buy so many of those that it made its launch price(which was by no means low) jump up significantly.
There is market for high end pc parts, and its not small at all, otherwise we would have both bis cpu and gpu on the shelves at or even below launch msrp by now, which ofc isnt the case.
They sell out due to the limited number that gets manufactured and 4090s' aren't all sold for gaming purposes.
 
There was massive clock increase +1GHz.

I suspect clock increase will happen (based on power draw) but not anything on this scale.
4070 ti super is 20% slower than 4080 in game. It doesn't make sense for 5070 ti to be slower than 4080.

Imo

5070 ti 10% > 4080
5080 10% > 4090
5090 40% > 4090
 
4070 ti super is 20% slower than 4080 in game. It doesn't make sense for 5070 ti to be slower than 4080.

Imo

5070 ti 10% > 4080
5080 10% > 4090
5090 40% > 4090

There was already gen of Nvidia where xx80 part was slower than previous gen top dog (2080 vs 1080ti).

Based on specs 4090 should be faster than 5080.

IMO:

4070S -> 5070 ->/= 4070ti -> 4070tiS -> 5070ti ->/= 4080S -> 5080 -> 4090 -> 5090.
 
Look how are they trying hard to justify it above 🤣

But but but PS5 has 10GB of ram for games!! Its ok I can spend $800 for this gimped Nvidia Gpu! Lmao
Yeah price is getting too insane, I get that price slowly increases over a long period of time but this is way beyond that. I don't expect the RTX 5070 to be at the exact same price as I paid for the GTX 1070 long ago at 300 something but it should have been 400 something or 500 at most while being several times better. The PS5 pro is a complete rip off at 700 even if it came with the drive. If the PS5 PRO news never happened and people were to ask me what I think the PS6 would cost, I would have guessed 600 to 650 for the drive version and this is for the PS6. Would have never expected 700 for a mid gen upgrade that didn't even include a drive.

The AAA console games upping the price from 60 to 70 would be a example of a reasonable and fair price increase.
 
Last edited:
Look how are they trying hard to justify it above 🤣

But but but PS5 has 10GB of ram for games!! Its ok I can spend $800 for this gimped Nvidia Gpu! Lmao
Who here is trying to justify this? I know you have a clown in your avatar, but you don't have to act like one.

Why are you in a PC GPU thread with your low-effort trolling again?
 
I knew it would be a fucking 12GB card.

I already told myself that I wasn't upgrading from my 4070 Super so ultimately it doesn't matter to me, but I just don't see how you release a $600 minimum GPU with 12Gb in 2025.
 
I'm in the same boat with my 1070 which has been the value king for nearly a decade, but finally I grabbed a $430 7800XT and am sitting here debating completing a new build or using a holiday return policy. At this point I'd even consider a discounted PS5 Pro as a better value than PC gaming.
The Pro is not great value, at least relative to the PS5. It's like going from a 3060 12GB to a 4070 and only getting a 30-40% uplift in performance. 75% increase in price over the OG Digital for a 35-40% increase in performance.

What CPU are you on?
 
Last edited:
Who here is trying to justify this? I know you have a clown in your avatar, but you don't have to act like one.

Why are you in a PC GPU thread with your low-effort trolling again?

I can be wherever I want, next time you create a thread tag it "safe space" so nobody hurt your feelings.

Also don't act all angelic all of a sudden when you have trolled every Pro thread since forever.
 
I can be wherever I want, next time you create a thread tag it "safe space" so nobody hurt your feelings.
And we can tell you to piss off with your useless trolling.
Also don't act all angelic all of a sudden when you have trolled every Pro thread since forever.
GTFO with this bullshit. I bought a Pro day 1, whereas all you do is weak trolling. Quote me "trolling" those Pro threads. You won't find those posts because you made that shit up to use as a "no u" defense for being called out.
 
Last edited:
Remember, a couple gens back, when we realistically expected the "5060" to have the performance "4080"?

I do.

Let's hope at least that AMD and Intel spice the things a bit on the mainstream segment.
 
Remember, a couple gens back, when we realistically expected the "5060" to have the performance "4080"?

I do.

Let's hope at least that AMD and Intel spice the things a bit on the mainstream segment.

They can't do power jumps like that anymore, tech progres is much slower.

3xxx was stuck on shit Samsung node, that's the main reason power/efficiency jump was so impressive for 4xxx.
 
https://store.steampowered.com/hwsurvey/videocard/ 4090 sitting at 1.03% of users, which isnt big %age but considering steam had 132m users back in 2021 and grew substantially from there... its probably around 1,5 to 2m of 4090 gpu's on steam xD
The steam hardware survey is only for people who have volunteered to participate when picked at random. I assume that even those numbers are skewed more towards the high-end since people with high-end pcs love to tell everyone about their high-end pc.
 
A 600 dollar card should be at least 16gb or more and stronger than the 4070 super by a decent amount.
Agreed but this is modern Nvidia.

I got my 4070S because of a mixture of Open Box, Microcenter GPU trade in, and as a roundabout way to upgrade my girlfriends dying GPU otherwise I wouldn't have upgraded it at all from my 3060 ti.

PC components are pricing me out of upgrading anything let alone my GPU.
 
If Indiana Jones is indicative of what's to come, this is not enough VRAM.

Unacceptable in 2025.

I'll buy AMD or Intel. Ngreedia can shove it.
 
Last edited:
I base my gpu purchases on gaming benchmarks rather than specs (which tbh I don't really understand).

I was planning on getting a 7900xt for 1440p gaming. Price tag if £600. Do these 5070 cards look significantly better?
 
Last edited:
There was already gen of Nvidia where xx80 part was slower than previous gen top dog (2080 vs 1080ti).

Based on specs 4090 should be faster than 5080.

IMO:

4070S -> 5070 ->/= 4070ti -> 4070tiS -> 5070ti ->/= 4080S -> 5080 -> 4090 -> 5090.
If that is the case these new gpus are DOA.
 
What's so funny about that?
We are barely pushing the envelope at 8GB now. Games are poorly optimized and using tricks to hit visual peaks but often look like shit in motion. Secondly, not everyone games in 4K, nor do we need to. Lastly the 5070 is more mid-high tier where as the 5080 and 5090 would be high to ultra tier where people should be at 4K.
 
Remember, a couple gens back, when we realistically expected the "5060" to have the performance "4080"?

I do.

Let's hope at least that AMD and Intel spice the things a bit on the mainstream segment.
Rdna5 is the real deal for amd. It is what next gen consoles will receive, at least next xbox. So all focus will be there and it is our last chance to receive something good.

Imo 7900xtx is a really good gpu problem is price. For $899 it would had better sales.
 
Last edited:
It's insane how both Xbox Series X & PS5 from 2020 have 16GB VRAM and Nvidia is still pulling this shit in 2025, The base model 5070 should've absolutely had 16GB.
 
Rdna5 is the real deal for amd. It is what next gen consoles will receive, at least next xbox. So all focus will be there and it is our last chance to receive something good.

Imo 7900xtx is a really good gpu problem is price. For $899 it would had better sales.
RDNA 5 doesn't exist anymore. The next jump in architecture is uDNA. But lets not get too ahead of ourselves. Everytime the hype train comes, they release their GPU's and its a dissapointment.

It should be good though. The leaked benchmarks from RDNA 4 and 9070 XT with its relatively small die is promising in the RT side.
 
Last edited:
For the complainers STOP buying Nvidia and buy Intel or AMD outside of the HIGH end 90 80 cards there just isn't any compelling reason to spend more $ for less raster and less longevity due to lack of ram
 
For the complainers STOP buying Nvidia and buy Intel or AMD outside of the HIGH end 90 80 cards there just isn't any compelling reason to spend more $ for less raster and less longevity due to lack of ram

They want DLSS, Reflex, and Frame Gen.
 
Last edited:
They want DLSS, Reflex, and Frame Gen.

Frame gen is garbage because it adds latency. If you care about a quality gaming experience, input lag is important.

Reflex is nice, but AMD and Intel have equivalents.

DLSS is overrated. Sure, it increases frame rate but it's not free. Most of the time there is artifacting. Nothing beats a native image.
 
12-16gb? Riding the 3090 to infinity and beyond.

(who am I kidding: I'll be on the fence for the 5090, it's been a while since I got myself a nice gift...).
 
Frame gen is garbage because it adds latency. If you care about a quality gaming experience, input lag is important.

Reflex is nice, but AMD and Intel have equivalents.

DLSS is overrated. Sure, it increases frame rate but it's not free. Most of the time there is artifacting. Nothing beats a native image.

You like native image, yet you want the lowest input lag (and that means the highest framerate)?

Intel and AMD reflex equivalents are not used by almost any games at this point.

DLSS works very well in most games, sometimes DLSS Q will give you better quality than native TAA (that is not always good). Frame gen is ok in some games and bad in others...
 
We are barely pushing the envelope at 8GB now. Games are poorly optimized and using tricks to hit visual peaks but often look like shit in motion. Secondly, not everyone games in 4K, nor do we need to. Lastly the 5070 is more mid-high tier where as the 5080 and 5090 would be high to ultra tier where people should be at 4K.

Wrong. 8GB is barely adequate for new games. A 3080 chugs with Indiana Jones because of its limited VRAM where an A770 beats it. Embarrassing. All the people that called out Nvidia for cheaping out on VRAM were right.

Star Wars, God of War, Doom, Dead Space, Resident Evil, Cyberpunk…. And it goes on and on, all example where you need more than 8GB if you're running more than low-medium textures at 1080p.

Buying a new GPU today with 8-12 GB of VRAM is a big mistake. A 3070 I still have struggles heavily with Indiana Jones… I am forced to run it at 1080p with DLSS Performance to not run into VRAM issues. So I'm running the game basically in 876p…
 
You like native image, yet you want the lowest input lag (and that means the highest framerate)?

Intel and AMD reflex equivalents are not used by almost any games at this point.

DLSS works very well in most games, sometimes DLSS Q will give you better quality than native TAA (that is not always good). Frame gen is ok in some games and bad in others...

DLSS quality gives the illusion of a better image if you don't know what you're looking for because it disables TAA which is usually very blurry so you get a sharper image and you think it looks better.

If you have a high refresh rate display with a fast pixel response time (like a good OLED display), you can immediately pick up motion artifacts from DLSS. There is no such thing as a free lunch.

It's also why most people who play games competitively do not use it. If image clarity is your number one concern, DLSS goes against it.

Nothing beats native.
 
You like native image, yet you want the lowest input lag (and that means the highest framerate)?

Intel and AMD reflex equivalents are not used by almost any games at this point.

DLSS works very well in most games, sometimes DLSS Q will give you better quality than native TAA (that is not always good). Frame gen is ok in some games and bad in others...

And yes, the best gaming experience will always be running a native image at the highest frame rate possible.
 
Wrong. 8GB is barely adequate for new games. A 3080 chugs with Indiana Jones because of its limited VRAM where an A770 beats it. Embarrassing. All the people that called out Nvidia for cheaping out on VRAM were right.

Star Wars, God of War, Doom, Dead Space, Resident Evil, Cyberpunk…. And it goes on and on, all example where you need more than 8GB if you're running more than low-medium textures at 1080p.

Buying a new GPU today with 8-12 GB of VRAM is a big mistake. A 3070 I still have struggles heavily with Indiana Jones… I am forced to run it at 1080p with DLSS Performance to not run into VRAM issues. So I'm running the game basically in 876p…
Eh, Doom ran just fine on my 1080 without any special features at a native1440p. Also the 3080 came out quite a while ago, it's no wonder games today may put it at the limits. Although God of War, Dead Space and Resident Evil should not have put your PC at its limits.
 
DLSS quality gives the illusion of a better image if you don't know what you're looking for because it disables TAA which is usually very blurry so you get a sharper image and you think it looks better.

If you have a high refresh rate display with a fast pixel response time (like a good OLED display), you can immediately pick up motion artifacts from DLSS. There is no such thing as a free lunch.

It's also why most people who play games competitively do not use it. If image clarity is your number one concern, DLSS goes against it.

Nothing beats native.

And yes, the best gaming experience will always be running a native image at the highest frame rate possible.

DLSS is a form of TAA in some way, but it's superior thanks to ML element to it.

To get the best possible framerate at native image you need the most powerful GPU, and right now nvidia will have ~3 top spots (5090, 4090, 5080) when 5xxx launches. AMD and Intel have NOTHING to combat that. So no matter if you like Nv features or not you pretty much need to get Nv GPU if you want to play like that.
 
Last edited:
Eh, Doom ran just fine on my 1080 without any special features at a native1440p. Also the 3080 came out quite a while ago, it's no wonder games today may put it at the limits. Although God of War, Dead Space and Resident Evil should not have put your PC at its limits.

Yes, but all things equal (VRAM) a 3080 stomps an A770 with Indiana Jones. The only problem it has is memory limitations which turn it into a slideshow.


This is abysmal. The game is unplayable at 1080p because the GPU doesn't have enough memory. Whereas If it had 12GB or more, the performance would be excellent.

Anybody that bought 3000/4000 with 8GB will be forced to upgrade if they want to play newer games only because they're VRAM limited, and nothing else.
 
Yes, but all things equal (VRAM) a 3080 stomps an A770 with Indiana Jones. The only problem it has is memory limitations which turn it into a slideshow.


This is abysmal. The game is unplayable at 1080p because the GPU doesn't have enough memory. Whereas If it had 12GB or more, the performance would be excellent.

Anybody that bought 3000/4000 with 8GB will be forced to upgrade if they want to play newer games only because they're VRAM limited, and nothing else.
There's a 3080 12gb btw
 
Yes, but all things equal (VRAM) a 3080 stomps an A770 with Indiana Jones. The only problem it has is memory limitations which turn it into a slideshow.


This is abysmal. The game is unplayable at 1080p because the GPU doesn't have enough memory. Whereas If it had 12GB or more, the performance would be excellent.

Anybody that bought 3000/4000 with 8GB will be forced to upgrade if they want to play newer games only because they're VRAM limited, and nothing else.
Maybe, but as NEW games arrive, new specs are required to reach them top of the line visuals. Especially as texture resolution and detail rise. But along with more RAM comes faster speeds. Filling and emptying the RAM will be a much faster process with GDDR7.
 
Maybe, but as NEW games arrive, new specs are required to reach them top of the line visuals. Especially as texture resolution and detail rise. But along with more RAM comes faster speeds. Filling and emptying the RAM will be a much faster process with GDDR7.

This vram speed argument that was used with 3080 (with new and shiny 6x) is only good on theory. In reality more = better, when game goes out of vram it's dogshit no matter the speed.
 
This vram speed argument that was used with 3080 (with new and shiny 6x) is only good on theory. In reality more = better, when game goes out of vram it's dogshit no matter the speed.
I agree, but your 3080 is almost 2 gens behind. Either use the right settings or suffer the consequences of filling your VRAM. My 1080 still hits medium settings well coming into 2025 at 40-144 fps depending on the game.
 
Last edited:
Frame gen is garbage because it adds latency. If you care about a quality gaming experience, input lag is important.

Reflex is nice, but AMD and Intel have equivalents.

DLSS is overrated. Sure, it increases frame rate but it's not free. Most of the time there is artifacting. Nothing beats a native image.

ZyOpJt4.gif
 
Top Bottom