• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 5070 | Review Thread

dgrdsv

Member
nope. I'm okay reducing texture quality with my now ancient 3070
but with brand new 5070? nah. it should have 16 GB VRAM
even 3070 was able to play with maximum texture quality up until 2023. so this is even worse than 3070 at launch
You don't need to reduce anything on a 12GB card unless you're aiming at 4K+PT+FG which is likely too heavy for a 5070's performance anyway.
"Should have" is nice. Are you ready to pay +$100 for something which will help you in half a dozen of games over the card's lifespan?

The fact that it's actually slower than a 4070 ti is pretty crazy.
31 Tflops vs 40 Tflops
$550 vs $800 launch price
Why is this "crazy"?
 
Last edited:

yamaci17

Member
You don't need to reduce anything on a 12GB card unless you're aiming at 4K+PT+FG which is likely too heavy for a 5070's performance anyway.
"Should have" is nice. Are you ready to pay +$100 for something which will help you in half a dozen of games over the card's lifespan?
there's literally a comparison above you that shows 4060ti getting 80 FPS with settings that buckles 5070 to single digits. meaning if 5070 had enough VRAM, it would ran the game at the same settings with 140+ FPS

also no, you still can't max out textures in indiana jones without path tracing with 12 GB.
 
Last edited:

dgrdsv

Member
there's literally a comparison above you that shows 4060ti getting 80 FPS with settings that buckles 5070 to single digits. meaning if 5070 had enough VRAM, it would ran the game at the same settings with 140+ FPS
This is a completely flawed comparison because in IJ it is *you* who control how much VRAM the game allocates.
There is no point in using "supreme" level when the quality is the same for everything above "medium" and the only thing which changes is the VRAM allocation.
So basically this is an "issue" which is constructed by hand to show what impact running out of VRAM would produce - but not how you would actually fare on a 12GB card when playing this game in particular.
 

Rudius

Member
What a piece of shit. Same performance as 4070S for a little discount? Why, Nvidia? Why didn't you just keep producing the 4000 series instead of going forward with this joke of a launch?
A 4070 super for 399 would be nice, or 449 with gddr7 memory (don't know if possible)
 

Rosoboy19

Member
This is a completely flawed comparison because in IJ it is *you* who control how much VRAM the game allocates.
There is no point in using "supreme" level when the quality is the same for everything above "medium" and the only thing which changes is the VRAM allocation.
So basically this is an "issue" which is constructed by hand to show what impact running out of VRAM would produce - but not how you would actually fare on a 12GB card when playing this game in particular.
This is true when comparing the 5070 to 16GB cards in random YouTube reviews. The bigger issue is that Jensen himself made the direct comparison to the 24GB 4090. This opened the door for everyone to highlight the 5070’s biggest shortcomings in all sorts of unrealistic scenarios because the comparison itself was unrealistic. Foolish.
 

SolidQ

Member
RTX 5050 for 550
GlM9jdybYAAIrp4
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
What's wrong with this one in your view?
5070 launches at $550 (market price is TBD of course but this is true for all GPUs) which puts it at 4070's price position - or 50 below 4070 Super's.
In both comparisons it provides a perf/price improvement. Not enough for the owners of 4070 or Super to upgrade - but this was always a rare occurrence even before we've ran out of production process improvements.
For 3070 owners or prior Turing and Pascal GPUs of ~$500 launch price it is a solid upgrade option at a similar price.

There are some results where 5070 barely beats 4070 which are less than ideal but this one isn't it.

xx70s should at best beat the last gen range topper.
At worst they should match xx80
Generally the xx70 should get you performance near the best of the previous generation +-

This card is trading blows with last gens xx70S.
Doesnt stand a chance vs lastgen xx80.
And is fighting with a two generations old range topper.


Its utter shite.
And before you come at me that xx70s matching range toppers is rare.

We can go back:

  • 4070S ~ 3080Ti
  • 3070 ~ 2080Ti
  • 2070S ~ 1080Ti
  • 1070 ~ 980Ti
  • 970 ~ 780Ti
  • 770 ~ 680
  • 670 ~ 580
  • 570 ~ 480
  • 470 ~ 280
  • 260 ~ 9800


Yes im cheating by using two supers in there but even the base 2070 and 4070 were better than what we are looking at here.
 

poodaddy

Member
Please AMD....please. Just have a successful launch, all you have to do now is not fuck up the quality assurance and inventory, and you've won. Nvidia desperately needs this wake up call, we need true competition again.
 

dgrdsv

Member
This is true when comparing the 5070 to 16GB cards in random YouTube reviews.
I don't know what that means.

The bigger issue is that Jensen himself made the direct comparison to the 24GB 4090. This opened the door for everyone to highlight the 5070’s biggest shortcomings in all sorts of unrealistic scenarios because the comparison itself was unrealistic. Foolish.
A single comparison when talking about MFG, yes. All other comparisons were in fact with 4070.
I get that YT dramatize everything because clicks and rage and drama for them literally means money. But why ordinary people do this I'll never understand.

xx70s should at best beat the last gen range topper.
At worst they should match xx80
Generally the xx70 should get you performance near the best of the previous generation +-
There is no "xx70" and there is no "should".
Product names are completely random and mean nothing.
The only thing which matters is performance per price comparison.

Yes im cheating by using two supers in there but even the base 2070 and 4070 were better than what we are looking at here.
The problem isn't that you're cheating, the problem is that you give meaning to the card's name which never was there and also have expectations which aren't based on anything but meaningless names.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
There is no "xx70" and there is no "should".
Product names are completely random and mean nothing.
The only thing which matters is performance per price comparison.


The problem isn't that you're cheating, the problem is that you give meaning to the card's name which never was there and also have expectations which aren't based on anything but meaningless names.

Even if we use your prece performance metric.
Its still a bad card considering its nowehere near being a Halo product that can justify its bad price/performance ratio.

Note: The chances of finding a 5070 for 550 might as well be unicorn shit.
Its gonna be 700+ dollars on the street bet.

performance-per-dollar-2560-1440.png




P.S The names arent arbitrary, its used to signify tiers, they have meaning to them.
I dont know why people over the last two generations have suddenly started saying Nvidias card names are arbitrary?
Justifying purchases or what?
We have since the like what 6000 series we have had a sense of order knowing whats what in the stack, why since the 4070 have people decided the names dont matter?
 

Rosoboy19

Member
I don't know what that means.


A single comparison when talking about MFG, yes. All other comparisons were in fact with 4070.
I get that YT dramatize everything because clicks and rage and drama for them literally means money. But why ordinary people do this I'll never understand
I was agreeing that, without Jensens 4090 comment, your “flawed comparison” remark was valid. But he said it. And he got the crowd riled up and made tons of headlines by saying it. Whatever comparisons he made after that to 4070 didn’t matter much. Mission accomplished, marketing achieved.

All that aside, most ordinary people would probably prefer their brand new $600+ 5070 gpu to have 16GB VRAM in 2025. Nvidia could certainly afford it.
 
The RTX5070's performance is lower than expected (even standard 4070ti is faster, let alone the 4070ti super), and with only 12GB of VRAM, I would consider buying the Radeon 9070 even though I prefer nvidia cards.
 

Xellos

Member
It's hard to shake the feeling that Nvidia could have done better for similar price, like this is the bare minimum they could get away with. I guess Nvidia wants to steer 4070/3080 owners towards the more expensive cards like 5070ti and 5080.

I'd love for AMD to be a viable alternative but that's all on FSR4 vs DLSS. AMD will have the performance/dollar advantage at native rendering but if FSR4 isn't comparable to DLSS4 it's going to be tough to give up DLSS.
 

Buggy Loop

Member
With the pure RT future games are heading in even 16gb is gonna be on the chopping block by 2027.

12gb went from "adequate" to "not enough for RT" in less than 2 years. This is spooky.

Then there's no new cards to buy unless its a 5090. From either Nvidia or AMD.

I honestly don't think so. Neural texture compression to the rescue. AMD is also in on this and has papers on the subject, but this will be directX vendor agnostic feature soon enough I think.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Much to my surprise, my local Microcenter (dallas) will be getting over 70 5070s with over 25 of them being the Asus TUF that is selling for $739.


Please dont be a fucking retard and buy this piece of shit GPU, when the 9070XT will likely blow it out of the water.
 
Last edited:

rofif

Can’t Git Gud
You should have posted the Indiana Jones screencap. 13fps due to 12gb VRAM.

0xr94Lf.png


It's the 8gb 3070 all over again.
We are already at a point where even 12gb is not enough for modern games. AMD should really reconsider cancelling that 32gb 9070 XTX.
5070 12gb vs 4060ti 16gb ,

2f43b1b39fe6cc3fb00d0eb484216ba626a07ac4b91e57ff9428ae0a69e757b4.png

another gpu with gimped vram

Nonsense. This just exposed those who have not played with game settings themselves.
One just needs to play indiana jones and toy with settings for 5 mintues to understand how bugged it is and how bad of a job dev did.
The sec that games goes over vram limit, it drops from whatever 100fps you had, to 5. I've checked, I've tested.

I can run the game at max settings but HIGH textures (not highest). The game overfills my vram and drops to 3fps, I just need to go into settings, move the textures or shadows sliders 1 notch down and back up, and it unlocks the fps again and will not go to a crawl again.

Seriously. my vram limit is around 9700. If I see it hovering around that, the game will drop to 5fps. You just need to touch the settings to unlock it when that happens. And the devs are SO SHITTY, the menu and settings also run at 3fps, so it's actually a pain to get to that setting and change it.

This is a bug and game fault. nothing to do with 12gb. More vram just avoids that bug.
I see this repeated everywhere and it just exposed who played this game on pc and who did not.
People only repeating bs. More vram shouldn't be needed here. it just helps to avoid the bug. Their stupid textures cache setting is broken.

This actually happened to my 4080 Super as well if I tried to max it in 4K. VRAM went over 16GB and then I got Ocarina of Time framerates.
20+ GB is needed now.
Exactly.
 
Last edited:

64gigabyteram

Reverse groomer.
This is a bug and game fault. nothing to do with 12gb. More vram just avoids that bug.
. so we need more than 16gb of vram for future cards over 300.

you think devs will stop making bugs in video games? we have to account for worse optimization, more complex games and gaffes like this which fuck up lower VRAM cards. We need more VRAM.

People said that the AMD 32gb 9070 was unnecessary. They're about to see how that statement stacks up in the coming years.
 

rofif

Can’t Git Gud
. so we need more than 16gb of vram for future cards over 300.

you think devs will stop making bugs in video games? we have to account for worse optimization, more complex games and gaffes like this which fuck up lower VRAM cards. We need more VRAM.

People said that the AMD 32gb 9070 was unnecessary. They're about to see how that statement stacks up in the coming years.
Of course we need more vram.
But this benchmark should not be shared or should come with a disclaimer.
It is a confirmed bug and not a normal behavior when you go ... tiny bit over vram limit. Game should manage that itself.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Nonsense. This just exposed those who have not played with game settings themselves.
One just needs to play indiana jones and toy with settings for 5 mintues to understand how bugged it is and how bad of a job dev did.
The sec that games goes over vram limit, it drops from whatever 100fps you had, to 5. I've checked, I've tested.

I can run the game at max settings but HIGH textures (not highest). The game overfills my vram and drops to 3fps, I just need to go into settings, move the textures or shadows sliders 1 notch down and back up, and it unlocks the fps again and will not go to a crawl again.

Seriously. my vram limit is around 9700. If I see it hovering around that, the game will drop to 5fps. You just need to touch the settings to unlock it when that happens. And the devs are SO SHITTY, the menu and settings also run at 3fps, so it's actually a pain to get to that setting and change it.

This is a bug and game fault. nothing to do with 12gb. More vram just avoids that bug.
I see this repeated everywhere and it just exposed who played this game on pc and who did not.
People only repeating bs. More vram shouldn't be needed here. it just helps to avoid the bug. Their stupid textures cache setting is broken.


Exactly.
Yeah Indiana Jones is one of the buggiest and most annoying games to play. Constant weird glitches. I was constantly getting save corruptions that some times were fixed by a restart, but weird random framerate drops and occasionally after a long cutscene, the game would randomly get stuck at like 49fps. DLSS and frame gen causing all sorts of issues…just ugh.
 

rofif

Can’t Git Gud
Yeah Indiana Jones is one of the buggiest and most annoying games to play. Constant weird glitches. I was constantly getting save corruptions that some times were fixed by a restart, but weird random framerate drops and occasionally after a long cutscene, the game would randomly get stuck at like 49fps. DLSS and frame gen causing all sorts of issues…just ugh.
Yep. Aside from super annoying vram -> 5fps, I also had more than fe crashes crashes.
Doesn't help that it was one of few new releases that I played on pc... here is to me giving pc another good faith chance.... I ended up toying with settings about as much as playing the game but at least I discovered how vram limit works. Stupid seeing youtubers using bug in benchmarks
 
Last edited:

Fess

Member
Of course we need more vram.
But this benchmark should not be shared or should come with a disclaimer.
It is a confirmed bug and not a normal behavior when you go ... tiny bit over vram limit. Game should manage that itself.
It’s the reality though, and Indy won’t be the last one to go OoT when above 16GB. I lowered the settings when playing on my 4080 Super and I hated doing it because it shouldn’t be needed on a $3k PC.
It’ll be interesting to see what happens with DOOM, same engine there.
 
Last edited:

mansoor1980

Gold Member
Nonsense. This just exposed those who have not played with game settings themselves.
One just needs to play indiana jones and toy with settings for 5 mintues to understand how bugged it is and how bad of a job dev did.
The sec that games goes over vram limit, it drops from whatever 100fps you had, to 5. I've checked, I've tested.

I can run the game at max settings but HIGH textures (not highest). The game overfills my vram and drops to 3fps, I just need to go into settings, move the textures or shadows sliders 1 notch down and back up, and it unlocks the fps again and will not go to a crawl again.

Seriously. my vram limit is around 9700. If I see it hovering around that, the game will drop to 5fps. You just need to touch the settings to unlock it when that happens. And the devs are SO SHITTY, the menu and settings also run at 3fps, so it's actually a pain to get to that setting and change it.

This is a bug and game fault. nothing to do with 12gb. More vram just avoids that bug.
I see this repeated everywhere and it just exposed who played this game on pc and who did not.
People only repeating bs. More vram shouldn't be needed here. it just helps to avoid the bug. Their stupid textures cache setting is broken.


Exactly.
look at you rofif rofif , defending a gimped PC gpu ,

giphy.gif
 

JohnnyFootball

GerAlt-Right. Ciriously.
Anyone that buys this over a 9070XT is legit fucking retard.

I really hope that for once that you nvidia loyalists can tell Jensen that enough is enough.

Seriously if you’re considering a 5070 go and get a 9070XT instead.
 


Blog Reviews




Videos










Charts

At 1440p

  • 5070 is 0.81x Performance of 5070 Ti
  • 5070 is 0.91x Performance of 4070 Ti Super
  • 5070 is 0.97x Performance of 4070 Ti
  • 5070 is 1.05x Faster than 4070 Super
  • 5070 is 1.22x Faster than 4070
  • 5070 is 1.55x Faster than 2080 Ti and 3070
  • 5070 is 1.61x faster vs 4060 Ti 16GB
  • 5070 is 1.78x Faster vs 3060 Ti
  • 5070 is 2.35x Faster vs 3060

rtx-5070-1080p-average.jpg



rtx-5070-1440p-average.jpg


rtx-5070-4k-average.jpg


Via



- The Verge + TechPowerUp


QpjtLqx.jpeg
 

CrustyBritches

Gold Member
Out of curiosity I got up early for 6am PST release time. Had Newegg and Best Buy open ready to go on 2 monitors and spammed refresh leading up to 6am. Only one card was available, the $650 MSI Trio, and that was for local pickup for about 2min. All other cards were immediately marked ‘Sold Out’.

I believe it was Steve from Gamers Nexus that called this an imaginary card with imaginary price and performance. Founders Edition isn’t even launching until later this month because, in reality, this card doesn’t even exist.
 

YCoCg

Member
Nvidia need to discontinue this card and push out a 5070 Super FAST because it's getting stomped on by AMDs 9070, this shit is embarrassing, the 60/70 series is where most gamers buy at and I'm certain most of us don't even want to see what the hell Nvidia will attempt to shit out as the 5060 at this point.
 
Top Bottom