• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 5080- 5090 Reveal Showcase | OT | $1999 of Monopoly | 6.30 PM P.T - 2:30 AM GMT

Will you buy the new cards?

  • Yes!

    Votes: 91 27.7%
  • No.

    Votes: 153 46.5%
  • Yes, but i will wait for a discount!

    Votes: 39 11.9%
  • Console is better

    Votes: 46 14.0%

  • Total voters
    329
  • Poll closed .

CuNi

Member
Ngl, this is terrible. Like it's looking like 30% real gains across the board. This frame-gen tomfoolery is there to obfuscate the numbers. Considering that the notes suggest that the games are rendered at 4k max settings, it's pretty bad. Performance gains look to be worse than ada maybe barring the low end, wild.

This maybe hyperbolic on my part but anytime a company's stock appreciates this much and employees can cash out on their RSUs, the company starts a slow decline in innovation. I suspected that it might happen with Nvidia and this lacklustre release only confirms it for me. These guys are about to enter a coasting phase.

I feel like Raster gains are going to decline gen over gen but Ray and path-tracing, followed by AI features are where the future of uplifts is.
 

Gaiff

SBI’s Resident Gaslighter
Ngl, this is terrible. Like it's looking like 30% real gains across the board. This frame-gen tomfoolery is there to obfuscate the numbers. Considering that the notes suggest that the games are rendered at 4k max settings, it's pretty bad. Performance gains look to be worse than ada maybe barring the low end, wild.

This maybe hyperbolic on my part but anytime a company's stock appreciates this much and employees can cash out on their RSUs, the company starts a slow decline in innovation. I suspected that it might happen with Nvidia and this lacklustre release only confirms it for me. These guys are about to enter a coasting phase.
It’s in the 35-40% range gen-on-gen. It’s standard performance uplift, not terrible, not amazing.
 

lachesis

Member
Well, I was super hyped with DLSS4.... but decided to stick with 4080 Super for a while longer. Probably will upgrade at 60 series, and just upgrade my CPU to 9950x3d for my workstation.
I do need a new budget computer build for my living room one - which I'm going to grandfather my current 7800x3d - I think I will just settle with 5070 if needed to. (My TV only does 4k/60hz anyway)
Heck, if AMD does RX 9070xt way below RTX5070 - say, $399 - I may even consider going that route, but of course with real life benchmarks provided that it would be competitive.
 

Radical_3d

Member
I agree but I’d upgrade from an ancient 1060, and I don’t do 4K gaming. Just 1440p. So if the 5070 is 550€ I’m fine with 12 GB. Big if, tho.
Well, fuck this!
g1hzoNa.jpeg
 
I feel like Raster gains are going to decline gen over gen but Ray and path-tracing, followed by AI features are where the future of uplifts is.
True, raster gains will decline but unfortunately, RT gains this gen are poor based on Nvidia's chart. It looks awful.

It’s in the 35-40% range gen-on-gen. It’s standard performance uplift, not terrible, not amazing.
I'm not sure it's that at all. Their charts are suggesting like 1.3x. It's pretty poor especially for the 5090 despite jacking up the power budget and the price. Then again, it's on the same node so it's not surprising but the gains are weak compared to ada and ampere. Lets wait for independent benchmarks however, this gen is looking like an easy skip.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
True, raster gains will decline but unfortunately, RT gains this gen are poor based on Nvidia's chart. It looks awful.


I'm not sure it's that at all. Their charts are suggesting like 1.3x. It's pretty poor especially for the 5090 despite jacking up the power budget and the price. Then again, it's on the same node so it's not surprising but the gains are weak compared to ada and ampere.
I pixel peeped. A Plague Tale is 40% faster on the 5090 vs the 4090. 36% on the 5080 vs 4080, and 41% on the 5070 vs 4070. For Far Cry 6, it’s 34%, 30%, and 35%.
 
I pixel peeped. A Plague Tale is 40% faster on the 5090 vs the 4090. 36% on the 5080 vs 4080, and 41% on the 5070 vs 4070. For Far Cry 6, it’s 34%, 30%, and 35%.
Can't wait for benchmarks.... It's going to be interesting for sure and once we shut off DLSS, the truth will come out.
 

Buggy Loop

Member
AMD didn't have RDNA4 conference yet, so when it hapens, then we can to say meh/not meh

They fumbled epically

Presenting to media before hand the slides of RDNA 4 just to go on stage and say nothing. Let AIB awkwardly show their cards with static demos at a show with no presentation of the tech

This says everything I need to know. They’re still haven’t got rid of marketing clowns clearly.
 
Last edited:

Buggy Loop

Member
Because they wating prices, that reason. let's just wait RDNA4 event, and see what they show to us

I think FSR 4 is much like FSR 3 a concept of a plan and it’s really not ready..

But ok

We’ll inevitably see sometime… in Q1

Doesn’t inspire confidence don’t you think? Waiting on price? When AIBs are out there on CES floor with cards? I’ve never seen something like this in 30 years of following GPU tech enthusiastically.
 

CuNi

Member
Because they wating prices, that reason. let's just wait RDNA4 event, and see what they show to us

The Issue is, if they show the cards after 5080 and 5090 release, it'll be horrible because the internet will be flooded with DLSS4 multi-frame gen videos and it'll skew the comparison.
I will be honest, while it would be fair to compare both GPUs in raw performance, I play with DLSS on all titles that support it.
If I look at benchmarks etc. I look for the DLSS performance and with more and more features coming to DLSS etc. this will only get worse, not better.

If I had to choose between a card that natively runs 50FPS vs 100FPS, but the 50FPS card can use DLSS to get 150FPS and has all the other tech inside of it, I'm picking that card!
And let's be honest... AMD isnt going to beat the new GPUs in raw performance either so that just makes it even worse for them.
 

Radical_3d

Member
If I had to choose between a card that natively runs 50FPS vs 100FPS, but the 50FPS card can use DLSS to get 150FPS and has all the other tech inside of it, I'm picking that card!
And let's be honest... AMD isnt going to beat the new GPUs in raw performance either so that just makes it even worse for them.
Well, the mid range gamers would love a 5070 that doesn’t cost a kidney and a half. And if it has 16GB of RAM the better.
 

hinch7

Member
The Issue is, if they show the cards after 5080 and 5090 release, it'll be horrible because the internet will be flooded with DLSS4 multi-frame gen videos and it'll skew the comparison.
I will be honest, while it would be fair to compare both GPUs in raw performance, I play with DLSS on all titles that support it.
If I look at benchmarks etc. I look for the DLSS performance and with more and more features coming to DLSS etc. this will only get worse, not better.

If I had to choose between a card that natively runs 50FPS vs 100FPS, but the 50FPS card can use DLSS to get 150FPS and has all the other tech inside of it, I'm picking that card!
And let's be honest... AMD isnt going to beat the new GPUs in raw performance either so that just makes it even worse for them.
AMD are going to have to price 9070's way lower than a 5070 for people to care. People are already going with Nvidia's marketing of 5070 = 4090.

Its going to be bloodbath either way unless AMD goes hard with prices. And even then...
 
Last edited:

CuNi

Member
Well, the mid range gamers would love a 5070 that doesn’t cost a kidney and a half. And if it has 16GB of RAM the better.

The mid-range gamers will most likely gravitate towards a 5070 too. The only way not is when the AMD cards are €200 cheaper, which I'm 90% confident they won't be.
Especially for a mid-range gamer, DLSS4 multi frame gen is a godsend as they'll be able to pick if they want to slightly up the settings and still get 60fps or increase fps to 120 on the same settings as before.

AMD is sadly really, really far behind when it comes to tech.
 

SolidQ

Member
I think FSR 4 is much like FSR 3
FSR4 have two kernels, one exclusive for RDNA4, other for rest cards

Waiting on price?
Because multifake frames are going heavy marketing, it's hard to understanding. AMD have some sorts multifake frames, but real one will come with UDNA when they get Matrix Cores from CDNA
I think if 9070XT is 500$ and perf like 7900XT, it's fine. A lot people will buy it with 16gb
 

rofif

Can’t Git Gud
I feel like Raster gains are going to decline gen over gen but Ray and path-tracing, followed by AI features are where the future of uplifts is.
Of course. Raster gains are limited by chip size and efficiency. The cards are so power hungry for a reason. The advancements are with rt and software
 

Radical_3d

Member
The mid-range gamers will most likely gravitate towards a 5070 too. The only way not is when the AMD cards are €200 cheaper, which I'm 90% confident they won't be.
Especially for a mid-range gamer, DLSS4 multi frame gen is a godsend as they'll be able to pick if they want to slightly up the settings and still get 60fps or increase fps to 120 on the same settings as before.

AMD is sadly really, really far behind when it comes to tech.
Well I don’t care for frame generation, so if the 9070 is on par with a 5070 and costs 200€ less, which are still a whooping 450€, I’ll be there for it.
 

CuNi

Member
Well I don’t care for frame generation, so if the 9070 is on par with a 5070 and costs 200€ less, which are still a whooping 450€, I’ll be there for it.

I can't even tell if I want the 9070 to be 200€ less or not.
On the one hand, I'd love it to be cheaper to be able to fight NVIDIA for some shares.
On the other hand, I don't know how much profit that would leave for AMD.
They are doing great CPU wise currently, but they really need to gain back market share now or at the very latest with the first batch of UDNA cards.
 

Xyphie

Member
Based on the specs, will the 5080 be faster than the 4090 in like for like tests? I'm not so sure.... I'm not so sure.

It can probably get close-ish in some resolutions because the 4090 really has problem scaling without 4K pathtraced games.

relative-performance-rt-2560-1440.png
1440p.png


E.g. in 1440p 4080 Super with 80SM vs 4090 with 128SM is only 17-21% faster.

So with 5080 having 84SM (+5%), maybe 100-200MHz core clock (~5%), and some architectural improvements it's not unreasonable to think it can close the gap. But I wouldn't expect it in some 4K game when 4090 is like 40% faster.
 
  • Like
Reactions: N0S

Thebonehead

Gold Member
I'm thinking 20% gains in raster average which I'll be happy with on the 5090 Vs 4090.

I'm more interested in seeing what Indiana, cyberpunk and AW2 are like with path tracing on and dlss off to see how that scales against last gen
 

nemiroff

Gold Member
I suppose we must become accustomed to speaking in relative terms for real these days...

The claim that 5070=4090 should be taken with a grain of salt.

Yes, the reality is that generative performance is becoming more important, while the "lack" of advancements in rasterization is sort-of decreasing in importance. But, there's a intersection point of usefulness in there somewhere which is kinda hard for us as gamers to predict, especially when aligning it with the adaptation rate in games.

I will purchase a 5090 for sure, but I feel that the FOMO effect isn't as strong as some of Jensen's charts suggest - At least not yet.
 
Last edited:

CuNi

Member
I suppose we must become accustomed to speaking in relative terms for real these days...

The claim that 5070=4090 should be taken with a grain of salt.

Yes, the reality is that generative performance is becoming more important, while the "lack" of advancements in rasterization is sort-of decreasing in importance. But, there's a intersection point of usefulness in there somewhere which is kinda hard for us as gamers to predict, especially when aligning it with the adaptation rate in games.

I will purchase a 5090 for sure, but I feel that the FOMO effect isn't as strong as some of Jensen's charts suggest - At least not yet.

If I weren't into Gen-AI with things like SD, FLUX etc. I would've for sure went with 5080.
I just want that extra juicy VRAM for Image and Video generation.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
AMD didn't have RDNA4 conference yet, so when it hapens, then we can to say meh/not meh

Unless the RX9070XT is like 350 dollars its DOA.....if its over 400 dollars save aliitle longer and get a 5070.
I dont think it has much hope either way as it competes with the 4070Ti at best and will not even hold a candle to the 5070.

ftvqnG0.png


^Note this is from AMD themselves so its not like im just being a hater.....they think their RX 9070 XT at best will match a 4070Ti.
Even if they come with 16 or 20GB of VRAM.
They are literally competing with last gen Nvidia cards.


Gigabyrte are even using their Aorus Elite cooler for it (The Aorus Elite is the cheap Aorus)
GIGABYTE-RX-9070-2.jpg




Asus arent even bothering to make a Strix or Astral version......TUF is as good as it gets.
feature-amd-next-gen.jpg






As for MSI.........they just said fuck we aint even gonna bother making a cooler for this piece of shit.
0p8eg5ef8qed1.jpeg
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Because AMD have much better vendors. like Sapphire, Powercolor, XFX. People not much buying MSI




Uu1d8ki.gif





You think any of those vendors are better than MSI?
Mate have you seen the X Trio and Suprim X.......nothing from Sapphire, Powercolor or XFX even comes close.
Im pretty sure Gigabyte is going to give up on AMD too soon.
ASUS is gonna be the last premium brand working with AMD.
Someone help them.
 

STARSBarry

Gold Member
Bought a 3080 on release and got fucked over with the 10GB Vram, never again. 5090 all the way this gen, already got the money saved for it years ago. Just was waiting for this to get a full new build.
 

Zathalus

Member
What.
Why are people even considering the 90 scam cards?! These came after the titans to see if people will buy it.
But also 5080 is only 16gb… so probably 90 card will last longer. But not 1000 longer.
How is it a scam? 32GB GDDR7, 512bit bus, 20k+ cores, 4nm TSMC (that just got a price hike this year), almost 100 billion transistors (over 3x that of a 3080, or 10x a PS5) It’s literally two 5080s. Something like this would never be cheap.

Edit: Even just going off the 3080, with inflation that card would just be under $900 today. So since it is basically two 5080s with inflation this thing would be $1800 minimum. Throw in TSMC price gouging and GDDR7 and you can see how this thing is priced accordingly.
 
Last edited:

FireFly

Member
Unless the RX9070XT is like 350 dollars its DOA.....if its over 400 dollars save aliitle longer and get a 5070.
I dont think it has much hope either way as it competes with the 4070Ti at best and will not even hold a candle to the 5070.
The 4070 Ti has 25% more CUDA cores than the 5070 and a higher listed boost clock, so it remains to be seen how the 5070 fares overall.
 
AMD are going to have to price 9070's way lower than a 5070 for people to care. People are already going with Nvidia's marketing of 5070 = 4090.

Its going to be bloodbath either way unless AMD goes hard with prices. And even then...
AMD's feature deficit is reaching the point where they could price at 50% of what Nvidia charges and no one will buy. They have been doing nothing for far too long, the world has moved on from raster performance and AMD has nothing besides raster and a bigger VRAM number

Also by setting the 5070 at $550 Nvidia have basically thrown down the gauntlet on mid-range pricing. Meanwhile Intel is selling video cards with 12 GB of VRAM for $250. AMD has literally nowhere left to run in terms of their pricing, they are now restricted to between $250 and $550 and that is such a narrow price range that it's essentially unsustainable for their GPU division. I'm not surprised they backed out of talking about their new GPU in their CES keynote, they know they are fucked
 
Last edited:

Rivdoric

Member
Fake Frame Dream is getting better and better.

"Oh my i can finally enjoy RT game on 240fps with a base framerate of 20 and call it smooth, the future has never been so real !"

RTX 6090 : 1 real frame, 99 fake, 100% real price.
 
Last edited:

RespawnX

Member
What.
Why are people even considering the 90 scam cards?! These came after the titans to see if people will buy it.
But also 5080 is only 16gb… so probably 90 card will last longer. But not 1000 longer.

The 90s are enthusiast cards. Customers would also pay 3000 dollars if Nvidia demanded it. Half of the buyers don't care about money and the other half have no other hobby. From an AI perspective, the 90 series are also interesting, 32 GB VRAM is hard to come by outside of a Mac for the price and AI computing power.

For me, 16 GB VRAM is too little to dare to upgrade. Maybe a 5070 Ti if it comes up for sale. 879 Euros is too much for me considering this narrow upgrade also the benchmarks will reveal the true performance ratio. For many, the 4070 Ti Super should be more interesting with the upcoming sales.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Those vendors very fine.


this-is-fine.jpg




The 4070 Ti has 25% more CUDA cores than the 5070 and a higher listed boost clock, so it remains to be seen how the 5070 fares overall.

At worst its gonna be a 4080 class card, comparing CUDA cores and clocks across generations is basically pointless.
The 9070XT is a 4070Ti class card at best according to AMD. (obviously we wait for reviews).
IT would need to be much cheaper than the 4070Ti to be worth anything.
I dont see what hope that card has unless its Intel Arc levels of value, and even then I still dont have much faith in it with the 5070 being $550.
 
Top Bottom