• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA just sacrificed their entire lineup so they could sell you a $2000 GPU

Danny Dudekisser

I paid good money for this Dynex!
Even the $2000 card isn't impressive in terms of actual raw horsepower upgrade. They're selling software, not hardware. And it's absolutely succeeding - that's the worst part.
 

Jigsaah

Gold Member
And this is why I skip generations typically. 1070ti to 3080...then I caved on 4090 because...it was the fuckin 4090. And now I'll be skipping the 5090 because 4090 is still 2nd best and can hit my monitor's refresh rate on pretty much any game I currently play. Here's to 2027 and the 60 series.
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Buying 5090 if I can get my hands on it. Maybe even 5080.

We have this thread every single generational release. You can go look and see right here on GAF. We see 4090 clutching today but check the old threads and read the same 5090 complaints about the 4090.
 
Last edited:

Buggy Loop

Member
Not a single game uses the neural rendering

Now does that mean go go buy it on promises? No

But raster to raster is a meaningless measure for these. Nvidia / AMD / Intel are going with neural pipelines. Whole pipeline is reinvented, from shaders to geometry to lighting.

Time will tell. Throwing it under the bus as of day 0 review when not a single game supports its features is a bit too much of a risk of a post aging like milk for me. The last pipeline major rework dates back to 24 years ago. It'll not be obvious straight away what Blackwell does.
 

Kilau

Gold Member
Whatever you don't buy, Jensen will be forced to eat!

tenor.gif
 

welshrat

Member
Honestly hope now that the upcoming 9700xt is half decent. I have my doubts about AMD pricing it correctly but honestly if this comes close to 600 - 650, fsr4 is good and RT is improved it will probably get my money.
 

JohnnyFootball

GerAlt-Right. Ciriously.
They will release super/ti models in a year with much better price/perf just like with 20xx and 40xx series, for now simply either buy 5090 or hold out bit longer if u are budget gamer like most of us...
The last time they offered a Ti it didn’t offer better much better performance but it had a drastically higher price.

I’m referring to the 3080 Ti.

The 1080 Ti was the fluke.

The super cards from the 4000 were better but not drastically better.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Honestly hope now that the upcoming 9700xt is half decent. I have my doubts about AMD pricing it correctly but honestly if this comes close to 600 - 650, fsr4 is good and RT is improved it will probably get my money.
I’m wanting to replace my 7800XT in my living room SteamOS/Bazzite PC. I am very much hoping the 9070XT ends up being a great value.
 

JohnnyFootball

GerAlt-Right. Ciriously.


HUB did a really great deep dive into multi frame generation. Here’s their recommendation when it makes sense to use frame gen:

WwKRESJ.jpeg


So yeah if you have a 250 Hz+ monitor, and you’re running a game at 100+ FPS, and that’s still not smooth enough for you, and you’re willing to tolerate visual artifacts and higher latency in exchange for higher output framerate…. Then multi frame generation is a really killer feature.

Yes, Frame Gen is a great feature when used properly. It should be avoided when the base framerate is below 60 fps, but it’s pretty great when smoothing 90 to 180
 

Elios83

Member
It's funny how the tune has changed around these cards since the CES presentation.
Anyway there aren't many options, 4000 series will be discontinued and these cards will be the only "affordable" option for most people (5060 and 5070 even more).
Realistically only a niche of people are willing and can actually afford to spend 2000$ just on a GPU, so it's not as much as to try to convince more people to buy the 5090 but to increase their margins on the cards most people will buy by gimping them to the point it's barely an upgrade, the marketing with AI and fake frames will do the rest.

It will be interesting to see how the mid range cards AMD is preparing will compare in actual tests, there's zero chance they can pose a threat to nVidia but if priced well they could be a really good option for gamers.
 

OmegaSupreme

advanced basic bitch
How much do these ultra high end cards sell anyway? Do we ever get those kind of numbers? Surely they can't be selling more than a few hundred thousand?
 

Jayjayhd34

Gold Member
People seem to forgetting how much they've invested in AI they need to claw that money back some how. Can't imagine making dlss and frame gen was cheap for the them , then you got to mention the lack of competition.
 
Last edited:

grvg

Member
hey, basically, im just not gonna buy it (Nvidias entire 50 lineup)

i know …. i know im sorry….

UGH!!!!!!!!!!!!!!!!!

it’s just that im not gonna buy it is alll
 

Celcius

°Temp. member
How much do these ultra high end cards sell anyway? Do we ever get those kind of numbers? Surely they can't be selling more than a few hundred thousand?
I mean, the 3090's and 4090's have been constantly or near constantly sold out during their lifespans. At no point in time were you ever able to just go to your local BestBuy to casually grab a FE for example. AIB cards have been tough to get throughout the lifecycle as well.
 
Last edited:

Myths

Member
Not a single game uses the neural rendering

Now does that mean go go buy it on promises? No

But raster to raster is a meaningless measure for these. Nvidia / AMD / Intel are going with neural pipelines. Whole pipeline is reinvented, from shaders to geometry to lighting.

Time will tell. Throwing it under the bus as of day 0 review when not a single game supports its features is a bit too much of a risk of a post aging like milk for me. The last pipeline major rework dates back to 24 years ago. It'll not be obvious straight away what Blackwell does.
Of course they don’t understand. People don’t even understand what they’re paying for, this is readily apparent with the gaming segment that thinks the cost only reflects the use for just gaming — except that GPUs are now multipurpose and have been for a very long time. It’s a bundled cost.

Jokes on them for not using it outside of gaming. Skill issue.
 

FingerBang

Member
Not a single game uses the neural rendering

Now does that mean go go buy it on promises? No

But raster to raster is a meaningless measure for these. Nvidia / AMD / Intel are going with neural pipelines. Whole pipeline is reinvented, from shaders to geometry to lighting.

Time will tell. Throwing it under the bus as of day 0 review when not a single game supports its features is a bit too much of a risk of a post aging like milk for me. The last pipeline major rework dates back to 24 years ago. It'll not be obvious straight away what Blackwell does.
Then don't release new cards overpromising if you can't deliver. Remember that time when Nvidia introduced the future with Ray Tracing? How many high end, ray traced games can a 2060 play? How many path traced games can a 2080ti run?

By the time the technology was good enough, the first generation of cards were basically useless. The 5000 series is terrible value compared to the 4000, which was already bad value compared to the 3000. That's the current situation.

It's up to Nvidia to prove the tech is worth it, not for us to justify single digit performance uplift for the same price IN MODERN GAMES.
 
Last edited:

GHG

Member
Of course they don’t understand. People don’t even understand what they’re paying for, this is readily apparent with the gaming segment that thinks the cost only reflects the use for just gaming — except that GPUs are now multipurpose and have been for a very long time. It’s a bundled cost.

Jokes on them for not using it outside of gaming. Skill issue.

He's still talking about it being used for gaming. Just in the context of some speculative future when the rendering pipeline is turned on its head. By which point there will be new much more powerful hardware that's better suited anyway.
 

xVodevil

Member
Well I can live with "the less I buy the more I save" approach
But also wouldn't that also mean our 4xxx cards hold on longer bar the VRAM if there are no better cards to develop games for?
 

Haint

Member
How much do these ultra high end cards sell anyway? Do we ever get those kind of numbers? Surely they can't be selling more than a few hundred thousand?

The 4090 alone is around 1.2% of total Steam users, so probably pushing 1.5 million units in gaming use. Many more are hidden in AI specific applications, so potentially 2 million or more total. That's over 3.5 billion in revenue, and they more than triple the BoM cost in profit on every sale. A lot of people got a lot of money and they're happy to spend it.

The old Titan cards sold like absolute dogshit, maybe 10s of thousands of units lifetime. The 4090 is what really changed the game. Without bitcoin and covid the 3090s also would have sold terribly. The massive performance gaps in the product stack is why people will fall over themselves to spend $2000+.
 
Last edited:

Buggy Loop

Member
Then don't release new cards overpromising if you can't deliver. Remember that time when Nvidia introduced the future with Ray Tracing? How many high end, ray traced games can a 2060 play? How many path traced games can a 2080ti run?

By the time the technology was good enough, the first generation of cards were basically useless. The 5000 series is terrible value compared to the 4000, which was already bad value compared to the 3000. That's the current situation.

It's up to Nvidia to prove the tech is worth it, not for us to justify single digit performance uplift for the same price IN MODERN GAMES.

And I'm not trying to convince anyone to buy them, I don't give a shit. I might even skip it myself even though 3000 series is due to upgrade imo, but I'm not camping out, nor fighting bots online, trust me on that. I said in the other thread that much like Turing, by the time there's enough games using it, another series will be out.

Closest neural implementation coming is Alan Wake 2's RTX mega geometry, which was either tomorrow or around launch.

But it is what it is, if you do not implement a tech that is doing a paradigm shift at the cost of peoples being confused when comparing with old methods, then the series later on will not be any more justifiable because no games will have tried it. Had Turing not introduced RT/ML, would we have any RT / PT games nowadays? Probably not. Had turing not implemented it, would you have jumped on it at Ampere? Then ampere takes the burden of proving a technology with no games around.
 
Last edited:

GHG

Member
And I'm not trying to convince anyone to buy them, I don't give a shit. I might even skip it myself even though 3000 series is due to upgrade imo, but I'm not camping out, nor fighting bots online, trust me on that. I said in the other thread that much like Turing, by the time there's enough games using it, another series will be out.

Closest neural implementation coming is Alan Wake 2's RTX mega geometry, which was either tomorrow or around launch.

But it is what it is, if you do not implement a tech that is doing a paradigm shift at the cost of peoples being confused when comparing with old methods, then the series later on will not be any more justifiable because no games will have tried it. Had Turing not introduced RT/ML, would we have any RT / PT games nowadays? Probably not. Had turing not implemented it, would you have jumped on it at Ampere? Then ampere takes the burden of proving a technology with no games around.

The difference is that while not spectacular, the 20XX series still offered much better uplift when it came to raster performance (~30% for both the 2080 and 2080ti vs the 1080 and 1080ti respectively).

Right now, with the 5080, you (and nvidia) are asking people to buy a ~10% increase based on some speculative future. That's nonsensical.
 

Buggy Loop

Member
The difference is that while not spectacular, the 20XX series still offered much better uplift when it came to raster performance (~30% for both the 2080 and 2080ti vs the 1080 and 1080ti respectively).

Right now, with the 5080, you (and nvidia) are asking people to buy a ~10% increase based on some speculative future. That's nonsensical.

Because Turing focused on making the pipeline compute like it still helped in raster.

Almost everything new on Blackwell is NPU. What game does it?

Btw, fuck off with your little "you are asking people to buy", I'm not. See my replies before and in the other thread. Don't start this shit GHG. I'm not gonna be your little daily gotcha dopamine hit like the junky you are for that stuff.
 
Last edited:
  • LOL
Reactions: GHG

.Xeno

Neo Member
Still stuck on a 3080 10GB, at this point might just get a 5070ti for the 16GB then wait for better gen upgrade on 6K series.
 

GHG

Member
Because Turing focused on making the pipeline compute like it still helped in raster.

Almost everything new on Blackwell is NPU. What game does it?

Btw, fuck off with your little "you are asking people to buy", I'm not. See my replies before and in the other thread. Don't start this shit GHG. I'm not gonna be your little daily gotcha dopamine hit like the junky you are for that stuff.

Damn, someone's been a bit touchy since Monday's news.

You're sat here attempting to justify this new generation, so yes, it does come across as some kind of sales pitch (whether that's intentional or not, I don't know).

The bottom line is that from a gaming perspective this generation is a dud if you currently have access to 40xx series hardware. However if you also happen to use your GPU for AI tasks then there are substantial upgrades to be had here.

What you're failing to account for in those fictional scenarios is the fact that we are only now starting to get games release that require graphics cards with RT capability. A whole 6+ years after the release. We also had an immediate pipeline of upcoming RT games at the time of Turing's release. All of this waffle will be even more hilarious if stuff like mega-geometry (which is only scheduled to be coming to a single game at the time of writing) also runs decently well on the higher end 40XX series cards.

So yeh, I think most people are good to wait and see how this shakes out.
 
Last edited:

LectureMaster

Gold Member
This in fact just made it super easy for me not to upgrade from 4080.

I will probably use the money to buy a handheld this year and get a better overall enjoyment than purchasing a new card.
 

Miyazaki’s Slave

Gold Member
I've seen the DAW qtys for my area (Houston) and there is absolutely a huge shortage of these cards.

Hitting the "physical lottery" (ie showing up to a store to get one) in one of the largest metro areas of the country will be 99.999999999% impossible.

Manufactured shortage or actual shortage it doesn't matter...there are still not that many FE cards to go around.
 

Buggy Loop

Member
Damn, someone's been a bit touchy since Monday's news.

You're sat here attempting to justify this new generation, so yes, it does come across as some kind of sales pitch (whether that's intentional or not, I don't know).

The bottom line is that from a gaming perspective this generation is a dud if you currently have access to 40xx series hardware. However if you also happen to use your GPU for AI tasks then there are substantial upgrades to be had here.

What you're failing to account for in those fictional scenarios is the fact that we are only now starting to get games release that require graphics cards with RT capability. A whole 6+ years after the release. We also had an immediate pipeline of upcoming RT games at the time of Turing's release. All of this waffle will be even more hilarious if stuff like mega-geometry also runs decently well on the higher end 40XX series cards.

So yeh, I think most people are good to wait and see how this shakes out.

Again putting words in my mouth

"you're failing to account for... only now starting to get games release that require RT"

My own quote : "But those Ada owners by the time games really flex these technologies, probably 6000 series will be out. A bit like Turing being the startup of RT & ML upscalers but we didn't see good implementations until years later."

Yup, you nailed it

K-Pop GIF


I'm an electrical engineer with classes in semi-conductors and family member who worked on 100M pixel camera on silicon for Mars robots so pardon me if my interest is more into what it brings under the hood on that silicon for NPUs than just benchmarking Hitman for the 100th time

Do not buy it for all I care GHG.

I'm not even sure I want to buy it

I want the whitepapers.

Its new tech. It'll define AMD / Intel's future. They're all onboard with DirectX HLSL team. It'll affect future consoles. If that isn't interesting to you then please do go on with your life, do not fucking put me as a salesman though because I'm interested.
 

Jigsaah

Gold Member
I'm also a bit concerned about new technologies in their first iteration. I don't really do anything with AI currently so happy to wait until it matures a bit more.
 
5080 > 4080

cheaper
smaller
newer
better option

am I missing something?

The 4080 was way better than the 3090. I personally didn't think the 5080 would match the 4090 considering how good the 4090 was but it wasn't unreasonable to expect somewhat similar performance.

The 5090 being 30% faster (?) than the 4090 is more than acceptable, but they jacked up the price quite bit.

I've said it on here before but the 40 series is actually the best lineup of this console generation of cards but people won't admit it.
 
These generational improvements getting funny at this point:
1080 was matched by 2060 (60 class matching an 80 class is uncomprehensible with Nvidia of today, but this is just 6 years ago)
330.png

2080ti was matched by 3070
Greenshot-2025-01-29-16-47-28.png

3090ti was nicely outdone by 4080
Greenshot-2025-01-29-16-49-51.png

5080 is just sad by comparison
Greenshot-2025-01-29-17-06-23.png
Damn that's fucking horrible. Wow.
 

spons

Gold Member
The most atrocious thing about the 2000 buck GPU is the people posting wanky comments about how they can afford one and can't wait to play on it.
 
Top Bottom