• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Let’s be real with gpu increases especially the 5090

SNG32

Member
A small percentage of people into pc gaming buy 5090's. Games are never using these types of GPUs as a recommended spec. So what if it becomes out of reach for the average consumer. There is still always mid tier and developers will now have to optimize better for mid-low tier GPUs instead of brute forcing performance.
 
well yeah I mean you can play 95% of games at great settings if not max on a 4080 / 4090, the jump from 4x gen to 5x was not worth it in any way.

4090 card will be all you need for at least another 3-4 years
 
There is still always mid tier and developers will now have to optimize better for mid-low tier GPUs instead of brute forcing performance.
No comments on the rest of your post, but this part stood out.

No, they won't. You're right. Most people aren't buying 5090s. The most popular GPU on Steam is the 3060. In fact, the mid range ##60 in each generation is always in the top 3 spots.

Despite the mid range being the most popular card, developers are not optimizing their games. Nvidia selling less 5090 isn't going to change that.

Developers/publishers have conditioned people to accept a very low bar.
 
Last edited:
I mean high end GPUs have always been out of reach for the average consumer. There's a reason high end GPUs account for like 1% of GPUs on Steam.


Even if devs aren't specifically implementing high end features for these cards, you still get resolution/framerate/settings presets inherent to using better hardware.


A game on PS6 running at 1440p/30fps/medium settings will always look way worse than the same game running at 4k/120fps/ultra settings, even if devs aren't using specific RTX features.
 
No comments on the rest of your post, but this part stood out.

No, they won't. You're right. Most people aren't buying 5090s. The most popular GPU on Steam is the 3060. In fact, the mid range ##60 in each generation is always in the top 3 spots.

Despite the mid range being the most popular card, developers are not optimizing their games. Nvidia selling less 5090 isn't going to change that.

Developers/publishers have conditioned people to accept a very low bar.

Your right about the optimized part because their are some games that don't run great on a 5090 for the performance.you should be getting such as borderlands 4.
 
I am happy I have built a pc in summer of 2023 with 4090 RTX but my only regret is not getting 128GB of DDR5 Ram. I have 64GB. I do a lot of AI stuff and video production and 3D. Wanted to build a dedicated workstation but now I am stuck due to RAM prices :/

Thankfully though, i can locally use my machine still for AI gen because 4090 is a beast.
 
A small percentage of people into pc gaming buy 5090's. Games are never using these types of GPUs as a recommended spec. So what if it becomes out of reach for the average consumer. There is still always mid tier and developers will now have to optimize better for mid-low tier GPUs instead of brute forcing performance.

Resolution, graphics settings, and frame rate. Those are the three factors to consider.

There will always be people who want to push all three as much as possible. Some of those people have the money to spend on a halo-tier product like the 90 class cards. If you are a person who wants to push one or two of those things, then a mid-tier card is plenty. You're not missing out.

Personally, I don't feel any need to move from QHD to 4K. My eyesight is such that the additional DPI wouldn't give me any significant value. I also have played around with capping my frame rate enough to know that I don't perceive any additional "smoothness" when the frame rate gets beyond about 100fps. I understand that other people are more sensitive and pro-level gamers can tell the difference between 165fps and 240fps. I just can't. So for me, I've got a QHD monitor and I've capped my refresh rate at 90hz.

A current generation mid-tier card is enough to max out settings in almost every game and hit 90fps at that resolution. I could buy a 5090 (though it would be irresponsible), but it wouldn't do me any good because I'm already hitting the ceiling of what's viable for my eyes & brain. I would like to eventually upgrade to an OLED monitor for the deeper blacks, but it's not worth spending $4,000+ on at this time. I'll wait a few more years until that's down below $2,000.
 
Last edited:
in all my decades of PC gaming, ive never said "damn, i dont know what to do with all this performance"

theres never been a time where i could run all my games at all the settings i wanted with excellent performance

$2k for a GPU is also retarded
 
If the 5090 is getting a price increase, then probably everything else incuding the mid-tier is too. There's a reason Nvidia is bringing back the freaking 3060.
 
Last edited:
Unfortunately that isn't the case cause if the 5090 rises in price than you'll always have the some of the rich folks with more money than brains still throwing money at it. So once those pieces of shit in Nvidia sees that than they'll try to up the price of all the other models too. This directly effects the poor and middle class trying to buy the 60s, 70s, and 80s series.

The current prices even at msrp is already pretty bad and now it's going to get worst. 5090 would have been lower than 2000 in an ideal timeline.
 
You don't need anything more than 9070xt/5070ti tier unless you use your GPU to make money.

Hell, the 9060xt/5060ti tier is best for most people.
I don't make a dime with my gpu's, but definitly need a 5090 to get all bells and whistles in the most demanding titles i 4K.
Or a 4090 I guess.

Now, is those bells and whistles something I really "need"? I guess not, but it sure feels and looks amazing to play.

If prices go straight to oblivion, I might sell one of my 5090's, or maybe both - and try to convince myself that what you wrote is 100% true for me too.
 
in all my decades of PC gaming, ive never said "damn, i dont know what to do with all this performance"

theres never been a time where i could run all my games at all the settings i wanted with excellent performance

Well, da-doi!

Don't go looking for limits and complain when you find them!

Especially when you're leveraging a single component of a complex and highly integrated collection of systems to create gains! Ramping up sliders to the max isn't just creating stress on the GPU, its pushing the CPU and all the connective components because at the end of the day you are just creating and manipulating data.
 
Well, da-doi!

Don't go looking for limits and complain when you find them!

Especially when you're leveraging a single component of a complex and highly integrated collection of systems to create gains! Ramping up sliders to the max isn't just creating stress on the GPU, its pushing the CPU and all the connective components because at the end of the day you are just creating and manipulating data.
pool-fart.gif
 
I don't make a dime with my gpu's, but definitly need a 5090 to get all bells and whistles in the most demanding titles i 4K.
Or a 4090 I guess.

Now, is those bells and whistles something I really "need"? I guess not, but it sure feels and looks amazing to play.

If prices go straight to oblivion, I might sell one of my 5090's, or maybe both - and try to convince myself that what you wrote is 100% true for me too.
That won't be true for you,

More than 90% of the people either can't or don't care for 4k and VR so they'll be fine with cards like 5060 ti 16gb to 5070ti for the future.

But since you must do max on 4k that you're going to want to keep your 5090 especially for future big games like Witcher 4, GTA VI max at 4k

5060 ti - 1080p/1440p but should stick with 1080p in the future for next gen

5070ti - 1440p/4k but should stick with 1440p in the future for next gen

5090 - 4k/8k but should stick with 4k in the future for next gen
 
Last edited:
I've always found the high end GPUs retarded, but it's only gotten worse with time. The last time I splurged on a high end GPU was the GTX 1080 (the GOAT) but that was when high end GPUs were 1/3 of the price.

The funny thing to me is not only how expensive everything has gotten but how they have essentially moved everything down a tier as the XX60 series is now the low tier instead of the XX50 making my 4070 Super a mid-tier GPU.

So GPUs have moved down tiers while going up anywhere from $150-600 in price.
 
I had a Titan, I had a 3090, I had a 4090, and now I have a 5090.

I'll get the 6090 when it's released, because I have the disposable income to do so.

However, they have all been completely overkill, and in reality, I could probably still be using a 3090 and running most games at 4k/60 and be perfectly fine.
 
I had a Titan, I had a 3090, I had a 4090, and now I have a 5090.

I'll get the 6090 when it's released, because I have the disposable income to do so.

However, they have all been completely overkill, and in reality, I could probably still be using a 3090 and running most games at 4k/60 and be perfectly fine.
I had a 3090, 4090 and 5090's now, but If the current development continues i'm pretty sure that i'm priced out of the next top card.
Probably the 80-series next time for me (If Nvidia still make consumer cards by then), or the best AMD offering.
 
5090 and 4090 owner here. I wish there were higher cards I could buy. Price is no object to some of us. However, I do agree games should be developed for mid-tier specs because not being able to get 4K60 on a 5090 in modern games feels very bad, let alone the 4K120 I actually want for my display.
 
I had a Titan, I had a 3090, I had a 4090, and now I have a 5090.

I'll get the 6090 when it's released, because I have the disposable income to do so.

However, they have all been completely overkill, and in reality, I could probably still be using a 3090 and running most games at 4k/60 and be perfectly fine.
To me this would be pointless because do you notice big leaps?. Like I'm a type of person to buy a gpu and sit on it until the next console gen starts. I like to see how long a gpu can hold out and it's more of a big jump when I get one. But different strokes for different folks.
 
Last edited:
That won't be true for you,

More than 90% of the people either can't or don't care for 4k and VR so they'll be fine with cards like 5060 ti 16gb to 5070ti for the future.

But since you must do max on 4k that you're going to want to keep your 5090 especially for future big games like Witcher 4, GTA VI max at 4k

5060 ti - 1080p/1440p but should stick with 1080p in the future for next gen

5070ti - 1440p/4k but should stick with 1440p in the future for next gen

5090 - 4k/8k but should stick with 4k in the future for next gen
The 5080 is a 1440p card. It is not a 4k card. At 4k max settings, you will not hit 60fps in alot of games especially unreal engine games. In path tracing, the 5080 is a 1080p card. As for the 5090, in path tracing, it's a 1440p card.

Keep in mind, I define the card by what the card will run at native resolution with no dlss or fg.
 
Recommended Spec usually targets 60 FPS, which is not good enough.

While the 5090 is really powerful, there are still some games that require more than what it has to offer, to achieve a high frame rate, be it because they're demanding and/or unoptimized.

I'm currently primarily gaming at 3840x1600. I'd like to upgrade to a 5k2k monitor when the right one releases, which will be even more demanding.

Having a lot of headroom and being able to DSR, or just not run the card at its limit all the time, is also nice.
 
Last edited:
According to steam hardway survy:
5090: 0.36%
5080: 1.07%
4090: 0.80%
4080: 0.70%
4080S: 0.69%

So combined these top 5 GPUs are about 3.62% of Steam. For perspective the most common card is 3060 with 4.07%

So yes for all intents and purposes these cards are irrelevant in the PC gaming market.
 
The 5080 is a 1440p card. It is not a 4k card. At 4k max settings, you will not hit 60fps in alot of games especially unreal engine games. In path tracing, the 5080 is a 1080p card. As for the 5090, in path tracing, it's a 1440p card.

Keep in mind, I define the card by what the card will run at native resolution with no dlss or fg.

And that's the issue I think a 5090 should be able to hit 4k60 easily without DLSS and fg. Especially if it's going up to $5,000 dollars.
 
well yeah I mean you can play 95% of games at great settings if not max on a 4080 / 4090, the jump from 4x gen to 5x was not worth it in any way.

4090 card will be all you need for at least another 3-4 years
Shit Maybe longer. You can buy a 1080ti, TitanXP or 2080to for 150-300 bucks and have great experiences especially if you turn off RT
 
I've always found the high end GPUs retarded, but it's only gotten worse with time. The last time I splurged on a high end GPU was the GTX 1080 (the GOAT) but that was when high end GPUs were 1/3 of the price.

The funny thing to me is not only how expensive everything has gotten but how they have essentially moved everything down a tier as the XX60 series is now the low tier instead of the XX50 making my 4070 Super a mid-tier GPU.

So GPUs have moved down tiers while going up anywhere from $150-600 in price.
Yep it seems the massive price increase is already retarded but they also don't even do much improvement each gen on top of cheapening out on vram.

The 1000 series is amazing with the price and upgrade over the 900 series and how things should be.

Their absolute highest the 1080ti was $699. The 1000 series is so good that the 1070 and 1060 is still used by a good amount of folks on Steam.
 
And that's the issue I think a 5090 should be able to hit 4k60 easily without DLSS and fg. Especially if it's going up to $5,000 dollars.
I have no ideal that the 5090 can't even do 60 for native. I would have thought a crazy 2000 dollar gpu would and should do more than that in 4k native. That is even worse than I imagine.

Although how much better looking is native compared to dlss these days? If it's super close to the point where u can't tell the difference than I guess it's good?
 
The 5080 is a 1440p card. It is not a 4k card. At 4k max settings, you will not hit 60fps in alot of games especially unreal engine games. In path tracing, the 5080 is a 1080p card. As for the 5090, in path tracing, it's a 1440p card.

Keep in mind, I define the card by what the card will run at native resolution with no dlss or fg.
And thats the beauty of PC. Any card can do whatever you want.

For me, my 9070xt is a 4k card. It does it well, but I lean into FSR4 and run high settings instead of ultra (except textures) if I need to up the framerate. My target is 60-120fps.
 
I have no ideal that the 5090 can't even do 60 for native. I would have thought a crazy 2000 dollar gpu would and should do more than that in 4k native. That is even worse than I imagine.

Although how much better looking is native compared to dlss these days? If it's super close to the point where u can't tell the difference than I guess it's good?
But I think that's the issue Nvidia isn't focusing on raw power of GPUs and got all this upscaling bullshit to do the heavy lifting.
 
There is still always mid tier and developers will now have to optimize better for mid-low tier GPUs instead of brute forcing performance.
They already do that so games can run on consoles.

The days of PC games pushing to the limits are long gone. Games like DOOM 3, Crysis and all that stuff, nowhere to be seen anymore.
 
They already do that so games can run on consoles.

The days of PC games pushing to the limits are long gone. Games like DOOM 3, Crysis and all that stuff, nowhere to be seen anymore.

Tbh games aren't really optimized well on console doom dark ages can't run a stable 60 in performance mode.
.
 
Tbh games aren't really optimized well on console doom dark ages can't run a stable 60 in performance mode.
Yeah imo games should run much better than they do, specially UE5 games where devs didn't took time to optimize shit.

It is what it is I guess. Makes me appreciate well optimized games more at least. For example KCD2 ran like a dream for me.
 
They already do that so games can run on consoles.

The days of PC games pushing to the limits are long gone. Games like DOOM 3, Crysis and all that stuff, nowhere to be seen anymore.
The optimization and tech seems so much worse now.

Doom 3, Half Life 2, Crysis were amazing for their time and aged decently.

Crysis was like 1 gb vram and 2gb system ram

Now games these days obviously look way better than Crysis especially if you put them side by side to compare but still definitely not 10x the requirement better.
 
The optimization and tech seems so much worse now.

Doom 3, Half Life 2, Crysis were amazing for their time and aged decently.

Crysis was like 1 gb vram and 2gb system ram

Now games these days obviously look way better than Crysis especially if you put them side by side to compare but still definitely not 10x the requirement better.
It sounds like this shit started on mobile.
 
The optimization and tech seems so much worse now.

Doom 3, Half Life 2, Crysis were amazing for their time and aged decently.

Crysis was like 1 gb vram and 2gb system ram
We've gone from passionate people trying to make the most out of the hardware to suits trying to make the most out of the budget.

No time for optimization, no time for experiments and no time for passion.

Now games these days obviously look way better than Crysis especially if you put them side by side to compare but still definitely not 10x the requirement better.
Some of them look better. But some (for my tastes, lots of them) look worse than Crysis, and I'd put Monster Hunter Wilds as the prime example.

This is a very personal opinion but for my tastes modern games have a lot of shit on screen which when paired with low resolutions + recostruction make for some real blurry and ugly graphics. There's also a lot of use of volumetric fogs, or as some poeple call it the "anti-soul gas". Shit sucks.

TLH0JTo409Q6E3dp.jpg


CXsR4iWn5AtesyXQ.jpg


It's like the new bloom, and ugly trend that I don't like at all. makes for real muted colors, and it's not even "realistic" nor I see any improvements when it comes to the style and visuals.
 
Last edited:
The days of PC games pushing to the limits are long gone. Games like DOOM 3, Crysis and all that stuff, nowhere to be seen anymore.
It doesn't happen as often, for sure. But it sure is cool when it does!
Games like Cyberpunk, Alan Wake 2 and Black Myth in all it's glory is amazing experience.
And Resident Evil 9 with path tracing will be a very different visual experience than console versions.
 
It doesn't happen as often, for sure. But it sure is cool when it does!
Games like Cyberpunk, Alan Wake 2 and Black Myth in all it's glory is amazing experience.
You are right. Maybe I was being a bit cynical with the previous post lol. Still wish it happened more often. :goog_relieved:
 
Buying a top end GPU isn't always for now, it's so you can still play games at their good and best settings 5 years from now, which is very apt at the moment with the memory situation, they are too expensive now to keep paying for a new GPU every 18 months or so, i mean even the mid range are a thousand quid and going to be higher anyway, that's why i got a 5090 anyway, 2014 i paid 400 for a 980, the top card at the time, prices are crazy now.
 
They already do that so games can run on consoles.

The days of PC games pushing to the limits are long gone. Games like DOOM 3, Crysis and all that stuff, nowhere to be seen anymore.
Is this Jensen's way to throw the platform cold turkey? people won't remember him and easily forget all what he did to the industry.
 
Is this Jensen's way to throw the platform cold turkey? people won't remember him and easily forget all what he did to the industry.
Don't know. He's just some dude running a business, and if AI gives more cash than videogames, then AI it is. It sucks for us but it is what it is.

Maybe someone else can took it's place in the long term. AMD, Intel or some chinese company. Or maybe the AI stuff ends up blowing up and things go back to the old normal. Who knows.
 
5090 and 4090 owner here. I wish there were higher cards I could buy. Price is no object to some of us. However, I do agree games should be developed for mid-tier specs because not being able to get 4K60 on a 5090 in modern games feels very bad, let alone the 4K120 I actually want for my display.
VuGmZW3i1zbBSzGk.jpg
 
We've gone from passionate people trying to make the most out of the hardware to suits trying to make the most out of the budget.

No time for optimization, no time for experiments and no time for passion.


Some of them look better. But some (for my tastes, lots of them) look worse than Crysis, and I'd put Monster Hunter Wilds as the prime example.

This is a very personal opinion but for my tastes modern games have a lot of shit on screen which when paired with low resolutions + recostruction make for some real blurry and ugly graphics. There's also a lot of use of volumetric fogs, or as some poeple call it the "anti-soul gas". Shit sucks.

TLH0JTo409Q6E3dp.jpg


CXsR4iWn5AtesyXQ.jpg


It's like the new bloom, and ugly trend that I don't like at all. makes for real muted colors, and it's not even "realistic" nor I see any improvements when it comes to the style and visuals.
The PS4 is the first console to be made with the developers in mind, past forward 10 years later and it turns out the devs themselves don't know what they're doing and what they want. It's way better to not held these people accountable in my opinion, instead it's better to make them learn by doing like any known trade cause you can't transfer knowledge without making these pupils real apprentices.
 
And thats the beauty of PC. Any card can do whatever you want.

For me, my 9070xt is a 4k card. It does it well, but I lean into FSR4 and run high settings instead of ultra (except textures) if I need to up the framerate. My target is 60-120fps.
I have a 9070xt as well in my bazzite build and even with high textures in some games, it cannot reach 60fps at 4k. For the games that support FSR4, it's not an issue. However, the games that don't support fsr 4, it's a big problem.
 
Last edited:
To me this would be pointless because do you notice big leaps?. Like I'm a type of person to buy a gpu and sit on it until the next console gen starts. I like to see how long a gpu can hold out and it's more of a big jump when I get one. But different strokes for different folks.
I also like to see big numbers that would make me feel a big difference just like what I did going from the 1070 to a 5070ti.

Got a 5070ti, 9800xd3, 2tb ssd, and 32gb of ram and won't be getting a new pc until the RTX 8000 series at least or even try to squeeze to the 9000 series if all goes well enough.

Buying a top end GPU isn't always for now, it's so you can still play games at their good and best settings 5 years from now, which is very apt at the moment with the memory situation, they are too expensive now to keep paying for a new GPU every 18 months or so, i mean even the mid range are a thousand quid and going to be higher anyway, that's why i got a 5090 anyway, 2014 i paid 400 for a 980, the top card at the time, prices are crazy now.
Same my main goal for my new 5070ti is mainly to prepare for 1440p GTA VI, Witcher 4, Elder Scrolls VI, Fallout 5, Half Life 3, and all other demanding games within the next 5 years + for 1440p.

And it also allows me to play all the games I couldn't play before with my old 1070 at max setting and much higher frame rate at the same time so this would also make my waiting a lot more enjoyable in the meantime.
 
Last edited:
Top Bottom