• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Let’s be real with gpu increases especially the 5090

in all my decades of PC gaming, ive never said "damn, i dont know what to do with all this performance"

theres never been a time where i could run all my games at all the settings i wanted with excellent performance

$2k for a GPU is also retarded
Ah ... I remember those simpler times. My dad got me and my brother a 486 Intel PC at ... 60 MHz, I think? It could run anything and everything perfectly. That was around ... the early 90s. And he got us a 200 MB HDD and we could install all our games.
 
To me this would be pointless because do you notice big leaps?
Honestly, not really. I purchase because I can, and I want to future-proof as much as possible.

Generational leaps are getting smaller and smaller. In reality, I could probably run every new game on a 3090 at max settings and not really notice a difference between it and running the game on a 5090, unless I looked at a fps counter.
 
I have a 9070xt as well in my bazzite build and even with high textures in some games, it cannot reach 60fps at 4k. For the games that support FSR4, it's not an issue. However, the games that don't support fsr 4, it's a big problem.
Which games are you struggling with?
 
I am happy I have built a pc in summer of 2023 with 4090 RTX but my only regret is not getting 128GB of DDR5 Ram. I have 64GB. I do a lot of AI stuff and video production and 3D. Wanted to build a dedicated workstation but now I am stuck due to RAM prices :/

Thankfully though, i can locally use my machine still for AI gen because 4090 is a beast.
You are actually saving money with that rig if you are doing AI.

Most of us have to fucking rent to get the cards with the VRAM we need. You did well picking it up when you did.
 
Ah ... I remember those simpler times. My dad got me and my brother a 486 Intel PC at ... 60 MHz, I think? It could run anything and everything perfectly. That was around ... the early 90s. And he got us a 200 MB HDD and we could install all our games.
First graphic card for me was the Nvidia GTX 260.

Ran everything well from Doom 3, Half Life 2, Crysis, World of Warcraft, and even ran Resident Evil 6 decently. Only game that I had trouble with was Tera Online which was like the most demanding MMORPG at the time with much better graphics than WoW and only had problems during portions where hundreds of players were ganging up on world event bosses.

Stuff was actually optimized and fairly priced back then so it was a win win for everyone compared to now where it's both overpriced and underpowered for it's price. Everything was great all throughout to the GTX 1000 series and started going downhill from there.
 
The 5080 is a 1440p card. It is not a 4k card. At 4k max settings, you will not hit 60fps in alot of games especially unreal engine games. In path tracing, the 5080 is a 1080p card. As for the 5090, in path tracing, it's a 1440p card.

Keep in mind, I define the card by what the card will run at native resolution with no dlss or fg.
Not even the 5090 can achieve 4K at 60 fps in every game. Does that mean the 5090 is a 1440p, or even 1080p card in ark surival? Can you see how silly that sounds? If we consider the 5090 to be a 1080p card, then what about the 4060/3070 card? Are those 120p cards? Dude, can't you see how ridiculous your logic is?

The RTX 5090 is an awesome 4K card. It can run most games at this resolution with ease with over 120fps. Even my 4080S can run most of the games in my library at native 4K and well over 60 fps. In fact, both my 4080S and 5080 were marketed as 4K cards by Nvidia and reviewers (my card had 95fps had an average of 95fps at 4K when I bought it)
Yes, some of the latest games require me to do some tweaking to achieve a high refresh rate on 4K displays, but that doesn't matter, especially with AI technologies. I paid for Tensor cores, and if I didnt planet to use them I would buy much cheaper AMD card. Thanks to AI, I can run even UE5 games at 4K with 120–170 fps and I proved that many times with my comparisons.

The RTX 5090 and it has an aversge of 150fps at 4K native in most games. I doubt there's a single 5090 owner who considers their card to be 1440p or 1080p based on the few extreme examples that could be counted on one hand. With DLSSQ and FG (yet alome MFG) can be considered even an 8K card IMO.

average-fps-3840-2160.png
 
It is what it is I guess. Makes me appreciate well optimized games more at least. For example KCD2 ran like a dream for me.

This is exactly why I've always thought The Game Awards should have a "Most Optimized at Launch" category. Considering how much launch performance gets discussed, it's kind of insane the category doesn't exist already.
 
With the abundant increase in miscellaneous computer components, the average consumer is just not going to be able to partake in PC gaming. Period. At least when it comes to owning a good to great gaming PC. I really don't know what the future holds, it's too hard to tell ATM, but man, I hope that something changes because this hobby is quickly reaching insane cost levels.

I updated my rig in 2022/2023, and I'm beyond glad I did. Because I'm almost positive that the total I spent would double nowadays.

Yeah imo games should run much better than they do, specially UE5 games where devs didn't took time to optimize shit.

It is what it is I guess. Makes me appreciate well optimized games more at least. For example KCD2 ran like a dream for me.
For real! I'd like to think that's gonna be something we see improved in the near future, especially with GPUs and components getting more and more expensive. Better optimization would certainly be beneficial for all players, especially those without brand new-ish / decent GPUs.

I still can't believe a game like Battlefield 6 can run with everything cranked, is buttery smooth, and my system is silent. Then you have games like Black Ops 7 / Warzone (or almost any UE5 title) that is everything but silent. Too many studios have dragged their feet and relied on things like frame generation and powerful GPUs. There's no excuse for that anymore.
 
Last edited:
Guys the price increases seem to be actually happening?


Not only is Amazon now charging $4199 for the Asus Astral 5090 now (shipped and sold by amazon, not a 3rd party) but you can't even buy one unless you're an amazon prime member).

My local microcenter and newegg are both sold out.
We're cooked.
 
Last edited:
Guys the price increases seem to be actually happening?


Not only is Amazon now charging $4199 for the Asus Astral 5090 now (shipped and sold by amazon, not a 3rd party) but you can't even buy one unless you're an amazon prime member).

My local microcenter and newegg are both sold out.
We're cooked.

Price in Europe seem to be 3000-3300€ now.
Used to start at 2799€ when I checked a few days ago.

People need to decide for themselves if it is worth it, but for me I just divide the cost of the GPU by the amount of games (that would genuinely require a graphics card like a 5090) I play during the usual cycle of a GPU and then I make a decision based on that whether I think it is wise.
For me it would mean maybe 1-2 graphically demanding games a year that are made better by such an expensive card and that would mean the total cost of a single game I play enhanced by that graphics card is
$500 extra per game
Personally it is not worth it. I don't want to pay $580 per game.
 
Not even the 5090 can achieve 4K at 60 fps in every game. Does that mean the 5090 is a 1440p, or even 1080p card in ark surival? Can you see how silly that sounds? If we consider the 5090 to be a 1080p card, then what about the 4060/3070 card? Are those 120p cards? Dude, can't you see how ridiculous your logic is?

The RTX 5090 is an awesome 4K card. It can run most games at this resolution with ease with over 120fps. Even my 4080S can run most of the games in my library at native 4K and well over 60 fps. In fact, both my 4080S and 5080 were marketed as 4K cards by Nvidia and reviewers (my card had 95fps had an average of 95fps at 4K when I bought it)
Yes, some of the latest games require me to do some tweaking to achieve a high refresh rate on 4K displays, but that doesn't matter, especially with AI technologies. I paid for Tensor cores, and if I didnt planet to use them I would buy much cheaper AMD card. Thanks to AI, I can run even UE5 games at 4K with 120–170 fps and I proved that many times with my comparisons.

The RTX 5090 and it has an aversge of 150fps at 4K native in most games. I doubt there's a single 5090 owner who considers their card to be 1440p or 1080p based on the few extreme examples that could be counted on one hand. With DLSSQ and FG (yet alome MFG) can be considered even an 8K card IMO.
Firstly I own a 5090 and consider it a 1440p card. It cannot run path tracing at native 4k. It cannot even run RT and keep a native 4k in some games. You don't buy a 5090 to play raster games at 4k. There are already lots of cards that can do that. You buy it to open the possibilities to super high refresh gaming, path tracing, and high quality raytracing.

Secondly read my original post again. I explicitly call out raytracing and path tracing as the metrics I use in evaluating a cards performance. In fact, I'm sure you didn't comprehend what I read because you references DLSS and MFG, features which I explicitly referenced in my original post.
 
Best thing you can do is switch hobbies, buy a less powerful hardware or wait for the bubble to implode.
 
As a guy crazy enough to buy 3080ti for 2200€ in the middle of cryptoboom i will be likely aiming at 6080 next(unless its terrible performance and value like 5080 was vs neighbouring products aka 5070ti and 5090), skipped 4000 and 5000 gen coz it wasnt enough of an upgrade except of 5090 which had crazy 575W tdp even on non oc models- that was the reason i didnt want to purchase it, not its price.

TLDR Gimme proper 24gigs 6080 with 2k usd pricetag but under 400W tdp, and im baiting on that 4sure(yes in hindsight i should have bought 4090 but back then my 3080ti didnt feel weak/outdated yet since it wasnt even 1.5yo, it will feel like that 4sure in 2027 when 6080/90 launches tho ) xD
 
On steam hardware survey, the 4060 or lesser cards make up the vast majority. It is only the 1 percentile rocking 40/5090 cards.
 
Firstly I own a 5090 and consider it a 1440p card. It cannot run path tracing at native 4k. It cannot even run RT and keep a native 4k in some games. You don't buy a 5090 to play raster games at 4k. There are already lots of cards that can do that. You buy it to open the possibilities to super high refresh gaming, path tracing, and high quality raytracing.

Secondly read my original post again. I explicitly call out raytracing and path tracing as the metrics I use in evaluating a cards performance. In fact, I'm sure you didn't comprehend what I read because you references DLSS and MFG,
Using your own logic, lets ignore how the vast majority of ray tracing games run and focus on one game that doesn't run well. ARK: Survival Ascended runs at 45-60fps FPS at 1440p with 50% resolution scale. This means that the 5090 isn't even a 1440p card. I'm sure that even 1080p would dip below 60 fps given how poorly the game runs. Damn dude, you paid $3000 for sub 1080p card.

N22C3Gc2YPpxYk3u.jpg
 
Using your own logic, lets ignore how the vast majority of ray tracing games run and focus on one game that doesn't run well. ARK: Survival Ascended runs at 45-60fps FPS at 1440p with 50% resolution scale. This means that the 5090 isn't even a 1440p card. I'm sure that even 1080p would dip below 60 fps given how poorly the game runs. Damn dude, you paid $3000 for sub 1080p card.
You couldn't have picked a more wrong example if you tried. The video game industry is in the process of a technological transition. The transition from raster to RT. All I'm doing is measuring the cards by the era we're trying to transition into. RT is the future and the 5090 represents the first card that can deliver that future at native 1440p60fps. It cannot deliver that future at native 4k at 60fps and therefore it is not a 4k card for RT. The 4090 was the first card that could deliver that future at native 1080p 60fps. Technologies like DLSS only exist to bridge the gap.

We already mastered the technologies of the past. If I wanted to play a raster game at 4k, i don't need a 5090 for that.... The only thing ARK represents is a poorly optimized game that has bad developers. It doesn't represent a paradigm shift in thinking, technology or approach to game development. It only serves as a reference of what not to do.
 
Last edited:
When playing Indiana Jones i 4k with full path tracing and dlss, om balanced, the game barely hits 60fps, so yeah.. Path tracing is rough! (No frame gen)

I"m playing with path tracing on medium, and get around 75-100 fps, which is ok. Not a huge difference in visuals.


The Astral 5090 i'm using is at 83% power limit and 950v@2745MHz i Gpu tweak though. Scared to push it and risk the melting issue.. Which is the downside with these cards.

Edit: the fps got a lot higher after the first jungle area (Just started playing the game ).
 
Last edited:
You couldn't have picked a more wrong example if you tried. The video game industry is in the process of a technological transition. The transition from raster to RT. All I'm doing is measuring the cards by the era we're trying to transition into. RT is the future and the 5090 represents the first card that can deliver that future at native 1440p60fps. It cannot deliver that future at native 4k at 60fps and therefore it is not a 4k card for RT. The 4090 was the first card that could deliver that future at native 1080p 60fps. Technologies like DLSS only exist to bridge the gap.

We already mastered the technologies of the past. If I wanted to play a raster game at 4k, i don't need a 5090 for that.... The only thing ARK represents is a poorly optimized game that has bad developers. It doesn't represent a paradigm shift in thinking, technology or approach to game development. It only serves as a reference of what not to do.
RT is a broad tearm. Games use hybrid RT, PT, and even software ray tracing (lumen, SVOGI etc.). Many hybrid RT games runs extremely well even at 4K native on the RTX5090.

Metro Exodus EE - 140fps.



Resident Evil 2 Remake - 240fps



Pragmata (demo) 140fps



Robocop (UE5) uses real-time GI (Lumen), VSM, and Nanite. This video was recorded with Shadowplay, resulting in around 10% performance loss, yet there's still around 100 fps at 4K DLAA (and keep in mind DLAA also take takes few fps compared to TAA native).



I could show many more examples, but my point is that games use a variety of RT effects and on top of that RT on different scales. Games with hybrid RT on small scale usually run well at 4K native on the RTX 5090. You would have to play extremely heavy PT games or the most unoptimized UE5 games to push the RTX 5090 to its limits at 4K native.

We are transitioning from raster to RT games, but we are also transitioning from the native resolution era to the AI era. Some tech YouTubers have even concluded that DLSS has made native resolution a thing of the past. Even at 77% scale, DLSS Ultra Quality provides a significant performance boost in RT games while looking indistinguishable from DLAA and far superior to old TAA. More than a year ago, I made a DLSS comparison in Black Myth: Wukong, and half of the Neogafers couldn't even decide where's the DLSS 50% resolution scale and DLAA were. If not for nanite details that scales with inrernal resolution, not a single person would tell a difference. It's even more difficult to tell the difference between the DLSS 77% resolution scale and DLAA. Insisting on playing the most demanding games (especially PT) at 4K DLAA is absurd, considering the performance cost of that. The RTX 5090 can run these extremely demanding RT games with 4K image clarity and a high refresh rate thanks to AI technology (tensor cores and DLSS). I would never buy or recommend this card for a 1440p monitor (let alone a 1080p monitor, especially for "Ark Survival.").
 
I'm not talking about hybrid rt.
So, if you want to talk about PT games specifically, how many do you know? There aren't that many if we dont count mods for older games. Developers usually add PT just to push the limits of GPUs and future-proof their games. Do you expect to max out these PT games on current hardware at 4K native?
 
So, if you want to talk about PT games specifically, how many do you know? There aren't that many if we dont count mods for older games. Developers usually add PT just to push the limits of GPUs and future-proof their games. Do you expect to max out these PT games on current hardware at 4K native?
In about 2-3 generations, we should be able to max these games out at 4k native. That's why for these workloads, it's a 1440p card. Outside these PT workloads, there's no point of even considering a 5090 other than for work. A 5080 or 5070ti will allow you to play at 4k at the fraction of the cost.

In terms of the number of games, there are lot of games where the 5090 drops from 60fps at 4k for path tracing games. There are also many UE5 games and raster games like microsoft flight simulator where the 5090 drops from 60fps at 4k.
 
You don't know what you're talking about and don't understand CPU limit either.
You must be clowning right? Surely no one can be this obtuse? What do cpu limits have to do with this:

4mxgQSQMAQtqtpxd.png

sLXH96JkGMg8fQjx.png

G7mp3w1ctZknMaOf.png


At2IygbHxHiuAvoh.png


If you don't have a 5090, don't talk please. You don't know what you're talking about. I use mine everyday and can clearly see it dropping from 60fps at 4k in some games without DLSS.
 
Last edited:
You must be clowning right? Surely no one can be this obtuse? What do cpu limits have to do with this:

4mxgQSQMAQtqtpxd.png

sLXH96JkGMg8fQjx.png

G7mp3w1ctZknMaOf.png


At2IygbHxHiuAvoh.png


If you don't have a 5090, don't talk please. You don't know what you're talking about. I use mine everyday and can clearly see it dropping from 60fps at 4k in some games without DLSS.
Yeah, I don't have a 5090, check #1 spot for reference and beat it: https://www.neogaf.com/threads/2026-official-neogaf-3dmark-port-royal-benchmark-wars.1691881/

Clearly you don't understand CPU limit because you think Flight Sim is in GPU limit. And this "no DLSS" purism is just insane to me. We are talking about real time raytracing at ultra high resolutions here, buddy.
 
In about 2-3 generations, we should be able to max these games out at 4k native. That's why for these workloads, it's a 1440p card. Outside these PT workloads, there's no point of even considering a 5090 other than for work. A 5080 or 5070ti will allow you to play at 4k at the fraction of the cost.

In terms of the number of games, there are lot of games where the 5090 drops from 60fps at 4k for path tracing games. There are also many UE5 games and raster games like microsoft flight simulator where the 5090 drops from 60fps at 4k.
The flight simulator is certainly CPU-limited, at least with the settings maxed out. Check out this video. The game often dips below 60 fps at 4K DLAA, but the GPU isnt maxed out.



As for the UE5 games, you meant the ones that are unoptimized. I already showed you that the RTX 5090 can run the best optimized UE5 games at around 100 fps, even at 4K DLAA. With high or very high settings instead of Epic, most UE5 games can run at 4K DLAA 60 FPS on that card as John from DSOG showed. With DLSS and FG, even a 240 Hz monitor can be maxed out. But let's pretend the RTX 5090 is a 1440p or even 1080p card because some games can push this card to its limits at 4K with maxed out settings.
 
Last edited:
*kisses my 4090*


By the time I'm ready to upgrade when the 7090 comes around prices will be hopefully be more sane. If that's not the case I would have gotten so much out of my 4090 that I won't mind paying a small fortune for a 7090.
 
Great then why are you talking? You jump in half way through the discussion lacking context and chime in incorrectly? To what end?
And this "no DLSS" purism is just insane to me. We are talking about real time raytracing at ultra high resolutions here, buddy.
If you think I spent $4000 CAD to use DLSS, I'd say you're the one who has lost the plot. If I wanted to use DLSS in all my games, I'd buy a 5070ti and call it a day.
 
As for the UE5 games, you meant the ones that are unoptimized. I already showed you that the RTX 5090 can run the best optimized UE5 games at around 100 fps, even at 4K DLAA. With high or very high settings instead of Epic, most UE5 games can run at 4K DLAA 60 FPS on that card as John from DSOG showed. With DLSS and FG, even a 240 Hz monitor can be maxed out. But let's pretend the RTX 5090 is a 1440p or even 1080p card because some games can push this card to its limits at 4K with maxed out settings.
Which UE5 games are unoptimized? The ones that use lumen and nanite? The ones that run an older version of ue5? There are a bunch of them.....
 
FalconPunch: "RTX 4090 is a 720p GPU"
I never said that but sure.... Cry all you want about it, the simple truth is at 4k native with path tracing, heavy rt and some ue5 games, the 5090 cannot deliver 60fps.

No matter how much you cry about it, the truth will remain unaltered. As it relates to heavy rt and path tracing, the 5090 is not a native 4k card. It's at best, a 1440p card in those workloads.
 
Last edited:
I never said that but sure.... Cry all you want about it, the simple truth is at 4k native with path tracing, heavy rt and some ue5 games, the 5090 cannot deliver 60fps.



No matter how much you cry about it, the truth will remain unaltered. As a result as it relates to heavy rt and path tracing, the 5090 is not a native 4k card. It's at best, a 1440p card in those workloads.
You are of course free to your own arbitrary standards but you do realize that the vast majority of people enjoy DLSS a lot?
In terms of edge resolution, it completely destroys native. And it keeps getting better and better, see release of 4.5 yesterday. I think the 4090 was the first true 4k GPU and the 5090 is the icing on the cake.
 
Last edited:
You are of course free to your own arbitrary standards but you do realize that the vast majority of people enjoy DLSS a lot?
I do realize that a lot of people enjoy it. However, I prefaced my very first statement regarding the 5090 in this thread by saying the following:
The 5080 is a 1440p card. It is not a 4k card. At 4k max settings, you will not hit 60fps in alot of games especially unreal engine games. In path tracing, the 5080 is a 1080p card. As for the 5090, in path tracing, it's a 1440p card.

Keep in mind, I define the card by what the card will run at native resolution with no dlss or fg.

In terms of edge resolution, it completely destroys native. And it keeps getting better and better, see release of 4.5 yesterday. I think the 4090 was the first true 4k GPU and the 5090 is the icing on the cake.
DLSS has a lot of artifacts. In some games, it works really well while in others it doesn't. Each person has different aspects of image quality and general graphics that acts as annoyance. There's no one size fits all.

As for DLSS 4.5, it's not universally better than DLSS 4 and adds a lot of artificial sharpening. In certain games, it's much much worse. This is at least from my testing. If you're on an older gpu, the computational cost is much heavier than DLSS4.
 
Last edited:
Which UE5 games are unoptimized? The ones that use lumen and nanite? The ones that run an older version of ue5? There are a bunch of them.....
For sure, Ark: Survival Ascended, Oblivion, and maybe even Borderlands 4 (although I really like this game). Even my RTX 4080S can run some UE5 games at 4K resolution and still get 60 fps and high settings (some even at maxed-out settings, like Assetto Corsa Rally). The RTX 5090 is twice as fast. John from DSOG tested almost all UE5 games on his site, and the 5090 could deliver 60 fps in the vast majority at worst with just minor adjustments (very high or high details instead of epic).
 
For sure, Ark: Survival Ascended, Oblivion, and maybe even Borderlands 4 (although I really like this game). Even my RTX 4080S can run some UE5 games at 4K resolution and still get 60 fps and high settings (some even at maxed-out settings, like Assetto Corsa Rally).
I didn't show any of those games in the screenshots I posted though? The 4080S is a good card. I used to have one. Went from 4090 -> 4080 Super -> 5090.
The RTX 5090 is twice as fast. John from DSOG tested almost all UE5 games on his site, and the 5090 could deliver 60 fps in the vast majority at worst with just minor adjustments (very high or high details instead of epic).
It could be twice as fast? Last I checked, on techpowerup it was 83% faster so things might have changed. Look at how expensive the 5090 is, the idea of dropping settings for such a price premium is a tad ridiculous to me.

That being said, everyone has their preferences.
 
Yeah and I heavily disagree with that statement but you do you.
I mean you're free to, it's a free world. However, like I've said from the start, I'm judging the GPU on compute not on it's ability to generate pixels using a machine learning algorithm.

That's all DLSS is. You reduce the workload of the gpu and generate/infer the missing pixels via machine learning. If I wanted to measure that type of workload/performance, I'd just refer to this instead:

2juA8w6XFNOKRsFg.png
 
Last edited:
Top Bottom