• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 5090 Review Thread

Very interesting:

GP9MYJ7.png



Only losing ~5% performance going from 575 Watts to 450 Watts on a 5090.


This could be a wonderful card for undervolting.
Does this mean I could get away with a quality 850w PSU (which I already have) and an undervolted 9800x3d?
 
High price with a lot of watts for a not too big jump in performance with a lot of coil whine. No thanks, that's my conclusion. But either way, I wasn't their target group from the start.
 

Rickyiez

Member


Optimum with another awesome video.
Power consumption to performance ratio is absurd.

Banger video , love his visual presentation on the charts and the smoke coming out from the GPU itself . You can feel that he gave so much effort in putting this together compared to lazy ass Nexus with their ugly ass graphs . Optimum's one is lacking on RT benchmarks though
 
Last edited:

peish

Member
as a budget conscious PCMR :messenger_bicep::messenger_halo:, this deflate my stiffness to pay scalpers for a 5090.

The excess heat and power means less room for tweaking and overclocking.

I can smell 5090Ti end of the year with refined 3nm process.
 

FingerBang

Member
I believe the embargo for the 5080 is actually the 29th, the day before launch.
That's correct. And looking at how accurate the rumors were for the 5090, it's fair to expect an average of 10-15% for the 5080. Definitely not worth upgrading to it from a 4080, but a big jump from people with a 3080 or 4070.
Which Youtube face should I click on?
The one that makes you want to punch him less.
 

GHG

Member
This is an Intel style upgrade.

The performance uplift is largely in line with the power consumption, temperature and noise increases.

Disgusting that they are charging more than the predecessor for this.
 

FingerBang

Member
This is an Intel style upgrade.

The performance uplift is largely in line with the power consumption, temperature and noise increases.

Disgusting that they are charging more than the predecessor for this.
It's not disgusting. They are keeping the same margins as before, seeing how much more expensive the card is.
The sad part is that there is no real advancement in IPC or efficiency, so the only way they could offer more power is by doing so.

Could they have reduced their margins? Of course. But why would they do so in the current market? Chips are scarce resources with alternative uses. It's either this or, you know, AMD.
 

RCX

Member
How much of the performance gain is just coming from the ever-increasing power draw?

Frame gen really is the sticking plaster for the lack of genuine progress over the last few years. Moores Law really is dead.
 

DirtInUrEye

Member
Banger video , love his visual presentation on the charts and the smoke coming out from the GPU itself . You can feel that he gave so much effort in putting this together compared to lazy ass Nexus with their ugly ass graphs . Optimum's one is lacking on RT benchmarks though

Nexus' format for their charts make me bleed from my left ear when I'm deciphering them. I just watch HUB instead these days. I love Steve and his mistakes, bless him.
 
You have 1:1 with price/performance increase. With Pro you are paying 75%/2x more (depending on place) for UP to 45% performance increase (average ~30%).

Pro is a shitty deal, 5090 is very mediocre deal.
Pro is way more than 30%. Some games have 60% better resolution. Some big games have only 30% more frames but that's likely caused by bandwidth / CPU bottleneck. But game with DRS often get 50% resolution improvements. On average we should get what Sony said: 45%.
 
Last edited:

Buggy Loop

Gold Member
The card is basically the RTX 4090 Ti with some improvements, same chipset.

4090 took a while to flex the architectural advantages

It was not evident until Cyberpunk 2077 overdrive

5090 is made for neural networks.

Alan wake 2 with RTX geometry should already give a glimpse. When path tracing switch to neural radiance cache, when they use neural shaders, mesh shaders, neural skin, neural this neural that, it won't show what's under the hood.

Is it time to upgrade? Probably not. Not enough games except Alan wake 2 with patch at day 0 is using any of the new tech.

Raster is a dead end. The whole rendering pipeline is changing, slowly but surely.
 

Mozart36

Neo Member
Performance is good. Price (in my country) is not. And the coil whine mentioned in a lot of reviews scares me. I'm gonna wait. But good hunting for those who want one on release :messenger_winking:
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Wow, dissapoointing. When was the last time gen on gen improvement was so low?

The most surprising thing is that the improvements in raw frame rate (without frame generation) are so modest even though the 5090 has 30% more CUDA cores and 80% more memory bandwidth than the 4090.
 
What a joke. I’m glad I got my 4090 when I did. Will be skipping the 50 series. Hopefully the 60 series or 70 series provide more meaningful gains with slightly less greed from Nvidia (lol, less greed is not happening)
 

Fess

Member
$2000 to upgrade the 2017 TV and get a modern one with VRR and the sub 60fps on a 4080 Super is no issue

vs

$2000 to upgrade the graphics card so it stay above 60fps even at the 1% low so using the 2017 TV without VRR is no issue

Confused Curb Your Enthusiasm GIF
 
In performance?? For "all" games??
Well on PS5 Pro we often can't change settings to test purely the GPU as most games are CPU limited. Some games that are not CPU limited in some modes (like MH Wilds in graphics mode) do get about 45% more frames. To test only the GPU, the games with DRS is a good test and here we usually get quite more, up to 70%. On PC you can easily use a high end CPU and test the GPU only. And here 30% is quite bad.
 

Bojji

Member
Wow, dissapoointing. When was the last time gen on gen improvement was so low?

980 in 2014, 780ti was 93% of power:

perfrel_1920.gif


1080 was fastest in 2016, 980ti is 73% of power:

perfrel_2560_1440.png


2080ti 2018, 1080ti is 72%

relative-performance_3840-2160.png


3090 in 2020, 2080ti is 64% of power

relative-performance_3840-2160.png


5090 in 2025, 4090 is 74% of power

relative-performance-3840-2160.png


980ti and 1080ti launched much later so at launch day people were getting this ^

4090 to 3090ti was one of the biggest jumps in last decade for sure:

relative-performance_3840-2160.png
 

StereoVsn

Gold Member
Saw HUB and GN reviews. TLDR - this is like a 4090Ti. Improvements in raster in games between 20-30 percent.

DLSS and RT at 4K can get to a bit higher. FG is a different animal and need to watch more in depth image quality/latency reviews. HUB will have a dedicated one soon.

Man, 5080 is really going to be a stinker considering very small hardware bump from 4080.

I will probably try to get a 5090FE, fail and see if I can get 5070Ti down the road to upgrade from my 3080Ti.
 
Top Bottom