• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 5080 | Review Thread

KungFucius

King Snowflake
Thank god. I can keep my 4090 and relax. This minor upgrade is what I was hoping for because we don't need to keep upgrading since 4k is basically all we need.

The 5090 is a bad upgrade from the 4090 since it is 25% more expensive for ~ 40% improvement coming 2+ years later. The prices are absurd too. Nvidia is basically fucking their AIB partners over and forcing them to charge 15-25% over MSRP for minor tweaks.
 
Fun meme but we know 4sure 5090 is solid and will fly off the shelves, topend pc buyers arent price sensitive at all and they get roughly 35% increase over previous BiS cardd, 4090 that even in january started at 2500usd for basic worst cooling models ;)
5090 will sell every model even at well over 3k usd streetprice, which it is even in the US, outside the US many models of that card will be close to 4k usd ;X
Well the 5090 is certainly an improvement over the 4090 in performance, but it's also more expensive and commands a much higher total power draw. I'll be similarly unimpressed if the 6090 increases perf/price/TDP by a combined 30%.

On the whole, the 5000 series is very much a sidegrade from the 4000 series.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
A real 5080 should beat the RTX4090, so we can say that the card disappoints, but the RTX5080 still offers nice performance boost compared to the RTX4080 (and especially compared to the 4070ti 12GB, that supposed to be the weakest RTX4080)

My RTX4080S would never come close to the RTX4090, even with OC. The RTX 5080 is close without OC and would probably beat the stock 4090 with 12% OC.

dragon-age-veilguard-3840-2160.png

Thats some serious cherry picking, im guessing something went wrong, maybe they were CPU limited, cuz their 4090 is underperforming.
Just look at any other review or even look at techpowerups own 1080p and 1440p results, something went wrong at 4K.



Dragon-4K-p.webp
 

PeteBull

Member
Well the 5090 is certainly an improvement over the 4090 in performance, but it's also more expensive and commands a much higher total power draw. I'll be similarly unimpressed if the 6090 increases perf/price/TDP by a combined 30%.

On the whole, the 5000 series is very much a sidegrade from the 4000 series.
Market decides about it, and same way it decided 4090 was worth 2500usd 2weeks ago still, not its msrp that nvidia said, same way it will decide(kinda already decided) all 5090's starting at 3k usd+(yes limited founders bought straight from nvidia got lower msrp, most will be resold for 3k+ right after they are bought tho ;)
 
Thats some serious cherry picking, im guessing something went wrong, maybe they were CPU limited, cuz their 4090 is underperforming.
Just look at any other review or even look at techpowerups own 1080p and 1440p results, something went wrong at 4K.



Dragon-4K-p.webp
In this benchmark results also look different compared to techspot.

da3840.png


And why results look so different? You cant expect to see the same or even similar results if techpowerup used different test location. One location can stress memory bandwidth, other shader or RT cores. Also relative difference can change depending on the resolution.

I have seen more benchmarks where the 5080 is 25% faster than the 4080S.

KHyMTIi.jpeg


Youtuber "Frame chasers" also did interesting comparison between 4080S with max OC and the 5080 Max OC, and blackwell card was 18-25% faster. That's almost like the 4080 vs 4090 (the 4090 was 25% faster on average).

I'm trying to be objective. The 5080 is underperforming from an expectation point of view, but it's still better than my RTX4080S. It's not a big difference that would make me want upgrading (only 2x faster GPU at the same price would make me consider upgrading), but if I would be upgrading from older GPUs (RTX 20 or 30) I would definitely want the RTX5080.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Youtuber "Frame chasers" also did interesting comparison between 4080S with max OC and the 5080 Max OC, and blackwell card was 18-25% faster. That's almost like the 4080 vs 4090 (the 4090 was 25% faster on average).

I'm trying to be objective. The 5080 is underperforming from an expectation point of view, but it's still better than my RTX4080S. It's not a big difference that would make me want upgrading (only 2x faster GPU at the same price would make me consider upgrading), but if I would be upgrading from older GPUs (RTX 20 or 30) I would definitely want the RTX5080.

The best he got in game was 18%
The 4090 is in these screenshots too.

fDe6q98.png



In COD it was like 12%
TQXSPIn.png






Now if youve got ~1000 bucks and need to buy a GPU obviously you should go for the 5080, im not denying that.
But you cant deny this thing is performing more like an AD102 4080Ti than a whole new generation xx80.
 
Last edited:

AFBT88

Member
Hopefully people will vote with their wallet. Wont touch any of these crap cards. It really is a 5070 card labled as 5080. If you can't have a 80 class card at 1000$ just be man enough to call it 5070 and give gamers a 5080 at 1200$ or so. People are not stupid to think it's 5080 just because you called it 5080!

We saw what happened when they tried to sell us 4070ti as 4080! LMAO! I just wish AMD wasn't such a joke company.
 
Last edited:
Guys Get MSI vanguard RTX 5080 OC and then OC it again because of good headroom and you'll get a 4090 performance for $1000

Like I said, 5080 OC like crazy and can match the stock RTX4090 in certain games. It's still not very impressive given the expectations, but it's certainly a better card than my RTX4080S.

Here's another video with OC results.

 
Last edited:
we´ve never had xx90 cards with such huge silcon differences to their xx80 counterparts before the 4090. Everyone who expected the 5080 to beat the 4090 without at least a full node shrink was nuts. And even WITH a nodeshrink from TSMC I´d absolutely not bet on the 6080 beating the 5090 which is literally a monster on die level.
5080 is garbage, nuff said.
Nq0ZyA9.jpeg
 

StereoVsn

Gold Member
It seems all RTX 5080s OC to 3200MHz (including FE), not just more expensive MSI vanguard. Framechasers was even able to achieve 3250-3300MHz on FE.
Yeah, I will pick it up if I can get it near MSRP in a month or two once the dust settles. It will still be a sizable upgrade from a 3080 Ti.

Edit: iOS auto correct is horrible. Siri, do better!
 
Last edited:
You can Oc a 5080 but the vram is still 16gb
It will make a difference when PS6 ports starts coming on PC. My RTX4080S has 16GB VRAM and it's more than enough. I play games at 4K and most use between 9-12GB VRAM. Very few games use more VRAM, mainly PT games that require some tweaks anyway, because you arnt going to play PT games at native 4K even on the RTX5090.
 

Celcius

°Temp. member
It will make a difference when PS6 ports starts coming on PC. My RTX4080S has 16GB VRAM and it's more than enough. I play games at 4K and most use between 9-12GB VRAM. Very few games use more VRAM, mainly PT games that require some tweaks anyway, because you arnt going to play PT games at native 4K even on the RTX5090.
You can with DLSS though.
Portal RTX at 4K with DLSS runs on my 3090 and I've seen over 20gb of VRAM usage. Looks great too.
Future games will only use more VRAM
 
Last edited:
You all realise the reason there's so much OC headroom is because the 5080S is gonna come out with those overclocked settings at stock.
The 5080 is the full, uncut GB203 die, there's no more headroom, and they probably don't want to cut down GB202 for an 80-class GPU.
 
You all realise the reason there's so much OC headroom is because the 5080S is gonna come out with those overclocked settings at stock.
The 5080 is the full, uncut GB203 die, there's no more headroom, and they probably don't want to cut down GB202 for an 80-class GPU.
It sure looks like an "artificial" placement to make room for an S or Ti. That kind of OC potential hasn`t been seen among GPUs for a few gens.
But hey, I just bought a 5080 with a giant cooler so I´ll definitely try to use that potential and see what the Silicon RND gods have bestowed upon me.
 
The 5080 having 50% of the core of the 5090 has nothing to do with Moore’s Law being dead. This isn’t a 5080. It’s a 5070 Ti at best.
It is because making big chips is expensive and there is a floor to how low they can price things. TSMC is charging an arm and a leg for N7 and they're charging that and a kidney for N4. Samsung 8nm was dirt cheap in comparison and as a result for the first time since 2013 (GTX 780) Nvidia made an xx80 card using the biggest chip of the lineup. Because 8nm was so affordable Nvidia passed down the savings to consumers selling the cut down version for $700 as the 3080 and higher end version with more VRAM for $1500 as the 3090.

Contrast this with what they did the gen before in 2018 when they were using TSMC: the cut down version was $1000 (2080ti) and the higher end version was $2500 (Titan RTX). It was a night and day difference. Then they more to TSMC N4 and the gains are big but the pricing sky rockets to $1600 for the cut down big chip and no higher end version in the GeForce lineup.

As you can see with each node leap things get more and more expensive. Hell the reason Blackwell isn't on next gen N3 is because it's so expensive and the performance increase of N3 vs N4 isn't that high that they'd rather use old N4 instead. Nvidia will likely always have a behemoth 600mm² chip but the price of this cheap keeps getting bigger and bigger. In 2010 Nvidia sold the fully enabled behemoth chip as the GTX 580 for $500 USD. Last gen not even the $6700 RTX 6000 Ada was fully enabled, and the GeForce equivalent was $1600. The GTX 560 (I loved this card) was a 332mm² chip that launched for $199 in 2011. The 5080 is a 378mm² chip that launched for $1000 I'm 2025. Moore's Law and dennard scaling are dead.
 
Like I said, 5080 OC like crazy and can match the stock RTX4090 in certain games. It's still not very impressive given the expectations, but it's certainly a better card than my RTX4080S.

Here's another video with OC results.


It's using more power than the 4090 when overclocked to match the 4090. In terms of stability, expect to cook your card if you do that as you'll just get degradation over time. Nvidia who traditionally hasn't left much over clocking headroom in the last 2 gens decided to leave overclocking headroom on this card? Don't fall for the bait...
 
Last edited:
The 5080 having 50% of the core of the 5090 has nothing to do with Moore’s Law being dead.

It's like Shrinkflation basically. This doesn't impact the 5090 type parts since they will just jack up the price as needed... until they can't there.

Expect the difference to be even bigger with Rubin.
 

Chiggs

Gold Member
I'll be similarly unimpressed if the 6090 increases perf/price/TDP by a combined 30%.

The 6000 series will be on a different node process, meaning (theoretically) better power and heat, providing yet another reason to just skip the 5000 series if you can.

The 5000 series is basically Nvidia's version of the Core i9 14900K...and that's obviously not a good thing.
 
Last edited:
The 6000 series will be on a different node process, meaning (theoretically) better power and heat, providing yet another reason to just skip the 5000 series if you can.

About that... figure the power savings will be invested in performance gains. I don't think NV will go above 600 W but maybe I will end up being wrong on that.
 

Chiggs

Gold Member
About that... figure the power savings will be invested in performance gains. I don't think NV will go above 600 W but maybe I will end up being wrong on that.

I hope you're not. I think 600 watts is completely absurd. Anything beyond is just outrageous.
 
Last edited:
It's using more power than the 4090 when overclocked to match the 4090. In terms of stability, expect to cook your card if you do that as you'll just get degradation over time. Nvidia who traditionally hasn't left much over clocking headroom in the last 2 gens decided to leave overclocking headroom on this card? Don't fall for the bait...
Nonsense, at least according to the data we have so far.
The models tested needed <10% over standard power at a performance level within striking distance to the 4090 with its 450w tdp and the temps of the parts at that point were actually below that of the FE model....

So if the cases we've seen haven't been "golden samples" (totally possible still) your opinion is based on thin air.
 
Last edited:

//DEVIL//

Member
You can Oc a 5080 but the vram is still 16gb
which is more than enough for 4k gaming. including all the ray tracing you want. as a matter of fact. if you are going to have DLSS enabled. it will even lower the VRAM requirment. Not sure what is the issue. there are no games from here or the next 2 years that will require anything above 16 gigs for 4k ultra gaming.
 
Last edited:

Dirk Benedict

Gold Member
Between this launch and Deepseek, I hope Nvidia gets its shit 💩 packed in. I couldn’t get the 5090. I think it was a blessing in disguise. I am wondering if I should wait for the new stock or wait for the super/ti variants… Nvidia are fucking dragging the Ti name into the fucking ground. It used to be a great bang for your buck option if you couldn’t afford a high end gpu, but still wanted some decent gains from the current gen.

How disappointing. They even forgot who fucking propelled them into the limelight. People who buy their gpus for gaming.
 
Last edited:
You can with DLSS though.
Portal RTX at 4K with DLSS runs on my 3090 and I've seen over 20gb of VRAM usage. Looks great too.
Future games will only use more VRAM
Portal RTX use around 10-12GB VRAM on my RTX4080. Games sometimes allocate more VRAM than necessary, but this does not mean that 20GB of VRAM is really needed.

Future games will use over 16GB's VRAM, but that will be PS6 ports. Till then the RTX5080 will be fine, at worst people will need to tweak settings a little bit in couple of games. The difference between ultra texture settings and high settings is usually very small, or not noticeable at all.

Fot example Indiana Jones with PT at 4K native is vram limited on my PC. However with DLSS and ultra texture settings instead of supreme the game is no longer VRAM limited and textures doesnt even look any different. I dont feel that VRAM affects my experience in this game.

The VRAM limit only bothers me when it noticeably affects performance or texture quality. For example, when I was playing the TLOU1 remake on my old GTX1080 8GB, the texture quality looked extremely blurry on low settings and the game was also stuttering.
 
Last edited:

Celcius

°Temp. member


^^ says Nvidia is actually selling a x070 class card for $1k by giving it an x080 name.

Similar to what they tried to do with the 4080 12gb, except this time there’s no better card to compare it against except for the very top card.
 
Last edited:

MikeM

Member
Portal RTX use around 10-12GB VRAM on my RTX4080. Games sometimes allocate more VRAM than necessary, but this does not mean that 20GB of VRAM is really needed.

Future games will use over 16GB's VRAM, but that will be PS6 ports. Till then the RTX5080 will be fine, at worst people will need to tweak settings a little bit in couple of games. The difference between ultra texture settings and high settings is usually very small, or not noticeable at all.

Fot example Indiana Jones with PT at 4K native is vram limited on my PC. However with DLSS and ultra texture settings instead of supreme the game is no longer VRAM limited and textures doesnt even look any different. I dont feel that VRAM affects my experience in this game.

The VRAM limit only bothers me when it noticeably affects performance or texture quality. For example, when playing the TLOU1 remake on my old GTX1080 8GB, the texture quality looked extremely blurry on low settings.
Scaling will get better given the handhelds that are out and selling well, so 16GB is viable. The issue will be that, come PS6, the VRAM won’t allow for the high end experience people want.
 

hinch7

Member


^^ says Nvidia is actually selling a x070 class card for $1k by giving it an x080 name.

Similar to what they tried to do with the 4080 12gb, except this time there’s no better card to compare it against except for the very top card.

Yeah this was always the case. The disparity and gulf between GB202 and 203 is insane. And they made people believe that spending $550 for a 60 tier GPU a good deal (5070). Nvidia GPU's are a rip off these days.

Sadly there's no competition around so they'll do what they want. And AMD only cares for margins so they'll both collude with prices.
 
Last edited:
Yeah this was always the case. The disparity and gulf between GB202 and 203 is insane. And they made people believe that spending $550 for a 60 tier GPU a good deal (5070). Nvidia GPU's are a rip off these days.

Sadly there's no competition around so they'll do what they want. And AMD only cares for margins so they'll both collude with prices.

As I've been saying, expect the gap to get even wider.
 

hinch7

Member
As I've been saying, expect the gap to get even wider.
Unless AMD pulls out something special with UDNA and gives us a flagship; or something along the lines of the 4090. Sadly nothing stopping Nvidia from selling gimped GPU's for sky high prices. And looking at the 5080 sales initial sales, unlikely going to change for 6000 series.
 

rofif

Can’t Git Gud
There is no founders edition here at all. They never even tried to sell it here.
INstead we got a fucking MSI VENTUS garbage for 1600$. YES. 1600 fucking $.
Nvidia can go fuck themselves.

I was able to get 3080fe for 700$ on release date. Granted it was difficult and I had to hun on nvidia store online for a few days for a drop but it did happen.
This was never sold as fe here. What a joke.
 
I'm looking to build a PC at some point later in the year, I understand that this card is shit from a gen to gen comparison, but for someone like myself building for the first time, it's a solid card right?
 

Topher

Identifies as young
This is going to be the case across the board, ain't it? 5070 and Ti will be paper launches and AIBs will have super expensive versions.

I'm looking to build a PC at some point later in the year, I understand that this card is shit from a gen to gen comparison, but for someone like myself building for the first time, it's a solid card right?

Yeah, if you can get one then you'll have a damn good card.
 
been playing around with my 5080 and the OC vids definitely weren`t golden samples after all, or I got one as well...... I´m not done testing yet, but at least the OC levels ~3200/+375 we`ve seen in the vids run rock solid and still very cool.
That thing is redeeming itself at least a tiny bit right now.
Can`t remember the last GPU with that much out of the box OC potential without even running noteworthy hotter....
This reeks of artificial placement by NVIDIA.... 5080S/TI incoming.
 
Last edited:
Top Bottom