Report: RTX 5090 to include 32GB of VRAM

closest was 9800GTX, even lose sometimes to 8800Ultra
The 8800ulta was such a perfect card, right up their with the 1080ti as perfect GPU's that lasted insanely long.

I'll probably sell my 4090 and grab a 5090, I have business reasons to have one, plus the fun prospect of shoving a 5090 into an ITX build like I did with the 4090 just sounds too fun.
 
Last edited:
And, now hear me out, with 600 nuclear watts, you can double it as a space heater in the high fuel cost winter.

Dj Khaled Compliment GIF

With how much money you could save by turning the furnace off, you'd be a fool not to jump on this!
 
To everyone obsessing over the 600W thing, 4090 was also rumoured be 600 watts. And in the end it can be overclocked to that limit but at 350W you can get 90% of the performance at 600W so I wouldn't pay too much attention to that.
 
To everyone obsessing over the 600W thing, 4090 was also rumoured be 600 watts. And in the end it can be overclocked to that limit but at 350W you can get 90% of the performance at 600W so I wouldn't pay too much attention to that.

true.
these numbers are the extreme end of the scale. so 600W is what you should be able to support to not get any issues, but it won't use that power even at stock setting in 99% of the time.
 
do you really need a 5090 for gaming ? i mean even a 4080 is enough for 4k/60fps with framegen on

i feel like 5090 is build for bigger tasks than gaming
 
do you really need a 5090 for gaming ? i mean even a 4080 is enough for 4k/60fps with framegen on

i feel like 5090 is build for bigger tasks than gaming
Well, the 4090 struggles to hit everything on ultra with ray and path tracing on in 4K in games like Black Myth and even Cyberpunk. And will definitly do with the upcoming Flight Simulator.

So if you need the VERY best fidelity/performance I suppose you need a 5090 going forward.

But want and FOMO is definitly terms more connected to the 5090 than "need" I suppose.
 
Well, the 4090 struggles to hit everything on ultra with ray and path tracing on in 4K in games like Black Myth and even Cyberpunk. And will definitly do with the upcoming Flight Simulator.

So if you need the VERY best fidelity/performance I suppose you need a 5090 going forward.

But want and FOMO is definitly terms more connected to the 5090 than "need" I suppose.
Cyberpunk everything maxed out 4k isn't a 60fps struggle on my 4090 pc, it's easily above that with framegen on. Dont use scRGB hdr, than it tanks. Don't know what is the struggling part. Never the less I want that 5090.
 
Cyberpunk everything maxed out 4k isn't a 60fps struggle on my 4090 pc, it's easily above that with framegen on. Dont use scRGB hdr, than it tanks. Don't know what is the struggling part. Never the less I want that 5090.

Frame Gen has made upgrading less important. Especially if you game on PC with a controller 99% of the time like myself.
 
Im just gonna say this, as long as performance is there, aka sizeable jump from already monstrous rtx 4090, no matter 2 or 3k usd
 
More interested to see what DLSS 4 offers and if its exclusive towards the 5000 series.

It's interesting because DLSS improvements are starting to hit diminishing returns now,IMO.

Once it looks better than native, what becomes the end goal? Why take it any further?
 
I'd only replace my 4090 if the 5090 was more energy efficient.
I'm very happy with my 4090 performance wise.

Exacty the same for me.
Perf/W is now more important than anything else with a top of the line GPU.
Undervolting is certainly the most exciting thing i set up on all GPUs i install.
 
3k+€ in Europe for sure.
Meh, it's the end of this "gen" and not like most games will need more graphical power anyway.
 
"thanks sony for the ps5 pro price me and my friends will be switching to PC"
 
Sitting the 50 and 60 series out. By that time I've probably bought a new monitor and would need to upgrade again from the 40 series.
"thanks sony for the ps5 pro price me and my friends will be switching to PC"
Here is your 5060 for only 499. I'm joking. Maybe.
 
Last edited:
Can someone tell me how 5080 can be 10% faster than 4090? 4090 has 60% more cores and is better in every aspect:

JyMwBBM.jpeg


Even less power draw so 4GHz clock (lol) is out of the picture.
 
Can someone tell me how 5080 can be 10% faster than 4090? 4090 has 60% more cores and is better in every aspect:

JyMwBBM.jpeg


Even less power draw so 4GHz clock (lol) is out of the picture.
1. The 4090 has Ada Lovelace cuda cores. The 5080 has Blackwell cores. New architecture = better performance per core.
2. Your chart doesn't mention clock speeds at all.
3. Blackwell will probably have DLSS4 as well.
 
Last edited:
1. The 4090 has Ada Lovelace cuda cores. The 5080 has Blackwell cores. New architecture = better performance per core.
2. Your chart doesn't mention clock speeds at all.
3. Blackwell will probably have DLSS4 as well.

It's DF chart. Clock speeds are a major factor but I don't expect jump from 2.9GHz to 4GHz.

Also, last time we got massive IPC gain was long time ago. Biggest difference between Ampere and Ada was clock speed bump from ~1.9GHz to ~2.9GHz (and of course better RT hardware).

DLSS4? Maybe but what it will bring to the table?

I don't expect 5080 to beat 4090 in raster AT ALL.
 
I'm betting on on 2400 for a reference 5090

AIB if image 2700$ min.

I hope micro center still is going to give 1320$ for my 4090 trade in 😅
 
we don't know, which cache system in Blackwell.

can beat at 1080\1440, but not in 4k

I'm still not seeing that, memory speed is not the problem on 5080 (like it was on 4070ti).

Unless we have massive IPC increase and/or big clock speed bump 5080 will be slower than 4090.
 
I think it can get close, but only because the 4090 really has trouble stretching its legs because the GPU is so massive. You really need a 4K pathtraced RT load for that ~40% perf increase over a 4080 to materialize.

I think they'll mostly focus on non-raster improvements with this gen:
  • Improved RT perf seems like a given.
  • Faster Optical Flow Accelerator yet again, bringing it closer to ~2x non-FG FPS. Ada was a ~2x improvement over Ampere and around ~3x Turing. Maybe some sort of double generated frame mode enabled by the higher perf too?
  • nvCOMP Acceleration Block, relieving shader load in DirectStorage games.
 
Last edited:
Can someone tell me how 5080 can be 10% faster than 4090? 4090 has 60% more cores and is better in every aspect:



Even less power draw so 4GHz clock (lol) is out of the picture.
The thing is, relative to its number of cores, the 4090 is much slower than you'd think it be. It seems very inefficient in terms of keeping its cores fed. It has something like 80% more cores than the 4080 and only a 10% deficit in clock speed and is much beefier in everything else. Maybe they figured out how to keep Blackwell's core busy? Or the rumored 10% improvement could be bullshit or under very specific circumstances.
 
Last edited:
Unless DLSS4 is Voodoo inspired black magic, it's looking very unlikely that the 5080 will somehow be 10% faster than the 4090. IF it somehow does though, I also don't see it coming in at less than $1200.
 
Also keep in that the 5090 has like literally double the specs of the 5080...
So if the 5080 is 10% faster than a 4090 then the 5090 is going to be more than double the speed of a 4090 :messenger_fire:

Time to upgrade my 1000W PSU I guess
I have a 1000w psu but plan to keep it. I plan to turn the card's power limit down to 80% for less heat/noise and minimal performance impact. (currently doing this with my 3090)
 
Last edited:
I'm still not seeing that, memory speed is not the problem on 5080 (like it was on 4070ti).

Unless we have massive IPC increase and/or big clock speed bump 5080 will be slower than 4090.
There's 2 ways to look at it, one is the 4090 has so much more compute hardware, the other is that the 5080 beats the 4080 significantly. Based on the specs given I could see the 5080 being roughly 20-25% faster than the 4080 given the increase in power, the more efficient node (maybe some clock increase), and increase in memory bandwidth. If they have some architectural improvements I could see it matching a 4090. 10% faster than a 4090 feels like a big stretch though.
 
Would need either 32Gb chips on a 256-Bit interface or 16Gb chips on 512-Bit interface..


Alternatively they could go for 30/36GB on a 320/384-Bit interface with 24Gb chips.
 
It isn't that ONLY 1.3% care about upscaling, they just care about other features like raw performance (which people will always want over fake frames and ai resolution) and pricing more.
 
Top Bottom