• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 50 graphics card family TDPs 'leaked' by Seasonic

YeulEmeralda

Linux User
Think of it this way, 4090 is already 3x faster from ps5 even in raster(6x+ in rt), even if 5090 is "only" 20 to 30% stronger, we got behemoth of unimaginable power on our hands here :D
Revenge Of The Sith Power GIF by Star Wars
 

Hudo

Member
So, will the 5090 at least come with 32 GB of VRAM? Otherwise, what's the fucking point of the card.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I meant most game engines are built to accommodate the lowest common denominator, so all you're really getting with these insane GFX cards is ever increasing frame rates @4k, I'm not seeing games built ground up to take insane advantage of these cards is what I'm saying

I take it you missed Path Tracing in Alan Wake?
Or Phantom Liberty.
Or Hardware Lumen in Unreal Engine 5.
Or PathTracer in Unreal Engine 5.
Or Full RT in Unity.
Or every RTX Remix game.

Most if not all engines are forward looking and will happily throw more features at better hardware.
The gap between Alan Wake, Phantom Liberty, Portal and HW-Lumen on PC and the console variants is stark regardless of what people tell you.

Hell to get Enemies to run on console it would literally look like a different project entirely.....and 4090s cant even get close to running maxxed out Enemies at 4K30.
So maybe a 5090 finally gets a good framerate in Enemies............Enemies is built in Unity an engine that can scale from bottom of the barrel mobile games all the way to pressuring a 4090.
So no they arent built for the lowest common denominator, they are all forward looking and scalable, with the upper limits literally unattainable with current hardware even at current resolutions.

image
 

YeulEmeralda

Linux User
And doesnt look like they plan on targeting the top end anytime soon.
The xx90 is so far beyond they are literally Halo products.

The real fight is gonna be the midrange where we have to hope Intels Battlemage brings the heat so Nvidia starts realizing they need to make compelling midrangers.

AMD really just needs to launch their direct competing cards well under Nvidias price..........the upper echelon just leave it to Nvidia and call it a day.
Remember the 5700XT......that was AMDs range topper for that generation.....it fought against 2070s for 100....a whole 100 dollars less......they literally pressured Nvidia to make the Super series.

They need to do that again, but this time they have the advantage of having RT and FSR, itll make their 8700XT a truly truly better alternative to the 5070(ti)......if they can also make their 8900XT a much better deal than a 5080 (which shouldnt be hard based on leaks) then they should be golden, Nvidia can have the crown with the 5090 but at every other price point AMD and Intel should have a much better deal, not a slightly better deal, it should be much better.

100 dollars less for the competitor card and Nvidia will either do price drops are lose that market point (assuming consumers are smart).

jax63x74tj831.png
You're forgetting that people are willing to pay $€100 more for RT and DLSS.
 

GymWolf

Gold Member
It will.
16GB VRAM and pretty damn fast especially if you count DLSS.
Youll easily glide through the generation till the RTX60s which will probably come with some new tech making the upgrade worthwhile.
Let's talk again when gta6 is gonna be out on pc, anything less than 4k60 is gonna be far from "gliding" :lollipop_grinning_sweat:
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
You're forgetting that people are willing to pay $€100 more for RT and DLSS.

AMD and Intel both have good RT now and have FSR and XeSS.
In the midrange paying 500 dollars vs 600 dollars for similar performance without RT and a minor loss with RT should be compelling to users.

The 7800XT already has pretty strong RT.
In light RT workloads it could catch and at times beat the 4070.
In heavy RT workloads it only just misses the mark vs the 4060Ti.
Now how many games are gonna throw the full wormage of RT features like Cyberpunk does.....who knows but in lighter RT workloads AMD is already right up on the cusp of being simply better than Nvidia at midrange price points.

rt-spiderman-remastered-2560-1440.png



rt-cyberpunk-2077-2560-1440.png


Let's talk again when gta6 is gonna be out on pc, anything less than 4k60 is gonna be far from "gliding" :lollipop_grinning_sweat:

Bro your 4080 is already struggling with native 4K60 now.
You really should be using DLSS, those Tensor cores arent there to massage your back.

Horizon-Forbidden-West-benchmarks-4.png



avatar-benchmarks-and-pc-performance-3.png



Hellblade-2-graphics-settings-benchmarks.png



Still-Wakes-the-Deep-benchmarks-3.png





P.S I actually dont think GTA6 is going to be that punishing a game to run for PC gamers.....unless they have an uber setting for Real Time RayTraced Global Illumination, i think High settings will easily be doable with a 4080.
Whether thats Native 4K60 or not well thats up to you.....i never let my Tensor cores sleep.
 

GymWolf

Gold Member
AMD and Intel both have good RT now and have FSR and XeSS.
In the midrange paying 500 dollars vs 600 dollars for similar performance without RT and a minor loss with RT should be compelling to users.

The 7800XT already has pretty strong RT.
In light RT workloads it could catch and at times beat the 4070.
In heavy RT workloads it only just misses the mark vs the 4060Ti.
Now how many games are gonna throw the full wormage of RT features like Cyberpunk does.....who knows but in lighter RT workloads AMD is already right up on the cusp of being simply better than Nvidia at midrange price points.

rt-spiderman-remastered-2560-1440.png



rt-cyberpunk-2077-2560-1440.png




Bro your 4080 is already struggling with native 4K60 now.
You really should be using DLSS, those Tensor cores arent there to massage your back.

Horizon-Forbidden-West-benchmarks-4.png



avatar-benchmarks-and-pc-performance-3.png



Hellblade-2-graphics-settings-benchmarks.png



Still-Wakes-the-Deep-benchmarks-3.png





P.S I actually dont think GTA6 is going to be that punishing a game to run for PC gamers.....unless they have an uber setting for Real Time RayTraced Global Illumination, i think High settings will easily be doable with a 4080.
Whether thats Native 4K60 or not well thats up to you.....i never let my Tensor cores sleep.
I meant 4k dlss quality of course, 4k native is a waste when there is an almost as good option.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I meant 4k dlss quality of course, 4k native is a waste when there is an almost as good option.

Ohh then you are sweet.
DLSS/TSR to 4K and you are good.....even in GTA6.
Which again I really dont think is going to punish systems as much as people think it will.
I think we already have games that are much more taxing than GTA6 will be even if they arent as dynamic (Alan Wake 2, Cyberpunk, Senuas Saga)
 

Kenpachii

Member
While my 4080 laptop ( 4070 desktop basically ) is doing wonders for 3440x1440 with dlss and framegen. I would like to have some more performance for path tracing titles for 3440x1440 and 4k gaming.

It's all going to matter on its performance and v-ram. this laptop is cutting it close with 12gb of v-ram. I would want double the v-ram on a 5080 if possible, but they probably going to set it at 16gb.

Then i could see myself be interested for a 5090 just for the v-ram and just heavily underclock it to get it into the 350w range.

We will see what the cards offer tho.
 

Gaiff

SBI’s Resident Gaslighter
I meant most game engines are built to accommodate the lowest common denominator, so all you're really getting with these insane GFX cards is ever increasing frame rates @4k, I'm not seeing games built ground up to take insane advantage of these cards is what I'm saying
And you’re be wrong because most engines have a slew of features meant to take advantage of more advanced and powerful hardware. UE5 is supported by the little Switch, all the way up to the beefiest PC with tools such as Nanite and hardware-accelerated ray tracing with Lumen. Look at the famous Matrix demo on UE5. That’s what the engine can do and this is certainly not taking into account the lowest common denominator.

So you’re wrong on all accounts.
 

twilo99

Gold Member
While my 4080 laptop ( 4070 desktop basically ) is doing wonders for 3440x1440 with dlss and framegen. I would like to have some more performance for path tracing titles for 3440x1440 and 4k gaming.

It's all going to matter on its performance and v-ram. this laptop is cutting it close with 12gb of v-ram. I would want double the v-ram on a 5080 if possible, but they probably going to set it at 16gb.

Then i could see myself be interested for a 5090 just for the v-ram and just heavily underclock it to get it into the 350w range.

We will see what the cards offer tho.

Do Nvidia cards usually have a decent margin for a meaningful underclock?
 

hinch7

Member
Do Nvidia cards usually have a decent margin for a meaningful underclock?
They do. At least for Ada Lovelace. My 4070Ti runs 80-100W less (at .925mv @2715mhz) with a slight negative curve undervolt, in regular gaming. And averages around 170-200W in games. Performs around the same as stock OC with a slight memory clock bump. Can barely hear the GPU spin up with the massively oversized triple cooler from MSI.
 
Last edited:
Hopefully my 4080 is gonna remain decent until series 60.
I think the requirements will not skyrocket in the near future because AAA games are made with consoles in mind. I expect even the 2080ti to run all current gen games (with above console settings) by the time the PS6 launches.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
While my 4080 laptop ( 4070 desktop basically ) is doing wonders for 3440x1440 with dlss and framegen. I would like to have some more performance for path tracing titles for 3440x1440 and 4k gaming.

It's all going to matter on its performance and v-ram. this laptop is cutting it close with 12gb of v-ram. I would want double the v-ram on a 5080 if possible, but they probably going to set it at 16gb.

Then i could see myself be interested for a 5090 just for the v-ram and just heavily underclock it to get it into the 350w range.

We will see what the cards offer tho.

Do Nvidia cards usually have a decent margin for a meaningful underclock?

No idea to be honest, kinda wondering myself about it.

Undervolting (not underclocking(Nvidia drivers will handle that)) Ada was well well worth it....as long as you dont choke the card going under under ~60% power.
So for the 5090 70% power would net you the ~350W you are looking for.
Unless Blackwell is a step backwards you'd lose 5 -10% of your performance depending on the game.


gA41aJWRC0HYTkNV.jpg



VuGaGUV8EwofumnTgfDBfk-970-80.png.webp



small_der8auer-pubg-power-target.png
 
It will no worries, we have the same tech on consoles next 2 years, so you can wait for 6080 in 2027 with at least 50% more power than 4080.

In certain games (like spiderman in this video), the RTX4080 was around 80-95% faster than the RTX3080, so IMO there's a good chance that the 5080 will be 50% faster than the 4080 (at least in some games). The 6080 will probably be over 2x faster.

 
Last edited:

Sanepar

Member
In certain games (like spiderman in this video), the RTX4080 was around 80-95% faster than the RTX3080, so IMO there's a good chance that the 5080 will be 50% faster than the 4080 (at least in some games). The 6080 will probably be over 2x faster.


But maybe this happens because of 10gb on 3080. in raw power the diff between 3080 and 4080 is 50% perf. 3080 problem is 10gb of vram. I don't believe on 50% jump again this gen.
 

DonkeyPunchJr

World’s Biggest Weeb
I am very curious what game Nvidia will use to promote the new cards. I’m guessing Wukong.
That’s honestly my biggest question right now. As a 4090 owner I’d love it if Nvidia made me feel like my current GPU is an obsolete piece of e-waste. But right now I don’t even have any games that would noticeably benefit from a more powerful GPU.
 

hinch7

Member
But maybe this happens because of 10gb on 3080. in raw power the diff between 3080 and 4080 is 50% perf. 3080 problem is 10gb of vram. I don't believe on 50% jump again this gen.
Maybe for the 5090>4090. The further down the stack, probably not. I can see the 5080 matching the 4090 in raster and maybe beating it in RT.
 

diffusionx

Gold Member
Think of it this way, 4090 is already 3x faster from ps5 even in raster(6x+ in rt), even if 5090 is "only" 20 to 30% stronger, we got behemoth of unimaginable power on our hands here :D
3x the power for >3x the cost just for the GPU (~5x the cost when you factor in the entire PC)
 
But maybe this happens because of 10gb on 3080. in raw power the diff between 3080 and 4080 is 50% perf. 3080 problem is 10gb of vram. I don't believe on 50% jump again this gen.
I haven't noticed the VRAM usage on the 4080, it's over 12GB in spider man mile morales, so it seems you're right.
 
Last edited:

PeteBull

Member
3x the power for >3x the cost just for the GPU (~5x the cost when you factor in the entire PC)
Indeed, value proposition isnt good, but its normal for topend pc hardware, its called bleeding edge for a reason, if u are enthusiast u literally gotta bleed cash out xD
Edit: Same thing goes with cpu's.
For midrange and amazing value u go for example for r7 5700x https://pcpartpicker.com/product/JmhFf7/amd-ryzen-7-5700x-34-ghz-8-core-processor-100-100000926wof so 160$, but top of the top so 7800x3d https://pcpartpicker.com/product/3hyH99/amd-ryzen-7-7800x3d-42-ghz-8-core-processor-100-100000910wof is almost 400$, and yet difference in min and avg fps, even in best case scenario so heavy cpu bottleneck, usually isnt even +50% between them
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
They releasing this year or no?

I am very curious what game Nvidia will use to promote the new cards. I’m guessing Wukong.

That’s honestly my biggest question right now. As a 4090 owner I’d love it if Nvidia made me feel like my current GPU is an obsolete piece of e-waste. But right now I don’t even have any games that would noticeably benefit from a more powerful GPU.

Heart of Chernobyl is slated for Septmeber release......GTC is between September and November.....so if we are lucky they announce them as early as September with an October/November release.
And they use Heart of Chernobyl as a benchmark of sorts.

stalker-2-screenshots.jpg






Otherwise yeah, its likely going to be Black Myth Wukong thats gonna be their showcase title this year.




P.S
I would be really shocked if they manage to get these things out this year.
Ive been expecting 2025 this whole time after March GTC......if they do actually manage to get them out this year more pwoer too them.
 

Hudo

Member
There's not a game in existence that's using 24 GB of VRAM yet. What are you even talking about?
I am thinking about work stuff, where I frequently hit 24 GB of VRAM on my 4090. And before you tell me that I am retarded for using 4090 for work stuff: Nvidia themselves have marketed it for rendering and deep learning stuff as well. So it's not like I am outside the intended use cases.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I am thinking about work stuff, where I frequently hit 24 GB of VRAM on my 4090. And before you tell me that I am retarded for using 4090 for work stuff: Nvidia themselves have marketed it for rendering and deep learning stuff as well. So it's not like I am outside the intended use cases.
Dont your apps use shared memory?
Just buy as second 4090.

P.S Be prepared to be disappointed its gonna be 28GB at best.
Start looking at the A series if you need much more VRAM, if its for work then its a tax write off.....if you cant write it off the extra VRAM of the A series will pay for itself sooner rather than later.......if you actually are using much higher than 24GB of VRAM.

will still buy if 5090 has 512 bus.

will downsample my dongus off.

The 5090 will most def not be using a 512bit bus........no way they use the full wormage of that chip.
Its gonna be 448bit and 28GB of GDDR7.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Shame they aren't getting announced until CES now. 😞

You think they will use CES instead of GTC?
Or was this confirmed somewhere and I missed it.

They still have GTC Sept - Nov to announce them then they can release and show them off at CES if they announce at CES chances are we only see the actually work they can do at GTC March then.
 

Bry0

Member
RTX 5k is a given. I’m more curious about this “7990 xtx”
Seems totally bizarre for AMD to refresh rdna 3 with rumors of rdna 4 swirling unless it’s to have a refreshed chiplet gpu for the enthusiast price bracket.

Probably just rumor mill guessing from seasonic? There was rumors a solid year+ ago about a potentially fixed rdna3 refresh for desktop being canceled.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Imma be skipping this generation.
With DLSS constantly improving and I dont think games are gonna get that much more demanding.
Im good.
Im good.

Wait for the RTX60s they will likely have some actually new features with them.



Nvidia havent even announced when their next GTC event will be......depending on its name youll have an idea whether they will announce the cards or not.
Too busy counting money to worry about this shit.
Remember Blackwell is being used both for their AI chips and for the consumer chips.
I dont think they are in any rush to start taking silicon away from Blackwell AI.
They're just gonna get much less optimized. Developers will start using FG/DLSS as a crutch.

Who am I kidding? They already are!
 
Indeed, value proposition isnt good, but its normal for topend pc hardware, its called bleeding edge for a reason, if u are enthusiast u literally gotta bleed cash out xD
Edit: Same thing goes with cpu's.
For midrange and amazing value u go for example for r7 5700x https://pcpartpicker.com/product/JmhFf7/amd-ryzen-7-5700x-34-ghz-8-core-processor-100-100000926wof so 160$, but top of the top so 7800x3d https://pcpartpicker.com/product/3hyH99/amd-ryzen-7-7800x3d-42-ghz-8-core-processor-100-100000910wof is almost 400$, and yet difference in min and avg fps, even in best case scenario so heavy cpu bottleneck, usually isnt even +50% between them

Maybe the vast majority of games still run well (60fps+) on the 5700X, but the 7800X3D CPU is more future proof.
In games optimized to use 8 CPU cores there's quite a big difference between 7800X3D and 5700X (78%).

borderlands-3-1280-720.png
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Hope its true. I wonder how powerful a 220w 5070 would be, I wonder if it could be close to a 4080.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
They're just gonna get much less optimized. Developers will start using FG/DLSS as a crutch.

Who am I kidding? They already are!

Is DLSS is a crutch the new lazy devs narrative?

Unreal Engine 5, the vanilla version is getting so much more CPU optimized that even "lazy devs" will have it being multithreaded and heavily parallelized for rendering.
Devs who make their own custom engines like Capcom are already working on having their engines be multithreaded and parallelized.

What makes you think games are gonna get less optimized when it looks like Unreal is gonna be the most used engine again and custom engines are coming from guys like Capcom, Remnedy and Insomniac (nixxes putting in work).
Epic is literally working on making Unreal fuckup proof right down to the shader comp being automatic and basically killing hard traversal stutters (we will likely still get hitches).
 

lmimmfn

Member
Let's set expectations:
1. 5 series cards will be totally overpriced
2. Path tracing is available in a few games only, it is non performant today without frame gen and still struggles. 5 series cards will provide at best 20% I.provement, which is negligee.
3. Raster performance should be fantastic.....but if AMD actually get of their asses and provide a decent competator( not going to happen as going mid range only,).

So we are stuck with Nvidia who couldn't give a crap about consumers as they have their datacentre business, and where they want to focus wafers.

Intel is improving, making huge grounds but are playing catchup on drivers
 

SScorpio

Member
While they could be place holders, I hope they aren't.

I and some others feel NVIDIA put out a SKU lower with the 4000 series, the power draw make me think performance style be back where it belongs.

IE the 4060 should have been the 4050, 4060ti the 4060, 4070 the 4060ti, etc.

If this is the case, TDP went down when compared to the higher SKU.
 

PeteBull

Member
Let's set expectations:
1. 5 series cards will be totally overpriced
2. Path tracing is available in a few games only, it is non performant today without frame gen and still struggles. 5 series cards will provide at best 20% I.provement, which is negligee.
3. Raster performance should be fantastic.....but if AMD actually get of their asses and provide a decent competator( not going to happen as going mid range only,).

So we are stuck with Nvidia who couldn't give a crap about consumers as they have their datacentre business, and where they want to focus wafers.

Intel is improving, making huge grounds but are playing catchup on drivers
TLDR next gen will be glorious, for midrange/budget gamers who can go amd and spend 300-500$ and get close to what they have to pay 900$ today, and for enthusiasts who can easily pay 2k usd as long as performance is there(2200 euro i paid in august 2021 for 3080ti durning crypto boom is 2571€ now= exactly 2800 usd and i already got cash for next gpu/top end pc set aside, just waiting for correct time/launch of new hardware :)
 

DenchDeckard

Moderated wildly
You think they will use CES instead of GTC?
Or was this confirmed somewhere and I missed it.

They still have GTC Sept - Nov to announce them then they can release and show them off at CES if they announce at CES chances are we only see the actually work they can do at GTC March then.

Unfortunately, it's what I've heard. I'm hoping to be wrong.

Normally Nvidia go big 4080/4090 for roughly Sept announcement October launch but I've recently heard they've now delayed to CES for the big boys.

Normally they would announce 4060/ti tier cards around CES time.

See how the rumours pan out. I'm sure it will start to leak online soon if it's true.

My step dad is Jensen so I overheard him on the phone. ;)
 
Top Bottom