• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen NVIDIA GeForce RTX 4090 With Top AD102 GPU Could Be The First Gaming Graphics Card To Break Past 100 TFLOPs

rofif

Can’t Git Gud
I hope Nvidia reenables the europe/poland ability to buy FE cards directly.
I was able to get 3080 for 700 two years ago and even sold watch dogs for 30.
And while it's not the most silent or chill version, it is small, very dense and amazing cool looking.
It's the first gpu I ever got that does not sag at all lol.

Getting that gpu was a real hunt. I spent whole week refreshing, setting refreshing and tracking changes apps, scanning api and having autofill ready.
My fav part was using chrome console commands to make the "buy now cart" appear as soon as API was updated. Normally it takes few minutes and normal refresh does not make the gui update.
And with all of this, I managed to be at my pc at the right time when the alarm rang and in 2 minutes, the cards were gone, never to be sold in eu store again.
btw, this gpu is HEAVY,. Over 2 kilograms (almost 5 pounds).
I am not selling that lol. Funnily enough, I had to do the same hunt for ps5 few months later...

eRtuZN8.jpg

WHDJKmq.jpg

JugjCo5.jpg
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Any product of theirs on shelves is hilarious.

It was blind luck i managed to get a 3080 on launch
Without crypto to drive up demand Ada will be easy to find, im like not even rushing for a 4080 this time around, no rush from me until games come out that actually need a 4080 class card to hit 90fps. (UW1440p)
The main reason Ampere was hard to get a hold of is crypto scum buying it up, and scalpers realizing crypto bros were/are willing to pay well above MSRP so they grabbed as many as they could.
A pandemic forcing internet cafes to close and/or become mining farms didnt help either.


5s9mc3413u691.jpg


FVw82FLWAAAmGH3


FVw82FLXoAAMsy6
 

DenchDeckard

Moderated wildly
So I guess I can't grab a 4090 for my 12900k build with a 1000 watt gold gpu.....?

...that sounds crazy to me lol

4080 it is then I guess.....sadge.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
So I guess I can't grab a 4090 for my 12900k build with a 1000 watt gold gpu.....?

...that sounds crazy to me lol

4080 it is then I guess.....sadge.
What are expecting to do that will max out the wattage on both your 12900K and an RTX 4090....at the same time.
Running Furmark and Prime at the same time is not a realistic workload.
 

DenchDeckard

Moderated wildly
What are expecting to do that will max out the wattage on both your 12900K and an RTX 4090....at the same time.
Running Furmark and Prime at the same time is not a realistic workload.

Do you think I'll be OK with a 1000 psu? Or do you think it will need to be like a 1200 watt psu?

Just seems crazy to me.

I was wanting to go big this time but I may just have to hold off and stick with the 80 series.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Do you think I'll be OK with a 1000 psu? Or do you think it will need to be like a 1200 watt psu?

Just seems crazy to me.

I was wanting to go big this time but I may just have to hold off and stick with the 80 series.
A 4090 based system (whole system) will likely peak somewhere in the 800W range.
A 12900K gaming doesnt eat anywhere near as much as fear mongers will have you think it does.

But lets just say some game comes out that uses all p-e cores and stresses them out the ass.

Some basic albeit speculative maths.
12900K - 170W
RTX4090 - 650W
Othershit - 80W

Total system completely maxed out approx 900W.

Im assuming you have an 80+ Rated Powersupply, I dont really see how or why it wouldnt handle you running Prime and Furmark at the same time.

Note that realistically your 12900K will never actually max out and you should absolutely undervolt your 4090 so 1000W 80+ should be easy work.
Remember also that the likes of MSI have stupid powerlimits on their cards that make them eat a shit ton of wattage for an extra 5fps.
It just looks good for them to say they have the fastest RTX yayaya.
 
Last edited:

DenchDeckard

Moderated wildly
A 4090 based system (whole system) will likely peak somewhere in the 800W range.
A 12900K gaming doesnt eat anywhere near as much as fear mongers will have you think it does.

But lets just say some game comes out that uses all p-e cores and stresses them out the ass.

Some basic albeit speculative maths.
12900K - 170W
RTX4090 - 650W
Othershit - 80W

Total system completely maxed out approx 900W.

Im assuming you have an 80+ Rated Powersupply, I dont really see how or why it wouldnt handle you running Prime and Furmark at the same time.

Note that realistically your 12900K will never actually max out and you should absolutely undervolt your 4090 so 1000W 80+ should be easy work.
Remember also that the likes of MSI have stupid powerlimits on their cards that make them eat a shit ton of wattage for an extra 5fps.
It just looks good for them to say they have the fastest RTX yayaya.

I bought the msi 3080 gaming trio x day one. It's been awesome.

Thanks for the post, yeah I have the corsair rm1000x with braided cables 80 plus gold so I'm sure I'll be fine. Will keep my eye out to see what reviewers say. I'm sure people will be pairing it up with 12900ks
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I bought the msi 3080 gaming trio x day one. It's been awesome.

Thanks for the post, yeah I have the corsair rm1000x with braided cables 80 plus gold so I'm sure I'll be fine. Will keep my eye out to see what reviewers say. I'm sure people will be pairing it up with 12900ks
You could test your own MSI card right now by undervolting to say 90 or even 80% powerlimit and see that you in any cases gain FPS because the way Nvidias new boost clocks work they rely on temperature....the main thing that will drive up temps in like for like situations is excess power.
MSI basically brute force themsevles to the top of the charts by having heavy heavy cooling and well over volting the chips.
That good cooler plus slightly lower temps will almost certainly lead to a better experience overall.
 

DenchDeckard

Moderated wildly
You could test your own MSI card right now by undervolting to say 90 or even 80% powerlimit and see that you in any cases gain FPS because the way Nvidias new boost clocks work they rely on temperature....the main thing that will drive up temps in like for like situations is excess power.
MSI basically brute force themsevles to the top of the charts by having heavy heavy cooling and well over volting the chips.
That good cooler plus slightly lower temps will almost certainly lead to a better experience overall.

Always heard about undervolting but never bothered to try. I guess I just literally fire up msi afterburner and drop the power limit to 90 percent?

I'll give it a go.
 

GymWolf

Gold Member
Without crypto to drive up demand Ada will be easy to find, im like not even rushing for a 4080 this time around, no rush from me until games come out that actually need a 4080 class card to hit 90fps. (UW1440p)
The main reason Ampere was hard to get a hold of is crypto scum buying it up, and scalpers realizing crypto bros were/are willing to pay well above MSRP so they grabbed as many as they could.
A pandemic forcing internet cafes to close and/or become mining farms didnt help either.


5s9mc3413u691.jpg


FVw82FLWAAAmGH3


FVw82FLXoAAMsy6
Don't jinx it dude.
 
Last edited:

TrebleShot

Member
Finally we will be playing RDR2 in 400fps and Cyberpunk in 4k 60fps Ultra RT.
The future is here and overpriced and definitely definitely worth it. :messenger_sunglasses:
 

Orta

Banned
I know its early days and we don't have official specs but what kind of cpu do you folks think would be needed to nicely compliment the 4070?
 

JackSparr0w

Banned
There is not much demand for $500 - $2500 GPUs from Gamers.
Most buy in the $200 - 350 range
The GTX 1600 and RX570/580 are the most popular gaming GPUs for a reason.
There is tons of demand otherwise they wouldn't make them. After Nvidia made the first Titan which sold beyond their wildest expectations there is no going back.

If you think Nvidia became filthy rich by selling 1060s you're incredibly naive. Higher end GPU models have profit margins that are larger in order of magnitute.
 
Last edited:

GymWolf

Gold Member
There is not much demand for $500 - $2500 GPUs from Gamers.
Most buy in the $200 - 350 range
The GTX 1600 and RX570/580 are the most popular gaming GPUs for a reason.
Things can always change and nobody here has a crystal ball to see the future.

Maybe crypto shit is gonna return big way, who knows.
 
There is tons of demand otherwise they wouldn't make them. After Nvidia made the first Titan which sold beyond their wildest expectations there is no going back.

If you think Nvidia became filthy rich by selling 1060s you're incredibly naive. Higher end GPU models have profit margins that are larger in order of magnitute.
They became filthy rich by selling high margin products. Mainly professional GPUs and overpriced gaming GPUs in bulk to miners and milking their ODM

Their first Titan card was a failure. It was based on the Kepler architecture and launched in 2013.

nVidida didn't start to make a lot of money until 2016/2017
 

Celcius

°Temp. member
They became filthy rich by selling high margin products. Mainly professional GPUs and overpriced gaming GPUs in bulk to miners and milking their ODM

Their first Titan card was a failure. It was based on the Kepler architecture and launched in 2013.

nVidida didn't start to make a lot of money until 2016/2017
Eh I don’t know about the Titan being a failure… it was a halo product that had everyone talking about it and was much more powerful than anything else at the time. Lots of people bought them and shortly thereafter they came out with the Titan black and then continued on with the other titans as well.
 
Eh I don’t know about the Titan being a failure… it was a halo product that had everyone talking about it and was much more powerful than anything else at the time. Lots of people bought them and shortly thereafter they came out with the Titan black and then continued on with the other titans as well.
The point is $1k+ consumer GPU didn't sell in any significant amount before 2018
They got rid of the Titan naming in their consumer GPUs for a reason. They also increased the difference between "Titan" and the regular enthusiast GPU like the 80TI recently.
The 1080TI was better than the Titan X (pascal) and buyers felt ripped off, just after half a year. Again after the initial Titan / Titan X launch back in 2013/2014.
 

Celcius

°Temp. member
The point is $1k+ consumer GPU didn't sell in any significant amount before 2018
They got rid of the Titan naming in their consumer GPUs for a reason. They also increased the difference between "Titan" and the regular enthusiast GPU like the 80TI recently.
The 1080TI was better than the Titan X (pascal) and buyers felt ripped off, just after half a year. Again after the initial Titan / Titan X launch back in 2013/2014.
The 3090 and 3090 Ti are Titans without the name
 

JohnnyFootball

GerAlt-Right. Ciriously.
Finally we will be playing RDR2 in 400fps and Cyberpunk in 4k 60fps Ultra RT.
The future is here and overpriced and definitely definitely worth it. :messenger_sunglasses:
It's nice to have visitors from 2030 show up on the forum. Please bring me some of those 8000 series nvidia GPUs next you're here.
 

JohnnyFootball

GerAlt-Right. Ciriously.
The point is $1k+ consumer GPU didn't sell in any significant amount before 2018
They got rid of the Titan naming in their consumer GPUs for a reason. They also increased the difference between "Titan" and the regular enthusiast GPU like the 80TI recently.
The 1080TI was better than the Titan X (pascal) and buyers felt ripped off, just after half a year. Again after the initial Titan / Titan X launch back in 2013/2014.
I still want to know how so many gamers can afford these monsters? Gamers aren't exactly known for raking in the dough.
 

Senua

Member
Finally, I have ordered ryzen 5600x and rtx 3060ti 😃
It will be huge upgrade coming from ryzen 1200 and gtx 970
I was so tempted to do the same but I may as well wait until the next gen 60 card as there aren't many games I'm dying to play that my 1060 can't do a decent enough job of atm
 
I still want to know how so many gamers can afford these monsters?
It depends where you live, and what you do for a living. In some countries people earn 400 euro per month, so from this perspective even 200-300$ GPUs are extremely expensive. If you however earn 2500 euro (average salary in some countries) then even high end GPU is affordable. You think 1000$ for GPU is expensive, but there are people who can pay 1000$ for bottle of wine.
 

Rbk_3

Member
A 4090 based system (whole system) will likely peak somewhere in the 800W range.
A 12900K gaming doesnt eat anywhere near as much as fear mongers will have you think it does.

But lets just say some game comes out that uses all p-e cores and stresses them out the ass.

Some basic albeit speculative maths.
12900K - 170W
RTX4090 - 650W
Othershit - 80W

Total system completely maxed out approx 900W.

Im assuming you have an 80+ Rated Powersupply, I dont really see how or why it wouldnt handle you running Prime and Furmark at the same time.

Note that realistically your 12900K will never actually max out and you should absolutely undervolt your 4090 so 1000W 80+ should be easy work.
Remember also that the likes of MSI have stupid powerlimits on their cards that make them eat a shit ton of wattage for an extra 5fps.
It just looks good for them to say they have the fastest RTX yayaya.

I have a 12900KS and a 1000W so I am hoping you're right. I just upgraded from 850 to 1000W last year and kicking myself for not going at least 1200.
 

mitchman

Gold Member
I have a 12900KS and a 1000W so I am hoping you're right. I just upgraded from 850 to 1000W last year and kicking myself for not going at least 1200.
1000W might be able to handle the transient spikes in power draw on 4080 etc. NVidia is notorious with. See the video posted above.
 

skneogaf

Member
I reckon I'll be fine with my 3090ti, i may try to undervolt it soon as its plenty fast enough it's the noise of these gpu's nowadays.
 

OZ9000

Banned
Most normal gamers actually have jobs, and save money.
I think these things are easier if you don't have any major financial commitments.

I'll be purchasing a house at the end of the year therefore I'll have to cut back on my gaming or the need to buy overpriced PC hardware.

I hope the RTX 4070 won't be absurdly priced. If it can sell for the same RRP as the 3070 I'll be happy.
 
Last edited:

DenchDeckard

Moderated wildly
I have a 12900KS and a 1000W so I am hoping you're right. I just upgraded from 850 to 1000W last year and kicking myself for not going at least 1200.

Sounds just like me lol. I thought 1000 watts would easily be enough...now I'm kicking myself.
 

tusharngf

Member

NVIDIA GeForce RTX 4090 to feature 2520 MHz boost clock, almost 50% higher than RTX 3090​

Kopite7kimi has an update on RTX 4090 specs.

NVIDIA-RTX-40-HERO-banner-2048x424.jpg


According to the leaker, NVIDIA RTX 4090 now apparently has a base and boost clocks. It is revealed that RTX 4090 has 2235 MHz base and 2520 MHz boost with 2750 MHz actual clock (such as in-game clock).

Those numbers are impressive once compared to the existing RTX 3090 SKU. The base clock alone is 60% increase (vs 1395 MHz) and boost clock is 49% higher (vs 1695 MHz). That’s indeed a noticeable upgrade over Ampere series.



RUMORED NVIDIA GeForce RTX 40 Series Specs
VideoCardz.comGeForce RTX 4090GeForce RTX 4080GeForce RTX 4070
ArchitectureAda (TSMC N4)Ada (TSMC N4)Ada (TSMC N4)
GPUAD102-300AD103-300AD104-275
Board NumberPG139-SKU330PG139-SKU360PG141-SKU341
SMs1288056
CUDA Cores16384102407168
Base Clock2235 MHzTBCTBC
Boost Clock2520 MHzTBCTBC
Memory24 GB G6X16 GB G6X10GB G6
Memory Bus384-bit256-bit160-bit
Memory Speed21 Gbps21 Gbps18 Gbps
Bandwidth1008 GB/s676 GB/s360 GB/s
TDP~450W~420W~300W
Launch DateSeptember-October 2022October-November 2022November-December 2022
Source: @kopite7kimi
by WhyCry

Source: https://videocardz.com/newz/nvidia-...hz-boost-clock-almost-50-higher-than-rtx-3090
 
Last edited:

DenchDeckard

Moderated wildly

NVIDIA GeForce RTX 4090 to feature 2520 MHz boost clock, almost 50% higher than RTX 3090​

Kopite7kimi has an update on RTX 4090 specs.

NVIDIA-RTX-40-HERO-banner-2048x424.jpg


According to the leaker, NVIDIA RTX 4090 now apparently has a base and boost clocks. It is revealed that RTX 4090 has 2235 MHz base and 2520 MHz boost with 2750 MHz actual clock (such as in-game clock).

Those numbers are impressive once compared to the existing RTX 3090 SKU. The base clock alone is 60% increase (vs 1395 MHz) and boost clock is 49% higher (vs 1695 MHz). That’s indeed a noticeable upgrade over Ampere series.



RUMORED NVIDIA GeForce RTX 40 Series Specs
VideoCardz.comGeForce RTX 4090GeForce RTX 4080GeForce RTX 4070
ArchitectureAda (TSMC N4)Ada (TSMC N4)Ada (TSMC N4)
GPUAD102-300AD103-300AD104-275
Board NumberPG139-SKU330PG139-SKU360PG141-SKU341
SMs1288056
CUDA Cores16384102407168
Base Clock2235 MHzTBCTBC
Boost Clock2520 MHzTBCTBC
Memory24 GB G6X16 GB G6X10GB G6
Memory Bus384-bit256-bit160-bit
Memory Speed21 Gbps21 Gbps18 Gbps
Bandwidth1008 GB/s676 GB/s360 GB/s
TDP~450W~420W~300W
Launch DateSeptember-October 2022October-November 2022November-December 2022
Source: @kopite7kimi
by WhyCry

Source: https://videocardz.com/newz/nvidia-...hz-boost-clock-almost-50-higher-than-rtx-3090


The cuda core count puts the 4090 with a huge jump over the 4080. 450 watt tdp too. I should be OK with my 1000 watt psu....

...am I actually contemplating getting a 4090 here....
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The cuda core count puts the 4090 with a huge jump over the 4080. 450 watt tdp too. I should be OK with my 1000 watt psu....

...am I actually contemplating getting a 4090 here....
Yup.
The 3080 was a golden goose by being based on the top chip.
Prior gens xx80 and sometimes even xx70 were on the second largest chip, so the jump to Ti and Titan on the largest chip was mega.
Nvidia is just going back to the old days were xx80 and the tier above werent an overclock apart.

For 3080 owners the real options are 4090 or skip a generation.

Im hoping demand is low for 4090 and MSRP isnt 2000 dollars.....I can squeeze my budget to its limits if the thing is reasonably priced "lol".
Otherwise ill holdout till 1440p60 "max" settings is off the table.
 
Top Bottom