• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Reportedly Planning GeForce RTX 3080 SUPER & GeForce RTX 3070 SUPER Graphics Cards

Bullet Club

Banned

NVIDIA Reportedly Planning GeForce RTX 3080 SUPER & GeForce RTX 3070 SUPER Graphics Cards​

We have seen a ton of NVIDIA GeForce RTX 30 series graphics card configurations that have been through the rumor mill only to be canceled soon after. It looks like NVIDIA is still deciding on what to do next with its Ampere lineup and as such, there are new rumors emerging from Kopite7kimi (via Videocardz), according to who, NVIDIA is preparing two brand new SKUs which might possibly by the GeForce RTX 3080 SUPER & the GeForce RTX 3070 SUPER.

NVIDIA GeForce RTX 3080 SUPER & GeForce RTX 3070 SUPER Reportedly Under Consideration, Could These Feature 16 GB GDDR6(X) VRAM?​

For starters, the NVIDIA GeForce RTX 3080 SUPER & the GeForce RTX 3070 SUPER are not going to affect the upcoming card lineup which would include the GeForce RTX 3080 Ti & the GeForce RTX 3060. The GeForce RTX 3070 Ti could be a whole different story since we have heard about this SKU for a while now but nothing has been formulated aside from some rumored specifications. The RTX 3080 Ti and RTX 3070 Ti have also been canceled and relaunched with different specs a couple of times now.

The GeForce RTX 3080 Ti is even going to be introduced earlier than the RTX 3070 Ti so the status of this particular SKU is not confirmed yet. At the same time, one thing to note about the GeForce RTX 3080 Ti and GeForce RTX 3060 is how both of them feature much higher memory capacities than their predecessors. The RTX 3080 Ti replaces the RTX 2080 Ti with 9 GB higher VRAM (11 GB vs 20 GB) and the RTX 3060 replaces the RTX 2060 with twice the memory buffer (6 GB vs 12 GB). The same might just be the case with the SUPER series that NVIDIA has planned.

NVIDIA GeForce RTX 3080 SUPER & RTX 3070 SUPER

We don't have any concrete specifications of these two chips but if we go back to earlier rumors, then we can tell that NVIDIA had initially planned higher VRAM variants of both RTX 3080 & RTX 3070 graphics cards. The RTX 3080 was expected to feature 12-16 GB VRAM while the RTX 3070 also had been expected to feature similar VRAM amounts. However, that plan was scrapped in favor of NVIDIA going literally bonkers with its RTX 3090 design and keeping the rest of the lineup capped to 10 GB VRAM due to limited availability of GDDR6(X) modules.



By the time the RTX 3080 Ti launches, the memory supply will be far better. NVIDIA themselves have also confirmed that supply would get better by the end of the first quarter of 2021 so that is probably when NVIDIA can devise a better plan to re-release their GeForce RTX lineup in a SUPER fashion.

For this purpose, the GeForce RTX 3080 SUPER and GeForce RTX 3070 SUPER could be the first entrants in the lineup. Similar to the previous generation, NVIDIA wouldn't want to touch the high-end models since they will be positioned right against the competition. It's the RTX 3080 and below lineups such as the RTX 3070 and RTX 3060 Ti which would require a SUPER refresh to tackle AMD's Radeon RX 6800 and RX 6700 series.



For that purpose alone, NVIDIA could end up featuring 16 GB of GDDR6(X) memory on both, the RTX 3080 SUPER (GDDR6X) & the GeForce 3070 SUPER (GDDR6). As for core configurations, they might also change with the GeForce RTX 3070 SUPER expected to utilize the GA103S GPU while the GeForce RTX 3080 SUPER would retain its GA102 GPU.

However, one critical thing to consider here would be the prices. Whether NVIDIA offers a price cut on existing variants and ends their production to make room for the new cards which is similar to what they did in the last generation or keep the existing cards and offer the SUPER variants with a slight premium, that remains to be seen. Lastly, don't expect the SUPER cards anytime soon as the green team has yet to be done with the original lineup. We can probably expect an announcement around Computex or during Summer which was also the case with the GeForce RTX 20 SUPER line.

NVIDIA GeForce RTX 30 Series 'Ampere' Graphics Card Specifications:​

Graphics Card NameNVIDIA GeForce RTX 3050NVIDIA GeForce RTX 3050 TiNVIDIA GeForce RTX 3060NVIDIA GeForce RTX 3060 TiNVIDIA GeForce RTX 3070NVIDIA GeForce RTX 3070 Ti?NVIDIA GeForce RTX 3080NVIDIA GeForce RTX 3080 Ti?NVIDIA GeForce RTX 3090
GPU NameAmpere GA107Ampere GA106?Ampere GA106?Ampere GA104-200Ampere GA104-300Ampere GA102-150Ampere GA102-200Ampere GA102-250Ampere GA102-300
Process NodeSamsung 8nmSamsung 8nmSamsung 8nmSamsung 8nmSamsung 8nmSamsung 8nmSamsung 8nmSamsung 8nmSamsung 8nm
Die SizeTBATBATBA395.2mm2395.2mm2628.4mm2628.4mm2628.4mm2628.4mm2
TransistorsTBATBATBA17.4 Billion17.4 Billion28 Billion28 Billion28 Billion28 Billion
CUDA Cores23043584384048645888742487041049610496
TMUs / ROPsTBATBATBA152 / 80184 / 96232 / 80272 / 96328 / 112328 / 112
Tensor / RT CoresTBATBATBA152 / 38184 / 46232 / 58272 / 68328 / 82328 / 82
Base ClockTBATBATBA1410 MHz1500 MHzTBA1440 MHzTBA1400 MHz
Boost ClockTBATBATBA1665 MHz1730 MHzTBA1710 MHzTBA1700 MHz
FP32 ComputeTBATBATBA16.2 TFLOPs20 TFLOPsTBA30 TFLOPsTBA36 TFLOPs
RT TFLOPsTBATBATBA32.4 TFLOPs40 TFLOPsTBA58 TFLOPsTBA69 TFLOPs
Tensor-TOPsTBATBATBATBA163 TOPsTBA238 TOPsTBA285 TOPs
Memory Capacity4 GB GDDR6?6 GB GDDR6?6 GB GDDR6?8 GB GDDR68 GB GDDR610 GB GDDR6X?10 GB GDDR6X20 GB GDDR6X24 GB GDDR6X
Memory Bus128-bit192-bit?192-bit?256-bit256-bit320-bit320-bit320-bit384-bit
Memory SpeedTBATBATBA14 Gbps14 GbpsTBA19 Gbps19 Gbps19.5 Gbps
BandwidthTBATBATBA448 Gbps448 GbpsTBA760 Gbps760 Gbps936 Gbps
TGP90W?TBATBA180W?220W320W?320W320W350W
Price (MSRP / FE)$149?$199?$299?$399 US?$499 US$599 US?$699 US$899 US?$1499 US
Launch (Availability)2021?2021?2021?November 2020?29th OctoberQ4 2020?17th SeptemberJanuary 2021?24th September

Source: WCCFTech
 

Ribi

Member
It's so strange how the 30 series haven't really came out yet and they are already being replaced with versions you'd want to buy instead. At this point just sell the ti and super versions... It's not like you you'd lose money since the older models were never manufactured to begin with.

What are the chance that nvidia replaces everything on the market of theirs with only 30 series cards. (No 20s to be sold at all)
 

STARSBarry

Gold Member
It's so strange how the 30 series haven't really came out yet and they are already being replaced with versions you'd want to buy instead. At this point just sell the ti and super versions... It's not like you you'd lose money since the older models were never manufactured to begin with.

What are the chance that nvidia replaces everything on the market of theirs with only 30 series cards. (No 20s to be sold at all)
This is because the on paper launch of AMD's cards had them running scared especially when there models all came with at most 10GB of VRAM outside of there lol 3090 when the competition was touting 16GB of ram across there line up. They where scrambling to deliver more despite the 3000 series being solid cards. Then the actual cards launched and the real world benchmarks came in, AMD's cards where solid but not as competiton crushing as assumed, so it was back to the drawing board again, this time knowing they need an improvement especially in memory numbers, but also lowering the scale and cost of the improvement as it was no longer needed.

I suspect the super will be akin to the small 2080 vs 2080S rather than the huge 2070 to 2070S jump... the one to watch is the 3080, the 3080 and 3090 are so close preformance wise that any meaningful improvement with the 3080Ti could very realistically leap frog the 3090 making it utterly pointless given its assumed cost difference.
 
Last edited:

nightmare-slain

Gold Member
So how are they slotting a 3080S and a 3080 Ti between the 3080 and 3090 which are 10-15% apart?
vram.

3080 = 10GB
3090 = 24GB

the 3080 Ti is likely gonna be 16GB. the 3080S could be 20GB. or if the 3080 Ti is 12GB then the 3080S could be 16GB.
 
Last edited by a moderator:

rofif

Can’t Git Gud
Ass. That's how I feel as a pc gamer.
Hunting for 3080fe for a month(i got it month after release), always having newest components and the damn pc is like 3k...
And you are always behind. Always fd. Glad ps5 should come this month. I am kinda sick of pc gaming sometimes.
 

diffusionx

Gold Member
Nvidia should be planning to actually ship enough cards instead of releasing new ones they can’t stock.

Ass. That's how I feel as a pc gamer.
Hunting for 3080fe for a month(i got it month after release), always having newest components and the damn pc is like 3k...
And you are always behind. Always fd. Glad ps5 should come this month. I am kinda sick of pc gaming sometimes.

PS5 is already behind though.
 
Last edited:

BigTnaples

Todd Howard's Secret GAF Account
Didn’t realize the 3080ti was all but confirmed?



Awesome.

I’ll gladly wait for that instead of trying to get a 3080 or 3090 right now.
 

draliko

Member
everytime i itch to get back to pc gaming i remember the hassle to get a decent card, i've had more luck securing a ps5, a xsx and a quest2 that a gpu. Seems like i'll remain on my 5700xt for a long time.
 

Celcius

°Temp. member
Maybe I’ll just keep my 1080 Ti until the RTX 4000 series cards... they’ll probably be out by the time I’m able to get my hands on a card.
 

regawdless

Banned
Curious how much faster the 3080S will be compared to standard at 1440p. The 3090 is disappointing in that regard.
 

greencoder

Member
cover2.jpg


But I'm glad I'm a patient gamer (and buyer).
 
Last edited:

Md Ray

Member
What a mess. NVIDIA from the get-go should have made the 3060 Ti and 3070 with at least either 10-11 GB VRAM or 12 GB at most.

And 16 GB minimum in 3080. The 3090 could have remained as is.
 
Last edited:

TheMan

Member
It’s good that tech continues to progress but I’m just fine with my 2070s. There just aren’t enough games out there to justify the cost and frustration of trying to upgrade to something better. Not yet anyway
 
Last edited:

supernova8

Banned
This is because the on paper launch of AMD's cards had them running scared especially when there models all came with at most 10GB of VRAM outside of there lol 3090 when the competition was touting 16GB of ram across there line up. They where scrambling to deliver more despite the 3000 series being solid cards. Then the actual cards launched and the real world benchmarks came in, AMD's cards where solid but not as competiton crushing as assumed, so it was back to the drawing board again, this time knowing they need an improvement especially in memory numbers, but also lowering the scale and cost of the improvement as it was no longer needed.

I suspect the super will be akin to the small 2080 vs 2080S rather than the huge 2070 to 2070S jump... the one to watch is the 3080, the 3080 and 3090 are so close preformance wise that any meaningful improvement with the 3080Ti could very realistically leap frog the 3090 making it utterly pointless given its assumed cost difference.

I'm definitely inclined to agree with you. All you need to do is look at the timing of previous GPU launches:

GTX 1080 launch: May 2016
GTX 1080 Ti launch: March 2017
10 month gap

RTX 2080 launch: September 2018
RTX 2080 Ti launch (equivalent to "RTX 2090"): also September 2018
RTX 2080 Super launch: July 2019
10 month gap

RTX 2070 launch: October 2018
RTX 2070 Super launch: July 2019
9 month gap

RTX 3080 launch: September 2020
RTX 3080 SUPER rumored launch: ??

Given how competitive AMD's cards are (sure there's no competition in ray tracing but a DLSS competitor is apparently coming), I wouldn't put it past NVIDIA to release the SUPER cards 6 or 7 months after launch (ie April/May 2021). I reckon gone are the days when NVIDIA could milk their architecture for an entire year unopposed. It's great for the consumer. AMD put in a good innings considering they came from the 5700 XT but I think NVIDIA does still have the better product, but once supply issues are out of the way we'll be getting better products faster than ever before. The war is back on and it's glorious!
 
D

Deleted member 17706

Unconfirmed Member
For people complaining about the 10GB in the 3080, which games are being bottlenecked by it that would otherwise be running at 4K60+?
 

nemiroff

Gold Member
For people complaining about the 10GB in the 3080, which games are being bottlenecked by it that would otherwise be running at 4K60+?

None of course. And no use asking here anyway, you're not going to get much else than notions and feelings (speculations is of course fine on a forum, but it's the definite authority speak I have a problem with). If we really want to figure this out, what we need to do is to ask a bunch of high profile actual professional developers how they look at the future of development and resource balance. I mean, even the PS5's and XSX's place in the ecosystem have a say in this. As for my speculations, will there perhaps be outlier developers taking on the challenge to push through the roof so that a handful of people get to use their extra VRAM? Yeah sure, probably. But I suspect it won't be many, and will probably just end up something low effort for the five minutes of fame. And not the least, the way things are going with assets streaming through high bandwidth channels, the need for tons and tons of VRAM kinda seems to be over anyway, no?
 
Last edited:

Mister Wolf

Gold Member
None of course. And no use asking here anyway, you're not going to get much else than notions and feelings (speculations is of course fine on a forum, but it's the definite authority speak I have a problem with). If we really want to figure this out, what we need to do is to ask a bunch of high profile actual professional developers how they look at the future of development and resource balance. I mean, even the PS5's and XSX's place in the ecosystem have a say in this. As for my speculations, will there perhaps be outlier developers taking on the challenge to push through the roof so that a handful of people get to use their extra VRAM? Yeah sure, probably. But I suspect it won't be many, and will probably just end up something low effort for the five minutes of fame. And not the least, the way things are going with assets streaming through high bandwidth channels, the need for tons and tons of VRAM kinda seems to be over anyway, no?

Godfall developers talked that shit about using tons of vram and the 6800XT doesn't even perform better than the 3080 running the game at 4K.
 
D

Deleted member 17706

Unconfirmed Member
Godfall developers talked that shit about using tons of vram and the 6800XT doesn't even perform better than the 3080 running the game at 4K.

Sounds about right for that game. Their entire goal seemed to be cramming as much next gen visual tech into a single package that they could rush out for the launch of the next-gen consoles. Optimization and making an actual good game were not on their list of priorities.
 

IntentionalPun

Ask me about my wife's perfect butthole
"I know there are some new SKUs recently, but I don't know their names yet, maybe 3070 Super and 3080 Super."

=

NVIDIA Reportedly Planning GeForce RTX 3080 SUPER & GeForce RTX 3070 SUPER Graphics Cards

lol
 

Md Ray

Member
For people complaining about the 10GB in the 3080, which games are being bottlenecked by it that would otherwise be running at 4K60+?
I'm primarily complaining about the 3060 Ti and 3070's 8GB VRAM. I have no problems with 3080. They should have come with at least 11GB VRAM (same as the 2080 Ti), and 3080 could have bumped it up to 16GB as a result of lower-tier cards coming with 11GB. And because 10GB on the 3080 would have looked stupid next to 3070 with 11GB.
 
Last edited:

KevinHelpUs

Neo Member
Considering the outweighed demand vs supply, I'm not surprised that they are planning more permutations, but it seems odd that they are announcing (or leaking) more SKUs rather that commenting on getting cards produced for those who want them.
 
D

Deleted member 17706

Unconfirmed Member
I'm primarily complaining about the 3060 Ti and 3070's 8GB VRAM. I have no problems with 3080. They should have come with at least 11GB VRAM (same as the 2080 Ti), and 3080 could have bumped it up to 16GB as a result of lower-tier cards coming with 11GB. And because 10GB on the 3080 would have looked stupid next to 3070 with 11GB.

3060 Ti and 3070 are 1440p cards for which 8GB is more than fine. I don't think there's a situation where your frame rate is handicapped by 8GB VRAM in which it would otherwise be running over 60 FPS with maxxed out settings at 1440p. Go to 4K and its a different story, perhaps, but you wouldn't be running maxxed out at over 60 FPS in most games anyway, so the VRAM is not the bottleneck. Maybe it's an issue for folks who are fine with 30 FPS and want all of the eye candy turned on?
 
Some really likes fabricating an issue where there isn't one. You'll be more than fine with 10 GB until 4080 comes out in 18 months or so. Games are being made to not exceed 8GB. Maybe once real next gen games start coming out in 2 years, then and only maybe then, you'll might need more than 10 GB. Plus DLSS lowers VRAM usage as well.
 

diffusionx

Gold Member
They really shot themselves in the foot this generation with that 3090.

Zero room to diversify the lineup in between the midrange card and the ultra high-end card.
I kind of disagree, they’re charging over 2x the price for 15% higher performance, even if the 3080TI comes out with 10% higher performance only for $999 people will buy it for the extra RAM (whether it’s needed or not). The 3090 might die but they just squeezed a lot of money out or people in the meantime and it’s not like these Titan-level cards were huge sellers anyway (the 3090 is probably the best selling).
 
Last edited:

njean777

Member
How are they even doing this when people cannot even grab a 3080? Nividia really weird, but I guess it is par the course.
 
Last edited:

adamosmaki

Member
well done nvidia. Instead of focusing in releasing $200-400GPus make more 70 and 80 series GPUs with nearly the same performance as if GPU shortages of those chips isnt already bad enough
 

Kuranghi

Member
For people complaining about the 10GB in the 3080, which games are being bottlenecked by it that would otherwise be running at 4K60+?

This is what I was going to ask, I played RE3 remake at 8K with maximum texture settings (It took ~6GB max at 4K) for one of my last runs and it took up ~7.5GB maximum. That game has insanely detailed textures at maximum settings, they still hold up at 8K.

Maybe its to do with swapping data in and out of VRAM, if you are near the limit of your card (8GB on my GTX 1080 in my case) and it tries to swap a lot of textures it might cause stutters because it exceeds the VRAM during that time. I don't really know how these things work except on a basic level though.
 

Ascend

Member
Apparently the mining shit is taking off again and cards with more than 4GB of VRAM are being targeted.

All 3080s.

mining-rig-78-rtx-3080-php6-million-noypigeeks-5135.jpg
nVidia directly sold a lot of the graphics cards to miners directly, at the cost of availability for gamers.

 

AV

We ain't outta here in ten minutes, we won't need no rocket to fly through space
They really shot themselves in the foot this generation with that 3090.

Zero room to diversify the lineup in between the midrange card and the ultra high-end card.

Kind of. They shot themselves in the foot by pretending the 3090 was a gaming card, but people actually believed them, and it worked. Although they're far easier to buy than the others, because not everyone's that dumb/wealthy.

Now they have to release a card in a few months with the same (probably) performance for less than 2/3 of the price, less than six months after launch, and pretend the old card was always a gaming card.
 
Last edited:

Md Ray

Member
3060 Ti and 3070 are 1440p cards for which 8GB is more than fine.
Lead engine programmer @ id Software had this to say:



This statement from him was two months before the next-gen consoles launched and he was already calling 8GB "low-end". Another hint as to why I think 8GB was a mistake especially in 3070 is because if you read about the XSX's memory allocation for games, you'll know out of 16 GB, 13.5 GB is allocated for games alone. Out of this 13.5 GB, 10 GB is the so-called "GPU optimal memory" in other words, this 10 gigs will be the video memory that runs at 560 GB/s, the other 3.5 GB is called "standard memory" and runs at slower speeds which will act as the system memory. Based on this, I think the minimum VRAM at least for 3070 that's going to run next-gen RT games should have been 10-11 GB minimum even if it's a 1440p card as Billy Khan says "Even at resolutions below 4K, higher VRAM is the way to go".

I don't think there's a situation where your frame rate is handicapped by 8GB VRAM in which it would otherwise be running over 60 FPS with maxxed out settings at 1440p. Go to 4K and its a different story, perhaps, but you wouldn't be running maxxed out at over 60 FPS in most games anyway, so the VRAM is not the bottleneck. Maybe it's an issue for folks who are fine with 30 FPS and want all of the eye candy turned on?
Yes, right now in current cross-gen games it's fine at 1440p. What will happen when those full-fledged next-gen RT games start to come out? And since RT is also said to be memory-intensive (heard from the same devs), I think gamers might have to lower texture quality below console-level on those 8GB cards to keep the performance up despite those GPUs being much faster than PS5/XSX GPUs, especially in RT.
 
Last edited:

Ascend

Member
"Allegedly"
nVidia always gets the excuses.


 
FFS, can someone just tell me when I'll be able to buy a 3060ti, or 3070, or 3070ti, or 3070 Super, or whatever the fuck in the $400-600 range I don't even care, without a huge hassle or getting completely ripped off?

What an absolute nightmare it is trying to buy one of this company's products. I know Nvidia don't care because they're the dominant partner in a duopoly when they aren't basically an outright monopoly, but Christ, surely it has to bite them in the ass at some point that every single person who ever tries to buy one of their products has a horrible experience?
 
nVidia always gets the excuses.



I couldn't care less. I'm not a fanboy to any billion dollar corporation. You stated something as fact by linking an article that stated allegedly.

I've seen your posts and I'm aware of your agenda. Again, I don't care.
 
D

Deleted member 17706

Unconfirmed Member
Lead engine programmer @ id Software had this to say:



This statement from him was two months before the next-gen consoles launched and he was already calling 8GB "low-end". Another hint as to why I think 8GB was a mistake especially in 3070 is because if you read about the XSX's memory allocation for games, you'll know out of 16 GB, 13.5 GB is allocated for games alone. Out of this 13.5 GB, 10 GB is the so-called "GPU optimal memory" in other words, this 10 gigs will be the video memory that runs at 560 GB/s, the other 3.5 GB is called "standard memory" and runs at slower speeds which will act as the system memory. Based on this, I think the minimum VRAM at least for 3070 that's going to run next-gen RT games should have been 10-11 GB minimum even if it's a 1440p card as Billy Khan says "Even at resolutions below 4K, higher VRAM is the way to go".


Yes, right now in current cross-gen games it's fine at 1440p. What will happen when those full-fledged next-gen RT games start to come out? And since RT is also said to be memory-intensive (heard from the same devs), I think gamers might have to lower texture quality below console-level on those 8GB cards to keep the performance up despite those GPUs being much faster than PS5/XSX GPUs, especially in RT.


But he doesn't explain "why" it's the way to go and I doubt he can point to a single game that is being bottlenecked at 1440p by 8GB of VRAM.

Going forward, a 3070 almost certainly isn't going to be enough to last you the entire generation anyway if you want to max out games and get 60+ FPS at 1440p.

If the RTX 4070 (assuming it's more powerful than a 3090) comes with 8GB of VRAM, then I'll start to be concerned.
 
Top Bottom