• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

The founders edition of the 3080 is up on bestbuy. Cant order it but you can get a notification when it goes on sale.
Buying an FE is not a good idea though. So far 3rd party versions have been cooler, quieter and cheaper at the end of the day. It looks really sexy though....
 
Last edited:

GHG

Member
man i have been going through some of the subreddits, twitter and gaming forums and hype is real lmao

i dont think i have seen this much hype for a gpu for like ever before.

if Apple gets mega upgrade cycle every 3 years then this is a giga cycle for nvidia

Nvidia is very much the apple of the PC building world. They basically emulate Apple in almost everything they do from the marketing to their product packaging.

I kind of feel sorry for AMD's GPU division because they would now need to release a GPU that is orders of magnitude better in order to claw market share back.
 

Rentahamster

Rodent Whores
Buying an FE is not a good idea though. So far 3rd party versions have been cooler, quieter and cheaper at the end of the day. It looks really sexy though....
Maybe. I might be wrong, but I think the FE editions this time around are reference designs + Nvidia putting in more effort to make them better.
 

Rbk_3

Member
Posting it here to get some help ......


I posted in the PC parts/build thread



Hey guys, I am in Canada and would like to get RTX 3080 on launch..in preparation for CyberPunk 2077 of course. I have never purchased a GPU at launch...

Can anyone advise and answer some of my questions?

1. How do people get those founder's Edition cards in Canada? I am not interested in RGB nonsense cards.
2. are they (as in the RTX from Nvidia) available in limited numbers ?
3. Are they easy to buy or are nerds hitting F5 with both hands?


Thanks

Buy from Nvidia.com. They will be approximately exchange rate plus tax and shipping is $52 CAD
 

nemiroff

Gold Member
Interesting. What would be your choices as alternatives against the FE?

I just wanted to say that you should take everything you read and hear right now with a grain of salt until we have proper reviews. The FE is different this time around, it looks like it's better designed/engineered than the other FEs.
 

saintjules

Gold Member
I just wanted to say that you should take everything you read and hear right now with a grain of salt until we have proper reviews. The FE is different this time around, it looks like it's better designed/engineered than the other FEs.

Of course - appreciate the thought. I'm just asking their thoughts on what alternatives they would look at. I'm already prepared for the FE.
 

Amey

Member
Pre-order Cancelled

xTbsYUg.png
 

Rikkori

Member
Same CPU (9900K). Yup AMD can definitely do this.

EhXE6Y6VoAA0oWo


Rumours of 3060 Ti/Super (= 2080 Super?). 3070 would only be 10-15% faster than it. I think it's clear now the 3070 is not quite at the level of a 2080 Ti.

 

Eliciel

Member
So what is AMD going to do about DLSS 2.0 + RT capabilities? Just lower the price of their big navi, position it between 3070 and 3080 doesn't seem enough to me. What is the "me2" value proposition when it comes to this?
 

Rikkori

Member
So what is AMD going to do about DLSS 2.0 + RT capabilities? Just lower the price of their big navi, position it between 3070 and 3080 doesn't seem enough to me. What is the "me2" value proposition when it comes to this?
Lower price and/or more performance and/or more VRAM, and maybe their very own software tricks? Let's not forget they actually started the current gen with a bang and had better software stack than Nvidia until the latter got DLSS 2.0 to work. Which as cool as it is is still very rare & inferior to native in terms of IQ (even if by a smidge - no pun intended).

Isn't this a super CPU intensive game? Waiting to see benchmarks from games that really push GPUs to the brink.
Yes and no. This is an unbelievably well scaling game and there's 0% chance it's CPU-bottlenecked at 4K crazy w/ 4x MSAA, least of all with a 9900K. Of course that doesn't mean it's representative of the wider AAA space. This is basically a high-tech indie RTS, so..
 
Yes and no. This is an unbelievably well scaling game and there's 0% chance it's CPU-bottlenecked at 4K crazy w/ 4x MSAA, least of all with a 9900K. Of course that doesn't mean it's representative of the wider AAA space. This is basically a high-tech indie RTS, so..

So we have a likely worst-case scenario in which a stock 3080 beats a stock 2080Ti by around 25% and a best case scenario (Doom Eternal) where the advantage goes to around 35%.

I'm ok with that.
 
Last edited:

Rikkori

Member
So we have a likely worst-case scenario in which a stock 3080 beats a stock 2080Ti by around 25% and a best case scenario (Doom Eternal) where the advantage goes to around 35%.

I'm ok with that.
Absolutely. Amusingly for me it's kinda in-line with the charts I was making before announcement, so any extra over 25% is gravy. Especially at the lower price.

thJIYUG.png
 

CrustyBritches

Gold Member
Article on VideoCardz about new info from Kopite concerning 3060ti/3060 Super.

Summary:
-VCZ expects 3060 by Nov.
-2 models based on GA106 and GA104, the latter being "Ti/Super"
-3060ti/Super has 8GB GDDR6(non-X)
-Kopite datamined it and tweeted about a bunch of models
-VCZ says Super variants weren't expected this year but next year as Ampere refresh with higher-clocked GDDR6X
-VCZ says "Ti" models expected to land right after Big Navi launches.
-3 models mentioned: 3080Ti, 3080 20GB, 3070 16GB

Hot dang, I really hope that 3070 16GB will be mine.
 

Rikkori

Member

Agent_4Seven

Tears of Nintendo
Last edited:

nemiroff

Gold Member
Maybe. I might be wrong, but I think the FE editions this time around are reference designs + Nvidia putting in more effort to make them better.

This time the FE card is custom design throughout and the board is only available for Nvidia themselves. They also have a reference design, but as far as I know I don't think anyone acquired it from Nvidia yet.
 
Last edited:

ZywyPL

Banned

2.1GHz translates to 44TFlops, yikes!
 

Kenpachii

Member

2.1GHz translates to 44TFlops, yikes!

Yea that thing is going to be a beast.
 

Ascend

Member

giphy.gif
Not unexpected. It was the first thing I thought of when I saw the compute power. And I'm a 'casual' miner.
 

Nydus

Member
I don't really see where the 3080Ti would slip in. Yeah it will have more vram but the gap between the 3080 and the 3090 doesn't seem big enough to justify another tier. Between 3070 and 80 we have a gap of around 3000 cuda cores and gddr6x. The 80 and 90 don't even have 2000. A 3080Ti with ~9500 cuda cores won't be really much faster then the 3080. If it will cost only 799€ it could be fine but I bet it's 899€ and that's just not a good buy. The 3080 almost seems too good :/
 
Ya know everything was lined up for me to get the 3090 and I'm excited to get a new GPU. But having a baby that literally just turned 8 weeks old, it's really hard for me to justify the 3090. Plus I think it makes my wife feel kinda guilty that I'm choosing for the lesser card, which makes me look like super dad putting my priorities in order. Guilt BJ's incoming. Kind of wish they had made a 3080ti with more Vram and maybe a higher clock/cuda count. Think what I'll do is buy the 3080 and then sell that whenever they eventually do announce the Ti. Figured even with the 10gb of vram, just with the performance improvement of the 3080 over the 1080ti is gonna leave me impressed. I mostly play at 3440x1440 120hz so if I can get that 80fps I'll be pretty happy to be honest. I don't think I could wait another 6 mos. for a Ti, but I'm sure I'll have no issues trying to sell the 3080 when it comes that time. Glad I held out though, I'd be so pissed if I was a recent 20xx buyer.

If you bought a 20xx series when it first came out, you more or less held the performance crown for at least 2 years, but still the 1080Ti was the best GPU purchase I've ever made and am so glad I spend the extra dosh at the time for it.
 
Last edited:

Chiggs

Gold Member
I'm concerned about the length of the these partner 3090s. I might have to go with the Founder's Edition, unfortunately. I basically have 13 inches to play around with. [INSERT JOKE HERE]
 
Last edited:
Question for everybody.

So in looking at new TVs, it seems there is 3 different types of VRR: Gsync, Freesync, and VRR over HDMI. Obviously the 3080 will get Gsync support (and I think that means freesync support basically right?), but how does VRR over HDMI work for PC?

My current tv is supposed to get an update for VRR at some point, and was wondering if I could make use of it on my PC. My monitor supports freesync, but I run a hdmi cable to my TV as well for certain games.
 

2.1GHz translates to 44TFlops, yikes!


So... Conclusions...;

RTX series cards cannot go much higher than 1.7 GHz, while RDNA2 should have no trouble hitting at least 2 GHz, possibly more than 2.2GHz

But what about Ascend Ascend 's 1,7 Ghz max boost... :messenger_tears_of_joy:
 

Ascend

Member



But what about Ascend Ascend 's 1,7 Ghz max boost... :messenger_tears_of_joy:
I'll say it again. There's a reason the boost of all three cards (RTX 3070/80/90) is rated to be around the 1.7GHz mark. Additionally, the RTX 3080 and RTX 3090 are both well above 300W, as per nVidia specs. And why wouldn't they clock the RTX 3070 higher if there is so much headroom?

If they can indeed reach 2.1 GHz, well, good on them. Considering the already atrocious supply of these cards, good luck getting such a golden sample. Because that's what it would need to be to be able to reach those clocks without catching fire, if nVidia's own specs are accurate.
 
Last edited:

VFXVeteran

Banned
I'll say it again. There's a reason the boost of all three cards (RTX 3070/80/90) is rated to be around the 1.7GHz mark. Additionally, the RTX 3080 and RTX 3090 are both well above 300W, as per nVidia specs. And why wouldn't they clock the RTX 3070 higher if there is so much headroom?

If they can indeed reach 2.1 GHz, well, good on them. Considering the already atrocious supply of these cards, good luck getting such a golden sample. Because that's what it would need to be to be able to reach those clocks without catching fire, if nVidia's own specs are accurate.

You started out very respectable. I thought to myself, here is a guy that got caught with his pants down and will actually admit he underestimated the silicon in the 3000-series cards despite him routing for AMD. But nope, he moved the goal post...
 
Top Bottom