• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

Patrick S.

Banned
Cool some uninformed person from France bough my RTX Titan for more than I payed for it. I feel good.
4bmiypjgdpk31.jpg




Why buy a prebuild shit? How man watts you have on the PSU? I am sure there is going to be some extension cable which is going to be 2 6pin to 12 pin or something like that, but you cannot have 450W PSU....

Just wait till he claims a refund though PayPal and you have lost both the card and the money 0:)
 
Recently I've started doubting if the 3080 is such a worthwhile upgrade to my 1080 Non-TI because of the vram amount.

But let's say I buy it - what would be the worst case scenario in 4 years time if I'm mostly going to play at 4k 60fps? Having to lower textures from Ultra to V. High? Having to lower rendering quality to 95%?

I'm an enthusiast gamer but I'm not a techhead. Is the 3080 likely to last me 4 years?
 

RespawnX

Member



Seriously? I'm not going to spend a single minute to play with settings, but the 2080 Ti part is obviously a mix of middle and high settings. Additionally the card is massively overclocked. 2080 Ti does not reach 180-200% 2080 performance. Neither in this world nor in hell. Maybe in LN2 Land.
 

BluRayHiDef

Banned
Recently I've started doubting if the 3080 is such a worthwhile upgrade to my 1080 Non-TI because of the vram amount.

But let's say I buy it - what would be the worst case scenario in 4 years time if I'm mostly going to play at 4k 60fps? Having to lower textures from Ultra to V. High? Having to lower rendering quality to 95%?

I'm an enthusiast gamer but I'm not a techhead. Is the 3080 likely to last me 4 years?
Get a 3090 or wait for the rumored 3080 20GB variant.
 

Rikkori

Member
Jesus christ, only 28 FPS at native 4K in Fortnite on a 3080? Those RT effects must be sucking a lot. Doesn't Epic settings in 4K run at well above 60 FPS on a 2080 or something?

Not unexpected UE4 RTX full hog takes a lot of horse power. Don't forget this classic:
qivZDdq.jpg
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Pretty impressed to see Nvidia get greater transistor density on Samsung 8N than AMD got on their first two attempts on TSMC 7.

3080/3090 = 44.6 MT/mm^2
3070 = 44.3 MT/mm^2
5700/XT = 41 MT/mm^2
Radeon VII = 39.97 mm^2
 

DeaDPo0L84

Member
Didn't Jensen mention the fe cards light up? Also will there be a review roundup before launch on the 17th? I know 4k is getting all the attention but I'm curious about 1440p/120hz.
 

INC

Member
Didn't Jensen mention the fe cards light up? Also will there be a review roundup before launch on the 17th? I know 4k is getting all the attention but I'm curious about 1440p/120hz.

This is where my heads at, I couldnt care less about 4k, 1440p and VR is what I wanna see more attention on
 
So what should we deduct from Ryan Smith's Final Words tweet? Sorry, I am illiterate on the tech behind GPUs.
I think it’s interesting the consoles have “dedicated blocks” for their speedy I/O, and Nvidia has RTX I/O. So...what will BIg Navi have? Imagine if it’s nothing...I wouldn’t put it past AMD to fail to include their console tech on their PC video cards.
 

HeisenbergFX4

Gold Member
Prices :)


 
Prices :)


Nah thats just amazon doing its usual early adopters stuff

https://www.overclockers.co.uk/Zotac-GeForce-RTX-3080-Trinity-10GB-GDDR6X-PCI-Express-Graphics-Card-GX-122-ZT.html
 

longdi

Banned
Damn im still conflicted, 3080 now or 3080S/Ti 9 months later.

The 10GB limitation is pissing my balls off.

Will 3080S/Ti have 20GB or 12GB?
Is 3080S/Ti an upscaled 3080 hence 20GB or a downscaled 3090 hence 12GB

If it is the later, than 3080 looks good to go.. :messenger_pensive:

Do you believe Nvidia will give us 20GB on a non-titan priced gpu?
 
Prices :)


Not likely to be the real final prices. You can preorder if you feel like it, and if the price doesn’t adjust before it ships just cancel it. I just threw in a 3090 preorder, worst thing that happens is I have to cancel it. Not a big deal.
 

longdi

Banned
How much do you really expect the resale value of a 3080 to drop in nine months, realistically? Assuming AMD doesn't bring out a card that competes with it.

I expect the 3080S/Ti to fill in the $1099 price gap, so prices arent a big factor. :messenger_loudly_crying:
I guess is more the perfomance, will the S/Ti version gives the same 30% uplift as expected of a Ti, especially with bigger VRAM? Next gen games will get full swing late 2021 imo.

Times are ...confusing and conflicting.
3080 is already using the big die unlike recent x80..is there more head room left? Will VRAM be even more critical?

While i like to buy once and forget for 3 years, you may be right on the resale thing. Ampere could be all about reselling now.
 
Last edited:

Rentahamster

Rodent Whores
I expect the 3080S/Ti to fill in the $1099 price gap, so prices arent a big factor. :messenger_loudly_crying:
I guess is more the perfomance, will the S/Ti version gives the same 30% uplift as expected of a Ti, especially with bigger VRAM? Next gen games will get full swing late 2021 imo.

Times are ...confusing and conflicting.
3080 is already using the big die unlike recent x80..is there more head room left? Will VRAM be even more critical?

While i like to buy once and forget for 3 years, you may be right on the resale thing. Ampere could be all about reselling now.
Save yourself the anguish and just buy a 3080 now and sell it once the 3080 Super leaks start coming out
 

BluRayHiDef

Banned
Damn im still conflicted, 3080 now or 3080S/Ti 9 months later.

The 10GB limitation is pissing my balls off.

Will 3080S/Ti have 20GB or 12GB?
Is 3080S/Ti an upscaled 3080 hence 20GB or a downscaled 3090 hence 12GB

If it is the later, than 3080 looks good to go.. :messenger_pensive:

Do you believe Nvidia will give us 20GB on a non-titan priced gpu?
Just get a 3090 and call it a day.
 

geordiemp

Member
Pretty impressed to see Nvidia get greater transistor density on Samsung 8N than AMD got on their first two attempts on TSMC 7.

3080/3090 = 44.6 MT/mm^2
3070 = 44.3 MT/mm^2
5700/XT = 41 MT/mm^2
Radeon VII = 39.97 mm^2

All the nodes have a similar pitch of around 30 nm and densities, including Intel.

The sauce is the gate technology, at TSMC 7nm is 6 nm width on the fin supposedly, and its to do with gate material technology as well as how its made that gives the frequency and power perf.

How good the samsung node is will b eseen in frequencies and power use,

7nm is just marketing, as is 8 nm.
 
Last edited:

Agent_4Seven

Tears of Nintendo
After carefully examining specs (RAM bandwidth and memory data rates) of 2080 Super, 2080Ti, 3080 and even RTX Titan, I just cannot believe that 3080 will deliver huge performance gains in rasterization and in native 1440p / 4K. And I'm talking about 60-90 FPS difference (at the very least) for both min and max frame rates in not Async Compute / Vulkan games of which there are very few and that's even assuming that you're playing them, planning to play them or care about them at all. Sure, you can make games load faster via RTX I/O boost, but it won't give you massive boost in frame rates. Also, who cares if 3000 series is x2 better when it comes to RT performance if 99,9% of games are not even using RT or DLSS for that matter, and you can't force DLSS to work in not supported games.

I don't care about 4K performance (I've 1440p 144Hz monitor), I don't care about RT until at least 70% of all AAA, AA and even B level games will be using RT. DLSS on the other hand is really cool feature to have but it also needs to be supported much more widely and or at the very least support for older demanding games needs to be added. I can sell my 1080Ti for $380 (I bought it for something like $430+ 2,5 years ago) right now, easily, but for now and without seeing real world benchmarks, I just don't think that 3080 will be significant enough upgrade in terms of rasterization performance in native 1440p / 4K in most demanding games out there - Deus Ex: Mankind Divided, Quantum Break, Horizon: Zero Dawn, Shadow of the Tomb Raider, Metro: Exodus, Detroit: Become Human and a bunch of other, old and also very demanding games, not to mention the ones whe haven't even played yet or let alone seen.

NVIDIA again trying to sell us the promise of the future which will be achievable and somewhat utilized when it comes to RT and DLSS not today, not tomorrow, not next month and next year, but maybe 5-6 years from now, during which we'll get at least 2 new generation of GPUs from both (or even triple if Intel's GPUs will be very good and competitive enough) sides. What they should focus on also and more heavily, is huge performance gains in rasterization cuz overwhelming majority of games will still be from 70% to 100% raster, so lighting, reflections, global illumination etc. etc. etc. will be faked in games and what's the point in RT and DLSS if your new $700-1500 card can't even run these games with min 100+FPS without RT and DLSS cuz they don't have support for it or Async Compute / Vulkan?

September 14th can't come soon enough.
 
Last edited:

Rikkori

Member
Agent_4Seven Agent_4Seven A big reason why they went with 4K so heavily is because when you have so many cores the only way to get proper utilisation out of them is to increase resolution. That's the easiest & cheapest way to make use of them, anything else is going to be a massive pain and probably not done for various (game-dev economic) reasons. That's why Vega 64 was so good at 4K but was so far behind at lower resolutions as well - it had a lot of CUs which get automatically put to work as you increase resolution but they otherwise require much more fine-tuning on the devs' part in order to use them.

As for rasterisation, they will keep increasing it because they have to - you still need shaders to do so much work & also for RT because it scales that perf as well. So even if they want to get rid of them, they can't. I think you will still see a significant boost to older more traditional games, but especially those who were compute-heavy. So any game that did really well with Vega for example, you can expect Ampere to smash them too.

I don't know, if I had a 1080 Ti I'd have a hard time justifying a 3080 too, simply because 1080 Ti is still a beast & RTX is still not prevalent. On the other hand, I always upgrade for specific games (otherwise what's the point?), so while I upgraded to AMD for Deus Ex: MD, I'm looking at doing the same for Cyberpunk 2077. I just know how much I'm gonna love it & I definitely want to take advantage of RTX in that one. Otherwise? Meh. There will be better models than the 3080 and at better prices, within a year.
 
Top Bottom