• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia RTX 30XX |OT|

VFXVeteran

Banned
Whelp. The first two next-gen games to come out have taxed the 2080Tis.

I'm talking about FS2020 and Marvel Avengers.

FS2020 is obviously a bandwidth hog but with the hidden Ultra texture pak in Marvel Avengers I just downloaded and played, things aren't looking good for 2080Ti as these games are bandwidth hungry. At Max settings with MA @ 4k, I'm getting FPS drops in the low 20s during cinematic cutscenes when SSS on the characters are taking up a big portion of the scene as well as action scenes with several particle affects.

It was probably hidden from players till the 3000-series cards releases.

I'm convinced that 2080 boards and below will absolutely require upgrades in order to play these next-gen games in 4k @ ultra settings. I"m not even going to guess on the FPS. I'd imagine that not even a 3090 would be able to play MA or FS2020 at 4k/ultra/60 without DLSS.
 
Last edited:

AGRacing

Member
Well... I just posted my Red Devil 5700 XT for 400 CDN firm.

I will let fate decide.

PS5 can wait until 2021 if the price is right and the exclusives are ready.
 

Mentat02

Banned
I guess the RTX 3000 series will be first come first serve type of deal. They aren't available for pre order and the launch is in a couple of weeks.
 

dcx4610

Member
Two questions - Does the 3080 still use the same GPU power cable or does it require an extra now? Secondly, do you need a PCIE 4.0 motherboard to take advantage of the full specs or it is not using PCIE 4.0's bandwidth?
 

CuNi

Member
I really hope there will be Benchmarks of the 80 and 90 and how they perform in VR.
Reverb G2 is going to sport 2160x2160 per eye just like the G1. I wonder how far you can push the Framerates on the 80 and 90 cards respectively.
 

Kazza

Member
Nvidia did a Q&A on Reddit. They answered quite a few questions, but it was the one on the amount of vram that interested me the most:

Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?

[Justin Walker] We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.
In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.
Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.


 

Chromata

Member
Incredible showing from NVIDIA, I won't be upgrading this cycle but I'll definitely be upgrading next cycle.
 

nkarafo

Member
How late the extra VRAM cards you think will be out? I don't want to do the same mistake like i did when i got the 2GB 960. And it seems the inevitable 3060 will have 6GB (it's not going to have the same amount as the 3070). But a 12GB 3060 sounds like something that could last through the whole generation if you game at 1080p.
 
Last edited:
The problem with most of these graphics cards is that you will become CPU bound straight away.

@Orta I have a 7700k and will likely have to upgrade to a 10700 minimum, AMD not being an option here as I need thread performance over many cores.

(also, may have to budget in a new PSU for both. Don't think my 650w Seasonic is enough..)

It's easy to break a CPU bound scenario though, just crank the resolution up enough until the GPU becomes the limiting factor. Since you now have 8K support on Ampere, there is no possibility that you cannot find a scenario that makes you GPU bound instead!
 

Bankai

Member
Wow this was SUCH an impressive presentation! DLSS 2(.1) combined with all that horsepower will be amazing!! What a time to be a PC gamer!


Which I am not.
 

Compsiox

Banned
Wow this was SUCH an impressive presentation! DLSS 2(.1) combined with all that horsepower will be amazing!! What a time to be a PC gamer!


Which I am not.
It really is super exciting. 3090 should be good for the generation.

It also kinda sucks that it can't really be said with full confidence. For all we know Jenson could wake up one night and decide he wants to release GPUs with another big jump in 2 years. Innovation is exciting but can hurt
 
Last edited:

DeaDPo0L84

Member
Question for those who might be more knowledgeable. I know we don't have real world benchmarks to make a for sure conclusion, would you say the 3080 will be perfectly fine for 1440p/60fps with max settings and ray tracing? I know that's what they demoed cp2077 with so I feel pretty confident it'll be just fine for a long while but curious if others could chime in.
 

Jezbollah

Member
It's easy to break a CPU bound scenario though, just crank the resolution up enough until the GPU becomes the limiting factor. Since you now have 8K support on Ampere, there is no possibility that you cannot find a scenario that makes you GPU bound instead!

Funnily enough, I have just been told that's the way to tune Flight Simulator - crank up the detail so it's your GPU that is the limiting factor over CPU :)
 
  • Like
Reactions: GHG

psorcerer

Banned


His model is flawed. 2080->3080 perf upgrade doesn't fit.
I have a different model.
Ampere has 25% IPC advantage over Turing.
TTF = Turing TF
Thus 3080 (14.88TTF) +25% = (2080 (10.4TTF) + 43%) + 25% = 2080 + 79% which is exactly what we see in DF benches.
And 3070 (10.2TTF) +25% = (2070 (7.88TTF) + 29%) + 25% = 2070 + 61%
Bingo.
 

Rikkori

Member
So anyone have any thoughts on the partner cards shown off? The MSI gaming trio looks pretty nice. Generally are the partner cards better for cooling etc than founders?? Personally i like the look of the founders but not an expert by any means. At the moment i think im set on the 3080. Will i get bottlenecks running alongside a ryzen 5 3600 cpu?

I don't trust any of these brands enough to blanket recommend them. There's always the odd dud, and never the same one; it's similar story for other components like motherboards. So I'd say wait for actual reviews for product quality. You never know which one turned out to have a fab problem and they didn't screw things tight enough, or had a vram pad too short, or whatever.

Personally I am more interested in the warranty length, the CS quality, and if they have an RMA base near me. As I'm in EE I tend to go with Gigabyte as they have RMA base in Hungary and tend to be overall ok. People in DE & UK have a bit better options with EVGA & Zotac. If I were in the US I guess I'd just stick to EVGA.

And yeah, your 3600 will absolutely be a bottleneck if you want to go for more than 60-120 fps (depending on the game & API).

When is the 3060 coming out?
Probably november.

How late the extra VRAM cards you think will be out? I don't want to do the same mistake like i did when i got the 2GB 960. And it seems the inevitable 3060 will have 6GB (it's not going to have the same amount as the 3070). But a 12GB 3060 sounds like something that could last through the whole generation if you game at 1080p.
I think the 3070 has the biggest chance to come out with 16GB this year, it just requires using the chip with more memory. For 3060? Probably not gonna be this year if it's a Nov launch. Also, I'm not sure that's not going to be an 8 GB card which won't see a 2x-vram offering, instead of the rumoured 6/12.

Question for those who might be more knowledgeable. I know we don't have real world benchmarks to make a for sure conclusion, would you say the 3080 will be perfectly fine for 1440p/60fps with max settings and ray tracing? I know that's what they demoed cp2077 with so I feel pretty confident it'll be just fine for a long while but curious if others could chime in.
Perfectly fine? It's god damn overkill for 1440p60hz, is what it is. If that's your target then you'll get many years of use out of it. Certainly all of next-gen, considering how much more powerful it is than the next-gen consoles.
 

GymWolf

Gold Member
If a 80 series GPU is not enough to run a game that releases two months after with all the bells and whistles, they have a huge problem
I played enough time on pc to not believe at all the people who think that a gpu can endure an entire gen with top tier settings and this 3000 series is not different.

It's gonna be fun when in a year or 2 people are gonna have to make sacrifices to maintain 4k60 with decentc settings in heavy\broken games...

Maybe this time is gonna be different because we have that giant trick called dlss...
 
I played enough time on pc to not believe at all the people who think that a gpu can endure an entire gen with top tier settings and this 3000 series is not different.

It's gonna be fun when in a year or 2 people are gonna have to make sacrifices to maintain 4k60 with decentc settings in heavy\broken games...

Maybe this time is gonna be different because we have that giant trick called dlss...

Of course games in 2-3 years will start taxing the 3080 (Hopefully...) but Cyberpunk is a cross-generation game which releases less than two months after the GPU

Not only that, but seems nVidia has chosen it as a showcase of what its new cards can do

I'd be relatively surprised if the 3070 couldn't run it 4k 60fps Ultra, and that's before DLSS
 

Arun1910

Member
Do you think that a 3080 would be enough for cyberpunk 4k60 ultra with rtx (or without rtx)?

Even DF is a bit unsure.

Apparently every single showing, and preview build that people have got on PC has been locked to 1080p and 30fps.

You would HOPE a 3080 can run it 4K60 as the card is a beast, but optimisation for the game may be a different story completely.
 

BluRayHiDef

Banned
Even DF is a bit unsure.

Apparently every single showing, and preview build that people have got on PC has been locked to 1080p and 30fps.

You would HOPE a 3080 can run it 4K60 as the card is a beast, but optimisation for the game may be a different story completely.
Hop on the 3090 train, boyos. That's what I intend to do. Choo! Choo!
 

GymWolf

Gold Member
Who cares about ultra settings? High is like 95% as good at about 60-70% the cost... usually.
Yes and if you don't use ultra setting there is still rtx that is even more heavy than that, there is always something heavy that kill performance in gpus, it's just how pc gaming works.

And if you buy a 700 (but really 800 to 900 with customers brands) dollars gpu you want to keep all this shit activated while playing.
 

GymWolf

Gold Member
Even DF is a bit unsure.

Apparently every single showing, and preview build that people have got on PC has been locked to 1080p and 30fps.

You would HOPE a 3080 can run it 4K60 as the card is a beast, but optimisation for the game may be a different story completely.
As a pc gamer, i'm gonna laugh my ass off if a 3080 is not enough to max out that game :ROFLMAO:
 
Last edited:
And if you buy a 700 (but really 800 to 900 with customers brands) dollars gpu you want to keep all this shit activated while playing.
I’d rather go with HIGH settings that looks 95% as good as ULTRA and has 50% more frames than ULTRA. Or maybe trades those extra frames for higher resolution so there’s less jaggies.

Point is that the performance loss in going from HIGH to ULTRA isn’t always worth it. And mindlessly jacking things up to ULTRA discards one of the best things about PC: customizing your settings to make better tradeoffs.
 
Last edited:

GymWolf

Gold Member
I’d rather go with HIGH settings that looks 95% as good as ULTRA and has 50% more frames than ULTRA. Or maybe trades those extra frames for higher resolution so there’s less jaggies.

Point is the performance loss in going from HIGH to ULTRA isn’t always worth it.
Dude i know this, right now i'm playing remnant from the ashes with half of the setting on medium because they look only slighty worse than ultra, but i'm still kinda pissed off that a 10 tf gpu can't handle a AA game with mediocre graphic at 1440p...

Also, it's not always the case, sometimes ultra looks better than high and like i said you still have rtx that are more heavy than ultra setting and people want that shit on screen with a 30 tf gpu...
 

Rikkori

Member
Do you think that a 3080 would be enough for cyberpunk 4k60 ultra with rtx (or without rtx)?

We don't know, but my gut feeling is - absolutely. Remember, Control has a butt load of RTX effects as well, and that can do 8K 60 with a 3090. No way in heck would a 3080 With DLSS NOT be able to do 4K 60hz (which would actually be 1080p/1440p as render resolution). It would make it the most unoptimised game in existence, and I just don't think that's the case based on CDPR's last game & engine, and the fact that I know Nvidia has at least 2-3 engineers working full time on the game to help with RTX.
 

GymWolf

Gold Member
We don't know, but my gut feeling is - absolutely. Remember, Control has a butt load of RTX effects as well, and that can do 8K 60 with a 3090. No way in heck would a 3080 With DLSS NOT be able to do 4K 60hz (which would actually be 1080p/1440p as render resolution). It would make it the most unoptimised game in existence, and I just don't think that's the case based on CDPR's last game & engine, and the fact that I know Nvidia has at least 2-3 engineers working full time on the game to help with RTX.
Yeah but control is a very small game with 1\50 of the stuff on screen compared to cyberpunk, it's not like they are similar game in scope...

Dlss really is gonna be our saviour isn't it?!
 
Last edited:

Rikkori

Member
Yeah but control is a very small game with 1\10 of the stuff on screen compared to cyberpunk, it's not like they are similar game in scope...
True, but what you're thinking of in terms of world life would add further strain to the CPU more than the GPU. Remember, it doesn't matter how big the game is, you're still only loading what you can see and that's what matters for the GPU. The NPC sub-routines and the like, that's work for the CPU.

So the conclusion remains the same vis-a-vis the 3080 for Cyberpunk.
 

GymWolf

Gold Member
True, but what you're thinking of in terms of world life would add further strain to the CPU more than the GPU. Remember, it doesn't matter how big the game is, you're still only loading what you can see and that's what matters for the GPU. The NPC sub-routines and the like, that's work for the CPU.

So the conclusion remains the same vis-a-vis the 3080 for Cyberpunk.
Well...a lot of cars on screen means more rtx...more reflective surface and lights in a city area means more rtx...more npc on screen needs to be rendered by the gpu, it's not only cpu work...
 
Last edited:

Rikkori

Member
Well...a lot of cars on screen means more rtx...more reflective surface and lights in a city area means more rtx...more npc on screen needs to be rendered by the gpu, t's not only cpu work...

Is that really going to be SO much heavier than all the reflective windows & floors in Control that it overpowers the 3080 at 4K when it could feasibly do 8K in Control? Or hell, just the fully path traced Q2 & Minecraft? I just don't believe it. Plus, it's the same numbers of rays cast so the scaling isn't bad no matter how many cars & etc you have on screen. I mean that's the big advantage over rasterisation - it can scale up very cheaply because most of the work is already done, whether you have 1 reflective object or 100.
 

GymWolf

Gold Member
Is that really going to be SO much heavier than all the reflective windows & floors in Control that it overpowers the 3080 at 4K when it could feasibly do 8K in Control? Or hell, just the fully path traced Q2 & Minecraft? I just don't believe it. Plus, it's the same numbers of rays cast so the scaling isn't bad no matter how many cars & etc you have on screen. I mean that's the big advantage over rasterisation - it can scale up very cheaply because most of the work is already done, whether you have 1 reflective object or 100.
Not an expert of rtx so i trust you.

I just don't see how a big city with a shitload of stuff that makes shadows,lights and reflextions can be as heavy as a small game with some offices...

Just a single car that moves in the area produce dynamic reflections, shadows and lights, imagina an istance with 20 of them plus npc and all the reflective stuff in a city area...
 
Last edited:
Top Bottom