• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Allegedly Begins Testing Its Fastest Next-Gen GPU, The AD102, For GeForce RTX 4090 Graphics Card, Features 24 Gbps GDDR6X Memory

Darius87

Member
Ok. So now game developers create hardware for next-gen? You sound silly - typical hater because innovation isn't happening on your precious box.
why so insecure? my only box is PC. :messenger_grinning_smiling:
A developer can not make a game without knowing what the limits they have. Hardware dictates that. Not software. Developers work WITH the hardware. Not a single developer said they wanted ray-tracing last gen and Nvidia/AMD said "yes sir! Right away sir!".
by your logic devs already targeting limits of nvidia's 3090, your limit you talking about is only minimum spec requirement for a game.
And sure.. consoles have enough power from their 2080-like performance to give photorealistic graphics, so no need for updated hardware -- that is until there is announcements for "mid-gen" refreshes.. And THEN people like you will get excited about new hardware pushing even BETTER photoreal graphics right?
what photorealistic graphics you're gonna push when nvidia 4xxx cards comes out? let me answer none, you're gonna play same games which can run on previous n'vidia and AMD cards.
 

winjer

Gold Member
Most popular graphics card on steam was the GTX1060.

Nvidia is ging to dictate what next gen will look like??😂🤣

With 7.95%.
RTX cards already represent 30% of on Steam, and these are all more powerful than the 1060.
And mind you the even the lowly RTX 2060, is as powerful in Ray-tracing as a PS5.
And all this during the biggest GPU crisis the PC market ever faced. Drastically hampering adoption of newer generation cards.
 

SantaC

Member
This kind of news is what console gamers should be paying attention to. Nvidia is going to dictate what next-gen will look like in the future. 3xxx series just came out and those boards are already powerful. 4xxx series is just going to be ridiculous. They might even have a 5xxx series before the console generation is over.
They just came out? 3000 series is almost two years old.
 

Midn1ght

Member
How many TFLOPs can the 4090 do?

fyiYClE.jpg

Work Working GIF


Ada Lovelace GPUSMsCUDA CoresTop SKUMemory BusAmpere GPUSMsCUDA CoresTop SKUMemory BusSM Increase (% Over Ampere)
AD10214418432RTX 4090?384-bitGA1028410752RTX 3090 Ti384-bit+71%

Zach Galifianakis Reaction GIF


68.4 TFLOPS

Season 2 Omg GIF by The Office

Working Super Troopers GIF


69 TFLOPS (OC)
Thats Nice Reaction GIF by MOODMAN
 

Dream-Knife

Banned
Id like to see a 3060 12G vs 3070 8G benchmark.
Cuz thats what the discussion is.
The chip being weaker means its VRAM is practically irrelevant because even as the generation goes on it wont be able to keep up with the 3070.



Show me one game where at like for like settings even at 4K with Ultra Textures the 3060 gets smoother performance than the 3070?
Id like to see that 3060 play any modern game going forward at 4K whatever settings (Ultra textures) considering its CUDA and bandwidth limitations.


Odd you are preaching about NOT settling, yet settled for a 3060 with its super limited bandwidth and CUDA count, its barely faster than the card its replacing the 2060S.
Didn't you hear? More vram and more cores (regardless of actual performance) is what matters.
The 3XXX aren't that great. It does not guarantee 4k60 in every game.

The 4XXX will only be impressive when every single PS5/Xbox title runs at 4K60 without a single hitch/fps drop.
Consoles often aren't doing real 4k.
 

OZ9000

Banned
Didn't you hear? More vram and more cores (regardless of actual performance) is what matters.

Consoles often aren't doing real 4k.
I don't care what consoles do or don't do.

I want all my games to run at 4k60. Otherwise the 4000 series will be a waste of money.

I am also not a fan of Nvidia's approach of pump up the wattage for more performance. What ever happened to efficiency?
 
Last edited:
Didn't you hear? More vram and more cores (regardless of actual performance) is what matters.

I don't think anyone is saying that VRAM matters most or that the RTX 3060 is going to suddenly best the RTX 3070 (though it does have much better .1% lows at 4k with the HD texture package in FC6, looks to be about 29fps vs. 9fps from what you see online). Nor do I see @StateofMajora saying that RTX 3060 is the more performant card. Personally, I just think the RTX 3060 will age a bit better at its performance level, which seems reasonable given historical results and what memory faults typically do to .1% and 1% lows.
 
Last edited:

VFXVeteran

Banned
The 3XXX aren't that great. It does not guarantee 4k60 in every game.
There is no hardware that can guarantee 4k60. That's a ludicrous assumption. If you think the 3xxx series aren't that great because they can't guarantee 4k60, then you must think consoles are terrible.

The 4XXX will only be impressive when every single PS5/Xbox title runs at 4K60 without a single hitch/fps drop.
Oh, I get it. The 4xxx needs to somehow "make" the PS5/Xbox titles run at 4k60 before they can be considered good tech GPUs.. Wow. Just when I thought we had at least rationale gamers, I am fooled yet again!
 

VFXVeteran

Banned
Didn't you hear? More vram and more cores (regardless of actual performance) is what matters.

Consoles often aren't doing real 4k.
Of course you know that the previous poster's comment made little sense because it's a console warrior's distorted glasses that he looks out of.
 
Last edited:

DenchDeckard

Moderated wildly
Of course you know that the previous poster's comment made little sense because it's a console warrior's distorted glasses that he looks out of.
Lucky to get 1440p and 60 fps on these consoles. But you know, pcs are shit unless they deliver native 4k 60 fps with ultra settings and raytracing....

Meanwhile consoles don't even have anisotropic filtering this gen.....
 

VFXVeteran

Banned
I don't care what consoles do or don't do.

I want all my games to run at 4k60. Otherwise the 4000 series will be a waste of money.

I am also not a fan of Nvidia's approach of pump up the wattage for more performance. What ever happened to efficiency?
How can a graphics card guarantee that when the software can use all kinds of high resources to achieve their desired look. So what if a company decides to make full path-traced game that can only run at 1080p/30FPS. Is it the GPUs fault?
 

Dream-Knife

Banned
I don't care what consoles do or don't do.

I want all my games to run at 4k60. Otherwise the 4000 series will be a waste of money.

I am also not a fan of Nvidia's approach of pump up the wattage for more performance. What ever happened to efficiency?
So you want something that doesn't exist? You do know PPI exists, along with view distance right? 4k is a huge waste of resources.
I don't think anyone is saying that VRAM matters most or that the RTX 3060 is going to suddenly best the RTX 3070 (though it does have much better .1% lows at 4k with the HD texture package in FC6, looks to be about 29fps vs. 9fps from what you see online). Nor do I see @StateofMajora saying that RTX 3060 is the more performant card. Personally, I just think the RTX 3060 will age a bit better at its performance level, which seems reasonable given historical results and what memory faults typically do to .1% and 1% lows.
You shouldn't be using any of those cards for 4k though. Developers have been pretty bad with optimization. Halo infinite only gets 64 fps at 4k ultra on a 3090, so safe to say really nothing this gen is 4k as long as games come out the way they do. 1440p a 3070 will last longer than a 3060 at higher performance. Either will probably be better than the base consoles.
 
I don't think anyone is saying that VRAM matters most or that the RTX 3060 is going to suddenly best the RTX 3070 (though it does have much better .1% lows at 4k with the HD texture package in FC6, looks to be about 29fps vs. 9fps from what you see online). Nor do I see @StateofMajora saying that RTX 3060 is the more performant card. Personally, I just think the RTX 3060 will age a bit better at its performance level, which seems reasonable given historical results and what memory faults typically do to .1% and 1% lows.
Indeed. Don't know why PC gamers want to beg for less and be apologists for companies like Nvidia when they don't put enough VRAM in these very expensive gpus, but hey. Nobody wants to admit they're getting screwed.

It is undeniable that the 3060ti, 3070 and even the 3080 10gb have an unacceptably low amount of vram, and if I had something like a 1080ti prior to the 3xxx series release, I honestly wouldn't have even thought to get a new card with less vram. And the 3060 is basically a side grade sans dlss and RT, so I wouldn't even have got that card. But it was an upgrade from a 970, so it was HUGE, and it feels good to know it won't be vram starved before it just isn't fast enough to run games anymore.

The 4070 should have at least 12gb vram, and for the 4080 it would be reasonable to expect 16gb there. Or maybe both the 70 and 80 will have 16gb, and 24 on the 4090.
 
Last edited:

FutureMD

Member
Isn't it better to have a high watt high power GPU options for those who want it and lower/medium power ones vs just low/medium powered ones? Obviously amazing performance and amazing wattage efficiency is best, but if this is the limit of the architecture, it's kind of like having a SLI gpu at the top end, and accepting the ridiculous power requirements that come with it.
 
Isn't it better to have a high watt high power GPU options for those who want it and lower/medium power ones vs just low/medium powered ones? Obviously amazing performance and amazing wattage efficiency is best, but if this is the limit of the architecture, it's kind of like having a SLI gpu at the top end, and accepting the ridiculous power requirements that come with it.
I am starting to look at it like the 60 and 70 series of nvidia cards are more like yesterday's high end 80 series and the new 80 and 90 series are a new tier(s) entirely... It's definitely true in terms of msrp and power consumption.
 
So you want something that doesn't exist? You do know PPI exists, along with view distance right? 4k is a huge waste of resources.

You shouldn't be using any of those cards for 4k though. Developers have been pretty bad with optimization. Halo infinite only gets 64 fps at 4k ultra on a 3090, so safe to say really nothing this gen is 4k as long as games come out the way they do. 1440p a 3070 will last longer than a 3060 at higher performance. Either will probably be better than the base consoles.

3060 is a 1080p card though, in that space I think it will last longer than 3060ti or 3070 at 1440p. Just because I think that memory usage will increase quite drastically across all resolutions (as it always does when the consoles get sizable increases in memory).

Basically, in the 1080p/light 1440p space you have the 6600xt and the RTX 3060 and in the 1440p/light 4k space you have the 6700xt and the 3060ti/3070. In both those spaces the lower VRAM cards are performing better in the crossgen phase, but it wouldn't surprise me if next-gen only games flip that around and favor the cards with more available memory, with 8GB cards suffering from noticeable hitches etc.

It's something that will be interesting to come back to in a couple years.
 
Last edited:
3060 is a 1080p card though, in that space I think it will last longer than 3060ti or 3070 at 1440p. Just because I think that memory usage will increase quite drastically across all resolutions (as it always does when the consoles get sizable increases in memory).

Basically, in the 1080p/light 1440p space you have the 6600xt and the RTX 3060 and in the 1440p/light 4k space you have the 6700xt and the 3060ti/3070. In both those spaces the lower VRAM cards are performing better in the crossgen phase, but it wouldn't surprise me if next-gen only games flip that around and favor the cards with more available memory, with 8GB cards suffering from noticeable hitches etc.

It's something that will be interesting to come back to in a couple years.
In no way shape or form is the 3060 a 1080p60 card... It really is like a 1080ti plus dlss and rt.

Unless you mean 1080p120, then maybe. But I can run many 4k60 games on the 3060 no problem and if the criteria for being a 4k card is native 4k always with no dlss and ultra settings everywhere, the 3080 falls short as well.

The only game I played at 1080p60 on the 3060 was quantum break max settings, and that cannot be ran at a stable 4k 60 on even the 3090. Every card out there atm needs dlss on something like cyberpunk.
 
Last edited:
Some PC gamers here need a refresher on just how wasteful ultra settings can be (except a few like textures... if you have the vram ;))



Not to say ultra settings are stupid, but they are if you're sacrificing major fps or resolution for small detail gains.
 

OZ9000

Banned
There is no hardware that can guarantee 4k60. That's a ludicrous assumption. If you think the 3xxx series aren't that great because they can't guarantee 4k60, then you must think consoles are terrible.


Oh, I get it. The 4xxx needs to somehow "make" the PS5/Xbox titles run at 4k60 before they can be considered good tech GPUs.. Wow. Just when I thought we had at least rationale gamers, I am fooled yet again!
Lucky to get 1440p and 60 fps on these consoles. But you know, pcs are shit unless they deliver native 4k 60 fps with ultra settings and raytracing....

Meanwhile consoles don't even have anisotropic filtering this gen.....

I am a PC gamer. I am not sure why sure it is being inferred I am some console warrior.

I just want a good card which will give me 4k60 - particularly as GPU prices are absurd. GPU prices have increased exponentially. But value proposition has dropped. During the PS3/360 gen, my 970 gave me 1080p60 for every title. My 2080 was also an excellent buy which provided high framerates during the PS4/XOX gen. My benchmark for a good GPU is based upon how well it performs in comparison to the consoles.

Fortunately as most console games do not run at native 4k60, I would hope that the 4000 series is sufficient enough to perform said duties. And as prices are going to be absurd, one would hope the performance is also absurd.
 
Last edited:

OZ9000

Banned
So you want something that doesn't exist? You do know PPI exists, along with view distance right? 4k is a huge waste of resources.

You shouldn't be using any of those cards for 4k though. Developers have been pretty bad with optimization. Halo infinite only gets 64 fps at 4k ultra on a 3090, so safe to say really nothing this gen is 4k as long as games come out the way they do. 1440p a 3070 will last longer than a 3060 at higher performance. Either will probably be better than the base consoles.
DLSS helps avoid the use of native 4K but unfortunately it is not implemented in every title. Either way, 4K resolution makes a significant difference to picture quality, especially when most games have terrible image quality. Halo infinite looks horrendous at low resolutions.
 
Last edited:

//DEVIL//

Member
Hmm, I may just keep my 3090 and skip next gen since my current card is still a beast at 4k and has enough vram to last a long time.

Do you guys plan to get the 4090 at launch or wait for the inevitable 4090 Ti?
4080 day one if possible. Assuming it’s 40% more powerful than 3090ti then yeah why the hell not.

Already sold my 3090 for profit and got a cheap 3080 below msrp used to hold me till I get my 4080 ( got it for 770$ US) . And it’s only 3 months old ftw3 lol
 

DenchDeckard

Moderated wildly
The irrational thinking of these warriors is dumbfounding. All in the name of the console!! Here hear!!

Yup, I love console gaming and PC gaming but some decisions on console have me scratching my head.

Like you said, the irrational thinking of some console fans have towards PC gaming is very funny to see. I now just laugh at a lot of it. Not talking about O OZ9000 as they have expressed they are a PC gamer, but i have heard sentiments around that mirror my comments.
 
Last edited:

FireFly

Member
In no way shape or form is the 3060 a 1080p60 card... It really is like a 1080ti plus dlss and rt.

Unless you mean 1080p120, then maybe. But I can run many 4k60 games on the 3060 no problem and if the criteria for being a 4k card is native 4k always with no dlss and ultra settings everywhere, the 3080 falls short as well.

The only game I played at 1080p60 on the 3060 was quantum break max settings, and that cannot be ran at a stable 4k 60 on even the 3090. Every card out there atm needs dlss on something like cyberpunk.
The 3060 is slower than a 1080 Ti and is roughly in between a 2070 and a 5700 XT. So we would expect it to be around a PS5 in rasterization performance and slower than an XSX. (Though comparisons with PC aren't always straightforward).

When console games start targeting 1440p, you will have a choice between playing at 1440p with console-like settings, or dropping to 1080p. Right now in Cyberpunk, you won't get a locked 60 at 1440p unless you drop below medium settings.
 

yamaci17

Member
once again... tons of misinformation

- 4 gb vram gpus are still plenty for nearly almost all lastgen games with optimized console settings
- rx 480 surpsased 970 because of having a slightly better architecture. 970 still remained competitive in terms of performance against the 1060/480/580 even after 7 years. some games benefitted so much from polaris that there were (some) instances were rx 480 even outsurpassed a 1060. it has nothing to do with polaris having higher VRAM
- gtx 970 still provides aproximately 2 times perf. over a ps4 with equivalent settings in rdr 2 and is able to ultra textures in that game (to me, peak of lastgen is rdr 2 and if that GPU is gracefully handling the RDR 2, it passed the longetivtity test, even with its wonky 3.5 GB fast+0.5 gb slow vram configuration)

- 10 gb vram should be plenty for 1440p-4K+no ray tracing. consoles will have a graphics memory budget of 8.5-10 GB depending on how complex the game's systems are
-8 gb vram should be plenty for 1080p high + 1440p optimized settings for a long time

12-16 gb vram will be required for nextgen textures+ray tracing. 3080, 3070 and co. won't be able to handle ray tracing+nextgen textures at 1440p/4K in future titles just like consoles. consoles will probably have to abandon ray tracing in favor of nextgen textures. naturally, same will apply to 3080 users (nothing wrong with it. you have to adhere to your budget limits)

metro exodus enhanced edition uses 8.5-9 GB of graphics budget at 1440p-4K (dynamic). even its updated textures are pretty outdated. imagine pushing actual ultra definition high quality textures into that game, VRAM usage would be around 12-13 GB and all of a sudden consoles cannot run it, and all <12 GB RTX GPUs are getting destroyed.

current console and rtx gpu memory budgets constraint the usage of nextgen textures alongside with ray tracing. even then, consoles' ray tracing performance is pretty bad and should tell you that there's no valid ray tracing for its future. its clear that;

They designed consoles with no ray tracing in mind. If they wanted to push nextgen textures+ray tracing, they'd give them 24-32 GB VRAM budget. They didn't give a care about ray tracing, if it were so, series s would not exist. Its a literal console that can only allocate 4-4.5 GB of VRAM to games. That's a serious implication. That box has to run games all the way to 2028.

You either go full ray tracing in nextgen or it just becomes a "from time to time" fantasy like Metro Exodus. Series S and other consoles having a very limited GPU memory made sure that Ray Tracing will be a fluke on consoles.

Now you can argue RTX 3070 3080 are capable of RT much better than consoles, but yeah, they will also be severely limited by their memory budget. But truth is, nextgen games are going to be so demanding, running RT will be out of question for both consoles AND those GPUs. I get literally 1080p 55-60 FPS in Cyberpunk with ray tracing on my 3070. that's jawdropping. game's base optimization is pretty bad. but nextgen games will actually be that tough to run. not every game will be a walk in the park for systems out there. adding RT on top of an already demanding game is brutal for performance reasons. you can get away with sacrifices to framerate or to resolution with DLSS and stuff, but core problem stays the same.

nextgen ray tracing will be more useful to rtx 4000 series with plentiful of VRAM but also PLENTIFUL of raster and RT performance. having plentiful vram but lackluster rt performance like 3060 is not cutting it either
 
Last edited:

Dream-Knife

Banned
DLSS helps avoid the use of native 4K but unfortunately it is not implemented in every title. Either way, 4K resolution makes a significant difference to picture quality, especially when most games have terrible image quality. Halo infinite looks horrendous at low resolutions.
Halo infinite looks ok at 1440p. Doesn't look good enough for the performance it has.

1440p 144+ looks and feels better IMO than 4k60 on a desktop monitor.

4080 day one if possible. Assuming it’s 40% more powerful than 3090ti then yeah why the hell not.

Already sold my 3090 for profit and got a cheap 3080 below msrp used to hold me till I get my 4080 ( got it for 770$ US) . And it’s only 3 months old ftw3 lol
Why not get a 4090? How did you get the 3080 below MSRP?

nextgen ray tracing will be more useful to rtx 4000 series with plentiful of VRAM but also PLENTIFUL of raster and RT performance. having plentiful vram but lackluster rt performance like 3060 is not cutting it either
4000 likely won't be enough either. We need 2x performance of 30 series minimum to consider next gen with RT. By the time those games start rolling out we'll have the 50 series.
 
Last edited:

//DEVIL//

Member
Halo infinite looks ok at 1440p. Doesn't look good enough for the performance it has.

1440p 144+ looks and feels better IMO than 4k60 on a desktop monitor.


Why not get a 4090? How did you get the 3080 below MSRP?
It really depends on the price of the cards . I don’t think the 4080 will be cheap. If anything at least same price as 3080ti msrp
 
The 3060 is slower than a 1080 Ti and is roughly in between a 2070 and a 5700 XT. So we would expect it to be around a PS5 in rasterization performance and slower than an XSX. (Though comparisons with PC aren't always straightforward).

When console games start targeting 1440p, you will have a choice between playing at 1440p with console-like settings, or dropping to 1080p. Right now in Cyberpunk, you won't get a locked 60 at 1440p unless you drop below medium settings.
3060 is either faster or slower than 1080ti depending on the game and how well the game is optimized for pascal. Which, is on its last legs in terms of driver optimization.

And that is just comparing raster to raster ; factor in dlss and rt as well as 1 extra gig of ram on 3060 and obviously 3060 is the better card.

Dlss is a big deal if you hadn't noticed..
 

yamaci17

Member
Halo infinite looks ok at 1440p. Doesn't look good enough for the performance it has.

1440p 144+ looks and feels better IMO than 4k60 on a desktop monitor.


Why not get a 4090? How did you get the 3080 below MSRP?


4000 likely won't be enough either. We need 2x performance of 30 series minimum to consider next gen with RT. By the time those games start rolling out we'll have the 50 series.
halo has a problem where its high quality lods, assets and textures are only loaded when the game is ran at 4k. it does not matter if the internal resolution is 1080p or 1440p, game will only use those high quality lods at 4k;


it is best experienced at 4K (you can still set the internal resolution to 1440p). game simply refuses to use those high quality assets if your resolution is anything other than 4K

look at that comparison, both are literally rendered at 1080p (internal render resolution) yet having a baseline of 4K forces the game to use those assets instead.
 
Last edited:
Gtx 970 was a great card, I used it for a long time.

Still that 3.5gb vram did hamper it and an 8gb rx 480/580 could, once again, use significantly higher texture settings and not run into stutter in games as soon as it happened to 970.

Same will apply to 3060 vs whatever that has less vram. 3080 10gb, no way I would argue it's a worse card in the long run than 3060 though, and it will last longer as long as you reduce textures. STILL it is an overpriced, bad product, and still can't use as high of textures as a much cheaper card which is just stupid no matter how much crying the sour grapes PC crowd does.
 

yamaci17

Member
nope, both can run the exact same ultra textures in rdr 2. there are no additional textures that rx 400 can ran on top of gtx 970 in rdr 2, ac odyssey, god of war and many more lastgen titles. with optimized settings, there's enough vram headroom that alleviates any potential stutter problem

if you push high fidelity settings that push higher vram usage (shadows, reflections and so on), performance drops so much on rx 480 and 970, both get unplayable framerates, so they have to use selective optimized settings either way.

here's actual proof



rdr 2, at 1080p, with one x settings, use only 3.5 GB vram out of a total 8 gb vram available by rx 580 which is what rx 580 and gtx 970/1060 targets

and hey, look at 1080p ultra settings,



now we are talking. a whopping 4.7 gb usage, innit? that would surely show gtx 970 its day. oh wait a second. 30-35 fps. not good !

this is what being forced into your natural memory usage limits look like.

you can either use optimized settings and get a pristine 60+ frames with adequate vram usage (3-3.5 GB, like consoles), or you can push and use "unused vram" with high fidelity settings and enjoy 30-35 frames.
 
Last edited:
nope, both can run the exact same ultra textures in rdr 2. there are no additional textures that rx 400 can ran on top of gtx 970 in rdr 2, ac odyssey, god of war and many more lastgen titles. with optimized settings, there's enough vram headroom that alleviates any potential stutter problem

if you push high fidelity settings that push higher vram usage (shadows, reflections and so on), performance drops so much on rx 480 and 970, both get unplayable framerates, so they have to use selective optimized settings either way.

here's actual proof



rdr 2, at 1080p, with one x settings, use only 3.5 GB vram out of a total 8 gb vram available by rx 580 which is what rx 580 and gtx 970/1060 targets

and hey, look at 1080p ultra settings,



now we are talking. a whopping 4.7 gb usage, innit? that would surely show gtx 970 its day. oh wait a second. 30-35 fps. not good !

this is what being forced into your natural memory usage limits look like.

you can either use optimized settings and get a pristine 60+ frames with adequate vram usage (3-3.5 GB, like consoles), or you can push and use "unused vram" with high fidelity settings and enjoy 30-35 frames.

Now try doom eternal.

I know full well rd2 ultra textures is fine on 970.

Did you not see me posting about how ultra settings can be a waste? You're preaching to the quire.

I have to use optimized settings at 4k on the 3060, and 1440 native sometimes as well.
 
Last edited:

yamaci17

Member
Now try doom eternal.
i don't care about doom eternal, its texture settings are mostly troll where anything above high is just a toggle for higher texture load distance where most people won't notice. i myself did extensive tests and couldn't notice differences without pixel peeping. i knew you would come up with something like this, but i don't care. you can argue you ran super ultra nightmare textures on 3060 where 3070 cannot run it. yet ultra and ultra nightmare textures look identical if you don't pixel peep. its not an actual texture quality toggle, its rather a toggle to load textures that not even in your vision

rdr 2 is one of the graphical peaks of last gen so I take that as a reference point. if rockstar developers wanted, they could provide such useless extra toggle options. i'm glad they didn't.

if 8 out of 10 lastgen games ran perfectly fine with maximum textures on a 970 with optimized settings, then the discussion is moot. there will always be outliers and extremes. it does not justify 8gb rx 480 over 4 gb gtx 970 or 8 gb rx 5500xt over 4 gb 1650super. as a matter of fact, i'd take 1650super over 5500xt any day. my friend uses it with optimized settings on games and there has not been a single title where he couldn't ultra textures.

latest game he played was elden ring. ultra textures, medium settings, a normal 90 hours experience on his end with 50-60 frames. what else do you expect from these GPUs? at ultra settings, card suffocated and rendered 30-35 frames. why would you accept 30-35 frames to fill up VRAM? you can go and check ultra 1080p 8 rx 580 benchmarks.
 
Last edited:

YeulEmeralda

Linux User
Innovation on PC is kinda irrelevant now that AAA PC exclusives are pretty much dead.

Both PC and console are playing the same games only your Nvidia GPU plays them at higher resolution/FPS/settings.
Consoles set the benchmark for every generation.

And that's great because it means my 3060 will last me another 3 years.
 

yamaci17

Member
I only care about the facts that are relevant to the topic we're discussing

texture load distance (which is what doom eternal's texture quality settings stands for above the high setting) is not relevant to the actual texture quality, therefore it holds no value for this discussion

In actual texture pack quality discussion, you may maybe find 1 out of 10 last-gen games that 8 gb truly has an advantage over 4 gb at 1080p. its your fault that you portray a situation as if almost all games run with higher textures on 8 gb GPUs, whereas the situation is rare. If a 4 GB GPU can match the 8 GB one with maximum texture fidelity in %99 of the titles in lastgen, your point becomes moot

so here are some corrections for your "facts"

"use significantly higher texture settings and not run into stutter in games as soon as it happened to 970.... IN ONLY 1 GAME OUT OF 10 GAMES (best case) "

you can claim that you've went and modified unreal engine 4 parameters in config files and forced the game to use more vram and got less stutters. no one says more vram is bad or anything. you can find uses for it. but its niche. that's about it. engines are built around limitations of hardware. the fact that series s has 4.5 gb available vram, even 6 gb vram won't be dead at 1080p for a long time. you can of course at some point run higher textures and be happy about it. it does not change the fact that gtx 970 1650s users enjoy the games just as the same because most of the time, even high and ultra textures (if the said ultra textures are aimed at 4K users) look similar but its not even the point of our discussion. you can run 4k textures on a 1080p screen with a 8 GB 580 but it wont do you much good anyways. thats why series s is given 10 gb budget and series x is given 16 gb budget, because 4K textures are also useless on a system that targets 1080p (hello rx 580)
 
Last edited:

DukeNukem00

Banned
Innovation on PC is kinda irrelevant now that AAA PC exclusives are pretty much dead.

Both PC and console are playing the same games only your Nvidia GPU plays them at higher resolution/FPS/settings.
Consoles set the benchmark for every generation.

And that's great because it means my 3060 will last me another 3 years.


It's not really dead. Ray Tracing you got from PC and the latest big deal in gaming, Battle Royale, also from PC. Just as it always was. Nearly every genre we play today was invented on PC- shooters, rpg's, stealth, mmo's, adventures, strategy games, sim games. PC continues to be the benchmark for everything
 

FireFly

Member
3060 is either faster or slower than 1080ti depending on the game and how well the game is optimized for pascal. Which, is on its last legs in terms of driver optimization.

And that is just comparing raster to raster ; factor in dlss and rt as well as 1 extra gig of ram on 3060 and obviously 3060 is the better card.

Dlss is a big deal if you hadn't noticed..
I am not advocating buying a 1080 Ti! Just pointing out the performance tier where the 3060 sits.

Moreover, the discussion is about memory usage in relation to resolution. Yes, you can use DLSS to play at 1440p on a 3060 with some ray tracing effects. But the same applies just as well to a 3070, where you can expect to see better ray tracing effects. DLSS diminishes the need for more VRAM by pushing down the render resolution, while delivering similar quality to native rendering.
 
Last edited:
Top Bottom