• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 50 Series Announcement Set for January 4

Cyberpunkd

Member
I wonder if Nvidia will lower the prices of their cards now they've seen how cheap Intel are willing to go.
Episode 16 Laughing GIF by One Chicago
 

Bojji

Member

Again, those motherfuckers are leaving 5070 in bad position, between 16GB 5060ti and 16GB 5070ti.

5070 should be 16GB, that should be their target. 12GB for 5060/5060ti and 20GB for 5080/5070ti.
 
Fuck the planet (and your electricity costs), it's the pixels that count.
I've got solar panels in my house. They're extremely effective in the summer - I generate more electricity via solar than actual consumption. Unfortunately we only have 4 months of sunlight in the UK and have zero energy production for most of the year. When playing games, my electric bills are sky high.

600W alone for a GPU seems like madness. I would not want to be in the same room as a GPU pumping out that much heat either.
 

kittoo

Cretinously credulous
Don't cheap out on motherboard. Get one with good vrm's and read about people having stability issues while you think, huh?, runs fine in my setup.

I know. I still plan to get a pretty decent mobo after going through many reviews etc., but I am not sure if paying even more for PCIE 5.0 is a good idea or not.
 

Rickyiez

Member
Man you are hugely downplaying the performance differences here

3080 was magnitudes faster than 2080 (40% at 4k) -- %70 over 2080 source HUB


1080 Ti was magnitudes faster than 980 Ti (46% at 4k) -- 85% over 980 TI source TPU


4090 is 40% faster than 3090 in 4k - 75% over 3090 source HUB

all AVG raster performance at 4k

This is how I read the performance difference https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/34.html

It's actually the same as how you read it , 40% more frame rate or 170% faster (i.e 100FPS for 3080 will be 60FPS for 2080)

wJyRqxm.png


But my counter argument towards the person I quoted remains , its that the performance increase were always magnitudes more , not just from 3090 to 4090 .
 
Last edited:

Gubaldo

Member

un1que

Member
I’m not as clued up as most of you with pc building. But what’s the usual best option with NVIDIA cards? The founders edition or another manufacturer? If it’s the latter who is is usually best?
 

Buggy Loop

Member


Very interesting details!


Awkward Season 9 GIF by The Office


AI rendering could be a huge game changer depending on what the 1st iteration will be like.

Imagine a game with basic assets, like you define how the game will play, the building blocks, the physics/animation, etc. The basic geometry of the level and objects. But AI puts the coat of paint on it, whatever style you want. It knows what lighting is supposed to look like, a fraction the cost of path tracing.
 

Xdrive05

Member
I wonder if it's going to be a gen-locked feature a'la frame gen for the 40 series. Better than 50% chance probably. They gotta have some shiny reason to sell yet another 60 series card with 8GB VRAM on a 128bit bus for probably higher MSRP than the previous one.

Over and under on "Neural Rendering" being marketed with another DLSS prefix (DLSS4?) or some new designation entirely?

I'm thinking "DLNR" if it really has nothing to do with upscaling or frame gen.
 

Bojji

Member
I wonder if it's going to be a gen-locked feature a'la frame gen for the 40 series. Better than 50% chance probably. They gotta have some shiny reason to sell yet another 60 series card with 8GB VRAM on a 128bit bus for probably higher MSRP than the previous one.

Over and under on "Neural Rendering" being marketed with another DLSS prefix (DLSS4?) or some new designation entirely?

I'm thinking "DLNR" if it really has nothing to do with upscaling or frame gen.

Suprisingly Nvidia weren't locking features that much before. GameWorks stuff worked on everything, there was some Maxwell exclusive DX12_2 stuff but almost nothing used it, MFAA for maxwell as well? TXAA for Kepler?

For DLSS they only have frame gen for Ada, ray reconstruction is the latest thing ("DLSS 3.5") and it works on everything with RTX name.

We will see.
 
600 Watts is truly crazy. I cannot imagine how anyone can justify 600 watts for the GPU just to play games.
thats the one thing that may make me not get a 5090--power consumption, and therefore temps/fan noise.
if its loud and heats up my room... may have to pass. itll probably be fine though.

need the gpu compute for 4k.

well that or if its $2500. thats just wrong.
 
Last edited:

llien

Member
How much of this is just PR bullshit?
Let's go point by point:

  1. "There will be a new DLSS feature that PF will hype that won't run on series below 5000" - no doubt
  2. RT cores that deliver "more realistic" lighting. BS. One can twist it like "but it's faster, so kinda allows for more realistic"
  3. "AI accelerated graphics" - modern "Y2K compatible speakers"
  4. "Improved AI performance" - no sh*t, watson, faster cards do scheisse faster
  5. "Better integration of AI in games" - I'm afraid another try to pull proprietary API trick
  6. "Neural Rendering Capabilities" - perhaps a runtime filter. I'm skeptical.
  7. "AI enhanced power efficiency" - guaranteed lie
  8. "Improved AI upscaling" and DLSS used outside games to upscale thingies. Yeah, why not.
  9. "Generative AI acceleration" - most GPUs can do that, of course new, faster gen will be even faster.
 
Last edited:

Bojji

Member
Let's go point by point:

  1. "There will be a new DLSS feature that PF will hype that won't run on series below 5000" - no doubt
  2. RT cores that deliver "more realistic" lighting. BS. One can twist it like "but it's faster, so kinda allows for more realistic"
  3. "AI accelerated graphics" - modern "Y2K compatible speakers"
  4. "Improved AI performance" - no sh*t, watson, faster cards do scheisse faster
  5. "Better integration of AI in games" - I'm afraid another try to pull proprietary API trick
  6. "Neural Rendering Capabilities" - perhaps a runtime filter. I'm skeptical.
  7. "AI enhanced power efficiency" - guaranteed lie
  8. "Improved AI upscaling" and DLSS used outside games to upscale thingies. Yeah, why not.
  9. "Generative AI acceleration" - most GPUs can do that, of course new, faster gen will be even faster.

I wonder how much of it is just about RTX cards in general.
 

Dorfdad

Gold Member
Most people won’t find it worth the price. Monitors and TVs can’t even reach the maximum of 4090 right now.

I’m sure some tech enthusiasts will love the latest and greatest, but it’s not going to be a popular choice for the general public. In fact, I think this new card might actually help the 40 series sell better.

This is the first gen I’m sitting on the sideline at launch.
 
Awkward Season 9 GIF by The Office


AI rendering could be a huge game changer depending on what the 1st iteration will be like.

Imagine a game with basic assets, like you define how the game will play, the building blocks, the physics/animation, etc. The basic geometry of the level and objects. But AI puts the coat of paint on it, whatever style you want. It knows what lighting is supposed to look like, a fraction the cost of path tracing.
I mean there was an interview with Jensen Huang 5 months ago where he was teasing what the future of DLSS will be, he said “generating much better models & textures” than what the game has using AI of some sorts, I think this is its manifestation? Or rather I should say, it’s 1st attempt at doing just that with the 50-series graphics cards? No wonder why the 5090 has 32GB as this stuff is pretty expensive on video memory… Maybe it’s just a hunk of bologna, we’ll see…
 

Buggy Loop

Member
I mean there was an interview with Jensen Huang 5 months ago where he was teasing what the future of DLSS will be, he said “generating much better models & textures” than what the game has using AI of some sorts, I think this is its manifestation? Or rather I should say, it’s 1st attempt at doing just that with the 50-series graphics cards? No wonder why the 5090 has 32GB as this stuff is pretty expensive on video memory… Maybe it’s just a hunk of bologna, we’ll see…

Yea, I might be too optimistic on 1st gen AI rendering..

AI texture could already be something quite nice. Microsoft, among others, had apparently an Xbox studio that was experimenting with it

AINeuralStuff.jpg



Or AI skins on characters, effects, , etc



Maybe they'll implement in their RT cores an accelerator specifically for neural radiance cache path tracing.



But deep down I want to see the inevitable, maybe too soon but it would be a massive bomb in rendering world if they have something like this Or an improvement on this concept, with less hallucinations (graphics shifting, these are amateur videos with not top of the line ML)



 

RavenSan

Off-Site Inflammatory Member
If the RX8800XT comes with 16GB VRAM and a (estimated) $500 price point -- that may be enough for me to ditch Nvidia. Can't fathom a 5080 at twice (or more) the price is going to give me twice the performance.

But I'll ultimately wait for reviews. For me, I'd personally take 25% less performance from a card at half the price as a no brainer.
 

CuNi

Member
5090 is 100% going to be 3k€

Nah I'm guessing 2-2,5k€.
Everything above would mean that the bottom tier also has to creep up by a lot.
With Intel and AMD most likely fighting with Nvidia in low and mid-tier, they can't just make a 1k€ gap between 70 and 80/90 series.
 
If the RX8800XT comes with 16GB VRAM and a (estimated) $500 price point -- that may be enough for me to ditch Nvidia. Can't fathom a 5080 at twice (or more) the price is going to give me twice the performance.

But I'll ultimately wait for reviews. For me, I'd personally take 25% less performance from a card at half the price as a no brainer.
If PC is all about price point than Nvidia would have not got 90% share in PC market. PC is the only market, where price point has a very least effect.
 

RavenSan

Off-Site Inflammatory Member
If PC is all about price point than Nvidia would have not got 90% share in PC market. PC is the only market, where price point has a very least effect.
I'm not sure what you point is? I know that a lot of folks are happy to spend that money on a Nvidia card -- I've got a 3070 now.

I'm just saying if the cards are even remotely comparable (I'd say within 20% of each other) -- there's no way I would personally spend twice as much on a Nvidia card. I don't think I speak for every PC gamer -- this is merely just my opinion.
 

HeisenbergFX4

Gold Member
Most people won’t find it worth the price. Monitors and TVs can’t even reach the maximum of 4090 right now.

I’m sure some tech enthusiasts will love the latest and greatest, but it’s not going to be a popular choice for the general public. In fact, I think this new card might actually help the 40 series sell better.

This is the first gen I’m sitting on the sideline at launch.
100% the 90s are for the junkies
 

Kenpachii

Member
If that 5090 is going to be 2k+ euro's, i am out.
the 5080 not having 24gb is kinda shit, might as well then opt for a 5070ti with 16gb, atleast don't feel fully scammed then.
 

Zug

Member
Nah I'm guessing 2-2,5k€.
Everything above would mean that the bottom tier also has to creep up by a lot.
With Intel and AMD most likely fighting with Nvidia in low and mid-tier, they can't just make a 1k€ gap between 70 and 80/90 series.
4090s price still range from 2300 to 2500€in France atm. I probably even under evaluated it, make it 3k5€ for the premium versions.
 
Top Bottom