Poll: Expected Generational Uplift on GPUs?

Expected GPU uplift over previous generation in raw rasterization? (Choose the closest)


  • Total voters
    49
  • This poll will close: .

iQuasarLV

Member
As the title asks, what is a PC gamer's expected uplift, in raw rasterization, each generation from a GPU to conclude it is worth the money you will spend? This is asking assuming an even upgrade 70 to 70 series, 80 to 80 series.
 
Pretty much in line with the efficiency gains of the node shrink. About 15-20% would be my guess. I'm not expecting the architecture improvements to do much.
Ofc there will be new AI shenanigans that will maybe come into play a few years after.

Edit: oh you were talking about the consoles, mb. There I'd expect at least factor 2 and also at least 200 bucks above the old base for the price.
 
Last edited:
As the title asks, what is a PC gamer's expected uplift, in raw rasterization, each generation from a GPU to conclude it is worth the money you will spend? This is asking assuming an even upgrade 70 to 70 series, 80 to 80 series.
20% is the new normal unless the chips are 1) ok a new node, 2) big chips like the 4080 and 4090 and 3) expensive like the 4080 and 5090.

If it meets those 3 then the gains will be good if not then they may not. Even the 5090 couldn't reach 40% gains at 4k ultra because it's missing the 1st requirement. Next gen the gains will be better (because 60 series will be on a manufacturing node) but only for the more expensive cards.
 
As the title asks, what is a PC gamer's expected uplift, in raw rasterization, each generation from a GPU to conclude it is worth the money you will spend? This is asking assuming an even upgrade 70 to 70 series, 80 to 80 series.
I hope udna flagship is 60-70% faster than 9070 xt.
 
Personally, I wouldn't upgrade until I could get at least 50-60% uptake.

In reality if a GPU in a same tier can get around 30% performance gain or higher then that's pretty good.
 
Last edited:
I expect we will see minimal uplift, if not regression, in pure raster performance as the industry continues its shift toward ray/path tracing, and usage of AI upscaling. Chip designers are going to use that extra die shrink space to pack in other types of cores.
 
We're not gonna see 2x generational performance uplifts again. But I also refuse to upgrade for less than a 2x uplift considering how much it costs these days.

I upgraded my 2080 Ti to 5080 at launch and can't say it was worth it at ~2.5x for 1100USD. Which is pretty crazy given there's almost 7 years between these cards.
 
~35%

Gen on gen


This way if I skip a generation im getting a truly sizeable upgrade buying in the same class.



Also the xx70 should match the last gen xx90..
How is it possible the 5080 couldnt beat the 4090?
 
I expect we will see minimal uplift, if not regression, in pure raster performance as the industry continues its shift toward ray/path tracing, and usage of AI upscaling. Chip designers are going to use that extra die shrink space to pack in other types of cores.

You think they are gonna put more Tensor and RT cores per SM?
Noting that number of Tensor cores per SM has actually gone down since Turing.

Worse still if you put more RT cores per SM you are gimping everything else the SM can do including the Tensor space as the RT core is a fixed block in the SM.

1013-sm-diagram.jpg




They are just going to change how the SM is used.

2TeVgAAMkQJgyDtw4EQLiC-1200-80.jpg.webp
 
Doesn't really work anymore, since each subsequent generation is the same cost/performance as the last, with the price simply skyrocketing as the performance improves.
 
Top Bottom