• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Let's face it: GPU manufacturers have hit the wall and things are only going to get worse from here.

bundylove

Member
Expect a 28TF machine with a focus on RT hardware and more AI

Of course there will also be architecture improvements and more RAM with much higher bandwidth, and a better CPU

Bigger leap than PS5 pro on the whole but the gpu itself will probably be a similar leap
Yeah that would be realy low spec wise.

Though i think times chaged and i believe the entry price will be 700 usd. And for that money i would like to expect 4080ti performance.

If i get a 28tf console for 700 i wont buy.
 
Yeah that would be realy low spec wise.

Though i think times chaged and i believe the entry price will be 700 usd. And for that money i would like to expect 4080ti performance.

If i get a 28tf console for 700 i wont buy.

It probably will be those specs @ $699, inflation doesn’t go down
 

SHA

Member
AI's jump still bigger than all those years and has no limit. It's significance shouldn't be underestimated.
 

bundylove

Member
It probably will be those specs @ $699, inflation doesn’t go down
Well then i want to see what devs can show off on that console.
I feel this gen didnt take advantage of the consoles . 4 years of cross gen games and lots of remakes remasters.
Games that would be next gen run or lool like shit or are just a buggy mess.

If they can provide a true next gen no cross gen console for 700 and 28tf we will see wonders.

If it will be the same than i have no hope for the industry
 
Last edited:

SHA

Member
We have seen huge improvements in GPUs since early 2000 almost every / two years, but it seems that many people do not understand what has made this possible. It's not like Nvidia and ATI / AMD have been improving the architecture so much each year to get 2x increase in performance. The truth is that this progress has only been possible because TMSC has been able to offer more transistors at the same price every 1-2 years. Unfortunately we hit the node wall and Nvidia cannot improve performance without increasing price and power consumption anymore (RTX5090 with absolutely ridiculous 575TDP). Some people like to think that Nvidia is just greedy, but can we really expect them to cut prices when costs go up? That's not realistic.

So what does this mean for us gamers? Well, we can certainly say goodbye to cheap GPUs and meaningful improvements every 2 years.

980ti vs. 1080ti a whopping 92% relative difference even without DLSS / FG gimmicks. We can forget about such massive gains.


doom_3840_2160.png


The good part is, current GPUs will last longer than ever.


Nor can we expect PS6 to bring the desired improvements. I doubt AMD will make APU that will be able to match even RTX4090 in the next few years. I was hoping to see PT used on the PS6, but things are not looking so good. Perhaps Sony should delay the PS6 until they can offer a meaningful performance / technology boost.

What can we do if things only get worse with our current hobby? Maybe we should get a new hobby and play some chess 🙃😃 🤣?
Cerny stated the Ps6 isn't about the gpu and he's not only talking about his product, all these companies working together and decide the future of gaming tech.
 

Buggy Loop

Gold Member
I played both horizon Remaster and Forbidden West. Beutiful scenery, and amazing art direction but the lighting was flat as hell.

That's not a limit of having no RT. There's solutions to almost everything on raster side years ago. Currently playing KCD and I don't understand why we even needed to move away from SVOGI & SMAA (fuck TAA)





At least this doesn't need TAA forced to hide all the GI flicker like we see in Lumen.

PT games like Wukong,

Performs like absolute shit

Cyberpunk have way better lighting

Performs good for the scale of things, but again, you're targeting ~4% of hardware owners that can enable it? Forget about it on consoles. I'm all for top of the line graphics, that's the goal of PC, but the BASELINE has barely improved. Do you feel the 2TF → 10 TF console gen? I don't.

also Avatar is on totally different level despite using standard RT.

eh, haven't played it because its ubisoft slop, and I'm sure on PC with max settings its something else, but the baseline on console shouldn't look this bad



Sony devs made forbidden west to run on 2TF. Avatar is 10TF min and beyond.

Like I said, barely improved. Sometimes downgraded even.
 
Last edited:

Puscifer

Member
The people that can't afford it will just return to console. The people with money will keep on trucking.
I can afford it but I'm not giving entire percentages of my income for a single part. As I said rounding my income down to 100K you're talking 1.5% of that for a single component when for the 2000 series that built a 2080 PC in full. I'm supporting AMD going forward, Nvidia can kick rocks
 

nemiroff

Gold Member
Disregarding notions, what does the actual data indicate? What is the year-over-year percentage increase in processing speed over the past few decades? And how much of this increase is affected by technical factors?

Anyway, no need to worry that much. Realtime AI rendering will revolutionize the future, and in many ways, it has already begun (DLSS f.ex.). Moreover, even though it will take more time, we've seen the first truly positive results from advancements in quantum computing coming in recently.
 

Stuart360

Member
That's not a limit of having no RT. There's solutions to almost everything on raster side years ago. Currently playing KCD and I don't understand why we even needed to move away from SVOGI & SMAA (fuck TAA)





At least this doesn't need TAA forced to hide all the GI flicker like we see in Lumen.



Performs like absolute shit



Performs good for the scale of things, but again, you're targeting ~4% of hardware owners that can enable it? Forget about it on consoles. I'm all for top of the line graphics, that's the goal of PC, but the BASELINE has barely improved. Do you feel the 2TF → 10 TF console gen? I don't.



eh, haven't played it because its ubisoft slop, and I'm sure on PC with max settings its something else, but the baseline on console shouldn't look this bad



Sony devs made forbidden west to run on 2TF. Avatar is 10TF min and beyond.

Like I said, barely improved. Sometimes downgraded even.

Cant speak for the console versions but i own both Forbidden West and AVATAR, and FW is not even in the same league as AVATAR maxed out.
Oh and lol at that biased comparison vid, which is clearly using a low resolution and low settings for AVATAR (you barely see any pop in on Ultra settings). i love it when you get these clear biased comparison vids lol.
 
I bet one of the Xbox options lives up to expected numbers but comes at a cost

Xbox said the same thing when they “won” the teraflop wars of 12TF vs 10TF and look how that panned out.

Especially since you’ve been saying Xbox will release first

I’ll believe it when I see it, they will be within spitting distance of each other.
 
Last edited:

kungfuian

Member
Looking at consoles and graphics cards from a traditional perspective and the op might be right, that has plateaued, but I don't think hardware and software of the future will follow the traditional path.

So far we've seen dedicated hardware for Raytracing emerge over the last 10 years and more recently AI software (and hardware to a very limited extent) applied to upscaling and frame generation, but that's just the beginning. Aside from a further refining of these applications/approaches, I'd expect hardware to transition even further toward heavy reliance on dedicated AI chiplets or something to that effect. The traditional methods of creating hardware and software will definitely give way to more heavily AI based hardware and game creation and all methods of rendering, managing in game ai systems, animation, character interactions/dialog, etc. etc.
 

Madflavor

Member
I'm just glad I grew up during the time when we got to see actual graphical leaps between systems. From NES to SNES, then to the PS1 to PS2, etc.

Nowadays it's just "Here's our new console, check out that lighting! 30fps? How about sustained 40fps on Quality Modes!"

The "Next Gen Console" just doesn't hit the same anymore.
 

HeisenbergFX4

Gold Member
Xbox said the same thing when they “won” the teraflop wars of 12TF vs 10TF and look how that panned out.

Especially since you’ve been saying Xbox will release first

I’ll believe it when I see it, they will be within spitting distance of each other.
For sure the power crown did nothing for them
 

Kenpachii

Member
DLSS, Framegen are large performance boosters. So its not just rast performance that matters anymore. We will see more and more focus on AI techniques to get more performance going when GPU progress lacks.
 
Well then i want to see what devs can show off on that console.
I feel this gen didnt take advantage of the consoles . 4 years of cross gen games and lots of remakes remasters.
Games that would be next gen run or lool like shit or are just a buggy mess.

If they can provide a true next gen no cross gen console for 700 and 28tf we will see wonders.

If it will be the same than i have no hope for the industry

You need to re-set expectations.

Cross-gen is a thing. Games are too costly to develop. Cross-gen will only get worse.

Next-gen only games will be limited to cheap AAA games to produce (Ratchet/Returnal) and Remasters (Demons Souls).

PS6 games will have cross-gen ports with things like Path Tracing and Ray Tracing enabled, better resolutions/frames, and texture quality. That's what it will be.
 

Stuart360

Member
You need to re-set expectations.

Cross-gen is a thing. Games are too costly to develop. Cross-gen will only get worse.

Next-gen only games will be limited to cheap AAA games to produce (Ratchet/Returnal) and Remasters (Demons Souls).

PS6 games will have cross-gen ports with things like Path Tracing and Ray Tracing enabled, better resolutions/frames, and texture quality. That's what it will be.
I wouldnt be surprised if cross gen lasts the whole of next gen, ar least for some games/publsihers.
The next gen versions will just be higher rez/framerate, with a bit extra detail here and there.
 

bundylove

Member
I bet one of the Xbox options lives up to expected numbers but comes at a cost


Not fair for me to ballpark numbers

Kind of surprised they haven’t leaked, 99% sure people like Tom Henderson are sitting on that info, for months now
WOw so we actualy are close to a ps6 info reveal?

Next gen starts early. Hoping
 
I wouldnt be surprised if cross gen lasts the whole of next gen, ar least for some games/publsihers.
The next gen versions will just be higher rez/framerate, with a bit extra detail here and there.

Yeah pretty much. I don't see these massive live service games doing away with last-gen version for perhaps the entirety of the gen, especially with Switch 2 coming onto the market which will not even be up to current gen standards.

Pushing next-gen tech just isn't happening much anymore and I expect that trend to continue, you still get amazing looking games but not much that can truly push boundaries. Diminishing returns on hardware + scalability + AAA costs are the reason.
 

Firestartah

Member
Continue to play games? Graphics is the last game's quality I care about.
This.

And not only that, but personally I find myself going back to 7th gen games a lot and replaying the classics, the new sequels to those francizes just dont hit the same (Mass Effect trilogy, Assassins Creed Ezio Trilogy and Black Flag, Batman Arkham Asylum/City/Knight/Origins etc etc). You just dont need the latest shit to play those. Until this industry returns to quality, I feel no pressure to upgrade my 2080.
 

Stuart360

Member
Yeah pretty much. I don't see these massive live service games doing away with last-gen version for perhaps the entirety of the gen, especially with Switch 2 coming onto the market which will not even be up to current gen standards.

Pushing next-gen tech just isn't happening much anymore and I expect that trend to continue, you still get amazing looking games but not much that can truly push boundaries. Diminishing returns on hardware + scalability + AAA costs are the reason.
Well at least graphics didnt stall in the PS2/Xbox gen. Games today look great on the whole, even some AA and Indie games.
I can live with how games look today, until some point in the future when some new rendering techniques are made or something.
 

bundylove

Member
You need to re-set expectations.

Cross-gen is a thing. Games are too costly to develop. Cross-gen will only get worse.

Next-gen only games will be limited to cheap AAA games to produce (Ratchet/Returnal) and Remasters (Demons Souls).

PS6 games will have cross-gen ports with things like Path Tracing and Ray Tracing enabled, better resolutions/frames, and texture quality. That's what it will be.
I Want to be optimistic. Maybe the next gen cross gen games will be miles betyer as they will more or less runing on same architecture meaning ssd loading, ssd level design, unreal 5 optimizatiins etc.
Unlike this one where they still make game deaign choises based on hdd and jaguar cpu etc
 
Well at least graphics didnt stall in the PS2/Xbox gen. Games today look great on the whole, even some AA and Indie games.
I can live with how games look today, until some point in the future when some new rendering techniques are made or something.

Yeah I really have no complaints with visuals on a lot of games these days, a lot of them simply look awesome and next gen will look even more awesome. I guess what's disappointing are the rare titles that have severe issues and you're like "there's zero reason for this", and I'm not sure next-gen will even fix that.
 

Hoddi

Member
I can afford it but I'm not giving entire percentages of my income for a single part. As I said rounding my income down to 100K you're talking 1.5% of that for a single component when for the 2000 series that built a 2080 PC in full. I'm supporting AMD going forward, Nvidia can kick rocks
I'm in the same camp. $2500 is less than my monthly savings but I still have zero interest in supporting this bullshit.

If Intel can release a card with 12GB on a 192-bit bus for $250 then I don't want to hear excuses for nvidia charging twice that for a 128-bit card.
 

Killer8

Member
All of the sperging about the PS5 Pro price starts to look incredibly funny now we're staring down the barrel of a $1500 RTX5080. It was viewed as anomalous pricing when it's really just the precedent now in a world of inflation, diminishing returns and market leaders being able to flop their dick on the table.
 

MikeM

Member
We are now in the refinement age. That means getting smarter at using resources that won’t leap gen on gen like before. Main gains will be in better RT better tricks (AI upscaling for resolutions, textures, ray reconstruction, etc).

What really needs to happen though is better software. The amount of games with stutter or unable to scale as more cores are available is atrocious.
 
That's not a limit of having no RT. There's solutions to almost everything on raster side years ago. Currently playing KCD and I don't understand why we even needed to move away from SVOGI & SMAA (fuck TAA)





At least this doesn't need TAA forced to hide all the GI flicker like we see in Lumen.



Performs like absolute shit



Performs good for the scale of things, but again, you're targeting ~4% of hardware owners that can enable it? Forget about it on consoles. I'm all for top of the line graphics, that's the goal of PC, but the BASELINE has barely improved. Do you feel the 2TF → 10 TF console gen? I don't.



eh, haven't played it because its ubisoft slop, and I'm sure on PC with max settings its something else, but the baseline on console shouldn't look this bad



Sony devs made forbidden west to run on 2TF. Avatar is 10TF min and beyond.

Like I said, barely improved. Sometimes downgraded even.

I don't care how Avatar looks on the PS5 / Pro. I know it looks like crap on consoles, but my video was showing the PC version and the amount of detail in that video absolutely destroys Horizon 1 / 2. It's literally like comparing two generations of graphics.

As for Black Myth Wukong PT in this game is very demanding, but once you use DLSS + FG performance is very good. I get 114fps with very high settings and max PT. With medium PT I only get 3% less performance compared to lumen (123fps with PT medium vs 127 lumen). They have optimised PT performance so well in this game that it does not even make sense to play it without PT on my PC.

If you think over 100fps with PT is absolute shit, what can you say about the PS5Pro's performance in this game, or in other PS5Pro games?
 
Last edited:

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
I wouldnt be surprised if cross gen lasts the whole of next gen, ar least for some games/publsihers.
The next gen versions will just be higher rez/framerate, with a bit extra detail here and there.

If both MS and sony release both nextgen consoles AND lower specced handhelds, everything will be crossgen in essence.
 

Buggy Loop

Gold Member
I don't care how Avatar looks on the PS5 / Pro. I know it looks like crap, but my video was showing the PC version and the amount of detail in that video absolutely destroys Horizon 1 / 2. It's literally like comparing two generations of graphics.

As for Black Myth Wukong PT in this game is very demanding, but once you use DLSS + FG performance is very good. I get 114fps with very high settings and max PT. With medium PT I only get 3% less performance compared to lumen (123fps with PT medium vs 127 lumen). They have optimised PT performance so well in this game that it does not even make sense to play it without PT on my PC.

If you think over 100fps with PT is absolute shit, what can you say about the PS5Pro's performance in this game, or in other PS5Pro games?

Do you realize how ridiculous it is to have Ada graphic cards and just say "well this is fine" ?

Look at my post history with path tracing. I'm one of the biggest ambassador of this tech for years. No, games are not supposed to cather to ~4% of hardware users and leaving consoles with worst looking games with pixel soup image quality. Devs have completely lost it. They've hit the wall at 120mph, they're all one AAA flop away from closure almost. Making even bigger GPUs is not enough. Engines perform like total dogshit.
 
Last edited:

kikkis

Member
Moore's Law is at its end and while GAAFET transistors will provide a sizeable jump in transistor density, the bigger reason for the stagnation in semiconductor manufacturing advancement (in terms of cost saving opportunities), is that TSMC is now single a monopoly in bleeding edge logic-based semi-conductor manufacturing.

Given that Global Foundaries fell off a cliff and Intel got left in the dust, TSMC has been increasing prices across all major nodes multiple times over the course of the past half-decade. They will of course blame wider socio-economic factors for the increases, but the truth is just pure and simple corporate greed because they now have no competition.

The TSMC monopoly is the real root cause of the slowdown and death of Moore's Law. The silver lining is that as we've reached the end of silicon scaling, there is an opportunity for whole new exotic computing technologies to emerge that could kick-start a whole new era of technological advancement and arms race between many different players. The problem is, most of those non-silicon based technologies are still many decades away from prime time.
I don't think tsmc margins are super high considering they get lot from chips act alone. Looking at wiki they had 35% margin, you would expect like 20% for a bleeding edge tech with massive initial investment costs. So really if 4060 costs now 300 dollars and lot of its on nvidia maybe 10-20 dollar could be gotten from lower tsmc margins.
 

SlimySnake

Flashless at the Golden Globes
We have seen huge improvements in GPUs since early 2000 almost every / two years, but it seems that many people do not understand what has made this possible. It's not like Nvidia and ATI / AMD have been improving the architecture so much each year to get 2x increase in performance. The truth is that this progress has only been possible because TMSC has been able to offer more transistors at the same price every 1-2 years. Unfortunately we hit the node wall and Nvidia cannot improve performance without increasing price and power consumption anymore (RTX5090 with absolutely ridiculous 575TDP). Some people like to think that Nvidia is just greedy, but can we really expect them to cut prices when costs go up? That's not realistic.

So what does this mean for us gamers? Well, we can certainly say goodbye to cheap GPUs and meaningful improvements every 2 years.

980ti vs. 1080ti a whopping 92% relative difference even without DLSS / FG gimmicks. We can forget about such massive gains.


doom_3840_2160.png


The good part is, current GPUs will last longer than ever.


Nor can we expect PS6 to bring the desired improvements. I doubt AMD will make APU that will be able to match even RTX4090 in the next few years. I was hoping to see PT used on the PS6, but things are not looking so good. Perhaps Sony should delay the PS6 until they can offer a meaningful performance / technology boost.

What can we do if things only get worse with our current hobby? Maybe we should get a new hobby and play some chess 🙃😃 🤣?
Get the average performance results from techpowerup. 1080ti was 67% more powerful on average than the 980ti. 2080ti was 35% and 3090 was around 50%.

4090 is 64% more powerful on average so you are pretty much there with the 980ti to 1080ti leap.

Yes, prices have gone up and these things are no longer affordable but i remember the 2080 ti launching for $1200 back in 2018. 4090 at $1,600 is basically inflation doing its thing and if the 5090 is $2k then is prEtty much following the same trajectory of the past few gens.

I think the performance uplift will be around 50-60% and some fancy AI stuff will set it apart from the 4000 series.

AI will be compensating for the smaller leaps we are bound to get in the next few years. And the 5000 series will be the start of that with amd two years behind as usual and then Sony coming in the 3rd of 4th year.
 
Do you realize how ridiculous it is to have Ada graphic cards and just say "well this is fine" ?
The game is very scalable and still looks very good even with lumen.

WcXos5j.jpeg


The argument, however, was about the progress of graphics technology. Modern high-end PCs are already capable of delivering the next level of graphics fidelity compared to PS4 (2015 graphics). BMW with PT shows that difference and I would go even as far to say that BMW on PC looks next gen even compared to PS5 graphics. Of course you need Ada GPUs to play this game with PT, but it can be done on todays PC hardware, and I can assure you RTX50 series will run this game even better.

If AMD will not be able to achieve at least ADA Lovelace 50TF level of performance in 2028/2029 APUs, IMO Sony should delay the PS6 until they would be able to offer meaningful meaningful performance / technology boost.
 
Last edited:
We have seen huge improvements in GPUs since early 2000 almost every / two years, but it seems that many people do not understand what has made this possible. It's not like Nvidia and ATI / AMD have been improving the architecture so much each year to get 2x increase in performance. The truth is that this progress has only been possible because TMSC has been able to offer more transistors at the same price every 1-2 years. Unfortunately we hit the node wall and Nvidia cannot improve performance without increasing price and power consumption anymore (RTX5090 with absolutely ridiculous 575TDP). Some people like to think that Nvidia is just greedy, but can we really expect them to cut prices when costs go up? That's not realistic.

So what does this mean for us gamers? Well, we can certainly say goodbye to cheap GPUs and meaningful improvements every 2 years.

980ti vs. 1080ti a whopping 92% relative difference even without DLSS / FG gimmicks. We can forget about such massive gains.


doom_3840_2160.png


The good part is, current GPUs will last longer than ever.


Nor can we expect PS6 to bring the desired improvements. I doubt AMD will make APU that will be able to match even RTX4090 in the next few years. I was hoping to see PT used on the PS6, but things are not looking so good. Perhaps Sony should delay the PS6 until they can offer a meaningful performance / technology boost.

What can we do if things only get worse with our current hobby? Maybe we should get a new hobby and play some chess 🙃😃 🤣?
Could you imagine if the PS5 couldn't even match a Maxwell Titan? (For reference it matches the Pascal titan in performance) That's basically what the 4090 is to the PS6. Sad times.
 
Get the average performance results from techpowerup. 1080ti was 67% more powerful on average than the 980ti. 2080ti was 35% and 3090 was around 50%.

4090 is 64% more powerful on average so you are pretty much there with the 980ti to 1080ti leap.

Yes, prices have gone up and these things are no longer affordable but i remember the 2080 ti launching for $1200 back in 2018. 4090 at $1,600 is basically inflation doing its thing and if the 5090 is $2k then is prEtty much following the same trajectory of the past few gens.

I think the performance uplift will be around 50-60% and some fancy AI stuff will set it apart from the 4000 series.

AI will be compensating for the smaller leaps we are bound to get in the next few years. And the 5000 series will be the start of that with amd two years behind as usual and then Sony coming in the 3rd of 4th year.
Besides doom 2016 there were quite a few games in techpowerup review, where 1080ti was 90%+ faster compared to previous generation.

Anno - 28fps vs 54fps 93% relative difference

Battlefield 1 - 36fps vs 69fps 91% relative difference

COD Infinite Warfare - 44fps vs 84fps 91% relative difference

Fallout 4 - 40fps vs 78fps 95% relative difference

GTA5 - 44fps vs 84fps 91% relative difference

Hitman - 33fps vs 70fps 112% relative difference

Mafia 3 - 21fps vs 42fps 100% relative difference

Rise Of The Tomb Raider - 27 fps vs 52fps 92% relative difference

The Witcher 3 - 31fps vs 59fps 90% relative difference

Total War Warhammer - 22fps vs 42fps 90% relative difference

Watch Dogs 2 - 21fps vs 41fps 95% relative difference

There were also quite a few games with 80-89% relative difference. I think some games were CPU limited as well.

perfrel_3840_2160.png


Based on all 25 games tested the 1080ti was 85% faster on average compared to 980ti, so IDK where you find that 67%.

The RTX4090 was only 46% faster on average compared to the 3090ti and I havent found a single raster game where the RTX4090 would be 90% faster compared to previous generation, or even close to 80%. The RTX4090 can be 2x faster, but only in RT, or if you use Frame Generation.

average-fps_3840-2160.png


I'm glad you mentioned 2080ti. The RTX2080ti was released in 2018 and it took 7 years before console (PS5Pro) could match it's performance (the 2080ti is still faster, especially in RT, but not that much). How many years do you think it will take for AMD to build an APU that will be able to compete with the RTX4090, let alone the RTX5090?
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Besides doom 2016 there were quite a few games in techpowerup review, where 1080ti was 90%+ faster compared to previous generation.

Anno - 28fps vs 54fps 93% relative difference

Battlefield 1 - 36fps vs 69fps 91% relative difference

COD Infinite Warfare - 44fps vs 84fps 91% relative difference

Fallout 4 - 40fps vs 78fps 95% relative difference

GTA5 - 44fps vs 84fps 91% relative difference

Hitman - 33fps vs 70fps 112% relative difference

Mafia 3 - 21fps vs 42fps 100% relative difference

Rise Of The Tomb Raider - 27 fps vs 52fps 92% relative difference

The Witcher 3 - 31fps vs 59fps 90% relative difference

Total War Warhammer - 22fps vs 42fps 90% relative difference

Watch Dogs 2 - 21fps vs 41fps 95% relative difference

There were also quite a few games with 80-89% relative difference. I think some games were CPU limited as well.

perfrel_3840_2160.png


Based on all 25 games tested the 1080ti was 85% faster on average compared to 980ti, so IDK where you find that 67%.

The RTX4090 was only 46% faster on average compared to the 3090ti and I havent found a single raster game where the RTX4090 would be 90% faster compared to previous generation, or even close to 80%. The RTX4090 can be 2x faster, but only in RT, or if you use Frame Generation.

average-fps_3840-2160.png


I'm glad you mentioned 2080ti. The RTX2080ti was released in 2018 and it took 7 years before console (PS5Pro) could match it's performance (the 2080ti is still faster, especially in RT, but not that much). How many years do you think it will take for AMD to build an APU that will be able to compete with the RTX4090, let alone the RTX5090?
AS6UD3L.jpeg
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Meanwhile literally this generation


doom-eternal-3840-2160.png


We never got to see a full AD102 Geforce so comparing xx90 to xx90.

Ohh and last generation
doom-eternal-3840-2160.png




Looking at the leaked specs of the 5090.......I think those that can afford them are gonna be happy if they are gen on gen upgraders.
As for the rest of us peasants I think we are skipping a generation.


i7oBqGk.png



And yes thats nearly 2000GB/s bandwidth, if ETH mining was still a thing.....sweet jesus.................its crazy to me that its kinda normal to see memory speeds measured in TB/s




<-----Not an Nvidia fanboy
<-----Totally an Nvidia fanboy
<-----Not an Nvidia fanboy, just need CUDA for work and all CUDA ports get blocked right quick.
<-----Not an Nvidia fanboy
 
Last edited:
Top Bottom