AMD CEO Says Radeon RX 9070 XT Has Been A Huge Success, 10X Sales Versus Previous Generation (first week sale) & Confirms Increased Supply

Ritsumei2020

Who did they think I was shilling for?
Trolling both sides doesn't make it ok. I learned that the hard way when I was banned at the og place for it.

It is funny though💁‍♀️

You’re right, but back then there was a period where every other thread was about teraflops.

I wonder what will be the fad next round? I bet its going to be the number of Zettathrusts per Nanocycle (ZTN)
 

StereoVsn

Gold Member

Those are pretty solid numbers. I hope AMD can keep it up. And I hope at least some of the cards can keep to MSRP.

On Nvidia side I have been seeing cards pop up more often in Microcenter, but all the cards are either $200+ above MSRP for 5070Ti or $400+ for 5080.
 

Fafalada

Fafracer forever
I wonder what will be the fad next round? I bet its going to be the number of Zettathrusts per Nanocycle (ZTN)
We're fully in the TOPs era now already - and NVidia has already been inflating the number to high heavens beyond what hw actually does - so it clearly matters to someone.
I bet Switch HD will try to claim it has the same TOPs throughput as the PS5 Pro (and maybe claim 2x performance over PS5+XSX combined using frame interpolation on top of it - because as we all know things stack that way - you start with one number and then multiply over and over again).

Anyway - if I was a gambling person I'd predict at some point we'll measure hw in terms of Human-hours 'saved*' by the AI.
And if that sounds dystopian - it's because it's supposed to be - as dr Strange said it.
The End Endgame GIF


*Saving will mean everything from hours saved for efficiency, hours saved for future use, and hours saved for future human to perform when AI doesn't do it right.
 
If supply was unlimited I'm not sure it makes much sense to buy Nvidia in the mid range. You can easily crank a 9070 XT to surpass a 4080 in many scenarios. For $400 less.
Some raster games run extremely well on the 9070XT (not to mention 3Dmark benchmark :p), so AMD card can sometimes win with the 4080.

Whux8cK.jpeg


P1T88fZ.png



In 2025 however raw raster performance is no longer the most important factor. Modern games use ray tracing. AMD has closed the gap from 35% (7900XTX) to 20%, but that's still not the same RT performance level.

KY06pVK.jpeg


When it comes to games where you really need that RT power, it seems AMD needs to make another generational leap to finally catch up with Ada architecture.

9070 XT around 50-55fps in the city and 35-40 in the grassy areas, my RTX4080S around 80-90fps in the city, grassy areas 56-75fps.

4090-vs-9070.jpg



9070-grassy-areas.jpg



RTX4080S

11.jpg


GTA5-Enhanced-2025-03-12-23-33-46-006.jpg



In cyberpunk with standard RT ultra the difference isnt huge, but still noticeable. 9070XT FSR Quality 72fps vs RTX4080S DLSS4 quality 91fps (26% relative diffefence). With old DLSS3 quality (CNN model) 95fps (32% relative difference).


9070-XT-RT-ultra.jpg


DLSS Quality (transformer)

DLSS-transformer.jpg


In Path Tracing though that difference is much bigger - 9070XT FSR performance 52fps vs RTX4080S DLSS4 performance 92fps (77% relative difference). Even DLSS4 Quality 62fps is 19% faster despite running at much higher internal resolution.

9070-XT-PT.jpg


DLSS4 Preformance

DLSS-P-Path-tracing.jpg


DLSS4 Quality

PT-DLSSQ.jpg



In PT games like Quake 2 RTX the RX 9070XT is reasonably fast, but modern PT games have a lot of foliage and that's where Ada / Blackwell architecture shows it's strenght thanks to OMM and SER technology. AMD needs to finally support MS Raytracing 1.2 features (OMM and shader execution reordering) to close this gap in PT modern games.


Indiana-Jones.jpg


Black-myth-w.jpg


alan-wake-2.jpg


Up to 3.3x performance difference with PT, wider support for DLSS4 (all games that support DLSS 2 / 3 can use DLSS4), better image quality in RT games (RR looks very good in Cyberpunk with DLSSQ and perfect with DLAA), excelent DLSS FG that doesnt affect aiming even on mouse (I measured 1-4ms with the latest version in cyberpunk) compared to laggy FSR FG. Like Mister Wolf Mister Wolf is always saying, you get what you paid for. AMD cards are cheaper for a reason.
 
Last edited:

Astray

Member
I was a doubter this gen, but they proved me wrong.

My next GPU will likely be an AMD one because I intend to go with Linux gaming and ditch Windows for good.
 
Some raster games run extremely well on the 9070XT (not to mention 3Dmark benchmark :p), so AMD card can sometimes win with the 4080.

Whux8cK.jpeg


P1T88fZ.png



In 2025 however raw raster performance is no longer the most important factor. Modern games use ray tracing. AMD has closed the gap from 35% (7900XTX) to 20%, but that's still not the same RT performance level.

KY06pVK.jpeg


When it comes to games where you really need that RT power, it seems AMD needs to make another generational leap to finally catch up with Ada architecture.

9070 XT around 50-55fps in the city and 35-40 in the grassy areas, my RTX4080S around 80-90fps in the city, grassy areas 56-75fps.

4090-vs-9070.jpg



9070-grassy-areas.jpg



RTX4080S

11.jpg


GTA5-Enhanced-2025-03-12-23-33-46-006.jpg



Cyberpunk RT ultra, 9070XT FSR Quality 72fps vs RTX4080S DLSS4 quality 91fps (95 fps when using old CNN model).


9070-XT-RT-ultra.jpg


DLSS-transformer.jpg


Path Tracing - 9070XT FSR performance 52fps vs RTX4080S DLSS4 performance 92fps and DLSSQuality 62fps

9070-XT-PT.jpg


DLSS-P-Path-tracing.jpg


PT-DLSSQ.jpg



In PT games with a lot of foliage that difference is really massive, AMD needs to finally support MS Raytracing 1.2 features (OMM and shader execution reordering) to close this gap.


Indiana-Jones.jpg


Black-myth-w.jpg


alan-wake-2.jpg


Up to 3.3x performance difference with PT, better image quality (RR looks very good in Alan Wake 2 and Cyberpunk with DLSSQ and perfect with DLAA) and DLSS FG 85% fps boost with just extremely minimal latency (1-4ms latency difference in the best case scenario in Cyberpunk and 10-12ms in the worst Alan Wake 2) that doesnt affect aiming on mouse compared to laggy FSR FG. Like Mister Wolf Mister Wolf is always saying, you get what you paid for. AMD cards are cheaper for a reason.
Raw raster is always the most important thing. That’s what you’re actually paying for, otherwise a company could just sell you the same card over and over with a software update.

“You get what you pay for”, they should put that on Nvidia ads. Either that or “fuck you, pay me”.
 
Raw raster is always the most important thing. That’s what you’re actually paying for, otherwise a company could just sell you the same card over and over with a software update.

“You get what you pay for”, they should put that on Nvidia ads. Either that or “fuck you, pay me”.
Nvidia can increase RT performance on the software side (for example mega geometry), but they usually need to implement new features on the hardware level (for example OMM and shader execution reordering, LSS, subdivision surfaces, enhancements for mega geometry) to boost RT performance. In addition, they also increase the number of RT cores. Shading performance is also important in RT games, because even RT games realay on shaders. In short, the GPU needs to be stronger to deliver better RT performance.
 

JohnnyFootball

GerAlt-Right. Ciriously.
No, it's not. It is also sold out and whatever gets on shelves is being priced up to +50% to MSRP.


It's priced exactly how it should be priced based on its performance and features.
Granted that's better than how previous two gens of AMD GPUs were priced. But it's not any sort of a disruption.


It is not, it is very close to DLSS3's model E. All the comparisons I've seen showcase that when they are actually done properly instead of comparing FSR4 with whatever DLSS v2 a game had at its launch.
It's not a big deal though as everything starting with DLSS Model C/D and later is very close to each other to the point where the differences don't really matter much.
Why do you people LIE?
 

kevboard

Member
not gonna lie. I think their rebranding actually is a huge part of what makes these so successful.

calling them 9070 instead of 8700 was a genius move. it just looks and sounds better.
and that can be a big deal if you want to resonate with mainstream audiences.
 

64gigabyteram

Reverse groomer.
Some raster games run extremely well on the 9070XT (not to mention 3Dmark benchmark :p), so AMD card can sometimes win with the 4080.

Whux8cK.jpeg


P1T88fZ.png



In 2025 however raw raster performance is no longer the most important factor. Modern games use ray tracing. AMD has closed the gap from 35% (7900XTX) to 20%, but that's still not the same RT performance level.

KY06pVK.jpeg


When it comes to games where you really need that RT power, it seems AMD needs to make another generational leap to finally catch up with Ada architecture.

9070 XT around 50-55fps in the city and 35-40 in the grassy areas, my RTX4080S around 80-90fps in the city, grassy areas 56-75fps.

4090-vs-9070.jpg



9070-grassy-areas.jpg



RTX4080S

11.jpg


GTA5-Enhanced-2025-03-12-23-33-46-006.jpg



Cyberpunk RT ultra, 9070XT FSR Quality 72fps vs RTX4080S DLSS4 quality 91fps (95 fps when using old CNN model).


9070-XT-RT-ultra.jpg


DLSS Quality (transformer)

DLSS-transformer.jpg


Path Tracing - 9070XT FSR performance 52fps vs RTX4080S DLSS4 performance 92fps and DLSSQuality 62fps

9070-XT-PT.jpg


DLSS Preformance

DLSS-P-Path-tracing.jpg


DLSS Quality

PT-DLSSQ.jpg



In PT games with a lot of foliage that difference is really massive, AMD needs to finally support MS Raytracing 1.2 features (OMM and shader execution reordering) to close this gap.


Indiana-Jones.jpg


Black-myth-w.jpg


alan-wake-2.jpg


Up to 3.3x performance difference with PT, better image quality (RR looks very good in Alan Wake 2 and Cyberpunk with DLSSQ and perfect with DLAA) and DLSS FG 85% fps boost with just extremely minimal latency (1-4ms latency difference in the best case scenario in Cyberpunk and 10-12ms in the worst Alan Wake 2) that doesnt affect aiming on mouse compared to laggy FSR FG. Like Mister Wolf Mister Wolf is always saying, you get what you paid for. AMD cards are cheaper for a reason.
im still convinced the rt thing is more of a software problem than any hardware, but maybe that's cope
theres just a part of me that has hope in AMD that with some updates to their drivers + AMD centric game updates AMD will have much better RT.

A 20+fps gain in all RT situations would seal the deal for me.
 

Gaiff

SBI’s Resident Gaslighter
im still convinced the rt thing is more of a software problem than any hardware, but maybe that's cope
theres just a part of me that has hope in AMD that with some updates to their drivers + AMD centric game updates AMD will have much better RT.

A 20+fps gain in all RT situations would seal the deal for me.
There most certainly is a software component and devs prioritizing NVIDIA and barely doing anything on AMD, especially in those big RT showcases. Look at Assassin's Creed Shadows. I don't believe the gap in RT is all that big.
 

winjer

Gold Member
Does Intel even have units available? Almost everywhere I look has the B580 out of stock.

Good question. When Battlemage launched at the start of the year, there were shortages, but those should have been sorted by now.
I wonder if Intel just gave up on the DIY market and is now just focused on pre-builts.
 

Gaiff

SBI’s Resident Gaslighter
Good question. When Battlemage launched at the start of the year, there were shortages, but those should have been sorted by now.
I wonder if Intel just gave up on the DIY market and is now just focused on pre-builts.
Did they sort out that CPU issue that caused the 570 and 580 to under-perform with budget CPUs, the ones they're more likely to be paired with? If not, then no reason to buy this card, unless you only need an extremely powerful machine sans the GPU.
 

winjer

Gold Member
Did they sort out that CPU issue that caused the 570 and 580 to under-perform with budget CPUs, the ones they're more likely to be paired with? If not, then no reason to buy this card, unless you only need an extremely powerful machine sans the GPU.

I haven't seen any news about it. I'm sure that HU or GN would have re-tested them, if a driver had fixed those issues.
Maybe it's just how the hardware works and there are no drivers that can fix it......
 

Gaiff

SBI’s Resident Gaslighter
I haven't seen any news about it. I'm sure that HU or GN would have re-tested them, if a driver had fixed those issues.
Maybe it's just how the hardware works and there are no drivers that can fix it......
Lol two blunders in a row. First gen had issues with DX9 games, games people with budget GPUs are more likely to play. This one has issues with budget CPUs.

Well, I hope Intel doesn’t give up. Those GPUs can be decent if those problems are taken care of. I believe at least DX9 games perform much better now.
 

winjer

Gold Member
Lol two blunders in a row. First gen had issues with DX9 games, games people with budget GPUs are more likely to play. This one has issues with budget CPUs.

Well, I hope Intel doesn’t give up. Those GPUs can be decent if those problems are taken care of. I believe at least DX9 games perform much better now.

Archmage didn't have issues with DX9. They just didn't get the drivers ready on time for that API, so it ran on a translation layer.
But they did manage to fix that with proper driver support.
The issue with Battle mage was that there were some instructions that were missing or on low quantity, in the hardware itself.

 
im still convinced the rt thing is more of a software problem than any hardware, but maybe that's cope
theres just a part of me that has hope in AMD that with some updates to their drivers + AMD centric game updates AMD will have much better RT.

A 20+fps gain in all RT situations would seal the deal for me.
The 9070XT's performance in PT games tanks the most in locations with dense foliage, and that's exactly where the OMM engine makes a difference. Ampere architecture has similar problem.
v5ZpWkc.jpeg


Ampere does not have OMM and shader reordering, so I think these features are responsible for the huge performance gap in path traced games.

OMM and shader reordering technology are now part of DirectX raytracing 1.2, so I think AMD will want to support these features with their next GPU architecture and that's when Nvidia dominance in PT games will end.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
The 9070XT's performance in PT games tanks the most in locations with dense foliage, and that's exactly where the OMM engine makes a difference. Ampere architecture has similar problem.
v5ZpWkc.jpeg


Ampere does not have OMM and shader reordering, so I think these features are responsible for the huge performance gap in path traced games.

OMM and shader reordering technology are now part of DirectX raytracing 1.2, so I think AMD will want to support these features with their next GPU architecture and that's when Nvidia dominance in PT games will end.
Every RTX GPU supports OMM and shader execution reordering. Those predate Lovelace. However, Lovelace and above have hardware acceleration for OMM.
 
Last edited:

64gigabyteram

Reverse groomer.
There most certainly is a software component and devs prioritizing NVIDIA and barely doing anything on AMD, especially in those big RT showcases. Look at Assassin's Creed Shadows. I don't believe the gap in RT is all that big.
I can see it now. Finewine RT

Equal to a 4070 in RT now, jumps to 5070 TI perf by the end of the gen. You can do it AMD
 
Every RTX GPU supports OMM and shader execution reordering. Those predate Lovelace. However, Lovelace and above have hardware acceleration for OMM.
Nvidia's RTX kit features chart suggests that the RTX 20 and 30 series support the basic capabilities of OMM, while the RTX40 and RTX50 have hardware-based OMM. As for SER this chart suggest only RTX 40 / 50 support it.

MfahA44.png


The question is, if software based OMM can reduce the gap in these demanding PT games like Indiana Jones and Black Myth (games with foliage heavy scenes). Pascal support ray tracing on the software level but we all know that hardware support made a huge difference, so even if software based OMM can theoretically work we dont know if it's even worth using.
 
Last edited:

Kataploom

Gold Member
Some raster games run extremely well on the 9070XT (not to mention 3Dmark benchmark :p), so AMD card can sometimes win with the 4080.

Whux8cK.jpeg


P1T88fZ.png



In 2025 however raw raster performance is no longer the most important factor. Modern games use ray tracing. AMD has closed the gap from 35% (7900XTX) to 20%, but that's still not the same RT performance level.

KY06pVK.jpeg


When it comes to games where you really need that RT power, it seems AMD needs to make another generational leap to finally catch up with Ada architecture.

9070 XT around 50-55fps in the city and 35-40 in the grassy areas, my RTX4080S around 80-90fps in the city, grassy areas 56-75fps.

4090-vs-9070.jpg



9070-grassy-areas.jpg



RTX4080S

11.jpg


GTA5-Enhanced-2025-03-12-23-33-46-006.jpg



Cyberpunk RT ultra, 9070XT FSR Quality 72fps vs RTX4080S DLSS4 quality 91fps (95 fps when using old CNN model).


9070-XT-RT-ultra.jpg


DLSS Quality (transformer)

DLSS-transformer.jpg


Path Tracing - 9070XT FSR performance 52fps vs RTX4080S DLSS4 performance 92fps and DLSSQuality 62fps

9070-XT-PT.jpg


DLSS Preformance

DLSS-P-Path-tracing.jpg


DLSS Quality

PT-DLSSQ.jpg



In PT games with a lot of foliage that difference is really massive, AMD needs to finally support MS Raytracing 1.2 features (OMM and shader execution reordering) to close this gap.


Indiana-Jones.jpg


Black-myth-w.jpg


alan-wake-2.jpg


Up to 3.3x performance difference with PT, better image quality (RR looks very good in Alan Wake 2 and Cyberpunk with DLSSQ and perfect with DLAA) and DLSS FG 85% fps boost with just extremely minimal latency (1-4ms latency difference in the best case scenario in Cyberpunk and 10-12ms in the worst Alan Wake 2) that doesnt affect aiming on mouse compared to laggy FSR FG. Like Mister Wolf Mister Wolf is always saying, you get what you paid for. AMD cards are cheaper for a reason.
It looks too me that those games rely more on Nvidia pipelines for RT
 

JohnnyFootball

GerAlt-Right. Ciriously.
One guy, was working in NV and AMD, said AMD lose API war.
I hope with new Xbox, MC gonna be add RDNA5+ features to DirectX.
DX need fully new version, MC barely do anything with DX updates.
DirectX needs to be completely ditched by all developers. Use Vulkan instead, it works great on windows and on Linux. In my experience games that use both get far more consistent performance with Vulkan even if the average framerate may not be as high.

Take it from me as someone who has gamed in Linux/SteamOS and in Windows. Vulkan games have FAR less traversal stutter than thier DX counterparts.
 

JohnnyFootball

GerAlt-Right. Ciriously.
RT performance is the only area where I was a bit disappointed with the 9070XT as I expected and felt it needed to be a full tier higher to really make purchasing an nvidia GPU look foolish.

Having said that, FSR4 exceeded my expectations and has effectively neutralized nvidia's best advantage. DLSS4 is still better, but no longer worth paying a significant premium for.

Contrary to popular belief, FSR3 (or even FSR2) looked pretty good in most games with quality settings on a 4K display.
 

StereoVsn

Gold Member
RT performance is the only area where I was a bit disappointed with the 9070XT as I expected and felt it needed to be a full tier higher to really make purchasing an nvidia GPU look foolish.

Having said that, FSR4 exceeded my expectations and has effectively neutralized nvidia's best advantage. DLSS4 is still better, but no longer worth paying a significant premium for.

Contrary to popular belief, FSR3 (or even FSR2) looked pretty good in most games with quality settings on a 4K display.
FSR 3.x was indeed decent at 4K Quality, but it started suffering once you started going below 1440p source. It was still a lot better vs nothing.
 

Gaiff

SBI’s Resident Gaslighter
Nvidia's RTX kit features chart suggests that the RTX 20 and 30 series support the basic capabilities of OMM, while the RTX40 and RTX50 have hardware-based OMM. As for SER this chart suggest only RTX 40 / 50 support it.

MfahA44.png


The question is, if software based OMM can reduce the gap in these demanding PT games like Indiana Jones and Black Myth (games with foliage heavy scenes). Pascal support ray tracing on the software level but we all know that hardware support made a huge difference.
I'll need to recheck, but I do remember SER being one of the big features in an NVIDIA driver and it offered small or medium improvements in some games, and I recall the RTX 20 and 30 series being there.

NVIDIA's document also mentions this for supporting SER:

  • A GPU that supports DXR 1.0 or higher
  • A driver that supports SER, R520, and newer
  • HLSL extension headers, which can be found in the latest NVIDIA API
  • Link against nvapi64.lib, included in the packages containing the HLSL headers
  • (Optional) A recent version of DXC (dxcompiler.dll) that supports templates. If you’re compiling shaders from Visual Studio, make sure that your project is configured to use this version of the compiler exec
Which as far as I'm aware are all RTX capable GPUs. Maybe I'm also remembering incorrectly because your update is quite recent. It says 2025-02-07, so just last month. Well, the new DXR 1.2 will expand the support for it anyhow.

winjer winjer Can you confirm? The document I have suggests it is supported, but the other one Hicks provided says the opposite.
 
Last edited:

StereoVsn

Gold Member
Nvidia got every wafer pumping out commercial AI GPU's, the 0.01% that makes it to consumer GPU's are marked up 50-100% by AIB's and stores.
Nvidia sells wafers to AIBs at a terrible price. Basically they can’t make a profit at MSRP. And of course on top of that they are using this chance to get extra money by overcharging. So instead of say $2200-2300 for 5090, they could charge $3K and stock is still nowhere in site.
 

winjer

Gold Member
I'll need to recheck, but I do remember SER being one of the big features in an NVIDIA driver and it offered small or medium improvements in some games, and I recall the RTX 20 and 30 series being there.

NVIDIA's document also mentions this for supporting SER:

  • A GPU that supports DXR 1.0 or higher
  • A driver that supports SER, R520, and newer
  • HLSL extension headers, which can be found in the latest NVIDIA API
  • Link against nvapi64.lib, included in the packages containing the HLSL headers
  • (Optional) A recent version of DXC (dxcompiler.dll) that supports templates. If you’re compiling shaders from Visual Studio, make sure that your project is configured to use this version of the compiler exec
Which as far as I'm aware are all RTX capable GPUs. Maybe I'm also remembering incorrectly because your update is quite recent. It says 2025-02-07, so just last month. Well, the new DXR 1.2 will expand the support for it anyhow.

winjer winjer Can you confirm? The document I have suggests it is supported, but the other one Hicks provided says the opposite.

The first I saw a mention to SER, was during Ada's presentation.
I haven't kept up with nvidias drivers. So I don't know if they added SER support for older GPUs.
 

Haint

Member
Nvidia sells wafers to AIBs at a terrible price. Basically they can’t make a profit at MSRP. And of course on top of that they are using this chance to get extra money by overcharging. So instead of say $2200-2300 for 5090, they could charge $3K and stock is still nowhere in site.

They sell them completed boards at terrible prices cause AIB's don't actually make or design anything, they're just assembly and logistics companies that slap coolers on them. It's not difficult work with huge barriers of entry, that's why there are so many of them, and why DIY enthusiasts/overclockers are able to design better coolers in their garages/basements. They make plenty for what amounts to dumb grunt work, there is no reality where they should earn anywhere close to what Nvidia does on a GPU. Asus in particular is very likely earning MORE than Nvidia on every 5080 and 5090 sale. Nvidia are abject fools for writing contracts that allows this, and for continuing to ship them product.
 
Last edited:

demigod

Member
A bit more info about GPU sales at Mindfactory.
Basically, Nvidia and AMD are fighting it out. While Intel has left the fight.


Wtf, how are they getting so many cards. Microcenter doesnt get that many in a week. Although they have like 3 stores in Ohio. But still, don’t think they got anywhere close to 400 AMD cards lately. Must be the tariffs.
 

dgrdsv

Member
In 2025 however raw raster performance is no longer the most important factor.
The fun part is that RDNA4 doesn't do very well in older non-RT titles either. So it's really just some titles where you either do benchmarks w/o RT (so not on maximum quality) or the relatively rare non-RT modern titles.

Why do you people LIE?
I dunno why HUB lies. They used to say that FSR2 is about as good as DLSS3 but now suddenly FSR4 is miles better than FSR2 while being about similar to DLSS3? No idea how they can marry these two statements and hope that no one would notice.
Use your own eyes. FSR4 is about similar to DLSS3 model E most of the time. DLSS4 model J/K is very obviously better.
It also doesn't help AMD much that FSR4 is heavier to execute on RDNA4 than DLSS4 is on Lovelace/Blackwell.
 

JohnnyFootball

GerAlt-Right. Ciriously.
The fun part is that RDNA4 doesn't do very well in older non-RT titles either. So it's really just some titles where you either do benchmarks w/o RT (so not on maximum quality) or the relatively rare non-RT modern titles.


I dunno why HUB lies. They used to say that FSR2 is about as good as DLSS3 but now suddenly FSR4 is miles better than FSR2 while being about similar to DLSS3? No idea how they can marry these two statements and hope that no one would notice.
Use your own eyes. FSR4 is about similar to DLSS3 model E most of the time. DLSS4 model J/K is very obviously better.
It also doesn't help AMD much that FSR4 is heavier to execute on RDNA4 than DLSS4 is on Lovelace/Blackwell.
Its called.....drum roll....The technology has improved...the more you know.
 

dgrdsv

Member
Its called.....drum roll....The technology has improved...the more you know.
So according to them FSR2 was about on par with DLSS3 and then FSR4 which is miles better than FSR2 has... improved... to be about on par with DLSS3?..
I honestly have no idea why anyone is still viewing these clowns. Their reviews have been trash tier level for many years now, to the point where their data is almost constantly contradicts the data from every other source.
 
Top Bottom