SportsFan581
Member
I would like it to be more than that, the 480 didn't change anything for AMD or Nvidia in the long term
True, but at least it was a competitive product that was relevant in the market.
I would like it to be more than that, the 480 didn't change anything for AMD or Nvidia in the long term
Trolling both sides doesn't make it ok. I learned that the hard way when I was banned at the og place for it.I never understood if the problem was in reference to Sony or Xbox! Who did they think I was shilling for?
Sony better get some discounts from amd given the work they put in on this Gpu
Trolling both sides doesn't make it ok. I learned that the hard way when I was banned at the og place for it.
It is funny though![]()
We're fully in the TOPs era now already - and NVidia has already been inflating the number to high heavens beyond what hw actually does - so it clearly matters to someone.I wonder what will be the fad next round? I bet its going to be the number of Zettathrusts per Nanocycle (ZTN)
It's too badly priced in comparison to the XT to make much sense. I can only guess it's on the edge of profitability due to yields.Had the RX 9070 non XT been $150 less it would sell A LOT. AMD have a winner on the XT but they dropped the ball on the non XT version.
Some raster games run extremely well on the 9070XT (not to mention 3Dmark benchmarkIf supply was unlimited I'm not sure it makes much sense to buy Nvidia in the mid range. You can easily crank a 9070 XT to surpass a 4080 in many scenarios. For $400 less.
Raw raster is always the most important thing. That’s what you’re actually paying for, otherwise a company could just sell you the same card over and over with a software update.Some raster games run extremely well on the 9070XT (not to mention 3Dmark benchmark), so AMD card can sometimes win with the 4080.
![]()
![]()
In 2025 however raw raster performance is no longer the most important factor. Modern games use ray tracing. AMD has closed the gap from 35% (7900XTX) to 20%, but that's still not the same RT performance level.
![]()
When it comes to games where you really need that RT power, it seems AMD needs to make another generational leap to finally catch up with Ada architecture.
9070 XT around 50-55fps in the city and 35-40 in the grassy areas, my RTX4080S around 80-90fps in the city, grassy areas 56-75fps.
![]()
![]()
RTX4080S
![]()
![]()
Cyberpunk RT ultra, 9070XT FSR Quality 72fps vs RTX4080S DLSS4 quality 91fps (95 fps when using old CNN model).
![]()
![]()
Path Tracing - 9070XT FSR performance 52fps vs RTX4080S DLSS4 performance 92fps and DLSSQuality 62fps
![]()
![]()
![]()
In PT games with a lot of foliage that difference is really massive, AMD needs to finally support MS Raytracing 1.2 features (OMM and shader execution reordering) to close this gap.
![]()
![]()
![]()
Up to 3.3x performance difference with PT, better image quality (RR looks very good in Alan Wake 2 and Cyberpunk with DLSSQ and perfect with DLAA) and DLSS FG 85% fps boost with just extremely minimal latency (1-4ms latency difference in the best case scenario in Cyberpunk and 10-12ms in the worst Alan Wake 2) that doesnt affect aiming on mouse compared to laggy FSR FG. LikeMister Wolf is always saying, you get what you paid for. AMD cards are cheaper for a reason.
Nvidia can increase RT performance on the software side (for example mega geometry), but they usually need to implement new features on the hardware level (for example OMM and shader execution reordering, LSS, subdivision surfaces, enhancements for mega geometry) to boost RT performance. In addition, they also increase the number of RT cores. Shading performance is also important in RT games, because even RT games realay on shaders. In short, the GPU needs to be stronger to deliver better RT performance.Raw raster is always the most important thing. That’s what you’re actually paying for, otherwise a company could just sell you the same card over and over with a software update.
“You get what you pay for”, they should put that on Nvidia ads. Either that or “fuck you, pay me”.
Why do you people LIE?No, it's not. It is also sold out and whatever gets on shelves is being priced up to +50% to MSRP.
It's priced exactly how it should be priced based on its performance and features.
Granted that's better than how previous two gens of AMD GPUs were priced. But it's not any sort of a disruption.
It is not, it is very close to DLSS3's model E. All the comparisons I've seen showcase that when they are actually done properly instead of comparing FSR4 with whatever DLSS v2 a game had at its launch.
It's not a big deal though as everything starting with DLSS Model C/D and later is very close to each other to the point where the differences don't really matter much.
A bit more info about GPU sales at Mindfactory.
Basically, Nvidia and AMD are fighting it out. While Intel has left the fight.
im still convinced the rt thing is more of a software problem than any hardware, but maybe that's copeSome raster games run extremely well on the 9070XT (not to mention 3Dmark benchmark), so AMD card can sometimes win with the 4080.
![]()
![]()
In 2025 however raw raster performance is no longer the most important factor. Modern games use ray tracing. AMD has closed the gap from 35% (7900XTX) to 20%, but that's still not the same RT performance level.
![]()
When it comes to games where you really need that RT power, it seems AMD needs to make another generational leap to finally catch up with Ada architecture.
9070 XT around 50-55fps in the city and 35-40 in the grassy areas, my RTX4080S around 80-90fps in the city, grassy areas 56-75fps.
![]()
![]()
RTX4080S
![]()
![]()
Cyberpunk RT ultra, 9070XT FSR Quality 72fps vs RTX4080S DLSS4 quality 91fps (95 fps when using old CNN model).
![]()
DLSS Quality (transformer)
![]()
Path Tracing - 9070XT FSR performance 52fps vs RTX4080S DLSS4 performance 92fps and DLSSQuality 62fps
![]()
DLSS Preformance
![]()
DLSS Quality
![]()
In PT games with a lot of foliage that difference is really massive, AMD needs to finally support MS Raytracing 1.2 features (OMM and shader execution reordering) to close this gap.
![]()
![]()
![]()
Up to 3.3x performance difference with PT, better image quality (RR looks very good in Alan Wake 2 and Cyberpunk with DLSSQ and perfect with DLAA) and DLSS FG 85% fps boost with just extremely minimal latency (1-4ms latency difference in the best case scenario in Cyberpunk and 10-12ms in the worst Alan Wake 2) that doesnt affect aiming on mouse compared to laggy FSR FG. LikeMister Wolf is always saying, you get what you paid for. AMD cards are cheaper for a reason.
There most certainly is a software component and devs prioritizing NVIDIA and barely doing anything on AMD, especially in those big RT showcases. Look at Assassin's Creed Shadows. I don't believe the gap in RT is all that big.im still convinced the rt thing is more of a software problem than any hardware, but maybe that's cope
theres just a part of me that has hope in AMD that with some updates to their drivers + AMD centric game updates AMD will have much better RT.
A 20+fps gain in all RT situations would seal the deal for me.
Does Intel even have units available? Almost everywhere I look has the B580 out of stock.
Did they sort out that CPU issue that caused the 570 and 580 to under-perform with budget CPUs, the ones they're more likely to be paired with? If not, then no reason to buy this card, unless you only need an extremely powerful machine sans the GPU.Good question. When Battlemage launched at the start of the year, there were shortages, but those should have been sorted by now.
I wonder if Intel just gave up on the DIY market and is now just focused on pre-builts.
Did they sort out that CPU issue that caused the 570 and 580 to under-perform with budget CPUs, the ones they're more likely to be paired with? If not, then no reason to buy this card, unless you only need an extremely powerful machine sans the GPU.
Lol two blunders in a row. First gen had issues with DX9 games, games people with budget GPUs are more likely to play. This one has issues with budget CPUs.I haven't seen any news about it. I'm sure that HU or GN would have re-tested them, if a driver had fixed those issues.
Maybe it's just how the hardware works and there are no drivers that can fix it......
Lol two blunders in a row. First gen had issues with DX9 games, games people with budget GPUs are more likely to play. This one has issues with budget CPUs.
Well, I hope Intel doesn’t give up. Those GPUs can be decent if those problems are taken care of. I believe at least DX9 games perform much better now.
One guy, was working in NV and AMD, said AMD lose API war.I don't believe the gap in RT is all that big
The 9070XT's performance in PT games tanks the most in locations with dense foliage, and that's exactly where the OMM engine makes a difference. Ampere architecture has similar problem.im still convinced the rt thing is more of a software problem than any hardware, but maybe that's cope
theres just a part of me that has hope in AMD that with some updates to their drivers + AMD centric game updates AMD will have much better RT.
A 20+fps gain in all RT situations would seal the deal for me.
Every RTX GPU supports OMM and shader execution reordering. Those predate Lovelace. However, Lovelace and above have hardware acceleration for OMM.The 9070XT's performance in PT games tanks the most in locations with dense foliage, and that's exactly where the OMM engine makes a difference. Ampere architecture has similar problem.
![]()
Ampere does not have OMM and shader reordering, so I think these features are responsible for the huge performance gap in path traced games.
OMM and shader reordering technology are now part of DirectX raytracing 1.2, so I think AMD will want to support these features with their next GPU architecture and that's when Nvidia dominance in PT games will end.
I can see it now. Finewine RTThere most certainly is a software component and devs prioritizing NVIDIA and barely doing anything on AMD, especially in those big RT showcases. Look at Assassin's Creed Shadows. I don't believe the gap in RT is all that big.
Nvidia's RTX kit features chart suggests that the RTX 20 and 30 series support the basic capabilities of OMM, while the RTX40 and RTX50 have hardware-based OMM. As for SER this chart suggest only RTX 40 / 50 support it.Every RTX GPU supports OMM and shader execution reordering. Those predate Lovelace. However, Lovelace and above have hardware acceleration for OMM.
It looks too me that those games rely more on Nvidia pipelines for RTSome raster games run extremely well on the 9070XT (not to mention 3Dmark benchmark), so AMD card can sometimes win with the 4080.
![]()
![]()
In 2025 however raw raster performance is no longer the most important factor. Modern games use ray tracing. AMD has closed the gap from 35% (7900XTX) to 20%, but that's still not the same RT performance level.
![]()
When it comes to games where you really need that RT power, it seems AMD needs to make another generational leap to finally catch up with Ada architecture.
9070 XT around 50-55fps in the city and 35-40 in the grassy areas, my RTX4080S around 80-90fps in the city, grassy areas 56-75fps.
![]()
![]()
RTX4080S
![]()
![]()
Cyberpunk RT ultra, 9070XT FSR Quality 72fps vs RTX4080S DLSS4 quality 91fps (95 fps when using old CNN model).
![]()
DLSS Quality (transformer)
![]()
Path Tracing - 9070XT FSR performance 52fps vs RTX4080S DLSS4 performance 92fps and DLSSQuality 62fps
![]()
DLSS Preformance
![]()
DLSS Quality
![]()
In PT games with a lot of foliage that difference is really massive, AMD needs to finally support MS Raytracing 1.2 features (OMM and shader execution reordering) to close this gap.
![]()
![]()
![]()
Up to 3.3x performance difference with PT, better image quality (RR looks very good in Alan Wake 2 and Cyberpunk with DLSSQ and perfect with DLAA) and DLSS FG 85% fps boost with just extremely minimal latency (1-4ms latency difference in the best case scenario in Cyberpunk and 10-12ms in the worst Alan Wake 2) that doesnt affect aiming on mouse compared to laggy FSR FG. LikeMister Wolf is always saying, you get what you paid for. AMD cards are cheaper for a reason.
It looks too me that those games rely more on Nvidia pipelines for RT
DirectX needs to be completely ditched by all developers. Use Vulkan instead, it works great on windows and on Linux. In my experience games that use both get far more consistent performance with Vulkan even if the average framerate may not be as high.One guy, was working in NV and AMD, said AMD lose API war.
I hope with new Xbox, MC gonna be add RDNA5+ features to DirectX.
DX need fully new version, MC barely do anything with DX updates.
FSR 3.x was indeed decent at 4K Quality, but it started suffering once you started going below 1440p source. It was still a lot better vs nothing.RT performance is the only area where I was a bit disappointed with the 9070XT as I expected and felt it needed to be a full tier higher to really make purchasing an nvidia GPU look foolish.
Having said that, FSR4 exceeded my expectations and has effectively neutralized nvidia's best advantage. DLSS4 is still better, but no longer worth paying a significant premium for.
Contrary to popular belief, FSR3 (or even FSR2) looked pretty good in most games with quality settings on a 4K display.
Yep. On 1440p DLSS really flexed it's muscles.FSR 3.x was indeed decent at 4K Quality, but it started suffering once you started going below 1440p source. It was still a lot better vs nothing.
Just ok perf. Seems in RDNA5 they going add BVH hardware, based on patent - https://www.freepatentsonline.com/y2025/0104328.htmlRT performance is the only area where I was a bit disappointed with the 9070XT
I'll need to recheck, but I do remember SER being one of the big features in an NVIDIA driver and it offered small or medium improvements in some games, and I recall the RTX 20 and 30 series being there.Nvidia's RTX kit features chart suggests that the RTX 20 and 30 series support the basic capabilities of OMM, while the RTX40 and RTX50 have hardware-based OMM. As for SER this chart suggest only RTX 40 / 50 support it.
![]()
The question is, if software based OMM can reduce the gap in these demanding PT games like Indiana Jones and Black Myth (games with foliage heavy scenes). Pascal support ray tracing on the software level but we all know that hardware support made a huge difference.
Nvidia sells wafers to AIBs at a terrible price. Basically they can’t make a profit at MSRP. And of course on top of that they are using this chance to get extra money by overcharging. So instead of say $2200-2300 for 5090, they could charge $3K and stock is still nowhere in site.Nvidia got every wafer pumping out commercial AI GPU's, the 0.01% that makes it to consumer GPU's are marked up 50-100% by AIB's and stores.
I'll need to recheck, but I do remember SER being one of the big features in an NVIDIA driver and it offered small or medium improvements in some games, and I recall the RTX 20 and 30 series being there.
NVIDIA's document also mentions this for supporting SER:
Which as far as I'm aware are all RTX capable GPUs. Maybe I'm also remembering incorrectly because your update is quite recent. It says 2025-02-07, so just last month. Well, the new DXR 1.2 will expand the support for it anyhow.
- A GPU that supports DXR 1.0 or higher
- A driver that supports SER, R520, and newer
- HLSL extension headers, which can be found in the latest NVIDIA API
- Link against nvapi64.lib, included in the packages containing the HLSL headers
- (Optional) A recent version of DXC (dxcompiler.dll) that supports templates. If you’re compiling shaders from Visual Studio, make sure that your project is configured to use this version of the compiler exec
winjer Can you confirm? The document I have suggests it is supported, but the other one Hicks provided says the opposite.
Nvidia sells wafers to AIBs at a terrible price. Basically they can’t make a profit at MSRP. And of course on top of that they are using this chance to get extra money by overcharging. So instead of say $2200-2300 for 5090, they could charge $3K and stock is still nowhere in site.
A bit more info about GPU sales at Mindfactory.
Basically, Nvidia and AMD are fighting it out. While Intel has left the fight.
The fun part is that RDNA4 doesn't do very well in older non-RT titles either. So it's really just some titles where you either do benchmarks w/o RT (so not on maximum quality) or the relatively rare non-RT modern titles.In 2025 however raw raster performance is no longer the most important factor.
I dunno why HUB lies. They used to say that FSR2 is about as good as DLSS3 but now suddenly FSR4 is miles better than FSR2 while being about similar to DLSS3? No idea how they can marry these two statements and hope that no one would notice.Why do you people LIE?
It’s one retailer.Is it really a matter of AMD and Nvidia are only selling <500 units per week.
Production is that bad?!
Its called.....drum roll....The technology has improved...the more you know.The fun part is that RDNA4 doesn't do very well in older non-RT titles either. So it's really just some titles where you either do benchmarks w/o RT (so not on maximum quality) or the relatively rare non-RT modern titles.
I dunno why HUB lies. They used to say that FSR2 is about as good as DLSS3 but now suddenly FSR4 is miles better than FSR2 while being about similar to DLSS3? No idea how they can marry these two statements and hope that no one would notice.
Use your own eyes. FSR4 is about similar to DLSS3 model E most of the time. DLSS4 model J/K is very obviously better.
It also doesn't help AMD much that FSR4 is heavier to execute on RDNA4 than DLSS4 is on Lovelace/Blackwell.
So according to them FSR2 was about on par with DLSS3 and then FSR4 which is miles better than FSR2 has... improved... to be about on par with DLSS3?..Its called.....drum roll....The technology has improved...the more you know.
They haven't claimed that since doing a proper breakdown video.I dunno why HUB lies. They used to say that FSR2 is about as good as DLSS3 but now suddenly FSR4 is miles better than FSR2 while being about similar to DLSS3?