• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: AMD FSR 4 Upscaling Tested vs DLSS 3/4 & - A Big Leap Forward - RDNA 4 Delivers!

rm082e

Member
Being In the ballpark of DLSS3 was what they needed to accomplish. Nice to see they've done that. They also seem to have done a great job on price/performance. This is going to make them more competitive at the midrange and lower end once the 9060 is out... if they can get enough supply...

Now we just need a higher end card. I applaud them for all the work they've done, but the 9070 is not enough of a jump over my 3080 to be practical. I'm still stuck with Nvidia as the only sensible upgrade option, but it's not really sensible at all right now.
 

Three

Member
So you're comparing one screenshot out of context compared to the videos showing it in motion? 🤔
It's the screenshot shared in this post and the context was just that, and me asking if it does a better job of reproducing the ground truth image:
Seems to be a lot more performant than the Transformer model, at least in Rift Apart.

vsL6PMS.png


Honestly, the performance cost of DLSS4 is unacceptable in this case. It's 19% slower than DLSS3. Sure, you get much better IQ, but you're also knocked down an entire GPU class.

I said to me, based on this image, it appears like FSR4 is reproducing results similar to DLSS CNN at a lower framerate.
 
Last edited:

Nankatsu

Gold Member
AMD needs to transform this into an even bigger win by enabling FSR4 on RDNA3 GPUS at least.

Even if they don't enable it right away, they should at least tease it.

I don't believe a single bit RDNA3 GPU's can't use FSR4.

I understand they want and need people to pull the trigger on the new cards, but still...
 
Last edited:

Imtjnotu

Member
Ah, PS5 base is not RDNA 2 I forgot this whole fucking drama of RDNA 1.5 or somethimg. Sorry getting lost in all these little custom chipsets ffs.

Why did Sony not use RDNA 4's machine learning units?
My guess is when the Pro was in design RDNA4 wasn't ready.

Which is why we get a mixture of RDNA 2&3
 

Buggy Loop

Member
Not sure why the performance tanks like this on Alex's side. It's nowhere near this on a 5080.

Its nowhere near this on any RTX cards

From their own fucking video on DLSS 4 deep dive

XTJbPRI.png


Made by Alex.

Did he see those graphs and not dig what's going on when he perfectly knows the cost of performance for transformer model?

Confused Mark Wahlberg GIF by 20th Century Fox Home Entertainment
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Huge advancement over FSR3, obliterates it. Better than DLSS3 in key areas.

Similar quality to DLSS CNN, but comes with the performance cost of the DLSS transformer model.

A long overdue welcome to the party for AMD when it comes to image reconstruction.

If you were happy with DLSS3 results then you will be more than happy with this.
Agreed. I’ve long stated that FSR4 needs to be competitive with DLSS3 and if it accomplished that then it would be a hit.
DLSS4 is still the best but it’s no longer worth paying a significant premium for.

I’d wager you can run FSR4 in balanced mode to regain the performance lost from FSR3 quality with little to no image loss.

Bravo, AMD.
 

kevboard

Member
Ah, PS5 base is not RDNA 2 I forgot this whole fucking drama of RDNA 1.5 or somethimg. Sorry getting lost in all these little custom chipsets ffs.

Why did Sony not use RDNA 4's machine learning units?

just like how RDNA2 wasn't fully developed during the PS5's development, RDNA4 was probably not fully ready during the PS5 Pro's development.
 

yamaci17

Member
DLSS4 works on all cards, and the performance hit seems comparable to FSR4. so I doubt anyone would use FSR4 over DLSS4.
dlss 4 performance cost is irrelevant
1080p transformer dlss performance looks better than native 1080p taa and the old dlss quality
I hope FSR 4 is good at 1080p too
 

Rudius

Member
Good job AMD, you are still behind but you are catching up, its the most we can ask of you. Now catch up in Raytracing and Ray Reconstruction so we can finally have a reason not to choose Nvidia (or at least give them some competition).
If you were after a 5070 card, would you pay 50 dollars more for an extra 4GB of VRAM, 16 in total? If you would, then you need to compare the 5070 with the 9700XT.
 

kevboard

Member
dlss 4 performance cost is irrelevant
1080p transformer dlss performance looks better than native 1080p taa and the old dlss quality
I hope FSR 4 is good at 1080p too

I mean it would be relevant if the quality was similar, or if FSR4 offered a dramatic performance advantage on lower end cards.

but I doubt that would be the case.

DLSS4 also doesn't fully hold up on low resolutions like 1080p, where I could see FSR4 quality mode probably having a quality advantage over DLSS4 performance mode.
 

Buggy Loop

Member
@ 8:33 I see all I need to see for now

DLSS 4 transformer model is just in a league of its own for now. Like hardware unboxed has said, this is equivalent to playing 4K native ultra quality textures settings while older CNN models is more like playing 1440p high texture settings with TAA blur.

For sure AMD is on right path, and these models improve. But with a 3080 Ti and DLSS 4 working perfectly, I'm gonna wait and see.
 

YCoCg

Member
To me the biggest benefit of DLSS4 is the 0 blur in motion. Its such a gamechanger especially at higher refresh rate. I assume fsr4 still has the TAA blur in motion, which sucks
No the blur has been reduced significantly.

You guys really need to watch the videos before posting 😕
 
Last edited:

Aaron07088

Neo Member
Its a great improvement over to fsr 3. Even for dlss CNN model. Fsr4 is almost perfect like cnn model even better, but we need to see more low resolutions like 1440p dlss performance or 1080p dlss quality etc etc. Also we need to see frame gen. Did amd improve frame gen side too? Also they have to do some tech like dlss ray reconstruction.
 
Last edited:

YCoCg

Member
Compared to FSR 3 yes

Still haven't seen anyone doing like hardware unboxed commenting that DLSS 4 is like native pre-TAA blur.
The Daniel Owens video where he compares Spiderman 2 shows FSR4 being on par or close to DLSS4 with some instances on 1080p where he pointed out FSR4 actually performing very slightly better at reducing ghosting from the character.
 

yamaci17

Member
Compared to FSR 3 yes

Still haven't seen anyone doing like hardware unboxed commenting that DLSS 4 is like native pre-TAA blur.
yup
it needs to get rid of that TAA blur exactly like DLSS 4 is capable of
DLSS 4 literally made 1440p playable and most importantly ENJOYABLE to me.
before that, to me, 1440p was something that was just a cope for not being able to run games at 4K output due to VRAM or performance reasons. it looked so horribly blurry. taa, dlss, dlaa didn't matter. it just looked so blurry. to me dlss quality at 1440p was just being able to play with the same TAA blur at a higher performance. in a way it still wasn't that bad. but it practically did nothing over TAA blur. not like I expected any better. DLSS 4 is just magical, I have no idea how they did it
 
Last edited:

Buggy Loop

Member
Also did anyone toggle FSR 4 settings or resolution until it matches DLSS 4 performance?

I want to know what it costs to match it.

Is it FSR 4 quality? 8k resolution? There has to be a point where it shows more details than DLSS 4 performance.
 

Buggy Loop

Member
The Daniel Owens video where he compares Spiderman 2 shows FSR4 being on par or close to DLSS4 with some instances on 1080p where he pointed out FSR4 actually performing very slightly better at reducing ghosting from the character.

Sometimes there's very little differences as seen in Star wars but there's a lot more work done by hardware unboxed to come to the conclusion that Daniel to be honest

 

64gigabyteram

Reverse groomer.
Now catch up in Raytracing
they have, or are starting to. 9070 XT hovers inbetween 5070 and TI for RT & only massively loses in some super nvidia optimized games (Black Myth Wukong, Alan Wake 2) they'll likely come up with a ray recon solution soon enough.

sounds dumb but seeing as how these are firsts from AMD i hope they can finewine their way into more better FSR4 IQ
 

Pagusas

Elden Member
they have, or are starting to. 9070 XT hovers inbetween 5070 and TI for RT & only massively loses in some super nvidia optimized games (Black Myth Wukong, Alan Wake 2) they'll likely come up with a ray recon solution soon enough.

sounds dumb but seeing as how these are firsts from AMD i hope they can finewine their way into more better FSR4 IQ
Starting to yes, but havn't yet. Their issue (beyond not having a ray recon solution yet) is they have abandoned the high-end where those of us who want real RT performance go. They need to catch up and get back into the fight, not settle for being the Dollar Store/aldi of GPU's.
 

Pagusas

Elden Member
if you are targeting high end you likely have enough money to pay Nvidia prices so why care what AMD will do?
Because I want competition to drive Nvidia further into innovation. The marketing, especially the top of the market, gets better when there is serious competition to one up each other. Nvidia's owned it now for 3 strait gens without any competition. The 5090 has suffered for this and for sure the eventual 6090 will suffer from it. They have no incentive to do major gains at the moment.
 
Top Bottom