AMD FSR Redstone uses machine learning to achieve parity with Nvidia DLSS

Being annoyed hardware based AI upscaling isn't on the last gen AMD cards is no different than being annoyed that RTX wasn't backported to the 1080.
The silicon needed to crunch the numbers simply isn't there.
FSR4 works pretty damn well, but even without it the 9070 is a ridiculously performant card, often trading blows and sometimes even beating the 4080.
 
What's the difference between DLSS and machine learning?
DLSS was the first time we saw machine learning tech used for gaming. It's a temporal anti aliasing and upscaling technique which uses a Convuluted Neural Network or Transformer Neural Network to inference what the low resolution image you're seeing should look like at a higher resolution and then shows this higher resolution instead so the GPU no longer has to render it traditionally. For example 4k DLSS Performance mode is a 1080p image being reconstructed to 4k by DLSS.

It first started exceeding every other form of upscaling and AA when Nvidia released DLSS2 5 years ago and it's only gotten better since.
 
This article feels out of place, we already know AMD switched to AI based upscaling for FSR4 and it's done a great job. Yeah you can say AMD are behind but with FSR4 they've gone from being 5 years behind to 1 year.

And with all the shit Nvidia is pulling I'm sure most won't mind being a year behind if it avoids that shit.
 
Top Bottom