AMD FSR Redstone uses machine learning to achieve parity with Nvidia DLSS

So you are just trolling....
Jesus. You really can't resist with this stupid member war right? Can you tell me what the hell of concrete tech data is present in Bonji videos or any others video comparison about such stuff on the net? It's not even easy to spot resolution on DRS with the tools. Imagine measure the quality of the upscaler.
 
Last edited:
You clearly have not seen FSR4....and I cant blame you because it isn't widespread...but that achievement has already happened.
We disagree here. I have played Shadows, Spider-Man 2, both the horizon games, GoW:Ragnarock, and Enotria. None of these games are in par imo. There may be titles that I have not played (oblivion remaster for example) that match or are on par but if the ones I have tested at least are not.

Asscreed and Spider-Man 2 have awful motion artifacting, again, I will leave room for "bro you don't have your settings tweaked" but if I can start both games, set them to ultra (or that games equivalent) and they perform great with Nvidia with no tweaking then I would like the same for AMD.

I am personally heavily invested in AMD (or even intel for what matter with their upcoming releases on the Ai side) matching/outpacing team green, that day cannot get here quick enough.
 
Jesus. You really can't resist with this stupid member war right? Can you tell me what the hell of concrete tech data is present in Bonji videos or any others video comparison about such stuff on the net?

On that video, we have direct comparisons between these upscalers with zoom in picture, slowdowns and frame by frame analysis. It shows every defect that each tech has.
But somehow you think this is not enough. But that your friends that saw some issues is a reliable source. But then you contradict yourself by stating there is no way to test this and we need some magical specialized AI tool, that doesn't exist.
 
On that video, we have direct comparisons between these upscalers with zoom in picture, slowdowns and frame by frame analysis. It shows every defect that each tech has.
But somehow you think this is not enough. But that your friends that saw some issues is a reliable source. But then you contradict yourself by stating there is no way to test this and we need some magical specialized AI tool, that doesn't exist.
No I told you this is not tech data. You haven't understand anything of what I'm trying to argue. I remind similar stuff in the recent long discussion we have where you claimed Cerny admitted the ps5 pro hardware memory/bandwith troubling him which isn't it what he tried to argue in the conference about it. I guess it's an habit to misleading the others words.
 
Last edited:
I'm starting to think that the "tech analysis" that you want is only things that say that the Pro and PSSR are the best things ever.
I'm starting to think the only discussion you want to have it's only where you appear the more expert of the other. I never make a single post where I claimed Pro or PSSR are the best in something. Differently to you and many others.
Anyway I don't know if exist tools which analyse the precision of the AI upscaler maybe something about is part of the development kit of such upscalers, I wouldn't know.
 
Last edited:
i-dont-believe-you-whatever.gif
 
You already have a long discussion with another person about your approach with the PSSR "analysis" eh.

And DLSS won that discussion.

So far I haven't seen anything suggesting PSSR reaching DLSS3 quality, not personally in games, not in YT videos and not on this forum.

So in your opinion, is PSSR as good as DLSS3?
 
And DLSS won that discussion.

So far I haven't seen anything suggesting PSSR reaching DLSS3 quality, not personally in games, not in YT videos and not on this forum.

So in your opinion, is PSSR as good as DLSS3?
You seem really obsessed with this research of a winner and loser eh. Personally I think PSSR it's better in motion, speaking of details clarity and DLSS3 in AA/denoising. But again it's just based to simple eyes spot not tools measurement or math percentage about artifacts or noises in the whole screen and the relative persistence in ms; I ignore if it's possible elaborate such stuff via algorithm.
 
Last edited:
End of the day DLSS4 still beats FSR4 even if the comparison can get close. And RX9000 RT performance finally matches RTX 4000 cards. It's great AMD is reaching feature parity with NVIDIA previous gen. The problem is the RTX 5000 series are what AMD is competing with and for the complaints on price, you can find NVIDIA cards for closer to MSRP than the AMD cards.

And this is ignoring the fact that RTX 5000 is focusing on new types of rendering to move away from brute force rasterization. The cards will likely not end up being special just like the RTX 2000 series was seen as a bust. But it gives both developers and NVIDIA experience for where the future is going. I could see AMD's next gen sticking with the old way of doing things and NVIDIA ends up running circles around them once again.
This is true. 5070s and 5070 Tis can be had for reasonably close to MSRP.

At $599, the 9070XT is an easy recommend over the $750 5070 Ti and an absolute no brainer over the $850+

My local MC regularly gets $825 or less 5070 Tis and I havent seen a 9070XT under $750 since launch. At that price point, its a lot less obvious which to get, but I'd give a slight edge to nvidia given the better RT performance and more DLSS compatibility.

However, I have no issue saying that DLSS4 is no longer worth a price premium over FSR4.
 
Last edited:
Top Bottom