Digital Foundry: Leaked FSR4 INT8 Test: RDNA 3, RDNA 2, Steam Deck, Asus ROG Ally, Nvidia + Xbox Series X Simulation

He is giving you the answer I gave you. The PS5 does not support Dp4A.
What part did you not understand yet?
The same part you still haven't understood of my previous post and you continue to ignore. I already told to you why I'm not convinced but still you persist because it runs on pc. I have a different opinion what else I can say.
 
Last edited:
Love gaf so much, even when there's good press and news about XBOX it has to be bad news according to some people. The Series X is so dead no studio will publish games for it, not even MS.

Gaf being gaf.
 
Dude you've become a joke. Stop arguing about nonsense
That's something reported different times by the developers about the difference of developing on console, unified architecture has its advantages but also you have to share the ram and bandwidth between the cpu and gpu tasks where on pc are separate and dedicated so compare straightly isn't it exactly like per like. For example 448GBps on pc gpu are all for the gpu, cpu has its ram and bandwidth; on console you have to share the bandwith between gpu and cpu as also the RAM quantity. What exactly is surprising or unbelievable or nonsense?
 
Last edited:
It's not a good news, it's hopium for hopeless
FSR4 will not come to XSX, don't even dream about it.

But it's cool to discuss what if it did and kinda giving MS some love for forward thinking with their console.

I don't think the series X will be supported but at least someone designing it was thinking about this stuff.

This could have been secret sauce if we didn't see the gen go the way it did with poor development and cross gen games for so long into the gen.
 
the only reason it would not is if AMD forbids it or doesn't officially ever release the int8 version.
there's no technical reason not to use it
Reasons are not technical but rather administrative.
To use it, it should be integrated into toolkit and games should be updated for it. It's not exactly MS initiative so I doubt preparations were done in advance.
Given state of xbox and how far we are into generation - spending 0.5-1 year just to roll out it on dead console make no sense. If anything it will be implemented on next xbox (in FP8 version as it's superior) and not current.

But it's cool to discuss what if it did and kinda giving MS some love for forward thinking with their console.
I don't think the series X will be supported but at least someone designing it was thinking about this stuff.
This could have been secret sauce if we didn't see the gen go the way it did with poor development and cross gen games for so long into the gen.
As usual, xbox fans like to live in delusions. It has nothing with forward thinking, it's just pure luck and lazyness. Sony cut down parts that were unimportant in 2019 for gaming and MS just went with off-the-shelf parts. And got lucky that those particular parts became useful 6 years later.
If it would be forward thinking - Xbox would have their own ML model for upscaling, those 50 tops could be put to use long time ago on their own dlss2-3/dlss-light equivalent. But there is none - so much for MS being AI company.
 
Last edited:
Reasons are not technical but rather administrative.
To use it, it should be integrated into toolkit and games should be updated for it. It's not exactly MS initiative so I doubt preparations were done in advance.
Given state of xbox and how far we are into generation - spending 0.5-1 year just to roll out it on dead console make no sense. If anything it will be implemented on next xbox (in FP8 version as it's superior) and not current.


As usual, xbox fans like to live in delusions. It has nothing with forward thinking, it's just pure luck and lazyness. Sony cut down parts that were unimportant in 2019 for gaming and MS just went with off-the-shelf parts. And got lucky that those particular parts became useful 6 years later.
If it would be forward thinking - Xbox would have their own ML model for upscaling, those 50 tops could be put to use long time ago on their own dlss2-3/dlss-light equivalent. But there is none - so much for MS being AI company.

I found out yesterday that MS does have its own scaler and it can work on 50 tops.

So basically you are trying to give Sony credit for pulling something that has been proven to be effective years later and praise them for it but not praising the company that left something in that could be used in the future.

That's some classic fanboy logic right there.

Just look at all recent comparisons between ps5 and series x using next gen engines like unreal engine 5 and honestly tell me which console was more forward thinking.
 
I don't think I need to quote some replies in this thread to prove it lol
I don't think anyone has said it's a bad news. Maybe more it's irrevelant because FSR4 will never come on series X for differents reasons. Also trying to argue the hardware was smarter designed for a coincidence (doubt this "full" RDNA 2 thing was thinking because they knew there were an AI upscaler by AMD in the next future) is quite opinable if not laughable.
 
Last edited:
I found out yesterday that MS does have its own scaler and it can work on 50 tops.

So basically you are trying to give Sony credit for pulling something that has been proven to be effective years later and praise them for it but not praising the company that left something in that could be used in the future.

That's some classic fanboy logic right there.

Just look at all recent comparisons between ps5 and series x using next gen engines like unreal engine 5 and honestly tell me which console was more forward thinking.
Your desire to praise MS with some bullshit "forward thinking blablabla" prime your view of others
I don't praise anyone - all decision were logical at that time for every company priorities - Sony balanced costs and MS didn't want to go way too deep into RnD and AMD added stuff for general purpose GPU computing. Every party had their reasons why they choose particular path and each of it has it pros and cons. And none of them knew what will happen in 5 years time, some got more lucky and some got less.
 
That's something reported different times by the developers about the difference of developing on console, unified architecture has its advantages but also you have to share the ram and bandwidth between the cpu and gpu tasks where on pc are separate and dedicated so compare straightly isn't it exactly like per like. For example 448GBps on pc gpu are all for the gpu, cpu has its ram and bandwidth; on console you have to share the bandwith between gpu and cpu as also the RAM quantity. What exactly is surprising or unbelievable or nonsense?

Look at a 780M SoC, using FSR4 Int8.
This is a 4TFLOP GPU, with a memory bandwidth of 88GB/s, shared by the CPU and GPU.

 
Your desire to praise MS with some bullshit "forward thinking blablabla" prime your view of others
I don't praise anyone - all decision were logical at that time for every company priorities - Sony balanced costs and MS didn't want to go way too deep into RnD and AMD added stuff for general purpose GPU computing. Every party had their reasons why they choose particular path and each of it has it pros and cons. And none of them knew what will happen in 5 years time, some got more lucky and some got less.

Can you provide any evidence that MS just lucked out here or are you just making statements with 0 evidence of what you say being true. I could see it being true, but I do not understand WHY MS would have gone on so much about being real RDNA 2 if they didnt decide to KEEP this stuff.

If it was just off the shelf parts, why did it not have the RAM on the GPU. This is a custom APU made specifically for the series X, it is not an off the shelf part and I feel like you are making stuff up to downplay MS and also hype up Sony. its clear what you are doing tbh.
 
Microsoft was already planning an AI upscaler way before the Series X launch.
By 2018, they were already making presentations about it. And later introduced DirectML.
So MS did have plans to use RDNA2 capabilities for DP4A.
The question is why they never finished and used it.
Probably just another one of Microsoft's screw ups for this generation.


Microsoft has actually been working on a 'super-resolution' AI upscaling technique based on DirectML for quite some time. This was first revealed in a GDC 2018 paper aptly titled 'Using AI to accelerate your game', and then showcased in a SIGGRAPH 2018 live talk named 'Deep Learning for Real-Time Rendering: Accelerating GPU Inferencing with DirectML and DirectX 12'.

In both cases, Forza Horizon 3 was used as the test game to compare DirectML super-resolution against regular upscaling through the bilinear filter. Here's a screenshot they had shared at the time to compare the two.

microsoft-directmlHD-1.jpg
 
I still wonder why MS could have done something about this for Series X a long time ago and have a clear edge over the PS5 but never did.
The only time I remember ML being mentioned this gen was for that muscle deformation in Spider-Man?
Looks like Wolverine will be using it in a similar way when that comes out. The only other released game I'm aware of using ML is GOW ragnarok where they used it for texture upscaling.
 
The performance hgit to XSX using FSR4 will be significant.
Rich's test had a 10ms overhead to upscale on the 'simulated' console setup. You might as well just run native at that point, it might even perform better.

there's no technical reason not to use it
That's like saying 'there's no technical reasons to not use Super Sampling all the time' - every console can run it and quality is dramatically superior to any Temporal upscaler.
 
Can you provide any evidence that MS just lucked out here or are you just making statements with 0 evidence of what you say being true. I could see it being true, but I do not understand WHY MS would have gone on so much about being real RDNA 2 if they didnt decide to KEEP this stuff.

If it was just off the shelf parts, why did it not have the RAM on the GPU. This is a custom APU made specifically for the series X, it is not an off the shelf part and I feel like you are making stuff up to downplay MS and also hype up Sony. its clear what you are doing tbh.
We see it all the way from X360 era, where MS just pay money to get lightly customized stuff and they tends to buy advanced tech. And Sony do a heavy engineering to tailor stuff for themselves and often go to a very low level.
Sony together with IBM designed PPU and MS just bought it and put in 3x core configuration they though would be better - a clear difference in resources allocation and priorities.
Even actual Magnus is a thing that mostly bought out and some things are based on what Sony and AMD designed.
 
well, he shows in the video how it would currently run on PS5 by trying to use it on RDNA1. it completely breaks, almost looks like a 90s movie representation of a virus taking over your screen lol.
That's a code/driver issue on the 5700, a bug. If done properly it would run fine but slow making its use in INT8 form pointless. There were such results with XeSS on some older GPUs previously already IIRC.
The question here though is who would do the porting/optimization for such FSR4 FP16 version for PS5. AMD won't, they won't care, they barely care about RDNA3 at this point. Sony? Doubtful, they are focused on "improving" PSSR and PS6 (which are highly likely to end up being related).
So in essence it won't run because at this point there are no parties interested in making it run. But from a technical perspective an AI assisted upscaler probably could've been made for the OG PS5 even if its applicability would be limited to simpler graphical titles only (the ones which manage to hit 120 fps or run with supersampling for example).
 
Can you provide any evidence that MS just lucked out here or are you just making statements with 0 evidence of what you say being true. I could see it being true, but I do not understand WHY MS would have gone on so much about being real RDNA 2 if they didnt decide to KEEP this stuff.
MS is not a hardware company, and history has shown that Sony is and has always been more involved than MS when it comes to these shared technologies they have been using since the PS360 generations.
If it was just off the shelf parts, why did it not have the RAM on the GPU. This is a custom APU made specifically for the series X, it is not an off the shelf part and I feel like you are making stuff up to downplay MS and also hype up Sony. its clear what you are doing tbh.
This is also true to an extent, though. It's just wrong, however, for anyone to say the Xbox chips are not custom chips either, but at the same time, you have to understand what level of customizations go into the chips designs. Eg. MS chips are more "off the shelf" than Sony's chips ever are. As you will always see stuff in sonys iteration of the APU that is just not in any other version of that chip. This happened with the PS4pro with half of the CUs in the GPU being bigger than the other half to support specific CBR hardware. Happened again with the PS5 with there being that whole storage pipeline and cache scrubbers.

MS, on the other hand, tend to take a simpler less invasive approach to customizations. Like they took out the infinity cache form the GPU while keeping everything else the same.

I think its wrong to say MS "lucked out" but I don't also buy into the whole forward-thinking thing. You can't say something or somone is forward thinking when you do not see the evidence. Eg. MS has been working on AI stuff since 2017. They even previewed it, but why didn't they use it? Why did AMD never try and do any AI stuff on RDNA2 GPUs that clearly supposedly had the "hardware" to do it? MS had ML-upscaling running two years before their console released... and yet it was never even hinted at as part of their architecture suite.

Why is the Zboz equivalent GPU test in this video showing as much as a 10ms hit in these tests? I feel there is more to all this than we clearly know.
 
MS is not a hardware company, and history has shown that Sony is and has always been more involved than MS when it comes to these shared technologies they have been using since the PS360 generations.

This is also true to an extent, though. It's just wrong, however, for anyone to say the Xbox chips are not custom chips either, but at the same time, you have to understand what level of customizations go into the chips designs. Eg. MS chips are more "off the shelf" than Sony's chips ever are. As you will always see stuff in sonys iteration of the APU that is just not in any other version of that chip. This happened with the PS4pro with half of the CUs in the GPU being bigger than the other half to support specific CBR hardware. Happened again with the PS5 with there being that whole storage pipeline and cache scrubbers.

MS, on the other hand, tend to take a simpler less invasive approach to customizations. Like they took out the infinity cache form the GPU while keeping everything else the same.

I think its wrong to say MS "lucked out" but I don't also buy into the whole forward-thinking thing. You can't say something or somone is forward thinking when you do not see the evidence. Eg. MS has been working on AI stuff since 2017. They even previewed it, but why didn't they use it? Why did AMD never try and do any AI stuff on RDNA2 GPUs that clearly supposedly had the "hardware" to do it? MS had ML-upscaling running two years before their console released... and yet it was never even hinted at as part of their architecture suite.

Why is the Zboz equivalent GPU test in this video showing as much as a 10ms hit in these tests? I feel there is more to all this than we clearly know.
Imo DF are more passionate than tech expertise if I can say. They presume stuff many times than going deep in the "tech reality"; I think a lot of people overestimate their words giving to their take the credits as were the holy grail of tech analysis when it's not always the case; it's not just matter to disrespectful them, that's the impression they give me all the time they make a guess about something and it ends to misleading completely the argument. I have in mind all their strawman chats about the TF monster machine or the variable frequency on ps5.
 
Last edited:
Imo DF are more passionate that tech expertise if I can say. They presume stuff more than going deep in the "tech reality"; I think a lot of people overestimate too much them and give to their take as the holy grail when it's not just matter to disrespectful them, that's the impression they give me all the time they try to guess something and misleading completely the argument. I have in mind all their strawman about TF beast and variable frequency.
Agree with you on DF... however, the simple fact of the matter is that if the Xbox consoles could do ML-based reconstruction in any meaningful way, they would have done it already. Especially when you consider that MS had this tech (ML-upscaling) in the dev stage as far back as 2017. And that the Xbox Series S would definitely have benefited from something like that.
 
I don't think anyone has said it's a bad news. Maybe more it's irrevelant because FSR4 will never come on series X for differents reasons. Also trying to argue the hardware was smarter designed for a coincidence (doubt this "full" RDNA 2 thing was thinking because they knew there were an AI upscaler by AMD in the next future) is quite opinable if not laughable.

The Series X will still have its legs, probably the same legs as the PS5 Pro, so if MS and AMD agree to implement this on the XBOX SDK, which shouldn't be out of the question if the Next XBOX SDK is backwards compatible or a simply a newer version of the current one, then it could happen. After all it's quite clear cross-gen is here to stay, so Series X releases will be still released alongside the Next Series console versions as they already did with XBXONE + Series releases.

It will all depend on the time + effort required to do understand if it's worth the investment.
 
Last edited:
The Series X will still have its legs, probably the same legs as the PS5 Pro, so if MS and AMD agree to implement this on the XBOX SDK, which shouldn't be out of the question if the Next XBOX SDK is backwards compatible or a simply a newer version of the current one, then it could happen. After all it's quite clear cross-gen is here to stay, so Series X releases will be still released alongside the Next Series console versions as they already did with XBXONE + Series releases.

It will all depend on the time + effort required to do understand if it's worth the investment.
I don't know what make you think that because looking at MS lately seems they already buried it. And I'm not even that sure if it's that worth it use FSR4 on Series X. PSSR is already quite taxing on ps5 pro can't imagine FSR4 on Series X reconstructing at 720p or 800p to 4k.
 
Last edited:
People is complaining is some games with the PSSR 2ms overhead compared to TSR, and you think 8-10ms is going to work for XSX?
Most of the people are convinced about it but I'm considered a joke of a man because I dare to say I'm doubtful knowing pc and console aren't exactly like per like; never said I'm an expert eh but doubt the most are there.
 
Last edited:
I wonder if this works on older Nvidia GPUs like Pascal. They support int 8 acceleration.
No reason it wouldn't if it's done through DP4a similarly to XeSS.

FSR4 on NVIDIA doesn't use the tensor cores right?
INT8 math on RTX cards should be running on tensor cores but if it's DP4a math then it's not using the full tensor core throughput and is thus running at a fraction of possible speed.
FSR4 FP8 version is running through AMD's proprietary API and thus can't run on anything but AMD h/w (similar to DLSS and XeSS XMX).
 
This was the int8 and int4 support that was in RDNA2 for Xsx and Xss.
+ HW VRS tier2 (not as good as recent software VRS implementations), + mesh shaders (took till late 2025 before used in two! game), + sampler feedback (useless tech)
Why is sampler feedback useless was it even used ?
 
That's why I also stated that after all it all comes to understand if it's worth the investment or not. But it's clear the system does support it.
The system doesn't support it. MS never made an upscaler nor did they ask for AMD to get FSR4 running on it. This is a DF "simulation" on PC using the leaked AMD source code. As I was saying in the other thread though console is fixed hardware and the likely reason this has never happened is because on a 50TOPS machine you're talking about a major performance hit. It would have made PS5 comparisons worse.
 
RDNA 1.5 is back

EDIT : No gonna interest for Series X to use it, instead to lost 10 fps cost.. still better to use FSR3 instead this fake ass FSR4
AsRock BC-250 reveals PS5 GPU design and it's failed PC/XSX's RDNA 2 standard. Linux RDNA 1 driver is used instead of the PC RDNA 2 driver.

RDNA 1.5 (with RDNA 2 RT) is real.

AsRock BC-250's recycled PS5 APU with active 24 CU iGPU has sent Sony's NDA about hiding PS5 APU details in the bin.
 
Last edited:
Series X DID have secret sauce and was like MSs pro console from the get go. What do you know.

Shame they didn't have any of this ready to be able to actually show us.....MS are just great at lining everything up... /s
MS fails to properly deploy ML upscale even though XSX/XSS's RDNA 2 has DPa4 INT8 support. Superior hardware is nothing without good software.

AMD software stack's ML upscaler (FSR4) is also late. FSR4-based next PSSL is also late. There is a pattern: AMD sucks with GPU software when compared to Nvidia.
 
Last edited:
AsRock BC-250 reveals PS5 GPU design and it's failed PC/XSX's RDNA 2 standard. Linux RDNA 1 driver is used instead of the PC RDNA 2 driver.

RDNA 1.5 (with RDNA 2 RT) is real.

AsRock BC-250's recycled PS5 APU with active 24 CU iGPU has sent Sony's NDA about hiding PS5 APU details in the bin.
One Two Three Snl GIF by Saturday Night Live
 
  • Features an AMD BC250 APU, codenamed 'Ariel', a cut-down variant of the APU in the PS5. It integrates 6x Zen 2 cores, at up to 3.49GHz (ish), as well as a 24CU RDNA2 iGPU (Codename 'cyan-skillfish'), as opposed to the 36 available in a standard PS5 Ariel SoC
It seems to be basically RDNA1/GFX10 but with added support for image_bvh_intersect_ray ray tracing instructions. LLVM seems to be calling this variant gfx1013.
After this set of patches applied, this chip is properly detected and is usable on modern kernels. Tested on kernel 6.12.9.
Many games are reported to work. Quake2 RTX using the ray tracing pipeline works (not very fast, but 3-4x faster than RADV_PERFTEST=emulate_rt). Custom compute load that uses ray query a lot also works reliably."




Testing RPCS3 Demons Souls on AMD BC-250 with Arch Linux. PS5 APU's RPCS3 (PS3 emulator) is reasonable.
 
Last edited:
You can bet all your balls that Leadbetter doesn't miss the opportunity to relaunch the narrative about the hardware better engineered.
But Xbox has a better console this gen, that's a fact, better VRR, better OS features, AI hardware.

I'm actually expecting XS consoles to be FSR 4 capable.
 
But Xbox has a better console this gen, that's a fact, better VRR, better OS features, AI hardware.

I'm actually expecting XS consoles to be FSR 4 capable.
Xbox hardware compenents costed around 539$ if I'm not wrong where ps5 should be 469$, maybe a bit less, I don't remind precisesly, so yes spending more in the hardware components deliver more power but it's better a console with 18-22% of more raw power if it costed more ? For me it's opinable.
 
Last edited:
  • Features an AMD BC250 APU, codenamed 'Ariel', a cut-down variant of the APU in the PS5. It integrates 6x Zen 2 cores, at up to 3.49GHz (ish), as well as a 24CU RDNA2 iGPU (Codename 'cyan-skillfish'), as opposed to the 36 available in a standard PS5 Ariel SoC
It seems to be basically RDNA1/GFX10 but with added support for image_bvh_intersect_ray ray tracing instructions. LLVM seems to be calling this variant gfx1013.
After this set of patches applied, this chip is properly detected and is usable on modern kernels. Tested on kernel 6.12.9.
Many games are reported to work. Quake2 RTX using the ray tracing pipeline works (not very fast, but 3-4x faster than RADV_PERFTEST=emulate_rt). Custom compute load that uses ray query a lot also works reliably."




Testing RPCS3 Demons Souls on AMD BC-250 with Arch Linux. PS5 APU's RPCS3 (PS3 emulator) is reasonable.

I really don't understand what really you trying to argue in all honesty.
 
Top Bottom