Digital Foundry: Leaked FSR4 INT8 Test: RDNA 3, RDNA 2, Steam Deck, Asus ROG Ally, Nvidia + Xbox Series X Simulation

But Xbox has a better console this gen, that's a fact, better VRR, better OS features, AI hardware.

I'm actually expecting XS consoles to be FSR 4 capable.
The PS5 Pro has 6x better AI hardware than the XSX. Do you think it makes a difference in practical terms?

Good luck implementing FSR4 on a Series S. Even on the 30fps games I'm skeptical it would be a possibility. I have no idea why DF tried to run this so called Xbox Series "simulation" and pin blame on base PS5 as if that's what's holding XSX back on implementing it. Purely academic simulation I guess but suggesting base PS5 is the reason we haven't seen it on XS is stupid. The base PS5 isn't holding XSX back on an upscaler option especially when there is a PS5 Pro out there and PC support. The Xbox series hardware itself is what's holding FSR4 support back.
They were better off doing a PS5 Pro "equivalent" test because at least that would have been somewhat useful in the real world.
 
The PS5 Pro has 6x better AI hardware than the XSX. Do you think it makes a difference in practical terms?

Good luck implementing FSR4 on a Series S. Even on the 30fps games I'm skeptical it would be a possibility. I have no idea why DF tried to run this so called Xbox Series "simulation" and pin blame on base PS5 as if that's what's holding XSX back on implementing it. Purely academic simulation I guess but suggesting base PS5 is the reason we haven't seen it on XS is stupid. The base PS5 isn't holding XSX back on an upscaler option especially when there is a PS5 Pro out there and PC support. The Xbox series hardware itself is what's holding FSR4 support back.
They were better off doing a PS5 Pro "equivalent" test because at least that would have been somewhat useful in the real world.
It's all the generation they blamed ps5 hardware for something 🙄 and they really blamed ps5 again for this full RDNA2 bullshit? Not seen the video because I can't stand Leadbetter especially because I know enough him to put out such volatile warring theory. I already imagined he would have tried again a nonsense argumentation to feed the fanboys mind.
 
Last edited:
The PS5 Pro has 6x better AI hardware than the XSX. Do you think it makes a difference in practical terms?

Good luck implementing FSR4 on a Series S
. Even on the 30fps games I'm skeptical it would be a possibility. I have no idea why DF tried to run this so called Xbox Series "simulation" and pin blame on base PS5 as if that's what's holding XSX back on implementing it. Purely academic simulation I guess but suggesting base PS5 is the reason we haven't seen it on XS is stupid. The base PS5 isn't holding XSX back on an upscaler option especially when there is a PS5 Pro out there and PC support. The Xbox series hardware itself is what's holding FSR4 support back.
They were better off doing a PS5 Pro "equivalent" test because at least that would have been somewhat useful in the real world.
FSR 4.0.2. DPa4 INT8 works and is useful with an 8 CU scale RDNA 2-enabled Steam Deck.

XSX's full RDNA 2 iGPU is meaningless when PS5's game exclusives sold Sony's platform. Microsoft itself is what's holding XSX back.
 
We see it all the way from X360 era, where MS just pay money to get lightly customized stuff and they tends to buy advanced tech. And Sony do a heavy engineering to tailor stuff for themselves and often go to a very low level.
Sony together with IBM designed PPU and MS just bought it and put in 3x core configuration they though would be better - a clear difference in resources allocation and priorities.
Even actual Magnus is a thing that mostly bought out and some things are based on what Sony and AMD designed.
Xbox 360 had a proper HSA(Heterogeneous System Architecture)-like feature e.g. CPU pointer exchange with the GPU. Many concepts with GCN v1 are in ATI Xenos. PS3 CELL's lacks the pointer exchange with the CPU and SPEs.

BwfGjBnkuRGH7DfO.png



8uFogW1BsZX2Vrgw.jpg


Each GCN ACE (async compute) unit has up to 8 simultaneous contexts. GCN 1.0 and Bonaire GCN have two ACE units. GCN's Hawaii, Tonga, and PS4 GPU have 8 ACE units.

Xbox 360 GPU's ROPS has Rasterizer Ordered Views like features, which is exploited by a third-party Xbox 360 emulator with DirectX12 Feature Level 12_1.

Xbox 360 GPU has hardware tessellation, hence less load on the three PPE CPU cores.

Both GCN and Xenos GPUs are SIMD-based instead of PC's Radeon HD VLIW5 and VLIW4 designs.

Xbox 360's GpGPU was very advanced when compared to PS3's aging G7x class RSX.
 
Last edited:
FSR 4.0.2. DPa4 INT8 works and is useful with an 8 CU scale RDNA 2-enabled Steam Deck.

XSX's full RDNA 2 iGPU is meaningless when PS5's game exclusives sold Sony's platform. Microsoft itself is what's holding XSX back.
This reasoning makes absolutely no sense. They would have sold more consoles and more third party software if it was capable of taking advantage of this. Hell they could take advantage when PS5 Pro has better support than XSX now but they don't. Why do you think that is?

"Tests show that on Steam Deck in Performance mode the gain from FSR 4 INT8 is only about 28% compared to native rendering, it is lower than FSR3 or Intel XeSS, which provide from 29 to 43% depending on the scene. However, FSR 4 INT8 requires more resources, causing frequent drops in frame rate, especially in scenes with high GPU-load."

Even steamdeck with its often sub-60fps and low resolutions does not benefit much from this. It would not have scaled well on the XSX with its higher framerates, higher resolutions and low TOPS performance.
 
Last edited:
This reasoning makes absolutely no sense(3). They would have sold more consoles and more third party software if it was capable of taking advantage of this. Hell they could take advantage when PS5 Pro has better support than XSX now but they don't. Why do you think that is?

"Tests show that on Steam Deck in Performance mode the gain from FSR 4 INT8 is only about 28% compared to native rendering, it is lower than FSR3 or Intel XeSS, which provide from 29 to 43% depending on the scene. However, FSR 4 INT8 requires more resources, causing frequent drops in frame rate, especially in scenes with high GPU-load."(1, 2)

Even steamdeck with its often sub-60fps and low resolutions does not benefit much from this. It would not have scaled well on the XSX with its higher framerates, higher resolutions and low TOPS performance.
1. On image quality, performance mode FSR4 INT8 is superior when compared to FSR 3.1.

2. XSX GPU has 52 CU scale when compared to the Steam Deck's 8 CU scale.



The leaked FSR4.0.2 was designed for DP4a INT8 capable GPU, but RDNA 3.0/3.5 has WMMA INT8.

3. Superior hardware can be beaten by inferior hardware with superior exclusive games.
 
Last edited:
1. On image quality, performance mode FSR4 INT8 is superior when compared to FSR 3.1.

2. XSX GPU has 52 CU scale when compared to the Steam Deck's 8 CU scale.


What good is better image quality though at a significant cost to framerate even your video shows that on a 7900XTX FSR4 is two tiers slower than FSR3 meaning the gain in image quality for the same framerate is very debatable.
3. Superior hardware can be beaten by inferior hardware with superior exclusive games.
Sure but that doesn't stop the use of FSR4 on the other "superior machine", especially when xbox were making exclusive games in the past. Why do you think they (xbox) didn't do it for their own games? PC and PS5 Pro (the superior hardware) both have int8 support, why would the xbox be held back by anything other than MS deciding this is not worth pursuing?
 
Last edited:
What good is better image quality though at a significant cost to framerate even your video shows that on a 7900XT FSR4 is two tiers slower than FSR3 meaning the gain in image quality for the same framerate is very debatable.

But FSR4 Int8, even in performance mode, looks better than FSR3.1 in Quality mode. And they have the same performance.
 
What good is better image quality though at a significant cost to framerate even your video shows that on a 7900XTX FSR4 is two tiers slower than FSR3 meaning the gain in image quality for the same framerate is very debatable.

Sure but that doesn't stop the use of FSR4 on the other "superior machine", especially when xbox were making exclusive games in the past.
Why do you think they (xbox) didn't do it for their own games? PC and PS5 Pro (the superior hardware) both have int8 support, why would the xbox be held back by anything other than MS deciding this is not worth pursuing?
1. FSR3.1 quality mode is inferior when compared to FSR4.0.2 INT8 performance mode.

FSR3.1 quality mode is awful when I run it on my RTX 4090 and RTX 5090. My fastest current AMD GPU is Radeon 890M 16 CU scale RDNA 3.5).

2. Microsoft is distracted by Windows-related co-pilot and server AI. Microsoft is maintaining two parallel Windows 11 CPU editions i.e. AArch64 and X64.

3. Unlike Nvidia's mature DLSS releases, both XSX and PS5 have late-release ML/DL upscalers, and the common factor is AMD.
 
Last edited:
But FSR4 Int8, even in performance mode, looks better than FSR3.1 in Quality mode. And they have the same performance.
Debatable I'd say especially with the required effort for it, I hardly see a difference myself. Besides this is what it scales like on a 7900XTX being 2 tiers slower and it would scale worse on lower end cards like a 6700XT or Series S. Developers often opt for performance mode on FSR3 anyway and would rather have the framerate boost for less optimisation work. Has nothing to do with PS5 being more popular though. Nothing is holding back the XSX other than decisions made by MS/xbox which I don't necessarily disagree with in this case.
 
Debatable I'd say especially with the required effort for it, I hardly see a difference myself. Besides this is what it scales like on a 7900XTX being 2 tiers slower and it would scale worse on lower end cards like a 6700XT or Series S. Developers often opt for performance mode on FSR3 anyway and would rather have the framerate boost for less optimisation work. Has nothing to do with PS5 being more popular though. Nothing is holding back the XSX other than decisions made by MS/xbox which I don't necessarily disagree with in this case.
FSR4's release was a relatively recent event with the RX 9070 family. Both XSX and PS5 have late-release ML/DL upscalers, and the common factor is AMD i.e. the weakness is common for AMD-based game console platforms and AMD PC GPUs.

My Hardware Unbox link has the RX 6750 XT FSR4 INT8 vs FSR3.1 benchmark test and comparison.
 
Last edited:
Debatable I'd say especially with the required effort for it, I hardly see a difference myself. Besides this is what it scales like on a 7900XTX being 2 tiers slower and it would scale worse on lower end cards like a 6700XT or Series S. Developers often opt for performance mode on FSR3 anyway and would rather have the framerate boost for less optimisation work. Has nothing to do with PS5 being more popular though. Nothing is holding back the XSX other than decisions made by MS/xbox which I don't necessarily disagree with in this case.

You probably only saw the DF review. But there are several channels and sites that have reviewed FSR4 Int8 before them and with different GPUs and games.
One of the most comprehensive was done by Computerbase. They tested with a 9060XT, 7800XT and a 6800XT. In more games than DF.

The other thing to consider is that this is just a mod, based on early code. And on PC.
There is probably quite a bit of room to improve performance and image quality further.
And with consoles having lower level APIs, there is greater room for optimization.

plHoCxMgCz9y9XyI.png
 
FSR4's release was a relatively recent event with the RX 9070 family. Both XSX and PS5 have late-release ML/DL upscalers, and the common factor is AMD i.e. the weakness is common for AMD-based game console platforms and AMD PC GPUs.

My Hardware Unbox link has the RX 6750 XT FSR4 INT8 vs FSR3.1 benchmark test and comparison.
If the common factor is AMD then even less reason for DF to blame the lack of it on XSX on PS5 adoption. MS didn't need to be reliant on AMD just as Sony wasn't with PSSR. The benefit of it on XSX would be small to nonexistent though because no developer is going to switch from FSR3 performance and drop on average more than 10fps. I like to think the reason they didn't build an upscaler with DirectML is because they were competent enough to know it would not be worth the hassle on that hardware.

You probably only saw the DF review. But there are several channels and sites that have reviewed FSR4 Int8 before them and with different GPUs and games.
One of the most comprehensive was done by Computerbase. They tested with a 9060XT, 7800XT and a 6800XT. In more games than DF.

The other thing to consider is that this is just a mod, based on early code. And on PC.
There is probably quite a bit of room to improve performance and image quality further.
And with consoles having lower level APIs, there is greater room for optimization.

plHoCxMgCz9y9XyI.png
Do you have some good comparisons of FSR4 performance vs FSR3 quality? The hardware unboxed video didn't really make a good point of showing this so called increase in IQ in the video.

Consoles normally opt for FSR3 performance though and with that in your graph you end up with a 11fps drop in performance when switching to FSR4. I don't think anybody is going to be adopting FSR4 on an Xbox Series console. Not even the XSX.
 
Last edited:
If the common factor is AMD then even less reason for DF to blame the lack of it on XSX on PS5 adoption. MS didn't need to be reliant on AMD though just as Sony wasn't with PSSR. The benefit of it on XSX would be small to nonexistent though because no developer is going to switch from FSR3 performance and drop on average more than 10fps.

Do you have some good comparisons of FSR4 performance vs FSR3 quality? The hardware unboxed video didn't really make a good point of showing this so called increase in IQ in the video.

Consoles normally opt for FSR3 performance though and with that in your graph you end up with a 11fps drop in performance when switching to FSR4. I don't think anybody is going to be adopting FSR4 on an Xbox Series console. Not even the XSX.

You have reviews from Hardware Unboxed, Digital Foundry, ComputerBase and Ancient Gameplays, Daniel Owen, that show the differences between FSR 3.1 and FSR4.
And it's clear that FSR4 Int8 looks the best.
IF you have any doubts, you can run FSR 3.1 and FSR4 Int8 on any modern GPU. Even from Nvidia.

No. I also agree that no one will use FSR4 on the Series S/X. It's a dead console and not worth the extra effort.
Some devs are don't even bother releasing games for it.
But on PC, it's good to have, as people with older GPUs, that support DP4A, can enjoy a quality boost.
 
If the common factor is AMD then even less reason for DF to blame the lack of it on XSX on PS5 adoption. MS didn't need to be reliant on AMD just as Sony wasn't with PSSR. The benefit of it on XSX would be small to nonexistent though because no developer is going to switch from FSR3 performance and drop on average more than 10fps. I like to think the reason they didn't build an upscaler with DirectML is because they were competent enough to know it would not be worth the hassle on that hardware.


Do you have some good comparisons of FSR4 performance vs FSR3 quality? The hardware unboxed video didn't really make a good point of showing this so called increase in IQ in the video.

Consoles normally opt for FSR3 performance though and with that in your graph you end up with a 11fps drop in performance when switching to FSR4. I don't think anybody is going to be adopting FSR4 on an Xbox Series console. Not even the XSX.
1. FSR4-based PSSR update for PS5 Pro is scheduled for next year i.e. 2026.

From https://www.tweaktown.com/news/1077...rade-in-2026-not-amd-fsr-4-support/index.html

TL:RDR: Sony will launch an upgraded PlayStation Spectral Super Resolution (PSSR) upscaling technology in 2026 for the PS5 Pro, based on AMD's FSR 4 but uniquely optimized for console gaming. Unlike PC-focused FSR 4, Sony's custom PSSR handles variable rendering resolutions at fixed 4K output and 60FPS display

The late ML upscale deployment has a common factor i.e. AMD.

PS5 Pro was released on November 7, 2024 and that's after three RTX generations in the normal consumer market i.e. Turing, Ampere, ADA LoveLace. Blackwell was released near the PS5 Pro's release window.

Unlike NVIDIA's AI R&D effort, AMD's AI R&D is spread across several architectures that are not focused on a single architecture family i.e. it's a mess.

PS5 Pro's AI hardware capability falls between RDNA 3.0 and PC's RDNA 4.0 (GFX1250 and GFX1251). PC servers have incoming Mi400 (GFX1250) RDNA 4.5 with FP4 and FP6.



2. FSR 4.0.2 INT8 on RDNA 2 (RX 6800 XT) Tested.


FSR 3.1.5 compared against FSR 4.0.2 INT8

For CyberPunk 2077 on RX 6800 XT (72 CU RDNA 2 scale)

Native: 100% (81 fps)
FSR3: 151% (122 fps)
FSR4 (INT8): 141% (114 fps)


RDNA 3.0/3.5 still has WMMA INT8 beyond DPA4 INT8.
 
Last edited:
PS5 gpu was exposed from the gihtube leak?
Yep. AMD/AsRock BC-250 has a recycled yield failed PS5 APU with 24 CU scale iGPU and six CPU cores active. It shows the PS5 APU is Windows NT and mainline Linux X64 capable and usable.

There is no PS5 SKU lower than 8 CPU cores and a 36 CU iGPU configuration.

BC-250 doesn't have Sony's garden-walled DRM enforcement.

AMD's Ryzen 4700S has a recycled yield failed PS5 APU with just the eight CPU cores active.

To keep PS5's BOM cost as low as possible, failed yields need to be paid.



On unit sales, despite XSX's full support for RDNA 2 features, it still lost to PS5. Microsoft initially made a big PR noise about XSX's server-grade features with its GPU e.g. DPA4 as AMD's early AI extensions.

Radeon RX 6800 Series has excellent ROCm-based OpenCL Performance On Linux when compared against the Ampere RTX 3080's OpenCL (NVIDIA needs Tensor cores). PlaidML benchmark was used..
 
Last edited:
Top Bottom