Digital Foundry gets called out on apparent double standard between coverage of NVidia vs. AMD Frame Generation Tech

I mean… one of the very first replies to the tweet nails it. It's likely that DF just has a more thorough understanding of what frame generation actually is now. When DLSS3 was first introduced it was essentially just witchcraft. Now that there's been some time to fiddle with it and really understand the nitty gritty and everything it is - when talking about a similar technology they are more nuanced in their verbiage.

I personally think FSR3 looks better. It's less blurry. Sure there are some more artifacts compared to DLSS and I've always been pro-dlss against FSR but I have to admit AMD is really bringing it to the table. Especially since FSR isn't hardware locked.
That has to be BS, Nvidia was first to implement frame generation into games but frame gen was a thing theoretically since the 360 era. Also, isn't frame gen a thing in VR?
 
I'm not an expert on these techs but for what I know, FSR is hardware agnostic because it relies on shader cores while DLSS uses its own cores.

It's clear that FSR eats some performance using cores intended for graphics while DLSS not.
DLSS and FSR perform the same on NVIDIA GPUs 99% of the time.
 
Rich: Im careful with words used to describe the frame rate increase because similar to DLSS3, I dont think you can call it extra performance as such.






Where is the double standard here?
He says he doesnt think you can call FrameGen from DLSS3 or FSR3 extra performance.
Did you watch the clip in the tweet? The DF crew knew frame gen was a thing before Nvidia and suggested it was a type of performance boost even though they state frame gen can make a game feel less responsive on the DLSS3 clip.
 
checking-radar-no-one-cares-radar.gif
 
Did you watch the clip in the tweet? The DF crew knew frame gen was a thing before Nvidia and suggested it was a type of performance boost even though they state frame gen can make a game feel less responsive on the DLSS3 clip.

The clip in the tweet is an out of chronological order, cherry picked to look bad, out of context bunch of clips.
I watched it and know where all the clips are actually from so ive got context.

They didnt even have their current capture card when they did the DLSS3 video.

And in their most recent, and assumed most accurate data they have said out right that they wouldnt call DLSS3 and FSR3 FrameGen extra performance as such.

So whats the issue?


If you are gonna hold shit from yesteryear against them when they have updated and amended their opinions you might as well call out their PS4/XB1 era videos because its a fucking double standard.

They said the PS4 was the best way to play Game X, but clearly the Xbox One X version of that game runs and looks better.
Fucking double standards!!!!!!!!
 
Doesnt FSR3 FrameGeneration currently only work on 7000 series GPUs.
AMD Fluid Motion Frames only works on AMD GPUs 6000 series and up.

So isnt it hardware locked?
Source: https://community.amd.com/t5/gaming/first-look-at-amd-fidelityfx-super-resolution-3/ba-p/626581
AMD FSR 3 Supported Products


AMD FSR 3 is an open technology that does not require machine learning (ML) hardware, allowing it to be supported on a broad range of products and platforms, including consoles. When using FSR 3 with super resolution upscaling and frame generation we recommend the following hardware:


Supported and Recommended Hardware for using AMD FSR 3 with Upscaling and Frame Generation
AMDNVIDIA
Supported:
AMD Radeon™ RX 5700 and above
Recommended:
AMD Radeon™ RX 6000 Series and above
Supported:
NVIDIA GeForce RTX™ 20 Series and above
Recommended:
NVIDIA GeForce RTX™ 30 Series and above


(Note: we do not suggest using frame generation on products lower than our recommendations above. How frame generation performs will depend on the capabilities of your GPU and on older hardware you may not have an optimal experience and may see little to no improvement in performance.)


As AMD FSR 3 is an open technology, it will work on a broader range of hardware beyond what we recommend above. If just using super resolution upscaling WITHOUT frame generation, we recommend the following hardware:


Supported and Recommended Hardware for using AMD FSR 3 with Upscaling ONLY
AMDNVIDIA
Supported:
AMD Radeon™ RX 590 and above
Recommended:
AMD Radeon™ RX 5000 Series and above
Supported:
NVIDIA GeForce® GTX 10 Series and above
Recommended:
NVIDIA GeForce RTX™ 20 Series and above
 
Theres's visual performance and gameplay or latency performance.

We have all kinds of ways to measure visual performance (resolution, texture detail, fps, model detail/poly count, lighting, etc.) but latency performance measurement is sadly still in its infancy.
It's very easy to measure chances of latency: shoot a video with a high framerate (same or higher than the fps of the game) camera where it's seen the hand of someone pressing a key or button to perform a game action (like to shot or to move in a menu) and the tv/monitor running the game (with higher or same Hz than fps achieved than the game), in the same game both with this stuff activated and disabled, but with a framerate counter shown in the screen.

Then watch these 2 videos frame by frame to count the number of frames it took on both cases. Using that number and their framerate counter you will be to calculate the milliseconds of latency using simple math (latency in ms = frames of latency * 1000 / framerate).

The videos may or may not show different latency with this activated or disabled.

To be more accurate, specially if the camera isn't as fast as the framerate of the game, instead of making a single video with this activated and a single video with this disabled make multiple of them and get the average time.
 
Last edited:
Videos shot at different times with different understanding of how the tech works, I wouldn't call that any kind of a slam dunk.
 
I mean… one of the very first replies to the tweet nails it. It's likely that DF just has a more thorough understanding of what frame generation actually is now. When DLSS3 was first introduced it was essentially just witchcraft. Now that there's been some time to fiddle with it and really understand the nitty gritty and everything it is - when talking about a similar technology they are more nuanced in their verbiage.

I personally think FSR3 looks better. It's less blurry. Sure there are some more artifacts compared to DLSS and I've always been pro-dlss against FSR but I have to admit AMD is really bringing it to the table. Especially since FSR isn't hardware locked.

Are you saying that coming in late, years later, with the solution, more often than not a Kirkland brand copycat, has less wow factor?

Pretends To Be Shocked Fake Shock GIF by AIDES
 
While I usually don't share the same DF's Team Green excitiment and really think they should call-out Nvidia louder for pricing and gimped offerings, this video is pure manipulation that is severely taken out of context.
 
I mean… one of the very first replies to the tweet nails it. It's likely that DF just has a more thorough understanding of what frame generation actually is now. When DLSS3 was first introduced it was essentially just witchcraft. Now that there's been some time to fiddle with it and really understand the nitty gritty and everything it is - when talking about a similar technology they are more nuanced in their verbiage.

I personally think FSR3 looks better. It's less blurry. Sure there are some more artifacts compared to DLSS and I've always been pro-dlss against FSR but I have to admit AMD is really bringing it to the table. Especially since FSR isn't hardware locked.

But we've had an understanding of what frame generation is since day one. People realized it was not 'just witchcraft' because the latency data showed very early on that it remained similar whether running at 60fps native or 120fps* with frame gen and Nvidia Reflex enabled. We've even had laymen on forums understanding that aspect of the tech perfectly, funnily calling it 'fake frames' for ages, because it adds absolutely nothing to real game performance.

If the so-called expert outlet like Digital Foundry did not understand it from day one, that would be more than a little embarrassing. One look at the latency data would've shown exactly what the tech is doing, because DF should know that higher FPS should be intrinsically linked to a lowering of latency. There should not have been a learning curve over time, or an adjustment in their reporting of how frame generation works, because it was readily evident how it worked and just needed a ton of nuance attached to the reporting of it.

The tweet is perfect because if DF are taking issue with calling the tech a 'performance multiplier' now for Wichard's given reasons, why were DF comfortable calling it that multiple times in the first videos (which also just happened to literally parrot the Nvidia marketing line)? All roads lead to incompetence here - either it's incompetence in understanding the tech, incompetence in communicating how things work in their videos, or incompetence in tempering their own hype when describing technology like this as a 'performance multiplier' and a 'game-changer'.

I won't go as far as to accuse them of taking the free luxury cruise tickets from Jensen Huang, but dishonesty is all you're really left with if there wasn't some degree of incompetence involved.
 
The clip in the tweet is an out of chronological order, cherry picked to look bad, out of context bunch of clips.
I watched it and know where all the clips are actually from so ive got context.

They didnt even have their current capture card when they did the DLSS3 video.

And in their most recent, and assumed most accurate data they have said out right that they wouldnt call DLSS3 and FSR3 FrameGen extra performance as such.

So whats the issue?


If you are gonna hold shit from yesteryear against them when they have updated and amended their opinions you might as well call out their PS4/XB1 era videos because its a fucking double standard.

They said the PS4 was the best way to play Game X, but clearly the Xbox One X version of that game runs and looks better.
Fucking double standards!!!!!!!!
That DLSS "preview" was essentially paid advertisement, the DF crew clearly pointed out frame gen makes game images smoother but it's not like "real frame" and their usage of performance is wrong since your in-put on the mouse is not the same as the suggested/observable frame rate.

And the highlighted part was after FSR3.
 
I won't go as far as to accuse them of taking the free luxury cruise tickets from Jensen Huang, but dishonesty is all you're really left with if there wasn't some degree of incompetence involved.
You wouldn't be that far off. They have plenty [Sponsored by Nvidia] videos on their channel.
 
I'm with some of the responses earlier, are we to believe somebody because they make a fancy video on Twitter and they slow it down when he says the word performance for the maximum funny right?

Digital foundry is pretty inconsistent with me in terms of some of their views but I am not going to just criticize them for no reason when it comes to stuff like this. And where do you people find this stuff on Twitter as if it's like something you search for just to see if there's a gotcha moment to try to own one of these places and make them look bad. I will say that these places make mistakes here and there but I'm not necessarily finding this here.

Fsr3 is comparable and needs to make strides just like DLSS 3 when it came out. But I think it's clear to see because of nvidia's approach that they will have the advantage for the foreseeable future. I do love that you can use fsr3 with any GPU though. Hard to follow that one.
 
Cherry picking. The FSR video clearly explains this reasoning applies to both, and the DLSS video explains the latency implications quite explicitly

This is butthurt stuff because FSR3 is still a little beta. It doesn't even seem like the tech is bad, just not working correctly yet in terms of frame pacing.
 
DF just look like a marketing arm of some companies/products at this point: extremely biased towards some, if they did changed their stance on this they should be transparent about it, they should be making a video about how they are wrong about dlss3 and how they changed their minds, but I ain't seeing it.
 
FSR looks to be lowering the base framerate and then doubling it, where as DLSS is getting over double the base framerate. I don't see where DF is wrong, DLSS is clearly better. Am I missing something?
 
Last edited:
DF just look like a marketing arm of some companies/products at this point: extremely biased towards some, if they did changed their stance on this they should be transparent about it, they should be making a video about how they are wrong about dlss3 and how they changed their minds, but I ain't seeing it.
They changed their opinion about how best to refer to the extra frames generated by DLSS 3. This was explained in the 4090 review video. They did not change their opinion about how DLSS 3.0 works, or (as far as I am aware) how effective it is. They do not seem to be any less enthusiastic about frame generation than they were before and this also applies to FSR 3.0.
 
They changed their opinion about how best to refer to the extra frames generated by DLSS 3. This was explained in the 4090 review video. They did not change their opinion about how DLSS 3.0 works, or (as far as I am aware) how effective it is. They do not seem to be any less enthusiastic about frame generation than they were before and this also applies to FSR 3.0.
I think a video about dlss3 review will be much more popular than a 4090 review, they are just dishonest.
 
DF just look like a marketing arm of some companies/products at this point: extremely biased towards some, if they did changed their stance on this they should be transparent about it, they should be making a video about how they are wrong about dlss3 and how they changed their minds, but I ain't seeing it.
Really, an entire video just to say, "so we're just gonna say from now on that frame generation isn't exactly performance increase,"? Is that what would have appeased you? Months ago, Rich said the exact same thing for DLSS3, stating that you cannot exactly call it a performance boost. In this video he has the exact same stance for both technologies.

And to top it all off, FSR3 is useless more than 80% of the time and can actually make the game feel more jittery, not smoother, so there's no way you can call it a performance increase overall in its current state. DLSS3 is black and white, the frame rate and smoothness pretty much always increases, no exception. If anything, Rich knocked DLSS3 a peg to put it on par with FSR3.

The blind DF hatred on this board borders on lunacy at times.
 
Last edited:
I think a video about dlss3 review will be much more popular than a 4090 review, they are just dishonest.
Technically there wasn't a DLSS 3.0 performance review. There was a "first look" video showing DLSS 2.0 and 3.0 running together on the 4090, with no frame rate data and only a couple of benchmarks with frame generation specific performance. Then there was a 4090 review which had actual frame generation benchmarks. The first look video could not be effectively used to evaluate DLSS 3.0 due to the lack of frame rate data and FG specific benchmarks. It was, I agree, a marketing video. When DF provided benchmark data, they then gave the caveat that it could not be considered "real" performance.

There was a separate video made by Alex giving a breakdown of all the ways in which interpolated frames differed from "real" frames. There was also an extensive exploration of latency in different scenarios. I don't see how you could watch these videos and be misinformed about what DLSS 3.0 was doing. Even the "first look" video explained the latency implications of the technology.
 
Last edited:
Really, an entire video just to say, "so we're just gonna say from now on that frame generation isn't exactly performance increase,"? Is that what would have appeased you? Months ago, Rich said the exact same thing for DLSS3, stating that you cannot exactly call it a performance boost. In this video he has the exact same stance for both technologies.

And to top it all off, FSR3 is useless more than 80% of the time and can actually make the game feel more jittery, not smoother, so there's no way you can call it a performance increase overall in its current state. DLSS3 is black and white, the frame rate and smoothness pretty much always increases, no exception. If anything, Rich knocked DLSS3 a peg to put it on par with FSR3.

The blind DF hatred on this board borders on lunacy at times.

I don't hate DF, I just see it for what it is: marketing, I have just indifference for them.

If they really changed their instace they should have said so in the fsr3 video, they should say that they also don't care anymore about dlss3, but for some reason they didn't.
 
I don't hate DF, I just see it for what it is: marketing, I have just indifference for them.

If they really changed their instace they should have said so in the fsr3 video, they should say that they also don't care anymore about dlss3, but for some reason they didn't.
They didn't change their stance. They said it couldn't be exactly called a performance boost and they maintain that in this video. And what do you mean "don't care about DLSS3 anymore,"? They have a positive opinion on FSR3 and are looking forward to see the technology evolve and improve. They very much still care about frame generation.

This is why you shouldn't bother commenting on a stupid Twitter video slapped together in 5 seconds to generate outrage.
 
Hilarious 😅


I was putting Frame Gen on by default on but learning more makes me think it's less useful.

They need fake artificially reduced latency now lmao.


Make sense why it was paired with Reflex.

A better, AI enhanced Reflex maybe?

*Edit* After reading more from you guys, I just don't know, but, I do want better performance at the end of the day. 120fps low latency ain't enough yet.
 
Last edited:
They said the PS4 was the best way to play Game X, but clearly the Xbox One X version of that game runs and looks better.
Fucking double standards!!!!!!!!
That's the worst analogy I've heard because that's not an opinion change that's just technological advancement while maintaining an impartial opinion that graphics makes somewhere the best place to play. An opinion change would be suggesting graphics are very important during the PS4 then when the X1X comes suggesting they're not that important and the best place is actually PS4 pro. That's an opinion change.
 
Last edited:
That's the worst analogy I've heard because that's not an opinion change that's just technological advancement while maintaining an impartial opinion that graphics makes somewhere the best place to play. An opinion change would be suggesting graphics are very important during the PS4 then when the X1X comes suggesting they're not that important and the best place is actually PS4 pro. That's an opinion change.
And I take it people arent allowed to have a change of opinion?
Considering its Alex who harped on about DLSS3 in that one video being extra performance and Rich saying its not extra performance as such.
Couldnt it just be two differing opinions from different people within DF?
If you watch their DF Directs they dont always agree on everything.
 
I mean, c'mon... Digital Foundry characterizes themselves as tech experts, but their content is primarily used for validation and attack by fanboys.

I have seen their content over the years. Out of curiosity, it is interesting and somewhat informative (not the main content), pragmatically speaking; it is basically useless

But because they are seen as the voice of authority, people put too much trust in what they say
 
Doesnt FSR3 FrameGeneration currently only work on 7000 series GPUs.
AMD Fluid Motion Frames only works on AMD GPUs 6000 series and up.

So isnt it hardware locked?

No. FSR3 works on 5000-7000 AMD GPUs as well as 2000-4000 series Nvidia.

Fluid Motion frames was initially 7000 series only, but now includes 6000 series.
 
I don't think DF's biased at all, but it is funny the difference in how they presented FSR3 versus DLSS3.

Opinions can change over time. We're all human.
 
I don't think DF's biased at all, but it is funny the difference in how they presented FSR3 versus DLSS3.

Opinions can change over time. We're all human.
If fairness, DLSS3 worked, and FSR3 seems fundamentally broken, at least on Richard's system.

The issues with VRR might be particular to his set up (or might be easier to detect because of the set up they are using), as I have seen other reviewers say that at least the VRR mode works, but it's hard to blame the guy for not being as sunny about a tech that is busted in every mode for him.
 
Last edited:
Bias is basically impossible to avoid, but I find it odd that people consider DF to have any sort of consistent bias. They do their best to offer well-reasoned critique of graphics technologies. Like, have you guys seen the rest of the media world? DF are fairly even-handed journalists.
 
That Twitter video is nothing but flame bait. The criticism that frame generation was not really additional performance but has its own overhead was targeted at both FSR3 and DLSS. Leadbetter noted that both Nvidia and AMD had their anti latency implementation to address that overhead issue.
 
And I take it people arent allowed to have a change of opinion?
Considering its Alex who harped on about DLSS3 in that one video being extra performance and Rich saying its not extra performance as such.
Couldnt it just be two differing opinions from different people within DF?
If you watch their DF Directs they dont always agree on everything.
It was Rich calling it extra performance a year ago, not Alex. Though I'm sure Alex likely was very pro DLSS3 frame generation too.
 
If fairness, DLSS3 worked, and FSR3 seems fundamentally broken, at least on Richard's system.

The issues with VRR might be particular to his set up (or might be easier to detect because of the set up they are using), as I have seen other reviewers say that at least the VRR mode works, but it's hard to blame the guy for not being as sunny about a tech that is busted in every mode for him.

People saying that VRR works are most likely wrong or lying, I have not seen any proof of that.

My system also reports that VRR is working when it clearly doesn't (correctly).
 
Top Bottom