• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Can Game Upscalers Look as Good as Native?



A super in-depth look at one particular comparison between all current, existing upscaling methods. I was tempted to play the wedding music throughout this video to show you the torture I endured in the 4 days I spent making this video but I'm a kind person.All was tested at presets which attempt to upscale 360p to 720p. The ones zoomed in will therefore represent LESS than 360p base resolution as it's only showing a fraction of the screen, so in the close-ups of the dancers it's more like what 180p upscaled to 360p would look like.

0:00 - Making low res stuff high res
0:16 - 360p scaling VS 720p upscaling
2:40 - XESS
4:23 - FSR 3
5:29 - FSR 4
8:42 - DLSS 3
9:26 - DLSS 4 and 4.5
11:55 - 720p upscaled VS 4K downsampled

The 4k downsample comparison is really impressive.
 
Objectively speaking the obvious answer is NO.
Upscaling adds noise and makes the native image look worse in order to get it to fill out a larger display.
 
People overrate native.

Native usually means using taa which is nasty in its own way.

Playing days gone 4k native right now and i would kill to have dlss 4.5 in this one...
 
Maybe not exactly, but will end being good enough to don't be noticiable other than by Digital Foundry nerds who count pixels on zoomed and paused frames.
 
Last edited:
I guess native means native without anything else to blend pixel edges

Those ai based upscalers apply downscaling supersampling with upscaling
 
Yes it can, we've known DLSS Quality can sometimes beat native for a while and the techs only gotten much better since then.
 
Of course it does. We know this information since years ago
Really depends on context.
If we're talking about modern day native with bad TAA, then I guess yeah, a good upscaler could potentially look better if it does a better job than TAA.
However, if we go back 10 to 15 years ago, and we're talking about native with MSAA (or even better, SGSSAA), then no, an upscaler wouldn't do any better. The image clarity of SGSSAA basically looked like CGI, it was insane. I miss the time of MSAA, we had such sharp and clean games.
 
Last edited:
Really depends on context.
If we're talking about modern day native with bad TAA, then I guess yeah, a good upscaler could potentially look better if it does a better job than TAA.
However, if we go back 10 to 15 years ago, and we're talking about native with MSAA (or even better, SGSSAA), then no, an upscaler wouldn't do any better. The image clarity of SGSSAA basically looked like CGI, it was insane. I miss the time of MSAA, we had such sharp and clean games.

Lol so you are saying that MSAA is better than DLSS?

Hahaha 🤣
 
Yes? Not sure what's so funny about that.

Man, DLSS obliterates MSAA, you cant even get rid of specular aliasing with MSAA. Take GTA5 legacy edition for example, it has MSAA, even at 4k with MSAAx8 the game still has jaggies everywhere, there is no fucking way to remove them and the performance cost is astronomical. You can't be serious

Let me just guess, you are a fuckTAA user in reddit, man that subreddit is a fucking cult
 
Since we live in the era of AI, let's ask AI...
Game upscalers, especially modern AI-powered ones like NVIDIA's DLSS and AMD's FSR, get incredibly close to native resolution quality, often delivering comparable or even better perceived image quality and much higher frame rates, but for purists demanding absolute fidelity, native resolution still holds a slight edge, though the gap is narrowing significantly. While some blur or minor artifacts can occur with upscaling (especially older methods), advanced techniques create such convincing detail that native resolution often feels unnecessary for most gamers seeking performance and quality.
The short answer is - No, but close enough to be invisible to most people's eyes.
 
The problem with this comparison, is that often the native is using some shitty TAA implementation.
So of course, a modern AI based temporal upscaler will look better than a basic, crappy TAA, rendered at native resolution.
 
The answer is yes, and anyone saying no is just being pedantic...or loves aliasing.

Even at 4k, you'll get aliasing, and most modern AA solutions in game engines suck vs DLSS at quality or even balanced now. When you get in performance territory, or pre-FSR4 on AMD the obvious limitations present themselves.

Only games I've seen come up with a good enough custom solution to layer on top of native that looks fantastic is the Horizon series. Maybe a few others that I'm missing, but they're exceptions.
 
Since we live in the era of AI, let's ask AI...

The short answer is - No, but close enough to be invisible to most people's eyes.

K3gkltuzWB4qIzSp.png


rJ6gCwAEHyaBmxCw.png


3tHSXtRJuWZbQrda.jpg


ChatGPT needs some updates to his database apparently. And these examples are the first versions of DLSS, 2.2 I believe.

Imagine 4.5 now. Anyone denying these evidences are being dishonest
 
Last edited:
DLAA transformer model at 1080p still looks vastly better than the scaling/reconstruction on console versions that have a higher base internal resolution.

The DLSS part relies on a higher starting point to look decent. Stuff like hair can look a bit shit if the internal resolution is around 720p even with Transformer K preset.

I chose a 5060 TI 16GB over the cheaper, and more powerful AMD option due to FSR still being massively inferior. It's decent, but when you compare both of them, it's very obvious just how much better Nvidias method is.
 
ChatGPT needs some updates to his database apparently. And these examples are the first versions of DLSS, 2.2 I believe.
Imagine 4.5 now. Anyone denying these evidences are being dishonest
This is a normal development of technology. The human senses of sight, hearing, and smell, although they are miracles of nature, are quite easy to fool, especially if you know what to do. Technology has been doing this its entire life.
We've invented fragrances, artificial smells, and flavors. We've also invented music and sound compression, and much more. And the same thing is happening here. It's essentially the same movement that once happened with music when MP3s were first introduced, and their quality was terrible. But over time, algorithms improved, new compression formats emerged, and so on. And little by little, it's become such a part of our lives, and everyone's become so accustomed to it, that it's no longer even discussed. Has pure, uncompressed sound disappeared? No, enthusiasts who listen Vinyl and CDs or music in FLAC format haven't disappeared. They're a minority, but they're still there. Everything will be exactly the same here, it will become a permanent part of our lives over time. We'll get used to it and stop paying attention to it, but enthusiasts won't go away.
 
The answer is yes, and anyone saying no is just being pedantic...or loves aliasing.
SSAA always will be superior on same internal resolution as it doesn't introduce artifacts. Cost though is astronomical.

No matter how good DLAA could be it's still an approximation (educated guess) and approximation bound to have errors.
 
SSAA always will be superior on same internal resolution as it doesn't introduce artifacts. Cost though is astronomical.

No matter how good DLAA could be it's still an approximation (educated guess) and approximation bound to have errors.

The problem with SSAA is that it doesn't get rid of jaggies, even at high resolutions
 
The problem with SSAA is that it doesn't get rid of jaggies, even at high resolutions
It means that internal resolution is not enough
At x1.5 you'll have jaggies as not every pixel supersampled, at x2 unlikely but maybe in certain cases I can't think of, at x4 there should be no jaggies as averaging is too strong
 
It means that internal resolution is not enough
At x1.5 you'll have jaggies as not every pixel supersampled, at x2 unlikely but maybe in certain cases I can't think of, at x4 there should be no jaggies as averaging is too strong

Wrong. Yakuza 6 uses SSAA for example. Even at 4k and using SSAA at max, the game still has jaggies and the performance drop is absolutely disgusting. SSAA does not resolve specular aliasing because it provides spatial anti aliasing and not temporal so you will still see jaggies in motion. DLSS does provide spatial plus temporal, and it improves performance on top of that, this means the image quality is pristine even in motion with much higher framerates

This is fake news, DLAA and DLSS are vastly superior to SSAA and MSAA

The only AA that could be comparable with DLSS is SGSSAA and even like that, DLSS will have better results in most cases due to detail reconstruction being more effective. On top of that, Nvidia dropped support for SGSSAA and it's now an ancient technology only available for dx9 games
 
Last edited:
Wrong. Yakuza 6 uses SSAA for example. Even at 4k and using SSAA at max, the game still has jaggies and the performance drop is absolutely disgusting. SSAA does not resolve specular aliasing because it provides spatial anti aliasing and not temporal so you will still see jaggies in motion. DLSS does provide spatial plus temporal, and it improves performance on top of that, this means the image quality is pristine even in motion with much higher framerates

This is fake news, DLAA and DLSS are vastly superior to SSAA and MSAA

The only AA that could be comparable with DLSS is SGSSAA and even like that, DLSS will have better results in most cases due to detail reconstruction being more effective. On top of that, Nvidia dropped support for SGSSAA and it's now an ancient technology only available for dx9 games
SGSSAA is a simplified (optimized) SS
There is no temporal aliasing, that's bullshit, even DLSS and TAA using temporal data for spatial aliasing.
SS as a method gives the best quality of picture "by design", other methods can't touch it. No matter how many videos you take or smart algorithms apply, 1080p capable camera will not replicate photo of 8K camera.

DLSS is superior in a sense that x3-x4 SS have completely inefficient performance to quality convertion, it's just too costly. But DLSS always be inferior in pure quality.
 
DLSS is amazing, but I can tell a difference at 4k even with DLSS 4.5. I don't know what to compare to as "native" there is always some AA involved presumably, but if we do 8x MSAA at native I don't think DLSS Quality beats it, at least not in something like Forza Horizon 5 although I haven't tried 4.5 in that one. IF we are comparing DLAA vs DLSS Quality picture quality is slightly sharper and you get a more stable and clear image in motion which is why screenshots don't tell the full story. Having said that I would happily play any game at Balanced or Performance nowadays. This tech is magic.
 
SGSSAA is a simplified (optimized) SS
There is no temporal aliasing, that's bullshit, even DLSS and TAA using temporal data for spatial aliasing.
SS as a method gives the best quality of picture "by design", other methods can't touch it. No matter how many videos you take or smart algorithms apply, 1080p capable camera will not replicate photo of 8K camera.

DLSS is superior in a sense that x3-x4 SS have completely inefficient performance to quality convertion, it's just too costly. But DLSS always be inferior in pure quality.

Pure image quality does not automatically mean a clean image. As I've said before, SSAA does not handle aliasing the way DLSS does, what part of this don't you understand man?

That's a fundamental difference. SSAA increases resolution, but it does not effectively solve temporal aliasing, shimmering, or crawling edges in motion issues that are extremely noticeable in real gameplay.

In games, image quality is not judged on still frames, games are not movies. Motion matters. What's the point of "pure raw quality" if the image becomes a jagged, unstable mess once the camera moves? A good solution must be balanced: clean edges, stable motion, and acceptable performance. DLSS achieves that balance far better, especially in modern games with extreme geometric complexity and high detailed assets, where temporal reconstruction is practically required for both visual stability and performance.

You're also avoiding the performance discussion too, which is a crucial part of this topic. Most players cannot afford SSAA or MSAA, the performance cost is enormous, and the return is poor when it comes to anti-aliasing quality. Good AA is essential in games, and SSAA simply isn't an efficient or practical solution anymore.

There's a reason why DLSS is present in almost every modern title, while developers have abandoned SSAA and largely moved away from MSAA. The industry has shifted because these older techniques don't scale well with today's rendering demands.
 
DLSS has been better than native for some time now. I think 3.5 DLAA or Quality was better than native and now on 4.5 performance is pretty much there.

Only an idiot would stick to native believing it's blindly better. Usually anti AI Reddit mod types.
 
Its look good but there are effect that scale with resolution like reflection , lumen , RT , fog , etc so I wouldnt go that low (performance/ultra performance ).
 
Pure image quality does not automatically mean a clean image. As I've said before, SSAA does not handle aliasing the way DLSS does, what part of this don't you understand man?

That's a fundamental difference. SSAA increases resolution, but it does not effectively solve temporal aliasing, shimmering, or crawling edges in motion issues that are extremely noticeable in real gameplay.

In games, image quality is not judged on still frames, games are not movies. Motion matters. What's the point of "pure raw quality" if the image becomes a jagged, unstable mess once the camera moves? A good solution must be balanced: clean edges, stable motion, and acceptable performance. DLSS achieves that balance far better, especially in modern games with extreme geometric complexity and high detailed assets, where temporal reconstruction is practically required for both visual stability and performance.
There is no temporal aliasing
There is incostistancy in error of aliasing between frames, but it drop rapidly with pixel supersampling size
At low sampling size AA may suffer from lack of depth at which aliasing do not work perfectly and edges bleed out, but at x4 there already shouldn't be any cases anymore, averaging is too strong (though x4 resolution is insane cost)

You're also avoiding the performance discussion too, which is a crucial part of this topic. Most players cannot afford SSAA or MSAA, the performance cost is enormous, and the return is poor when it comes to anti-aliasing quality. Good AA is essential in games, and SSAA simply isn't an efficient or practical solution anymore.
What did you not understand in this - "DLSS is superior in a sense that x3-x4 SS have completely inefficient performance to quality convertion, it's just too costly."?
Thread title says explicitly about look, not efficiency or rationality of usage. SS is not efficient and not rational to use, but it capable to look better.

You should really put your emotion under check. In my original post I point out that strictly speaking DLSS/DLAA is a tradeoff (performance for a minor drop of quality) and the summit of quality is SS (not that it's usable in practical scenarios). It's completely not about real world scenarios or perception or usability. It's just assessment of maximums each tech can achieve.
 
There is no temporal aliasing
There is incostistancy in error of aliasing between frames, but it drop rapidly with pixel supersampling size
At low sampling size AA may suffer from lack of depth at which aliasing do not work perfectly and edges bleed out, but at x4 there already shouldn't be any cases anymore, averaging is too strong (though x4 resolution is insane cost)


What did you not understand in this - "DLSS is superior in a sense that x3-x4 SS have completely inefficient performance to quality convertion, it's just too costly."?
Thread title says explicitly about look, not efficiency or rationality of usage. SS is not efficient and not rational to use, but it capable to look better.

You should really put your emotion under check. In my original post I point out that strictly speaking DLSS/DLAA is a tradeoff (performance for a minor drop of quality) and the summit of quality is SS (not that it's usable in practical scenarios). It's completely not about real world scenarios or perception or usability. It's just assessment of maximums each tech can achieve.

I dont need to put anything under check. My emotions are fine.

You are lying about SSAA having good anti aliasing, its absolutely not true and I have given you the Yakuza 6 example, SSAA is inefficient and I say this having a lot of experience with different anti aliasing techniques because I can't stand aliasing in games.

I dont want to continue this conversation with you because you are being evasive and dishonest
 
Last edited:
You are lying about SSAA having good anti aliasing, its absolutely not true and I have given you the Yakuza 6 example, SSAA is inefficient and I say this having a lot of experience with different anti aliasing techniques because I can't stand aliasing in games.
Yakuza have SS at maximum 2.0. It's a minimum to not have just spatial aliasing (1.5 is a bullshit as every other pixel will not be aliased). And of course error in aliasing will not stable at such numbers, leading to temporal artifacts.
And I'm talking about SS at 4 that Yakuza obviously don't has. But 4 is kind of unreachable for games as it too costly.

You really have "experience" but seems to understand nothing about fundamental part of things (and I follow AA techniques for a long time) leading to some narrow vision and biases due to imperfection of practical implementations.

I dont want to continue this conversation with you because you are being evasive and dishonest
You just ignore half of thing I write and even put some stupid accusations that completely untrue just because you seems to not agree how reality is not to your liking
 
In some parts of an image they can due to modern AA vs old MSAA, but I always notice errors in any of the games that I use it on. I would say, that in most games, the minor details that I notice are worth the trade off for increased performance.
 
SSAA always will be superior on same internal resolution as it doesn't introduce artifacts. Cost though is astronomical.

No matter how good DLAA could be it's still an approximation (educated guess) and approximation bound to have errors.
Sure there are advantages of SSAA, or multiple other supersampling techniques if performance no longer matters. It just doesn't get added much in modern games, because of performance.

I've never compared to two before, but there have been a tiny few games that have used Nvidia's SGSSAA (Supersample Anti-Aliasing), which I think goes even further because it combines techniques (msaa + supersampling that handles transparent objects really well) to get the cleanest image I've ever seen.

Only game where it didn't destroy performance was The Legend of Heroes: Trails Through Daybreak, where Durante's port house on the PC version implemented it, because the game otherwise doesn't take much to run. Cut my FPS in half though :LOL:
 
Last edited:
people finally realizing native 4k isnt enough

giphy.gif


SSAA always will be superior on same internal resolution as it doesn't introduce artifacts.
youre waffle stomping multiple pixels down to the space of one pixel, which has imperfections.
DL-based AA also has tricks SSAA doesnt do, and DL-based AA will likely only get better over time.

The problem with SSAA is that it doesn't get rid of jaggies, even at high resolutions
nothing gets rid of jaggies, only reduces.
SSAA can help a ton though with jaggies, plus the benefit of other improvements like texture detail.
SSAA basically saved my life during the FXAA/SMAA/TAA days.
 
The problem with this comparison, is that often the native is using some shitty TAA implementation.
So of course, a modern AI based temporal upscaler will look better than a basic, crappy TAA, rendered at native resolution.

In the end he compared to it to downsampled "native" footage (that would eliminate some problems with TAA) and that L preset was still pretty much on par (or close to it).

Yeah, many games have really poor quality TAA implementations, if we change definition of "native" to DLAA or native AA (FSR4) then I doubt any upscaling could win with this.
 
In the end he compared to it to downsampled "native" footage (that would eliminate some problems with TAA) and that L preset was still pretty much on par (or close to it).

Yeah, many games have really poor quality TAA implementations, if we change definition of "native" to DLAA or native AA (FSR4) then I doubt any upscaling could win with this.
He says he's using SMAA in the video
 
Top Bottom