Cyberpunk 2077 on Switch 2 is using DLSS (confirmed by Digital Foundry)

Eh...."best handheld" relative to its power level isn't a given and even if it were the case at launch, I seriously doubt it would maintain that crown through the end of this year with the next round coming fairly soon. But "best" in a general sense is obviously a valid personal opinion.
Best is a little subjective. If we're talking in term of power a couple handhelds have it beat already - though their price is significantly higher. But the form factor and how seamless the handled to docked is a pretty big boon. Idk if anyone else had the same experience I have with my Steam Deck when docking it but it is extremely finicky to get it to work correctly - to the point where I've kind of just given up playing it docked.
 
Imagine what a console could do rather than a 10W envelope of switch 2 and the crazy resolutions 230W could bring

iwfYNnG.jpeg


648p

Wow much power

screech-saved.gif
Inconsequential post.

You wouldn't want to know what the native resolution Half Life 2 RTX runs at when using DLSS Performance upscale to 4K on an RTX 5090 (~600W).

It's actually 1080p

It's how much the load the game has on the system overall, resolution isn't the only definitive metric to measure actual game demand, and before you say my 1st comment was being negative towards CP2077's resolution range on Switch 2, I'm being negative because I can imagine how blurry it'll look, that's to be expected, it's a 10W machine (I think when in handheld mode? 720p-360p DLSS 💀), I was only stating my opinion, no need to be defensive, they're just video games. 😭
 
Last edited:
No? So now we're going to say lightweight games that run at 60 FPS consistently are optimized? What do you mean by this post? It's a common knowledge that Nintendo's games, in general, are pretty lightweight compared to most other 3rd-party titles that are available on other platforms or even on the Switch 2 itself, that also doesn't tell us if Nintendo's games are optimized or not, Cyberpunk 2077 is heavier than all of Nintendo's efforts, that's why we're not seeing it running anywhere near 60 FPS on Switch 2, doesn't mean it's not optimized, it means it's so heavy that the Switch 2 couldn't run it at 60 FPS, pretty simple stuff!
Part of being optimized is having the ability to edit your game and not add shit for the sake of adding, to say I need this to run at 60FPS here are the cuts needed.

I doubt we can have a serious conversation when you're thinking a game like Cyberpunk is optimized. As if there's no history there. And you're also keep saying how blurry CP will look as if we don't have actual footage of it. How fun it is to be detached from reality.
 
Last edited:
Docked?

720p internal resolution using Ultra Performance DLSS to output 4k would be the lowest I'd expect.
Not even the PS5 runs the game at 4k, and that's way more powerful than a Switch 2 was ever going to be, even taking DLSS into account.
 
Last edited:
Best is a little subjective. If we're talking in term of power a couple handhelds have it beat already - though their price is significantly higher. But the form factor and how seamless the handled to docked is a pretty big boon. Idk if anyone else had the same experience I have with my Steam Deck when docking it but it is extremely finicky to get it to work correctly - to the point where I've kind of just given up playing it docked.
annnd ports not optermized for them ground up like switch 2
 
Last edited:
Not even the PS5 runs the game at 4k, and that's way more powerful than a Switch 2 was ever going to be, even taking DLSS into account.

What part about 720p internal resolution did you not understand?

Even the PS5 is running the game at 600p internally sometimes… manage your expectations.

The difference in quality between the PS5 and Switch 2 is VAST. The PS5 is running at significantly higher settings than the Switch 2. The Steam Deck can run Cyberpunk at potato settings higher than 600p.

One thing doesn't have anything to do with the other.
 
Best is a little subjective. If we're talking in term of power a couple handhelds have it beat already - though their price is significantly higher. But the form factor and how seamless the handled to docked is a pretty big boon. Idk if anyone else had the same experience I have with my Steam Deck when docking it but it is extremely finicky to get it to work correctly - to the point where I've kind of just given up playing it docked.

Eh.....I've never had any issues docking handhelds. They all have worked fine for me.
 
What part about 720p internal resolution did you not understand?



The difference in quality between the PS5 and Switch 2 is VAST. The PS5 is running at significantly higher settings than the Switch 2. The Steam Deck can run Cyberpunk at potato settings higher than 600p.

One thing doesn't have anything to do with the other.
Pz6svzf.png

ffgO27K.jpeg

t44ZMKV.jpeg
 
Gonna be funny to see the usual suspects who downplay the maxed out PC versions say that there's a generational difference between the SX/PS5 and Switch versions.
 
Last edited:
My post about Switch 2's disappointing power output got nuked :(

I will repeat then, that meager ass 7W TDP in handheld mode is a huge bottleneck and will severely limit native resolutions, upscaling techniques and frame rates. This is so disappointing of Nintendo to let such potential go to complete waste. Further salt on wound is that laughable 19Wh battery. SDOLED has 50Wh and AllyX has 80Wh. And the latter is not even that larger than Switch 2. Nintendo could have had docked performance in handheld mode, had they opted for a 15W TDP with a 45Wh battery and decent fans. But of course that would mean replacing those gimmick joy cons with actual controllers and they would spend $10 more in manufacturing so no can do from Nintendo.

It doesn't surprise me that CP2077 can go as low as 360p despite a modest 40Hz cap. Anyone with experience of handheld PCs will tell you playing AAA games at below 15W is not feasible. And while Nvidia may be more efficient than AMD, don't forget this 8nm Turing chip is years old by now. Not even Nvidia can escape the laws of power consumption. What we see of Switch 2 in docked should've been handheld performance.
 
Yeah I think alot of people got sucked into the hype cycle and overestimated the machine bigtime, basically desperately putting it as a peer almost to series s. Its basically ps4.2 tier in graphics output with a modern feature set capable of improving upon the ps4 in some areas and in rare cases lagging behind in some. While overall its superior to the ps4, its still its closest benchmark.
 
Last edited:
So the most demanding game seen so far on Switch 2 has the leeway to run DLSS, against all DF's predictions.


If You Say So Shrug GIF
Well, the chip is very small and dlss needs a lot of power (in relation to a handheld chip). So expect a dumped down version of dlss.
Also is 1080p is the target res, what will be the internal res. Dlss below 1080p internal res is really not that great to watch.

I really would held my expectations really low.
 
Docked?

720p internal resolution using Ultra Performance DLSS to output 4k would be the lowest I'd expect.
The low number of tensor cores in the Switch 2 means upscaling to 4k with DLSS will have a large impact on performance. Which is why we're going to see upscaling from 720p to 1080p with DLSS a lot more than 720p to 4k. Switch 2 may just not have the AI Processing power for upscaling to 4k with DLSS in most situations, even 1440p. DLSS use on Switch 2 is going to be seen a lot more for upscaling to a 1080p target.

Digital Foundry talked about this a while ago using an RTX 2050 Laptop. Upscaling form 720p to 1080p with DLSS was a 12% reduction in performance, upscaling from 720p to 4k was a massive 47% drop.

DLSS_l49L4oK.png


"The DLSS conundrum. All readings here are derived from native 720p, then DLSS upscaling is added to hit 1080p, 1440p and 2160p resolutions. DLSS isn't a 'free lunch' and the Tensor cores in the GPU alone can only do so much. However, if T239 includes T234's Deep Learning Accelerator, that could drastically reduce DLSS's overhead."

The Switch 2 does not have the Deep Learning Accelerator. - "We've since confirmed that Switch 2's T239 chip has no deep learning accelerator - DLSS processes on the GPU tensor cores only"

So the Switch 2 just isn't powerful enough to upscale from 720p to 4k with DLSS without a huge performance drop.
 
Last edited:
I can't wait until more folks get hands on with it and can give their direct comparisons VS the steam deck.
So far, it looks like even though the SD can get a better picture quality VS FSR2 using XeSS, the picture overall tends to be pixelated and grainy looking.
The Switch videos all have a crisper image quality and look more impressive.
 
Last edited:
As I said in another thread. It's going to be funny seeing the same people who had a problem with sub 1080p now defend upscales from 360p.
U are technically right, but once again, new dlss4 model even upscaled from 720p to 4k looks crisper from native 1080p, hell it produces so few artifacts u could mistake it for native 1440p at times.
Not saying switch2 games gonna look as crisp as native 1440p obviously but upscaling from below 1080p can mean 20 different upscaling resolutions multiplied by many different ai upscaling methods, res u upscaling to makes a difference too.

720p to 1080p using fsr1 will produce pretty terrible results but 720p upscaled to 4k by dlss4 will produce very few artifacts and it will be more crisper/less blurry on top.
What im trying to say- there are many determining factors, not only native resolution u upscaling from but method of upscaling, and final resolution, hell same native to final resolution by same method can still look "different" if we look at it in 2 different games.

My plea here is- lets judge it by actual final result, not just native resolution picture is upscaled from, and very important- if we dont use video footage but screens, lets make sure those screens arent "stills", but taken in motion, ai upscaling usually falls apart in motion where it shows all kinds of artifacts so if we ever comparing it to other upscaling or even better- native res- we need to do it in motion, those screen captures gotta be of motion, not of stills, if u guys know what i mean.

Here is exactly what im talking about on 4k vid yt, dlss4 ultra performance so upscaling from 720p to final resolution of 4k +every other setting on ultra ultra( including forraytracing, which is far from max, since rt can be set way above, with psycho and even pathtracing, but thats for top of the top nvidia cards only).


Now lets compare it to 720p native:
ultra settings too(gtx 1070ti so obviously cant do any raytracing so its diseabled), so game looks good except for image quality, coz thats a blurry mess with pixels size of grown man's fist :P

It makes sense tho, first vid is on 3090, 2nd is on 1070ti and even with all rt on ultra(again, not max, "just" ultra which is still above current gen console rt settings- cp2077 didnt get pr0 paych yet so we talking base ps5 and xbox series s/x only).
thats the difference between those gpu's performance on avg, 3090 is 269%(so +169%) of 1070ti.

Here we got current gen cp2077 patch for ps5/xbox series consoles, and that 720p upscaled to 4k with dlss4 still look at least just as crisp as native 1440p, if not even crisper, somehow, it feels like magic literally, at least in this single example nvidia/cdpr really have if not get rid off then vastly minimised amount of artifacts u normally get when upscaling from such a low res like native 720p :)
 
U are technically right, but once again, new dlss4 model even upscaled from 720p to 4k looks crisper from native 1080p, hell it produces so few artifacts u could mistake it for native 1440p at times.
Not saying switch2 games gonna look as crisp as native 1440p obviously but upscaling from below 1080p can mean 20 different upscaling resolutions multiplied by many different ai upscaling methods, res u upscaling to makes a difference too.

720p to 1080p using fsr1 will produce pretty terrible results but 720p upscaled to 4k by dlss4 will produce very few artifacts and it will be more crisper/less blurry on top.
What im trying to say- there are many determining factors, not only native resolution u upscaling from but method of upscaling, and final resolution, hell same native to final resolution by same method can still look "different" if we look at it in 2 different games.

My plea here is- lets judge it by actual final result, not just native resolution picture is upscaled from, and very important- if we dont use video footage but screens, lets make sure those screens arent "stills", but taken in motion, ai upscaling usually falls apart in motion where it shows all kinds of artifacts so if we ever comparing it to other upscaling or even better- native res- we need to do it in motion, those screen captures gotta be of motion, not of stills, if u guys know what i mean.

Here is exactly what im talking about on 4k vid yt, dlss4 ultra performance so upscaling from 720p to final resolution of 4k +every other setting on ultra ultra( including forraytracing, which is far from max, since rt can be set way above, with psycho and even pathtracing, but thats for top of the top nvidia cards only).


Now lets compare it to 720p native:
ultra settings too(gtx 1070ti so obviously cant do any raytracing so its diseabled), so game looks good except for image quality, coz thats a blurry mess with pixels size of grown man's fist :P

It makes sense tho, first vid is on 3090, 2nd is on 1070ti and even with all rt on ultra(again, not max, "just" ultra which is still above current gen console rt settings- cp2077 didnt get pr0 paych yet so we talking base ps5 and xbox series s/x only).
thats the difference between those gpu's performance on avg, 3090 is 269%(so +169%) of 1070ti.

Here we got current gen cp2077 patch for ps5/xbox series consoles, and that 720p upscaled to 4k with dlss4 still look at least just as crisp as native 1440p, if not even crisper, somehow, it feels like magic literally, at least in this single example nvidia/cdpr really have if not get rid off then vastly minimised amount of artifacts u normally get when upscaling from such a low res like native 720p :)

Oh I agree with you that the final result os what matters but there have been games with 860p upscaled to 4k and people were making a big deal about the fact that the native res was 860p. It's just funny that we're now down to sub SD at 360p and nobody is batting an eye.
 
Imagine what a console could do rather than a 10W envelope of switch 2 and the crazy resolutions 230W could bring

iwfYNnG.jpeg


648p

Wow much power

screech-saved.gif
You're still posting this disingenuous old Jedi Survivor DF screenshot that you love to keep posting huh. This was when Jedi Survivor had RT and wasn't yet patched. It runs at higher res now. good luck with RT on a damn switch 2 running anywhere near that res too have fun with 360p on that.
Old Jedi Survivor on base PS5 with FSR and RT. Good luck with RT on Switch 2. This was pre-patch resolution he's using. After the patch that removed RT the internal resolution was 1080p then upscaled.
 
You're still posting this disingenuous old Jedi Survivor DF screenshot that you love to keep posting huh. This was when Jedi Survivor had RT and wasn't yet patched. It runs at higher res now. good luck with RT on a damn switch 2 running anywhere near that res too have fun with 360p on that.

I know you have trouble following a chain of replies in a thread but to refresh your memory

0eZnuiK.png


So PS2 resolution was fine because it had RT? :messenger_tears_of_joy:

This game of nonsense / no context resolution bitching goes both ways. If you want to sit down on and talk about how Cyberpunk 2077 with 1080p output quality mode is fine on a 10W handheld even with DLSS then we can talk like grown men, unlike the clown that got banned.
 
Last edited:
I know you have trouble following a chain of replies in a thread but to refresh your memory

0eZnuiK.png


So PS2 resolution was fine because it had RT? :messenger_tears_of_joy:

This game of nonsense / no context resolution bitching goes both ways. If you want to sit down on and talk about how Cyberpunk 2077 with 1080p output quality mode is fine on a 10W handheld even with DLSS then we can talk like grown men, unlike the clown that got banned.
860p isn't PS2 resolution. Most PS2 games were 480p. Switch is lower than that with 360p and yes because it has more effects. The prepatch PS5 version of Jedi Survivor had a lower bound DRS limit of 648p because of RT so your continuous comparison to PS5 Pro in these switch threads is you trolling PS in unrelated switch threads as usual.
 
If people care about the resolution they would chose the quality mode no?
Yes as they would on the PS5 equivalent games (for 60fps vs the 40fps on switch 2). It's just funny seeing people defend the 360p lower bound now when they were up in arms about 860p or 648p on performance modes of other games.
I mean look at this shit. Nobody even mentioned PS here yet he was clowning as usual:

Who gives a shit if it's wasted on dealing with 540p being upscaled to 1080p?

As opposed to PS5 648p internal resolution with FSR pixel soup?

i-might-realize-this-too-late-but-648p-on-ps5-what-the-fck-v0-cr4ztslrcvqc1.jpeg


A lot better :messenger_tears_of_joy:
Now he's out here defending 360p (with the usual now incorrect res deflection of Jedi Survivor of course)
 
Last edited:
Yes as they would on the PS5 equivalent games. It's just funny seeing people defend the 360p lower bound now when they were up in arms about 860p or 648p on performance modes of other games.

Since when peoples are really looking at internal res over AI upscalers? Is this your new battlefront? :messenger_tears_of_joy:

Funny seeing you keep quoting 360p from internal res performance mode when its also confirmed that the quality mode has 1080p handheld, with also likely dynamic DLSS. I know your job is to keep the defense line while your buddy James and Sawtooth are gone, but its quite apparent what you're doing.

Not to mention that 540p → 1080p DLSS looks miles better than any FSR pixel soup we saw on base consoles.
 
Since when peoples are really looking at internal res over AI upscalers? Is this your new battlefront? :messenger_tears_of_joy:
Funny seeing you keep quoting 360p from internal res performance mode when its also confirmed that the quality mode has 1080p handheld, with also likely dynamic DLSS. I know your job is to keep the defense line while your buddy James and Sawtooth are gone, but its quite apparent what you're doing.
Not to mention that 540p → 1080p DLSS looks miles better than any FSR pixel soup we saw on base consoles.
You were continuously quoting 648p internal res being upscaled in a 60fps performance mode and laughing. Now you have a problem with people quoting 360p on a 40fps mode? Yeah, I wonder why. That was your battlefront not mine.
 
Last edited:
You were continuously quoting 648p internal res being upscaled in a 60fps performance mode and laughing. Now you have a problem with people quoting 360p on a 40fps mode? Yeah, I wonder why. That was your battlefront not mine.

Continuously quoting? Am I living rent free in your head Three? When I bring this receipt is for peoples like the banned clown that war with a resolution and acting all high and mighty of their platform of choice resolutions and shitting on others from a great height. I see you've detailed all the reasons why it could not achieve 4k :)

I don't have a problem with peoples quoting 720p 40 fps mode or 1080p 30 fps mode on a 10W handheld. Because here at least we know DLSS delivers with upscaling, unlike FSR pixel soup.

Ok so not PS2 resolution for 648p, PS3 then 🤷‍♂️
 
Continuously quoting? Am I living rent free in your head Three? When I bring this receipt is for peoples like the banned clown that war with a resolution and acting all high and mighty of their platform of choice resolutions and shitting on others from a great height. I see you've detailed all the reasons why it could not achieve 4k :)
Yes continuously quoting, or did you miss the time you brought it up when coffinbirth (who is neither banned nor a PS fan to be warring) brought up the fact that the internal res for Switch is likely something low then you warred with the same shit and outdated war fodder regarding Jedi Survivor. I even quoted it for you above, it's not hard to see.
 
Last edited:
Yes continuously quoting, or did you miss the time you brought it up when coffinbirth (who is neither banned nor a PS fan to be warring) brought up the fact that the internal res for Switch is likely something low then you warred with the same shit and outdated war fodder regarding Jedi Survivor. I even quoted it for you above, it's not hard to see.

Continuously quoting, finds one time I did

Oh No GIF by The Maury Show


230W dude, 230W. And its still FSR pixel soup. My point still stands. If it updated to PSSR then it'll already be better. But FSR 3 is raw dogshit at those internal res.

We have the footage, its not an unknown




Or are you trying to convince peoples it looks like FSR pixel soup?

star-wars-jedi-ocalaly-optymalizacja-3.jpg


You're doing a good job keeping thing in order while James & Sawtooth are gone. Keep it up.
 
Last edited:
230W dude, 230W. And its still pixel soup. My point still stands. If it updated to PSSR then it'll already be better. But FSR 3 is raw dogshit at those internal res.
I'm not here to discuss wattage or any other nonsense. I'm here telling you that you were continously quoting internal res when it cames to PS yet have a problem with people quoting the 360p40fps mode on switch for some reason.
 
ahahaha in real lifes or df clirkbiat??

ngpCsCe.png

VmjqFxv.png





I was wondering why the steam deck version looked weird then I saw it.... FSR3. Most people playing this game on steam deck use xess.



From the little we've seen from the Switch 2, Cyberpunk looks better but that's to be expected. The game is custom written to run on the Switch 2 while the steam deck is just running the pc version. Despite all of that, the differences aren't so large. The screenshots being posted are certainly not representative at all.
 
Yes as they would on the PS5 equivalent games (for 60fps vs the 40fps on switch 2). It's just funny seeing people defend the 360p lower bound now when they were up in arms about 860p or 648p on performance modes of other games.
I mean look at this shit. Nobody even mentioned PS here yet he was clowning as usual:




Now he's out here defending 360p (with the usual now incorrect res deflection of Jedi Survivor of course)
In the January thread he was defending using DLSS to upscale low resolutions (540p).

He's doing the same here. Saying that as PS5 upscales from low resolutions with FSR, it's fine for Switch 2 to do that with DLSS.
 
Continuously quoting, finds one time I did

Oh No GIF by The Maury Show


.
Nice edit but you're so silly that you don't realise you're the one who used the phrase "keep quoting" regarding me mentioning 360p and I've only ever mentioned it in this thread once whereas you've mentioned Jedi Survivors old internal res multiple times in multiple threads.
 
Last edited:
Top Bottom