DF Direct Weekly #208: Switch 2 Specs Breakdown, DLSS Info, Switch 2 vs Steam Deck + More!

Jaguar was shit architecture, so them sticking so hard to this 'it'll run like a PS4' prediction will end up making them look stupid.

There is no point comparing it to old hardware. The only interesting comparisons will be the Series S, Steam Deck and ROG Ally.
 
Kirby dance:


This whole segment with crystal floor:

I played half of the Forgotten Land when it launched, but had to put it down at some point, and ended up selling the game, I'm glad they're making a better version for the Switch 2, can't wait to play it
 
It uses a cut down T234 Chip (this rumor iis anything but confirmed at this point)

ProcessorNvidia T234Nvidia T239 (Switch 2)
CPU cores12x ARM A78AE8x ARM A78C
CPU clock speed1.43GHzUp to 1100MHz (estimate)
CUDA cores20481536
Memory Interface256-bit / LPDDR5128-bit / LPDDR5
Memory Bandwidth204GB/sUp to 102GB/s (estimate)

You are getting ~Steam Deck performance (Handheld)
Until somebody has an actual system and does a tear down it's all speculation at this point. Of course nobody outside core hobby gamers cares in the first place. But it wil be nice to have some actual confirmed specs.
 
Until somebody has an actual system and does a tear down it's all speculation at this point. Of course nobody outside core hobby gamers cares in the first place. But it wil be nice to have some actual confirmed specs.
Yup I agree with you , but Im kinda 95% sure it wont deviate a lot from the rumored specs here.
 
I played half of the Forgotten Land when it launched, but had to put it down at some point, and ended up selling the game, I'm glad they're making a better version for the Switch 2, can't wait to play it
Why!? I personally LOOVED the game, I couldn't put the game down until I 100% everything.
 
Jaguar was shit architecture, so them sticking so hard to this 'it'll run like a PS4' prediction will end up making them look stupid.

There is no point comparing it to old hardware. The only interesting comparisons will be the Series S, Steam Deck and ROG Ally.
Looking at the Cyberpunk footage it's pretty clear this is impressive for a handheld costing £395.
 
We are so used to threads on here complaining how the Switch had poor performance. We are just not ready for the threads complaining about how good the Switch 2 performance is.

Welcome to NeoGaf folks.
 
Why!? I personally LOOVED the game, I couldn't put the game down until I 100% everything.

Don't wanna sound too generic, but life basically. I did that for a lot of games, still only do one dungeon in TOTK for eg.

Up until I got my Portal last September, I barely had the chance to play games. I should've gone with the OLED model with PowerA pro joycons for confortable play, but I didn't, was waiting for the Switch 2.
 
Looking at the Cyberpunk footage it's pretty clear this is impressive for a handheld costing £395.
I owned one of the first Steam Decks and ended up selling it because the battery life killed the fun for me. Even playing something like AC2 it barely managed 90 minutes, on the Switch it was 5 hours. The fan could get noisy too.

The thing that excites me about this is the form factor, it's thinner and lighter than the competition. And the battery life will be much better. When Nintendo said 2-6 hours, I'm pretty sure the game that drained it in 2 hours was Cyberpunk 2077.
 
Nintendo Switch 2 vs PS4 Pro is the most fitting comparison when evaluating two hardware systems with broadly similar performance ceilings, despite differences in architecture and rendering approaches.

This isn't just old vs new, it's muscle vs brains. The PS4 Pro, with its 4.2 TFLOPs GCN GPU and 8GB of GDDR5 RAM, was a beast in its time. It handles high-res textures, heavy post-processing, and complex effects like volumetrics and alpha transparencies with confidence. Native or checkerboarded 4K was its calling card, but all of this runs on a dated Jaguar CPU and an architecture that's now well behind the curve.

The Switch 2, meanwhile, takes a completely different route, running a custom NVIDIA Ada Lovelace-based SoC, it's all about efficiency and modern rendering. With DLSS 3.5 in the mix, including frame generation and ray reconstruction, the console doesn't need brute force to punch above its weight.

In raw texture quality and bandwidth-heavy scenes, the PS4 Pro still pulls ahead. It's better at pushing high-res assets without compromises. But in lighting and overall visual coherence, especially in motion, the Switch 2 may look more "next-gen," thanks to modern techniques like real-time global illumination and AI-powered reconstruction.
Particle density and effects are another battleground: PS4 Pro delivers more of them, but the Switch 2 can simulate similar results using smarter shaders and DLSS-enhanced tricks, albeit sometimes at the cost of artifacting or edge softness.

Bottom line ? PS4 Pro is still impressive for raw power and static image quality. But the Switch 2 brings modern rendering tools to the fight, DLSS, better CPU efficiency, and scalable performance. For handheld gaming with "current-gen visuals", it's a game-changer. For raw asset fidelity on a 4K screen, the PS4 Pro still flexes harder.

All things considered, I'm pleasantly impressed by the hardware of the Switch 2, just quite a bit disappointed by the astronomical price of some physical games, such as 'Mario Kart World' priced at €89.99, while others like 'Donkey Kong Bananza' are offered at €79.99(Italy).
 
Last edited:
I owned one of the first Steam Decks and ended up selling it because the battery life killed the fun for me. Even playing something like AC2 it barely managed 90 minutes, on the Switch it was 5 hours. The fan could get noisy too.

The thing that excites me about this is the form factor, it's thinner and lighter than the competition. And the battery life will be much better. When Nintendo said 2-6 hours, I'm pretty sure the game that drained it in 2 hours was Cyberpunk 2077.
Possibly. I like the handheld feature and might use it more than the original due to the 1080p screen. But docked is the way to go for me. My LG C series OLED is waiting. I am hoping for a 4k or at least HD home scree this time round. Although the LG does a pretty decent job of upscaling the 720p screen. The fan in the dock gets my interest a lot more than anything else. We must be looking at much better clock speeds this time round. And perhaps Nvidia have delivered the goods with their AI DLSS. :)
 
Last edited:
Looking at the Cyberpunk footage it's pretty clear this is impressive for a handheld costing £395.

I don't think its as impressive as you guys makes it to be when you basically already gotten this level of Handheld performance from a 4 years old Steam Deck.

Docked? What is the point when your nearest competitor is giving 5x your performance since 2020?
 
I don't think its as impressive as you guys makes it to be when you basically already gotten this level of Handheld performance from a 4 years old Steam Deck.

Docked? What is the point when your nearest competitor is giving 5x your performance since 2020?
The steamdeck is hardly a Switch competitor. Even they state they are not. For the vast majority of people that will buy a Switch 2 this will be their very first experiance of this game. They are simply not core hobby gamers that spend lots of time playing and discussing videogames. The just buy games play them and enjoy.
 
Nintendo Switch 2 vs PS4 Pro is the most fitting comparison when evaluating two hardware systems with broadly similar performance ceilings, despite differences in architecture and rendering approaches.

This isn't just old vs new, it's muscle vs brains. The PS4 Pro, with its 4.2 TFLOPs GCN GPU and 8GB of GDDR5 RAM, was a beast in its time. It handles high-res textures, heavy post-processing, and complex effects like volumetrics and alpha transparencies with confidence. Native or checkerboarded 4K was its calling card, but all of this runs on a dated Jaguar CPU and an architecture that's now well behind the curve.

The Switch 2, meanwhile, takes a completely different route, running a custom NVIDIA Ada Lovelace-based SoC, it's all about efficiency and modern rendering. With DLSS 3.5 in the mix, including frame generation and ray reconstruction, the console doesn't need brute force to punch above its weight.

In raw texture quality and bandwidth-heavy scenes, the PS4 Pro still pulls ahead. It's better at pushing high-res assets without compromises. But in lighting and overall visual coherence, especially in motion, the Switch 2 may look more "next-gen," thanks to modern techniques like real-time global illumination and AI-powered reconstruction.
Particle density and effects are another battleground: PS4 Pro delivers more of them, but the Switch 2 can simulate similar results using smarter shaders and DLSS-enhanced tricks, albeit sometimes at the cost of artifacting or edge softness.

Bottom line ? PS4 Pro is still impressive for raw power and static image quality. But the Switch 2 brings modern rendering tools to the fight, DLSS, better CPU efficiency, and scalable performance. For handheld gaming with "current-gen visuals", it's a game-changer. For raw asset fidelity on a 4K screen, the PS4 Pro still flexes harder.

All things considered, I'm pleasantly impressed by the hardware of the Switch 2, just quite a bit disappointed by the astronomical price of some physical games, such as 'Mario Kart World' priced at €89.99, while others like 'Donkey Kong Bananza' are offered at €79.99(Italy)


We started with 'worse than PS4'

Then 'worse than PS4PRO'

Soon we will get to 'worse than Series S'
 
Last edited:
The steamdeck is hardly a Switch competitor. Even they state they are not. For the vast majority of people that will buy a Switch 2 this will be their very first experiance of this game. They are simply not core hobby gamers that spend lots of time playing and discussing videogames. The just buy games play them and enjoy.
Im talking pure technical specs , not sales

I'm willing to make a prediction here - it'll look noticeably better on Switch 2 than the Steam Deck.
When doecked?! Highly probable. Handheld its gonna be similar to Steam Deck which is my point of argument.

Im saying you are getting similar (Handheld) performance to Steam Deck which was released 4 years ago, making the comment of impressive Switch 2 performance weird
 
Im talking pure technical specs , not sales


When doecked?! Highly probable. Handheld its gonna be similar to Steam Deck which is my point of argument.

Im saying you are getting similar (Handheld) performance to Steam Deck which was released 4 years ago, making the comment of impressive Switch 2 performance weird
Specs are fun to discuss. I was merely suggesting that outside of the core it's not an important factor.
 
Im talking pure technical specs , not sales


When doecked?! Highly probable. Handheld its gonna be similar to Steam Deck which is my point of argument.

Im saying you are getting similar (Handheld) performance to Steam Deck which was released 4 years ago, making the comment of impressive Switch 2 performance weird
And I'm saying handheld. As impressive as the Steam Deck's APU is, RDNA 2 is limited.

Steam Deck launched 3 years ago. It also has slightly better than a 720p screen, Switch 2 is 1080p.
 
Last edited:
The main problem of the Switch 2 is that it uses worse node (Samsung 8nm) than both the Steam deck and PS5 and we have probably mainly saw docked modes of the big games (how is Eden Ring going to run on portable mode?) But it's still a next-gen leap compared to Switch. People just have to lower their expectation, a portable PS4 / XB1 is objectively a very impressive machine particularly running Nintendo games. DLSS could be used in docked mode in the future I reckon.

The bandwidth is the second most disappointing specs of all IMO. In portable mode that super low bandwidth will prevent it to efficiently use either DLSS or RT. We are talking XB1 bandwidth here, without the super fast esram 200GB/s part! XB1 latest big open world games were running horribly compared to PS4 games (even with the esram), there won't be any miracles with that 68GB/s bandwidth in portable mode, not with 8 CPU cores to feed. Right now I am expecting XB1 power in portable mode and PS4+ (DLSS, RT at 30fps?) in docked modes.
 
Im perfectly


We started with 'worse than PS4'

Then 'worse than PS4PRO'

Soon we will get to 'worse than Series S'
The Switch 2 delivers excellent performance, and that's what truly matters. Technical comparisons are ultimately irrelevant, especially when comparing a home console from the previous generation to Nintendo's latest hybrid/portable system. Furthermore, there are areas where the Switch 2 outperforms, no one has mentioned any inferiority in absolute terms.
 
The main problem of the Switch 2 is that it uses worse node (Samsung 8nm) than both the Steam deck and PS5 and we have probably mainly saw docked modes of the big games (how is Eden Ring going to run on portable mode?) But it's still a next-gen leap compared to Switch. People just have to lower their expectation, a portable PS4 / XB1 is objectively a very impressive machine particularly running Nintendo games. DLSS could be used in docked mode in the future I reckon.

The bandwidth is the second most disappointing specs of all IMO. In portable mode that super low bandwidth will prevent it to efficiently use either DLSS or RT. We are talking XB1 bandwidth here, without the super fast esram 200GB/s part! XB1 latest big open world games were running horribly compared to PS4 games (even with the esram), there won't be any miracles with that 68GB/s bandwidth in portable mode, not with 8 CPU cores to feed. Right now I am expecting XB1 power in portable mode and PS4+ (DLSS, RT at 30fps?) in docked modes.
Ampere has substantially better colour compression than GCN, so the figures are not directly comparable. For example the 3060 has 360 GB/s for 13.7 TF, which is 26 GB/s per TF. Then a 1.71 TF Ampere part would need ~44 GB/s by this metric, leaving 24 GB/s for the CPU.
 
Well see how Cyberpunk 2077 runs on Switch 2

Cause it runs like shit on PS4
I think we will have similar situation we had with series s vs xbox one x, older console had terrible cpu but massive gpu, while newer one has smaller/weaker gpu but much more capable cpu and much more modern feature set(and an ssd).
Same thing gonna be with ps4pr0 vs switch2 docked or ps4 base vs switch2 in handheld mode.

We forgot how terrible really were those jaguar cores we had in xbox one/s/x and ps4 base/pr0.
Remember the ubisoft slide when it compared ps3 cpu power to ps4 cpu power?
G60hdWM.jpeg
 
I think we will have similar situation we had with series s vs xbox one x, older console had terrible cpu but massive gpu, while newer one has smaller/weaker gpu but much more capable cpu and much more modern feature set(and an ssd).
Same thing gonna be with ps4pr0 vs switch2 docked or ps4 base vs switch2 in handheld mode.

We forgot how terrible really were those jaguar cores we had in xbox one/s/x and ps4 base/pr0.
Remember the ubisoft slide when it compared ps3 cpu power to ps4 cpu power?
G60hdWM.jpeg
People hate on jaguar, but a few traditionally 30 fps franchises went 60 fps on ps4/xone, Battlefield, Battlefront, Gears of war, halo, Uncharted multiplayer and others
 
People hate on jaguar, but a few traditionally 30 fps franchises went 60 fps on ps4/xone, Battlefield, Battlefront, Gears of war, halo, Uncharted multiplayer and others
It was simply best console form factor cpu amd had to offer at that time, things couldnt be helped, luckily for the industry amd stopped slacking cpu wise, now we need to worry about gpu's coz modern console cpu's are both very performant and relatively affordable(compared to the gpu's :P ).
 
Horizon Zero Physics barely has any interactivity with the objects in its world. It also has many invisible walls within the world. Game logics are also laughably bad. No dynamic weather system.
The fuck are you talking about WRT the weather? Decima most definitely has a dynamic weather system. Volumetric cloud simulation even. And the type of weather that can occur depends on the location/elevation (you won't get sandstorms in the arctic regions, or snow in the desert, etc.)
 
Soon we will get to 'worse than Series S'
With the 100GB/s it'll always be worse (as in - not really close) than Series S.
7840U which on paper should run circles around S at 30Watts - sometimes completely crashes and burns in bandwidth intensive applications - and yes that includes software that is up to 20 years old.

But I don't even know if that memory spec is 'real'.
 
People hate on jaguar, but a few traditionally 30 fps franchises went 60 fps on ps4/xone, Battlefield, Battlefront, Gears of war, halo, Uncharted multiplayer and others
It did have better multithreading, but boy did Halo look bad on the xbox one
 
I don't think its as impressive as you guys makes it to be when you basically already gotten this level of Handheld performance from a 4 years old Steam Deck.

Docked? What is the point when your nearest competitor is giving 5x your performance since 2020?

I think the market will decide whats important.
 
BoTW uses basic ass havok and that can be done on GPU compute on a PS4. In fact there was a havok demonstration on the GPU as part of the PS4 release presentation.
Exactly. Most people think that PS4 is confined to 1.6 GHz Jaguars in physics calculations out of ignorance of its architecture. The APU was heavily tailored towards GPGPU via specific hardware customizations (Onion+ bus, Volatile bit, 8 ACEs), In a way it was designed to assist the CPU in this area, ideally asynchronously.
 
The main problem of the Switch 2 is that it uses worse node (Samsung 8nm) than both the Steam deck and PS5 and we have probably mainly saw docked modes of the big games (how is Eden Ring going to run on portable mode?) But it's still a next-gen leap compared to Switch. People just have to lower their expectation, a portable PS4 / XB1 is objectively a very impressive machine particularly running Nintendo games. DLSS could be used in docked mode in the future I reckon.

The bandwidth is the second most disappointing specs of all IMO. In portable mode that super low bandwidth will prevent it to efficiently use either DLSS or RT. We are talking XB1 bandwidth here, without the super fast esram 200GB/s part! XB1 latest big open world games were running horribly compared to PS4 games (even with the esram), there won't be any miracles with that 68GB/s bandwidth in portable mode, not with 8 CPU cores to feed. Right now I am expecting XB1 power in portable mode and PS4+ (DLSS, RT at 30fps?) in docked modes.
I wholeheartedly agree with your assessment of Switch 2 being at XB1 level in raw throughput as a handheld. Just to note, XB1's 200 GB/s figure for the 32 MB ESRAM is mostly meaningless damage control/marketing trick which combines reads and writes speeds (the official figure at reveal was 102 GB/s, before slight upclock). They stated that they reached upto ~140 GB/s in a specific tailored application afterwards, but in actual games i very much doubt it. I think PS4's 176 GB/s pool offered about twice as bandwidth in practice at least even with XB1's ESRAM help (the vast majority of RAM being 68 GB/s).
 
Last edited:
At the end of the day. No other company made a game like breath of the wild. No other company makes games that hit like a new Zelda, Mario, Metroid, Mario Kart, Animal Crossing and many more.

Nintendo innovated more with a shitty ancient tablet than other first party devs did with their churned out Assassins creed clones.

That's why so many people will buy a switch 2. Their software is just different to anyone else. Love them or hate them they are completely original.
 
Exactly. Most people think that PS4 is confined to 1.6 GHz Jaguars in physics calculations out of ignorance of its architecture. The APU was heavily tailored towards GPGPU via specific hardware customizations (Onion+ bus, Volatile bit, 8 ACEs), In a way it was designed to assist the CPU in this area, ideally asynchronously.

physics isn't the issue, Witcher 3 has the most static ass world imaginable with zero dynamic interactions, yet it's heavily CPU limited even on One X's 2.4ghz jaguar.

too many NPCs or enemy AIs in one place and a game like that will heavily slow down due to the CPU.

God of War 2018 is CPU limited in an almost empty and 100% static area, simply because it's large and open, probably due to drawcalls of the scenery or something else data streaming related.
 
Last edited:
I wholeheartedly agree with your assessment of Switch 2 being at XB1 level in raw throughput as a handheld. Just to note, XB1's 200 GB/s figure for the 32 MB ESRAM is mostly meaningless damage control/marketing trick which combines reads and writes speeds (the official figure at reveal was 102 GB/s, before slight upclock). They stated that they reached upto ~140 GB/s in a specific tailored application afterwards, but in actual games i very much doubt it. I think PS5's 176 GB/s offered about twice as bandwidth in practice at least even with XB1's ESRAM help (the vast majority of RAM being 68 GB/s).
It would've been interesting to see how things played out if Sony was unable to increase the amount of RAM just before launch, 4 gigs of fast ram vs 8 gigs of slow ram. As it was the difference was in line with the 40% better GPU metric, coming down to 900p vs 1080p games at the same quality.
 
At the end of the day. No other company made a game like breath of the wild. No other company makes games that hit like a new Zelda, Mario, Metroid, Mario Kart, Animal Crossing and many more.

Nintendo innovated more with a shitty ancient tablet than other first party devs did with their churned out Assassins creed clones.

That's why so many people will buy a switch 2. Their software is just different to anyone else. Love them or hate them they are completely original.
True but the novelty of playing something like Witcher 3 on the go is also a draw. Personally for me I'd only be interested in those titles if they hit 60fps, otherwise I might as well wait until I get home and play on pc.
 
This looks to me like the biggest leap power-wise for Nintendo since the transition from N64 to GameCube. Which pleases me a lot, they said they try to make Switch 2 future-proof, and this power bump seems to be a part of this plan.
 
Ampere has substantially better colour compression than GCN, so the figures are not directly comparable. For example the 3060 has 360 GB/s for 13.7 TF, which is 26 GB/s per TF. Then a 1.71 TF Ampere part would need ~44 GB/s by this metric, leaving 24 GB/s for the CPU.
Doesn't this assessment assume that RTX 3060 has 'ideal' bandwidth to compute ratio as a starting point, is this realistic?
Bandwidth 'need' isn't a really 'fixed thing' isn't, shouldn't it vary according to the nature of the processes?
What's the efficiency gain from Ampere delta color compression in consumption, what's the percentage ballpark in practical scenarios?
Isn't Switch 2 a HUMA architecture, what about contention (disproportinate bandwidth consumption with CPU utilisation)?
 
Last edited:
At the end of the day. No other company made a game like breath of the wild. No other company makes games that hit like a new Zelda, Mario, Metroid, Mario Kart, Animal Crossing and many more.

Nintendo innovated more with a shitty ancient tablet than other first party devs did with their churned out Assassins creed clones.

I foreshadowed this type of posts:

Edit: Reality will hit soon, and we will get back to " who does buy Switch to play 3rd party games? you get it for Nintendo games!!"
 
It would've been interesting to see how things played out if Sony was unable to increase the amount of RAM just before launch, 4 gigs of fast ram vs 8 gigs of slow ram. As it was the difference was in line with the 40% better GPU metric, coming down to 900p vs 1080p games at the same quality.
In general difference between XB1 vs PS4 came down to either 900P vs 1080P (44% higher resolution on PS4) or as often seen 720P vs 900P (56% higher on PS4 in this case) with sometimes higher performance on PS4 in GPU/bandwidth intensive scenarios on top. There also cases with some higher settings on PS4. The 40% figure for GPU is for compute only which doesn't take things like twice the ROPs (pixel fill rate) and ASYNC into account. To be fair i don't think third party titles pushed GPGPU side of PS4 that hard.

If PS4 had only 4 GB of RAM i honestly think this would have been pretty disastrous for Sony due to multiple reasons unrelated to raw power.
 
I foreshadowed this type of posts:

Well done for understanding facts, I guess bro. :)

People buy Nintendo consoles for their games over their hardware.

Imagine being a company that doesnt compete on a technical level with these big consoles like PS5 / Series X yet downright embarrass them with ingenuity. Of course Fans of the console buy the Switch for the first party games.
 
Doesn't this assessment assume that RTX 3060 has 'ideal' bandwidth to compute ratio as a starting point, is this realistic?
Bandwidth 'need' isn't a really 'fixed thing' isn't, shouldn't it vary according to the nature of the processes?
What's the efficiency gain from Ampere delta color compression in consumption, what's the percentage ballpark in practical scenarios?
Isn't Switch 2 a HUMA architecture, what about contention (disproportinate bandwidth consumption with CPU utilisation)?
A similar ratio is maintained across the entire 3000 series line, so it is not specific to the 3060. And the claim is not that this is "ideal", but merely that it should be sufficient to see performance comparable to other Ampere parts.

I can't find any figures listed for Ampere, however delta colour compression on Maxwell improved effective bandwidth by 25%+ over Kepler, Pascal offered a further 20% improvement over Maxwell, and Nvidia listed a further 25% improvement for Turing over Pascal.


I can't comment on the contention issue.
 
Last edited:
People hate on jaguar, but a few traditionally 30 fps franchises went 60 fps on ps4/xone, Battlefield, Battlefront, Gears of war, halo, Uncharted multiplayer and others
Indeed. MGS5 was running at locked 60fps on PS4 at 1080p while it was ~25fps on X360 (and ~20fps on PS3 at sub 720p resolution). That legendary Ubisoft benchmark didn't take into account the fact that the Cell was here used like a GPGPU. But in traditionnal CPU tasks ? It was trash (worse than X360 CPU) and Jaguar was like 2.5 times faster.

After a decade of work developers knew how to squeeze everything from the Jaguars cause they were eventually doing games for 4 Jaguar based machines. I doubt they spend as many time optimizing their games on one ARM based handheld notably knowing Nintendo gamers don't care about framerate.
 
Last edited:
Top Bottom