Did the Switch 2 graphics capabilities meet your expectation?

Did the Switch 2 graphics capabilities meet your expectation?

  • Over my expectations

  • In line with my expectations

  • Under my expectations


Results are only viewable after voting.


WOW, by double the wattage you crawl above Steam deck. 8.6 TFlops APU vs Steam deck 1.6 TFlops

Good job!

much-efficient-so.jpg


Exactly like I said. Rog ally X is the representation of the wrong direction PC handhelds are taking. Trying to brute force its way with stupid ass big APUs that are bottlenecked and inefficient.

Its typical PC environment efficiency, imagine a Switch 2 with those specs? Console optimization will drive it even further but alas .. Nintendo opted for parts so cheap and off shelf they are on par with SD.
 
ROG Ally X is not 8.6 TF, I'm not sure why that myth gets repeated. VOPD is useless for gaming, and the advertised GPU clock is almost never reached. At 25w it is 3.5 TF (2.3Ghz average clock), and at 15w you can expect around 1.69TF (1.1Ghz average clock). Memory bandwidth is only 17% higher vs the Deck, but in reality is around the same due to the Ally having 8 CPU cores using more memory bandwidth. If you install Bazzite and lock to 15w/17w then the Rog Ally X also has better battery life vs the Deck.

The Rog Ally X is engineered just fine. It's just a larger chip that needs 17w or more to start seeing meaningful performance lifts vs the Deck.
 
Last edited:
ROG Ally X is not 8.6 TF, I'm not sure why that myth gets repeated. VOPD is useless for gaming, and the advertised GPU clock is almost never reached. At 25w it is 3.5 TF (2.3Ghz average clock), and at 15w you can expect around 1.69TF (1.1Ghz average clock). Memory bandwidth is only 17% higher vs the Deck, but in reality is around the same due to the Ally having 8 CPU cores using more memory bandwidth. If you install Bazzite and lock to 15w/17w then the Rog Ally X also has better battery life vs the Deck.

The Rog Ally X is engineered just fine. It's just a larger chip that needs 17w or more to start seeing meaningful performance lifts vs the Deck.
games are nort optermized rog alley switch 2 performence will win
 
In handheld mode is already losing if it is really 540p/25fps not ultra settings...

The DF marketing reel pixel count again? Rich said its from the shot where it slaps a video of cyberpunk playing and onto a render of switch for marketing purpose that they got that count. It's meaningless and even he questions it. But here we are. DF being that fucking dumb to drop resolutions without context is now ammo repeated constantly while the game is said to have handheld 1080p at launch.
 
I mean in any common sense you would not pixel count a render on a CG handheld. WTF is this. Why even do it. What's next, pixel count peoples holding an handheld on the roof commercial?
540p makes some sense as Zelda TotK is 1440p and Cyberpunk is supposed to be much more demanding to run and is 1080p docked.
 
games are nort optermized rog alley switch 2 performence will win
In handheld mode I think the Rog Ally X at 20/25w will come out on top, but it is drawing way more power to do so and is almost twice the price, so the Switch 2 performing as well as it does is already a win.
 
I wasn't expecting much but Nintendo claim that the Switch 2 has 10X the performance of the Switch so on hearing that the screen was 1080p and the dock supported 4K I was kind of expecting more.

What irks me the most though is Nintendo's stubborn refusal to use DLSS or any kind of anti-aliasing in most of their games never mind anisotropic texture filtering, especially as I only buy Nintendo consoles for the first-party games. This is kind of a must for playing Switch games on a 4K TV so they don't look a jagged, shimmery mess while playing. I own a PS5 Pro and a PC (oh and something called an Xbox Series X, what a disappointment that is!) so I have zero-interest in playing third-party games on Switch or Switch 2 yet those are the games that are likely to embrace DLSS or use anti-aliasing etc etc.

The definitive Nintendo game experience is still on PC in my experience. I have played over 80 hours of Zelda: Breath of the Wild on my PC via the Cemu emulator with enhancements that make the game look far superior to the Switch 2 version and runs at 120 fps. Yet that game is the Wii U version (and, yes, I do still own boxed Wii U and GameCube games). Also, Mario Kart 8 on PC looks better than Mario Kart World.

The nail in the coffin was hearing that GameCube games would only run at 1280x960 on Switch 2 despite the fact that the NVIDIA Shield, which uses the same chipset as the Switch, runs GameCube games at higher resolutions according to Digital Foundry (although this is only for the Chinese market, I believe). So you would think that GameCube games would run at higher resolutions on the more powerful Switch 2, right? 1280x960 might be OK for the 1080p handheld screen but I'm guessing the games will not look good on a 4K TV.
 
I don't know why people are saying Switch 2 is a "substantial" upgrade over Deck when we have already saw that Hogwarts Legacy and SF6 have significant cutbacks. Deck's running the next gen code while Switch 2 is running the last gen config.
I played Hogwarts on the Deck from start to finish, it was worse than the Switch 2 footage.
 



As far as I'm concerned, videogame graphics peaked with Soul Calibur on Sega Dreamcast. Everything that has followed since is icing on the cake. Heck, to us kids who grew up in the 1980s, the movie Tron was considered the peak of computer graphics. We couldn't imagine videogames ever looking better than that. We passed that milestone decades ago, hah.

Everything today is amazing and fantastic. Better than I could hope or ask for.
 



As far as I'm concerned, videogame graphics peaked with Soul Calibur on Sega Dreamcast. Everything that has followed since is icing on the cake. Heck, to us kids who grew up in the 1980s, the movie Tron was considered the peak of computer graphics. We couldn't imagine videogames ever looking better than that. We passed that milestone decades ago, hah.

Everything today is amazing and fantastic. Better than I could hope or ask for.

Team forties! This and modern gaming sucks.
 
VOPD is useless for gaming
VOPD is useless in general I'd say but it does provide a theoretical flops peak which is correct when counted as a doubling of non-VOPD flops.
In any case math flops of all things aren't the main limiting factor for RDNA3 performance in a 15W power envelope.
 
VOPD is useless in general I'd say but it does provide a theoretical flops peak which is correct when counted as a doubling of non-VOPD flops.
Do you happen to know how Nvidia Ampere Teraflops are calculated? Are double rate fp32 calculations included in a similar fashion? I am reading 'ampere compute figures being inflated' left and right but couldn't find a definitive answer after doing a bit of research.
 
Do you happen to know how Nvidia Ampere Teraflops are calculated? Are double rate fp32 calculations included in a similar fashion? I am reading 'ampere compute figures being inflated' left and right but couldn't find a definitive answer after doing a bit of research.
Ampere really does have double FP32, though accessing it requires sacrificing the use of the integer unit. The 3080 has the same number of SMs as the 2080 Ti, but gets 30% more performance on average. Compare that to RDNA 3, where you get maybe 5% better performance.
 
s2oULB3.png

aaiOrcg.png

0OIw2WG.png


Switch 2 really has a huge upgrade over Steam Deck graphically.
Could you please precise what are those settings and the frame rate on the deck ?
Because it could just be a fanboy puttings the lowest settings on the deck just to make an unfair comparison.

Not saying it's the case here, but that wouldn't surprise me honestly.
And again, Deck is a 2022 device.

What really annoys me is Nintendo and Nvidia not giving the exact specs.
 
Ampere really does have double FP32, though accessing it requires sacrificing the use of the integer unit. The 3080 has the same number of SMs as the 2080 Ti, but gets 30% more performance on average. Compare that to RDNA 3, where you get maybe 5% better performance.
I see, so T239's stated 3TF docked figure is with this feature factored in? Like in a sense it's 1.5 ('base TF' without double rate fp32)×1.3=1.95 TF, so about 2TF is the closer to real throughput, less theoretical figure? It is indeed inflated in that sense then, confirming what i have been reading. Another analogy (i know this isn't 1:1) might be PS4 PRO's RPM feature. PS4 PRO was theoretically 8.4 TF at half/fp16 precision. We know that RPM isn't useless and it can bring a significant uplift in real world compute throughput (maybe about 20% with good use). We could factor that fact in using the same mentality like this 4.2×1.2=5 TF to reach a higher figure for PS4 PRO, but we of course aren't doing it. All this say to say that it may be fairer to account for this aspect when comparing Switch 2 to another system like PS4 regarding theoretical compute ceilings. It isn't quite 3>1.84 really, there are nuances there.
 
Last edited:
I see, so T239's stated 3TF docked figure is with this feature factored in? Like in a sense it's 1.5 ('base TF' without double rate fp32)×1.3=1.95 TF, so about 2TF is the closer to real throughput, less theoretical figure? It is indeed inflated in that sense then, confirming what i have been reading. Another analogy (i know this isn't 1:1) might be PS4 PRO's RPM feature. PS4 PRO was theoretically 8.4 TF at half/fp16 precision. We know that RPM isn't useless and it can bring a significant uplift in real world compute throughput (maybe about 20% with good use). We could factor that fact in using the same mentality like this 4.2×1.2=5 TF to reach a higher figure for PS4 PRO, but we of course aren't doing it. All this say to say that it may be fairer to account for this aspect when comparing Switch 2 to another system like PS4 regarding theoretical compute ceilings. It isn't quite 3>1.84 really, there are nuances there.
That's true but doing that calculation gives you the equivalent of a 1.95 TF Turing card when in reality Turing was slightly faster than RDNA 2 per teraflop, and RDNA 2 is 1.25X GCN. The multiplier I use is 0.7 to get the roughly equivalently performing RDNA 2 part (~33.6 TF 3080 competes with ~23.8 TF 6900 XT) and 0.875 to get the equivalently performing GCN part (0.7 * 1.25 = 0.875). So by this metric, 3.1 Ampere TF would roughly equal 2.17 RDNA 2 TF or 2.7 GCN TF. That is, a bit more than half the power of the Xbox Series S, or in between the PS4 and the PS4 Pro. Other than a few possible outliers, so far it looks like this matches what we see in Switch 2 games.
 
Last edited:
That's true but doing that calculation gives you the equivalent of a 1.95 TF Turing card when in reality Turing was slightly faster than RDNA 2 per teraflop, and RDNA 2 is 1.25X GCN. The multiplier I use is 0.7 to get the roughly equivalently performing RDNA 2 part (~33.6 TF 3080 competes with ~23.8 TF 6900 XT) and 0.875 to get the equivalently performing GCN part (0.7 * 1.25 = 0.875). So by this metric, 3.1 Ampere TF would roughly equal 2.17 RDNA 2 TF or 2.7 GCN TF. That is, a bit more than half the power of the Xbox Series S, or in between the PS4 and the PS4 Pro. Other than a few possible outliers, so far it looks like this matches what we see in Switch 2 games.
I see, thanks. To clarify, i was only assessing/comparing theoretical compute ceilings in isolation amongst other metrics/indicators since it's the most overused one which show up the most often in arguments. Since compute isn't the only indicator of game performance and its efficiency/real world throughput depends on a lot of factors like bandwidth (be it caches or memory), async, custom bits, APIs etc. It's all the more difficult to access its veracity for comparing specific platforms. Then there is the fixed function/rasterisation part of the picture of course. I guess we'll see the released games running side by side in due time for a better assessment.
 
I see, thanks. To clarify, i was only assessing/comparing theoretical compute ceilings in isolation amongst other metrics/indicators since it's the most overused one which show up the most often in arguments. Since compute isn't the only indicator of game performance and its efficiency/real world throughput depends on a lot of factors like bandwidth (be it caches or memory), async, custom bits, APIs etc. It's all the more difficult to access its veracity for comparing specific platforms. Then there is the fixed function/rasterisation part of the picture of course. I guess we'll see the released games running side by side in due time for a better assessment.
Well Ampere compute isn't "faked", so in a synthetic benchmark the Switch 2 should be able to hit that 3.1 TF figure.
 
Well Ampere compute isn't "faked", so in a synthetic benchmark the Switch 2 should be able to hit that 3.1 TF figure.
Of course, i never said that it was "faked" and couldn't reach it in synthetic benchmarks. I guess PS4 PRO could also reach 8.4 TF in a tailor made FP16 synthetic benchmark. I was only inquiring about the intricacies.
 
I have tried it today. My answer is yes, fully. The games I tried look good on a TV screen and absolutely gorgeous on the Switch 2 screen. It's miles ahead of the first one (as expected) and I love the form factor. So much power in such a slick device.
 
Last edited:
i didn't expect to see it damn near on par with PS5. Visually, games like FF7R, Cyberpunk 2077 and SF6 look only slightly less than their PS5 counterparts, at least from what I can see. That's extremely impressive for what is essentially a handheld.

I expected DLSS and 4k but I didn't expect VRR and a 120fps screen. Nintendo truly went above and beyond to future proof this thing!
 
Last edited:
As a non Nintendo fan, I honestly couldn't see the difference, because every Nintendo game to me looks the same as the previous gen and I'm not 5 years old it was a major meh, I stopped giving a shit about Nintendo when my balls dropped and voice changed, they're like the Michael Jackson of videogames, never changes, all about the kids and something just off lying beneath the surface
 
Huh? Litterally no one ever expected that. Hell, no one even expects that on a PS5 Pro.

Seriously, like half of GAF believed that it wouldn't even be able to output in 4k, and even the biggest optimists, myself included, didn't think you would have games like Metroid Prime 4 hitting 4k/60.
 
Seriously, like half of GAF believed that it wouldn't even be able to output in 4k, and even the biggest optimists, myself included, didn't think you would have games like Metroid Prime 4 hitting 4k/60.

What I really didn't see coming was 120FPS screen, nevermind supporting it on TV. With 120FPS and mice they really broadened the scope of games I can take this seriously to play. Lots of PC games won't really need a PC for a similar experience.

Don't worry guys not all PC games. Put the pitchforks away.
 
As a non Nintendo fan, I honestly couldn't see the difference, because every Nintendo game to me looks the same as the previous gen and I'm not 5 years old it was a major meh, I stopped giving a shit about Nintendo when my balls dropped and voice changed, they're like the Michael Jackson of videogames, never changes, all about the kids and something just off lying beneath the surface
America What GIF by Nat Geo Wild
 
As a non Nintendo fan, I honestly couldn't see the difference, because every Nintendo game to me looks the same as the previous gen and I'm not 5 years old it was a major meh, I stopped giving a shit about Nintendo when my balls dropped and voice changed, they're like the Michael Jackson of videogames, never changes, all about the kids and something just off lying beneath the surface

To be fair. I wouldn't be able to tell the difference between a PS5 and PS4 title if I never played the PS4 game either.
 
Top Bottom