Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.
You keeping saying this ... They aren't going to shrink Maxwell it makes no sense.

Why not? Assuming this SoC has been in the design phase for the past 2 years as Nvidia has said, and seeing as how they clearly have used the Tegra X1 as the starting point for the design (since it's in one of the devkits at least), it would seem clear that the GPU design started from Maxwell, and was likely customized therefrom.

If one of those customizations was a die shrink which offers Nintendo nothing but benefits (cost, efficiency, space), then this would still be a GPU based on Maxwell and on a 16nm process. This chip was likely in the design phase at the same time as the Pascal architecture, so they would likely have taken the base design from Maxwell, as it was readily available at the time in many products.

Maxwell has been on 28nm and 20nm, so what reason is there to doubt it could be on 16nm? Is your issue just the fact that a Maxwell GPU on 16nm might as well be Pascal?
 
How would you even clip the joycons on this ? I mean, I know that devkits tend to be really different from the final product, but they have to be playable. There's no way to actually put joycons on these thing. And joycons were one of the few things we had no idea how to represent them.
It has a Classic Controller port.
 
Stupid question but if developers are using a Jetson X1 as a devkit, how do they account for new tech or stuff that is added to the retail unit (custom X1). I mean, let's say that Nvidia has added the new texture compression ASTC that isn't found in the Jetson X1, do developers just code it in and emulate that through software?

How much can Nintendo and Nvidia customize the GPU without making it a hindrance for the developers?
 
Stupid question but if developers are using a Jetson X1 as a devkit, how do they account for new tech or stuff that is added to the retail unit (custom X1). I mean, let's say that Nvidia has added the new texture compression ASTC that isn't found in the Jetson X1, do developers just code it in and emulate that through software?

How much can Nintendo and Nvidia customize the GPU without making it a hindrance for the developers?

Yeah I was wondering the same thing. Light the tech GAF signal.
 
Why not? Assuming this SoC has been in the design phase for the past 2 years as Nvidia has said, and seeing as how they clearly have used the Tegra X1 as the starting point for the design (since it's in one of the devkits at least), it would seem clear that the GPU design started from Maxwell, and was likely customized therefrom.

If one of those customizations was a die shrink which offers Nintendo nothing but benefits (cost, efficiency, space), then this would still be a GPU based on Maxwell and on a 16nm process. This chip was likely in the design phase at the same time as the Pascal architecture, so they would likely have taken the base design from Maxwell, as it was readily available at the time in many products.

Maxwell has been on 28nm and 20nm, so what reason is there to doubt it could be on 16nm? Is your issue just the fact that a Maxwell GPU on 16nm might as well be Pascal?

Yes it would be Pascal they wouldn't just shrink Maxwell if they were already planning Pascal it would make no sense.
 
Stupid question but if developers are using a Jetson X1 as a devkit, how do they account for new tech or stuff that is added to the retail unit (custom X1). I mean, let's say that Nvidia has added the new texture compression ASTC that isn't found in the Jetson X1, do developers just code it in and emulate that through software?

How much can Nintendo and Nvidia customize the GPU without making it a hindrance for the developers?
That's probably the reason why titles released around the launch window of a console usually don't push the hardware much. Those games could be mostly developed on non-final dev kits in which the specifications are an approximation of the final hardware specs.
 
Stupid question but if developers are using a Jetson X1 as a devkit, how do they account for new tech or stuff that is added to the retail unit (custom X1). I mean, let's say that Nvidia has added the new texture compression ASTC that isn't found in the Jetson X1, do developers just code it in and emulate that through software?

How much can Nintendo and Nvidia customize the GPU without making it a hindrance for the developers?

Nintendo likely wouldn't add technology that wasn't supported by dev kits, or they'd send replacements with updated tech and new documentation. The weirder scenario would be if the newer chip didn't support technology that the older chip had supported.
 
Stupid question but if developers are using a Jetson X1 as a devkit, how do they account for new tech or stuff that is added to the retail unit (custom X1). I mean, let's say that Nvidia has added the new texture compression ASTC that isn't found in the Jetson X1, do developers just code it in and emulate that through software?

How much can Nintendo and Nvidia customize the GPU without making it a hindrance for the developers?

I don't know how to answer your broader question, but all we know about devkits being straight Jetson TX1s is that some time before July when Eurogamer reported about the hybrid the devkits were straight TX1s. We don't know how old those devkits are. They could be from as long as a year ago for all we know.

Yes it would be Pascal they wouldn't just shrink Maxwell if they were already planning Pascal it would make no sense.

What I'm saying is that, starting from a TX1 and making the customizations that Nintendo is interested would mean the GPU is Maxwell based, regardless of the process it's made on. Unless they chose to include every single alteration made between Maxwell and Pascal, which might not have been the case.

It's basically semantics- if the GPU evolved from Maxwell originally, not Pascal, even if it's remarkably similar to Pascal (which Maxwell is by default) then it's still Maxwell based, not Pascal based.
 
Stupid question but if developers are using a Jetson X1 as a devkit, how do they account for new tech or stuff that is added to the retail unit (custom X1). I mean, let's say that Nvidia has added the new texture compression ASTC that isn't found in the Jetson X1, do developers just code it in and emulate that through software?

How much can Nintendo and Nvidia customize the GPU without making it a hindrance for the developers?

Until final hardware Devkits are always working with the best they can to get an idea of what will finally be available so that can mean features are not fully in place etc.

As someone else mentions it is why Things can change drastically in game development on a new system. You are designing around X then finalized Y comes out and things change as far as what is and isn't possible.
 
It's a super generic looking tablet that has literally nothing in common with the switch. None of the buttons or ports are even remotely the same. And if that was the dev kit none of the leaks on how the system works and the joycons connect would have been what they were.

How would a dev even get glamour shots of a dev kit like that? It really just looks like a bad photoshop.

are you fucking blind?
 
IMG_0010.jpg


It honestly don't look familiar at all.

This one.
 
People doubting Emily because they don't like what she says after being right with all the Switch stuff it's becoming quite annoying.

I have no idea how anyone can think the devkit photo is fake.

This boggles my mind, this was leaked just hours before the reveal and it's exactly as the Switch unit but with extra Dev Kit features, but people keeps saying it's fake...
 
How would you even clip the joycons on this ? I mean, I know that devkits tend to be really different from the final product, but they have to be playable. There's no way to actually put joycons on these thing. And joycons were one of the few things we had no idea how to represent them.

That dev kit rev didn't have the controllers, only the newest ones do.
 
A dev came out and said that the real dev kit didn't have any special ports on it that the retail units doesn't have. Seems fake.

Well that's incorrect. That Rev has multiple ports the retail unit doesn't have. It's powered by a Wii U psu, it has Wii controller inputs (that dev kit, you used a Wii pro controller (not Wii U).
 
Well that's incorrect. That Rev has multiple ports the retail unit doesn't have. It's powered by a Wii U psu, it has Wii controller inputs (that dev kit, you used a Wii pro controller (not Wii U).
What are the BNC or coaxial connectors for?
 
This one right?



I remember seeing this right before, it is the same look, even the vents, slots, and buttons I think match. Wow.

The headphone jack though is now in the middle IIRC.

Only difference is that this unit seems to have a Wii/U styled port for power an ethernet port and a Wii/U style controller port. And the actual HDMI out
 
The only controller support we had was a Wii Classic Pro controller. The wired one. Maybe some devs got the Joy Con, but that dev kit is relatively old now.
If it's like every recent Nintendo final dev kit, the Switch final dev kit probably look like a green version of the retail model. I'm not sure though.
 
Catching up with the thread.

Didn't Laura Kate Dale debunk that Dev-kit "leak" of that picture?

All I know is that there is no way that picture was faked. We don't know what it is from, though the smart money does indeed say some sort of dev kit. But this picture was circulating the night before the October reveal and the top of the unit is 100% identical to the top of the Switch seen in the trailer, so it can't possibly be faked (without the faker having seen a real Switch unit anyway).
 
Are we thinking the CPU is going to be drastically different then what was listed at the start of the thread?

Four ARM Cortex-A57 cores
We have no clue about the CPU other than the A57+A53 cores found in the Tegra X1. Better keep our expectations in check.
 
LOL. Last time I checked people that are playing on PS4 aren't getting the "best graphics". In fact, I don't think the term "best graphics" should even be used any discussions when it comes to any consoles. Unless, some crazy manufacturer is releasing something with at least a Haswell quad-core i7 w/ a GTX 1080 or Titian....

"They can get" as in within the limitations they set themselves (budget ease of use, comfort for playing). Basically anyone who wants a console machine and likes "great graphics" won't chose a Switch because it'll likely be quite a bit lesser than the current consoles and A LOT lesser than PS4 Pro and Scorpio.
 
Status
Not open for further replies.
Top Bottom