A Nintendo Switch has been taken apart

Standard specs. Not necessarily available to developers. Nintendo themselves makes this differentiation in their development docs.
 
We haven't seen more recent Wii U shots, but it's very likely that it looks better. The viewing distance seems much improved, performance is way better and resolution too, the only thing of "concern" is some lighting issues, but that might be a general change or a build thing.

Same HDMI output, but more hardware power behind it and same or better resolutions used, so it should.

We already know Mario Kart 8 Deluxe is 1080p @60fps, so...

awesome! i dont think they've explicitly stated that it was going to be better than the wiiu. personally, i thought the wiiu had awesome graphics so to have something thats comparable power wise and portable is epic.
 
Standard specs. Not necessarily available to developers. Nintendo themselves makes this differentiation in their development docs.

Ok, I'm not trying to be stubborn but that isn't the same thing at all. The standard specs for X1 is not the specs the july devkits have sure.

Hanging your hat on a translation of a chinese to english post based on the word standard? He isn't a developer either, what is standard about that clock exactly? I mean it's nothing like the X1 spec sheet, it's literally a custom chip design to run at a single clock (the cpu in this instance) so why would there be a different clock for developers?
 
Because it matters if Nintendo treats it as the 3DS successor.That would mean its gonna get an amazing stream of games.

As a home console,it has less chances of getting third parties due to power gap.

why would it mean it gets an amazing stream of games?

i look at the 3ds release list and i see hardly anything, with massive gaps inbetween the scant few things that do release for it
 
Standard specs. Not necessarily available to developers. Nintendo themselves makes this differentiation in their development docs.

The standard specs are 2GHZ for the CPU and 1GHZ for the GPU, and the actual speed available to devs was TBD in the right column of the leaked document. At this point, why would they test 1.78GHZ and 921MHZ clocks for 8 days? Why there's no mention of 1.02GHZ and 768MHZ in the foxconn leak? The guy didn't pull those numbers from some document, they were on the screen during the test.

I'm trying to understand, because if they kept A57 cores there's no way that clock is being used.
 
The standard specs are 2GHZ for the CPU and 1GHZ for the GPU, and the actual speed available to devs was TBD in the right column of the leaked document. At this point, why would they test 1.78GHZ and 921MHZ clocks for 8 days? Why there's no mention of 1.02GHZ and 768MHZ in the foxconn leak? The guy didn't pull those numbers from some document, they were on the screen during the test.

I'm trying to understand, because if they kept A57 cores there's no way that clock is being used.

Actually A57 is possible at that clock, the battery capacity gives 6.4watts for the device and A57 on Samsung's 14nm process (this is done on a similar TSMC 16nm process) draws ~3.6watts at the 1.78ghz for 4 cores, and the best estimation we have for the GPU at 384mhz is around 0.4w so if the rest of the system can sit at 2.5w or below, it is possible to push that clock on A57.
 
The standard specs are 2GHZ for the CPU and 1GHZ for the GPU, and the actual speed available to devs was TBD in the right column of the leaked document. At this point, why would they test 1.78GHZ and 921MHZ clocks for 8 days? Why there's no mention of 1.02GHZ and 768MHZ in the foxconn leak? The guy didn't pull those numbers from some document, they were on the screen during the test.

I'm trying to understand, because if they kept A57 cores there's no way that clock is being used.

The updated translation actually never connects the clock speeds (which he calls "standard specs") to the test demo, so we don't know for sure if the test unit was running at those speeds.
 
Actually A57 is possible at that clock, the battery capacity gives 6.4watts for the device and A57 on Samsung's 14nm process (this is done on a similar TSMC 16nm process) draws ~3.6watts at the 1.78ghz for 4 cores, and the best estimation we have for the GPU at 384mhz is around 0.4w so if the rest of the system can sit at 2.5w or below, it is possible to push that clock on A57.
That's assuming 14nm, we don't know if that's the case. Also A72 would still be a better choice (space taken on die, heat, power consumption, performance per watt).

The updated translation actually never connects the clock speeds (which he calls "standard specs") to the test demo, so we don't know for sure if the test unit was running at those speeds.

But where did he read those clocks then? I doubt he had documents, and even if that was the case i doubt that they would show different max theoretical clocks for the chip itself than the ones in the leaked document.
 
Jeez this thing is still an enigma.

At this stage I'm expecting the official teardown to feature Hercule Poirot.

"But what ze Foxconn leaker was not telling you, mademoiselle, was that ze sample testing was performed specifically at release spec!"
 
But where did he read those clocks then? I doubt he had documents, and even if that was the case i doubt that they would show different max theoretical clocks for the chip itself than the ones in the leaked document.

Well that's kinda the point, we don't really know anymore where he got that info. Originally since it was close to the part where he talks about the 8 day test, it must have been translated in a way that suggested he saw those clocks during that test. Now though, there's absolutely no connection.

He makes other comments about hearing various things from people at the plant, maybe supervisors' offices or something like that, so it could be possible that he saw that on a spec sheet. And that could be the final operating clocks on that sheet or the maximum theoretical clocks. It doesn't make much sense that the max theoretical clocks would be lower than the TX1 max though.

We just really don't know what the context of these clocks is anymore is the main problem. It definitely seems possible that he saw it on a readout during this demo, but it's had to be sure of that anymore.

One thought I had is that maybe every unit is tested to make sure they can reach those clocks in the space/thermals allotted once assembled, which is why they need those clocks on a spec sheet? So that they can toss out any units which don't reach the prescribed levels?
 
I seem to remember Switch was widely rumored and expected to be released in Fall 2016, and it was at one point delayed to March 2017 for unknown reasons.
Maybe updating the SoC was one of those reasons and therefore the clocks Eurogamer got were indeed intended to be final, and they are just outdated?
 
I don't understand why people are so sceptical about the Foxconn leak, it got absolutely everything right except some assumptions ("The screen looks good, it must be 1080p) and it got many more details than even Eurogamer did.
Then Eurogamer releases some admittedly old information about clock speeds, in the same context as "October devkits are more powerful than the July ones".
I think it makes a lot of sense that older devkits were using A57s and Maxwell cores at 20nm and the old frequencies, and they just shrinked the node and changed the CPU for A72s at the same frequencies, with the same RAM size (4GB) and bandwidth.
And even if the machine has a full core or part of it dedicated to the OS it still has close throughput to that of PS4, more single thread performance and a decent GPU that can run its games with reduced graphics.


There's the rub though, if he got resolution wrong because he was only looking at a physical device, how would he know core type?


I'm not skeptical that he held the physical device. Just more, how much would a Foxconn assembly worker know about the internals of a device under NDA?


We get iPhone leaks nearly 100% of the time nowadays, but we don't know the uArch details until someone delids one and goes in with a microscope, or writes custom code to measure, and that's well after launch.
 
The standard specs are 2GHZ for the CPU and 1GHZ for the GPU, and the actual speed available to devs was TBD in the right column of the leaked document. At this point, why would they test 1.78GHZ and 921MHZ clocks for 8 days? Why there's no mention of 1.02GHZ and 768MHZ in the foxconn leak? The guy didn't pull those numbers from some document, they were on the screen during the test.

I'm trying to understand, because if they kept A57 cores there's no way that clock is being used.

It's not said that they tested the clocks for 8 days. Just that they had a Switch running for 8 days.

Also it doesn't say that the clocks were read from a benchmark. Just that these are the standard clocks.


Why there's no mention of 1.02GHZ and 768MHZ in the foxconn leak? Maybe the guy doesn't have access to this info if these are still the "real" clocks? Maybe the clocks are no longer valid? Who knows.

The new translation really paints a slightly different picture than the first one with more things being speculation and hearsay and making it sounds like the guy really gathered bits and pieces from his own experience with the Switch but also from colleagues in the factory.
 
There's the rub though, if he got resolution wrong because he was only looking at a physical device, how would he know core type?

I'm not skeptical that he held the physical device. Just more, how much would a Foxconn assembly worker know about the internals of a device under NDA?

We get iPhone leaks nearly 100% of the time nowadays, but we don't know the uArch details until someone delids one and goes in with a microscope, or writes custom code to measure, and that's well after launch.

He clarified that he saw the output was 1080p, which is why he assumed that was the screen resolution. The updated translation is here: http://www.neogaf.com/forum/showpost.php?p=229879660&postcount=836

We don't know how he got the clock speed info, or whether or not 16nm is his assumption (which it seems to be), but he does specify later in his post that he looked at the clock speeds again and initially reported them incorrectly (1750MHz->1785MHz, 912MHz-> 921MHz). So it would seem to be that he got a look at clock speeds, whether on a readout or spec sheet, but this doesn't tell us if those are the speeds that the retail product will be maxed at.

Why there's no mention of 1.02GHZ and 768MHZ in the foxconn leak? Maybe the guy doesn't have access to this info if these are still the "real" clocks? Maybe the clocks are no longer valid? Who knows.

The new translation really paints a slightly different picture than the first one with more things being speculation and hearsay and making it sounds like the guy really gathered bits and pieces from his own experience with the Switch but also from colleagues in the factory.

If it's true that 1.02GHz and 768MHz is what's available to developers, then it would make a ton of sense that he didn't see those numbers. The numbers he saw likely only has to do with the maximum clock speeds the unit will be run at, and we don't know for sure that it was run at those speeds for 8 days. So maybe Nintendo wants a large amount of overhead for a possible clock speed increase in the future, so they are throwing out any unit which can't reach 1.78GHz and 921MHz comfortably.
 
So games like Mighty number 9 and Bro force are also maxing out the hardware since they also struggle to maintain 60 fps? I didn't realise the PS4 was so weak...

Except it's not and the PS4 struggling to maintain 60 fps in Snakepass likely has nothing to do with Sumo maxing out the PS4 and is just them not optimising correctly.
Yes. If it cannot do what a developer is trying to make it do at the speed it's supposed to do, something is being maxed out. That doesn't imply something a thousand times better couldn't be done by people with more talent.
 
Well that's kinda the point, we don't really know anymore where he got that info. Originally since it was close to the part where he talks about the 8 day test, it must have been translated in a way that suggested he saw those clocks during that test. Now though, there's absolutely no connection.

He makes other comments about hearing various things from people at the plant, maybe supervisors' offices or something like that, so it could be possible that he saw that on a spec sheet. And that could be the final operating clocks on that sheet or the maximum theoretical clocks. It doesn't make much sense that the max theoretical clocks would be lower than the TX1 max though.

We just really don't know what the context of these clocks is anymore is the main problem. It definitely seems possible that he saw it on a readout during this demo, but it's had to be sure of that anymore.

One thought I had is that maybe every unit is tested to make sure they can reach those clocks in the space/thermals allotted once assembled, which is why they need those clocks on a spec sheet? So that they can toss out any units which don't reach the prescribed levels?

They wouldn't be doing that test during full production, and it wouldn't be done at foxconn but TSMC. The truth is, it doesn't matter where he saw the clocks, he reported those clocks and they can't be maximum theoretical clocks because the chip is based on X1 and the base clock didn't change, the maximum clocks would still be possible with the Switch SoC that X1 had. The "standard spec" (which is a translation and thus not an indication of much) is the clocks he saw/heard in relation to the full production unit, and we know it is a full production unit because he indicates they were making 20k of these a day, you can't test every one, you do random sampling, which likely has temp read outs and possibly clock read outs so that the testers know that the chip is operating normally, not for stress testing a theoretical clock but for testing the clocks the chip runs at in the wild. (that is the only purpose of random testing, to check the quality of the production, collecting a large sample allows you to have confidence that all the units work correctly.

I seem to remember Switch was widely rumored and expected to be released in Fall 2016, and it was at one point delayed to March 2017 for unknown reasons.
Maybe updating the SoC was one of those reasons and therefore the clocks Eurogamer got were indeed intended to be final, and they are just outdated?

Just speculating, but that could be what the Nintendo president meant when he said it was because they needed to have software available for the Switch. From the reports we heard, Mario was done and it seems silly to miss the holidays if that was the case. 3rd parties might have just said "The CPU is too low" and so Nintendo had to change the SoC and didn't update the document eurogamer read about launch clocks, again just a throw away speculation here.
 
We don't know how he got the clock speed info, or whether or not 16nm is his assumption (which it seems to be), but he does specify later in his post that he looked at the clock speeds again and initially reported them incorrectly (1750MHz->1785MHz, 912MHz-> 921MHz). So it would seem to be that he got a look at clock speeds, whether on a readout or spec sheet, but this doesn't tell us if those are the speeds that the retail product will be maxed at.

That part for me sounds like he got hold of some kind of spec sheet or there was one visibile close to where he works, seeing the precision of the corrections.
 
There's the rub though, if he got resolution wrong because he was only looking at a physical device, how would he know core type?


I'm not skeptical that he held the physical device. Just more, how much would a Foxconn assembly worker know about the internals of a device under NDA?


We get iPhone leaks nearly 100% of the time nowadays, but we don't know the uArch details until someone delids one and goes in with a microscope, or writes custom code to measure, and that's well after launch.

Well, I always thought the Foxconn worker wouldn't be anyone, but someone relatively high in the ranks, a manager, whatever, probably somebody in charge of QA, where they test the workings of the internals and therefore see that kind of tests. He wouldn't have to know the panel type or resolution, and the translation says that the process node and the console having A73 were educated guesses by the worker based on the clocks displayed on the test.
 
They wouldn't be doing that test during full production, and it wouldn't be done at foxconn but TSMC. The truth is, it doesn't matter where he saw the clocks, he reported those clocks and they can't be maximum theoretical clocks because the chip is based on X1 and the base clock didn't change, the maximum clocks would still be possible with the Switch SoC that X1 had. The "standard spec" (which is a translation and thus not an indication of much) is the clocks he saw/heard in relation to the full production unit, and we know it is a full production unit because he indicates they were making 20k of these a day, you can't test every one, you do random sampling, which likely has temp read outs and possibly clock read outs so that the testers know that the chip is operating normally, not for stress testing a theoretical clock but for testing the clocks the chip runs at in the wild. (that is the only purpose of random testing, to check the quality of the production, collecting a large sample allows you to have confidence that all the units work correctly.

As I was saying above, it might be the case that the random sampling was indeed where he saw the clock speeds, or potentially he saw the clock speeds on a sheet listing sort of a minimum tolerance threshold (meaning, if a unit cannot reach these speeds without exceeding X degrees C it's tossed out). But this doesn't mean those speeds will be available to developers.

I would imagine almost all consoles will have a comfortable overhead where the SoC is capable of reaching clocks higher than those that are available for games, just for quality and reliability purposes. If indeed the testing was done for 8 days at those clocks, then it would paint a different picture, but we can't be sure of that based on the new translation.
 
Well, I always thought the Foxconn worker wouldn't be anyone, but someone relatively high in the ranks, a manager, whatever, probably somebody in charge of QA, where they test the workings of the internals and therefore see that kind of tests. He wouldn't have to know the panel type or resolution, and the translation says that the process node and the console having A73 were educated guesses by the worker based on the clocks displayed on the test.

We can completely throw out those clocks being possible at 20nm, I believe the A57 is 8watts by itself and the GPU would put it close to 10 watts and giving the device just over 1 hour of battery life. A72 was around 5watts on 20nm iirc? that still is beyond the battery life we get for final hardware.

It was certainly someone who had access to the random sample testing that would be done at this phase, and like I said temperature read outs are important for such a test, because you need to know if there is a drastic difference in temperatures, if that is there, clocks are usually part of that tool kit as well.

As I was saying above, it might be the case that the random sampling was indeed where he saw the clock speeds, or potentially he saw the clock speeds on a sheet listing sort of a minimum tolerance threshold (meaning, if a unit cannot reach these speeds without exceeding X degrees C it's tossed out). But this doesn't mean those speeds will be available to developers.

I would imagine almost all consoles will have a comfortable overhead where the SoC is capable of reaching clocks higher than those that are available for games, just for quality and reliability purposes. If indeed the testing was done for 8 days at those clocks, then it would paint a different picture, but we can't be sure of that based on the new translation.

They don't do design or QA testing with random samples, the clocks during those tests are the clocks that would be out in the wild, if the clocks are real, they point to final clocks, you don't do stress tests on final products during full production is the entire bit of new info we have from someone with knowledge of the process over at beyond3D, I think it makes perfect sense too because you've already done those tests, and probably at TSMC, not foxconn.

Also what you seem to be talking about is "binning" that is done before the die even reaches a product, at the fab level, not the production level, how else could they choose to put binned chips into laptops?
 
We can completely throw out those clocks being possible at 20nm, I believe the A57 is 8watts by itself and the GPU would put it close to 10 watts and giving the device just over 1 hour of battery life. A72 was around 5watts on 20nm iirc? that still is beyond the battery life we get for final hardware.

It was certainly someone who had access to the random sample testing that would be done at this phase, and like I said temperature read outs are important for such a test, because you need to know if there is a drastic difference in temperatures, if that is there, clocks are usually part of that tool kit as well.
The way I understood the leak was, the worker performed a visual examination of the console, then tested the SoC at, presumably, the target frequency, and that's it.
Obviously this rules out 20nm, but nobody in their right mind would use 20nm in 2017, especially for a high volume product that is supposed to sell for years. It makes no sense from a business standpoint.
The whole 20nm thing comes from the rumor about Nvidia having to honor a contract with TSMC for Shield TVs, but that would be in the hundreds of thousands of units at MOST, we are talking 10 million Switch units predicted for year 1 alone. Now people stuck with it, but 20nm is dead ffs.
 
They don't do design or QA testing with random samples, the clocks during those tests are the clocks that would be out in the wild, if the clocks are real, they point to final clocks, you don't do stress tests on final products during full production is the entire bit of new info we have from someone with knowledge of the process over at beyond3D, I think it makes perfect sense too because you've already done those tests, and probably at TSMC, not foxconn.

Also what you seem to be talking about is "binning" that is done before the die even reaches a product, at the fab level, not the production level, how else could they choose to put binned chips into laptops?

You're saying Foxconn does not do any sort of quality assurance testing on assembled units? What is the point of the random sample testing then? I assume there's a big difference between TSMC testing the SoCs in isolation versus Foxconn testing them inside the assembled unit, as you have to account for cooling quality when assembled.

And I'm assuming this is the type of QA the random sampling is for, as every unit should fit the minimum thresholds perfectly if they were tested at TSMC, but in the small chance that a unit does not meet the criteria they would need to report that to Nintendo I imagine.

All of this is guesswork in my head based on how I believe Foxconn assembly works, so I'm certainly open to being corrected about it. I just don't see how Foxconn wouldn't be doing their own QA on the fully assembled units which includes ensuring the SoC works as it should.
 
He clarified that he saw the output was 1080p, which is why he assumed that was the screen resolution. The updated translation is here: http://www.neogaf.com/forum/showpost.php?p=229879660&postcount=836

We don't know how he got the clock speed info, or whether or not 16nm is his assumption (which it seems to be), but he does specify later in his post that he looked at the clock speeds again and initially reported them incorrectly (1750MHz->1785MHz, 912MHz-> 921MHz). So it would seem to be that he got a look at clock speeds, whether on a readout or spec sheet, but this doesn't tell us if those are the speeds that the retail product will be maxed at.



If it's true that 1.02GHz and 768MHz is what's available to developers, then it would make a ton of sense that he didn't see those numbers. The numbers he saw likely only has to do with the maximum clock speeds the unit will be run at, and we don't know for sure that it was run at those speeds for 8 days. So maybe Nintendo wants a large amount of overhead for a possible clock speed increase in the future, so they are throwing out any unit which can't reach 1.78GHz and 921MHz comfortably.
One thing that makes me believe that these numbers has some legitacy is that there is a relationship between Foxconn and Eurogamers numbers. The GPU clock frequencies are multiples of 76.8 (X1's system clock) and the CPU is exactly 75% higher for the Foxconn's numbers. That does seem like a crazy coincidence.
 
You're saying Foxconn does not do any sort of quality assurance testing on assembled units? What is the point of the random sample testing then? I assume there's a big difference between TSMC testing the SoCs in isolation versus Foxconn testing them inside the assembled unit, as you have to account for cooling quality when assembled.

And I'm assuming this is the type of QA the random sampling is for, as every unit should fit the minimum thresholds perfectly if they were tested at TSMC, but in the small chance that a unit does not meet the criteria they would need to report that to Nintendo I imagine.

All of this is guesswork in my head based on how I believe Foxconn assembly works, so I'm certainly open to being corrected about it. I just don't see how Foxconn wouldn't be doing their own QA on the fully assembled units which includes ensuring the SoC works as it should.

They do random sampling to check to make sure that the device runs at normal parameters. The SoC is tested at TSMC in a process people know as "binning" from what I understand this is just to check to see that the SoC works inside parameters and is shipped to be assembled.

The random sampling has nothing to do with outrageously high clock testing, that isn't the job they have to do there, all Foxconn's testing would be for at this point is to make sure the device runs at the final clock the device is expected to run at. NOW this could mean that the clock isn't for launch but for later down the road, but this isn't an easy out for Eurogamer's clocks because they wouldn't be possible on the same node.

Basically if Foxconn workers are testing those clocks. the device has to be 16nm, and if its 16nm, Eurogamer's clocks would not drain the battery in 3 hours, and couldn't produce the heat we see in those pictures.
 
The whole 20nm thing comes from the rumor about Nvidia having to honor a contract with TSMC for Shield TVs, but that would be in the hundreds of thousands of units at MOST, we are talking 10 million Switch units predicted for year 1 alone. Now people stuck with it, but 20nm is dead ffs.

Wasn't this picked from an earlier NX thread, from a post by Thraktor or similar? At least something was just copied from those topics and made news from them.
 
One thing that makes me believe that these numbers has some legitacy is that there is a relationship between Foxconn and Eurogamers numbers. The GPU clock frequencies are multiples of 76.8 (X1's system clock) and the CPU is exactly 75% higher for the Foxconn's numbers. That does seem like a crazy coincidence.

I don't have any doubt that the clock numbers are legitimate either, I just don't know if we can reasonably assume these will (even eventually) be the clocks available to developers based on what the leaker has said.

Like I said in the other thread, if the power consumption numbers for a 20nm unit with the EG clocks matches up almost exactly with the power consumption numbers for a 16nm unit with the Foxconn clocks, that would be strong evidence in my mind that Nintendo did raise the clock speeds available to developers, as you could come up with a good reasoning of why they did (20nm to 16nm afforded them a lot more performance at the same power/battery life). But we only have rough calculations so far on that, though they are promising.
 
Because that's the only example of a multiplat game that isn't DQ Heroes which we can compare between the two consoles. The fact that it's near PS4 levels (or over 50% those levels) a month after it was first ported over is incredibly impressive for the Switch. If it's struggling to maintain 60fps on the PS4 then there's obviously some sort of bottleneck, be it GPU or CPU, and it's therefore maxing the hardware in some way.

Are people forgetting Disgaea 5? It was PS4 only and has none of the issues of DQ Heroes
 
One thing that makes me believe that these numbers has some legitacy is that there is a relationship between Foxconn and Eurogamers numbers. The GPU clock frequencies are multiples of 76.8 (X1's system clock) and the CPU is exactly 75% higher for the Foxconn's numbers. That does seem like a crazy coincidence.

(1750MHz->1785MHz, 912MHz-> 921MHz


Did he revise them for the appropriate multiples before or after the Eurogamer leak?
 
Did he revise them for the appropriate multiples before or after the Eurogamer leak?

Weeks before. The entire Foxconn leak was weeks before Eurogamer's, and bonus, it was newer information than Eurogamer's as Eurogamer's came from a few months prior, with no desire to publish them until the venture beat article confirmed the architecture for them.
 
Weeks before. The entire Foxconn leak was weeks before Eurogamer's, and bonus, it was newer information than Eurogamer's as Eurogamer's came from a few months prior, with no desire to publish them until the venture beat article confirmed the architecture for them.

I see, thanks. That would be a pretty big coincidence.

I hope someone like Hector Marcan can finalize all this for us at launch as with the Wii U, with which there were also multiple clock speed rumours floating around before launch.
 
I see, thanks. That would be a pretty big coincidence.

I hope someone like Hector Marcan can finalize all this for us at launch as with the Wii U, with which there were also multiple clock speed rumours floating around before launch.

It would be nice. It's hard to ignore the foxconn clocks but there is a chance that they are for an update down the road, the problem there is that the eurogamer clocks aren't possible with what we know if that is the case.

Eurogamer's clocks absolutely had to be real though, too much connection with the foxconn one, so it isn't really proving Eurogamer wrong, it is more like their Pokemon rumor, where things were changed after their information, they just sat on the clocks for too long here possibly.
 
Are people forgetting Disgaea 5? It was PS4 only and has none of the issues of DQ Heroes

Actually yes I did forget that! Is there any comparison between versions for that?

The other reason I think Snake Pass is a very good point of comparison is that it's an upcoming title for both platforms, rather than a title which has been out for one console and is being ported to another like DQH and Disgea.
 
There is no way the foxconn leak is false. Zero chance. Way too many things someone couldn't know given how it was discovered in the speculation thread. But one of the major things we learned was about the thernal throttling the X1 does which lined up with the Eurogamer clocks. So I don't think there is any chance those clocks on the foxconn leak could be available to devs. But of the chip is on a 16nm process then those 20% clock increases speculated would be possible from my understanding. I wouldn't count on it but ot would be possible.
 
There is no way the foxconn leak is false. Zero chance. Way too many things someone couldn't know given how it was discovered in the speculation thread. But one of the major things we learned was about the thernal throttling the X1 does which lined up with the Eurogamer clocks. So I don't think there is any chance those clocks on the foxconn leak could be available to devs. But of the chip is on a 16nm process then those 20% clock increases speculated would be possible from my understanding. I wouldn't count on it but ot would be possible.

There is a distinction to be made between the outward features (that is, things he can physically see) and things like clock speeds, which require a different method of analysis. I think the Foxconn leak is plausible, but we can't directly say that his clock speeds must be true just because he has physical features correct. Still, it is definitely possible he had access to (someone who had access to) these benchmarks and tests he mentioned, so I definitely do not dismiss his clock speeds and stuff. At any rate we will know this within two weeks most likely, so we just need a little patience.
 
There is a distinction to be made between the outward features (that is, things he can physically see) and things like clock speeds, which require a different method of analysis. I think the Foxconn leak is plausible, but we can't directly say that his clock speeds must be true just because he has physical features correct. Still, it is definitely possible he had access to (someone who had access to) these benchmarks and tests he mentioned, so I definitely do not dismiss his clock speeds and stuff. At any rate we will know this within two weeks most likely, so we just need a little patience.

I'm not implying the foxconn leak is going to be retail specs at all. I'm saying the opposite. I'm just saying those clocks are definitely not made up. It almost certainly is a bench mark and not available to devs. Because of what a gaffer doscovered about the shield TVs thermal throttling in another thread it would be impossible to get those clocks in a retail unit without a drastically improved cooling method.

I do think speculation about a 20% clock increase is possible if the process is 16nm though. (Possible, I am not betting on it)
 
There is a distinction to be made between the outward features (that is, things he can physically see) and things like clock speeds, which require a different method of analysis. I think the Foxconn leak is plausible, but we can't directly say that his clock speeds must be true just because he has physical features correct. Still, it is definitely possible he had access to (someone who had access to) these benchmarks and tests he mentioned, so I definitely do not dismiss his clock speeds and stuff. At any rate we will know this within two weeks most likely, so we just need a little patience.

Exactly this, no need to claim these are real clocks, we will know soon enough. It's just speculation about a very substantial leak.
 
I was the gaffer that was doing the Shield TV tests hah. You can check back on those findings I made if you check my post history around just before the January 13th Switch presentation.

I managed to go to one of the Switch events to get a feel of the Switch. Using unfortunately un-scientific methods, relying on my memory and using my hands it felt around the same temperature in my hands and about the same heat coming out of the top vent (not blowing air, just felt rising heat) as it does on my Shield TV; this was when I was playing Mario Kart 8 in handheld mode. The Switches were up and running at least a good hour and played constantly during that time when the event started.

Make of that what you will!
 
I wonder when DF will do a tear down of the switch. We have a preview embargo lifted tomorrow for the switch, and a review embargo lifted on March 2nd for games.. I wonder where that fits with disassembling the switch..
 
I was the gaffer that was doing the Shield TV tests hah. You can check back on those findings I made if you check my post history around just before the January 13th Switch presentation.

I managed to go to one of the Switch events to get a feel of the Switch. Using unfortunately un-scientific methods, relying on my memory and using my hands it felt around the same temperature in my hands and about the same heat coming out of the top vent (not blowing air, just felt rising heat) as it does on my Shield TV; this was when I was playing Mario Kart 8 in handheld mode. The Switches were up and running at least a good hour and played constantly during that time when the event started.

Make of that what you will!

Yeah from the IR picture we could tell they produce the same relevantive, so it's likely targeting the same temperatures while running. Doesn't get us much further to the truth, but does help with the idea that it is a custom chip based on x1.

It also tells us that it can't be eurogamer's clocks on 16nm or Foxconn's clocks on 20nm with the limited space for cooling, if its still suppose to upclock when docked.

Thanks for your hard work BTW.
 
But Shield TV runs at higher clocks (than Switch in portable mode) and with a bigger heatsink, which means that Switch in portable mode reaches the same temoperatures as Shield TV does and thus in dock mode the fan will rev up and potentially be louder than Shield TV.(Which is silent anyway)
Judging by the size of them both, this seems reasonable, but this only tells us that the energy threshold is about the same and we already expected that.
 
But Shield TV runs at higher clocks (than Switch in portable mode) and with a bigger heatsink, which means that Switch in portable mode reaches the same temoperatures as Shield TV does and thus in dock mode the fan will rev up and potentially be louder than Shield TV.(Which is silent anyway)
Judging by the size of them both, this seems reasonable, but this only tells us that the energy threshold is about the same and we already expected that.

And the last sentence you wrote tells us that it can't be eurogamer's clocks on 16nm because that draws under 1.5w for the Soc and couldn't have a heat profile like that with the active cooling. Foxconn's clocks are also pretty much impossible on 20nm because the battery would drain in just over a hour.
 
I was the gaffer that was doing the Shield TV tests hah. You can check back on those findings I made if you check my post history around just before the January 13th Switch presentation.

I managed to go to one of the Switch events to get a feel of the Switch. Using unfortunately un-scientific methods, relying on my memory and using my hands it felt around the same temperature in my hands and about the same heat coming out of the top vent (not blowing air, just felt rising heat) as it does on my Shield TV; this was when I was playing Mario Kart 8 in handheld mode. The Switches were up and running at least a good hour and played constantly during that time when the event started.

Make of that what you will!
Right i forgot your nick, edited my last post to credit you. Thanks again for the testing.

Has there been any confirmation about what the screen is made of? Glass would be preferable
Some people who tried the console said it's glass, there are also tempered glass protectors for it.
 
While I get your point (and nobody in this thread has been arguing for indiscriminate use of fp16 in contemporary shaders - on the contrary)

That was in fact, the exact point I was replying to from the poster: "100% of pixel work can be done in fp16". Sweeping generalities in these enthusiast discussions are painful to witness. I'm not sure anyone in this thread has even attempted to understand long pole on performance due to memory latency in the shader microcode, launch rate, etc., none of which will be helped by fp16. But even that's a generalization, because we have that situation already, which is why we sometimes fold passes because we have idle ALU cycles, etc.

what you give as an negative example is perfectly doable in a carefully-devised integration test scenario, where a shader running entirely at precision X is taken, and the individual computational statements in it are demoted one by one to precision Y and passed through an automated test harness that mimics the use-cases of the shader and compared for, say, MSE vs reference results. It's not exactly rocket science (yes, I've used such a pipeline for fp64 to fp32).

Sure, you could do this (probably in the RDC DXBC interpreter to start with, although that will expose you to problems where architectural implementations differ from emulated results), but no one in the GPU space is doing so as part of normal development workflow. We have had discussions with various parties about annotations to allow us better expressivity about required ranges of values, but it's a very tricky thing to do, and expensive to do every time you compile, etc. Instead it's the usual--hunt for banding, blow-outs, etc, post mortem analysis. To re-use your phrase, it's not rocket science, but it is work.

Architectural peculiarities are just that - architectural peculiarities.

Sure, but they are the bread and butter of a console GPU programmer.
 
I wonder when DF will do a tear down of the switch. We have a preview embargo lifted tomorrow for the switch, and a review embargo lifted on March 2nd for games.. I wonder where that fits with disassembling the switch..
It's really weird Nintendo didn't make an OS direct and leaving it up to the press to show it of. I assume that goes under "preview". I would also assume a teardown goes under "preview" but I have no idea.
 
It's really weird Nintendo didn't make an OS direct and leaving it up to the press to show it of. I assume that goes under "preview". I would also assume a teardown goes under "preview" but I have no idea.
I imagine the preview is pretty strict to be honest. OS maybe but possibly not. Probably more hardware based.
 
The embargo thread says 23rd of Feb is when the embargo lifts on all but game reviews.

DF tear-down in four hours (to midnight)? #wishfulthinking
 
Top Bottom