We haven't seen more recent Wii U shots, but it's very likely that it looks better. The viewing distance seems much improved, performance is way better and resolution too, the only thing of "concern" is some lighting issues, but that might be a general change or a build thing.
Same HDMI output, but more hardware power behind it and same or better resolutions used, so it should.
We already know Mario Kart 8 Deluxe is 1080p @60fps, so...
Standard specs. Not necessarily available to developers. Nintendo themselves makes this differentiation in their development docs.
Because it matters if Nintendo treats it as the 3DS successor.That would mean its gonna get an amazing stream of games.
As a home console,it has less chances of getting third parties due to power gap.
Standard specs. Not necessarily available to developers. Nintendo themselves makes this differentiation in their development docs.
The standard specs are 2GHZ for the CPU and 1GHZ for the GPU, and the actual speed available to devs was TBD in the right column of the leaked document. At this point, why would they test 1.78GHZ and 921MHZ clocks for 8 days? Why there's no mention of 1.02GHZ and 768MHZ in the foxconn leak? The guy didn't pull those numbers from some document, they were on the screen during the test.
I'm trying to understand, because if they kept A57 cores there's no way that clock is being used.
The standard specs are 2GHZ for the CPU and 1GHZ for the GPU, and the actual speed available to devs was TBD in the right column of the leaked document. At this point, why would they test 1.78GHZ and 921MHZ clocks for 8 days? Why there's no mention of 1.02GHZ and 768MHZ in the foxconn leak? The guy didn't pull those numbers from some document, they were on the screen during the test.
I'm trying to understand, because if they kept A57 cores there's no way that clock is being used.
That's assuming 14nm, we don't know if that's the case. Also A72 would still be a better choice (space taken on die, heat, power consumption, performance per watt).Actually A57 is possible at that clock, the battery capacity gives 6.4watts for the device and A57 on Samsung's 14nm process (this is done on a similar TSMC 16nm process) draws ~3.6watts at the 1.78ghz for 4 cores, and the best estimation we have for the GPU at 384mhz is around 0.4w so if the rest of the system can sit at 2.5w or below, it is possible to push that clock on A57.
The updated translation actually never connects the clock speeds (which he calls "standard specs") to the test demo, so we don't know for sure if the test unit was running at those speeds.
But where did he read those clocks then? I doubt he had documents, and even if that was the case i doubt that they would show different max theoretical clocks for the chip itself than the ones in the leaked document.
I don't understand why people are so sceptical about the Foxconn leak, it got absolutely everything right except some assumptions ("The screen looks good, it must be 1080p) and it got many more details than even Eurogamer did.
Then Eurogamer releases some admittedly old information about clock speeds, in the same context as "October devkits are more powerful than the July ones".
I think it makes a lot of sense that older devkits were using A57s and Maxwell cores at 20nm and the old frequencies, and they just shrinked the node and changed the CPU for A72s at the same frequencies, with the same RAM size (4GB) and bandwidth.
And even if the machine has a full core or part of it dedicated to the OS it still has close throughput to that of PS4, more single thread performance and a decent GPU that can run its games with reduced graphics.
The standard specs are 2GHZ for the CPU and 1GHZ for the GPU, and the actual speed available to devs was TBD in the right column of the leaked document. At this point, why would they test 1.78GHZ and 921MHZ clocks for 8 days? Why there's no mention of 1.02GHZ and 768MHZ in the foxconn leak? The guy didn't pull those numbers from some document, they were on the screen during the test.
I'm trying to understand, because if they kept A57 cores there's no way that clock is being used.
There's the rub though, if he got resolution wrong because he was only looking at a physical device, how would he know core type?
I'm not skeptical that he held the physical device. Just more, how much would a Foxconn assembly worker know about the internals of a device under NDA?
We get iPhone leaks nearly 100% of the time nowadays, but we don't know the uArch details until someone delids one and goes in with a microscope, or writes custom code to measure, and that's well after launch.
Why there's no mention of 1.02GHZ and 768MHZ in the foxconn leak? Maybe the guy doesn't have access to this info if these are still the "real" clocks? Maybe the clocks are no longer valid? Who knows.
The new translation really paints a slightly different picture than the first one with more things being speculation and hearsay and making it sounds like the guy really gathered bits and pieces from his own experience with the Switch but also from colleagues in the factory.
Yes. If it cannot do what a developer is trying to make it do at the speed it's supposed to do, something is being maxed out. That doesn't imply something a thousand times better couldn't be done by people with more talent.So games like Mighty number 9 and Bro force are also maxing out the hardware since they also struggle to maintain 60 fps? I didn't realise the PS4 was so weak...
Except it's not and the PS4 struggling to maintain 60 fps in Snakepass likely has nothing to do with Sumo maxing out the PS4 and is just them not optimising correctly.
Well that's kinda the point, we don't really know anymore where he got that info. Originally since it was close to the part where he talks about the 8 day test, it must have been translated in a way that suggested he saw those clocks during that test. Now though, there's absolutely no connection.
He makes other comments about hearing various things from people at the plant, maybe supervisors' offices or something like that, so it could be possible that he saw that on a spec sheet. And that could be the final operating clocks on that sheet or the maximum theoretical clocks. It doesn't make much sense that the max theoretical clocks would be lower than the TX1 max though.
We just really don't know what the context of these clocks is anymore is the main problem. It definitely seems possible that he saw it on a readout during this demo, but it's had to be sure of that anymore.
One thought I had is that maybe every unit is tested to make sure they can reach those clocks in the space/thermals allotted once assembled, which is why they need those clocks on a spec sheet? So that they can toss out any units which don't reach the prescribed levels?
I seem to remember Switch was widely rumored and expected to be released in Fall 2016, and it was at one point delayed to March 2017 for unknown reasons.
Maybe updating the SoC was one of those reasons and therefore the clocks Eurogamer got were indeed intended to be final, and they are just outdated?
We don't know how he got the clock speed info, or whether or not 16nm is his assumption (which it seems to be), but he does specify later in his post that he looked at the clock speeds again and initially reported them incorrectly (1750MHz->1785MHz, 912MHz-> 921MHz). So it would seem to be that he got a look at clock speeds, whether on a readout or spec sheet, but this doesn't tell us if those are the speeds that the retail product will be maxed at.
There's the rub though, if he got resolution wrong because he was only looking at a physical device, how would he know core type?
I'm not skeptical that he held the physical device. Just more, how much would a Foxconn assembly worker know about the internals of a device under NDA?
We get iPhone leaks nearly 100% of the time nowadays, but we don't know the uArch details until someone delids one and goes in with a microscope, or writes custom code to measure, and that's well after launch.
They wouldn't be doing that test during full production, and it wouldn't be done at foxconn but TSMC. The truth is, it doesn't matter where he saw the clocks, he reported those clocks and they can't be maximum theoretical clocks because the chip is based on X1 and the base clock didn't change, the maximum clocks would still be possible with the Switch SoC that X1 had. The "standard spec" (which is a translation and thus not an indication of much) is the clocks he saw/heard in relation to the full production unit, and we know it is a full production unit because he indicates they were making 20k of these a day, you can't test every one, you do random sampling, which likely has temp read outs and possibly clock read outs so that the testers know that the chip is operating normally, not for stress testing a theoretical clock but for testing the clocks the chip runs at in the wild. (that is the only purpose of random testing, to check the quality of the production, collecting a large sample allows you to have confidence that all the units work correctly.
Well, I always thought the Foxconn worker wouldn't be anyone, but someone relatively high in the ranks, a manager, whatever, probably somebody in charge of QA, where they test the workings of the internals and therefore see that kind of tests. He wouldn't have to know the panel type or resolution, and the translation says that the process node and the console having A73 were educated guesses by the worker based on the clocks displayed on the test.
As I was saying above, it might be the case that the random sampling was indeed where he saw the clock speeds, or potentially he saw the clock speeds on a sheet listing sort of a minimum tolerance threshold (meaning, if a unit cannot reach these speeds without exceeding X degrees C it's tossed out). But this doesn't mean those speeds will be available to developers.
I would imagine almost all consoles will have a comfortable overhead where the SoC is capable of reaching clocks higher than those that are available for games, just for quality and reliability purposes. If indeed the testing was done for 8 days at those clocks, then it would paint a different picture, but we can't be sure of that based on the new translation.
The way I understood the leak was, the worker performed a visual examination of the console, then tested the SoC at, presumably, the target frequency, and that's it.We can completely throw out those clocks being possible at 20nm, I believe the A57 is 8watts by itself and the GPU would put it close to 10 watts and giving the device just over 1 hour of battery life. A72 was around 5watts on 20nm iirc? that still is beyond the battery life we get for final hardware.
It was certainly someone who had access to the random sample testing that would be done at this phase, and like I said temperature read outs are important for such a test, because you need to know if there is a drastic difference in temperatures, if that is there, clocks are usually part of that tool kit as well.
They don't do design or QA testing with random samples, the clocks during those tests are the clocks that would be out in the wild, if the clocks are real, they point to final clocks, you don't do stress tests on final products during full production is the entire bit of new info we have from someone with knowledge of the process over at beyond3D, I think it makes perfect sense too because you've already done those tests, and probably at TSMC, not foxconn.
Also what you seem to be talking about is "binning" that is done before the die even reaches a product, at the fab level, not the production level, how else could they choose to put binned chips into laptops?
One thing that makes me believe that these numbers has some legitacy is that there is a relationship between Foxconn and Eurogamers numbers. The GPU clock frequencies are multiples of 76.8 (X1's system clock) and the CPU is exactly 75% higher for the Foxconn's numbers. That does seem like a crazy coincidence.He clarified that he saw the output was 1080p, which is why he assumed that was the screen resolution. The updated translation is here: http://www.neogaf.com/forum/showpost.php?p=229879660&postcount=836
We don't know how he got the clock speed info, or whether or not 16nm is his assumption (which it seems to be), but he does specify later in his post that he looked at the clock speeds again and initially reported them incorrectly (1750MHz->1785MHz, 912MHz-> 921MHz). So it would seem to be that he got a look at clock speeds, whether on a readout or spec sheet, but this doesn't tell us if those are the speeds that the retail product will be maxed at.
If it's true that 1.02GHz and 768MHz is what's available to developers, then it would make a ton of sense that he didn't see those numbers. The numbers he saw likely only has to do with the maximum clock speeds the unit will be run at, and we don't know for sure that it was run at those speeds for 8 days. So maybe Nintendo wants a large amount of overhead for a possible clock speed increase in the future, so they are throwing out any unit which can't reach 1.78GHz and 921MHz comfortably.
You're saying Foxconn does not do any sort of quality assurance testing on assembled units? What is the point of the random sample testing then? I assume there's a big difference between TSMC testing the SoCs in isolation versus Foxconn testing them inside the assembled unit, as you have to account for cooling quality when assembled.
And I'm assuming this is the type of QA the random sampling is for, as every unit should fit the minimum thresholds perfectly if they were tested at TSMC, but in the small chance that a unit does not meet the criteria they would need to report that to Nintendo I imagine.
All of this is guesswork in my head based on how I believe Foxconn assembly works, so I'm certainly open to being corrected about it. I just don't see how Foxconn wouldn't be doing their own QA on the fully assembled units which includes ensuring the SoC works as it should.
The whole 20nm thing comes from the rumor about Nvidia having to honor a contract with TSMC for Shield TVs, but that would be in the hundreds of thousands of units at MOST, we are talking 10 million Switch units predicted for year 1 alone. Now people stuck with it, but 20nm is dead ffs.
One thing that makes me believe that these numbers has some legitacy is that there is a relationship between Foxconn and Eurogamers numbers. The GPU clock frequencies are multiples of 76.8 (X1's system clock) and the CPU is exactly 75% higher for the Foxconn's numbers. That does seem like a crazy coincidence.
Because that's the only example of a multiplat game that isn't DQ Heroes which we can compare between the two consoles. The fact that it's near PS4 levels (or over 50% those levels) a month after it was first ported over is incredibly impressive for the Switch. If it's struggling to maintain 60fps on the PS4 then there's obviously some sort of bottleneck, be it GPU or CPU, and it's therefore maxing the hardware in some way.
One thing that makes me believe that these numbers has some legitacy is that there is a relationship between Foxconn and Eurogamers numbers. The GPU clock frequencies are multiples of 76.8 (X1's system clock) and the CPU is exactly 75% higher for the Foxconn's numbers. That does seem like a crazy coincidence.
(1750MHz->1785MHz, 912MHz-> 921MHz
Did he revise them for the appropriate multiples before or after the Eurogamer leak?
Did he revise them for the appropriate multiples before or after the Eurogamer leak?
Weeks before. The entire Foxconn leak was weeks before Eurogamer's, and bonus, it was newer information than Eurogamer's as Eurogamer's came from a few months prior, with no desire to publish them until the venture beat article confirmed the architecture for them.
I see, thanks. That would be a pretty big coincidence.
I hope someone like Hector Marcan can finalize all this for us at launch as with the Wii U, with which there were also multiple clock speed rumours floating around before launch.
Are people forgetting Disgaea 5? It was PS4 only and has none of the issues of DQ Heroes
There is no way the foxconn leak is false. Zero chance. Way too many things someone couldn't know given how it was discovered in the speculation thread. But one of the major things we learned was about the thernal throttling the X1 does which lined up with the Eurogamer clocks. So I don't think there is any chance those clocks on the foxconn leak could be available to devs. But of the chip is on a 16nm process then those 20% clock increases speculated would be possible from my understanding. I wouldn't count on it but ot would be possible.
There is a distinction to be made between the outward features (that is, things he can physically see) and things like clock speeds, which require a different method of analysis. I think the Foxconn leak is plausible, but we can't directly say that his clock speeds must be true just because he has physical features correct. Still, it is definitely possible he had access to (someone who had access to) these benchmarks and tests he mentioned, so I definitely do not dismiss his clock speeds and stuff. At any rate we will know this within two weeks most likely, so we just need a little patience.
There is a distinction to be made between the outward features (that is, things he can physically see) and things like clock speeds, which require a different method of analysis. I think the Foxconn leak is plausible, but we can't directly say that his clock speeds must be true just because he has physical features correct. Still, it is definitely possible he had access to (someone who had access to) these benchmarks and tests he mentioned, so I definitely do not dismiss his clock speeds and stuff. At any rate we will know this within two weeks most likely, so we just need a little patience.
I was the gaffer that was doing the Shield TV tests hah. You can check back on those findings I made if you check my post history around just before the January 13th Switch presentation.
I managed to go to one of the Switch events to get a feel of the Switch. Using unfortunately un-scientific methods, relying on my memory and using my hands it felt around the same temperature in my hands and about the same heat coming out of the top vent (not blowing air, just felt rising heat) as it does on my Shield TV; this was when I was playing Mario Kart 8 in handheld mode. The Switches were up and running at least a good hour and played constantly during that time when the event started.
Make of that what you will!
But Shield TV runs at higher clocks (than Switch in portable mode) and with a bigger heatsink, which means that Switch in portable mode reaches the same temoperatures as Shield TV does and thus in dock mode the fan will rev up and potentially be louder than Shield TV.(Which is silent anyway)
Judging by the size of them both, this seems reasonable, but this only tells us that the energy threshold is about the same and we already expected that.
Right i forgot your nick, edited my last post to credit you. Thanks again for the testing.I was the gaffer that was doing the Shield TV tests hah. You can check back on those findings I made if you check my post history around just before the January 13th Switch presentation.
I managed to go to one of the Switch events to get a feel of the Switch. Using unfortunately un-scientific methods, relying on my memory and using my hands it felt around the same temperature in my hands and about the same heat coming out of the top vent (not blowing air, just felt rising heat) as it does on my Shield TV; this was when I was playing Mario Kart 8 in handheld mode. The Switches were up and running at least a good hour and played constantly during that time when the event started.
Make of that what you will!
Some people who tried the console said it's glass, there are also tempered glass protectors for it.Has there been any confirmation about what the screen is made of? Glass would be preferable
While I get your point (and nobody in this thread has been arguing for indiscriminate use of fp16 in contemporary shaders - on the contrary)
what you give as an negative example is perfectly doable in a carefully-devised integration test scenario, where a shader running entirely at precision X is taken, and the individual computational statements in it are demoted one by one to precision Y and passed through an automated test harness that mimics the use-cases of the shader and compared for, say, MSE vs reference results. It's not exactly rocket science (yes, I've used such a pipeline for fp64 to fp32).
Architectural peculiarities are just that - architectural peculiarities.
Has there been any confirmation about what the screen is made of? Glass would be preferable
It's really weird Nintendo didn't make an OS direct and leaving it up to the press to show it of. I assume that goes under "preview". I would also assume a teardown goes under "preview" but I have no idea.I wonder when DF will do a tear down of the switch. We have a preview embargo lifted tomorrow for the switch, and a review embargo lifted on March 2nd for games.. I wonder where that fits with disassembling the switch..
I imagine the preview is pretty strict to be honest. OS maybe but possibly not. Probably more hardware based.It's really weird Nintendo didn't make an OS direct and leaving it up to the press to show it of. I assume that goes under "preview". I would also assume a teardown goes under "preview" but I have no idea.