...
Are you familiar the term: Throw shit at the wall and see what sticks?
That is exactly what this guy is doing, he is no different to the other "leakers" on Twitter.
I just... Don't get how you can find this guy believable considering he got some lucky guesses from people's speculation of the NX hybrid from Eurogamer and Emily Rogers.
Are you not seeing the hundreds of other things he got wrong?
Seriously?
Guys, don't let forget that who leaked it was Nishikawa Zenji and not Zelda Informer.
Indeed. Nothing that panned out hadn't already been leaked by Emily Rogers or NateDrake.
...
Are you familiar with the term: Throw shit at the wall and see what sticks?
That is exactly what this guy is doing, he is no different to the other "leakers" on Twitter.
I just... Don't get how you can find this guy believable considering he got some lucky guesses from people's speculation of the NX hybrid from Eurogamer and Emily Rogers.
Are you not seeing the hundreds of other things he got wrong?
Seriously?
My issue here is you're not actually responding to a poster and arguing that their expectation of CPU clock speeds are unrealistic, or providing a counter-argument of what you think is a realistic expectation, but instead claiming that "Everyone is assuming that the CPU cores will run at maximum clock speed" without actually quoting or citing a single poster assuming as much, let alone providing evidence that every single person in this thread is making such an assumption.
Nishikawa Zenji is a japanese tech writer. Nishikawa Zenji was the first one who said that Sony would launch a stronger version of the PS4 a few years back.
Having re-read it, you're right.
My apologies.
His original leak seemed to have a lot of either mistranslations or misunderstandings of the difference between FP32 and FP16. We also don't know if his report was a "leak" or his own speculation, though I'd guess the latter.
Im still suprised people are grasping on to this fp16 thing. Im aware UE4 uses fp16 extensively but there were a lot of sacrifices made visually to make it work. You also have to be very cheeky with your shaders algos to make something that looks nice which warrents more work than many devs are willing to put in.
No problem.No, I apologise because I was harsh on you.
It's just that after the fact that so many frauds got exposed when the Switch was revealed, I thought that at least people would do more to question things when they are reading claims from a rumour. Especially when that rumour comes from somewhere that isn't a reputable media outlet.
That isn't really the case though since a lot of people still follow those "leakers" on Twitter.
I just explained my reasoning for bringing to attention a lack of consideration of power consumption in this thread. Earlier, you wrote that "There has been barely any discussion in this thread about CPU clock speeds" which supports the argument I made in the post you just responded to. (You also linked to posts of yours outside this thread. The reason for that still escapes me, since I am responding to the discussion in this thread.) My initial post was a direct response to a specific post citing the performance figures that have been undeniably widely accepted as reasonable in this thread (at least over the last 10-15 pages or so in which I participated), which was the entire basis of my argument, as I just explained.
Instead of engaging that argument I just made, you are instead distracting with a meta discussion about how I am supposedly not engaging other people's arguments. This is strange. Maybe you are reading "everyone" so hyperliterally that you take something personally that was never addressed to you, which would explained why youquite irrelevantlyquoted posts of yours from another thread. Or maybe you perceive my posts as party pooping, which I am not intending to do. In any case, I don't think you are producing fair criticism here.
I just explained my reasoning for bringing to attention a lack of consideration of power consumption in this thread. Earlier, you wrote that "There has been barely any discussion in this thread about CPU clock speeds" which supports the argument I made in the post you just responded to. (You also linked to posts of yours outside this thread. The reason for that still escapes me, since I am responding to the discussion in this thread.) My initial post was a direct response to a specific post citing the performance figures that have been undeniably widely accepted as reasonable in this thread (at least over the last 10-15 pages or so in which I participated), which was the entire basis of my argument, as I just explained.
Instead of engaging that argument I just made, you are instead distracting with a meta discussion about how I am supposedly not engaging other people's arguments. This is strange. Maybe you are reading "everyone" so hyperliterally that you take something personally that was never addressed to you, which would explained why youquite irrelevantlyquoted posts of yours from another thread. Or maybe you perceive my posts as party pooping, which I am not intending to do. In any case, I don't think you are producing fair criticism here.
You're blowing that a bit out of proportion. It's not hard work for a lot of devs because they would of had tons of experience with FP16 on PS3 and mobile. All that knowledge just doesn't go away. And devs are encouraged to use FP16 on PS4 Pro now. Its going to be very relevant (again). It's not hard for devs to test FP16 to see which shaders and algos work or not, they literally only need to change 1 line of code to test. And in my research a good amount of shaders can use FP16 without any work or little work at all and still produce the same result as FP32.
Any game on Switch which is purely FP32 only is a poorly optimised one.
The work to get as many shaders and algos on FP16 is well worth it for those to perform at double the performance.
You know what? I'm sticking with Zlatan over on Anandtech on the bandwidth. He says it's about 1/3 Xbox One (with Tegra X1 being 1/8 Xbox One). If we add Xbone's minimum 109 GB/s eSRAM + 68 GB/s DDR3, then the Switch bandwidth would work out to about 60 GB/s . 128-bit lpDDR4 3733. This spec makes more sense to me than Nintendo skimping on RAM bandwidth, of all things.
Do we have a solid idea of the likely performance relative to Wii U?
Is it possible that in handheld mode it'll be worse or similar to Wii U?
Do we have a solid idea of the likely performance relative to Wii U?
Is it possible that in handheld mode it'll be worse or similar to Wii U?
Do we have a solid idea of the likely performance relative to Wii U?
Is it possible that in handheld mode it'll be worse or similar to Wii U?
Why does that article talk about FP16 Tflops and compare them to FP32 Tflops of the Xbone and PS4? Bunch of garbage.
A 4 watt 500GF FP32 GPU is incredible if that is what Nintendo manages to score from Nvidia.
I agree that my post was a little exaggerated lol. I could have expanded on my point more. My post was aimed mostly at those who expect a 512/768 gflop device to suddenly act like 1/1.4 tflop device. Fp16 has been a thing on Nvidia hardware since dx9 (many of the benifits were present) but for some reason a few ppl are treating it like the second coming all of the sudden. I also agree with your last statement and would like to add that any game which only uses fp32 is a poorly optimized one lol.
My UE4 example was mostly referencing fp32 stuff that would not traslate to fp16 well(obvious artifacts) w/o some major changes. Stuff like that is why I personally believe we wont see NX suddenly have double performance.
From Image and Form games Twitter " Can't comment on specs, I'm afraid. But Nintendo are definitely not skimping on power! "
From Image and Form games Twitter " Can't comment on specs, I'm afraid. But Nintendo are definitely not skimping on power! "
https://twitter.com/ImageForm/status/793482754767945728
Well that's nice to hear. Most likely it means that it performs very well as a portable device and fairly well as a home console, but I am sure some will soon be assuming it means 1.5 TFLOPS in fp32 when docked.
Not exactly a developer who uses any power though, are they?
Not exactly a developer who uses any power though, are they?
They would still be familiar with the capabilities of the hardware potentially lol. It's not like devs who target more modest and manageable goals simply do so because they are ignorant of any other options.
Not exactly a developer who uses any power though, are they?
Not exactly a developer who uses any power though, are they?
Certainly not but that doesn't mean they don't have documents and the dev kits itself to know how powerful it is.
edit: as you say, tho.. "Not skimping on power" doesn't tell us anything.
Unless I'm missing something this dude at gamefaqs seemed to leak accurately loads of Switch stuff in August and claims that the SCD is alive and well.
Bullshit or possible true?
It will only have 4GB of RAM but "word has it" that with the NX use of cartridges instead of disc media, the RAMifications won't dent performance much at all.
I think someone analyzed the BOTW footage and found that the one clip with the giant Moblin was a constant 15 frames per second, which just implies that the video was slowed down 50% in post, as opposed to frame drops which fluctuate. Dumb move on the marketing's part, but I would take the performance in the trailer with a tremendous grain of salt.Cheers everyone. That's reassuring.
I was a bit worried by the low framerate of the Zelda footage in the trailer. I know the footage was all added in post but...
If the machine 2x Wii U in handheld mode then I'll be happy. I'm hoping docked mode is merely the difference between 720p and 1080p with no difference in framerate. I'd hate for going mobile to mean a lower framerate.
Cheers everyone. That's reassuring.
I was a bit worried by the low framerate of the Zelda footage in the trailer. I know the footage was all added in post but...
If the machine 2x Wii U in handheld mode then I'll be happy. I'm hoping docked mode is merely the difference between 720p and 1080p with no difference in framerate. I'd hate for going mobile to mean a lower framerate.
From Image and Form games Twitter " Can't comment on specs, I'm afraid. But Nintendo are definitely not skimping on power! "
https://twitter.com/ImageForm/status/793482754767945728
"Not skimping on power" is such a subjective term though, could mean relative to the Wii U or for a portable device etc.
Debating the numbers is fine but shouldn't we take one thing into consideration... not one negative report has come out yet about "real world performance." I mean we have heard from multiple "in the know" people hear on gaf that said ps4 Xbox one ports are not an issue. So I know we all want to know the numbers but does this not give you guys some sort of hope rather than lower than 1tf docked mode well Nintendo doomed?
Certainly, but regardless, it's the first Nintendo platform in more than a decade that's powerful and modern enough to run pretty much every single important piece of middleware out there, And that's not subjective, even if it doesn't really tell us all that much about raw performance. That said, I'd argue it's also more important than raw performance."Not skimping on power" is such a subjective term though, could mean relative to the Wii U or for a portable device etc.
It's using a SoC, a simple teardown won't tell us all that much. It might even use custom memory chips (like 3DS) or a stacked design and we won't be able to figure out the memory configuration on our own. Last time, we needed Chipworks to figure out what makes the thing tick, and a former GAF member had to ask them directly to give us access to their die shots.It's November, not long now until we get the full reveal in January and then two months after that the Switch releases with people doing a teardown of the Switch and the dock.
Indeed. Nothing that panned out hadn't already been leaked by Emily Rogers or NateDrake.
'The controllers also attach vertically' lol. Only if there was some weird shell that adds a second pair of rails.
That's just Parker. Not a leak as much as already released information from Nvidia.
I will eat some serious crow if there are Denver cores in the Switch.
It's using a SoC, a simple teardown won't tell us all that much. It might even use custom memory chips (like 3DS) or a stacked design and we won't be able to figure out the memory configuration on our own. Last time, we needed Chipworks to figure out what makes the thing tick, and a former GAF member had to ask them directly to give us access to their die shots.
It's using a SoC, a simple teardown won't tell us all that much. It might even use custom memory chips (like 3DS) or a stacked design and we won't be able to figure out the memory configuration on our own. Last time, we needed Chipworks to figure out what makes the thing tick, and a former GAF member had to ask them directly to give us access to their die shots.
I think they are going to give us the actual specs this time.