Rumor: Wii U final specs

Optical drives are slow as hell, the numbers do actually tell the story (since it matches up with 5x here), and gets worse when you factor in seek times. Optical drives average around 200ms, hard drives are around 10ms, and SSDs are around .1 or less. That's the main reason SSDs feel fast, more so than the insane bandwidth of newer SSDs...and why optical drives feel slow as hell (not that the low bandwidth here will help either).

The issue is this probably will not be the case with the Wii U, just as it was with the 360 and even the PS3 in some cases, where the DVD drives just perform better than USB. The 360's DVD drive, by itself outclasses any USB drive you plug into a 360 on a JTAG unit, on a vast majority of games. That's due to bandwidth/throughput on the metal regardless of the theoretical speeds of USB drives vs. whatever. We have no way of knowing that dedicated info on the Wii U yet, nor how it will perform on say Rayman Legends on DVD vs. installed to a USB drive. But if our revelations about Nintendo's choices in hardware and performance targets for the Wii U is any indication, I'm betting on a faster DVD drive speed for most games, if possibly the same or marginally slower on USB. Call me a cynic... I'd love to be wrong though, but I don't see the end results being worlds better than JTAG units.

And that still doesn't touch the rationale for Nintendo not including the options, I'd rather just spend a extra $50-$70 than be nickel and dimed for every component Nintendo chooses to exclude from the package at the cost of pennies to them vs. what it costs consumers. Including ethernet, optical out, the works.
 
Wouldn't that be up to developers to install the game if their games detects an external drive? I doubt Nintendo will block them since they want people to use their own drives.

Completely forgot about that. Install to flash storage and just transfer to HDD to store. Should be a basic option for all games released.
 
Completely forgot about that. Install to flash storage and just transfer to HDD to store. Should be a basic option for all games released.

Or just install straight to ext USB HDD/USB FLASH/USB SSD. Again I think developers will give you the option of installing and loading the games from whatever storage option it detects like the 360.
 
Why? For what reason? Like was said early USB 2.0 is already faster than the disc drives used in consoles so...

not by much, 10-15MB/s more aint going to make a big difference especially for large games which need texture streaming, the lack of USB3.0 is a disappointment, i've elaborated this many times, search my posts because I'm not going to repeat my self for the 5th time.
 
not by much, 10-15MB/s more aint going to make a big difference especially for large games which need texture streaming, the lack of USB3.0 is a disappointment, i've elaborated this many times, search my posts because I'm not going to repeat my self for the 5th time.

Games made for Wii U will be made around the lowest common denominator, which will be the disc drive. Games running from other media will be a bit of a bonus, and won't matter in multiplayer games where you have to wait for everyone to load anyway.
 
Just about any device connected via usb on 360 is faster than DVD:

Vez8R.jpg


http://www.eurogamer.net/articles/xbox-360-storage-update-the-flash-factor-article?page=4

It's only because of MS you can't connect SSD to internal sata. PS3 has that advantage.
http://www.eurogamer.net/videos/ps3-hard-drive-vs-ssd-comparison-video-rage

But WiiU has no internal connect, SSDs will be useless, that's why it's a big disappointment that 3.0 is not there, this is just ridicolous, you'll wait more than min 30 secs for big games to load.

Games made for Wii U will be made around the lowest common denominator, which will be the disc drive. Games running from other media will be a bit of a bonus, and won't matter in multiplayer games where you have to wait for everyone to load anyway.

What are you talking about ? I don't see this making any sense. And pretty much doesn't have anything to do with what i'm saying, i'm not speculating what kind of games or not will come, I talk about purely technical aspect, you're opinion here is invalid, and if you want to talk about bottlenecks which is a better term for that, the biggest and first bottleneck is always RAM always will be until quantum future.

And to say that WiiU won't have big high-profile games is ridicolous, not sure what is your idea of lowest denominator in this case, because it's not the disc drive, it's 25GB , it's not going to limit any game to it and it doesn't matter at all how big is the disc or how fast it is, developers will try to make the best games they can, the third parties too, like for example Doom 4, you should expect it to ship on 2 WiiU Discs. 1GB is the RAM, that 1GB will be filled all up, ofcourse compromises would have to be made to adapt for transfer speeds and that's the whole idea why USB2.0 is a disappointment, developers wouldnt' have to worry about transfer speed those that are making virtual textured games, sooner or later.

X360 has 8 GB DVD and it didn't stop id software's Rage being on 3 DVDs and certainly didn't stop them from totally watering down the game to get no texture pop-in on X360 DVD case (and it was really compressed) the installation to HDD made a huge difference.

The WiiU has 22.5 MB/s disc speed which is almost 3 times as much as current gen, and this won't be as effective as it sound, WiiU has 1 GB of RAM, twice as much would be, a little more than twice, because 512MB is total in PS360 , and some of it is used for OS, so WiiU has 1024MB wholly for games and that'll make games to load that much more than last gen and it'll take probably the same time to load as last gen, and that's a really big shame, USB3.0 could have fixed all this.
 
Graphics Horse said:
22.5MB/s. About 2.5x the PS3.
Curious about a comparison to the other current consoles, I went searching. Seems X360 and Wii are 12x and 6x DVD respectively, which should work out to about 15.8 and 7.9 MB/s.
Stewox said:
And to say that WiiU won't have big high-profile games is ridicolous, not sure what is your idea of lowest denominator in this case, because it's not the disc drive
I don't think you're taking his intended meaning. Games will be built around the disc drive: the one, relatively low-speed solution everyone has: the lowest common denominator. If a game has streaming issues through slightly faster USB 2.0, the developer will have made a game unsuitable for the medium most people will be playing it on.
 
Curious about a comparison to the other current consoles, I went searching. Seems X360 and Wii are 12x and 6x DVD respectively, which should work out to about 15.8 and 7.9 MB/s.

DVD read speed is not constant across the whole disc though; on the inside it's only half as fast as on the outside (your numbers).
 
Curious about a comparison to the other current consoles, I went searching. Seems X360 and Wii are 12x and 6x DVD respectively, which should work out to about 15.8 and 7.9 MB/s.

I don't think you're taking his intended meaning. Games will be built around the disc drive: the one, relatively low-speed solution everyone has: the lowest common denominator. If a game has streaming issues through slightly faster USB 2.0, the developer will have made a game unsuitable for the medium most people will be playing it on.


While USB2.0 isn't exactly blazingly fast, it's still much, much faster than WiiU's discs (~22MB vs. ~40MB). Seek times and stuff like that, where optical disc drives tend to suck, should also be uninfluenced by USB2.0, i.e. a standard 2.5 inch 5400rpm external HDD should have seek times that are in the ballpark of 10 times lower than the optical disc drive.
 
USB2 storage is going to be waaaay more available than external eSATA storage.
You don't get it, is not an "or" thing. USB 2.0 and esATA should have been available. A decent mechanical drive is enough to saturate the USB 2.0 throughput, so you are wasting performance.
The NAND flash is easily going to be the best method to run the game. I kind of doubt we will be able to rip our games to USB external HDD and then transfer 2 or 3 to the console's flash storage, would be great to run games right off the flash or at least the 32GB base version. Cheaper and faster than last gen plus we can choose.
At this stage NAND storage is still too expensive in relative terms and will continue to be for years. We still don't know how much open is the network to allow any retail game to be downloaded. If that happens 32GB is nothing.
Or just install straight to ext USB HDD/USB FLASH/USB SSD. Again I think developers will give you the option of installing and loading the games from whatever storage option it detects like the 360.
Then like i said, we (and devs) are missing out because Nintendo decided to save cents per unit and not include a faster interface.
not by much, 10-15MB/s more aint going to make a big difference especially for large games which need texture streaming, the lack of USB3.0 is a disappointment, i've elaborated this many times, search my posts because I'm not going to repeat my self for the 5th time.
USB 3.0 was just not going to happen, too expensive. eSATA was the way to go. ANy decent/cheap enclosure has it. And even Nintendo could release and official one and rake in some good money. Hell, the advanced version of the console could have included that instead of some pieces of crap plastic.
While USB2.0 isn't exactly blazingly fast, it's still much, much faster than WiiU's discs (~22MB vs. ~40MB). Seek times and stuff like that, where optical disc drives tend to suck, should also be uninfluenced by USB2.0, i.e. a standard 2.5 inch 5400rpm external HDD should have seek times that are in the ballpark of 10 times lower than the optical disc drive.
The bold part is not like that, real world USB 2.0 performance doesnt reach over 40 MB/s since you need to take into consideration control packet and data sent. Recent mechanical drives are improving platter density and can reach over 50/60 MB/s, better ones go past that and lets not talk about an SSD.
 
Frankfurter said:
While USB2.0 isn't exactly blazingly fast, it's still much, much faster than WiiU's discs (~22MB vs. ~40MB). Seek times and stuff like that, where optical disc drives tend to suck, should also be uninfluenced by USB2.0, i.e. a standard 2.5 inch 5400rpm external HDD should have seek times that are in the ballpark of 10 times lower than the optical disc drive.
Not exactly a real test, but what I did was look at that comparison of X360 Orange Box loading times from various media and multiply the disc times by 0.7 since that seems to be the relationship between X360 and Wii U read speeds. So for instance where it had USB taking 27.5 seconds and DVD taking 44.5 seconds, I figured Wii U disc would do it in 31.2 seconds.
 
The bold part is not like that, real world USB 2.0 performance doesnt reach over 40 MB/s since you need to take into consideration control packet and data sent. Recent mechanical drives are improving platter density and can reach over 50/60 MB/s, better ones go past that and lets not talk about an SSD.

Err, nope. USB2.0 real world performance is indeed in the 40MB/s region. 60MB/s is the theoretical limit, ~40MB/s what you can use. I.e. I was already being generous in the comparion as I compared real world USB2.0 performance vs. maximum WiiU disc drive performance.

Again: not saying that USB2.0 is fast, but imo it'll do the trick for the WiiU. There'll be about 5 people on earth that connect an SSD to their WiiU console, so I'll say that it's fair from Nintendo to ignore those guys. The usual WiiU owner will either a) not connect any external device, b) connect a random USB stick (that'll often fail to even come close to USB2.0's real world limit) or c) connect a 2.5 inch 5400rpm external HDD to it, which will probably be in the 40-60MB/s region in sequential performance.
 
Seems that the HDD bus speed is limited to USB2 bandwidth if you look at the HDD vs USB2 HDD, but it's the extremely low seeking time/latency of Flash memory that really helps with loading of multiple files.

It shouldn't be though, isn't the connector for the 360 HDD Sata I or Sata II? (well technically eSata but its the same basic thing)
 
USB 3.0 was just not going to happen, too expensive. eSATA was the way to go. ANy decent/cheap enclosure has it. And even Nintendo could release and official one and rake in some good money. Hell, the advanced version of the console could have included that instead of some pieces of crap plastic.

I'm for eSata any day too, i just banked on the idea USB3.0 would get a lot cheaper in the coming years.
 
While USB2.0 isn't exactly blazingly fast, it's still much, much faster than WiiU's discs (~22MB vs. ~40MB). Seek times and stuff like that, where optical disc drives tend to suck, should also be uninfluenced by USB2.0, i.e. a standard 2.5 inch 5400rpm external HDD should have seek times that are in the ballpark of 10 times lower than the optical disc drive.

Although I've never seen USB working above 30MB/s, but that was a USB Flash drive, ofcourse the implementation on consoles will be better than chips in the USB flash, so i hope it will support full speed, then if you have a good enough USB-Sata adapter you should be able to achieve 60MB/s which will be a freak show in practise, it would really make a difference, still most HDDs are faster than 60MB/s.

Just alert to everyone, DO NOT BUY EXTERNAL HDDs, this may not work because those are rubbish and I don't even follow anything over there so I don't even know what USB speeds they have, just ask on tech forums or anybody if you don't know how to do it manually with a normal internal HDD.
 
It's only because of MS you can't connect SSD to internal sata. PS3 has that advantage.
http://www.eurogamer.net/videos/ps3-hard-drive-vs-ssd-comparison-video-rage

But WiiU has no internal connect, SSDs will be useless, that's why it's a big disappointment that 3.0 is not there, this is just ridicolous, you'll wait more than min 30 secs for big games to load.



What are you talking about ? I don't see this making any sense. And pretty much doesn't have anything to do with what i'm saying, i'm not speculating what kind of games or not will come, I talk about purely technical aspect, you're opinion here is invalid, and if you want to talk about bottlenecks which is a better term for that, the biggest and first bottleneck is always RAM always will be until quantum future.

And to say that WiiU won't have big high-profile games is ridicolous, not sure what is your idea of lowest denominator in this case, because it's not the disc drive, it's 25GB , it's not going to limit any game to it and it doesn't matter at all how big is the disc or how fast it is, developers will try to make the best games they can, the third parties too, like for example Doom 4, you should expect it to ship on 2 WiiU Discs. 1GB is the RAM, that 1GB will be filled all up, ofcourse compromises would have to be made to adapt for transfer speeds and that's the whole idea why USB2.0 is a disappointment, developers wouldnt' have to worry about transfer speed those that are making virtual textured games, sooner or later.

X360 has 8 GB DVD and it didn't stop id software's Rage being on 3 DVDs and certainly didn't stop them from totally watering down the game to get no texture pop-in on X360 DVD case (and it was really compressed) the installation to HDD made a huge difference.

The WiiU has 22.5 MB/s disc speed which is almost 3 times as much as current gen, and this won't be as effective as it sound, WiiU has 1 GB of RAM, twice as much would be, a little more than twice, because 512MB is total in PS360 , and some of it is used for OS, so WiiU has 1024MB wholly for games and that'll make games to load that much more than last gen and it'll take probably the same time to load as last gen, and that's a really big shame, USB3.0 could have fixed all this.

The HDD in that graph that is connected via USB 2.0 to the 360 loads those "big games" only a second or two behind the SATA HDD of the 360.

I'm not going to go into this too much because refreshment#1 and I had a rather large debate about this very article over PM. USB 3.0 is expensive and a lot of the time it just doesn't work thanks to drives, you get it locked at 2.0 speeds, and that is just unacceptable when it costs more money and uses more power. HOWEVER I do think a eSATA port would have been nice but in no way does this cause any problems for developers that will already be targeting the disc drive's load capabilities.
 
quick question: what is the likelyhood that Sony or Microsoft are going to use USB3.0? The chances are that they are going to want to stick to either eSATA or proprietary HDDs instead of allowing people to attach a 3TB USB3.0 HDD just so thatway they can make more money.
 
Interesting tidbits. What is your opinion on the outlook for 2.5D stacking? Is it viable for somebody like nintendo?

Maybe. But seeing how risk averse they are, I doubt it. They want cheap, reliable chips and they want them in very high volume. Think about how bad the Wii was in terms of power. Can you see the company that brought that to the market, taking a huge yield risk to reduce some latency between chips?
 
Maybe. But seeing how risk averse they are, I doubt it. They want cheap, reliable chips and they want them in very high volume. Think about how bad the Wii was in terms of power. Can you see the company that brought that to the market, taking a huge yield risk to reduce some latency between chips?

Latency is the one thing Nintendo seems to be very particular about, though, and there's good reason to believe they'll put particular emphasis on reducing the latency between the CPU and the eDRAM on the GPU (both for Wii backward compatibility and offloading SIMD calculations to the GPU).
 

Just to add to this. IBM and GloFo work side by side so If Nintendo had their parts fabbed at GloFo instead of TSMC they could easily work something out. The issue here is that 3d is not needed in a console. Consoles use lagging tech, especially Nintendo.

3D is important in 2 areas. High Performance Computing and in Mobile. It's all about latency, power loss and package density. None of these are really important for a piece of consumer electronics in the current environment.
 
Latency is the one thing Nintendo seems to be very particular about, though, and there's good reason to believe they'll put particular emphasis on reducing the latency between the CPU and the eDRAM on the GPU (both for Wii backward compatibility and offloading SIMD calculations to the GPU).

But they are not going to work out some complicated business arrangement that could end up reducing their yield and increasing their time to market to achieve the gains gotten by advance packaging schemes. They want off the shelf parts using off the shelf technology. If they did something crazy they would have even a harder time meeting launch demand, and that means tons of money left on the table. If the technology was mature and they knew what the actual costs were, then maybe. But remember this system was spec'ed out and ready to roll in it's final form about a year ago.
 
Latency is the one thing Nintendo seems to be very particular about, though, and there's good reason to believe they'll put particular emphasis on reducing the latency between the CPU and the eDRAM on the GPU (both for Wii backward compatibility and offloading SIMD calculations to the GPU).

This is quite interesting to me. My personal thoughts are that they will stick with a traditional bus between the CPU and GPU (w/ its eDRAM). The latency from CPU to the 24 MB 1t-SRAM main pool in Gamecube and Wii is around 10 ns. This is all they need for BC purposes, and is quite achievable using mature technology.
 
But they are not going to work out some complicated business arrangement that could end up reducing their yield and increasing their time to market to achieve the gains gotten by advance packaging schemes. They want off the shelf parts using off the shelf technology. If they did something crazy they would have even a harder time meeting launch demand, and that means tons of money left on the table. If the technology was mature and they knew what the actual costs were, then maybe. But remember this system was spec'ed out and ready to roll in it's final form about a year ago.

What if it's already increased the time to market? At E3 last year it was generally accepted that Nintendo were targeting a mid-2012 release for the console, so it's not impossible that the delay to November is due to some kind of manufacturing problems. Some people were speculating that Nintendo are going with a 28nm GPU and the delay is due to the well-publicised yield issues, but in theory a 2.5D packaging system could be another explanation.

Also, regarding time-frames, judging by improvements in dev kits, the first "final" hardware was available around February or March this year.

This is quite interesting to me. My personal thoughts are that they will stick with a traditional bus between the CPU and GPU (w/ its eDRAM). The latency from CPU to the 24 MB 1t-SRAM main pool in Gamecube and Wii is around 10 ns. This is all they need for BC purposes, and is quite achievable using mature technology.

The latency at Wii clocks is around 10ns. In theory they might overclock the interconnect in Wii mode to achieve the necessary latency, but it's more likely that they'll need something that offers sufficiently low latency when clocked down to Wii speeds.
 
The latency at Wii clocks is around 10ns. In theory they might overclock the interconnect in Wii mode to achieve the necessary latency, but it's more likely that they'll need something that offers sufficiently low latency when clocked down to Wii speeds.

Ok...I'm confused. I know a bit about how the different RAM timings and frequencies can affect latency, but how does increasing or decreasing interconnect speed come into play? The faster it is, the higher the latency?
 
Ok...I'm confused. I know a bit about how the different RAM timings and frequencies can affect latency, but how does increasing or decreasing interconnect speed come into play? The faster it is, the higher the latency?

Latency is defined in terms of clock cycles. So, if you had a 10 cycle latency on RAM running at 1GHz, you'll get a total latency of 10ns. If you increase the clock of the same RAM to 2GHz, that latency will reduce to 5ns, as each cycle takes half the time. The reductions in latency time are almost always a matter of a simple increase in clock speed, unless you're talking about fundamentally different kinds of RAM like 1T-SRAM or some novel interconnect. What's more, these increases in clock speeds often require an increase in the latency timings (as measured in clock cycles) to achieve, so you don't get a fully linear benefit. Hence why DDR3-1333 RAM for your PC generally will have lower CAS timings than DDR3-1600.

I'm far from an expert on these kind of things, but my understanding is that the same kind of thing would be true for the bus between the CPU and GPU package in the Wii U. The bus runs at a certain clock speed and has a certain latency relative to that clock. Similarly there'll be something on the GPU package which regulates access to the eDRAM which incurs its own latency. My guess is that, while at Wii U clock speeds the absolute latency (in ns) of the interconnect will be lower than the Wii, I wouldn't necessarily expect that to be the case when clocked down in Wii mode. My theory was then that they could overclock the necessary components (say to twice the clock speed they were on the Wii) and only send/receive on half of the cycles, to achieve the same bandwidth and latency as the Wii hardware. As I say, though, I'm not an expert, and can't say for sure whether it would be necessary or even possible.
 
This is quite interesting to me. My personal thoughts are that they will stick with a traditional bus between the CPU and GPU (w/ its eDRAM). The latency from CPU to the 24 MB 1t-SRAM main pool in Gamecube and Wii is around 10 ns. This is all they need for BC purposes, and is quite achievable using mature technology.
RAM-to-CPU latency is one thing. Embedded-texture-cache-to-TMU - another. Something has to be able to feed TMUs at 6.2ns straight (flipper's TC latency for random reads - embedded 1Tsram for the win!). Combine this with the consideration that the 32MB edram most likely sits on its own die (or nintendo could be facing you-wish-it-was-a-POWER7-octocore yield conditions from a 32MB + U-GPU monster die), and you'll see how rigid U-GPU/edram wire requirements might be in terms of latency. Alternatively, U-GPU could be hosting a dedicated 1MB pool of edram/sram essentially for Hollywood TC emulation, but that'd be poor transistor budgeting, IMO (apparently, they could just as well have a Hollywood actual in there, which would be PS3-fat-w/BC levels of bad transistor budgeting - even Sony could not swallow that in the end). My expectations are that the 32MB pool will cover all BC needs of all units in the system. It will be that awesome (tm) ; )
 
quick question: what is the likelyhood that Sony or Microsoft are going to use USB3.0? The chances are that they are going to want to stick to either eSATA or proprietary HDDs instead of allowing people to attach a 3TB USB3.0 HDD just so thatway they can make more money.

expect at least a 3.0 port...
 
Latency is defined in terms of clock cycles. So, if you had a 10 cycle latency on RAM running at 1GHz, you'll get a total latency of 10ns. If you increase the clock of the same RAM to 2GHz, that latency will reduce to 5ns, as each cycle takes half the time. The reductions in latency time are almost always a matter of a simple increase in clock speed, unless you're talking about fundamentally different kinds of RAM like 1T-SRAM or some novel interconnect. What's more, these increases in clock speeds often require an increase in the latency timings (as measured in clock cycles) to achieve, so you don't get a fully linear benefit. Hence why DDR3-1333 RAM for your PC generally will have lower CAS timings than DDR3-1600.

I'm far from an expert on these kind of things, but my understanding is that the same kind of thing would be true for the bus between the CPU and GPU package in the Wii U. The bus runs at a certain clock speed and has a certain latency relative to that clock. Similarly there'll be something on the GPU package which regulates access to the eDRAM which incurs its own latency. My guess is that, while at Wii U clock speeds the absolute latency (in ns) of the interconnect will be lower than the Wii, I wouldn't necessarily expect that to be the case when clocked down in Wii mode. My theory was then that they could overclock the necessary components (say to twice the clock speed they were on the Wii) and only send/receive on half of the cycles, to achieve the same bandwidth and latency as the Wii hardware. As I say, though, I'm not an expert, and can't say for sure whether it would be necessary or even possible.

I pretty much follow you. I suppose alot rides on how short the latency of IBM's eDRAM truly is. If they follow the Gamecube outline, then the bus from CPU to GPU would run at the speed of the GPU. Actually, breaking down the cycle times, is it correct to say that the under 10 ns latency to main RAM in the Cube is figured by adding the ~6 ns cycle time from CPU to GPU and the ~3 ns cycle time from GPU to main RAM?

RAM-to-CPU latency is one thing. Embedded-texture-cache-to-TMU - another. Something has to be able to feed TMUs at 6.2ns straight (flipper's TC latency for random reads - embedded 1Tsram for the win!). Combine this with the consideration that the 32MB edram most likely sits on its own die (or nintendo could be facing you-wish-it-was-a-POWER7-octocore yield conditions from a 32MB + U-GPU monster die), and you'll see how rigid U-GPU/edram wire requirements might be in terms of latency. Alternatively, U-GPU could be hosting a dedicated 1MB pool of edram/sram essentially for Hollywood TC emulation, but that'd be poor transistor budgeting, IMO (apparently, they could just as well have a Hollywood actual in there, which would be PS3-fat-w/BC levels of bad transistor budgeting - even Sony could not swallow that in the end). My expectations are that the 32MB pool will cover all BC needs of all units in the system. It will be that awesome (tm) ; )

Thanks for the perspective as always. Reading about the Cube architecture is quite fascinating still to this day. I'd like to hope they could pull of something similarly awesome in Wii U - only better. I'm willing to admit that the separate die theory is probably the easiest and most likely, but let me run what I've been imagining by you guys in full and then you can tell me how crazy it is. haha

Basically, I am with bgassassin and others in expecting the GPU to run at a low clock (480 Mhz). That would give it a cycle time of ~2 ns. Coincidentally enough, we hear that IBM's eDRAM is capable of latencies of below 2 ns. I am not sure how Nintendo/Mosys measured the Cube's 1t-SRAM latency, but at 6.2 ns, it comes to one clock cycle at 162 Mhz. Perhaps this is what Nintendo is going for - one clock cycle of sustained latency for eDRAM accesses. Everything about IBM's eDRAM process touts its ability to be placed on chip, so I really think it would be a waste of that technology if they just placed it on its separate die and called it a day. Would it really take them 2+ years to hook up an RV770 to an eDRAM die via databus?

Here's the RV770 die:

rv770die.jpg


What I propose is that they have modified the chip in the following ways: First, strip the GDDR5 interface and replace it with a 128-bit DDR3 interface. Next, I believe the orange area is the ROPS and L2 cache. From what I have read, at 480 Mhz, the GPU might be "slow" enough for a portion of the 32 MB eDRAM pool to serve as L2. So strike that L2 SRAM and replace it with 32 MB eDRAM. On 32nm, that should only take up 23 mm^2 according to IBM. Also, since we keep going back to the 640 ALU rumor, strike two SIMD cores and their respective texture units. Actually, it seems that the RV740 then would be our closest point of reference, and at 40 nm, that chip is only 137 mm^2. I believe Flipper was 120 mm^2, so this seems right in line.

Then, there were the talks we had about replacing the general registers and L1 texture cache with eDRAM. That seems like quite an in-depth modification, but you never know...

Thoughts?
 
I was just looking up the transistor density of IBM's eDRAM to look into blu's assertion that the eDRAM would need its own die, and I came across this article on SemiAccurate about the 32nm Power7+. The last paragraph caught my eye as being pertinent to our current conversation:

SemiAccurate said:
OK, so IBM is laying out the law on advanced packaging, and no one else has shown this type of tech, not to mention anything on this scale. Could it get any better? Sure it can. What if I told you that the interposer wasn’t a passive part, but an active one with lots of embedded RAM. Need a few, oh, lets say tens of MB cache with a silly wide interface? See above. Also see your local IBM rep because no one else can do this

Power7+ isn't even out yet, so this is the exact opposite of a mature process, but food for thought nonetheless.

Edit:

I pretty much follow you. I suppose alot rides on how short the latency of IBM's eDRAM truly is. If they follow the Gamecube outline, then the bus from CPU to GPU would run at the speed of the GPU. Actually, breaking down the cycle times, is it correct to say that the under 10 ns latency to main RAM in the Cube is figured by adding the ~6 ns cycle time from CPU to GPU and the ~3 ns cycle time from GPU to main RAM?

I would say so. As I say, though, not an expert on these things, let alone the Gamecube architecture in particular.

On 32nm, that should only take up 23 mm^2 according to IBM.

Do you have a source for that? According to this paper (PDF) "Less than 15% of the area of the POWER7 die is consumed by the eDRAM macros comprising the 32-MB L3 cache" which, given a 567mm² die, would amount to <85.05mm² for the eDRAM on a 45nm process (or 180 million transistors, if that's how we're counting). Given my understanding of die-shrinks (which is probably wrong) that would give us 43mm² on a 32nm process, or a bit under.

Edit 2:

Continuing my shoddy die-shrink maths to its
il
logical conclusion, a RV740 + 32MB of eDRAM at 32nm would come to 130.68mm², which seems about the right ballpark.
 
Do you have a source for that? According to this paper (PDF) "Less than 15% of the area of the POWER7 die is consumed by the eDRAM macros comprising the 32-MB L3 cache" which, given a 567mm² die, would amount to <85.05mm² for the eDRAM on a 45nm process (or 180 million transistors, if that's how we're counting). Given my understanding of die-shrinks (which is probably wrong) that would give us 43mm² on a 32nm process, or a bit under.

Edit 2:

Continuing my shoddy die-shrink maths to its
il
logical conclusion, a RV740 + 32MB of eDRAM at 32nm would come to 130.68mm², which seems about the right ballpark.

Actually, embarrassingly enough, I have no primary source for that one - only bgassassin. Hey, I'm not writing a research paper here - it's a friggin message board! haha. In the search I just ran, I also saw a claim of 1 mm^2 per MB - still decent. But can anyone access [url="http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5424375&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F5419306%2F5424206%2F05424375.pdf%3Farnumber%3D5424375] this paper?[/url] I could have sworn I read it before but my college email is now expired and I can't access it. Seems like it might have some relevant information.
 
Just to add to this. IBM and GloFo work side by side so If Nintendo had their parts fabbed at GloFo instead of TSMC they could easily work something out. The issue here is that 3d is not needed in a console. Consoles use lagging tech, especially Nintendo.

3D is important in 2 areas. High Performance Computing and in Mobile. It's all about latency, power loss and package density. None of these are really important for a piece of consumer electronics in the current environment.
Doesn't really explain why the Tezzaron guy makes it sound like his company was involved. Oddly specific and weird timing for a purely hypothetical statement.

I'd like to add that Nintendo actually doesn't really use off-the-shelf parts. Especially when it comes to memory, they tend to come up with odd, custom solutions. Like the 1T-SRAM die used for Gamecube and Wii (no manufacturer offers those), or the weird FCRAM chip the 3DS uses (that part is custom made for Nintendo as well).

I talked about this with someone on B3D, and we came to the conclusion that stacking might make sense under certain circumstances. Especially if the eDRAM is part of the CPU die but requires a sufficiently fast, low latency interface to the GPU.

There are also rumors that Nintendo actually has pretty massive yield issues, which led to delays and is also the reason retailers only get very few systems at launch. Supposedly.
 
So this was from the Xbox World Rumor thread.

Insider Daily: Nintendo forced MS and Sony to change their specs

Don't put it on your blog please, but you can use main things:
8000+ gnc 2.0 "Venus XTX" based gpu 20mn
8 core Richland x86 cpu With 8000 core series hd apu 28nm
8 or 6 gig of ram
Reports are 4.00+ TFLOP of graphics processing power.

Misterx: ok, by my own words:
- Next xbox x86 officially
- Next xbox will be 4T+ in power. 5T reports are true.
- Next xbox will have 8 to 6 gigs of ram. Sony now target 4gb.
- Sony and MS had phone calls not to put next gen at E3 in order to delay it and make Nin look absolete. after Nintendo changed their 5000 specs to 7000 series specs.
- MS changed 70% of their gpu/cpu design recently after that.
- MS should not delay xbox next
- Next xbox will be a moster for a console. the price should be around $450 but that is not a fact for now.

So this rumor tells us that the Wii U has a 7xxx GPU, and that the Xbox 720 will be a beast for only $450 with probably Kinect 2 as well...Ok than lol.

What do people think about this?
 
So this was from the Xbox World Rumor thread.



So this rumor tells us that the Wii U has a 7xxx GPU, and that the Xbox 720 will be a beast for only $450 with probably Kinect 2 as well...Ok than lol.

What do people think about this?

I have absolutely zero inside information and I feel comfortable saying that it is 100% false.
 
Actually, embarrassingly enough, I have no primary source for that one - only bgassassin. Hey, I'm not writing a research paper here - it's a friggin message board! haha. In the search I just ran, I also saw a claim of 1 mm^2 per MB - still decent. But can anyone access this paper? I could have sworn I read it before but my college email is now expired and I can't access it. Seems like it might have some relevant information.

Unfortunately I can't access IEEE papers anymore, and that one's not one I've read, but it'd likely provide the answer. I suppose it's not impossible that they could increase the density, though, as the eDRAM used on the Power7 was basically the first generation of the technology. There are a couple of other papers I found that may be of interest to anyone who can access them:

High density DRAM for space utilizing embedded DRAMs macros in 32nm SOI CMOS
A 0.039um2 high performance eDRAM cell based on 32nm High-K/Metal SOI technology

There are also a couple of papers on the subject of IBM's eDRAM stacking which the SemiAccurate article mentioned:

3D stackable 32nm High-K/Metal Gate SOI embedded DRAM prototype
A novel DRAM architecture as a low leakage alternative for SRAM caches in a 3D interconnect context
A 3D system prototype of an eDRAM cache stacked over processor-like logic using through-silicon vias

Also, here's a quote from the paper I quoted a few posts up that I figure would be illuminating on the on-die/off-die question:

IBM said:
by incorporating the eDRAM L3 cache on the processor die, a 4-MB local L3 region enjoys roughly one sixth the access latency of an off-chip L3 cache
 
Question: If Nintendo isn't releasing specs and won't do so after launch, how are you all going to find out what's in the Wii U?

Is there an NDA in place? So developer documentation will be leaked? Do you open up the Wii U and do research on the parts etc?
 
Question: If Nintendo isn't releasing specs and won't do so after launch, how are you all going to find out what's in the Wii U?

Is there an NDA in place? So developer documentation will be leaked? Do you open up the Wii U and do research on the parts etc?

That's pretty much what will be done and what has been done in the past
 
The HDD in that graph that is connected via USB 2.0 to the 360 loads those "big games" only a second or two behind the SATA HDD of the 360.

I'm not going to go into this too much because refreshment#1 and I had a rather large debate about this very article over PM. USB 3.0 is expensive and a lot of the time it just doesn't work thanks to drives, you get it locked at 2.0 speeds, and that is just unacceptable when it costs more money and uses more power. HOWEVER I do think a eSATA port would have been nice but in no way does this cause any problems for developers that will already be targeting the disc drive's load capabilities.

USB 3.0 controllers aren't expensive anymore, they almost reach price parity with USB 2.0. I got a lot of USB 3.0 drives, I have no issues with any of them so far. The USB 3.0 cards are 5 dollar more than USB 2.0, same with the enclosures.
 
I have absolutely zero inside information and I feel comfortable saying that it is 100% false.

Sorry to say it like that but people with absolutely zero information usually feel comfortable about saying anything.

I think the conversation, if there's any at all (people might not think it's wort talking about at all), should be about the possibility that this could happen. In theory. because that's as far as we can get here, a theory, unless we have an insider among us who can disclose anything.

It was already said here that this could never happen and that companies don't do this - that they don't change their hardware specs according to information about their competition. Well, I would not think that this is always the case, especially when it comes to closed computer platforms (videogame systems) which tend to be on the market for a long time and which also tend to be the only hardware which the company that made it is represented by on the market. (sure, exceptions - but no console manufacturer wants to split their own marketshare today... look at what it did to Sega). That is what makes the video game console market unique when compared to, let's say, the phone market.
 
USB 3.0 controllers aren't expensive anymore, they almost reach price parity with USB 2.0. I got a lot of USB 3.0 drives, I have no issues with any of them so far. The USB 3.0 cards are 5 dollar more than USB 2.0, same with the enclosures.

Cost isn't so much a factor as the maturity of the spec. USB3 is not a mature spec.
 
Cost isn't so much a factor as the maturity of the spec. USB3 is not a mature spec.

But he listed cost as one of the major reason not to use it and instead went with a connection(eSATA) that not many people use and there aren't many drives are on the market with it. Nintendo could always give a list of compatible USB 3.0 hard drive that they know will work with it's console. Anyway, the argument is moot since the Wii U doesn't have it.
 
Really? how much of it will be off the shelf though? Is it really as easy as Googling?

None of it is off the shelf and opening it up can only get you so far. To truly know the specs by just opening it up you'd need to have a high level of expertise and an electron microscope or something. It's not as simple as looking at the model number on the parts inside and Googling them.
 
But he listed cost as one of the major reason not to use it and instead went with a connection that not many people use(eSATA) or have not many drives are released with. Nintendo could always give a list of compatible USB 3.0 hard drive that they know wil work with it's console. Anyway, the argument is moot since the Wii U doesn't have it.

There's parts and there's also licensing costs, and increase in the hardware needed to support it (since AFIAK, USB still uses the host for processing). It also doesn't help that Intel seems to delay fully going into USB3 support on their chipsets because they want to get people interested in Thunderbolt.

Shopping for USB3 products is basically a crapshoot of compatibility guessing. I'm still pissed that the Intensity Shuttle didn't want to work with my computer because I didn't have the exact correct USB3 chipset. So much for the Universal in Universal Serial Bus :/
 
Cost isn't so much a factor as the maturity of the spec. USB3 is not a mature spec.

The USB3 spec is finished, but at this point many "USB3 devices" that are sold to consumers either don't implement the spec correctly (especially hubs), don't take advantage of new transfer modes (hard drive enclosures) or are just plain unfit for purpose (cable adapters, passive cables that are too long).

Basically: There's plenty of crap sold that shouldn't be associated with USB3. This problem will mostly disappear over time, but at the moment, it is very real.
 
None of it is off the shelf and opening it up can only get you so far. To truly know the specs by just opening it up you'd need to have a high level of expertise and an electron microscope or something. It's not as simple as looking at the model number on the parts inside and Googling them.

So leak it is basically. :D
 
The USB3 spec is finished, but at this point many "USB3 devices" that are sold to consumers either don't implement the spec correctly (especially hubs), don't take advantage of new transfer modes (hard drive enclosures) or are just plain unfit for purpose (cable adapters, passive cables that are too long).

Basically: There's plenty of crap sold that shouldn't be associated with USB3. This problem will mostly disappear over time, but at the moment, it is very real.

Yeah, in the tech world there's a world of difference between a finalized spec and a mature spec. It's been finalized since 2008 but everyone is still derping around on getting it to be mature, especially when the players involved have other interests/standards they want adopted.
 
The USB3 spec is finished, but at this point many "USB3 devices" that are sold to consumers either don't implement the spec correctly (especially hubs), don't take advantage of new transfer modes (hard drive enclosures) or are just plain unfit for purpose (cable adapters, passive cables that are too long).

Basically: There's plenty of crap sold that shouldn't be associated with USB3. This problem will mostly disappear over time, but at the moment, it is very real.

Wouldn't be an issue for Nintendo if they release a list of compatible drives.
 
Top Bottom