Rumor: Wii U final specs

None of it is off the shelf and opening it up can only get you so far. To truly know the specs by just opening it up you'd need to have a high level of expertise and an electron microscope or something. It's not as simple as looking at the model number on the parts inside and Googling them.

There are firms that specialize in taking apart a companies competitors products to identify the internals and the manufacturing process, as well as researching its capabilities. We'll probably get some information that way as well - plenty of people will likely be in the know about the Wii U specs.
 
So leak it is basically. :D
Nah, companies like Chipworks will tear the system down properly, die photos and all. It will take a while, and we don't know how much information they're going to release to the public, but I'm sure we'll get something. We always get something. Though with 3DS, they only released information regarding the memory chip to the public - nothing about the SoC.
 
Wouldn't be an issue for Nintendo if they release a list of compatible drives.

USB 2 is still faster than optical drive reads, so what's the point of clamoring for USB 3?

Oh wait, I forgot that the Wii U's design priorities were all about maximum performance!

…wait.
 
USB 2 is still faster than optical drive reads, so what's the point of clamoring for USB 3?

Oh wait, I forgot that the Wii U's design priorities were all about maximum performance!

…wait.
I just wish to say, this reply is plain offensive. Offensive because basically ignores a number of reasonable arguments regarding the issue and there's no insighful remark, data or counter argument, so it contributes with nothing. People invested their time explaining why the inclusion of a fster interface would be desireable and it comes with a small investment from Nintendo's part.

Kinder garden debate level in action, covering the ears and screaming as loud as the lungs allow. A spec thread deserves better.
 
Nah, companies like Chipworks will tear the system down properly, die photos and all. It will take a while, and we don't know how much information they're going to release to the public, but I'm sure we'll get something. We always get something. Though with 3DS, they only released information regarding the memory chip to the public - nothing about the SoC.
Probably, but how much is it going to mean in practice?

We all know 3ds <<<< Vita, but how many games are on Vita that look better than Revelations?


USB 2 is still faster than optical drive reads, so what's the point of clamoring for USB 3?

Oh wait, I forgot that the Wii U's design priorities were all about maximum performance!

&#8230;wait.
People are OK with games that don't run properly without mandatory installs, and only on USB3s

Kinder garden debate level in action, covering the ears and screaming as loud as the lungs allow. A spec thread deserves better.
Apparently it requires a more understanding of why Nintendo enforces equal opportunity and discourages method enabling developers to disregard them.

For Nintendo, it basically boils down to this: are most people going to use USB3? No? Well then, we enforce USB2 upon everyone so the experience doesn't change from one to another
 
Well, when I looked at the demo Wii U units from the back there WAS one more USB port (actually, a mini USB I think) but still none of the three was blue, so... I guess we will have to settle for USB2. I don't care all that much.
 
Actually, embarrassingly enough, I have no primary source for that one - only bgassassin. Hey, I'm not writing a research paper here - it's a friggin message board! haha. In the search I just ran, I also saw a claim of 1 mm^2 per MB - still decent. But can anyone access [url="http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5424375&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F5419306%2F5424206%2F05424375.pdf%3Farnumber%3D5424375] this paper?[/url] I could have sworn I read it before but my college email is now expired and I can't access it. Seems like it might have some relevant information.

I think the 1 mm^2 per MB is what you're remembering me talking about as that's what I remember what was said about 32nm eDRAM density from IBM.
 
People are OK with games that don't run properly without mandatory installs, and only on USB3s


Apparently it requires a more understanding of why Nintendo enforces equal opportunity and discourages method enabling developers to disregard them.

For Nintendo, it basically boils down to this: are most people going to use USB3? No? Well then, we enforce USB2 upon everyone so the experience doesn't change from one to another
A faster interface... not necessarily USB 3.0. IIRC, the person proposing the use of USB 3.0 changed his mind when it was explained to him some of the drawbacks (costs and maturity of the standard).

The point, which i think it has been disscused enough, is that Nintendo could've offered the USB 2.0 that it's almost omnipresent and another interface withotuh any impossible economic sacrifices.
 
Nah, companies like Chipworks will tear the system down properly, die photos and all. It will take a while, and we don't know how much information they're going to release to the public, but I'm sure we'll get something. We always get something. Though with 3DS, they only released information regarding the memory chip to the public - nothing about the SoC.

Hmm interesting...and this Chipworks what's stopping them from releasing their full findings? Sorry if it's a dumb question...
 
A faster interface... not necessarily USB 3.0. IIRC, the person proposing the use of USB 3.0 changed his mind when it was explained to him some of the drawbacks (costs and maturity of the standard).

The point, which i think it has been disscused enough, is that Nintendo could've offered the USB 2.0 that it's almost omnipresent and another interface withotuh any impossible economic sacrifices.

They still have zero incentive to do so. Who is going to buy a Wii U or a game only because it can be loaded from a faster-than-USB-2 interface? I doubt anyone. Why, therefore, is it worth any sacrifice at all on Nintendo's part?
 
Hmm interesting...and this Chipworks what's stopping them from releasing their full findings? Sorry if it's a dumb question...

They are in the information research business: Identifying the exact chips, manufacturing process and technological capabilities of a product is what they sell.
 
Well, when I looked at the demo Wii U units from the back there WAS one more USB port (actually, a mini USB I think) but still none of the three was blue, so... I guess we will have to settle for USB2. I don't care all that much.

The demo units have extra connections on the back to connect up to dev kits. The only USB ports on the back of the retail console will be two full-size USB2 ports.
 
They are in the information research business: Identifying the exact chips, manufacturing process and technological capabilities of a product is what they sell.

I saw that it costs $200 to view the teardown of the 3DS CPU (ouch). Has anyone read it? Or is it basically information that everybody allready knows (dual-core ARM11 variant)?
EDIT: $200 for a PHOTO of the damn die. What a rip-off.
 
Unfortunately I can't access IEEE papers anymore, and that one's not one I've read, but it'd likely provide the answer. I suppose it's not impossible that they could increase the density, though, as the eDRAM used on the Power7 was basically the first generation of the technology. There are a couple of other papers I found that may be of interest to anyone who can access them:

High density DRAM for space utilizing embedded DRAMs macros in 32nm SOI CMOS
A 0.039um2 high performance eDRAM cell based on 32nm High-K/Metal SOI technology

There are also a couple of papers on the subject of IBM's eDRAM stacking which the SemiAccurate article mentioned:

3D stackable 32nm High-K/Metal Gate SOI embedded DRAM prototype
A novel DRAM architecture as a low leakage alternative for SRAM caches in a 3D interconnect context
A 3D system prototype of an eDRAM cache stacked over processor-like logic using through-silicon vias

Also, here's a quote from the paper I quoted a few posts up that I figure would be illuminating on the on-die/off-die question:

Here's a good one that I read before and just came across again. There are some good quotes to pull out of it and it may have some solid answers for die size. I'm still trying to make sense of that part, though.

http://www.ww.fsa.org/events/2010/0316/docs/5.GMC-iyer.pdf

IBM said:
Integrate with Logic without impacting logic Performance, Reliability or Yields

I think the 1 mm^2 per MB is what you're remembering me talking about as that's what I remember what was said about 32nm eDRAM density from IBM.

http://64.91.233.250/forum/showpost.php?p=40629866&postcount=13183

bgassassin said:
That is info that has been tough to find even in past searches. POWER6 is older than what I'd like to use as a reference, though there wasn't really much I could find when I used to look. This time around I looked more at that L4 cache of the z196, but not much there either. That said considering GC's 1T-SRAM main memory even had a ~10ns latency, I'd like to believe Nintendo would have a design that could surpass an appox. 12-year old design. I've seen that IBM's eDRAM is supposed to have less than 2ns latency, but I think there is a lack of info as to what that means exactly. Also at 32MB that would put the size of that chunk of memory at ~67mm^2. I don't see them putting that on-die.

EDIT: Correction. I was still using 45nm measurements. 32nm eDRAM is supposedly 11Mbit/mm^2 so that would put it at ~23mm^2. I guess that's manageable, but I still see them having it on the module instead of on-die. But of course that's my take at least.
;)

Edit #?: Maybe you were thinking of this link, bg: http://www.realworldtech.com/iedm-2010/3/
 
They still have zero incentive to do so. Who is going to buy a Wii U or a game only because it can be loaded from a faster-than-USB-2 interface? I doubt anyone. Why, therefore, is it worth any sacrifice at all on Nintendo's part?
Giving users convinience, options and a better experience for very little cost is not incentive enough? Opening a window of oportunity for developers so their software could perform better, should that not be wanted or apreciated?

Don't see whats wrong with being pro consumer, instead of advocating every single politic a corporation adopts.
 
I saw that it costs $200 to view the teardown of the 3DS CPU (ouch). Has anyone read it? Or is it basically information that everybody allready knows (dual-core ARM11 variant)?
EDIT: $200 for a PHOTO of the damn die. What a rip-off.
Decapping isn't exactly an easy and cheap process. The price seems perfectly fine to me.

And no, we don't really know what's inside the SoC. Dual ARM11 with dual VFPs, yes. But there also seems to be an ARM9 (that's actually used in 3DS mode and not just there for DS compatibility), and we know very little about everything else.
 
None of it is off the shelf and opening it up can only get you so far. To truly know the specs by just opening it up you'd need to have a high level of expertise and an electron microscope or something. It's not as simple as looking at the model number on the parts inside and Googling them.

Actually with the Wii the 64 MB GDDR3 RAM was a stock part IIRC or at least something that a simple google search revealed, and the CPU has it's speed written on it.
 
They still have zero incentive to do so. Who is going to buy a Wii U or a game only because it can be loaded from a faster-than-USB-2 interface? I doubt anyone. Why, therefore, is it worth any sacrifice at all on Nintendo's part?

At least one guy told me that he might reconsider buying the console because of the fact that there is no USB3. He says that he's got a pretty fast hard drive in his PS3 and does not want to accept slower loading times.

I told him that I don't care, and that I'm happy that I can buy pretty much any device on the market and it will work. But yeah, those people exist, apparently.
 
At least one guy told me that he might reconsider buying the console because of the fact that there is no USB3. He says that he's got a pretty fast hard drive in his PS3 and does not want to accept slower loading times.

I told him that I don't care, and that I'm happy that I can buy pretty much any device on the market and it will work. But yeah, those people exist, apparently.

Considering experiments with putting SSDs in Playstation 3 systems (SSDs are absolutely fantastic in PCs, but not worth it for the PS3), it is very clear that current gen consoles aren't primarily limited by drive speed but by processing power and fast-access memory. Faster drives would likely only provide extremely marginal benefits to the Wii U, the exclusion of USB3 is in no way a reason not to buy the system.

The Wii U's bigger RAM pool, in comparison to PS3 and 360, might have a very noticeable effect - with potentially fewer necessary (user-visible) load times and less pop-in.
 
Decapping isn't exactly an easy and cheap process. The price seems perfectly fine to me.

And no, we don't really know what's inside the SoC. Dual ARM11 with dual VFPs, yes. But there also seems to be an ARM9 (that's actually used in 3DS mode and not just there for DS compatibility), and we know very little about everything else.

An additional ARM9? That's news to me! What are your thoughts on an ARM Cortex-A5 being included on the Wii U GPU die? And then there is the DSP. There are alot that run at 120 Mhz. I know Li Mu Bai mentioned ARM might be involved in that as well. Don't quite know if he's believable, but NXP (who already provide some 3DS parts for Nintendo) made a 120 Mhz microcontroller using the ARM Cortex-M3 that was making news in 2010 for its DSP capabilities. http://www.nxp.com/news/press-relea...x-m3-microcontrollers-top-dsp-benchmarks.html
 
An additional ARM9? That's news to me! What are your thoughts on an ARM Cortex-A5 being included on the Wii U GPU die? And then there is the DSP. There are alot that run at 120 Mhz. I know Li Mu Bai mentioned ARM might be involved in that as well. Don't quite know if he's believable, but NXP (who already provide some 3DS parts for Nintendo) made a 120 Mhz microcontroller using the ARM Cortex-M3 that was making news in 2010 for its DSP capabilities. http://www.nxp.com/news/press-relea...x-m3-microcontrollers-top-dsp-benchmarks.html

ARM Cortex-A5? On the GPU? Where did this come up?
 
ARM Cortex-A5? On the GPU? Where did this come up?

It came up by me. haha. We know there needs to be a "Starlet" equivalent and the A5 is a very small design that is functional for their purposes. It's a lowball guess really. Plus AMD are actually using them in their APUs starting next year. Perhaps they liked it so much in their Wii U development that they decided to roll with it.
 
Can't figure out where to post this question, and I'm also curious as how this relates to the Wii U. What happens if the MS and Sony consoles are different enough that porting games between them are more difficult then the 360 and PS3?
 
Can't figure out where to post this question, and I'm also curious as how this relates to the Wii U. What happens if the MS and Sony consoles are different enough that porting games between them are more difficult then the 360 and PS3?

So far the opposite has shown to be true. We'll hopefully learn more soon to see if that continues to be the case.
 
Well, amuse me then BG :D - hypothetically, what happens if their next gen porting is harder than it was this gen?

I mean other than saying it won't get ports unless the market suggests devs put forth the effort, there's not really anything else to say in that regard.
 
Well, amuse me then BG :D - hypothetically, what happens if their next gen porting is harder than it was this gen?

It will be a matter of mathematics.

(projected revenue from port - cost to make port) / stupidity of publisher

If that's > some internal positive value then they'll do it
 
Sounds to me like that was exactly the point. I'm not an insider either and have learned things. What you're saying about him applies to me as well.

I'm sorry, but no. You are referring to his post as "confirmation". When you are sharing your own ideas, we can tell (or ask) if you are guessing/speculating or have actual info. So again, why should we see (or should we have seen at the time) his post as "confirmation" while he isn't a known insider and doesn't elaborate about his info. So i disagree with your post (and tone).
 
I'm sorry, but no. You are referring to his post as "confirmation". When you are sharing your own ideas, we can tell (or ask) if you are guessing/speculating or have actual info. So again, why should we see (or should we have seen at the time) his post as "confirmation" while he isn't a known insider and doesn't elaborate about his info. So i disagree with your post (and tone).

Well first I have no tone good or bad. It only comes off like there is a negative tone because I'm disagreeing with you.

Second we'll have to agree to disagree then.
 
The HDD in that graph that is connected via USB 2.0 to the 360 loads those "big games" only a second or two behind the SATA HDD of the 360.

That HDD is the bottleneck, either it's the HDD, or sata controller, i don't know if it uses sata 1 or sata 2 but both are above 100MB/s so if the sata is not bottlenecked it's the HDD.

What kind of HDDs are these, the HDDs in X360 cannot be whatever unless you crack, and that chart has no such information, too broad and you can't just compare stuff by saying "HDD" you need to know which model specifically.

X360 is not a viable benchmark, someone find the sata speed on X360 because it's not 150MB/s if HDD is okay.

PS3 doesn't have HDD jail-locking so it's a better comparrison.

Basicaly you guys were wasting time on all this comparrison debate over an invalid chart, it's only valid for X360, no other comparrisons are valid.
 
Thanks for the perspective as always. Reading about the Cube architecture is quite fascinating still to this day. I'd like to hope they could pull of something similarly awesome in Wii U - only better. I'm willing to admit that the separate die theory is probably the easiest and most likely, but let me run what I've been imagining by you guys in full and then you can tell me how crazy it is. haha
*buckles up*

Basically, I am with bgassassin and others in expecting the GPU to run at a low clock (480 Mhz). That would give it a cycle time of ~2 ns. Coincidentally enough, we hear that IBM's eDRAM is capable of latencies of below 2 ns. I am not sure how Nintendo/Mosys measured the Cube's 1t-SRAM latency, but at 6.2 ns, it comes to one clock cycle at 162 Mhz. Perhaps this is what Nintendo is going for - one clock cycle of sustained latency for eDRAM accesses. Everything about IBM's eDRAM process touts its ability to be placed on chip, so I really think it would be a waste of that technology if they just placed it on its separate die and called it a day. Would it really take them 2+ years to hook up an RV770 to an eDRAM die via databus?
It really depends on the wire. 2.5d stacking can be extremely good there, and it would be quite ludicrous for me, if I were nintendo - mature fab nodes, good yields, low latencies. It all really comes down to what fab nodes nintendo have decided to shoot for. Apparently I share the opinion that a mature fab node + 2.5 stacking could be a net win. Perhaps I'm wrong, but just look at the whole 28nm drama that's been shaking the industry.
 
That HDD is the bottleneck, either it's the HDD, or sata controller, i don't know if it uses sata 1 or sata 2 but both are above 100MB/s so if the sata is not bottlenecked it's the HDD.

What kind of HDDs are these, the HDDs in X360 cannot be whatever unless you crack, and that chart has no such information, too broad and you can't just compare stuff by saying "HDD" you need to know which model specifically.

X360 is not a viable benchmark, someone find the sata speed on X360 because it's not 150MB/s if HDD is okay.

PS3 doesn't have HDD jail-locking so it's a better comparrison.

Basicaly you guys were wasting time on all this comparrison debate over an invalid chart, it's only valid for X360, no other comparrisons are valid.

360 and PS3 are both SATA1, quite a few people were disappointed that SSDs had almost no visible improvement to load speeds on PS3. SATA1 should still beat USB2, but the Wii U probably doesn't even have a SATA1 controller for it's internal flash. While we won't know for sure till it comes out, it's probably hooked up via a USB controller as well... that puts all of the storage at about the same speed (20-30MB/s) sans seek time (from lowest to highest flash -> USB -> Bluray-a-like drive)
 
An additional ARM9? That's news to me! What are your thoughts on an ARM Cortex-A5 being included on the Wii U GPU die? And then there is the DSP. There are alot that run at 120 Mhz. I know Li Mu Bai mentioned ARM might be involved in that as well. Don't quite know if he's believable, but NXP (who already provide some 3DS parts for Nintendo) made a 120 Mhz microcontroller using the ARM Cortex-M3 that was making news in 2010 for its DSP capabilities. http://www.nxp.com/news/press-relea...x-m3-microcontrollers-top-dsp-benchmarks.html
ARM is not really suited for audio. 3DS uses a CEVA DSP. No idea what the Wii U will use, as 120MHz would be a very low clock for a CEVA design.
 
Ah, dude. I feel genuinely bad now. I don't want to upset you. As an olive branch, I promise not to make a single negative comment about the Wii-U for a whole week. Just for you.

I made it. I can't believe it!

I just want to thank my friends, my family and everyone on GAF. This has been the hardest week of my life but I made it through. Follow your dreams.
 
I made it. I can't believe it!

I just want to thank my friends, my family and everyone on GAF. This has been the hardest week of my life but I made it through. Follow your dreams.

I figured it was going end in a giant anti-Nintendo rant personally. Guess I'm out 5 internet bucks. How dare you be a mature individual! (that is sarcasm btw)


(edit) Err... wait, the sarcasm remark was supposed to be for the whole thing... not the last part about being mature... -_- Open mouth insert foot.
 
Completely forgot about that, haha. I guess I remember the other one (1mm^2 per MB) as I saw it more times (probably from the same place).

So if I'm reading this right, that makes it 1.375MB/mm^2 since it lists the density as 11Mb/mm^2. 32MB of this would be approx 23mm^2, which is what Fourth Storm mentioned. But most of this stuff is over my head, so I could be full of crap.
 
I made it. I can't believe it!

I just want to thank my friends, my family and everyone on GAF. This has been the hardest week of my life but I made it through. Follow your dreams.

I love how by describing how hard it was is basically a massive troll :P
 
deviljho said:
Well, amuse me then BG :D - hypothetically, what happens if their next gen porting is harder than it was this gen?
We learn they have access to sliding technology, and must be using a different world's tech to be so different.
 
ARM is not really suited for audio. 3DS uses a CEVA DSP. No idea what the Wii U will use, as 120MHz would be a very low clock for a CEVA design.

It did sound a bit fishy to me, but ARM's website does talk up the DSP extensions of the Cortex M3 and Cortex M4. And integration on a SoC or MCM should be a breeze. Of course, a "real" DSP is still likely more capable and there is Wii BC to factor in.
 
Damn ruggedly handsome Brits having a laugh at fanboys.

THEY'VE DEDICATED YEARS OF THEIR LIVES! Their money! Potentially even the attention and love of their kids defending the products of a multinational corporation.

And you just mock them!

I'm ashamed to have wanted to buy you a beer SmokyDave! Ashamed!
 
Question: If Nintendo isn't releasing specs and won't do so after launch, how are you all going to find out what's in the Wii U?

Is there an NDA in place? So developer documentation will be leaked? Do you open up the Wii U and do research on the parts etc?

That's pretty much what will be done and what has been done in the past

That's one of the main reasons why I registered on this forum :).


Really? how much of it will be off the shelf though? Is it really as easy as Googling?

What do you mean ? Explain


I just wish to say, this reply is plain offensive. Offensive because basically ignores a number of reasonable arguments regarding the issue and there's no insighful remark, data or counter argument, so it contributes with nothing. People invested their time explaining why the inclusion of a fster interface would be desireable and it comes with a small investment from Nintendo's part.

Kinder garden debate level in action, covering the ears and screaming as loud as the lungs allow. A spec thread deserves better.

Giving users convinience, options and a better experience for very little cost is not incentive enough? Opening a window of oportunity for developers so their software could perform better, should that not be wanted or apreciated?

Don't see whats wrong with being pro consumer, instead of advocating every single politic a corporation adopts.

Made my morning. Doesn't happen very often. Yes my intentions are good but how can I predict what how will people twist and percieve, I can explain only so much because my english is not perfect either.


USB 3.0 controllers aren't expensive anymore, they almost reach price parity with USB 2.0. I got a lot of USB 3.0 drives, I have no issues with any of them so far. The USB 3.0 cards are 5 dollar more than USB 2.0, same with the enclosures.

Cost isn't so much a factor as the maturity of the spec. USB3 is not a mature spec.

Okay all about that mature spec, that is a valid argument, and I can see nintendo going for reliability as always. On the other hand I don't want to be more disappointed to see bottleneck below 40MB/s. At least 40 would be a good advantage over disc media to warrant practical differentiation.

Also I didn't express the level of disappointment with lack of USB3.0 compared to whatever else, it's not a big disappointment so stop making panic all you know who you are.

Considering experiments with putting SSDs in Playstation 3 systems (SSDs are absolutely fantastic in PCs, but not worth it for the PS3), it is very clear that current gen consoles aren't primarily limited by drive speed but by processing power and fast-access memory. Faster drives would likely only provide extremely marginal benefits to the Wii U, the exclusion of USB3 is in no way a reason not to buy the system.

The Wii U's bigger RAM pool, in comparison to PS3 and 360, might have a very noticeable effect - with potentially fewer necessary (user-visible) load times and less pop-in.

I never said it's a reason not to buy WiiU, if you and others would get familiar with me and how I talk, it's never my personal opinion that is behind the tech discussion and that's how it's done right, my personal opinion on this is that I don't really care, I am used to waiting, ofcourse i have a really great PC setup which is lightning fast, but I am no unpatient guy, I've got diehard experience from the 90 and although I've cursed microsoft about a million times still I do not whine about petty things which don't really matter for gameplay experience, and secondly, you know that I won't care for most 3rd-parties so i won't even need much extra space at all, unless some super big awesome mega metroid prime 4 comes on 2x 25GB discs and they recommend to install then i would gladly do it.



At least one guy told me that he might reconsider buying the console because of the fact that there is no USB3. He says that he's got a pretty fast hard drive in his PS3 and does not want to accept slower loading times.

I told him that I don't care, and that I'm happy that I can buy pretty much any device on the market and it will work. But yeah, those people exist, apparently.

That's the petty thing i was talking about, those are usually people who have the half-picture of the whole tech stuff, and you get stuff like this, the only thing he can say that is because you can use sata HDDs, wether or not he actually has connected it via sata is another question.

The funny thing is, we don't even know the final USB speed on WiiU, it could be full-speed which is 60MB/s and that could be absolutely fantastic.


360 and PS3 are both SATA1, quite a few people were disappointed that SSDs had almost no visible improvement to load speeds on PS3. SATA1 should still beat USB2, but the Wii U probably doesn't even have a SATA1 controller for it's internal flash. While we won't know for sure till it comes out, it's probably hooked up via a USB controller as well... that puts all of the storage at about the same speed (20-30MB/s) sans seek time (from lowest to highest flash -> USB -> Bluray-a-like drive)

No, WiiU Flash is more tied with the mobo. That means it's probably conected with some kind of interconnect, but it remainst to be seen what kind of flash controller is there or none at all (handled by I/O processor). It's not on the motherboard in WiiU but I think it's definitely not USB.
 
360 and PS3 are both SATA1, quite a few people were disappointed that SSDs had almost no visible improvement to load speeds on PS3.

That has more to do with the games being designed around the I/O limitations - minimizing random seek/access. It shouldn't be surprising with linear games in that respect. Open world games will tend to have redundant data around the disc and/or repeated/instanced assets in the scene or surrounding area, but these games would be the ones likely to benefit the most (whether or not it's tangible or significant is another thing given the design).

While we won't know for sure till it comes out, it's probably hooked up via a USB controller as well... that puts all of the storage at about the same speed (20-30MB/s) sans seek time (from lowest to highest flash -> USB -> Bluray-a-like drive)

Regardless, the main speedups of SSDs come from parallelization i.e. multiple flash chips - basically RAID0-like operation. Flash chips only have so much I/O on themselves. It seems unlikely that they'd go with more than 2 chips TBH.
 
Top Bottom