Rumor: Wii U final specs

I don't see what would be so challenging for third party devs. They are already used to multcore CPUs and shader based GPUs.

Nintendo first party teams will need to go through that learning curve though.


if there is specific effort required to 'get the most' from the hardware by coding for the GPGPU portion, then ordinarily I'd argue that no third parties will bother and they'll just port with minimum effort.

However, if the next gen consoles have GPGPU elements, then teams will need to get to grips with this anyway, which might benefit WiiU development.

Unless you're a Wii U games dev yourself then you can only expect not to understant and just need to except that!

But yeah, all rumours seem to point to all consoles being GPGPU which they will have to learn for anyway like you say. Which will benefit the 720/ps4 before it benefits the Wii U ;-) But yes, that then in turn would hopefully mean porting between all 3 new platforms will be more universal.
 
Why did we have all the moaning from dev's that the Wii U was weaker than the PS360 because it had such a slow CPU then?

Those were rumours. Tekken dev mentioned sonething about it to the Western press which was then miss-translated into 'weak CPU', to which the Tekken dev said he never said/meant that.

Weak CPU and context to overall system architecture is so far just fanboi jibba-jabba for shit flinging contests.
 
Why did we have all the moaning from dev's that the Wii U was weaker than the PS360 because it had such a slow CPU then?

It was only one developer Koei and it went on to say they are still trying to figure out the system.

"With the Wii U being new hardware, we're still getting used to developing for it, so there are still a lot of things we don't know yet to bring out the most of the processing power. There's a lot that still needs to be explored in that area."

I think Nintendo went with an under-clocked CPU because there is the GPGPU to off-set some of the computations and this is what Koei are having to deal with developing Warriors Orochi 3 Hyper...
 
Unless you're a Wii U games dev yourself then you can only expect not to understant and just need to except that!

But yeah, all rumours seem to point to all consoles being GPGPU which they will have to learn for anyway like you say. Which will benefit the 720/ps4 before it benefits the Wii U ;-) But yes, that then in turn would hopefully mean porting between all 3 new platforms will be more universal.

slightly aggressive tone?

My point wasn't about specifically learning GPGPU coding, it was more that I expect multiplatform development towards WiiU to be conservative in its approach (see Fifa having old engines etc). That means limited funds for dev teams, which means 'get something out of the door quickly'.

That environment doesn't lend itself to teams taking the time to understand the architecture and get the most out of it.
 
it was more that I expect multiplatform development towards WiiU to be conservative in its approach (see Fifa having old engines etc). That means limited funds for dev teams, which means 'get something out of the door quickly'.

To get games ready for the launch window, yes, this is going to take place, of course it is. I'd imagine it's a crazy mad rush to get the game "working"

But, for Fifa 14 on the Wii U, they are going to have to improve it otherwise people won't upgrade to the new version.

You can't really judge 3rd party efforts by launch window
 
I can imagine the development process for RETRO has been difficult - there's a lot of pressure for them to deliver a visually stunning game, yet with fluctuating hardware and looming deadlines, I bet they've had a reall difficult time pulling the project through. I have every faith they'll develop something amazing, but as is always the case with new tech, I bet they had a struggle to match graphics similar to Witcher 2, Skyrim etc., let alone surpassing it.

If we see Halo 4 level graphics, I'll be impressed. If we see anything surpassing that, I'll be amazed.
 
Gemüsepizza;43504669 said:
All this hardware you have listed was not something a console dev did usually work with.
Erm, I'm talking bog standard desktop tech circa the beginning of the century, and the corresponding know-how. Gamedevs don't live in caves. Their office workstations and/or home desktops were already hosting SMP and shader tech before 2005. Devs' non-cube/ps2 projects were most like requiring them to know this stuff. The notion that 360 (will return to ps3 shortly) introduced a sizable amount of tech the likes of which nobody had ever touched is as detached from reality as they come. That does not mean that every gamedev coder was brilliant at SMP code and was writing shaders day-in and day-out in 2005. But guess what - devs aren't even today, and that will remain so in the foreseeable future - there's a sound division of labor and responsibilities on every sizable game project. BTW, I'm talking from the POV of somebody who used to do PC-based game engines in that timeframe, and whose teams (on multiple projects) would include gamedevs of various backgrounds - consoles, pc, handhelds, university graduates. I'm not speaking hypothetically, I'm telling you how things were back then for a representative sample of the industry.

Gamecube = No shaders. PS2 = No pixel shaders. Many console devs had little experience with shaders when the current generation hit, and had to learn how to properly use them.
Let me guess, many console devs in your eyes means ps2-only devs? Because entirely disregarding any likely PC exposure of the gamedevs from that timeframe, cube's TEV was proto-PS1.3 tech, and Xbox was dx8 through and through. Basically, your entire argument is resting on the premise there was this exotic tribe in the Amazon forest known as 'console devs' whose exposure to technology was limited to only what their console vendor of choice dropped on them via parachutes once per hw generation. Which, again, is still not sufficient for you, because ps3's 'ultra exotic' CPU tech was precisely targeting seasoned ps2 devs, who were already versed in widely-asymmetric architectures where autonomous streaming processors were meant to (pre-)chew graphics workloads. Yes, the RSX was a castrated desktop part, so that did turn quite a bump in the learning porting curve for quite a few, but that was not because there was anything exotic in this tech.

What? Of course devs had to learn how to properly use this hardware. This was new technology in console space. This goes hand in hand with developing algorithms for this hardware.
No, it does not go 'hand in hand' - most graphics know-how in use today either originates from the academic/CGI/PC space, or finds its way there shortly, and from there on into the wild. Exception is Cell-targeted sw tech, which, let's face it, was a dead end. And how is 'learning how to properly use the new hw' implying that devs had no experience with SMP or shaders per se?

Wii U will profit by some of the advancements made in graphics development, but in the end it's dependent on how much developers will care. I have already pointed this out in a post before, yet everybody seems to ignore it. I mean, it's not like every Xbox 360 title from now on will look like Halo 4. Please tell me, how realistic is it, that we will see devs putting their best graphic experts and big money on Wii U development to optimize their engines by the same amount they did for Xbox 360 and PS3.
You're confused so let me help. You are mixing high production values (which is normally an asset thing) and use of advanced graphics algorithms (which is normally an R&D thing) - those are different things. Yes, high-production-value projects do tend to also have strong R&D, but a small indie title can be high-tech just as well.

What I mean is: The hardware in the Wii U is not supposed to be something completely new. Because Nintendo didn't want this. They wanted hardware that is easy to develop for.
And yet we get anecdotal accounts of devs who managed to increase the performance of their WiiU pipelines multi-fold in the span of the last devkit cycle. Again, let me restate that how much learning WiiU devs have to do to get the hang on the paltform is something which you have to get first-hand only.

Maybe because you didn't read all of them. Optimization is always possible. But it costs money and time. There is no doubt that the Wii U's hardware is more capable than the hardware of Xbox 360 and PS3. But is there any sign that devs will actually care about this? That they will put the same effort in the Wii U that they did with Xbox 360 and PS3? That they will put more effort in it? And how much effort is needed to produce a jump like PDZ to Halo 4? We won't see a magical jump just because "graphic algorithms will advance".
Which devs? PDZ and Halo4 are both first-party. Are you suggesting Nintendo will neglect their own platform?
 
While that remains a possibility I find it highly unlikely for reasons that have been repeatedly stated

I find it likely because of what else we know. With a seperate processor for the OS, sound, and having GPGPU functionality it makes sense that they wouldn't have given the CPU a lot of juice, to focus on power consumption, heat and size.

Considering both PS3 and 360 CPUs deal with OS tasks, sound and graphics calculations, they would probably try to make it on par for tasks excluding those (and considering GPGPU) so it balances out. But maybe some tasks can't be offloaded to the GPU, or with difficulty, or there's a problem with experience or time. Or they underestimated demands, which is why we hear problems. Who knows.
 
Blu, after all the arguing I'm not sure I still see your point. Are you saying that the difference in hardware architecture going from PS360 to WiiU is as significant as the one going from Xbox/GC/PS2 to PS360? Because if so, I disagree for the reasons I outlined further up. If not, then what are you saying?

Exception is Cell-targeted sw tech, which, let's face it, was a dead end.
Not entirely I would say. People are incredibly hyped about GPGPU these days (still), which is quite close to the Cell concept. Particularly if the "GPU" cores used for it reside on the CPU.
 
You've already been told by multiple people but continue to repeat the question. Yes, it is about POWER since every gen starts with devs trying to figure out how to best utilize that POWER.
architecture and power are not the same thing. the Wii was very underpowered compared to the 360 and PS3 but it was also hampered by having an outdated architecture.

The WiiU's POWER is similar to current gen and develpers can just use existing models and assets with enhancements. So far I see the jump from PS360 to WiiU as a smaller jump than PS2 to Xbox.
i believe the same thing, and i already suggest so by telling someone who saw this as a PS2 to xbox jump that 'I wouldn't go that far'.

I see no reason to expect some sort of hidden potential that blows the best current gen software away.
I don't. We'll see games on Wii U look better than current gen if a talented dev puts in the effort. guaging from the Wii, I doubt we'll see many games push the system to it's best, as most Wii games fail to look better than the best Gamecube games, save a few notable exceptions.

The WiiU is designed to run current gen engines. These engines have had years to mature to their current state and that and developer experience is a huge benefit to the WiiU. This was clearly part of Nintendo's plan and it was a risky but clever idea.
The next systems are expected to have a generational jump in power that will restart the cycle of immature engines and tools all over again. The learning curve will be smoother than this gen but will still have it's fair share of difficulty.
I'm not sure what other answers you're looking for since no, the next systems won't be similar simply because they have multi-core cpu's and shaders, and your questions are terribly half-assed and unclear.
as Gemusepizza (who i was replying to originally) keeps reiterating a big part of the learning curve was learning how to program for multicore CPUs and working with shaders. that isn't talking about power, but functions. as far as we know based on all the rumours we are look at PS4 and Loop featuring shader model 5.0 GPGPUs. the Wii U has an enhanced shader model 4.1 GPGPU.

if those rumours are true, the Wii U's GPGPU part is architecturally much closer to what we're going to see in Sony and Microsoft's next machines than it is to the parts we saw in the PS3 and Xbox 360.

now, that says nothing of how powerful the part is. of it's fill rate. how many triangles it can push. how many FLOPS it is. that is talking about it's architecture. that isn't the same thing as power.

or to put it another way, writing games for the Wii U is probably going to be like writing software for a substantially under clocked Xbox Loop, more than writing games for an overclocked Xbox 360.

Try to ask a question like an adult and don't act like an asshole.
if you take the time to read my posts and actually respond to what i said, rather than presuming i said something else, i won't be so dismissive. gemuse was talking about architecture, not power, and continues to talk about architecture. i was responding to him. it's not difficult to follow and yet somehow you thought i was talking about power.

you don't have to go much further back in this thread than the original post of mine you quoted to see me saying 'I wouldn't go that far' as to say the Wii U is to the 360/PS3 what the Xbox is to the PS2.

was my question worded too strongly in suggesting that Gemusepizza was saying we're seeing the best the system can do at launch? absolutely. was it about power rather than architecture?

nope.

expecting to see the best of any system at launch is stupid, as most launch games are going to be rushed to meet that deadline, and will have only had versions of the dev kit with finalised specs to work with for a few months.

i've never said i expected to see large gains over the launch titles anywhere in this thread... but we should see noticeably improved graphics over current gen titles if and when we get one of the best coding teams trying to push the system.

Naughty Dog, for example, would be able to make a game that looked better than they could on the PS3 even if just with all that extra memory. the Wii U is more powerful. it'll be apparent in the right hands.
 
I can imagine the development process for RETRO has been difficult - there's a lot of pressure for them to deliver a visually stunning game, yet with fluctuating hardware and looming deadlines, I bet they've had a reall difficult time pulling the project through. I have every faith they'll develop something amazing, but as is always the case with new tech, I bet they had a struggle to match graphics similar to Witcher 2, Skyrim etc., let alone surpassing it.

If we see Halo 4 level graphics, I'll be impressed. If we see anything surpassing that, I'll be amazed.

Isn't most of Retro former PC developers? I think the more mundane architecture of current consoles would be right up their alley.
 
I find it likely because of what else we know. With a seperate processor for the OS, sound, and having GPGPU functionality it makes sense that they wouldn't have given the CPU a lot of juice, to focus on power consumption, heat and size.

Considering both PS3 and 360 CPUs deal with OS tasks, sound and graphics calculations, they would probably try to make it on par for tasks excluding those (and considering GPGPU) so it balances out. But maybe some tasks can't be offloaded to the GPU, or with difficulty, or there's a problem with experience or time. Or they underestimated demands, which is why we hear problems. Who knows.

Oh dont get me wrong I dont expect the CPU to be a beast by modern standards I just expect even being very small and not clocked very high it would be highly unlikely it was less powerful than xenon
 
Blu, after all the arguing I'm not sure I still see your point. Are you saying that the difference in hardware architecture going from PS360 to WiiU is as significant as the one going from Xbox/GC/PS2 to PS360? Because if so, I disagree for the reasons I outlined further up. If not, then what are you saying?
No, I'm arguing that at the end of the day it was not so much the novelty tech in those platforms that dictated what was the observable state-of-the-art on those platforms through the end of the platforms' lifespans. It was the algorithmic know-how pertinent to the hw, and developed through the hw's lifespan which played the crucial role of something like 'PDZ vs Halo4'. I'm arguing that the sw side of things did not simply come down to 'devs originally did not know SMP or shaders' - that's a mis-directed argument at best, and more like a gross bastardisation of the subject.

Not entirely I would say. People are incredibly hyped about GPGPU these days (still), which is quite close to the Cell concept. Particularly if the "GPU" cores used for it reside on the CPU.
Ok, well, it was a dead-end on the CPU end, let's put it like this.
 
Let's put some names in this discussion. I don't think that the games from Wii U on 2015 will look only a little better than Assassin's Creed 3 and Zumbi U.

Devs can have less technology to domain in Wii U, but I believe graphics in game will have a notable improve over the years.

And XBox 360 is a good example of how this will be possible. When it's machine made 5 years, you can say that Devs know everything about it hardware, no? But games like AC3 and Halo 4 have a notable improve over games launched 2 years ago.

Wii U hardware don't bring nothing new to PC developers? Okay, but it will not stop companies to make graphics better and better at each new game they made.
 
Isn't most of Retro former PC developers? I think the more mundane architecture of current consoles would be right up their alley.

There were a few people with a PC background in the original team (I seem to remember one or two from id), but by this point it's a fairly diverse group as far as development backgrounds go.

I thought retro was mainly ex iguana (n64 fps developer) staff?

The original studio head (who didn't last long) was from Iguana. There might have been a couple of others, I'm not sure, though.
 
Isn't most of Retro former PC developers? I think the more mundane architecture of current consoles would be right up their alley.

Retro was founded by former members of Iguana Entertainment (Turok, Southpark) and I know at least one person (can't remember tha name) was a former artist from Naughty Dog. Don't know about PC devs, though.
 
slightly aggressive tone?

My point wasn't about specifically learning GPGPU coding, it was more that I expect multiplatform development towards WiiU to be conservative in its approach (see Fifa having old engines etc). That means limited funds for dev teams, which means 'get something out of the door quickly'.

That environment doesn't lend itself to teams taking the time to understand the architecture and get the most out of it.
So essentially another Wii situation.

I'm certainly glad I'm not rushing to waste money this time around.
 
So essentially another Wii situation.

I'm certainly glad I'm not rushing to waste money this time around.

Anyone who "wasted money" on the Wii can only blame themselves for not waiting to see what games would be available.
 
I don't see what would be so challenging for third party devs. They are already used to multcore CPUs and shader based GPUs.

<snip>

However, if the next gen consoles have GPGPU elements, then teams will need to get to grips with this anyway, which might benefit WiiU development.

These two sentences seem to contradict each other. Blu would now more about this than me, but isn't GPGPU essentially unused in current console development? There will be more emphasis on this next generation which we can only assume will generate better results on Wii U.

Just think, the ports currently on Wii U are doing so without the use of it's GPGPU and devs are still getting to grips with the system. It would be a bit ignorant to think we won't see an improvement of sorts.

This improvement may have to wait until the next consoles from MS/SONY come because by then it may be financially acceptable to spend the time optimising engines for GPGPU centric systems but it will come.
 
I wasn't aware of this. I thought the base hardware is exact same, just shrunk down.


The CPU, codenamed Xenon, implemented three in-order PowerPC cores with SMT support - meaning the whole chip could work on six threads at the same time. The design was ahead of its time but given its 90nm manufacturing process it only had 1MB of L2 cache to share among all three cores. These days it isn't really considered the ideal approach to a many-core CPU. Private L2 caches with a large shared L3 cache is preferred for scaling beyond two cores.

Which probably is implemented with the WiiU.
We know the CPU will have a large amount of eDRAM.

the new Xbox 360 consumes less than half the power of the original 360!

The WiiU is also very power efficient.
In other words, power consumption is not a reliable determinant how powerful it really is.
In the case of the 360, more power efficient meant more powerful.

One interesting thing about the new design is the inclusion of a "FSB Replacement" block. IBM/GlobalFoundries could have just connected the GPU and GPU with a low-latency internal connection, but doing so would have made the new Xbox 360 faster than previous versions. The FSB Replacement block actually adds latency to the mix and introduces a performance hit to keep the new model from outpacing older versions.


This is what they managed to do with old hardware.
WiiU is new hardware with low latency designs in mind.
This is why I think the differences between the WiiU and next gen will be minimal.
Sony or MS would have to really come out with a powerful system for people
to notice a difference between the consoles at their launch. If not...
graphics will not be a factor. It will be Gamepad vs Kinect vs VR glasses.
 
I can imagine the development process for RETRO has been difficult - there's a lot of pressure for them to deliver a visually stunning game, yet with fluctuating hardware and looming deadlines, I bet they've had a reall difficult time pulling the project through. I have every faith they'll develop something amazing, but as is always the case with new tech, I bet they had a struggle to match graphics similar to Witcher 2, Skyrim etc., let alone surpassing it.

If we see Halo 4 level graphics, I'll be impressed. If we see anything surpassing that, I'll be amazed.
It wouldn't surprise me in the least if they easily surpassed Halo 4's presentation. These are the same guys that made the Prime games and Donkey Kong Wii. Whatever they make on the Wii U with it's far more advanced CPU and 10x the memory should be absolutely breathtaking.
 
Ok, can someone tell me why Gemüsepizza thinks the GC didn't support shaders at all? I mean, I'm pretty sure the TEV wasn't there for nothing. And while I am aware they weren't your average shaders, they were still shaders none the less.
 
Ok, can someone tell me why Gemüsepizza thinks the GC didn't support shaders at all? I mean, I'm pretty sure the TEV wasn't there for nothing. And while I am aware they weren't your average shaders, they were still shaders none the less.

I think he's just confused by the fact the GCN didn't have programmable shaders which is something a lot of people get confused about. There was only a brief window between the introduction of pixel/vertex shaders in general and then actual programmable shaders.
 
Also, for those concerned by the quality, the speed & quantity of all the memory department of Wii U, don't worry, they are "marvels of optimizations", from caches to the ram. It's a recurring compliment from what i've heard.
 
Also, for those concerned by the quality, the speed & quantity of all the memory department of Wii U, don't worry, they are "marvels of optimizations", from caches to the ram. It's a recurring compliment from what i've heard.

Have you heard anything about bandwidth specifically compared to PS3 or XBox360?
 
Also, for those concerned by the quality, the speed & quantity of all the memory department of Wii U, don't worry, they are "marvels of optimizations", from caches to the ram. It's a recurring compliment from what i've heard.
Mon ami Will the OS really take 1 gb of space?
If it uses less than that will the rest be relocated to be used as extra ram for games??
 
Also, for those concerned by the quality, the speed & quantity of all the memory department of Wii U, don't worry, they are "marvels of optimizations", from caches to the ram. It's a recurring compliment from what i've heard.
Thanks for your reply. I assuming that also has to do with whatever the Tori Tori developers are using to reduce their texture RAM usage by 100MB. It is interesting that they can not say how they are doing that.
 
Have you heard anything about bandwidth specifically compared to PS3 or XBox360?

Can't be too specific, but all parameters are balanced, well-thought, aimed at extreme efficiency for running code, with really low latency.

And the good thing is it seems there is a huge room for improvement from developers themselves. Between the first dev kits and the latest, they grew accustomed to the memory organization (in addition to the increase from dev kit components, sdk, etc.), the difference is huge. It's obviously pretty common for new hardware, but really impressive on Wii U.
 
Wii U hardware don't bring nothing new to PC developers? Okay, but it will not stop companies to make graphics better and better at each new game they made.

IMO the people saying we'll see little to no improvement are fighting a losing battle. No matter how new or old hardware is, there's always room for improvement.

The only thing I don't agree with is the statements that we'll see the same leap in improvement as we saw with the PS3 or 360. I understand it's new hardware and that PDZ pushed the 360 as hard as Halo 4, just less efficiently. I don't see how it's wrong to believe developers are achieving better utilization of the Wii-U's hardware at launch than they achieved on the ps3 or 360 at launch. Regardless of the new hardware in the Wii-U, there are still practices and general improvements/lessons learned over the years that will not just benefit development on the Wii-U but also the PS4 and 720. Only difference between those three systems is the pool of resources developers can play with.

So while I fully expect Wii-U games to improve and impress, I think expecting a PDZ->Halo 4 level of improvement is an unrealistic expectation.
 
Mon ami Will the OS really take 1 gb of space?
If it uses less than that will the rest be relocated to be used as extra ram for games??

Well, the operating itself, stricto sensus (the files that constitutes the "Windows" of Wii U), no. But all the system "functions" (the "Windows" files + all the services running in background, the software features attached), apparently yes, mostly for caching purposes. Some techies can explain this better than me, we already talked about this on previous WUST.

But the 1GB for games/1GB not available for devs is a sure thing and revealed here in February.
 
Thanks for your reply. I assuming that also has to do with whatever the Tori Tori developers are using to reduce their texture RAM usage by 100MB. It is interesting that they can not say how they are doing that.

It's likely some kind of texture compression technique that isn't supported on other platforms. Nintendo did that back in the day with hardware S3TC support in the Gamecube. I don't know what the modern equivalent might be.
 
Well, the operating itself, stricto sensus (the files that constitutes the "Windows" of Wii U), no. But all the system "functions" (the "Windows" files + all the services running in background, the software features attached), apparently yes, mostly for caching purposes. Some techies can explain this better than me, we already talked about this on previous WUST.

But the 1GB for games/1GB not available for devs is a sure thing and revealed here in February.
Merci!
 
Ideaman is back! :)

Good to hear more confirmation of Wii U's well-thought-out memory hierarchy. Although, we've really been hearing nothing but good things about this since Brain_Stew's initial bombas back in the day. Even if they kept it at 1 GB permanently, it seems like a nice amount to play around with given the capabilities of the GPU and CPU.

Now, if only we could get some bandwidth numbers! It's obvious Nintendo has gone w/ DDR3, so let us all hope that it's the fast 1066 Mhz type (or a very slightly downclocked custom version of it - for clock sync purposes).
 
Ideaman is back! :)

Good to hear more confirmation of Wii U's well-thought-out memory hierarchy. Although, we've really been hearing nothing but good things about this since Brain_Stew's initial bombas back in the day. Even if they kept it at 1 GB permanently, it seems like a nice amount to play around with given the capabilities of the GPU and CPU.

Now, if only we could get some bandwidth numbers! It's obvious Nintendo has gone w/ DDR3, so let us all hope that it's the fast 1066 Mhz type (or a very slightly downclocked custom version of it - for clock sync purposes).

With Samsung 30nm or 20nm DDR3 they could potentially go as high as 1.2GHz while keeping within thermal and power constraints. Double the GPU clock seems a reasonable possibility.
 
But the 1GB for games/1GB not available for devs is a sure thing and revealed here in February.

Is this something that can be optimized, giving the developers more ram to work with in the future? were 360/ps3's OS optimized thru out there life?
 
Is this something that can be optimized, giving the developers more ram to work with in the future? were 360/ps3's OS optimized thru out there life?

Yes and yes. The PS3 initially allocated 128MB to the OS, and it's now down to (I think) 32MB.
 
Is this something that can be optimized, giving the developers more ram to work with in the future? were 360/ps3's OS optimized thru out there life?

3DS did that as well, I believe.
 
I can imagine the development process for RETRO has been difficult - there's a lot of pressure for them to deliver a visually stunning game, yet with fluctuating hardware and looming deadlines, I bet they've had a reall difficult time pulling the project through. I have every faith they'll develop something amazing, but as is always the case with new tech, I bet they had a struggle to match graphics similar to Witcher 2, Skyrim etc., let alone surpassing it.

If we see Halo 4 level graphics, I'll be impressed. If we see anything surpassing that, I'll be amazed.

And this is why I really hope that Retro is working on a Metroid Prime title. Prime 3 looked fantastic considering the hardware it was on. That and I want Wii U Pad scanning, just for novelty's sake.

I've never heard that about 3DS RAM. (apart from in similar gaf posts.)
There were rumours of extra processing power being unlocked though.

I thought Nintendo reduced the amount of RAM that the OS used, thus giving developers a bit more to use. Similar to how the PSP got its cpu clockrate increased.
 
Top Bottom