Wii U Speculation Thread of Brains Beware: Wii U Re-Unveiling At E3 2012

Status
Not open for further replies.
boris feinbrand said:
Reasonable guess I'd say. I don't know why Console Ram would be more expensive than desktop RAM though. Lower power consumption requirements maybe?

Still 2 GB would be my personal wish for next gen consoles.
The ram you find in game consoles/handhelds is lower latency, faster, and consumes significantly less power than RAM found in PC's and smartphones. So yes the price difference is significant.
 
Vieo said:
It would be sweet if the tablet could double as an eReader. Do it, Nintendo! DO EEEET!

I think your post got lost in the mix. That would be a neat idea to pull off and promote.

I may have missed it as I looked back and didn't see this posted. Dirt and F1 are confirmed for Wii U.

http://www.mcvuk.com/news/read/mcv-interview-rod-cousens-on-wii-vita-ea-and-pre-owned/082618

MCV Interview: Rod Cousens on... Wii U, Vita, EA and pre-owned

...

In terms of Wii U, the interesting aspect is timing and price and that has not been announced.

Everybody knows it is coming. I anticipate further price cuts on existing consoles prior to Christmas this year, and there’s room for further price cuts next year.

So Wii U needs to be very keenly priced – and in the same way that Sony is having to price Vita when the competition is smartphones and tablets. Pricing is critical in a very fragmented hardware space. So I am not sure they are going to be able to come out and hit a premium price point from the outset.

But Nintendo is smart. It is always innovative. It has demonstrated its breakthrough ability in the past, so let’s see if it can do it again. And the industry should cheer it on.

We will be doing Dirt and F1 for Wii U.
 
Instro said:
The ram you find in game consoles/handhelds is lower latency, faster, and consumes significantly less power than RAM found in PC's and smartphones. So yes the price difference is significant.
I assume Nintendo will use standard DDR3, most likely 1GB, with a quad channel interface if we're lucky (POWER7/ A2 memory controller).
 
I'm wondering what if any significant changes will be made to the WiiU from what we saw at E3. I think the form factor for the controller will be modified. I also think the console casing will see some changes. What does everyone else think?
 
LegendofJoe said:
I'm wondering what if any significant changes will be made to the WiiU from what we saw at E3. I think the form factor for the controller will be modified. I also think the console casing will see some changes. What does everyone else think?

A new name. Hopefully.
 
wsippel said:
I assume Nintendo will use standard DDR3, most likely 1GB, with a quad channel interface if we're lucky (POWER7/ A2 memory controller).
DDR3 is ridiculously cheap and will only get cheaper with time. I can really see Wii U using it instead of GDDR, the cost savings would be massive and would enable them to put something like 2GB in for very little cost. They could even go for a split RAM architecture like PS3 with 1.5GB DDR3 and 512mb GDDR5.
 
wsippel said:
I assume Nintendo will use standard DDR3, most likely 1GB, with a quad channel interface if we're lucky (POWER7/ A2 memory controller).

How much do you think Nintendo will reserve for the OS and other background tasks?
 
lwilliams3 said:
How much do you think Nintendo will reserve for the OS and other background tasks?


360 used like 40MB, I think.
3DS uses 32MB.
So, probably like 50MB. Maybe more if they want to have the OS on the tablet while you're in game.
 
wsippel said:
I assume Nintendo will use standard DDR3, most likely 1GB, with a quad channel interface if we're lucky (POWER7/ A2 memory controller).
Would the quad channel interface also apply to the GPU? (aka the only thing that matters) And what about latency? And what about the fact that Nintendo has never downgraded components from one generation to the next?

I'm sticking by GDDR3/5 for the main RAM pool. I don't see any point to going with quad-channel RAM. I don't think it would save any money.

Stabbie said:
I haven't followed the Wii U since E3. Has anything been confirmed since the E3 reveal?
Nope.
 
wsippel said:
I assume Nintendo will use standard DDR3, most likely 1GB, with a quad channel interface if we're lucky (POWER7/ A2 memory controller).

Keeping the promised "tons of embedded RAM" in mind (my guess are around 30MB, GPU + CPU this is) I wouldn't write off this possibility for the main RAM. Even 2 Gigs of DDR3 would hardly cost them 10 bucks and with the quad channel interface it would be reasonable fast.

EDIT: AFAIK GDDR5 is based on DDR3, what makes it more expensive?
 
Mr_Brit said:
DDR3 is ridiculously cheap and will only get cheaper with time. I can really see Wii U using it instead of GDDR, the cost savings would be massive and would enable them to put something like 2GB in for very little cost. They could even go for a split RAM architecture like PS3 with 1.5GB DDR3 and 512mb GDDR5.
I don't really see the point. DDR3 is actually faster than GDDR5. It's usually held back by the bus, but that wouldn't be an issue with IBM's latest DDR3 controllers (quad channel and dual quad channel, up to >200GB/s).
 
Mr_Brit said:
DDR3 is ridiculously cheap and will only get cheaper with time. I can really see Wii U using it instead of GDDR, the cost savings would be massive and would enable them to put something like 2GB in for very little cost. They could even go for a split RAM architecture like PS3 with 1.5GB DDR3 and 512mb GDDR5.

I don't think GDDR5 will be used. I can see 1.5GB of DDR3 and then the eDRAM.

BurntPork said:
Would the quad channel interface also apply to the GPU? (aka the only thing that matters) And what about latency? And what about the fact that Nintendo has never downgraded components from one generation to the next?

I'm sticking by GDDR3/5 for the main RAM pool. I don't see any point to going with quad-channel RAM.


Nope.

I wouldn't consider it a real downgrade since its newer than GDDR3.
 
wsippel said:
I don't really see the point. DDR3 is actually faster than GDDR5. It's usually held back by the bus, but that wouldn't be an issue with IBM's latest DDR3 controllers (quad channel and dual quad channel, up to >200GB/s).
Do you know what bandwidth something like quad channel DDR3 at say 1600MHz would have? Would it be below or above GDDR5 in a midrange GPU?

BurntPork said:
Would the quad channel interface also apply to the GPU? (aka the only thing that matters) And what about latency? And what about the fact that Nintendo has never downgraded components from one generation to the next?

I'm sticking by GDDR3/5 for the main RAM pool. I don't see any point to going with quad-channel RAM. I don't think it would save any money.


Nope.
Wouldn't GDDR3 be slower and more expensive than DDR3?
 
Mr_Brit said:
Wouldn't GDDR3 be slower and more expensive than DDR3?
I think they're similar in terms of speed, but GDDR3 has lower latency. I'm not really sure though, since I never bothered to really sit down and compare them.

EDIT: Quick Google search shows that GDDR3 is significantly faster than DDR3, though that doesn't take quad channel into account.

wsippel said:
I don't really see the point. DDR3 is actually faster than GDDR5. It's usually held back by the bus, but that wouldn't be an issue with IBM's latest DDR3 controllers (quad channel and dual quad channel, up to >200GB/s).
But would save them any money, and how would it be in terms of latency?
 
Mr_Brit said:
Do you know what bandwidth something like quad channel DDR3 at say 1600MHz would have? Would it be below or above GDDR5 in a midrange GPU?
~100GB/s. That's not far from the memory used on the 4870 line of cards (115GB/s), which obviously wasn't exactly midrange and used higher clocked RAM (1.8GHz).
 
BurntPork said:
I think they're similar in terms of speed, but GDDR3 has lower latency. I'm not really sure though, since I never bothered to really sit down and compare them.

EDIT: Quick Google search shows that GDDR3 is significantly faster than DDR3, though that doesn't take quad channel into account.
GDDR3 is DDR2.
 
wsippel said:
GDDR3 is DDR2.
It's only based on DDR2. They're still very different. I'm talking in terms of of speed and latency. Newer doesn't always mean faster. Do you think that DDR4 is going to be faster than GDDR5?

Can you answer my questions now?
 
BurntPork said:
I think they're similar in terms of speed, but GDDR3 has lower latency. I'm not really sure though, since I never bothered to really sit down and compare them.

EDIT: Quick Google search shows that GDDR3 is significantly faster than DDR3, though that doesn't take quad channel into account.


But would save them any money, and how would it be in terms of latency?
An 8800GT uses GDDR3 and has bandwidth of around 55GB and it is expected that the GPU in Wii U will be around this power or lower. Wsippel says quad channel DDR3 has around 100GB of bandwidth so that should be plenty for the Wii U system as well as being significantly cheaper than GDDR. If they do end up using DDR3, then 2GB is the minimum amount I'd expect as opposed to the 1GB I'd expect if they go with GDDR5.
 
AceBandage said:
Around or lower than an 8800GT?
Pretty sure a 4770 is above that.
Brit has already decided that Wii U has an RV730 at 500-600MHz.

Does anyone know if a quad-channel interface would cost more? and how could it be shared between the CPU and GPU?

Lovely Salsa said:
Yea I'm not buying it

Nice try Nintendo
hit_and_run.gif
 
[Nintex] said:
Devkits had R770LE chips according to the latest reports.(Radeon 4830)
Source? I've not heard any GPU leaks from the latest dev kits. If true, it's much better than most were expecting.
 
Mr_Brit said:
Aren't most people expecting around RV730? That is quite a bit lower than an 8800GT.
Mostly the naysayers are expecting a rv730. no one in their right mind expects Nintendo to use a gpu based on the 4670
 
No Brit. Most people were never expecting that. Just you and I think the pixel counting article.

BurntPork said:
It's only based on DDR2. They're still very different. I'm talking in terms of of speed and latency. Newer doesn't always mean faster. Do you think that DDR4 is going to be faster than GDDR5?

Can you answer my questions now?

Everything I've seen from when wsippel first proposed DDR3 a little while back said the difference was negligible. GDDR3 had the lower latency, but DDR3 made up for it through clock speed. The key difference seems to be the read/write ability of GDDR.

Also here is a benchmark for an XFX 4770 that is supposed to be using DDR3 while the other two 4770's (MSI and Gigabyte) have GDDR5. I found this in an old Beyond3D thread.

http://www.motherboards.org/reviews/hardware/1897_8.html

Just from that when looking at performance vs cost there's no way I could see Nintendo justifying GDDR5 over DDR3.
 
bgassassin said:
No Brit. Most people were never expecting that. Just you and I think the pixel counting article.



Everything I've seen from when wsippel first proposed DDR3 a little while back said the difference was negligible. GDDR3 had the lower latency, but DDR3 made up for it through clock speed. The key difference seems to be the read/write ability of GDDR.

Also here is a benchmark for an XFX 4770 that is supposed to be using DDR3 while the other two 4770's (MSI and Gigabyte) have GDDR5. I found this in an old Beyond3D thread.

http://www.motherboards.org/reviews/hardware/1897_8.html

Just from that when looking at performance vs cost there's no way I could see Nintendo justifying GDDR5 over DDR3.
I have a hard time believing that the quad-channel interface isn't expensive and an even harder time believing that DDR3 won't have major latency issues. If neither of these are are issues, then Nintendo has NO EXCUSE to have less than 3GB of RAM or price it above $250.
 
Mr_Brit said:
Source? I've not heard any GPU leaks from the latest dev kits. If true, it's much better than most were expecting.
It was posted in this very thread by a guy named lherre or something, acebandage mentions his posts as well but he seems to have editted it.

Nintendo usually picks the right GPU the question for me is, will they decrease clockspeeds to fix the overheating issue and improve yields and will they cheap out on RAM? The latter seems unlikely given the fact that Ubisoft commented on the 'large pool of RAM' that they liked and Gearbox said the same thing I believe.
 
^ Don't forget Wii BC. I wouldn't be shocked to see Nintendo use a larger amount for GPU eDRAM.

BurntPork said:
I have a hard time believing that the quad-channel interface isn't expensive and an even harder time believing that DDR3 won't have major latency issues. If neither of these are are issues, then Nintendo has NO EXCUSE to have less than 3GB of RAM or price it above $250.

Got your edit in I see. I don't know how cost effective it will be to use the interface from a POWER7. However like I said everything I've read indicates there would not be any major latency issues. Your edit is just you getting into that old habit of overreacting.

[Nintex] said:
It was posted in this very thread by a guy named lherre or something, acebandage mentions his posts as well but he seems to have editted it.

Actually wsippel posted it. lherre said there's nothing (except for one possible thing) based on what the specs they have to suggest that it will use an R700 variant.
 
sfried said:
Do you think its possible they dropped the 1T-SRAM? It seems likely considering they dropped GC BC.
The 4750? I didn't even know that existed. (lol. I'm new to the PC gaming scene.) Hm...

Based on the TDP of that, a 600MHz RV740 @ 40nm could actually fit into the Wii U. That's interesting.
 
bgassassin said:
Got your edit in I see. I don't know how cost effective it will be to use the interface from a POWER7. However like I said everything I've read indicates there would not be any major latency issues. Your edit is just you getting into that old habit of overreacting.
You have to admit that 1GB of cheap-ass DDR3 would be pretty dumb. I could maybe deal with 2GB, but less than that is just silly, especially considering the fact Nintendo has always been good about adding plenty of RAM. I mean, look at the 3DS. Less power than the PS2, but more RAM than the Wii? How does that work?

... And I hit submit by accident again.
 
BurntPork said:
The 4750? I didn't even know that existed. (lol. I'm new to the PC gaming scene.) Hm...

Based on the TDP of that, a 600MHz RV740 @ 40nm could actually fit into the Wii U. That's interesting.
4750 is basically just a DDR3 version of the 4770.
 
Mr_Brit said:
No it's not. It's just a 4770 with a lower clocked core . Everything else is exactly the same as the 4770.
I'M SO CONFUSED


*checks* Oh hey, Brit's right. Okay, so I think a 600MHz 4750 could barely fit in there and it's around the right size to fit on an SoC.
 
[Nintex] said:
Interesting posts from our old friend brain_stew there :)

I agree. Reaffirms what most of us believed. And for those wondering his posts are on page 5 of that link.

BurntPork said:
You have to admit that 1GB of cheap-ass DDR3 would be pretty dumb. I could maybe deal with 2GB, but less than that is just silly, especially considering the fact Nintendo has always been good about adding plenty of RAM. I mean, look at the 3DS. Less power than the PS2, but more RAM than the Wii? How does that work?

... And I hit submit by accident again.

Is there enough info to say 3DS is weaker than the PS2?

Anyway, I don't think it's dumb especially with it being complimented by the eDRAM. Also that benchmark test indicates that it's not an issue when using it on cards. And it would be better for the CPU as well. I get the feeling that 1.5GB is the target. I say this looking at the dev kit kinda like the build IGN had, but acknowledging one thing they forgot. They forgot to count the memory on the graphics card. If we're looking at off the shelf parts for the dev kit, the kit might have 1GB of memory and a card that has 512MB, or (unlikely) vice versa. I would guess 1.5GB is the target with 2GB as a possibility. Ubisoft said the amount of memory wasn't an issue awhile back.
 
sfried said:
Do you think its possible they dropped the 1T-SRAM? It seems likely considering they dropped GC BC.
The Wii had 1T-SRAM too, so they'd still need some sort of equivalent for Wii emulation. My guess, the CPU will have 24MB EDRAM (which is practically the same thing as 1T-SRAM).
 
Mr_Brit said:
No it's not. It's just a 4770 with a lower clocked core . Everything else is exactly the same as the 4770.
There are DDR3 Variety of the 4750. They were mostly reserved for OEM Machines.
 
bgassassin said:
I agree. Reaffirms what most of us believed. And for those wondering his posts are on page 5 of that link.



Is there enough info to say 3DS is weaker than the PS2?

Anyway, I don't think it's dumb especially with it being complimented by the eDRAM. Also that benchmark test indicates that it's not an issue when using it on cards. And it would be better for the CPU as well. I get the feeling that 1.5GB is the target. I say this looking at the dev kit kinda like the build IGN had, but acknowledging one thing they forgot. They forgot to count the memory on the graphics card. If we're looking at off the shelf parts for the dev kit, the kit might have 1GB of memory and a card that has 512MB, or (unlikely) vice versa. I would guess 1.5GB is the target with 2GB as a possibility. Ubisoft said the amount of memory wasn't an issue awhile back.
1GB is twice as much as current consoles, so it wouldn't be "out there" for Ubisoft to call that a large amount. All we know for sure is that it has more RAM than current consoles. Also, I REALLY think IGN made up every bit of info that originated from them.
 
BurntPork said:
1GB is twice as much as current consoles, so it wouldn't be "out there" for Ubisoft to call that a large amount. All we know for sure is that it has more RAM than current consoles. Also, I REALLY think IGN made up every bit of info that originated from them.

I think you're missing what I'm saying. 1GB system + 512MB vram meaning that the final would most likely be a unified amount of 1.5GB DDR3. I'm not just focusing on the system amount of 1GB. I can see the person that "leaked" the info way back when doing the same thing that IGN and you just did by forgetting about the vram. Then from the there leak focused on 1GB. IGN's system technically had a total of 2.5GB of memory, and I don't think they made it up as it was consistent with the other early rumors.

Luigiv said:
The Wii had 1T-SRAM too, so they'd still need some sort of equivalent for Wii emulation. My guess, the CPU will have 24MB EDRAM (which is practically the same thing as 1T-SRAM).

Don't forget that what the Wii and GC had was for the GPU. Wii U is using IBM's eDRAM process for the CPU and I can see some type of eDRAM (could be 1T-SRAM-Q) for the GPU.
 
bgassassin said:
I think you're missing what I'm saying. 1GB system + 512MB vram meaning that the final would most likely be a unified amount of 1.5GB DDR3. I'm not just focusing on the system amount of 1GB. I can see the person that "leaked" the info way back when doing the same thing that IGN and you just did by forgetting about the vram. Then from the there leak focused on 1GB. IGN's system technically had a total of 2.5GB of memory, and I don't think they made it up as it was consistent with the other early rumors.
There's no reason to have separate system RAM and VRAM unless they use different types of RAM. Otherwise, it just makes development harder. IGN is really clueless about PC gaming, obviously, and they said nothing about their source saying that it has 512MB of VRAM. Besides that, you can't find 512MB DDR3 anywhere, so if they heard that it was 1GB total (which is what they said they were told), matching it exactly would be impossible anyway. Whether or not they were telling the truth, between the OS overhead and the fact that PCs always have separate RAM and VRAM (ignoring iGPUs, of course), the 512MB of VRAM in the model is probably totally irrelevant to what's in the Wii U.
 
Honestly, even something equivalent to an 8800GT paired with a nice multi-core CPU and 1.5GB ram total would still be a tasty proposition in a closed box.
 
BurntPork said:
There's no reason to have separate system RAM and VRAM unless they use different types of RAM. Otherwise, it just makes development harder. IGN is really clueless about PC gaming, obviously, and they said nothing about their source saying that it has 512MB of VRAM. Besides that, you can't find 512MB DDR3 anywhere, so if they heard that it was 1GB total (which is what they said they were told), matching it exactly would be impossible anyway. Whether or not they were telling the truth, between the OS overhead and the fact that PCs always have separate RAM and VRAM (ignoring iGPUs, of course), the 512MB of VRAM in the model is probably totally irrelevant to what's in the Wii U.

You're still missing what I'm saying. The dev kit most likely has a split amount because it uses off the shelf parts and the final will be unified. When the leak occurred they (and all of us really) only thought about the system memory since everything we heard then vs now seem to only relate to the first dev kits and not what the final will be. The leaks said it had 1GB of memory and an RV770 card. That card most like had at least 512MB of memory bringing the total amount to 1.5GB.
 
snesfreak said:
No we don't!
It's Nintendo, it's not even gonna be as powerful as current gen, it'll just be a Wii that can output in 1080P.
Cause they're cheap and don't care about graphics, all they do is come up with retarded gimmicks.
Only soccer moms and grandmas play their shit, they need to stop rehashing Mario and Zelda and release something innovative like an FPS where you fight in a war.

And if they don't they're gonna end up like Sega, after the Wii U fails and they immediately announce another system to replace it which ALSO fails.
Right? RIGHT?

Joke post?
 
Status
Not open for further replies.
Top Bottom