test_account
XP-39C²
I see, that is coolthat's one of the perks of a GPGPU it can compute more code while using less wattage than a CPU would.
I think that's how it goes (some with more knowledge of this drop in to explain)

I see, that is coolthat's one of the perks of a GPGPU it can compute more code while using less wattage than a CPU would.
I think that's how it goes (some with more knowledge of this drop in to explain)
Isn't GFLOPS/watt a really shitty way to measure power when you think about it?
I've no idea how to work put the input. Is it the same sum?
Well as far as I know volts x amps = watts. But if that calculation is different for the input than it is for the output then I'd welcome some clarification from anyone who knows better.
Do people really think MS and Sony are going to blow 160W of their power budget on the GPU?
CECH3000 was last years revision I believe.
I could be (probably am) reading this wrong (?) but I'm finding the original model had an output of 12V 32A = 384W?
Okay given this estimate.
We're looking at 60W max power draw - which presumably would be inclusive of the USB draw, which I've seen people attribute 10W?
That would mean 50W draw under load.
How much of this power "budget" would be reasonable to allocate to the GPU?
30-35W
25-30W
20-25W
"All that talk about the Wii U being “weak”, “underpowered”, and not capable of outputting graphics like the Xbox 360 or PS3, is apparently hogwash. Unity Technology's CEO David Hegalson squashes the rumors by acknowledging just how far the Wii U's tech can scale and what developers will be capable of utilizing with the Unity alone."
http://www.cinemablend.com/games/Wi...ble-DirectX-11-Equivalent-Graphics-47126.html
Did anyone happen to catch this recently? Wii-U is supporting the new Unity Engine and the creator is saying some pretty good things about the Wii-U.
David Hegalson - Unity Technology's CEO
He goes on more to talk about how the Wii-U WILL use some of the DX11 equivalent features when working with the Unity Engine.
For those who havent seen it this is their new demo on youtube made on the PC of course.
http://www.youtube.com/watch?v=Qd1pGJ5Hqt0
Do you understand what the glfop are and how its rated? It seem to be a misunderstanding of what they can change to boost this number. What are you saying they are changing to boost the glfop so high?
^ Already has a topic. The title doesn't match the article. Beaten by Meelow.
You shouldn't be questioning people. That said it was based on the dev kit GPU supposedly being 576GF at 450Mhz. It was speculation based on Nintendo continuing to use clock multiples like GC and Wii. We know the DSP was 120Mhz and 640ALUs at 480Mhz would be ~614GF.
http://twitter.com/IBMWatson/status/248929547842641920
They sure do like being vague. Might as well just say its a chip built by IBM.
That's not how it works.
It's this simple:
The Wii U's PSU can supply the console a maximum of 75 watts. Note: The 75w is NOT what the PSU draws from the power point, but what it can feed the Wii U.
But like any PSU, it the Wii U's PSU likely doesn't like being placed under 100% load. 80% would likely be the sweet spot where the PSU can run 24/7 365 without fault.
Thus 56w is likely a very realistic constant draw for the console.
So all the reasonable people saying no way it's a Power7 proved right again.
Ho hum.
^ Already has a topic. The title doesn't match the article. Beaten by Meelow.
You shouldn't be questioning people. That said it was based on the dev kit GPU supposedly being 576GF at 450Mhz. It was speculation based on Nintendo continuing to use clock multiples like GC and Wii. We know the DSP was 120Mhz and 640ALUs at 480Mhz would be ~614GF.
So all the reasonable people saying no way it's a Power7 proved right again.
Ho hum.
Yeah, this was the conclusion I was working from, or thereabouts.That's not how it works.
It's this simple:
The Wii U's PSU can supply the console a maximum of 75 watts. Note: The 75w is NOT what the PSU draws from the power point, but what it can feed the Wii U.
But like any PSU, it the Wii U's PSU likely doesn't like being placed under 100% load. 80% would likely be the sweet spot where the PSU can run 24/7 365 without fault.
Thus 56w is likely a very realistic constant draw for the console.
So what are some estimates for WiFi power usage and 2GB of RAM (for either DDR3 or GDDR5)?Right, this guy has it down.
But I'd go further down than that. 56W with everything working at the same time, disc spinning, CPU & GPU full load, USBs loaded, Wifi, etc.
So, the typical draw is back to 40-45W with a game. Most of that is from a constant load on the CPU, GPU, RAM, and Wifi sending signal.
Everything else will be a small part of that, intermittent disc spin, intermittent game saves to internal flash, 1 or 2 USB devices but idling most of the time, and so on.
Well actually we still don't know the CPU, it could be Power7, it might not be, who knows.
P.S I'm not defending anything but there where people that said "no way the Wii U uses a GPGPU" and they where proven wrong.
We'll have to see at launch if the CPU is Power7 or something else.
I do find it weird that IBM "corrected" them self's a year and 3 months after the original post.
I'm just glad AMD managed to sweep up so many console CPU/GPUs... lately I've been worried about them since they are more or less being pushed out of the PC CPU industry.
It's their own fault on the PC front. Their latest lineup of CPUs was so underwhelming, it's not even funny.
Yeah, this was the conclusion I was working from, or thereabouts.
So what are some estimates for WiFi power usage and 2GB of RAM (for either DDR3 or GDDR5)?
What does that leave for the GPU?
It's the money difference they have against intel that's not even funny, meaning once upon a time they were able to compete seeing intel made bad choices, but not now; they can't invest as much on R&D or on securing experimental tech like tri gate.It's their own fault on the PC front. Their latest lineup of CPUs was so underwhelming, it's not even funny.
It's the money difference they have against intel that's not even funny, meaning once upon a time they were able to compete seeing intel made bad choices, but not now; they can't invest as much on R&D or on securing experimental tech like tri gate.
Right, this guy has it down.
But I'd go further down than that. 56W with everything working at the same time, disc spinning, CPU & GPU full load, USBs loaded, Wifi, etc.
So, the typical draw is back to 40-45W with a game. Most of that is from a constant load on the CPU, GPU, RAM, and Wifi sending signal.
Everything else will be a small part of that, intermittent disc spin, intermittent game saves to internal flash, 1 or 2 USB devices but idling most of the time, and so on.
Hey Bg, what do you think of that 8xxx GPU rumor?
Can I ask where you got the FLOP info on the Dev kit GPU?
Or are you speculating it has 640ALUs and calculating it based on reasonable clocks?
Depends on the deal and if they manufacture the chip or not.Sadly winning all consoles, while not a bad thing, wont help AMD much either. It's not a very lucrative field. They get a minor one off payment and thats about it basically I think.
It's not their own fault on GPU's though. Their GPU's are great, and still nobody buys them cause everybody is an Nvidia fanboy.
Sadly winning all consoles, while not a bad thing, wont help AMD much either. It's not a very lucrative field. They get a minor one off payment and thats about it basically I think.
If winning the consoles was lucrative, AMD stock wouldn't be at historic lows right now.
I could see a danger of AMD going out of business or being bought up by a Qualcomm that doesnt care about PC GPU's sometime in the next gen. I'm sure the console makers dont care very much and have all the patents and agreements in place where it doesn't matter.
Biggest benefit to AMD imo will be all the PC ports of console games next gen being based more on AMD architecture and theoretically running better on AMD PC cards. Theoretically anyway. If it was that big a deal though I'm sure Nvidia wouldn't have allowed it to happen. Besides within a couple years both IHV's GPU's will probably look quite different than what goes into next gen console anyway, lessening any benefit.
It would be interesting to see if switch to something like that if those specs are true. Would definitely increase the increase the gap between PS4 and Wii U.
Indeed, the biggest market is the low end one, always was always will be.While this is true, I think there's definitely a place for two CPU makers. AMD doesn't have to perform the best to survive. Hell for most of their life they've rarely had the lead on intel, in CPU performance, yet they've been in business for a long time. There's a place for a low cost, decent performing Intel alternative.
Perhaps they do, they certainly don't see them as a pebble in their shoe at this point.Hell some people even claim Intel purposefully allows AMD to live to prevent antitrust actions.
They're trying though.Bulldozer was just a dog, though. Note I mentioned "decent" performance, and BD doesn't have that.
http://twitter.com/IBMWatson/status/248929547842641920
They sure do like being vague. Might as well just say its a chip built by IBM.
Do the people that deal with the twitter accounts even have direct access to such info?How the hell do you make an error like that for a year? So stupid.
Well, I believe a is real possibility. But the performance achieved with the SI part they are aiming for should be around what have been stated before. After all, Sony is not the ones to understate the performance of their future hardware. All the contrary I would say.It would be interesting to see if switch to something like that if those specs are true. Would definitely increase the increase the gap between PS4 and Wii U.
I swear I haven't seen her in anything in a loooong time.
I generally assume the bold for companies - regardless of who the account is named as, unless it's a verified personal account.Do the people that deal with the twitter accounts even have direct access to such info?
The people dealing with twitter, facebook and all that crap are usually PR's or the marketing team.
They might not even know the lineage now or be out of bounds for them, so they're trying to not overstep it either by claiming it isn't or claiming it is.
Indeed, the biggest market is the low end one, always was always will be.
But the cpu open market seems to value too much the tech being the best. It's like if you're buying a Atom/Celeron (let's call it a Fiat) you're still getting the lineage of a i7/Xeon (Ferrari) kind of thing; people want the best manufacturer, not always the best bang for the buck. And then intel has a big name in the market; having a big name doesn't mean they'll always have the edge, but AMD hasn't managed to crack that for a few years now.Perhaps they do, they certainly don't see them as a pebble in their shoe at this point.They're trying though.
Lots of things come out wrong in their first generation, but AMD Fusion's are definetly not as bad as atoms, for instance. They seem to be going for the low end, really.
...
Also people need to stop worrying about power consumption and how it relates to the Wii U's technical power. The Wii U by all accounts has dedicated DSP, I/O controller, and i believe even an ARM CPU for some O/S functions. Not to mention IBM's modern Power based CPUs are incredibly efficient...
Do you think that was just for PS4 or also Xbox 720?
Well, I believe that is real possibility. But the performance achieved with the SI part they are aiming for should be around what have been stated before. After all, Sony is not the ones to understate the performance of their future hardware. All the contrary I would say.
A Smaller SI for cheaper?Could be both though PS4 would definitely be the most likely of the two based on their original target specs.
If those specs are true, those would be some serious efficiency improvements. Based on what a CU contains, the 8870 has 28 CUs when looking at the 1.1Ghz clock since that comes out to ~3.94TF and the 24CUs for the 8850 @ 975Mhz. I have felt PS4 would probably end up using Sea Islands, but my biggest question would be the prices listed.
Could be both though PS4 would definitely be the most likely of the two based on their original target specs.
A Smaller SI for cheaper?![]()
Ahh, if the PS4 does use the 8xxx GPU do you think that could hurt Wii U third party support or possibly help it? (sorry I'm asking all these questions lol)
Personally I find the article not 100% legit because it didn't sound like it was confirmed but more speculation on the writer.
Compared to the 7950?
The jump was rather big in a direct comparison to the 7800 line (not saying it's impossible) and the 8870 based on those specs is an improved 7950. So I would be surprised to see it released at a price that is $170 cheaper than what the 7950 debuted at.
As for ports in that scenario, the market is still going to be the primary determinant IMO. And if PS4 and Xbox 3 are essentially "port buddies" then that should free up dev resources for a Wii U version of a game.
Thanks, I am just scared that devs will be like "Wii U isn't powerful enough to run this" like they did with the Wii...Should I be scared for that? lol.
Or some may just want to make games which require more than 2005-level CPU performance for gameplay reasons, and thus be pretty much unportable.Not sure why you would be scared. But there would definitely be some for varying reasons saying that. Some may not have the resources to devote to a Wii U downport since it definitely wouldn't be a 1:1 port. Some may just be picky about how their game looks, which would be their right to do so.