30 secs in. *Close tab*
...
30 secs in and I already can't stand that voice. What the hell, man?
You're missing out. It's a classic.
30 secs in. *Close tab*
...
30 secs in and I already can't stand that voice. What the hell, man?
Nintendo should copy how to make good games.
i'm just sayin', i am gonna really enjoy the wii u specs "reveal". nintendo's doin' y'all a favor by not coughing up numbers or components, as usual
The GPU in the Wii is virtually identical to the Gamecube, just a higher clock speed, which is my argument, I don't understand why my point is any less valid than yours, when we are speculating on virtually the same gpu just I assume they will target a higher clock (showing that 800mhz should be possible, but never saying that was what would be in the Wii U) I figure if they did 100mhz higher @ say 700mhz that the Wii U's GPU would be capable of near parity of XB3 with the lower 720p resolution, though that is probably very different than what you think I mean.
I am saying Wii U with those specs and some fixed functionality for lighting should make a high setting at 720p vs ultra settings at 1080p comparison, and not your completely bonkers 1080p ultra to low settings that isn't even seen in comparable PC GPUs.
Also, I'm sorry to drag this on so long, I made my statements this morning, and I've virtually argued it for the last 12+ hours... that wasn't my intention at all, I was just giving a simple speculation on what AMD could fit in the Wii U case under 50w TDP
He could make it rain again. Chad warden only deals wit Straight cash.I would pay him to make a troll video for PS4, Xbox 3, and Wii U.
Who's Chad Warden?
i'm just sayin', i am gonna really enjoy the wii u specs "reveal". nintendo's doin' y'all a favor by not coughing up numbers or components, as usual
Tonight, I realize I have yet a lot to learn about GAF.
And now I'm sad.
Also worth keeping in mind is whatever special hardware Nintendo may have added to the GPU. We don't know much in the way of specifics, but it looks like there's going to be some extra units on the GPU to help with lighting. It wouldn't bring the console on par with the others, but for games that make use of it it might bring them a step closer.
Tonight, I realize I have yet a lot to learn about GAF.
And now I'm sad.
Tonight, I realize I have yet a lot to learn about GAF.
And now I'm sad.
Not being wise in the ways of GAF is not something to look down upon yourself for. Just sayin'.
Tonight, I realize I have yet a lot to learn about GAF.
And now I'm sad.
Yeah but next gen you'll be in on them all. And believe this upcoming generation will offer lots of joke/meme ammo.I just hate being out of inside-jokes.![]()
Dam, you've been here less than a year?
Well, it goes to show how much of an impression you have had on me, and i am sure others, that i thought you where here for longer.
The reason why it's "not as valid" is because we were never discussing console transition. You expect the next console to have more power, and in turn, higher clocks than before. That's something that should be understood even in Wii's case. From my perspective what we've been debating is how high will that new clock be. And as I've said looking at Nintendo philosophy going with a high clock doesn't fit. Now could there be a leak (since we know Nintendo won't say it) of official specs and confirming they went with an 800Mhz GPU? Sure. I'm not saying it's impossible, I'm saying it's improbable.
So what's more bonkers? Essentially saying Wii U is on par with PS4 and Xbox 3 when nothing indicates this or saying Wii U will not be on par with PS4 and Xbox 3 when everything so far indicates this? Even then I said 1080p on high settings vs 720p on low settings is the max gap range I see.
And I wouldn't say this has lasted 12+ hours when there were large time gaps between responses. If that's the case then I've had past debates last for "full" days then.![]()
Tonight, I realize I have yet a lot to learn about GAF.
And now I'm sad.
Sorry to interject, but where exactly was is said this happened. As far as a recall, that was more of an imagined scenario and not something actually reported on.I work nights, so I sleep in the afternoon, my point was not about console transitions, it was that when they looked to make a stronger GPU than they had in the Gamecube, they didn't add more compute units, they added more hz, that is why it's possible that when they went to third parties and found their specs to be lacking, they decided to bump the frequencies rather than change the chip, you have to remember that the change happened right around the time that rumored finalized chips came out, they might have gotten higher than expected frequencies pushed out of those gpus thanks to NEC's 32nm process, or 28nm process...
Why that sounds improbable to you, is because Nintendo doesn't go for high clocks, but 600mhz is pretty low even by today's standards, so you are just on the opposite side of the same coin... I don't think your point is any more valid than mine. As Nintendo should of created a GPU that could hit 700mhz pretty easily, so why did they stop at 600, your argument is that it's Nintendo, and that is true, but my point is they have increased numbers before, there is little reason why they would avoid it now.
Console fans across all the platforms do seem to be a lot harder to piss off.
Besides lighting, are there any other GPU effects that would be a good candidate for fixed-functions?
Sorry to interject, but where exactly was is said this happened. As far as a recall, that was more of an imagined scenario and not something actually reported on.
Console fans across all the platforms do seem to be a lot harder to piss off.
Not really. I own all three and I still get ****ed off at some people's stupidity. As much as I try not to, it never seems to work because the inner me will always be more of a fanboy towards one particular company that I grew up with.![]()
its 2012, there are not enough women with orange hair
I'm sorry, did I walk out of the speculation thread? It's a scenario I don't work for Nintendo or any third party, I'm a security guard who has way too much time on my hands, though I am a PC enthusiast and I have followed GPUs for the last 10 years very closely, especially ATI/AMD as I'm sort of a fanboy.
We do know that Nintendo worked on third party engines, and they bumped the power upwards, this happened around the beginning of the year, which is when final silicon was finally available, so it's just speculation. If you want facts, well we will probably have to wait 6months to 6 years.
Link...
sounds like Lynx...
Atari!
You need to hang around more congirls and burners.
Nothing wrong with having a preference. For whatever reason, many folks seem convinced that preferring one over the other is a sin. I don't get it.Not really. I own all three and I still get pissed off at some people's stupidity. As much as I try not to, it never seems to work because the inner me will always be more of a fanboy towards one particular company that I grew up with.![]()
I've been caught.
Every time someone mentions how Atari failed, the inner fanboy in me wants to lash out at them. Atari is #1 <3
Not really. I own all three and I still get pissed off at some people's stupidity. As much as I try not to, it never seems to work because the inner me will always be more of a fanboy towards one particular company that I grew up with.![]()
Nothing wrong with having a preference. For whatever reason, many folks seem convinced that preferring one over the other is a sin. I don't get it.
Console preference isn't what I mean though. Most have a preference.
Anyways, in the hyperthetical event that all 3 have a sku at the same price point which one would you be inclined to buy?
Dam, you've been here less than a year?
Well, it goes to show how much of an impression you have had on me, and i am sure others, that i thought you where here for longer.
I work nights, so I sleep in the afternoon, my point was not about console transitions, it was that when they looked to make a stronger GPU than they had in the Gamecube, they didn't add more compute units, they added more hz, that is why it's possible that when they went to third parties and found their specs to be lacking, they decided to bump the frequencies rather than change the chip, you have to remember that the change happened right around the time that rumored finalized chips came out, they might have gotten higher than expected frequencies pushed out of those gpus thanks to NEC's 32nm process, or 28nm process...
Why that sounds improbable to you, is because Nintendo doesn't go for high clocks, but 600mhz is pretty low even by today's standards, so you are just on the opposite side of the same coin... I don't think your point is any more valid than mine. As Nintendo should of created a GPU that could hit 700mhz pretty easily, so why did they stop at 600, your argument is that it's Nintendo, and that is true, but my point is they have increased numbers before, there is little reason why they would avoid it now.
I do have all three as well, plus I do a lot of PC gaming too.
Though, my PC is far from powerful enough as I want it to be my Xbox 360 is unplugged and sitting in a corner because that consoles bores me for the lack of interesting exclusive nowadays. My Wii hasn't been opened since I finished Skyward Sword, my copy of Xenoblade still sealed.
The only consoles I touched in the last months are my 3DS and my PS3.
But I'm still a huge ****ing Nintendo fanboy.
E3 is pretty close doodz. All I know is thank the blizzard gods that diablo 3 is coming out. Will make the wait super easy. Also that thread is moving at crazy speeds. Reminds me of something I can't quite recall.
Are you? =P
E3 is pretty close doodz. All I know is thank the blizzard gods that diablo 3 is coming out. Will make the wait super easy. Also that thread is moving at crazy speeds. Reminds me of something I can't quite recall.
I'm upset that I know the song you're referencing.A jeep? [/r.kelly]