You've got to a be a joke character. Such specs would get universal praise.
If Marcan's source is correct then yes, but with the extra cores and more powerful GPU it won't mean much it just puts us back to square one before Marcan's original leak.
I know that you're being funny but considering the fab shrink the CPU is based on turn of the century achitecture just as much as IBM's other low wattage CPUs in this line.I think I've changed my mind guys. A triple core processor running at 1.2 ghz compares favorably to 7 year old technology. This is very good news. And seeing as how the only problem anyone has with the CPU is the clock speed and not the actual architecture from nineteen ninety nine, which is just before the turn of the century, it should also compare very favorably to the Durango and Orbis' CPU, seeing as how it will be clocked below 2 ghz and also feature GPGPU integration just like the Wii U. I am very happy.
Do you think lower Iq at 720p or 1080p.I personally thing it's going to be the same as this get 720p the standard.Agni is 60 fps, so I don't think is too far-fetched the possibility of getting those graphics at 30fps with lower IQ. Also the engine is still being worked on so optimizations are probably still being worked on.
I know that you're being funny but considering the fab shrink the CPU is based on turn of the century achitecture just as much as IBM's other low wattage CPUs in this line.
I'm sure that Orbis and Durango CPUs will will outperform it handily but it's probably not as bad a CPU as most are thinking.
But those demos start running on the pc's.Who knows how they are going to run on those systems.
Ah e3 the convention that raises more question than answers at times.we will find out next e3....
But those demos start running on the pc's.Who knows how they are going to run on those systems.
Now I'm wondering if the 360 outclassed the top of the line PC's back when the 360 launched? Because if the 360 was better, who's to say the Durango won't be? Other than hardware costs
I had a fairly powerful PC when the 360 launched and the video card I had at the time was better than Xenos but the triple-core Xenon was better than my dual core CPU.Now I'm wondering if the 360 outclassed the top of the line PC's back when the 360 launched? Because if the 360 was better, who's to say the Durango won't be? Other than hardware costs
Looking at the next gen demo videos like Agni or Unreal 4 how can you said he's wrong? Until nothing shows the contrary I think he's right.
Never going to happen again. "Top of the line PCs" are now beyond 400W. Noone will make a console with such extreme power consumption, and as a consequence noone will match bleeding-edge PC specs ever again.Now I'm wondering if the 360 outclassed the top of the line PC's back when the 360 launched? Because if the 360 was better, who's to say the Durango won't be? Other than hardware costs
Familiarity and Backward comp? I think it's pretty obvious.Then why didn't they use a cheap CPU instead?
The Pentium 4 architecture was a collossal dud, but they also released the Pentium M (Banias/Dothan under "Centrino" branding) in 2003, which formed the foundation for a still running streak of great Intel architectures.Did Intel's R&D go on vacation from 2003-2010 and then decide to make the core tech significantly faster in a short span of time?
Interesting if true...
https://twitter.com/marcan42/status/274856397915697152
Hector Martin
‏@marcan42
If you want more evidence that MHz isn't everything, a little birdie points out that Durango (Xbox 720) is specc'ed to have a 1.6GHz CPU.
So what does this mean?
It means that - according to the logic that many GAFers apply to Wii U - Xbox Next will have a sub-current-gen CPU.
If true, it probably means that a large focus for next-gen consoles will be about getting the best bang for your buck, on a performance per watt basis, especially for the CPU. Sounds like a "duh, that's always the case" response, but that wasn't actually the trend. Large, hot, CPUs that didn't came anywhere close to their theoretical max probably wasn't the best way to achieve that goal.
build quality
build quality
build quality
"The site’s source went so far as to say the build quality wasn’t even up to “horrid” yet."
Isn't their rumors about Xbox 720 having a "painfully low" processor.
Yep.
http://www.vg247.com/2012/09/06/xbox-720-release-delayed-due-to-manufacturing-issue-report/
So if it is true, all of the current/next gen consoles will focus much more on the GPU then the CPU like others said.
I think I've changed my mind guys. A triple core processor running at 1.2 ghz compares favorably to 7 year old technology. This is very good news. And seeing as how the only problem anyone has with the CPU is the clock speed and not the actual architecture from nineteen ninety nine, which is just before the turn of the century, it should also compare very favorably to the Durango and Orbis' CPU, seeing as how it will be clocked below 2 ghz and also feature GPGPU integration just like the Wii U. I am very happy.
1.6 GHz? And the Wii U's is 1.2ghz? What other advantages will the Durango CPU have over the Wii u's?
Another core. Who knows what else. We barely known what's in the Wii U CPU and that's out.
Another core. Who knows what else. We barely known what's in the Wii U CPU and that's out.
Speaking of the multi-core ARM in the Wii U, did we get anymore info about that? If it is like scarlet, I would think that it would be clocked at the same frequency as the GPU, which is 550MHz. Scarlet was an ARM9, so if its 550MHz, would it probably be ARM11/Cortex-Ax?More like 3-5 more cores. If the rumor is correct (in that the ARM multi-core processor handles the OS on Wii U) then I wouldn't be surprised to see a couple of cores in Durango (or PS3, even) be locked for the OS. We've already seen something similar on PS3. In the end, though, it seems that Durango/PS4 will have more cores for development than Wii U, regardless.
I fear catastrophe. Someone hold me.
If it turns out true Xbox 720 is clocked at 1.6ghz there's going to be a lot of people jumping out of tenth floor windows with their 360s strapped to their back before even waiting for the first media of actual games to appear.
If it turns out true Xbox 720 is clocked at 1.6ghz there's going to be a lot of people jumping out of tenth floor windows with their 360s strapped to their back before even waiting for the first media of actual games to appear.
Question for people who know these things:
How powerful would XB3/PS4 have to be to completely shut Wii U out of the picture? Like, if you had PC style graphics settings on all multiplats, how powerful would they need to be that Wii U couldn't even handle "lowest settings?"
If Marcan says it's 550MHz, it's 550MHz. lherre and Arkam basically confirmed it. It's a bit higher than I expected at least, and apparently quite a bit higher than Nintendo initially planned. The Zelda and Japanese Garden techdemos were seemingly running on lower clocked machines, so that's certainly promising.Question, since the Wii U GPU is 550MHz correct? Is that above expectations, below or about right?
If it turns out true Xbox 720 is clocked at 1.6ghz there's going to be a lot of people jumping out of tenth floor windows with their 360s strapped to their back before even waiting for the first media of actual games to appear.
If Marcan says it's 550MHz, it's 550MHz. lherre and Arkam basically confirmed it. It's a bit higher than I expected at least, and apparently quite a bit higher than Nintendo initially planned. The Zelda and Japanese Garden techdemos were seemingly running on lower clocked machines, so that's certainly promising.
Question, since the Wii U GPU is 550MHz correct? Is that above expectations, below or about right?
IF this is true, it makes me wonder if people will still attack the Wii U CPU.
That is a good point. Until we get some Wii U games that display its power, expect to see those Zelda demo-gigs for awhile longer.If Marcan says it's 550MHz, it's 550MHz. lherre and Arkam basically confirmed it. It's a bit higher than I expected at least, and apparently quite a bit higher than Nintendo initially planned. The Zelda and Japanese Garden techdemos were seemingly running on lower clocked machines, so that's certainly promising.
Let me see if i understand what are you saying... how much powerful the next consoles have to be to have something where even the worst of their games looks significantly better than WiiU best game? I guess twice as powerful is enough and from what we know ps4 and x720 are already better than that.
No, not just a looks thing, a "can't even run at lowest settings." If you took Black Ops 4, turned off all the effects, reduced the draw distance, etc, (the equivalent of turning all the settings to low on the PC version) would it be able to run on Wii U? If not, then Nintendo under-powered Wii U out of next gen multiplats. If yes, then there's no real reason for the games not to show up besides publisher apathy/antipathy.
If yes, then there's no real reason for the games not to show up besides publisher apathy/antipathy.
As long as performance of the system and it's games are up-to-par or monstrously better, there shouldn't be any issues (not the case with Wii U obviously). Sad to see so many people up in arms crying because they can't potentially have a psychological circle jerk to amazing polys counts, crisp textures and IQ. It's PERFORMANCE and overall smoothness that has the most bearing on gameplay and potential gameplay and expanded features, graphics in and of themselves are vastly overrated as a selling point and sticking point. But that's the story of the "gamer" (from critic to consumer) and really this gen as a whole.
Ironically, the "casual" is the least exploited by this type of thinking...
No people will just admit CPU are no longer the big driver in a console and that the GPU is more important.If it turns out true Xbox 720 is clocked at 1.6ghz there's going to be a lot of people jumping out of tenth floor windows with their 360s strapped to their back before even waiting for the first media of actual games to appear.
Of course it's more fun to ignore that and retrospectively claim that all criticism was due to clock speed alone.
No people will just admit CPU are no longer the big driver in a console and that the GPU is more important.
Basically the exact opposite of what they say in WiiU threads
No people will just admit CPU are no longer the big driver in a console and that the GPU is more important.
Basically the exact opposite of what they say in WiiU threads
Or maybe, just maybe, they won't show up because the cost of porting the game to the Wii U will be too high in relation to the expected return on investment and available audience for a port.
Publishers really hate Nintendo, that's what it is!
As a gamer, I want the best possible experience. While Wii U downports will certainly be possible (maybe even likely), at least I'm not going to buy them when better versions of the same games are available. Graphics do matter to the experience - framerate, pop-in, tearing are all very detrimental: They are what is suffering the most on current consoles, future downports certainly won't fare better on that front.
So you finally believe that PS4 / 720's CPU wont be clocked anywhere near '7 YEAR OLD TECH !!!' of PS360 ?.
Only took you a week of denial in EC's WiiU spec thread lol...
Fells strange now the shoes on the other foot eh![]()
As a gamer, I want the best possible experience. While Wii U downports are certainly be possible (maybe even likely), at least I'm not going to buy them when better versions of the same games are available. Graphics do matter to the experience - framerate, pop-in, tearing are all very detrimental.
Never going to happen again. "Top of the line PCs" are now beyond 400W. Noone will make a console with such extreme power consumption, and as a consequence noone will match bleeding-edge PC specs ever again.
If Marcan says it's 550MHz, it's 550MHz. lherre and Arkam basically confirmed it. It's a bit higher than I expected at least, and apparently quite a bit higher than Nintendo initially planned. The Zelda and Japanese Garden techdemos were seemingly running on lower clocked machines, so that's certainly promising.
They wont, unless we see new engines spawn out of nowhere with hefty requirements but even then I say the chances of that happening are nigh.Question for people who know these things:
How powerful would XB3/PS4 have to be to completely shut Wii U out of the picture?
Yeah, I said earlier that I don't understand the push to concentrate on dispelling the myth that clock speed is an important factor in performance. I feel like its akin to me focusing "it's not the size that counts; it's how you use it!" rhetoric in response to accusations that I have a small penis when more damaging testimonials like "he lacks stamina," and "he has no idea how to please a woman" are being tossed out there about me by past lovers.
They wont.
The trolls in this thread have already made up their minds that everyone bashing the CPU is a dumb dumb who thinks clock speed equals power and doesn't understand "architecture" and "OoOE".
Hilarious how ps4/720 CPUs keep getting brought up as a counterpoint.