VivaPapaya
Banned
Why did we have all the moaning from dev's that the Wii U was weaker than the PS360 because it had such a slow CPU then?
Actually it was one dev. And it was no moaning.
Why did we have all the moaning from dev's that the Wii U was weaker than the PS360 because it had such a slow CPU then?
I don't see what would be so challenging for third party devs. They are already used to multcore CPUs and shader based GPUs.
Nintendo first party teams will need to go through that learning curve though.
if there is specific effort required to 'get the most' from the hardware by coding for the GPGPU portion, then ordinarily I'd argue that no third parties will bother and they'll just port with minimum effort.
However, if the next gen consoles have GPGPU elements, then teams will need to get to grips with this anyway, which might benefit WiiU development.
Why did we have all the moaning from dev's that the Wii U was weaker than the PS360 because it had such a slow CPU then?
Why did we have all the moaning from dev's that the Wii U was weaker than the PS360 because it had such a slow CPU then?
"With the Wii U being new hardware, we're still getting used to developing for it, so there are still a lot of things we don't know yet to bring out the most of the processing power. There's a lot that still needs to be explored in that area."
It was only one developer Koei and it went on to say they are still trying to figure out the system.
Unless you're a Wii U games dev yourself then you can only expect not to understant and just need to except that!
But yeah, all rumours seem to point to all consoles being GPGPU which they will have to learn for anyway like you say. Which will benefit the 720/ps4 before it benefits the Wii U ;-) But yes, that then in turn would hopefully mean porting between all 3 new platforms will be more universal.
it was more that I expect multiplatform development towards WiiU to be conservative in its approach (see Fifa having old engines etc). That means limited funds for dev teams, which means 'get something out of the door quickly'.
slightly aggressive tone?
No mate, maybe that's how you read it because your defence systems went on red alert. I stopped reading after your first line though; it set the tone, with which I can't be arsed today.
Erm, I'm talking bog standard desktop tech circa the beginning of the century, and the corresponding know-how. Gamedevs don't live in caves. Their office workstations and/or home desktops were already hosting SMP and shader tech before 2005. Devs' non-cube/ps2 projects were most like requiring them to know this stuff. The notion that 360 (will return to ps3 shortly) introduced a sizable amount of tech the likes of which nobody had ever touched is as detached from reality as they come. That does not mean that every gamedev coder was brilliant at SMP code and was writing shaders day-in and day-out in 2005. But guess what - devs aren't even today, and that will remain so in the foreseeable future - there's a sound division of labor and responsibilities on every sizable game project. BTW, I'm talking from the POV of somebody who used to do PC-based game engines in that timeframe, and whose teams (on multiple projects) would include gamedevs of various backgrounds - consoles, pc, handhelds, university graduates. I'm not speaking hypothetically, I'm telling you how things were back then for a representative sample of the industry.Gemüsepizza;43504669 said:All this hardware you have listed was not something a console dev did usually work with.
Let me guess, many console devs in your eyes means ps2-only devs? Because entirely disregarding any likely PC exposure of the gamedevs from that timeframe, cube's TEV was proto-PS1.3 tech, and Xbox was dx8 through and through. Basically, your entire argument is resting on the premise there was this exotic tribe in the Amazon forest known as 'console devs' whose exposure to technology was limited to only what their console vendor of choice dropped on them via parachutes once per hw generation. Which, again, is still not sufficient for you, because ps3's 'ultra exotic' CPU tech was precisely targeting seasoned ps2 devs, who were already versed in widely-asymmetric architectures where autonomous streaming processors were meant to (pre-)chew graphics workloads. Yes, the RSX was a castrated desktop part, so that did turn quite a bump in theGamecube = No shaders. PS2 = No pixel shaders. Many console devs had little experience with shaders when the current generation hit, and had to learn how to properly use them.
No, it does not go 'hand in hand' - most graphics know-how in use today either originates from the academic/CGI/PC space, or finds its way there shortly, and from there on into the wild. Exception is Cell-targeted sw tech, which, let's face it, was a dead end. And how is 'learning how to properly use the new hw' implying that devs had no experience with SMP or shaders per se?What? Of course devs had to learn how to properly use this hardware. This was new technology in console space. This goes hand in hand with developing algorithms for this hardware.
You're confused so let me help. You are mixing high production values (which is normally an asset thing) and use of advanced graphics algorithms (which is normally an R&D thing) - those are different things. Yes, high-production-value projects do tend to also have strong R&D, but a small indie title can be high-tech just as well.Wii U will profit by some of the advancements made in graphics development, but in the end it's dependent on how much developers will care. I have already pointed this out in a post before, yet everybody seems to ignore it. I mean, it's not like every Xbox 360 title from now on will look like Halo 4. Please tell me, how realistic is it, that we will see devs putting their best graphic experts and big money on Wii U development to optimize their engines by the same amount they did for Xbox 360 and PS3.
And yet we get anecdotal accounts of devs who managed to increase the performance of their WiiU pipelines multi-fold in the span of the last devkit cycle. Again, let me restate that how much learning WiiU devs have to do to get the hang on the paltform is something which you have to get first-hand only.What I mean is: The hardware in the Wii U is not supposed to be something completely new. Because Nintendo didn't want this. They wanted hardware that is easy to develop for.
Which devs? PDZ and Halo4 are both first-party. Are you suggesting Nintendo will neglect their own platform?Maybe because you didn't read all of them. Optimization is always possible. But it costs money and time. There is no doubt that the Wii U's hardware is more capable than the hardware of Xbox 360 and PS3. But is there any sign that devs will actually care about this? That they will put the same effort in the Wii U that they did with Xbox 360 and PS3? That they will put more effort in it? And how much effort is needed to produce a jump like PDZ to Halo 4? We won't see a magical jump just because "graphic algorithms will advance".
While that remains a possibility I find it highly unlikely for reasons that have been repeatedly stated
Not entirely I would say. People are incredibly hyped about GPGPU these days (still), which is quite close to the Cell concept. Particularly if the "GPU" cores used for it reside on the CPU.Exception is Cell-targeted sw tech, which, let's face it, was a dead end.
architecture and power are not the same thing. the Wii was very underpowered compared to the 360 and PS3 but it was also hampered by having an outdated architecture.You've already been told by multiple people but continue to repeat the question. Yes, it is about POWER since every gen starts with devs trying to figure out how to best utilize that POWER.
i believe the same thing, and i already suggest so by telling someone who saw this as a PS2 to xbox jump that 'I wouldn't go that far'.The WiiU's POWER is similar to current gen and develpers can just use existing models and assets with enhancements. So far I see the jump from PS360 to WiiU as a smaller jump than PS2 to Xbox.
I don't. We'll see games on Wii U look better than current gen if a talented dev puts in the effort. guaging from the Wii, I doubt we'll see many games push the system to it's best, as most Wii games fail to look better than the best Gamecube games, save a few notable exceptions.I see no reason to expect some sort of hidden potential that blows the best current gen software away.
as Gemusepizza (who i was replying to originally) keeps reiterating a big part of the learning curve was learning how to program for multicore CPUs and working with shaders. that isn't talking about power, but functions. as far as we know based on all the rumours we are look at PS4 and Loop featuring shader model 5.0 GPGPUs. the Wii U has an enhanced shader model 4.1 GPGPU.The WiiU is designed to run current gen engines. These engines have had years to mature to their current state and that and developer experience is a huge benefit to the WiiU. This was clearly part of Nintendo's plan and it was a risky but clever idea.
The next systems are expected to have a generational jump in power that will restart the cycle of immature engines and tools all over again. The learning curve will be smoother than this gen but will still have it's fair share of difficulty.
I'm not sure what other answers you're looking for since no, the next systems won't be similar simply because they have multi-core cpu's and shaders, and your questions are terribly half-assed and unclear.
if you take the time to read my posts and actually respond to what i said, rather than presuming i said something else, i won't be so dismissive. gemuse was talking about architecture, not power, and continues to talk about architecture. i was responding to him. it's not difficult to follow and yet somehow you thought i was talking about power.Try to ask a question like an adult and don't act like an asshole.
I can imagine the development process for RETRO has been difficult - there's a lot of pressure for them to deliver a visually stunning game, yet with fluctuating hardware and looming deadlines, I bet they've had a reall difficult time pulling the project through. I have every faith they'll develop something amazing, but as is always the case with new tech, I bet they had a struggle to match graphics similar to Witcher 2, Skyrim etc., let alone surpassing it.
If we see Halo 4 level graphics, I'll be impressed. If we see anything surpassing that, I'll be amazed.
I find it likely because of what else we know. With a seperate processor for the OS, sound, and having GPGPU functionality it makes sense that they wouldn't have given the CPU a lot of juice, to focus on power consumption, heat and size.
Considering both PS3 and 360 CPUs deal with OS tasks, sound and graphics calculations, they would probably try to make it on par for tasks excluding those (and considering GPGPU) so it balances out. But maybe some tasks can't be offloaded to the GPU, or with difficulty, or there's a problem with experience or time. Or they underestimated demands, which is why we hear problems. Who knows.
No, I'm arguing that at the end of the day it was not so much the novelty tech in those platforms that dictated what was the observable state-of-the-art on those platforms through the end of the platforms' lifespans. It was the algorithmic know-how pertinent to the hw, and developed through the hw's lifespan which played the crucial role of something like 'PDZ vs Halo4'. I'm arguing that the sw side of things did not simply come down to 'devs originally did not know SMP or shaders' - that's a mis-directed argument at best, and more like a gross bastardisation of the subject.Blu, after all the arguing I'm not sure I still see your point. Are you saying that the difference in hardware architecture going from PS360 to WiiU is as significant as the one going from Xbox/GC/PS2 to PS360? Because if so, I disagree for the reasons I outlined further up. If not, then what are you saying?
Ok, well, it was a dead-end on the CPU end, let's put it like this.Not entirely I would say. People are incredibly hyped about GPGPU these days (still), which is quite close to the Cell concept. Particularly if the "GPU" cores used for it reside on the CPU.
Isn't most of Retro former PC developers? I think the more mundane architecture of current consoles would be right up their alley.
Isn't most of Retro former PC developers? I think the more mundane architecture of current consoles would be right up their alley.
I thought retro was mainly ex iguana (n64 fps developer) staff?
Isn't most of Retro former PC developers? I think the more mundane architecture of current consoles would be right up their alley.
So essentially another Wii situation.slightly aggressive tone?
My point wasn't about specifically learning GPGPU coding, it was more that I expect multiplatform development towards WiiU to be conservative in its approach (see Fifa having old engines etc). That means limited funds for dev teams, which means 'get something out of the door quickly'.
That environment doesn't lend itself to teams taking the time to understand the architecture and get the most out of it.
So essentially another Wii situation.
I'm certainly glad I'm not rushing to waste money this time around.
I don't see what would be so challenging for third party devs. They are already used to multcore CPUs and shader based GPUs.
<snip>
However, if the next gen consoles have GPGPU elements, then teams will need to get to grips with this anyway, which might benefit WiiU development.
I wasn't aware of this. I thought the base hardware is exact same, just shrunk down.
The CPU, codenamed Xenon, implemented three in-order PowerPC cores with SMT support - meaning the whole chip could work on six threads at the same time. The design was ahead of its time but given its 90nm manufacturing process it only had 1MB of L2 cache to share among all three cores. These days it isn't really considered the ideal approach to a many-core CPU. Private L2 caches with a large shared L3 cache is preferred for scaling beyond two cores.
the new Xbox 360 consumes less than half the power of the original 360!
One interesting thing about the new design is the inclusion of a "FSB Replacement" block. IBM/GlobalFoundries could have just connected the GPU and GPU with a low-latency internal connection, but doing so would have made the new Xbox 360 faster than previous versions. The FSB Replacement block actually adds latency to the mix and introduces a performance hit to keep the new model from outpacing older versions.
It wouldn't surprise me in the least if they easily surpassed Halo 4's presentation. These are the same guys that made the Prime games and Donkey Kong Wii. Whatever they make on the Wii U with it's far more advanced CPU and 10x the memory should be absolutely breathtaking.I can imagine the development process for RETRO has been difficult - there's a lot of pressure for them to deliver a visually stunning game, yet with fluctuating hardware and looming deadlines, I bet they've had a reall difficult time pulling the project through. I have every faith they'll develop something amazing, but as is always the case with new tech, I bet they had a struggle to match graphics similar to Witcher 2, Skyrim etc., let alone surpassing it.
If we see Halo 4 level graphics, I'll be impressed. If we see anything surpassing that, I'll be amazed.
Basically, your entire argument is resting on the premise there was this exotic tribe in the Amazon forest known as 'console devs'
Ok, can someone tell me why Gemüsepizza thinks the GC didn't support shaders at all? I mean, I'm pretty sure the TEV wasn't there for nothing. And while I am aware they weren't your average shaders, they were still shaders none the less.
Also, for those concerned by the quality, the speed & quantity of all the memory department of Wii U, don't worry, they are "marvels of optimizations", from caches to the ram. It's a recurring compliment from what i've heard.
Mon ami Will the OS really take 1 gb of space?Also, for those concerned by the quality, the speed & quantity of all the memory department of Wii U, don't worry, they are "marvels of optimizations", from caches to the ram. It's a recurring compliment from what i've heard.
Thanks for your reply. I assuming that also has to do with whatever the Tori Tori developers are using to reduce their texture RAM usage by 100MB. It is interesting that they can not say how they are doing that.Also, for those concerned by the quality, the speed & quantity of all the memory department of Wii U, don't worry, they are "marvels of optimizations", from caches to the ram. It's a recurring compliment from what i've heard.
Iwata confirmed that, and I don't think that is likely to change before launch.Mon ami Will the OS really take 1 gb of space?
If it uses less than that will the rest be relocated to be used as extra ram for games??
Thanks for your reply. I assuming that also has to do with whatever the Tori Tori developers are using to reduce their texture RAM usage by 100MB. It is interesting that they can not say how they are doing that.
Have you heard anything about bandwidth specifically compared to PS3 or XBox360?
Wii U hardware don't bring nothing new to PC developers? Okay, but it will not stop companies to make graphics better and better at each new game they made.
Mon ami Will the OS really take 1 gb of space?
If it uses less than that will the rest be relocated to be used as extra ram for games??
Thanks for your reply. I assuming that also has to do with whatever the Tori Tori developers are using to reduce their texture RAM usage by 100MB. It is interesting that they can not say how they are doing that.
Merci!Well, the operating itself, stricto sensus (the files that constitutes the "Windows" of Wii U), no. But all the system "functions" (the "Windows" files + all the services running in background, the software features attached), apparently yes, mostly for caching purposes. Some techies can explain this better than me, we already talked about this on previous WUST.
But the 1GB for games/1GB not available for devs is a sure thing and revealed here in February.
Come on, Xbox is a 2x~3x PS2, the same can be applied to Wii U-PS360. I'm sure Wii U is more than the "PS2 to Xbox" jump.i believe the same thing, and i already suggest so by telling someone who saw this as a PS2 to xbox jump that 'I wouldn't go that far'.
Ideaman is back!![]()
Good to hear more confirmation of Wii U's well-thought-out memory hierarchy. Although, we've really been hearing nothing but good things about this since Brain_Stew's initial bombas back in the day. Even if they kept it at 1 GB permanently, it seems like a nice amount to play around with given the capabilities of the GPU and CPU.
Now, if only we could get some bandwidth numbers! It's obvious Nintendo has gone w/ DDR3, so let us all hope that it's the fast 1066 Mhz type (or a very slightly downclocked custom version of it - for clock sync purposes).
But the 1GB for games/1GB not available for devs is a sure thing and revealed here in February.
Is this something that can be optimized, giving the developers more ram to work with in the future? were 360/ps3's OS optimized thru out there life?
Is this something that can be optimized, giving the developers more ram to work with in the future? were 360/ps3's OS optimized thru out there life?
3DS did that as well, I believe.
I can imagine the development process for RETRO has been difficult - there's a lot of pressure for them to deliver a visually stunning game, yet with fluctuating hardware and looming deadlines, I bet they've had a reall difficult time pulling the project through. I have every faith they'll develop something amazing, but as is always the case with new tech, I bet they had a struggle to match graphics similar to Witcher 2, Skyrim etc., let alone surpassing it.
If we see Halo 4 level graphics, I'll be impressed. If we see anything surpassing that, I'll be amazed.
I've never heard that about 3DS RAM. (apart from in similar gaf posts.)
There were rumours of extra processing power being unlocked though.