• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Anandtech on X360/PS3's "poor" CPU performance

element said:
this is a little out of control. I've heard complaints, but mostly because developers dont have time to really develop for the platform, and are forced to use old practices in their current games.

Believe this man.
 
gofreak said:
Well that's a subjective point. In terms of sheer processing requirements, though, we're seeing dedicated physics chips appear in the PC space, which should give you an idea of future requirements..


I think Physics processing units are more of a fad, and will go the way of math co processors, absorbed by most likely CPU, or the (GPU not likely), or split between the two. Either way most of the CPU intensive stuff the physics engine, AI will be hampered by budgets and time restraints a lot more than the CPU.
 
3rdman said:
You're making the same mistake Anand is making...you're comparing a PC (or Mac) enviorment to a console. The difference in CPUS is that Intels and AMD's are meant to be very strong out-of-order chips and it seems as though his arguement is based around the idea of how well PC-centric code runs on these cpus.

From what I gather, games (which have to multi-threaded to get the most out of these consoles) are all designed around single-threaded code to run on PC's or Xbox's...heck most devs don't get the most out of the PS2 because they are not coding the EE properly.

I'm no techie, but this article really screams biased...or at the worst misinformed.


The problem with people not coding the EE properly solely lies with Sony, who cares how powerful the ps2 is if it's a beast to program for, and you can only get 75% out of it. The same will probably apply for ps3 with cell. You cant create a unfamiliar format and expect people to get games running at peak efficeny. I'm sure there are a lot of developers out there thinking the same thing as Anand.
 
I'll be glad when we get to see games running on final or close to final hardware so we can move on to more interesting subjects like games. I swear I've forgotten what speed the ps2 runs at.
 
This article reminds me of the one they did for Gamecube last time around, to some degree. The gist of that one was "the Cube is made up of non-PC parts, and we don't know much about them, but they suck." They still have it up:

http://www.anandtech.com/showdoc.html?i=1566&p=2

At least for this one, they interviewed developers for opinions, instead of just speculating that all the non-PC parts in the machines were crap.

A lot of what they say may be true, but they could work on their presentation a lot. "PC's rule and all non-PC parts in the new consoles suck" isn't a very compelling arguement.
 
element said:
this is a little out of control. I've heard complaints, but mostly because developers dont have time to really develop for the platform, and are forced to use old practices in their current games.


that sounds about right. the article reads as if these developers are still trying to code for a pentium4
 
teiresias said:
The only thing that comes close to being worse than PC programmers who refuse to learn to program anything that microsoft hasn't spoon-fed them is a tech-site writer who refuses to understand anything at all.
thanks teirs. looks like i don't have to post anymore. ta!

ps, i fixed it for you
 
fugimax said:
I dunno, what Anand says agrees with what I've been hearing from devs. The next-gen will be a good leap in terms of getting consoles into the HD era. And yes, the GPU's really are the saving grace of the console.

Anyone who is a programmer knows that threading and synchronization is a bitch not only to implement but to debug. I would wager there are some devs from last gen that won't be able to handle working in a thread-heavy environment -- and so they won't even try. The result? Underwhelming performance.

I think MS/Sony would have been just as well off using a high-end P4 or AMD64 chip as Anand states. Ya, maybe MS saved some money...but what the hell is Sony doing? The cell is anything but amazing *with respect to console gaming.* Yes, Cell is impressive, but not in a console.


I'm sure MS went with IBM for the aforestated reason of owning the IP to the chip. Sometimes, I wonder if they factored in the obscurity of a proprietary chip as well.

IOW, proprietary design + relative lack of PPC chipheads = harder to hack(?)
 
Lancelet Pink said:
I think Physics processing units are more of a fad, and will go the way of math co processors, absorbed by most likely CPU, or the (GPU not likely), or split between the two.

You're right, but it'll be absorbed by CPUs like CELL (or Intel's 2014 processor design) or GPUs which already share alot of commonality, of which CELL is a superset of the PPU. Current CPU architecture is unable to absorb anywhere near the same level of preformance as these designs given static lithography.

This is somewhat ironic as Anand just talked about how they should have went with an off-the-shelf CPU from Intel or AMD and the SPEs are a waste of space or some such bullshit. STI-Cell is very close to the Aegis PPU design AFAIK.

EDIT: HokieJoe, I've been meaning to ask... who is the girl? She looks familiar.
 
Someone on B3D forums made an excellent point that if the 360 is really only twice as fast as the xbox cpu then how in the world are they getting it to emulate xbox games? Everyone knows that emulation is a difficult task.
 
The article got pulled. Perhaps Anand didn't want to get nailed for some misinformation? Because that whole article teetered on complete hyperbole. Usually Anand does a better job.
 
Mrbob said:
The article got pulled. Perhaps Anand didn't want to get nailed for some misinformation? Because that whole article teetered on complete hyperbole. Usually Anand does a better job.

Yep, could be. I've almost always been impressed with Anand's thoroughness though. Hell, even Kyle over at HardOCP refers to him as "Wonderboy". Who knows, he may have some inside dope, but he did come off as if he had a chip on his shoulder (sorry, I couldn't resist).
 
Found this on the Anandtech boards:

PS3 article is pulled for now because Anand is worried about MS tracing his anonymous insider.

Kristopher

:lol

What a bunch of bullshit. The article has been posted. If anything someone from MS already has the article saved. Nothing you can do about it now! Stick to your guns if you think the article is correct. Pulling it and corresponding the pull with a weak excuse questions the authenticy of the article in the first place. Anand respect -10000
 
Well, I'm partially oblivious to most of this tech stuff but, I am glad sites like Anandtech are around. Atleast that guy tries to keep everything as unbiased as possible and promotes discussion of his findings instead of discussion of what MS/Sony PR dude told my best friends dad who works at 'insert store here' said.

Sounds like PS3 will be a "hoss" at 3D but crap for MS Word.
Stupid Floating Point Hardware... j/k
 
MidgarBlowedUp said:
Atleast that guy tries to keep everything as unbiased as possible...

There was plenty of bias in that article. Not towards Microsoft or Sony, but platform. The entire angle of that article was about as subtle as getting your toenail ripped off.
 
Lancelet Pink said:
I think Physics processing units are more of a fad, and will go the way of math co processors, absorbed by most likely CPU, or the (GPU not likely), or split between the two. Either way most of the CPU intensive stuff the physics engine, AI will be hampered by budgets and time restraints a lot more than the CPU.

I agree eventually it will be absorbed back in to CPUs, but in the meantime the type of processing it provides can't be replicated by PC CPUs. GPUs eventually will be absorbed back into CPUs too, IMO. And that kind of processing can be used without breaking the bank..it's not like art production. I think the limiting factor certainly is processing power before it is budget. A very simple use of it would be to simply switch on physics for more objects, which would take maybe 5 minutes with a decent engine. More ambitious use, more ambitious algorithms will take time, but there'll certainly be devs who invest that time, and some who won't.

On a general note, re. the article being pulled, whatever Anand's reasons, I'm not sure if some of the issues he discussed just disappear now ;) A couple of devs on B3D agreed he wasn't far off the truth with some of the more specific points about performance.
 
You know it amazes me when PC tech web sites realize and acknowlege the difference between console development and PC development and then STILL go on to compare them as if consoles were just PC's.

Some needs to show him this:

Doom 3 minimum PC requirements:

System: Pentium® 4 1.5 GHz or Athlon™ XP 1500+ processor or equivalent
RAM: 512 MB RAM
CD-ROM: 8X CD-ROM
Video Memory: 64 MB VRAM
Hard Drive Space: 2200 MB
Other: 384MB RAM, English version of Microsoft® Windows® 2000/XP, • 3D hardware Accelerator Card Required

Doom 3 minimum Xbox requirements:
733 MHz Pentium 3 and 64 MB of RAM.

He needs to try and run DOOM 3 on a strip down 733 MHz P3 chip with 64MB of RAM and see if he can achieve the same results ID got out of the Xbox...I'll give you a hint of what's going to happen.....NOTHING.
 
the ps2 had 3 cpu's and was vastly different to program for then pc's but yet they somehow got mgs3 and gt4 to run.
 
Ok, heres a question. These game programmersÂ… are most of them any good? I mean I would guess they are not on the same level as ID/epic guys, but surely they are more than your average c++ junkies. Cause with all this bitching it sure sounds like some of them need to go back to school and learn how to do THEIR JOBS over again instead of moaning about the difficulties of new architectures. To complain about hardware because you are not talented enough to develop for it is pathetic.

It is one thing to say the hardware just can't do it. But to say it can but we are either too stupid, lazy, and or simply incompetent to bother with it is sad. Sony should get the cell/RSX engineers (you know, the guys that actually have credentials worth talking about) and have them provide an engine that utilizes the hardware and subsequently license it to the more incompetent developers. The likes of konami and square-enix don't need handouts like this, but most western devs sure seem to.
 
I don't think you guys get the point. The goal of new hardware shouldn't be to make the developer's job really challenging, the goal is to get the maximum performance with the least effort as possible. It seems here that the 2 console makers have failed. In the end, its about the games, not about making it hard for the developer to reach his/her final vision.
 
the hardware and software engineers are working together to provide a product to the end-user. the end-user wants one thing: performance. i absolutely get the point, and so do all the mup design teams.
 
dorio said:
I don't think you guys get the point. The goal of new hardware shouldn't be to make the developer's job really challenging, the goal is to get the maximum performance with the least effort as possible. It seems here that the 2 console makers have failed. In the end, its about the games, not about making it hard for the developer to reach his/her final vision.

umm, no. The goal of new hardware is not supposed to be pandering to the requests of lazy and/or incompetent developers. The goal of new hardware is supposed to be pushing the limits of technology. The goal of good (read, GOOD) developers aught be the utilization of new hardware to push software to new levels. If it requires a steep learning curve, so be it. Learn to program around the new architecture instead of bitching and moaning. It's akin to doctors complaining that MRIs are a bitch since they mean having to learn all new operating techniques. Sure, the techniques result in superior results for the patient, but damn, it sucks to have to learn them from scratch.
 
Doube D said:
umm, no. The goal of new hardware is not supposed to be pandering to the requests of lazy and/or incompetent developers. The goal of new hardware is supposed to be pushing the limits of technology. The goal of good (read, GOOD) developers aught be the utilization of new hardware to push software to new levels. If it requires a steep learning curve, so be it. Learn to program around the new architecture instead of bitching and moaning. It's akin to doctors complaining that MRIs are a bitch since they mean having to learn all new operating techniques. Sure, the techniques result in superior results for the patient, but damn, it sucks to have to learn them from scratch.

Blimey, here I am agreeing with DD O_O

Sure, developers would like things to be simple. And perhaps with Xbox they have gotten comfortable with the idea of a PC-like development environment. And now X360 comes along and breaks that a little.

Unfortunately, devs don't always get what they want. Publishers want games that look as good as the competition on PS3/Xbox 360? As a developer, you deliver what the guy with the cheque wants, or you find yourself with no work (or porting licenses to PS2, like a lot of smaller devs did at this stage in the cycle last time round)
 
Doube D said:
Ok, heres a question. These game programmersÂ… are most of them any good? I mean I would guess they are not on the same level as ID/epic guys, but surely they are more than your average c++ junkies. Cause with all this bitching it sure sounds like some of them need to go back to school and learn how to do THEIR JOBS over again instead of moaning about the difficulties of new architectures. To complain about hardware because you are not talented enough to develop for it is pathetic.

It is one thing to say the hardware just can't do it. But to say it can but we are either too stupid, lazy, and or simply incompetent to bother with it is sad. Sony should get the cell/RSX engineers (you know, the guys that actually have credentials worth talking about) and have them provide an engine that utilizes the hardware and subsequently license it to the more incompetent developers. The likes of konami and square-enix don't need handouts like this, but most western devs sure seem to.
While I mostly agree with you (who would have thought it would ever happen), don't forget that optimizing and especially debugging multithreaded code can be and most of the time IS a complete nightmare.
 
Let's not forget we are talking about PC programmers. They do always complain about new consoles architectures 'cos, well, they are lazy and deserve a kick in the ass.
 
Why do we have to get all technical?
we know the performance won't be as good as they say. Never is. Never will be.

allard8ej.jpg
 
mrklaw said:
Blimey, here I am agreeing with DD O_O

Sure, developers would like things to be simple. And perhaps with Xbox they have gotten comfortable with the idea of a PC-like development environment. And now X360 comes along and breaks that a little.

Unfortunately, devs don't always get what they want. Publishers want games that look as good as the competition on PS3/Xbox 360? As a developer, you deliver what the guy with the cheque wants, or you find yourself with no work (or porting licenses to PS2, like a lot of smaller devs did at this stage in the cycle last time round)


I agree that the new consoles should push the technology envelope. Otherwise, why else come out with a new console. OTOH, I think MS and Sony would be remiss if they didn't consider how easy or hard their requisite platforms are to develop for.

I would think that this is a positive for all parties involved.
 
If he had word for word quotes this would probably hold more weight, but those are probably not sensational enough so he went ahead and fixed em up to stir up a bee hive. Seriously Anand is a cool site and all but the guy has his biases. He's basically saying a few minor points that have been made and exploding them. All he is basically saying about the cell is that it probably wont be as good a general purpose /Out of order/branch heavy CPU. Which doesnt mean a damn thing for its target use really.

If logic circuits were so damn important there wouldnt be physicists using video cards as processors because of their insane general computing power. They can use the gpu as a cpu because they know exactly what to ask of it, where as a cpu from amd/intel/ibm has to run a myrad of applications simultaneously so of course half their die space has to be housekeeping circuits that the cell and xenon dont have.

The point he is making about the tri core g5 is basically feeding off a review comparing the best workstations available. And as suspected the G5 is a latency heavy processor , its not a bitch when it comes to crunching the numbers but doesnt matter with the latency involved. Its not just the cpu though it was also Apples OS X with its tons of layers of abstraction that held that chip back. Basically if MS has the chip up and running anand has no argument here...i mean come on look at the SUpercomputer rankings list IBM dominates it they know how to make a CPU.
 
It's kinda funny all the hubub created by "anonymous" sources.

I'm always sceptical about anonymous sources. If Anand wants this to be taken more seriously, they need to provide names. If the devs don't want to go on record, then they need to STFU IMO.
 
Dr_Cogent said:
It's kinda funny all the hubub created by "anonymous" sources.

I'm always sceptical about anonymous sources. If Anand wants this to be taken more seriously, they need to provide names. If the devs don't want to go on record, then they need to STFU IMO.
Ignorance is bliss?
 
dorio said:
I don't think you guys get the point. The goal of new hardware shouldn't be to make the developer's job really challenging, the goal is to get the maximum performance with the least effort as possible. It seems here that the 2 console makers have failed. In the end, its about the games, not about making it hard for the developer to reach his/her final vision.

Well, i think the goal of new hardware, when talking about consoles especially, is to offer the maximum performance trying to keep costs down. Which sometimes means using some clever, original hardware design instead of going the raw chipset power way, which sometimes means the developers have to deal with 'different' architectures and actually have to 'think' of ways to get the maximum out of them.
And programmers are definately getting lazy; some would somehow menage to get 8 sound channels on a 4 channels chip, have 512 colours on screen on a machine which wasn't meant to display more than 32, code a decent version of, say, Thunderblade on a 64k machine...or have a ps2 outputting 1080i at 60fps...but most of them wouldn't try hard to overcome hardware limitations and would rather wait till the following gen...
 
jimbo said:
You know it amazes me when PC tech web sites realize and acknowlege the difference between console development and PC development and then STILL go on to compare them as if consoles were just PC's.

Some needs to show him this:

Doom 3 minimum PC requirements:

System: Pentium® 4 1.5 GHz or Athlon™ XP 1500+ processor or equivalent
RAM: 512 MB RAM
CD-ROM: 8X CD-ROM
Video Memory: 64 MB VRAM
Hard Drive Space: 2200 MB
Other: 384MB RAM, English version of Microsoft® Windows® 2000/XP, • 3D hardware Accelerator Card Required

Doom 3 minimum Xbox requirements:
733 MHz Pentium 3 and 64 MB of RAM.

He needs to try and run DOOM 3 on a strip down 733 MHz P3 chip with 64MB of RAM and see if he can achieve the same results ID got out of the Xbox...I'll give you a hint of what's going to happen.....NOTHING.

I think you missed the part where Doom 3 PC absolutely destroys Doom 3 Xbox in the graphics department when compared side by side. Doom 3 on Xbox was a downgraded port from PC. I'd like to see you take the PC version of Doom 3 and get it to run on Xbox as is. Not going to happen.
 
Razoric said:
I think you missed the part where Doom 3 PC absolutely destroys Doom 3 Xbox in the graphics department when compared side by side. Doom 3 on Xbox was a downgraded port from PC. I'd like to see you take the PC version of Doom 3 and get it to run on Xbox as is. Not going to happen.


That's true, but having played both versions, I was quite impressed with the port. And you know what? I didn't have to spend $800 upgrading my Xbox to play it the way it should be played either. :)
 
HokieJoe said:
That's true, but having played both versions, I was quite impressed with the port. And you know what? I didn't have to spend $800 upgrading my Xbox to play it the way it should be played either. :)

Unfortunately this is also true and why PC gaming is on a constant decline. It seems only Blizzard and a few others ever take weaker PCs into consideration when making games.
 
mrklaw said:
yeah, Xbox is pretty impressive for a celeron 733, 64(!) MB ram and geforce 3. Almost pulling off HL2 and Doom II.

now consider what that means for the next gen consoles....drool.

Xbox NV2A GPU is much more powerful than a GeForce 3.

plain GeForce 3 = ~45 million vertices/sec peak - 800 million pixels/sec fillrate
Xbox NV2A GPU = ~116 million vertices/sec peak - 932 million pixels/sec fillrate

the NV2A GPU is slightly over 2.5 times more powerful than GeForce 3 in polygon performance, and also significantly more powerful in pixel shader performance, even though the pixel fillrates are pretty similar

shpankey said:
the Xbox CPU isn't a Celeron or a P3. it's a hybrid CPU, somewhere in the middle of them both.

true, but Xbox CPU is closer to a Pentium III than it is to a Celeron. the bus and L2 cache architecture is more like, if not exactly like a PIII, even though the *amount* of L2 cache (128K) is the same amount that the Celeron version had.
 
midnightguy said:
Xbox NV2A GPU is much more powerful than a GeForce 3.

plain GeForce 3 = ~45 million vertices/sec peak - 800 million pixels/sec fillrate
Xbox NV2A GPU = ~116 million vertices/sec peak - 932 million pixels/sec fillrate

the NV2A GPU is slightly over 2.5 times more powerful than GeForce 3 in polygon performance, and also significantly more powerful in pixel shader performance, even though the pixel fillrates are pretty similar


Not only that, but the NV2A cranks along at 6.4GB/sec of video memory bandwidth vs. the Geforce 3's ~2.1GB/sec. So the Geforce 3 was also limited by the AGP bus.
 
Top Bottom