element said:this is a little out of control. I've heard complaints, but mostly because developers dont have time to really develop for the platform, and are forced to use old practices in their current games.
Believe this man.
element said:this is a little out of control. I've heard complaints, but mostly because developers dont have time to really develop for the platform, and are forced to use old practices in their current games.
gofreak said:Well that's a subjective point. In terms of sheer processing requirements, though, we're seeing dedicated physics chips appear in the PC space, which should give you an idea of future requirements..
3rdman said:You're making the same mistake Anand is making...you're comparing a PC (or Mac) enviorment to a console. The difference in CPUS is that Intels and AMD's are meant to be very strong out-of-order chips and it seems as though his arguement is based around the idea of how well PC-centric code runs on these cpus.
From what I gather, games (which have to multi-threaded to get the most out of these consoles) are all designed around single-threaded code to run on PC's or Xbox's...heck most devs don't get the most out of the PS2 because they are not coding the EE properly.
I'm no techie, but this article really screams biased...or at the worst misinformed.
element said:this is a little out of control. I've heard complaints, but mostly because developers dont have time to really develop for the platform, and are forced to use old practices in their current games.
thanks teirs. looks like i don't have to post anymore. ta!teiresias said:The only thing that comes close to being worse than PC programmers who refuse to learn to program anything that microsoft hasn't spoon-fed them is a tech-site writer who refuses to understand anything at all.
fugimax said:I dunno, what Anand says agrees with what I've been hearing from devs. The next-gen will be a good leap in terms of getting consoles into the HD era. And yes, the GPU's really are the saving grace of the console.
Anyone who is a programmer knows that threading and synchronization is a bitch not only to implement but to debug. I would wager there are some devs from last gen that won't be able to handle working in a thread-heavy environment -- and so they won't even try. The result? Underwhelming performance.
I think MS/Sony would have been just as well off using a high-end P4 or AMD64 chip as Anand states. Ya, maybe MS saved some money...but what the hell is Sony doing? The cell is anything but amazing *with respect to console gaming.* Yes, Cell is impressive, but not in a console.
Lancelet Pink said:I think Physics processing units are more of a fad, and will go the way of math co processors, absorbed by most likely CPU, or the (GPU not likely), or split between the two.
Vince said:EDIT: HokieJoe, I've been meaning to ask... who is the girl? She looks familiar.
Mrbob said:The article got pulled. Perhaps Anand didn't want to get nailed for some misinformation? Because that whole article teetered on complete hyperbole. Usually Anand does a better job.
PS3 article is pulled for now because Anand is worried about MS tracing his anonymous insider.
Kristopher
MidgarBlowedUp said:Atleast that guy tries to keep everything as unbiased as possible...
Lancelet Pink said:I think Physics processing units are more of a fad, and will go the way of math co processors, absorbed by most likely CPU, or the (GPU not likely), or split between the two. Either way most of the CPU intensive stuff the physics engine, AI will be hampered by budgets and time restraints a lot more than the CPU.
this is pretty accurate. developers can't rely on ILP and memory hierarchy tricks to to turn their shitty code into gold anymore. bitching about it isn't going to help.Hyoushi said:
dorio said:I don't think you guys get the point. The goal of new hardware shouldn't be to make the developer's job really challenging, the goal is to get the maximum performance with the least effort as possible. It seems here that the 2 console makers have failed. In the end, its about the games, not about making it hard for the developer to reach his/her final vision.
Doube D said:umm, no. The goal of new hardware is not supposed to be pandering to the requests of lazy and/or incompetent developers. The goal of new hardware is supposed to be pushing the limits of technology. The goal of good (read, GOOD) developers aught be the utilization of new hardware to push software to new levels. If it requires a steep learning curve, so be it. Learn to program around the new architecture instead of bitching and moaning. It's akin to doctors complaining that MRIs are a bitch since they mean having to learn all new operating techniques. Sure, the techniques result in superior results for the patient, but damn, it sucks to have to learn them from scratch.
While I mostly agree with you (who would have thought it would ever happen), don't forget that optimizing and especially debugging multithreaded code can be and most of the time IS a complete nightmare.Doube D said:Ok, heres a question. These game programmersÂ… are most of them any good? I mean I would guess they are not on the same level as ID/epic guys, but surely they are more than your average c++ junkies. Cause with all this bitching it sure sounds like some of them need to go back to school and learn how to do THEIR JOBS over again instead of moaning about the difficulties of new architectures. To complain about hardware because you are not talented enough to develop for it is pathetic.
It is one thing to say the hardware just can't do it. But to say it can but we are either too stupid, lazy, and or simply incompetent to bother with it is sad. Sony should get the cell/RSX engineers (you know, the guys that actually have credentials worth talking about) and have them provide an engine that utilizes the hardware and subsequently license it to the more incompetent developers. The likes of konami and square-enix don't need handouts like this, but most western devs sure seem to.
mrklaw said:Blimey, here I am agreeing with DD O_O
Sure, developers would like things to be simple. And perhaps with Xbox they have gotten comfortable with the idea of a PC-like development environment. And now X360 comes along and breaks that a little.
Unfortunately, devs don't always get what they want. Publishers want games that look as good as the competition on PS3/Xbox 360? As a developer, you deliver what the guy with the cheque wants, or you find yourself with no work (or porting licenses to PS2, like a lot of smaller devs did at this stage in the cycle last time round)
Because we are discussing technical issues?ThongyDonk said:Why do we have to get all technical?
Ignorance is bliss?Dr_Cogent said:It's kinda funny all the hubub created by "anonymous" sources.
I'm always sceptical about anonymous sources. If Anand wants this to be taken more seriously, they need to provide names. If the devs don't want to go on record, then they need to STFU IMO.
dorio said:I don't think you guys get the point. The goal of new hardware shouldn't be to make the developer's job really challenging, the goal is to get the maximum performance with the least effort as possible. It seems here that the 2 console makers have failed. In the end, its about the games, not about making it hard for the developer to reach his/her final vision.
jimbo said:You know it amazes me when PC tech web sites realize and acknowlege the difference between console development and PC development and then STILL go on to compare them as if consoles were just PC's.
Some needs to show him this:
Doom 3 minimum PC requirements:
System: Pentium® 4 1.5 GHz or Athlon™ XP 1500+ processor or equivalent
RAM: 512 MB RAM
CD-ROM: 8X CD-ROM
Video Memory: 64 MB VRAM
Hard Drive Space: 2200 MB
Other: 384MB RAM, English version of Microsoft® Windows® 2000/XP, • 3D hardware Accelerator Card Required
Doom 3 minimum Xbox requirements:
733 MHz Pentium 3 and 64 MB of RAM.
He needs to try and run DOOM 3 on a strip down 733 MHz P3 chip with 64MB of RAM and see if he can achieve the same results ID got out of the Xbox...I'll give you a hint of what's going to happen.....NOTHING.
Razoric said:I think you missed the part where Doom 3 PC absolutely destroys Doom 3 Xbox in the graphics department when compared side by side. Doom 3 on Xbox was a downgraded port from PC. I'd like to see you take the PC version of Doom 3 and get it to run on Xbox as is. Not going to happen.
HokieJoe said:That's true, but having played both versions, I was quite impressed with the port. And you know what? I didn't have to spend $800 upgrading my Xbox to play it the way it should be played either.![]()
Deadmeat posted this. It's in the wayback machine, I guess:Doc Holliday said:Where is this article? i cant find it anyehere. Link take me to a search engine that sucks.
mrklaw said:yeah, Xbox is pretty impressive for a celeron 733, 64(!) MB ram and geforce 3. Almost pulling off HL2 and Doom II.
now consider what that means for the next gen consoles....drool.
shpankey said:the Xbox CPU isn't a Celeron or a P3. it's a hybrid CPU, somewhere in the middle of them both.
midnightguy said:Xbox NV2A GPU is much more powerful than a GeForce 3.
plain GeForce 3 = ~45 million vertices/sec peak - 800 million pixels/sec fillrate
Xbox NV2A GPU = ~116 million vertices/sec peak - 932 million pixels/sec fillrate
the NV2A GPU is slightly over 2.5 times more powerful than GeForce 3 in polygon performance, and also significantly more powerful in pixel shader performance, even though the pixel fillrates are pretty similar