• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Interview with Gabe Newell (Valve): "Your Existing Code? Throw it Away"

Welcome to the future Gabe.
BlackClouds said:
Yeah, this guy is my new hero for having the balls to say some of the comments he has said. Unfortunately the deceit and lies can really affect game sales. God forbid you show a real picture of your game, you'll get blasted on forums like GAF. Next thing you know your game might not have any customers that still give a damn.
Oh I dunno, its seems its edgy these days to call out Sony on the PS3 demos/claims. And hilarious when they reference PS2 demos prove their point (not that he did).
 
Sho Nuff said:
Gabe is dead on and smarter than you, so keep making fun of the fact that he's fat if it makes you feel better.

Your reply showed that Im indeed smarter than you. I wasn't even serious in my remarks, are you his g/f or something?
 
Tellaerin said:
'Dead on and smarter than you'? Newell's an idiot as far as I'm concerned, fat or thin. I don't think anyone who's 'smarter than me' would ever make asinine remarks like, 'every game developer should be terrified of the next generation of processors'. People whose minds I respect aren't afraid of embracing new and better architectures, even when it means abandoning the techniques they know and starting fresh. Gabe sounds like someone who'd rather see hardware manufacturers stick with the same legacy architecture indefinitely and just keep layering on features, no matter how inefficient that might be, because to do things differently means developers have to make a concerted effort to learn, and this is a prospect they should be terrified by. (In the article, he cites the supposed problems involved in coding for these multicore systems, and suggests that programmers may not manage to master the hardware over the course of the coming generation--when should companies make the transition to multicore processors then, Gabe? Never? Clinging to the current design paradigm for as long as possible isn't going to make the learning curve any shallower or the transition any easier, so unless he's suggesting that hardware stay in its current groove indefinitely, there's no reason not to embrace change now.)

Sorry if that seems harsh to you, but I feel that people who cling to legacy architecture because they don't want to invest the time and effort necessary to learn something superior are dead weight slowing the wheels of progress. :p
*slaps forehead*

Seriously, how did you come away thinking that he is saying that the new hardware sucks and should be avoided? It must have taken you at least five minutes to type all of that and think it through.

Striek said:
Welcome to the future Gabe.
Same for you, bud.
 
no there are a couple of problems with that part:

first of all, game customers ARE idiots, and the architecture and hardware guys live off theoretical throughput and benchmark results because this is what they live and breathe, not because they're in the middle of some insidious plan to dupe the consumer.

second of all, this idea of his that modest parallelism is going to make the sky fall (you're so fucking stupid gabe. you know what would be great? if you managed to make more than 2 games in 10 years), or that we'll have like 0.0001% utilization across the board without application of doctoral thesis grade problem solving is fucking retarded. these chips are NOT massively parallel. massive parallelism is 2 to the 8-9-10 node clusters of 2-3-4 way multi processor and multi core commodity hardware. massive parallelism is crap like blue gene, not a single tri-core intel chip.* will some code not be reusable? yah, sure. will some code be perfectly reusable? yah, sure. OKAYSORRY!!! that's the way it always works.

for god sakes, why is this guy so rich?

*ok, so that was a bit of a tangent. my point is that the supercomputer guys are looking at massive parallelism right now, so his whole doctoral thesis crap really doesn't apply.
 
Rhindle said:
*slaps forehead*

Seriously, how did you come away thinking that he is saying that the new hardware sucks and should be avoided? It must have taken you at least five minutes to type all of that and think it through.

The impression I got wasn't so much 'the new hardware sucks and should be avoided' as 'Boy, do I wish companies never decided to move to this multicore architecture! Nobody knows how to code for it yet, most of the problems getting the code running aren't solved, most of the people that are productive now aren't going to be productive after the paradigm shift, none of our legacy code's going to be any good anymore... hell, we might not even master the new hardware before the X360's out of production! I wish all this multicore crap would just go away. Things have been fine up until now, and trying to do something new with hardware is just going to introduce a shitload of problems for everybody.' He seems to practically ooze negativity. No enthusiasm whatsoever about the potential of the new hardware, just this litany of inconveniences and drawbacks. Maybe if he'd tempered his remarks with a few comments about the possibilities he sees the new, more powerful architecture opening up for developers, I wouldn't've gotten the impression I did.
 
this makes a lot more sense. he is being very negative, but the reasons he gives are more molehills than mountains. this is why i think he's an idiot.
 
You have two choices here and only two choices.

You make hardware from off the shelf parts that is relatively easy to program for and easy to build but, has a very limited life due to the hardware being tapped out early in it's life cycle. (example- Dreamcast/Xbox)

DOA3 to DOAU isn't a big jump, more like a small step up.
Soul Calibur to "nothing surpassed it" isn't a big jump either.

or

You make hardware from custom built parts that are technically way ahead of the "off the shelf" stuff. Taking this route almost guarantees the architecture will be harder to delevelop for but it will take much longer for it to be maxed out early in its life.
(example- PS2/Gamecube)

Tekken Tag Tournament to Tekken 5 is a giant leap.
Any RE game to RE4 is a giant leap.
 
Rhindle said:
Seriously, how did you come away thinking that he is saying that the new hardware sucks and should be avoided?
Count me in the clueless group, because I don't know any other way to read a comment like this:

Gabe Newell said:
The amount of time it takes to get a good multicore engine running, the Xbox 360 might not even be on the market any longer. That should scare the crap out of everybody."
Maybe not 'sucks' as much as 'way too far ahead of its time', but certainly to be avoided given that he's claiming developers won't get anywhere near tapping the full potential of these platforms, treating customers like "morons" and "idiots" in so doing.

Which I guess seems weird to me given that getting the likes of Gears of War or Heavenly Sword out of a box that doesn't cost much more than $300 and I only have to buy once every 4-6 yrs seems like a better value than having to stick with the PC upgrade rat race every 6 months to a year just to play the latest PC games with decent performance.
 
But why are game buyers idiots? Because they truly are just dumb people or because they are constantly misinformed and mislead? Or because companies are trying to lure in a dumber more docile audience that has no experience with videogames? An audience easier to win over with some wowing special effects than some new ideas? Then if they want to learn more about videogames, they go read about it somewhere and either find hyped up jackasses on the internet or corporate cocksuckers in a magazine. Fuck the status quo. The industry doesn't have to become another entertainment disaster.

*takes chill pill*
 
Zaxxon said:
I heard somewhere that he refuses to port Half Life and Half Life 2 to to OS X and won't even let other companies port Half Life and Half Life 2 to OS X because he's an ex-Microsoft employee that hates Apple. Is that true? Seems like a pretty stupid petty thing.
Nah man, Half-Life was ported. The story goes that Sierra (or Vivendi, who was the publisher at the time?) pulled the plug just as the porting team was finishing up the golden master. Their excuse was that they wouldn't be able to give the userbase the same quality of support as the PC userbase, so instead of doing it half-heartedly they weren't going to do it at all.
 
Tellaerin said:
'Dead on and smarter than you'? Newell's an idiot as far as I'm concerned, fat or thin. I don't think anyone who's 'smarter than me' would ever make asinine remarks like, 'every game developer should be terrified of the next generation of processors'. People whose minds I respect aren't afraid of embracing new and better architectures, even when it means abandoning the techniques they know and starting fresh. Gabe sounds like someone who'd rather see hardware manufacturers stick with the same legacy architecture indefinitely and just keep layering on features, no matter how inefficient that might be, because to do things differently means developers have to make a concerted effort to learn, and this is a prospect they should be terrified by. (In the article, he cites the supposed problems involved in coding for these multicore systems, and suggests that programmers may not manage to master the hardware over the course of the coming generation--when should companies make the transition to multicore processors then, Gabe? Never? Clinging to the current design paradigm for as long as possible isn't going to make the learning curve any shallower or the transition any easier, so unless he's suggesting that hardware stay in its current groove indefinitely, there's no reason not to embrace change now.)

Sorry if that seems harsh to you, but I feel that people who cling to legacy architecture because they don't want to invest the time and effort necessary to learn something superior are dead weight slowing the wheels of progress. :p

HAHAHA HES FAT TOO!
 
sho nuff, since you are in the industry, do you think it's going to take you on the order of 3-5 years to put together an effective engine built for modest parallelism? i'm serious about this. do you have that little faith in your own skill and the skill of your co-workers?
 
kaching said:
Maybe not 'sucks' as much as 'way too far ahead of its time', but certainly to be avoided given that he's claiming developers won't get anywhere near tapping the full potential of these platforms, treating customers like "morons" and "idiots" in so doing.

Which I guess seems weird to me given that getting the likes of Gears of War or Heavenly Sword out of a box that doesn't cost much more than $300 and I only have to buy once every 4-6 yrs seems like a better value than having to stick with the PC upgrade rat race every 6 months to a year just to play the latest PC games with decent performance.
Where are you getting all of that from??? He says nothing about the new hardware being premature, or ill-advised in any way. He's talking to programmers, and sounding a wake-up call. Multi-thread programming IS hella difficult for someone who has never done it before. He's saying that great programmers who put in the work to learn the new architectures will have tons of potential to play with, and marginal programmers become a liability.

It's not a negative message, it's a "get your shit together cause you're going to have to step it up" message.
 
fart said:
sho nuff, since you are in the industry, do you think it's going to take you on the order of 3-5 years to put together an effective engine built for modest parallelism? i'm serious about this. do you have that little faith in your own skill and the skill of your co-workers?

Wait, so if I agree with Gabe I'm saying that I and my coworkers suck? Errrm.

In that case, I recall my argument! Gabe is stupid! Yeah!!!
 
What's with all the hate for Gabe. The guy makes some amazing games. If he speaks in hyperbole then he's trying to get the message out to consumers and mainly hardware manufactures that we need better tools if we really want to make these new machines sing.
 
Sho Nuff said:
Wait, so if I agree with Gabe I'm saying that I and my coworkers suck? Errrm.

In that case, I recall my argument! Gabe is stupid! Yeah!!!
YES! I WIN!

CHAMPION!!!!
 
If expectations don't go up, then what makes him think anyone will want to shell out money for something new? The motivation to upgrade is that you are going to get $300 worth of improvement over the current gen. If that means slapping in a multicore processor b/c single-core is a dead-end, then so be it. And if that means working longer hours and spending more money to produce results, then so be it. The beast needs human souls to survive. Fucking shut up and get over it. I have high expectations. If Sony and MS don't deliver (b/c Nintendo's already bailed), then they don't deserve my money....period. Let's not make excuses, let's produce results. So far, so good. Valve? It's your turn. PEACE.
 
Rhindle said:
He says nothing about the new hardware being premature, or ill-advised in any way.
Come on, how how else can you read this?

Gabe said:
"Most of the problems of getting these systems running on these multicore processors are not solved. They are doctoral theses, not known implementation problems. So it's not even clear that over the lifespan of these next generation systems that they will be solved problems. The amount of time it takes to get a good multicore engine running, the Xbox 360 might not even be on the market any longer. That should scare the crap out of everybody."

Rhindle said:
He's saying that great programmers who put in the work to learn the new architectures will have tons of potential to play with, and marginal programmers become a liability.
Yes he is. But pay attention to WHY he's saying the marginal programmers will become a liability - because the learning curve is too steep for them to overcome, UNLIKE with previous platform architectures.

Gabe said:
"If writing in-order code [in terms of difficulty] is a one and writing out-of-order code is a four, then writing multicore code is a 10," cautions Newell. "That's going to have consequences for a lot of people in our industry. People who were marginally productive before, will now be people that you can't afford to have write engine or game code. They can't get a big enough picture of what's going on in the box so they'll be a net negative on the project."

It's not a negative message, it's a "get your shit together cause you're going to have to step it up" message.
Problem is, he never gets around to saying how people should get their shit together, he never says anything other than how bad its going to be.
 
fart said:
sho nuff, since you are in the industry, do you think it's going to take you on the order of 3-5 years to put together an effective engine built for modest parallelism? i'm serious about this. do you have that little faith in your own skill and the skill of your co-workers?

Sho Nuff said:
Wait, so if I agree with Gabe I'm saying that I and my coworkers suck? Errrm.
In that case, I recall my argument! Gabe is stupid! Yeah!!!

Well Gabe seems to be talking specifically about the game industry. I can't relly speak for US devs because I've only seen how a couple of them do their work internally but if Japanese devs (who I have a bit of experience with) are any indication Gabe is pretty much spot on.

I know my response below is long but, I dunno, Sho Nuff read it and give me your impression. Agree? Disagree?

1. Software development in all but the best run software houses is shaky at best and in the game industry is downright terrible. From small things like deciding scope and schedules up front to allowing feature creep to dominate a project due to seeing game X do stuff more wiz bang than you are; game software dev is pretty darn messy.
2. Game developers (especially ones in Japan) are geeks at heart and wear the suffering though building your tech from scratch as a badge of honor. Developers will make a custom engine requiring new tools, converstion utils, etc and spend a ton of time and $$ doing it when they could just as easily pick and choose applicable stuff from elsewhere and tweak it and use it. Or at the very least use something from another project. However, most devs don't carry stuff (even if it's something portable like a file exporter/etc) from project to project and they also don't share such work with another team (within the same company) doing a parallel effort making something very similar.
3. Alot of guys are stuck doing stuff the old way. You'd be surprised how many guys want nothing more than a tight gameloop and control every pixel that gets thrown upon the screen himself instead of using modern rendering techniques. Those guys (and there's alot of them -- plenty of PS2, GC, and Xbox games make this blatantly obvious) are not going to make a smooth transition to next gen.
4. Writing parallel code is not difficult. In CS there are tons of smart guys who have figured out what tasks lend themselves to parallelism. The problem is that there are just as many tasks in game development that are explicitly serial as there are parallel tasks. Devs are going to have to not only separate out the parallel tasks but are going to have to manage their data and bandwidth in such a way that their processing pipelines are kept full minimizing stalls and staying near practical peak performance.
 
I know you guys like making fun of Gabe, but he's really on-the-ball with these comments. Multiprocessor code is not the samething as multithreaded code, and it requires an entirely different mindset for how you approach coding. It's like the difference between sketching something on paper and actually modelling it in three physical dimensions. The smaller developers, especially those who refuse to embrace middleware solutions, are going to be fucked.

2) is a serious problem in Japan. The Gundam demo that was shown at the PS event yesterday, the guy was bragging about how they had made it from scratch without middleware...meanwhile shooting the non-destructable environments and using only a single one of the PS3's processors. Education can help, and Sony is putting themselves on the line behind Havok and Ageia, but using middleware - even to do obvious, butt-simple things like File I/O or playback music - is seen as a sign of "weakness."

EDIT: The more I read ddkawaii's post, the more it's one of the best summaries of the problems facing developers in the next-gen I've seen on these boards. READ IT, learn it, love it.
 
JackFrost2012 said:
I know you guys like making fun of Gabe, but he's really on-the-ball with these comments. Multiprocessor code is not the samething as multithreaded code, and it requires an entirely different mindset for how you approach coding. It's like the difference between sketching something on paper and actually modelling it in three physical dimensions. The smaller developers, especially those who refuse to embrace middleware solutions, are going to be fucked.

I don't understand this statement but I will say I don't know of any platform programmed with a high-level language and with a pseudo modern/modern "OS" scheduler that exposes parallel execution to developers in a way fundamentally different than the concept that is commonly referred to as "threads" -- and I don't expect this to change anytime soon.
 
ddkawaii said:
I don't understand this statement but I will say I don't know of any platform programmed with a high-level language and a pseudo modern/modern scheduler that exposes parallel execution to developers in a way fundamentally different than the concept that is commonly referred to as "threads" -- and I don't expect this to change anytime soon.

I just meant that multiprocessor is juggling a lot more balls in the air at once, as it were.
 
kawaii, it may help to know that vestal's background is in fursuits and square.

that said, there are quite a few parallel models that better fit larger-scale parallel systems (particularly distributed, non-shared memory ones) that are not thread-based. however, the thread abstraction is so perfect for shared memory systems that i believe you're right; i don't know of any shared memory system whose basic model is not threading.
 
fart said:
that said, there are quite a few parallel models that better fit larger-scale parallel systems (particularly distributed, non-shared memory ones) that are not thread-based. however, the thread abstraction is so perfect for shared memory systems that i believe you're right; i don't know of any shared memory system whose basic model is not threading.

While on the topic of the upcoming generation and problems facing developers in the immediate future probably we should concentrate on shared mem, which is what I was talking about earlier.

However I think that on consoles and PC we'll see a form of in one box distributed computing as well as possibly multiple boxes. So in future discussio I'll be more specific.
 
JackFrost2012 said:
I know you guys like making fun of Gabe...
I have no predisposition either way, I just don't see how his bellyaching (ok, I know, low blow) helps the situation anymore than what you and ddkawaii describe is happening in Japan.
 
ddkawaii said:
While on the topic of the upcoming generation and problems facing developers in the immediate future probably we should concentrate on shared mem, which is what I was talking about earlier.

However I think that on consoles and PC we'll see a form of in one box distributed computing as well as possibly multiple boxes. So in future discussio I'll be more specific.
it was a dumb nitpicky point. i knew what you meant.

i think it is extremely cool that could see distributed systems formed with game consoles, by the way.
 
fart said:
are you sure you don't mean threaded processes?

I know you think you're clever, but "thread" is both a verb and a noun. I know about threaded processes; moreover, I know about threading threaded processes. I also know that what I know about thread(ed|ing) processes is entirely inadequate to the task of next-generation multicore development. Which is why I don't work as a programmer.
 
i'm not familiar with the verb form of thread being used where sequential scheduling or interleaving would be more appropriate. ironically, using your vernacular, what multiple targets gives you is the ability to run multiple threads without threading. har har har
 
*edit* just re-read my posts...excuse the grammar/spelling i'm home sick today and feel horrible....not even good enough to proof my posts before hitting the submit button *edit*

kaching said:
I have no predisposition either way, I just don't see how his bellyaching (ok, I know, low blow) helps the situation anymore than what you and ddkawaii describe is happening in Japan.

I don't know anything about dev in Europe so I can only talk about Japan and the US. That said I really don't know what's going on in the US these days other than what I can see from afar. Which is:
1. Some companies like EA obviously have a somewhat streamlined software development process becase they can crank out games like clockwork every year and they can get their licensed titles to market in a reasonable time.
2. US devs are a bit less hesitant to use middleware/other people's engines/tools than dev in Japan.
3. US games often suffer the exact same problem(s) Japanese games suffer from when they're done by a a team with a not so finely honed software process and didn't use obviously use mature outside libraries/tools/code to get a leg up -- technical problems, lacking content, short length, etc.

What this tells me is that similar problems exist everywhere ( or at least in US and Japan). What Gabe said is on the ball but at the end of the day pointless because what I think we'll see initially next gen is:
1. Huge variance in game technical quality
2. Graphics and sound as a whole should be improved however I think there will be a mandatory tradeoff....As quality of graphics and sound quality increases I think we'll see a reduction in the sheer amount of content in the game. Whether that be model variety, texture variety, levels, whatever initially it'll be down.
3. Sure fire big franchises will remain AAA grade (looks, sound, gfx will be appropriate for next gen). Anything in the tier below those types of titles will probably dissapoint most hardcore next-gen gamers.

Then I think after a few months maybe 6-12 months into each new platform's life we'll start seeing general technical quality go up as well as content levels going back to what we're accustomed to. At the end of that initial window everyone will be saying "wow, remember when everyone said this gen was gonna be so rough, was all a bunch of BS"

Of course this sort of thing happens every generation but this gen the problems in going into the generation are significantly larger (much harder than the move to PS2 in fact). The good part is like always smart devs will geta grip things and the jump from dissapointing levels to appropriate next-gen quality will be so big even the most demanding of the hardcore should end up being satisfied.
 
fart said:
i'm not familiar with the verb form of thread being used where sequential scheduling or interleaving would be more appropriate. ironically, using your vernacular, what multiple targets gives you is the ability to run multiple threads without threading. har har har

You're obviously very clever; I hope you find a way to apply those smarts towards next-generation coding solutions and not just being an asshole on GAF.
 
whiteman says you should go teach some multi-threaded english. cast has mocked our discussion by characterizing as a "geek slapfight."

yep, this has been a productive night.

ps, we love you vestal <3 :)
 
ddkawaii said:
I know my response below is long but, I dunno, Sho Nuff read it and give me your impression. Agree? Disagree?

Wow, did you work with my old company? Cos that describes 'em to a tee. No reuse of any code, even on sequels, zero sharing, the idea "we can make better engines than middleware," losing the original code...ahh good times.
 
Zaxxon said:
I heard somewhere that he refuses to port Half Life and Half Life 2 to to OS X and won't even let other companies port Half Life and Half Life 2 to OS X because he's an ex-Microsoft employee that hates Apple. Is that true? Seems like a pretty stupid petty thing.

This is not entirely true.

I've heard rumors, and here's the gist of what I heard:

A port of Half Life for MacOS was nearly completed. It was greenlit because an Apple evangelist promised Valve that a ridiculous number of Mac users would preorder.

When the actual preorder numbers came in and fell far short of the number promised, the port was cancelled, even though it was nearly done.

Blame whoever you want for this - Apple, for being deceptive (or just plain optimistic?) about projected numbers, the Mac community, for not stepping up and preordering, or Valve, for cancelling a nearly completed game - I'm just telling ya what I heard. :)
 
iirc (and this was a long long time ago, so big if), what i heard was that someone in the sierra/valve almagam didn't want to deal with support for a port with a small audience that wasn't 100%, and the porters (megg et al), were having a lot of trouble with network issues (i have no idea what or why they couldn't be resolved). (was this the official reason?)

also, if japanese software houses hate code reuse, shouldn't that help them by gabe's standards?
 
What's with all the hate for Gabe. The guy makes some amazing games. If he speaks in hyperbole then he's trying to get the message out to consumers and mainly hardware manufactures that we need better tools if we really want to make these new machines sing.
Speaking in hyperbole isn't quite the word for it. He basically is playing chicken little running around screaming that the sky is falling for all developers. Its retarded. Sure, Newell has made some good games, but when he says consumers are being duped, then goes on to spew all this crap about how terrible the new hardware is to code on, you have to wonder who shit in his froot loops that morning. That, or he's hoping to instill fear in most fellow developers with the hopes that they'll all run to buy middleware, something Valve will likely make available once they get their engine up and running on X360 and PS3. Kinda convenient how that works out.

In the world of hardware v. software design you can't have your cake and eat it too. You can't use your old legacy code and still get to use cutting edge hardware.

In an ideal future though Sony, MS, and Nintendo will assist in middleware heavily, making sure that there are several different choices of quality products that devs. can use. Competition will keep prices low and game developers can stop wasting the biggest chunk of dev. time just getting a stable engine running. Honestly, every major 3rd party should begin looking into making their own highly versatile engines for their future games, it'd save them ass loads of money and time. Naughty Dog and Insomniac did this and were able to turn out some of the best looking PS2 games on yearly timetables. Imagine how much better the engine could be if every studio Sony has, not just two, were using, and tweaking, that one engine.
 
All I know is that often raw numbers, system specs, plus benchmarks are thrown about my the general public, or those who don't know what the numbers means, all the time to gauge the validity of a system. Hell, all those folks who back in the day went "wow, the Saturn has two RISC based processors!" really didn't know what they were talking about, and I'm starting to see the same shit here... not at the GAF but from the mouths of uninformed game store clerks I mean.

Not sure if this contributes to the argument, but just saying. Hence why I personally don't say shit about the specs cuz its all greek to me.
 
It's actually a good point, FortNinety, because Newell's comments in this article put all the onus on developers for misleading customers but don't give any consideration for how much customers often mislead themselves, regardless of what devs are doing.
 
fart said:
well, i had an inkling before, but now i know for sure that gabe newell is a fucking idiot.

Gabe has been an idiot for plenty of other reasons, but this isn't one of them.

He's right, existing game code is no good on these new consoles because it's a completely different paradigm. We actually have multiple CPUs so the only way to get the most out of them is writing multi-threaded games which isn't common practice right now.
 
A PC developer dismayed at having to learn to code something other than x86?? Color me really surprised here . . . oh please. IMO, they should be worrying about trying to learn to make something other than a FPS.
 
Dr_Cogent said:
He's right, existing game code is no good on these new consoles because it's a completely different paradigm. We actually have multiple CPUs so the only way to get the most out of them is writing multi-threaded games which isn't common practice right now.
Does absolutely everything have to be thrown away though? Can they not keep the core of routines for some things (maybe AI, environmental effects, etc.) and wrap that in a multi-threaded code base?
 
http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2377&p=1

Nice interview with Tim Sweeney regarding multicore game-dev issues.

He isn't as melodramatic as Newell, but the underlying sentiment is similar. Of course Sweeney's engine has multi-thread support, whereas I'm pretty certain that Valve's Source does not :lol

Some quotes:-
AnandTech: Programming multiple threads can be complex. Wasn't it very hard to deal with the typical problems of programming multithreaded such as deadlocks, racing and synchronization?

Tim Sweeney: Yes! These are hard problems, certainly not the kind of problems every game industry programmer is going to want to tackle. This is also why it's especially important to focus multithreading efforts on the self-contained and performance-critical subsystems in an engine that offer the most potential performance gain. You definitely don't want to execute your 150,000 lines of object-oriented gameplay logic across multiple threads - the combinatorical complexity of all of the interactions is beyond what a team can economically manage. But if you're looking at handing off physics calculations or animation updates to threads, that becomes a more tractable problem.

We also see middleware as one of the major cost-saving directions for the industry as software complexity increases. It's certainly not economical for hundreds of teams to write their own multithreaded game engines and tool sets. But if a handful of company write the core engines and tools, and hundreds of developers can reuse that work, then developers can focus more of their time and money on content and design, the areas that really set games apart.

You can expect games to take advantage of multi-core pretty thoroughly in late 2006 as games and engines also targeting next-generation consoles start making their way onto the PC.

Writing multithreaded software is very hard; it's about as unnatural to support multithreading in C++ as it was to write object-oriented software in assembly language. The whole industry is starting to do it now, but it's pretty clear that a new programming model is needed if we're going to scale to ever more parallel architectures. I have been doing a lot of R&D along these lines, but it's going slowly.
 
dorio said:
What's with all the hate for Gabe. The guy makes some amazing games. If he speaks in hyperbole then he's trying to get the message out to consumers and mainly hardware manufactures that we need better tools if we really want to make these new machines sing.


It takes Valve 5 to 6 years to get a game out anyway. From Gabes vantage point I see why he made the comment. By the time they get their next game out the next next gen of consoles will be here. Just because Valve is slow and inefficient as a developer doesn't mean the entire industry has to follow at the same snails pace.

Maybe MS and Sony should hire Gabe as a consultant. They probably don't understand that they need to get the best tools out possible ASAP to developers......

As a former MS man, I think Gabes comments are extremely out of place. He should know his former employer is a company which specializes in getting software tools made. Well, whatever.
 
My impression is that he wasn't just decrying the use of these processors, but that he was also warning developers that they shouldn't expect to be writing engine code, because they may not be up to it. And if they are, it won't be easy.

And frankly, I would agree. Most programmers have very limited experience handling issues related to parallelism, and just because someone says "You want Ph.D. level issues - look at something like Earth Simulator!", that doesn't change the fact that these processors are going to introduce all kinds of crazy issues with starvation, etc. that will drive people insane.

That's not because programmers are dumb or lazy, or because publishers and developers are too cheap to put the necessary time into it. It's because this shit is HARD and it's only getting more difficult, meaning that fewer and fewer people are going to get their hands dirty in engine code.
 
He's right, existing game code is no good on these new consoles because it's a completely different paradigm. We actually have multiple CPUs so the only way to get the most out of them is writing multi-threaded games which isn't common practice right now.
But there's a lot of multi-threaded code available, even graphics engines. Newell's happens to just not be one of them.

And frankly, I would agree. Most programmers have very limited experience handling issues related to parallelism, and just because someone says "You want Ph.D. level issues - look at something like Earth Simulator!", that doesn't change the fact that these processors are going to introduce all kinds of crazy issues with starvation, etc. that will drive people insane.
I don't know, back when I studied computer science I was taking multi-processor arch. programming classes as a 2nd year. Seems like if you've gotten your bacheleors you probably should have some legitimate experience with multi-threaded archs. That or your school's cs program was cheese.

Also, single thread systems aren't the way computers have always and only been. Multi-threaded programming has been around for a real long time, just not in the more entertainment focused computing industries. So it'll be a bit of a rough transition and a bunch of programmers might not be able to hack it. Too bad for them, thats the price you pay for being mediocre at your job in a competative field.

Also, geophysical simulation programs are somewhat relevant in that they handle excessively large amounts of data and so use very efficient coding for memory management. There are better examples of those coding methods in real game related code available as well, but then that goes against Newell's belief that multi-threaded programming is only being done on PhD level theoretical work, which is total bunk.

Newell can say whatever he wants but it just seems to me like he's pissy that he's spent half a decade making the Source engine, never considering where the industry and computing as a whole was headed, and now finds himself mildly screwed out of a lot of money for his efforts (or lack thereof). How any programmer didn't see this coming when the leading platform this generation itself was designed for multi-threaded code is beyond me, but thems the breaks.
 
Top Bottom