Developing video games has become an increasingly complex endeavor. Most video games created for the legacy consoles (the PlayStation and N64) could be built in less than one year (we estimate that the average development time was six to nine months) and cost less than $1 million to produce. In the PS2/Xbox generation, the average console game required 18 to 36 months to finish, and cost an average of $4 million. Thus far in the cycle, current generation console games require between 24 and 36 months to develop, and average development costs rose to between $20 – 30 million, with some economies obtained if games are developed for both the Xbox 360 and the PS3simultaneously. We believe that the first efforts for most next generation games are likely to cost more than $30 million (with some games costing as much as $50 million), but we expect most publishers to quickly advance along the learning curve, and we anticipate that average costs will decline to $25 – 30 million over the next two years
...
We expect the average for next generation games to settle at around $15 million apiece over time, compared to an average development cost for 128-bit console games of $2.5 – 4 million. Sequels built on a pre-existing “engine” (the artificial intelligence that determines how the game’s characters will interact) generally cost less to develop than newly created games, and we expect development costs for next generation games to decline as engines are utilized well into the cycle. As a rule of thumb, we estimate game development costs are budgeted at 20% of a video game’s expected revenues, but the actual percentage depends upon the ultimate success of the underlying game. Whether a game is developed internally or externally will frequently affect how the game’s costs are classified on a publisher’s income statement
and how the revenues are shared.