Modern Games Vs. Old Games, A.K.A how games have regressed

His video style is a derivative of Crowbcat, and you can tell because Crowbcat, in a smarter fashion, will use games from the same franchises to make his case crystal clear.

This youtuber, on the other hand, will compare different franchises across different development teams with different gameplay priorities.

It would be like asking why Street Fighter 5 didn't have assists because Marvel vs Capcom 3 had them prior.

It comes across sloppier than Crowbcat's work due to this.

Yeah the video using lots of different games to prove a point is one of the most irritating things about it. It creates the impression of a mythical "old game", which contained all of the features described, instead of the reality being that these features were spread out across dozens of different games. It's not like every game back then had Guerrilla's destruction, Crysis tree branch physics, FEAR-level AI, perfect HL2 glass shattering, Splinter Cell's shootable lights... and so on. As we've said, developers picked what was most important for their design.

The other thing is that it's using a lot of examples which were exceptional even back then and presenting that as some norm. Not every game was a Deus Ex, MGS2 or Crysis - these games themselves set a new benchmark in one way or another within their time period. It feels disingenuous to compare some of these games which were a labor of love to the yearly COD instalment.

There's also a triviality to some of these details. Will we be making similar videos in 2040 complaining that gaming isn't what it used to be, because look at how great Yakuza's bread looked in a cutscene in 2020?

 
Last edited:
The video is funny but you can always cherry-pick. And even if it's not cherry-picking, it's enough to stop to think what the real-world reasons for it are:
1. Technology development is not always linear. I mean, when game X launches and does something awesome, it doesn't mean that technology (probably custom built inhouse) is available to all other develoeprs in the world, or that it's easy to reverse engineer it and implement in your game (it's not enough to buy the game and play it, you would have to have access to the whole project in the engine). Gamedev is not like a science community where one discovery is shared with the whole world to know it.
2. Resources. First of all, at the high level, time is money. For many of the features in the video, they probably lacked time = money to implement it. Red Dead Redemption 2 looks so good not only bacause it's somehow more technologically advanced, but because the devs had time and money to record, polish and implement all these context-sensitive animations.
3. Resource allocation. Different features are improtant for different gameplays. Advanced water physics can be very important in one game, and not so much in another game. You always need to make good decisions whether the time (=money) needed to do the thing translates to a significant increase in the game's quality. If not, you get these bloated AAAA budgets for games too expensive to fail - and then they fail because they got too big to break even.
4. Developers are people. People with unique skills and abilities. Hair in game A can be better simply because the dev resposible for it was more talented that their opposite number in game B. And knowledge preservation in game studios, from my experience, is really lacking, so when talented people are fired, their expertise is fired with them.
 
Last edited:
I mean no offense to anyone who works in the gaming industry but after Perfect Dark and Everwild were cancelled I asked myself if developers are not as talented as they used to be. How hard is it to get a spy themed, objective-based FPS (with I think some Deus Ex inspiration) up and running? It's not like Everwild's case where they were trying to make some artsy, no-combat game where they weren't even sure what the gameplay loop was.
 
Top Bottom