Posted on 03/20/2013 at 05:16 PM
| Filed Under Feature
This is a crutch the industry needs to do away with. While I realize the complexity of a game's code has grown exponentially since the industry's start, developers still need to make sure that a game runs with a minimum of stability issues and that the main campaign can be finished before it's launched. Everytime I see a game released that cannot be finished out of the box (like Diablo II or Star Trek Armada) or requires elaborate workarounds to avoid glitches (Ar tonelico II, War in the North), I want to know if they were even play tested.
I used to play quite a few PC games in the past, many of which were rock-solid developments (Star Trek: Bridge Commander, Age of Empires) that worked straight out of the box. But, over time the patch mentality took over (thanks to the expansion of the Internet) and allowed many companies to call a mulligan over their botched releases. Despite how easy it is to fix these games, I never got used to it. It's a deep-rooted and fundamental problem in the development process and the companies that suffer from it (Bethesda, Blizzard, Travellers' Tales) need to alter their practices. I've seen too many interesting games come and go only because I lost faith in developers to release working products. Of course, this practice is only going to get worse going forward. Many players already assume that a game isn't going to work right from the start, and it allows companies to make money on half-baked products. That's just a terrible way to treat consumers and do business.