In the internet age, it’s easier than ever for games to be kept up to date: new content, balance fixes for the multiplayer component and, arguably the most important, bug fixes. But how much of an obligation do developers have once their game is released?
Many developers shove their games out the door and immediately move on to other projects. This could be, and is most likely, due to publisher demands. There is no money in providing free fixes to a game that has already been bought. Many developers take a different approach, however, and continue to update their games years after release. Last year, for example, Blizzard released a patch for Diablo 2, no less than nine years after the game’s initial release.
Relic Games is a developer that supports its games well. Warhammer 40,000: Dawn of War II was released over a year ago and still receives periodic balance updates, along with new maps and game types – all for free. Unfortunately, it’s easy to argue that much of this content should have been in the game at launch – the game was unbalanced and devoid of maps and game types. Still an excellent game and, as a PC game, significantly cheaper than a new console game, but perhaps not a complete game.
While it’s easy to argue that developers should support their games, it’s also easy to see why they shouldn’t have to. A released game should be a final product, one that has undergone rigorous quality assurance testing, especially considering the amount of money that goes into many games and the resultant price tag for consumers.
Unfortunately, as games are becoming increasingly complex, more and more bugs slip through the cracks. Fifteen years ago, if a console game shipped with a bug, that was it. Now, it’s so easy to patch a released game that, it could be argued, games are being released with bugs that should have been squashed during testing; the online update option providing a safety net that developers are abusing to release games on schedule.
One very high profile bug plagued the release of Assassin’s Creed II. After completing chapter 11, the autosave animation plays, indicating it is safe to quit the game. Players who did decide to quit at that point in the game were greeted with a game that wouldn’t load properly upon their return. At that point, the player had no choice but to start a new game. UbiSoft did eventually release a patch which fixed players’ saves, which they should be applauded for. However, a bug that major should have never made it into a final release.
A recent offender in post-launch support, and one that affected me personally, LEGO: Harry Potter, from Traveler’s Tales. It is a charming game, with a vicious game-breaking bug, via a gold brick that doesn’t generate properly near the end. The issue is unfixable, and requires you to start a new game in order to finish, hoping of course, the same thing doesn’t happen again. My poor girlfriend spent well over 20 hours on LEGO: Harry Potter, reached 99-percent completion, and realized her game could never be completed. It’s now been nearly two months since the game’s release, and there’s still no patch to fix problem. Considering that the game cost £40, is it too much to ask that it at least be finished?
More disconcerting is the solution given by Traveler’s Tales; just start again.
Do developers have an obligation to support their games? If the game needs fixing, like Assassin’s Creed II or LEGO: Harry Potter, then yes, they do. Consumers have paid good money for their products, and if they don’t work and it’s simple to provide a software fix, then there is no excuse for not doing so. If the game is bug-free, then developers shouldn’t be obligated to provide dedicated support when the game doesn’t need it. Developers like Relic and Blizzard provide it anyway, though, and that’s greatly appreciated and should be encouraged.
Have any of you ever encountered such game-breaking issues, or care to share your favorite developers in terms of post-launch support?