Provocative title, I know. I’m in rant mode, and there’s a lot to say, so let’s dive right in with the requisite history.
In 1983, the American home video game market experienced a massive crash, which some at the time claimed would be the death of the video game industry. The crash occurred primarily as a result of an oversaturated market inundated by fungible, high-budget games with few defining characteristics and short-lived entertainment value. Another important cause for the implosion: too many consoles, none of which had a significant leg up on any of their competitors. Just before the crash, one could play almost all of the same games on all of the following consoles: Atari 2600, Atari 5200, Bally Astrocade, Colecovision, Intellivision, Fairchild Channel F, and Magnavox Odyssey2; as well as the following home computers: Apple II, TI 99/4A, Atari 400/800, Commodore VIC-20, and Commodore 64. What a mouthful.
The interim period from 1984 to around 2002 saw a Japanese takeover of the gaming industry, with Nintendo and Sega rising from the ashes to lead innovation for nearly 20 years. Since 2003, however, the game market has ye again become increasingly American-dominated, both in terms of consoles and developers. Games have once again become fungible in nature, in terms of their content and the consoles one can use to play them. Call of Duty 4: Modern Warfare, for instance, exists on all of the following: Nintendo Wii, Nintendo DS, Microsoft Xbox 360, Sony Playstation 3, Microsoft Windows PCs, and Apple Macintosh PCs (yes, an Apple computer is still a P[ersonal] C[omputer]). Moreover, the Call of Duty type of military-themed FPS (First-Person Shooter) is becoming the only kind of game you can readily find on this wide spread of consoles. Games being released this year alone that follow the same general concept and gameplay include: Deus Ex: Human Revolution, Dead Island, Hard Reset, RAGE, Gears of War 3, Serious Sam 3, Battlefield 3, Call of Duty: Modern Warfare 3, and Resistance 3. What’s the difference between all of these titles? Not much—a couple of different twists on the same brown and grey wasteland with plenty of ruined edifices for cover and a couple of different guns with varying degrees of realism. What were those factors that caused the 1983 crash again?
I’m not necessarily suggesting that the American market is headed directly for a redo of 1983; rather, I’m suggesting that it probably should be, for the benefit of the industry and its consumers. When Nintendo, and later Sega, stepped in to fill the void left by the near-death of Atari et alia in 1984, gaming experienced a period of shining innovation that lasted a little less than 20 years. Directly following 1983, nobody quite knew what was going to sell in a gaming market tired of so much sameness and confusion. Thus, a period of trial-and-error ensued, with (primarily) Nintendo pushing out massive numbers of titles as essentially marketing experiments—to see what sold and what flopped.
What flopped usually flopped hard, as evidenced by Castlevania II on NES and a whole series of failed games based off of movie franchises like Superman and Batman. The ideas that worked, however, went on to influence gaming all the way up to the modern era, as the effects of Super Mario Bros., The Legend of Zelda, Metroid, Final Fantasy, Dragon Quest, Castlevania, Ninja Gaiden, and Mario’s erstwhile rival Sonic the Hedgehog can still be felt today. Thus, a mindset of creative expression and experimentation pervaded the gaming industry for years, giving gamers some of the most innovative and entertaining variety since the very conception of video gaming. That is, until Microsoft introduced a crude Pentium III PC with a low-end graphics card and slow processor, bundled it with a strikingly unoriginal FPS called Halo: Combat Evolved, and called it the Xbox.
With the recent double success of Microsoft’s Xbox 360 and Activision’s Call of Duty franchise, the focus of the industry seems to have shifted, out of necessity, from innovation and experimentation to one of simply keeping up with Microsoft by producing more of the same content that has already proven itself so incredibly profitable in the short term. A not-totally-unforeseen consequence of this business strategy, therefore, is that Japanese game franchises attempting to continue down the path of experimentation and innovation are becoming too costly to localize and import to the American market. Thus, American gamers will likely never get to see such titles as Ace Attorney Investigations 2, Valkyria Chronicles III, Megaman Legends 3, and other installments in series which do have clearly established fanbases, but which just don’t make enough money to compete with the more than $1.33 billion bastion that is Call of Duty: Black Ops.
Herein lies the argument for a second video game crash: when an industry saturated with First-Person Shooters finally does buckle, it is possible that the void left in the market will once again push developers to innovate and come up with new, non-FPS games which have mass market appeal. As with all prophecies, the future is hard to predict. However, I’ll say this much: the current gaming industry needs a face lift. It needs one brave developer who has the time, resources, and creativity to bridge the gap between nostalgic platformers and Call of Duty clones. Thomas Jefferson once said, “The tree of liberty must be refreshed from time to time with the blood of patriots and tyrants,” and the same might well apply to a gaming industry watered with the blood of Call of Duty and Microsoft.