Certain file extensions are easier to convert than others. Here's a guide with eleven tips for how to convert video to MP4 so you can convert your videos without a hitch. Read more →
Catering to the Least Common Denominator
Games are becoming more and more acceptable and socially normal in today’s society. Whereas before, when only the geeks and nerds were the ones gaming, now almost anyone can be considered a gamer, from an eight-year-old girl playing Scribblenauts to a college frat boy fragging noobs in Call of Duty, from a middle-aged mother playing FarmVille on Facebook to a grandparent bowling on the Wii at a family gathering. It’s harder to find someone today that doesn’t play games on at least a semi-regular basis than it is to find that does.
As a result of the gaming industry’s growing popularity, however, developers and publishers are quickly realizing the market they’re releasing to is expanding rapidly, so they have to step up their game in what content they release to the world. While some studios have begun working on titles and genres they wouldn’t have otherwise touched–such as THQ trying to release children’s games, for instance–other companies are taking some of the hardcore crowd’s favorite titles and significantly changing or adding unnecessary content to them in an effort to appeal to a much larger demographic to bring in the most profit possible. While this isn’t necessarily a bad thing, it’s not necessarily good either, and I’m nervous as to where the industry could end up if it makes all its games catering to the widest range of gamers possible.
The idea of adding multiplayer to franchises that never had it before is a habit gaining momentum that I can’t support in most instances. Too many games nowadays have bland and forgettable co-op or multiplayer modes tacked onto them in an effort to please more people, and, more often than not, these needless additions take away from the overall experience more than they add to them.
Take Far Cry 3 for example. The single-player experience in this game was pure bliss, something I’ll remember for a long time. While the story length and character development fell short, the game as a whole excelled at offering fans a fun and mesmerizing single-player experience. But play the co-op and you’re sure to be disappointed. The co-op narrative stood as an excuse to let players kill baddies together, the stealth was broken, and the entire mode felt needlessly tacked on. I love co-op options in games, but if they’re not done as well as the single-player experience, why add them? The time spent providing players with a mediocre co-op mode could have been spent making the single-player story longer or adding on more side missions. I would have preferred that to the bland addition Far Cry 3 provides.
The new Tomb Raider faces the same thing. While I have yet to play it or its multiplayer component, there are reports that it pretty much sucks. It’s even being handled by a different developer than one making the story-based game, which is a recipe for disaster in itself. Tomb Raider doesn’t need a multiplayer mode in order to survive. No one even really wanted one. But there it is, in all its average glory.
BioShock 2, an older example, is guilty of this as well. Their uncreative attempt at multiplayer was no more than a cardboard cutout deathmatch clone and warranted no more attention than it received, which wasn’t much. I feel that had BioShock 2 stayed away from multiplayer, it would have reached the level of greatness that the original had. Apparently the franchise has learned from its mistake, considering the upcoming BioShock Infinite will have no multiplayer component attached.
Why do developers do this? Well, for one, they want to cut down on used game sales so more money goes to them instead of GameStop, which makes sense. Game development is a business, after all. And the addition of multiplayer may attract a bigger chunk of people to buy their game than if it had just been a single-player experience. But while developers are trying to collect as much cash as possible, they’re sacrificing their integrity and taking away some of the joy that could have been a reality had their game stayed strictly single-player. This isn’t to say that all single-player franchises can’t successfully add multiplayer onto their games. Mass Effect 3 and Assassin’s Creed did it well enough. But if developers can’t make a multiplayer or co-op mode at least half as interesting as its single-player counterpart, they’d be better off staying away.
Changing the Game
Plenty of franchises are also guilty of changing their successful formula to a more “dumb-downed” or action-oriented one in order to please the least common denominator and maximize sales. A game that comes to mind is Splinter Cell: Conviction, a 2010 stealth game based on the long-running franchise dating back to 2002. The first Splinter Cell had players control a secret agent named Sam Fisher who was instructed to avoid combat rather than engage in it. Players succeededl by avoiding detection, hiding bodies (bodies they shouldn’t have been creating in the first place), and sneaking through entire missions. If you got into a fire fight, you either failed the mission immediately or were very likely to die.
Conviction, however, turned the franchise into a much more fast-paced experience. While the stealth element is still in play, it’s not as crucial to the game. Players can even perform instant kills on multiple foes at once by tagging them then pressing a button to shoot them all in quick succession automatically. This is a concept far removed from the original’s gameplay. Does that make Conviction a bad game? No. As a matter of fact, I loved it, and I can’t wait for the next installment, Blacklist, to release. However, I’m sure there were more than a handful of diehard fans that were upset with the series’ drastic change, and I can’t say I blame them. Splinter Cell had a fair share of fans that loved the series to death when it first started. Now it has a wider audience of fans, but I’ll bet these newcomers don’t harbor the same unconditional love the original’s fans did. That’s because the game has been transformed into something to appeal to a much wider audience, sacrificing the desires of its most loyal fans in service to those who may not have ever played the series before.
Another game guilty of both an addition of an unnecessary multiplayer component and changing its DNA to suit the masses would be Dead Space 3. When I saw the original trailer that featured co-op gameplay, I literally laughed aloud at the idea. Dead Space has always been about the feeling of isolation and abandonment as you slowly navigate the halls of a dying ship, warding off disturbing creatures wherever they may appear. Adding in co-op takes away the survival horror element and turns the game’s genre into a third-person action game. There’s no intensity, no fear, no dread of the unknown when you’ve got a buddy by your side. Regardless of how good the game is, by adding in co-op, Dead Space 3 became just another shooter. The original was praised for its terrifying atmosphere and sense of seclusion; that’s lost when another player drops in to help you out.
I’m not saying all games guilty of doing these things are in the wrong. As a matter of fact, as I’ve already mentioned, some of them benefit from changing their classic blueprint, such as Assassin’s Creed, Mass Effect, and even Splinter Cell. However, if it becomes a norm for developers to cram as much content into their games or change their formula so much that they’re hardly recognizable just to gain mass appeal, we’ve failed as an industry. Sometimes it’s a good thing that not everyone who calls themselves a gamer enjoys a specific title; if they did, you’d have something worse than Call of Duty, and who wants that?