Square Enix's decision to split Final Fantasy VII Remake into multiple installments may harm the game for one big reason.
Can A Video Game Be Too Short: A Cost Comparison to the Film Industry
A simple question with a not so simple explanation. Recently, Crysis 3 ruffled feathers for what some consumers considered a campaign that was too short, clocking in around 5.5 hours even on the higher difficulties. This game did have a multiplayer component, which always fiddles with game time, but ultimately as multiplayer is added to more and more titles whether deservedly or not, the timeless constant is the single component. There are some exceptions, with franchise like Call of Duty and Battlefield among many other first-person shooter which heavily emphasize multiplayer. There are also the odd exception of video games like Mass Effect 3 which added multiplayer which was highly praised to the surprise of many, and still has a healthy player base a year later. But for all the other games that will likely be played through once and then dropped in favor of the next great thing should a multiplayer component really factor into a game’s playtime.
For the sake of this article I will take the stance, it does not. To explore whether a game can be too short I will, however, explore the basic notion of how long should a game be. By comparing video games to the film industry we can better assess from a consumer and developer perspective what a game’s ideal runtime is.
While video games might not have as much mass market appeal as a Summer blockbuster, video game can go toe-to-toe with even the biggest movies in most ways, be they good or bad. The entertainment industry as a whole has seen increasingly bloated budgets perpetuate at an alarming rate.
In the late 90s it was rare to see a movie budget breach $70 million and 1997 Final Fantasy VII became the most expensive video game ever made with a $45 million budget ($64 million when adjusted for inflation). Fast-forward a decade, thanks to a rapid development of new technology in both industries, it is impossible to find a summer blockbuster with a budget less than $200 million. Meanwhile, Grand Theft Auto IV establishes a new record, costing over $100 million to develop.
In the last five years budgets have relatively mellowed out. While there are video games that still push the envelope, Bioware Austin’s Star Wars: The Old Republic notably cost $200 million to develop, these game are far and few between and typically in the MMO sub-sector which relys on slightly different funding models. Beyond the rare exception, AAA games have settled into a $40-$100 development range, which might go up as teams shift development toward next generation consoles. Movies have jumped the gun on video games on the high-end, but as mentioned prior this gap may be closed as the next generation of consoles release.
Another key distinction of the movie market is it still maintains a health middle where majority of movies fall around the center with significantly fewer movies pushing to the extreme high and low budgets. This is the exact opposite of the video game market, which has seen a falling out of the middle and a push toward the extremes. This bimodal market has seen budgets at the extremes thrive, while $10-$40 million budget games have all but disappeared. This distinction is important to the overall health of each market and as a result the video game market has become much more volatile than its entertainment counterpart, the film industry. Despite the divergent market video games have seen top end revenues keep pace with the film industry. Blockbusters movies and AAA games can frequently garner about $500 million, and there are the rare exception in each that break $1 billion.
Why does all this matter, you might think. Well lets examine the economics of movies and games to consumers. If a consumer sees a movie in theatre on or near opening weekend he will usually spend roughly $10 dollars. A person buying a video game in a similar situation can expect to pay either $15 or $60, depending on whether it’s an indie or AAA title. Movie’s standard run time fall between 90 minutes and 150 minutes. Alternatively, video games have a much more varied runtime. However, given the similar the development time and cost of movies are you might expect a similar yield on the consumer’s behalf.
By this logic a $15 downloadable title should run the consumer roughly 2-4 hours. At the higher end a $60 release should be about a 9-15 hour experience. Now obviously there are plenty of video games that run well above those time constraints, there are even entire genres devoted to make games with 40+ hour experiences (see role playing games). If you factor in the multiplayer of the biggest blockbuster franchise they too, well exceed those time constraints. So again why does all this matter? This is important for two very important reasons. Consumers need to be aware that games frequently offer more bang for your buck than run of the mill movie. Consumers also need to be wary and reviewers should have an obligation to informer gamers if games don’t hit that minimum. This isn’t to suggest games which fail hit the 9 hour mark are inherently bad, but it is important that consumers realize that their dollar might be better spent elsewhere or wait till the price drops. The principle role of a reviewer is to inform consumers on a purchasing decision, and it is critical that the economics play a part. Below is a summary chart for your enjoyment.
Leave a comment if you think a game’s playtime should or shouldn’t factor into a games review score. Does a game’s run time influence your purchasing decisions?
Image courtesy: Kotaku, GamesPress