Video games have had a great year. 2017 saw Nintendo making a major comeback from the disappointing Wii U with both the Nintendo Switch and two incredible games: The Legend of Zelda: Breath of the Wild and Super Mario Odyssey. Also coming this year were games such as Horizon Zero Dawn and Wolfenstein II. It’s safe to say that video games are really hitting their stride, providing deep, intriguing stories within spectacular worlds. But, even in the face of all of this success, video games may also be experiencing one of their biggest problems.
In recent news, Star Wars: Battlefront II received a lot of criticism for their microtransaction and pay-to-play format. While EA has since altered the game, originally players were forced to spend money or an exuberant amount of time to unlock their beloved characters. It was deduced that it would take at least 40 hours of multiplayer gameplay to unlock characters such as Luke Skywalker and Darth Vader. Fans began complaining on forums about how long it took to unlock characters and their uproar brought in-game purchases into the spotlight. The question began to arise: Should gamers have to pay to access things in-game when they already spent $60-$80 on the game itself?
2012’s Mass Effect 3 often gets credited for making microtransactions mainstream. While that may be partly true, in-game purchases have long been a part of gaming. However, historically, these in-game purchases came in free-to-play games, so gamers weren’t paying a flat rate for the game in the first place. While Mass Effect 3 did incorporate loot boxes, Overwatch is largely considered the catalyst of loot boxes in full-priced games. In Blizzard’s shooter, players earn a box each time they level up but have the option to purchase additional boxes. That said, Overwatch had come with a promise that players would not have to pay for any new maps, characters, or game modes. Essentially, every player that paid the up-front $60 for the game would have the same version.
But now, with games such as Battlefront II, players feel like they must pay extra in order to access the full potential of the game. And many other issues arise as well. Some free-to-play games, the format that nearly invented in-game purchases, have moved towards pay-to-win. Many fear that this pay-to-win scheme will make its way into big-budget games, forcing players to invest even more money just to be able to compete, likely creating an uneven playing field between those who can and can’t pay.
Additionally, there have been discussions around the legality of loot boxes. There are those that argue that the purchase of loot boxes is essentially gambling. Players invest money for unknown and randomized results. Just last week, an official from the Belgian Gaming Commission said that he would like the ban loot boxes from gaming altogether. The issue becomes more unsettling for some when one considers that many gamers are minors. In fact, Hawaiian State Representative Chris Lee recently held a press conference in which he refers to the issue at hand as “predatory” and discusses the issue of having a “Star Wars themed online casino designed to lure kids into spending money.”
China already regulates loot boxes under gambling law. This year, in fact, a regulation was passed in China that requires developers to disclose the probability of getting certain items from loot boxes. So, it may not be long until U.S. game developers face similar regulations. For now, at least in the case of Battlefront II, the consumers’ outrage was enough to reverse many of the in-game transactions. And, in the wake of the Battlefront controversy, major developers will likely be revisiting microtransactions in future games. Recently, Obsidian, which developed Fallout: New Vegas, recently promised that their next RPG would not include any loot boxes or microtransactions. Nonetheless, the gaming industry is in a state of transition, with critical successes like Super Mario Odyssey and unfortunate hiccups like Battlefront II bringing in the new year.