Electronic Arts has brought the long-simmering debate over the potentially exploitative nature of loot boxes to a boil. Fans found the grind to unlock items and characters in Star Wars Battlefront II so egregious (nearly 40 hours of regular play for a single character) that they gave EA’s response to the controversy on Reddit the dubious honor of most downvoted post in the platform’s history. Developer DICE has already backpedaled, reducing character unlock costs by 75% and tweaking the item drop system, but, to an extent the damage is already done; EA has made its priorities clear.
It’s good that players are voicing concerns now, because this is only the beginning. Take-Two (parent company of 2K and Rockstar Games) reported the other week that microtransactions (e.g. buying in-game items, virtual currency, and DLC) comprised 42% of its most recent quarterly revenue (“It’s been transformative,” said CEO Strauss Zelnick), and that the company intended to henceforth focus on exclusively publishing games with “recurrent consumer spending hooks.” For Ubisoft, the publisher behind Far Cry, Assassin’s Creed, and many others, the figure was 51%. What happens when microtransactions transition from a supplementary revenue stream to being the main business model? That reality has arrived.
Yes, players are buying more loot boxes than ever, but does that mean they actually like them?
Publishers look at numbers like Take-Two’s revenue figures, or the popularity of loot boxes in Overwatch, and take it as justification that players want more. However, that is the circular reasoning of corporate entertainment production at work, absolving producers of their role in creating that desire in the first place. The naked, cynical capitalism of the phrase “recurrent consumer spending hooks” undercuts any explanations focused on positive player experience. Major video game publishers like EA and Ubisoft are publicly traded companies in an industry with rapidly-inflating overhead costs, and so following the money is always a reasonable exercise for explaining behavior.
Publishers have justified these changes at every turn as following player desire, and they have the sales to suggest that there’s some truth to that. However, they take no ownership of their role in creating and manipulating that desire through marketing and game design. Yes, players are buying more loot boxes than ever, but does that mean they actually like them?
A brief, recent history of the loot box
The modern loot box (a randomized collection of virtual goods that players can purchase with some combination of in-game and real currency) emerged in Chinese free-to-play games like ZT Online and Puzzles & Dragons in the late aughts. The system’s first major appearance in the west came in a 2010 patch for Valve’s seminal class-based shooter, Team Fortress 2. Anxious about breaking gameplay balance, however, they chose to restrict the loot boxes to cosmetic content (such as weapon reskins or the notorious hats), which allowed players to make their characters stand out. Soon thereafter making the game wholly free-to-play, developers added the system to better align the game’s business model with its “game as a service” ongoing development.
That anxiety was in part rooted in a fear of being lumped in with some of the more abusive trends emerging in microtransactions in mobile and social games, such as the infamous FarmVille, which pioneered manipulating players into spending money through timers and social pressure. Social games like FarmVille were extreme, but prescient example of microtransaction-focused design. Braid designer Jonathan Blow once quipped that “it’s just reward structure layered on reward structure layered on reward structure with a hollow center.”
Fast forward to 2017, though, and slew of publishers have gotten into hot water for offering boxes that impact gameplay. Fans were skeptical of their addition to Middle-earth: Shadow of War, Monolith and Warner Bros.’ heavily-anticipated sequel to Shadow of Mordor, which is notably a single-player-only game. While the developer insisted that the game was balanced around playing without additional purchases, many players found that they were all but necessary to complete its final “Shadow Wars” section in a reasonable amount of time – a steep difficulty spike meant that surviving these raids on your fortresses required either extensive and time-consuming grinding to unlock the most powerful orcs and weapons, or simply purchasing enough loot boxes to achieve the same effect. The “true ending” was more or less gated behind an artificial extension of the endgame, which felt disconnected from the rest of the play experience, and left a lot of players bitter.
Allowing players to spend more on cosmetic items is an acceptable accommodation for vanity
Just prior to Shadow of War, Forza Motorsport 7 tried to stray from series tradition by changing its one-time VIP purchase, which boosts the rate at which you earn in-game credits to unlock new cars, into a package of discrete, single-use bonuses that have to be continually purchased, until player furor forced the developer to go back. In previous games players could always choose to make races more difficult (such as by setting it at night or removing certain player-aiding mechanics) in exchange for a larger reward. Doing so in Forza 7 now requires one of these single-use mods, which are acquired randomly through loot boxes. Both Forza and Mordor took content that had been freely available in previous games and locked it behind byzantine microtransactions. The sense that something was being taken away is what felt most egregious.
Like Star Wars Battlefront II, Forza and Middle-earth are full-priced, $60 games. There is an implicit social contract between players and publishers that $60 is going rate for a complete and fulfilling experience with the highest production values. For many, allowing players to spend more on cosmetic items is an acceptable accommodation for vanity, but designing games around the expectation that, after an up-front purchase, players will spend more money just to access their basic content violates that contract.
Gamers, en masse, are notoriously good at breaking designers’ intentions, rapidly finding the most efficient techniques for churning through content. By providing ways to accelerate through a game with money, developers have set up a sort of related rates problem: how do you set the cost of microtransactions and the tedium of un-supplemented gameplay such that spending money becomes the path of least resistance without triggering too much anger or making them feel exploited? EA and DICE’s debacle with Battlefront II shows what happens when you miscalculate, and have to solve the problem in public.
Wag the dog
If player-centric justifications for adding loot boxes ring false, then we must look closely at how and why publishers are successfully integrating these economies into their games. From the satisfying tactility of opening them to the gameplay systems in which they are couched, loot boxes are being carefully crafted to manipulate players into wanting them.
Microtransactions majorly muddy the water for players’ ability to properly appraise the value of the time and money they spend on games, crippling their basic power as consumers. Myriad small purchases are much harder to track than a few large ones. The app economy, several years advanced in this regard, has already produced horror stories of people — especially children — spending far more than they realized.
Developers really have no excuse, however, because the nature of video games means they can create and control these in-game economies in lab-like conditions. All evidence suggests that developers are working hard to find new and more insidious ways to manipulate players into making more in-game purchases.
Making loot boxes so performative encourages a sort of “keeping up with the Joneses” mentality
Call of Duty: WWII recently took this to the next level by integrating loot boxes into its “headquarters” social space: Rewards drop down into the game for all to see, and players can see what cards come out. The game even includes a quest to watch three players open loot boxes for a reward. Conspicuous consumption is a well-documented effect, and making loot boxes so performative encourages a sort of “keeping up with the Joneses” mentality that inevitably leads to more spending.
Even more insidious, Call of Duty’s publisher, Activision, also recently filed patents for systems that encourage players to make more in-game purchases by manipulating the matchmaking system. For instance, players might be paired against someone else with a substantial advantage from having more items, encouraging them to catch up. Conversely players might be placed into modes and maps that favor recent purchases in order to make players feel good about their investment. Although Activision insisted that these are purely speculative patents, not yet in place, their intent to manipulate players into opening their wallets is transparent and troubling.
There ain’t no such thing as a free lunch
Always opinionated about industry discourse, Jonathan Blow recently tweeted about the loot box furor, pointing out that the cost of developing AAA games has skyrocketed, as the $60 price point has stayed the same, despite inflation. He’s right: the resources required to make today’s massive releases, loaded with dozens of hours of content and cutting-edge production values, are going up every year, and something’s got to give. Developers have reached the limit of how much value they can squeeze out of labor through exploitative “crunch” practices, so players are next in line for bearing the brunt of the rising costs.
The Entertainment Software Ratings Board (ESRB) recently announced it doesn’t consider loot boxes gambling, which would expose them to substantially more government oversight. That irked a lot of people who saw it as a tangible step to limit their proliferation, especially among children, but a myopic focus on loot boxes somewhat misses the point. Loot boxes are only the current and most popular form of “recurrent consumer spending hooks” in games, but they will not be the last. The gaming community needs to have an honest and broad discussion about how much games cost to make, and how much players are willing to pay for them.
By all indications that’s not a conversation that major publishers want to have. In light of other recent trends, like narrowing review windows and proliferating pre-order bonuses encouraging more purchases in a critical vacuum, there is a troubling irony for an industry built on fantasies of player empowerment to be on the bleeding edge of disempowering and exploiting its own consumer base. Whether it’s scaling back the scope or more fundamentally changing the purchase model, something needs to change in the economics of AAA games, and transparency is the only way forward to ensure a happy and healthy future for games, and the people that make and play them.
The views expressed here are solely those of the author and do not reflect the beliefs of Digital Trends.