Online multiplayer games always had the risk of having one or two toxic players around. Despite having moderation measures in place in text chats, negative players still get away with bad attitudes by using voice chat. That won't be the case in Call of Duty soon enough.
Voice Chat Moderation Using AI
If you've been playing Call of Duty multiplayer for a while, you'd know that the other players tend to spring some unpleasant language through the voice chat. Aside from cussing, they can also utter derogatory or racist slurs, and it does not necessarily stop there.
Voice chat has always been an issue because video game companies have no way of flagging audio then, but they do now. Activision is partnering with a company called Modulate, which is capable of moderating game voice chats.
The AI technology used in the new system is called ToxMod. According to The Verge, it is capable of identifying behaviors like hate speech, discrimination, and harassment in real time. If you're wondering when it goes live, it already has,
The new moderation feature has been rolled out in North America first. Titles like Call of Duty: Modern Warfare II and Call of Duty: Warzone already have the moderation system in place, and it will be rolled out to other countries by November 10th.
The AI moderation tool will also be included in Activision's latest Call of Duty release, Modern Warfare III. Unfortunately, the feature's rollout does not include Asia, according to the video game company's patch notes.
So far, Activision and Modulate have not disclosed much about how the system works, only that it "triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity."
Moderators will then respond to the incident by "supplying relevant and accurate context." It might mean that the player's emotions and volume in speaking will be taken in as a factor in being flagged. The consequence will likely be the same with text chat moderation.
Why It's a Big Step
Voice chat is a preferred method of communication in online multiplayer games, mostly because while in-game, players will sometimes be incapable of chatting through text, and voice chat is the only way to communicate with teammates.
When there's a toxic player in the mix, the choice will be between enduring the muting them and not being able to communicate, or enduring the player's words for the duration of the match. Negative communication can greatly impact both gameplay and the experience itself.
Players no longer report others who violate rules, mostly because more often than not, the toxic players remain unpunished. According to Venture Beat, two out of three gamers don't report instances of toxicity within the game.
According to a survey of 1,000 online gamers in the US, 39.9% have been called offensive names, 38.4% were trolled, meaning that players were teasing for fun, and 29.9% were bullied. In more concerning statistics, 15.9% were sexually harassed and 11.1% were stalked.