Call of Duty Introduces AI-Powered Voice Chat Moderation
It’s no secret that Call of Duty fans are the most toxic in all of gaming. Moderation, especially across millions of gamers, requires an army. But inserting AI running realtime as a screening mechanism is pretty genius, and hopefully cleans up COD… although expect some serious “wrongful ban” backlash while they work out the kinks.
Activision, the company behind the popular Call of Duty franchise, has partnered with Modulate to introduce an AI-powered voice chat moderation system called ToxMod. This new technology aims to combat toxic behavior, hate speech, discrimination, and harassment in real-time during in-game voice chats. The system will be implemented in the upcoming Call of Duty: Modern Warfare III, set to launch on November 10th.
ToxMod uses advanced machine learning to analyze the nuances of conversations and determine toxicity. While the AI system will identify and report toxic behavior in real-time, it will not take immediate action against players. Instead, it will submit reports to Activision's moderators, who will then decide on the appropriate enforcement actions.
This new development is part of Activision's ongoing efforts to combat toxicity in the Call of Duty community. Since the launch of Modern Warfare II, the company's existing anti-toxicity moderation systems have taken action against over a million accounts found to have violated the Call of Duty Code of Conduct. The introduction of ToxMod is expected to further enhance these efforts and provide a more enjoyable gaming experience for players.
Image: Activision