The latest Call of Duty operator isn't Skeletor, Spawn, or Evil Dead's Ash Williams (characters rumored to be joining Warzone as new operators) but a new AI-powered voice moderation technology system called ToxMod built by Modulate.

Call of Duty: Modern Warfare III will launch with ToxMod, Activision's AI-powered voice moderation technology, image credit: Activision Blizzard.
Okay, so it's not an in-game character, but a funky new AI that will offer Activision "global real-time voice chat moderation, at scale," to combat toxic and disruptive behavior beginning with the launch of Call of Duty: Modern Warfare III on November 10. As an AI tool, ToxMod will be able to identify and, in real-time, enforce Activision's toxic speech policies covering hate speech, discriminatory language, and harassment.
ToxMod will sit alongside Call of Duty's text-based filtering systems that cover 14 languages and the in-game reporting system that players can use. Ahead of its integration into Call of Duty: Modern Warfare III, Activision plans to roll out a beta for the new chat moderation technology in North America.
This is set to kick off this weekend in Call of Duty: Modern Warfare II and Call of Duty: Warzone; initially, it will only be available for English-speaking players, with additional language support coming later.
Toxic behavior and Call of Duty go hand-in-hand due to the sheer popularity of the franchise, and even though Activision has restricted over 1 million accounts due to them violating the Call of Duty Code of Conduct - it's still generally seen as widespread. On the plus side, Activision has shared a nice piece of data - 20% of players did not re-offend after receiving their first warning.
With AI being used to monitor player behavior in real-time, perhaps it's only a matter of time before we get AI-powered cheat and bot detection systems to combat the other side of bad online behavior - cheating.