Call of Duty Enlists AI to Combat Toxic Chat

Published 12 September 2023

Author
Julia Errens
2 min read

The November 10 launch of the next chapter in uber-popular first-person shooter series Call of Duty (more than 400 million lifetime sales) will deploy an AI-powered moderation tool designed to tackle toxicity in its multiplayer voice chat.

In partnership with Boston-based gaming moderation start-up Modulate, Call of Duty publisher Activision (US) will use Modulate’s ToxMod AI tool to identify hate speech, bullying, harassment and discrimination in voice chat sessions in real time. Modulate says ToxMod has the ability to read context cues, such as tone and voice volume, to distinguish friendly trash talk from verbal assault. It will forward concerning instances to Activision’s human moderation team for review.

This is Activision’s latest step towards improving the culture surrounding Call of Duty, which has been dubbed gaming’s most negative community. After introducing a code of conduct in 2022, the company banned 500,000 toxic players.

Women are especially under fire in multiplayer voice chats across gaming, and research has found male-identifying gamers play more aggressively against those they perceive as female. Among female players, 76% report disguising their gender to feel safe. Since Call of Duty and similar team shooter games (Valorant, Apex Legends, Overwatch) require teamwork to succeed and excel, this toxicity and discrimination excludes such players from full participation, including esports careers.

The work of improving the culture surrounding competitive multiplayer gaming holds worthwhile opportunities for awareness-raising brand activism. Take Through Their Eyes, a March 2023 experiment by American beauty brand Maybelline that saw two popular male gaming livestreamers experience multiplayer games with their voices modulated to sound female, resulting in a deluge of misogynist abuse.