Activision’s AI Tool Fights Toxic Behavior in Call of Duty

Activision’s AI Tool Fights Toxic Behavior in Call of Duty

Effektywność narzędzia AI firmy Activision w walce z toksycznym zachowaniem graczy w Call of Duty

Activision, a video game company, has announced the results of tests conducted on its artificial intelligence (AI)-based moderation tool, which is used in the game Call of Duty. The tool, known as ToxMod, was introduced in August of last year and has since identified over 2 million accounts with “disruptive voice chat” registered on them.

Activision introduced ToxMod as a beta version before the release of Call of Duty: Modern Warfare III and gradually developed it worldwide. According to company representatives, ToxMod is capable of identifying “disruptive” comments in 14 languages in games such as Call of Duty: Modern Warfare II, Modern Warfare III, and Call of Duty: Warzone.

Although Activision’s tool has proven to be effective, the company emphasizes that only 1 in 5 users report cases of toxic behavior. In response, messages encouraging players to report any violations of the game’s rules have been implemented.

The introduction of ToxMod has yielded immediate results, allowing the company to expand its moderation strategy. Activision states that since the implementation of the AI model, there has been a gradual decrease in the number of repeated violations of game rules by players.

In addition, the number of “serious cases” of disruptive voice chat has decreased by 50% since the release of Modern Warfare III. Players who continue to break the rules will face further restrictions, such as disabling the ability to use voice and text chat and limiting other social features.

Activision also emphasizes that they are seeking ways to allow players to provide “additional feedback.” For those in need of a reminder of the rules that may lead to account reports, Activision has updated the Code of Conduct in the game Call of Duty. The company has clearly defined its stance in combating toxic speech, including zero tolerance for violence and harassment, as well as offensive remarks regarding race, identity or sexual orientation, age, culture, religion, physical or mental disabilities, or country of origin.

Activision is determined to combat toxic behavior in games and plans to continue developing its moderation technologies for both voice and text chat.

Frequently Asked Questions

The source of the article is from the blog guambia.com.uy