New Article: Call of Duty Implements AI Tool to Combat Online Toxicity

New Article: Call of Duty Implements AI Tool to Combat Online Toxicity

Niemal 2 miliony kont w Call of Duty jest badanych w związku z toksyczną rozmową głosową

Call of Duty, one of the most popular video game franchises, has recently implemented a groundbreaking AI tool to tackle the issue of toxic behavior in its online voice chats. The tool, introduced by Activision, the publisher of the game, has already detected over two million accounts engaged in toxic conversations and is currently being investigated.

The new system, aimed at combating toxic speech and promoting a healthier gaming environment, is designed to identify elements such as hate speech, discriminatory language, and harassment among players. It was first introduced in the beta version of the game’s voice chat moderation in Modern Warfare 2 and Warzone in August 2023, initially available only in English for players in North America. The tool has since been expanded globally (excluding Asia) to Call of Duty: Modern Warfare 3, with added support for Spanish and Portuguese languages.

According to Activision, the anti-toxicity system has successfully detected and penalized over two million accounts for “disrupting voice conversations according to the Call of Duty Code of Conduct.” In cases that haven’t been reported, the voice chat moderation system takes appropriate actions against offending players. However, active reporting is still crucial, which is why the company has introduced messages thanking players for their reports and plans to provide feedback on the actions taken in the future.

By analyzing data month by month since the introduction of the voice chat moderation system, Activision revealed that Call of Duty has witnessed an 8% decrease in repeat offenders and a 50% reduction in the number of players “exposed to significant disruptions in voice conversations” since the release of Modern Warfare III. Violators of the code of conduct can expect various consequences, including global muting in voice and text chats, as well as restrictions on other social features.

Call of Duty is committed to combatting toxicity in its games and will continue to develop moderation technology to address disruptive behaviors in both voice and text chats. The company acknowledges that this is an ongoing effort and is dedicated to collaborating with the gaming community to ensure that Call of Duty remains fair and enjoyable for everyone.

FAQ:

1. What tool did Activision introduce in Call of Duty games?
Activision introduced a new tool for temporary voice chat moderation in Call of Duty games.

2. What elements does the tool aim to detect?
The tool aims to detect elements like hate speech, discriminatory language, harassment, and more among players.

3. Since when has the tool been available in the games?
The beta version of the AI system was added to Modern Warfare 2 and Warzone in August 2023.

4. In which region was the system initially introduced?
The tool was initially introduced in North America only, available in English.

5. Which languages were added after the expansion of the system?
The tool was expanded globally (excluding Asia) to Call of Duty: Modern Warfare 3, with added support for Spanish and Portuguese languages.

6. How many accounts were detected by the system?
The anti-toxicity system has successfully detected over two million accounts that were penalized in the game for “disrupting voice conversations according to the Call of Duty Code of Conduct.”

7. What are the consequences for players who violate the code of conduct?
Players can expect various consequences, including global muting in voice and text chats, as well as restrictions on other social features.

8. What benefits have the implementation of the moderation system brought?
According to the data analysis, there has been an 8% decrease in repeat offenders and a 50% reduction in the number of players “exposed to significant disruptions in voice conversations” since the release of Call of Duty: Modern Warfare III.

9. How does Activision plan to continue combating toxicity in games?
Activision will continue to develop moderation technology to address disruptive behaviors in both voice and text chats.

Definitions:

1. Toxicity – Player behavior or speech that is aggressive, offensive, discriminatory, or otherwise harmful to other players.

Sources:

– Call of Duty Official Website

The source of the article is from the blog myshopsguide.com