New Mechanisms to Combat Toxic Behavior in Call of Duty

New Mechanisms to Combat Toxic Behavior in Call of Duty

Nieznośne zachowanie graczy w Call of Duty: dalsze działania w celu zwalczania toksyczności

In a recent Call of Duty update, it was revealed that a staggering 2 million player accounts have been penalized for toxic behavior in the game. This announcement came as part of Activision Blizzard’s ongoing efforts to introduce new moderation mechanisms in Call of Duty. Specifically, the update discussed the implementation of voice chat automatic moderation features, which were rolled out in August 2023. These players were punished for “hindering voice communication” in Call of Duty.

Published data on callofduty.com tells a chilling story. For many Call of Duty players, the game has been synonymous with toxicity, especially in online multiplayer modes like Search and Destroy, where players often insult and offend each other in practically every match. While some may argue that Call of Duty has always been a breeding ground for trash talk and jokes, it is clear how much of an impact such communication can have on other players.

However, these measures alone are not enough. In a blog post, Activision Blizzard revealed that thanks to the new moderation mechanisms, there has been a 50% decrease in the number of players exposed to “severe cases of hindering voice communication” in the past three months. Additionally, there has been an 8% decrease in “repeat offenders” – users who were punished and then continued to break the rules and exhibit toxic behavior in the game. Ultimately, two million player accounts have been penalized due to toxic behavior.

Yet, there is still a significant problem, as emphasized by Activision Blizzard. It was stated that for all identified instances of disruption caused by artificial intelligence, only 20% were reported by other players. This means that 80% of toxic and offensive communication goes unreported and unnoticed. With the new technology in place, reporting is not necessary to take action against these harmful operators.

If you behave toxic in the game, these systems will identify you and you will be penalized. It’s as simple as that.

But it doesn’t end there. It was emphasized that in the future, additional features will be implemented as Activision Blizzard’s anti-cheat and moderation teams continue to develop new mechanisms to combat toxic and harmful actions in the game. Many players argue that the game has become “too soft,” and older players claim that “today’s players wouldn’t survive in their lobbies,” but Activision Blizzard remains steadfast: toxicity will not be tolerated.

To stay up to date with Call of Duty news, follow Esports.net.

FAQ:

1. What actions have been taken to combat toxic behavior in Call of Duty?
Answer: The Call of Duty game update introduced new moderation mechanisms, including voice chat automatic moderation features. Players have been penalized for hindering voice communication in the game.

2. What are the consequences for players displaying toxic behavior?
Answer: According to the report, a staggering 2 million player accounts have been penalized for toxic behavior in the game. The implementation of new moderation mechanisms has also seen a decrease in the number of players exposed to severe cases of hindering voice communication.

3. What are the results of implementing new moderation features?
Answer: With the new moderation mechanisms in place, there has been a 50% decrease in the number of players exposed to severe cases of hindering voice communication in the past three months. There has also been an 8% decrease in repeat offenders.

4. What is the main challenge associated with moderation in Call of Duty?
Answer: The main challenge is the fact that only 20% of disruptions caused by artificial intelligence are reported by other players. This means that 80% of toxic communication goes unreported.

5. What further actions will be taken to combat toxic behavior in the game?
Answer: Activision Blizzard plans to introduce additional features and mechanisms to combat toxic and harmful behavior in Call of Duty. The anti-cheat and moderation teams will continue to work on this issue.

Definitions:
– Moderation mechanisms: Features implemented in the game to control and penalize players for toxic behavior.
– Toxic behavior: Behavior exhibited by players that involves insulting and derogatory language.

Suggested related links:
– [Call of Duty](https://www.callofduty.com/)
– [Esports.net](https://www.esports.net/)

The source of the article is from the blog zaman.co.at