The company Modulate, which developed the AI content moderation system ToxMod, conducted an experiment in one of the games. Just three days after implementing ToxMod, the number of active users increased by 6,3%, and after 21 days, the growth was 27,9%. Toxic behavior of players remains one of the main problems faced by developers of MMO games. 

According to Mike Pappas, CEO and co-founder of Modulate, game studios make significant efforts to protect users from aggression and create a positive gaming environment. Effective content moderation not only improves the player experience but also contributes to player retention in the project.

The financial success of games-as-a-service largely depends on the loyalty of subscribers who make purchases in in-game stores. Uncontrolled toxicity in chats can lead to the outflow of existing users and deter potential newcomers who are afraid of an aggressive community.

ToxModImage: DALL·E 3 Moreover, the European Union's Digital Services Act imposes fines of up to 6% of a company's annual turnover for non-compliance with user safety policies. For example, in 2022, Epic Games paid $275 million due to violations of the Children's Online Privacy Protection Act (COPPA).

Developers may underestimate the problem of toxicity, especially when a game is at the peak of its popularity. However, according to a Take This survey, 61% of players are inclined to spend less money in projects where they have encountered aggression, hate speech, or harassment.

A 2023 study by Dr. Constance Steinkuhler confirmed this trend. Average monthly gamer spending was $12,09 in toxic games versus $21,10 in "friendly" projects.

Main image: Ensiplay and DALL·E 3