Xbox has released its first-ever Digital Transparency Report, offering insight into the work its community and moderation teams have done over the past six months in the name of player safety – including news it has taken action against 7.3 million community violations during this period. .
Microsoft announces the release of its first Digital Transparency Report as part of its “longstanding commitment to online safety”, with the aim of publishing an updated report every six months so that it can address the key takeaways and do more to “help people understand how to be a positive part of the Xbox community.”
Much of the document is dedicated to reiterating the various tools and processes that Microsoft and Xbox gamers have at their disposal to ensure user safety and adherence to community guidelines, ranging from parental controls to reporting.
However, the second half of Microsoft’s report goes into more specific detail about its actions over the past six months, breaking down the number of player reports versus the number of apps (i.e. instances where content is removed, accounts are suspended, or both).
In total, Microsoft says it received 33 million player reports between January 1 and June 30 this year, of which 14.16 million (43%) related to player behavior, 15.23 million (46%) focused on communications and 3.68 million (11%) based on user-generated content.
Of these reports, 2.53 million resulted in what Microsoft calls reactive enforcements, and an additional 4.78 million enforcements were done proactively (i.e. using “protection technologies adding processes” before an issue is raised by a player), bringing the total number of executions up to 7.31M. Notably, 4.33 million (57% of all enforcement actions) were related to cheating and inauthentic accounts, based on activity detected prior to player reports.
The full transparency report provides an interesting insight into the moderation challenges a platform like Xbox faces, as well as the steps Microsoft has taken so far to combat them.