Microsoft Develops Tool to Help Identify Child Sexual Predators in Gaming Chat Rooms

Microsoft unveiled a tool Thursday to help identify sexual predators in online gaming and app chat rooms titled Project Artemis.
The tool, which Roblox helped develop, combs through text conversations, evaluating and rating certain conversation characteristics for patterns used by predators. It then rates the conversation based on how likely it is to be an example of grooming, allowing companies to determine whether or not a conversation merits human moderation.
Microsoft is sharing a grooming detection technique, code name “Project Artemis,” by which online predators attempting to lure children for sexual purposes can be detected, addressed, and reported. https://t.co/IZvjOExjAy (1/3)
— Microsoft On the Issues (@MSFTIssues) January 9, 2020
“‘Project Artemis' is a significant step forward, but it is by no means a panacea," wrote Microsoft's chief digital safety officer Courtney Gregoire in a blog post.
Microsoft began development of Project Artemis in November 2018, then went on to test it on Xbox Live and Skype. The tool will be licensed starting Friday through the nonprofit Thorn, which itself builds tools to prevent the sexual exploitation of children.
Thorn assisted in the development of the tool, alongside The Meet Group, Roblox and Kik. Dr. Hany Farid, an academic with a history of work in child sexual exploitation detection, led development.
"Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems," Gregoire said. "But we are not deterred by the complexity and intricacy of such issues."