Microsoft Shares Its AI Software to Combat Online Grooming by Sex Offenders

DZone 's Guide to

Microsoft Shares Its AI Software to Combat Online Grooming by Sex Offenders

In this article, explore a new grooming detection technique that is trying to combat online grooming by sexual predators.

· AI Zone ·
Free Resource

Microsoft building

This week, Microsoft shared a new grooming detection technique, code name “Project Artemis,” by which online predators attempting to lure children for sexual purposes can be detected, addressed, and reported. Microsoft has been leveraging the technique in programs in its Xbox platform for several years. It's been developed in collaboration with The Meet Group, Roblox, Kik, and Thorn and will be made freely available via Thorn to qualified online service companies that offer a chat function.

The development of this new technique began in November 2018 at a Microsoft “360 Cross-Industry Hackathon.” Building off the Microsoft patent, the grooming detection technique is applied to historical text-based chat conversations. It evaluates and “rates” conversation characteristics and assigns an overall probability rating. This rating can then be used as a determiner set by individual companies implementing the technique as to when a flagged conversation should be sent to human moderators for review.

Human moderators would then be capable of identifying imminent threats for a referral to law enforcement, as well as incidents of suspected child sexual exploitation to the National Center for Missing and Exploited Children (NCMEC). NCMEC, along with ECPAT International, INHOPE and the Internet Watch Foundation (IWF), provided valuable feedback throughout the collaborative process.

Microsoft has a long-standing commitment to child online protection. In a recent blog post, they shared:

"First and foremost, as a technology company, we have a responsibility to create software, devices and services that have safety features built-in from the outset. We leverage technology across our services to detect, disrupt and report illegal content, including child sexual exploitation. And we innovate and invest in tools, technology and partnerships to support the global fight needed to address online child sexual exploitation." 

While child sex abusers continue to find ways to exploit children, tech tools like Microsoft’s PhotoDNA have revolutionized digital photography, helping to identify newly created material and help determine “where in the world” are these child victims.

You might also want to read:  Microsoft’s Insights on AI

The Real Value of Data Science Is Evident in Thorn

Thorn is a technology nonprofit that builds software to fight trafficking and child exploitation. It's co-founded by Ashton Kutcher and Demi Moore. Their most successful software is Spotlight, which "takes investigative times down from three years to three weeks" and  enables law enforcement officials to:

  • Gather insights from millions of data points to identify and locate those most at risk.
  • Reallocate time spent investigating to recovering victims.
  • Reduce the risk of following misleading information.

While the mechanics of the software are under wraps (presumably to be safe from the dark web), since launching, it's been used by over 10,000 investigators since launching to:

  • Identify 10 juvenile victims a day
  • Support 62K Human Trafficking Cases
  • Over 16K trafficker identified 
  • Reduce investigation time by 67%

Runaway Train Meets Digital Billboards and Location Intelligence

Those of you in the Generation X age group may remember the song Runaway Train by American grunge band Soul Asylum:

The song dealt with depression, and the film clip interspersed the band singing with the photos and names of real missing children along with the date on which they each disappeared. 

The song was a huge hit globally, not only reasoning with people musically but actually helping to find missing children. Three original versions of the video were screened in the US resulting in 21 missing children of 36 shown being located since the video aired. (The children would be updated in subsequent clips to show others missing). It didn't all end in happiness— The UK version of the video featured two girls whose remains were found 16 years later, with Peter Tobin convicted of both murders. The Australian version included missing backpackers. Several of them turned out to be victims of Ivan Milat, whose bodies were discovered in Belanglo State Forest, New South Wales Australia.

It was relaunched last year and repackaged to include a new cover (arguably not as good as the original), a website, and digital billboards, creating a powerful multi-platform tool to find missing children and one the largest digital out-of-home advertising campaigns in American history. The website and billboards involve a world-first use of geolocation technology using geo-targeted feeds to display images of local children from the National Center for Missing and Exploited Children (NCMEC) online in the video and on digital billboards and transit screens. If the video is shared on social media, local missing children are featured automatically.

Screenshot 2019-05-23 at 19.32.22

It's easy to blame tech for societal ills. The underlying social problems have always been there, but now they've expanded exponentially. But sometimes tech fights back.

Further Reading

5 Amazing Examples of Artificial Intelligence in Action

Should You Use AI to Make Decisions About Your Software Team?

artifical intelligence ,location intelligence ,photo dna ,microsoft ,tech for good ,tech for social good

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}