Google, Meta, Discord, and more team up to fight child abuse online

A new program called Lantern aims to fight online child sexual exploitation and abuse (OCSEA) with cross-platform signal sharing between online companies like Meta and Discord. The Tech Coalition, a group of tech businesses with a cooperative aim to fight online child sexual exploitation, wrote in today’s announcement that the program is an attempt to keep predators from avoiding detection by moving potential victims to other platforms.

Lantern serves as a central database for companies to contribute data and check their own platforms against. When companies see signals, like known OCSEA policy-violating email addresses or usernames, child sexual abuse material (CSAM) hashes, or CSAM keywords, they can flag them in their own systems. The announcement notes that while the signals don’t strictly prove abuse, they help companies investigate and possibly take action like closing an account or reporting the activity to authorities.

A visualization showing how Lantern works. Image: The Tech Coalition

Meta wrote in a blog post announcing its participation in the program that, during Lantern’s pilot phase, it used information shared by one of the program’s partners, Mega, to remove “over 10,000 violating Facebook Profiles, Pages and Instagram accounts” and report them to the National Center for Missing and Exploited Children.

The coalition’s announcement also quotes John Redgrave, Discord’s trust and safety head, who says, “Discord has also acted on data points shared with us through the program, which has assisted in many internal investigations.”

The companies participating in Lantern so far include Discord, Google, Mega, Meta, Quora, Roblox, Snap, and Twitch. Members of the coalition have been developing Lantern for the last two years, and the group says that besides creating technical solutions, it had to put the program through “eligibility vetting” and ensure it jibes with legal and regulatory requirements and is “ethically compliant.”

READ MORE  Meta takes goal at Twitter with the launch of rival app Threads

One of the big challenges of programs like this is being sure it is effective while not presenting new problems. In a 2021 incident, a father was investigated by police after Google flagged him for CSAM over pictures of his kid’s groin infection. Several groups warned that similar issues could arise with Apple’s now-canceled automated iCloud photo library CSAM-scanning feature.

The coalition will oversee Lantern and says it’s responsible for making clear guidelines and rules for data sharing. As part of the program, companies must complete mandatory training and routine check-ins, and the group will review its policies and practices regularly.

Leave a Comment