[ad_1]
The National Center for Missing and Exploited Children launched a tool Monday that allows young people to remove explicit images of themselves that appear online, or block such photos from being shared.
The platform, known as Take It Down, allows young adults from anywhere in the world to submit an anonymous report about explicit or intimate images of themselves posted on certain online spaces. Young adults who are over 18 but appear in imagery taken when they were underage can also submit a report to have the images removed from certain platforms, according to the center, a nonprofit.
The new initiative can’t remove photos everywhere online, but can block or delete them across participating platforms and social-media companies, the center said.
the social-media company that owns Facebook, Instagram and WhatsApp, provided the initial funding to build the infrastructure for Take It Down, the company said.
Facebook and Instagram are also among the participating tech apps, a spokeswoman for Meta said.
MindGeek, the parent company of Pornhub, has signed on to participate with Take It Down, as well as OnlyFans and Yubo, the center said. The nonprofit said it hopes other platforms and social-media companies will also sign on.
Take It Down works by assigning a unique digital fingerprint, known as a hash value, to specific images or videos. Young adults can make a report on their own behalf, or a parent or guardian can submit a case on behalf of a minor, according to the center.
Once a report is made and an image is assigned a hash value, that code is sent to participating tech platforms, which use it to detect and remove the imagery from its public or unencrypted websites and apps.
The hash values are assigned without the image or video leaving the victim’s phone or computer, or having anyone view it, the center said.
Reports can also be submitted by those who fear an explicit or private image or video might be shared online, even if it hasn’t appeared yet, according to the center.
In that case, the hash value works to block the imagery from appearing online without consent.
The center also said it hopes the new platform will prevent an online crime known as sextortion, which occurs when a person threatens to post intimate images or share more explicit content if a victim doesn’t provide money, sexual contact or more illicit imagery.
The nonprofit said it carried out a soft launch of the platform at the end of December and has already received 200 submissions.
The center has also launched a public-service announcement to run on platforms to raise awareness among teens and young adults about their options.
Even though the platform isn’t foolproof—in some cases an altered image might get around the program—the center said it offers victims a new way to fight back against unwanted exposure.
“Having explicit content online can be scary and very traumatizing, especially for young people,” said
Gavin Portnoy,
a spokesman for the center. “We cannot go back and change what happened, but we can help you move forward.”
Take It Down mirrors a similar platform that was launched for adults in 2021. That initiative, known as StopNCII and meant to protect and prevent nonconsensual image sharing among those over 18, was led by a U.K.-based charity, the South West Grid for Learning, and funded and developed by Meta.
StopNCII also uses a unique hash value to identify unwanted imagery online.
In December 2022, TikTok and
Bumble Inc.
joined the initiative to halt what is sometimes called revenge porn, when nude, partially nude or sexually explicit content is shared online without an adult’s consent, the nonprofit said.
In its first year, StopNCII.org helped 12,000 adults create cases to stop images or videos from being shared online without consent, according to South West Grid.
The National Center for Missing and Exploited Children, as well as Meta and cybersecurity experts, wanted to create a separate platform for teens and young adults because of concerns about shared intimate images of minors, a Meta spokeswoman said.
Take It Down is being promoted across Meta’s platforms and integrated it into Facebook and Instagram to make it easy for users to report potentially violating content, the tech company said. Both apps already have a way to report inappropriate content.
Meta said it has developed more than 30 tools to support the safety of teens and families across its apps, including defaulting teen users into the most private settings on Facebook and Instagram and educating teens about the potential harms of taking intimate or explicit photos and videos.
Write to Ginger Adams Otis at Ginger.AdamsOtis@wsj.com
Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8
[ad_2]