Meta launches new tools to safeguard young people from sextortion and online intimate image abuse


Mia Garlick: “We are focused on doing everything we can to stop these horrific scams.”

Meta has launched new tools to help prevent sextortion and online intimate image abuse on Instagram, which it says will make it more difficult for potential scammers and criminals to find and interact with young people. 

The tech giant is testing new ways of helping people detect potential sextortion scams, including blurring nudity in images and restricting messaging access to teen accounts.

Mia Garlick, Meta’s regional policy director, said the new measures will help protect young people from scammers.

“We are focused on doing everything we can to stop these horrific scams. We will continue to invest in tools and partnerships to support young people to know they can say no to sharing anything that makes them uncomfortable and to provide resources should they find themselves in this situation.”

She added Meta will continue to work with the community and local law enforcement, including the Australian Federal Police, the Office of the eSafety Commissioner, the Australian Centre for Combating Child Exploitation, and local youth online safety partners, to remind young people of the dangers of sending online images of a sexual nature on Meta’s apps and across the internet.

Nudity protection in DMs

Among the new tools that will undergo testing is a new nudity protection feature on Instagram, which will blur images detected as containing nudity in DMs. Meta said this tool will also protect users from scammers who may send nude images to trick people into sending their own images in return.

Meta noted that when the feature is turned on, people sending images containing nudity will see a message reminding them to be cautious when sending sensitive images, which can be unsent if they change their mind.

Forwarding a nude image received will activate a message encouraging the person to reconsider. If an image containing nudity is received, it will automatically be blurred under a warning screen, meaning the recipient isn’t confronted with a nude image, and they can choose whether or not to view it.

The social media platform will also begin testing a message encouraging people not to feel pressure to respond, and providing options of blocking the sender and reporting the chat. People will also be directed to safety tips, developed with guidance from experts, about the potential risks involved.

Preventing potential scammers from connecting with teens

Instagram is also developing technology to help identify accounts potentially engaging in sextortion scams, based on a range of signals.

The tech giant noted that while such signals are not necessarily evidence that an account has broken the platform’s rules, its precautionary steps to prevent these accounts from finding and interacting with teen accounts is critical.

Any message requests from potential sextortion accounts will go straight to the recipient’s hidden requests folder, meaning they won’t be notified of the message and never have to see it.

For those already chatting to potential scam or sextortion accounts, Safety Notices wil be shown to encourage them to report any threats.

Instagram is also testing not showing the “Message” option on a teen’s profile to potential sextortion accounts, even if they’re already connected. The platform will start testing hiding teens from these accounts in follower and like lists too, making it harder for potential sextortion accounts to find teen accounts in Search results. 

New education resources

The Meta platform will also test new pop-up messages for those who may have interacted with an account removed for sextortion. The message will direct them to expert-backed resources, including Instagram’s Stop Sextortion Hubsupport helplines, the option to reach out to a friend, for those over 18, and Take It Down for those under 18.

New child safety helplines

Instagram is also testing new child safety helplines in the in-app reporting flows, which will allow teens to report relevant issues such as nudity, threats to share private images, or sexual exploitation or solicitation. They will be directed to local child safety helplines where available. 

Meta’s new tools on Instagram come off the back of its partnership with the Australian Federal Police-led Australian Centre to Counter Child Exploitation (ACCCE), Kids Helpline, and US-based organisation NoFiltr last year.

The organisations launched a community service announcement to inform young people about the dangers of online sextortion.

Meta also launched Take It Down; a global platform that lets young people take back control of their intimate images and helps prevent them being shared online – taking power away from scammers.

To Top