The end of unsolicited d*** photos? Instagram is making a big change to protect users from receiving unwanted nude photos in their direct messages
Instagram is finally offering its users protection from unsolicited d*** photos, urging those who send them to delete them as soon as possible.
The Meta-owned app is releasing a tool called “nudity protection” that blurs nude photos received via direct messages (DMs).
It uses AI to automatically identify male or female genitals in sent photos before turning them invisible, similar to technology that already exists in dating apps.
However, the recipient still has the chance to tap the photo and get a good look at it even if they are only 13 years old – which a children’s charity has called ‘completely inadequate’.
Instagram requires everyone to be at least 13 years old if they want to create an account – already criticized by experts and the public alike as far too young.
Instagram finally offers users protection against unsolicited d*** photos and ‘flap snaps’ in direct messages (file photo)
In a statement, Meta said Instagram DMs are “overwhelmingly” used harmlessly to send messages and photos to friends and family.
However, sending an unsolicited photo of genitals is ‘intimate image abuse’.
Additionally, “sextortion scammers” who receive intimate images may threaten to share them unless a ransom is paid by the hapless sender.
According to a study earlier this year, 100 children a day are falling victim to sextortion scams on social media, including the 16-year-old boy who committed suicide.
“To address this issue, we will soon begin testing our new nudity protection feature in Instagram DMs. This feature blurs images found to contain nudity and encourages people to think twice before sending nudes,” says Meta.
According to Meta, owned by billionaire Mark Zuckerberg, the new tool will be rolled out in the coming months.
Nudity protection will be enabled by default for users 13 to 17 years old worldwide, but Instagram will show users 18 and older a notification encouraging them to enable this feature as well.
When nudity protection is active and someone receives a nude photo, the photo is automatically hidden – but the recipient is given the option to tap ‘view photo’.
They will then see the photo heavily blurred with a warning ‘photo may contain nudity’ – and they can then tap ‘see photo’ to see it blurred.
When someone receives an image containing nudity, it will automatically be blurred under a warning screen. This means that the recipient is not confronted with a nude image and can choose whether or not he wants to view it. Instagram will also show them a message encouraging them not to feel pressure to respond
‘Ask for help’: Instagram presents a series of safety tips for users who receive or send a nude photo. The sender could potentially be a victim of a sextortion scam
Then Instagram shows the recipient a pop-up message telling them not to “feel pressured to respond” to the person who sent the nude.
They can also tap to block the sender and report the chat, or tap another button to see more safety tips.
And if they try to forward a nude photo they’ve received, they’ll see a message encouraging them to reconsider.
For the person sending the nude, the process will be different.
Thanks to the automatic AI nudity detection technology, the photo will also be blurred for them in the chat, but the sender will receive a different pop-up message.
The message reads: ‘Be careful when sharing sensitive photos… others could screenshot or forward your photos without you knowing.’
They can also no longer send the nude photo if they have changed their mind, but they are told that ‘there is a chance that others have already seen the photo’.
They will also have the opportunity to see the safety tips, which were developed with guidance from experts, Meta said.
Instagram DMs use end-to-end encryption, which ensures that only the two participants in the chat can see messages – making it easier for pedophiles to go undetected by law enforcement authorities, according to some experts.
This is what users who have submitted a nude photo will see when nudity protection is enabled. A message tells them to be careful when sending ‘sensitive photos’ and lets them unsend the photo if they change their mind – but ‘there’s a chance others have already seen the photo’
Anyone who tries to forward a received nude photo will see a message encouraging them to reconsider
UK children’s charity the NSPCC said end-to-end encryption on Instagram and other meta-platforms such as WhatsApp and Messenger means the new tool is ‘wholly inadequate’.
“Last year, more than 33,000 child sexual abuse crimes were recorded by British police, more than a quarter of which occurred through Meta’s services,” said Rani Govender, senior policy officer at NSPCC.
‘Meta has long argued that disrupting child sexual abuse in end-to-end encrypted environments would weaken privacy, but these measures show that a balance can be struck between security and privacy.
‘However, the new measures will be completely inadequate to protect children from harm and must go much further to tackle child abuse.
“It is now inconceivable that Meta will continue down the path of deliberately losing the ability to identify and disrupt abuse through their decision to implement end-to-end encryption without safeguards in place for children.”
‘A true sign of their commitment to child protection would be to pause end-to-end encryption until they can share with Ofcom and the public their plans to use technology to stop the prolific child sexual abuse that takes place on their platforms is taking place.’
Instagram has a strict policy against posting nude photos on the main element of the app – the photo feed – although this policy does not appear to apply to DMs.
It includes a ban on photos and videos of female nipples, but makes exceptions for women who are “actively breastfeeding.”
Unfortunately, several OnlyFans stars have used this loophole to post videos of dolls being held against their bare breasts and pretending to breastfeed.
Users are slamming Instagram for failing to spot the problem, with one saying it’s “a complete mockery of mothers everywhere.”
Earlier this year, Instagram finally began hiding posts that could cause serious harm to children, including posts related to suicide, self-harm and eating disorders.
It follows serious concerns about the way teenagers are being affected by social apps, including 14-year-old Instagram user Molly Russell who committed suicide in 2017.