Meta has confirmed that it is developing a way to allow users to automatically block nude photos in Instagram DMs. Despite the nature of the detected images, one hopes it will suppress unolicited nude photos to which many women are subjected.
Alessandro Pulazzi, a developer and reverse engineer, discovered the evidence of the feature.
Instagram is collaborating on nudity protection for chats.
Photos that might contain nudity in chats are covered in technology on your device. Instagram CANT provides photo sharing.
When he was invited to activate the feature, he posted a screengrab of the screen users.
Instagram''s owner Meta confirmed to The Verge that the feature is in development.
The possibility of user controls, which are still in the early stages of development, will aid people to avoid unnecessary photos or other harmful messages.
These capabilities are similar to the Hidden Words feature, which allows users to automatically filter direct messages containing offensive content.
The company claims that user privacy is protected, and promised to share further information in the coming weeks.
According to Meta, the technology will not allow Meta to view the actual messages or disclose them with third parties. We are working closely with experts to ensure these new features protect peoples privacy, while giving them control over messages they receive, according to Meta''s spokesperson.
Especially for women, abusive messages, including unsolicited nude photos in MSDs, are a growing concern. A survey last year found that 43 percent of American women had experienced online abuse, with a third of them reporting sexual harassment.
Women under the age of 35 say they have been sexually harassed on the internet, while 11% of men under the age of 35 say the same.
According to a separate study from the Center for Combating Digital Hate, roughly 6% of Instagram DMs received by women in the public eye were abusive, and the social media platform failed to respond to 90% of complaints.
Ray Mallick/Unsplash''s Background Image