Meta on Thursday said Instagram will test features that blur messages containing nudity to safeguard teens and prevent potential scammers from reaching them as it tries to allay concerns over harmful content on its apps.
The tech giant is under mounting pressure in the United States and Europe over allegations that its apps were addictive and have fueled mental health issues among young people.
Meta said the protection feature for Instagram’s direct messages would use on-device machine learning to analyze whether an image sent through the service contains nudity. The feature will be turned on by default for users under 18 and Meta will notify adults to encourage them to turn it on.
Meta also said that it was developing technology to help identify accounts that might be potentially engaging in sextortion scams and that it was testing new pop-up messages for users who might have interacted with such accounts.
Join our twitter community :