Meta the owner of Instagram is introducing a new safety feature to help protect teenagers from seeing unwanted nude photos in their direct messages.
The feature will automatically blur out nude images if the recipient is identified as a teenager.
Meta said this safety measure is aimed at addressing three main problems.
First, teenagers sometimes receive nude photos that they didn’t ask for and don’t want to see.
Second, sending nude photos of teenagers, even if it’s the teenagers themselves sending them, can be against the law. And third, there are scams where teenagers, especially boys, are tricked into sending explicit photos and then blackmailed.
The tech giant is under mounting pressure in the United States and Europe over allegations that its apps were addictive and have fueled mental health issues among young people.
Meta said the protection feature for Instagram’s direct messages would use on-device machine learning to analyze whether an image sent through the service contains nudity.
Meta, the company that owns Instagram, will use this technology to detect nude photos in direct messages. If the recipient is flagged as a teenager based on their birthdate, the photo will be blurred, and a warning message will appear.
Attorneys general of 33 U.S. states, including California and New York, sued the company in October, saying it repeatedly misled the public about the dangers of its platforms.
In Europe, the European Commission has sought information on how Meta protects children from illegal and harmful content.