Instagram is set to roll out a new privacy feature aimed at safeguarding its younger users and curbing the spread of malicious content on its platform. The move comes amidst mounting pressure on Meta, the parent company of Instagram, over concerns related to addictive behavior and mental health issues among youth in the US and Europe.
Privacy Measure to Protect Teens:
The upcoming feature will utilize machine learning algorithms to identify and blur nude images in private messages. By employing this technology, Instagram aims to limit the opportunities for scammers and malicious actors to target vulnerable users, particularly teenagers.
End-to-End Encryption and Enhanced Safety Measures:
Notably, the privacy feature will operate locally on users’ devices, ensuring that images are analyzed without Meta accessing the content unless reported. This approach extends protection even to end-to-end encrypted chats, enhancing privacy and security for all users. Furthermore, Meta plans to introduce encryption for private messaging on Instagram, aligning it with the standards of its other platforms like Messenger and WhatsApp.
Combatting Sextortion Schemes:
In addition to blurring nude images, Meta is developing technology to identify accounts suspected of involvement in sextortion schemes. Users interacting with such accounts will receive warnings, empowering them to stay vigilant and protect themselves from potential harm, notes NIX Solutions.
As Instagram takes proactive steps to enhance safety and privacy, we’ll keep you updated on any further developments. Stay tuned for more information on how these measures will impact user experience and online safety.