Smiley face
Weather     Live Markets

Meta, the parent company of Instagram, has announced that they will be testing new features to protect teens from potentially harmful content and scammers on the platform. This includes a feature that will blur messages containing nudity, using on-device machine learning to analyze images to determine if they contain nudity. This protection feature will be turned on by default for users under 18, with adults being encouraged to turn it on as well.

Unlike Meta’s Messenger and WhatsApp apps, direct messages on Instagram are not encrypted. However, the company has stated that they plan to roll out encryption for the service in the future. Meta is also working on developing technology to identify accounts that may be engaging in sextortion scams, as well as testing new pop-up messages for users who may have interacted with such accounts.

In January, Meta announced that they would be hiding more content from teens on Facebook and Instagram to make it more difficult for them to come across sensitive content, such as suicide, self-harm, and eating disorders. This move was aimed at protecting the mental health and well-being of young users. However, the company has faced criticism and legal action over allegations that its apps are addictive and contributing to mental health issues among young people.

The company has faced legal challenges from attorneys general in 33 US states, including California and New York, who sued Meta in October for allegedly misleading the public about the dangers of its platforms. Additionally, in Europe, the European Commission has requested information on how Meta protects children from illegal and harmful content on its apps. The pressure on Meta to address these concerns and make its platforms safer for young users continues to grow.

By introducing new features to blur nudity in messages and protect teens from potential scammers, Meta is taking steps to address the concerns about harmful content on its platforms. The use of on-device machine learning to analyze images for nudity is a proactive approach to safeguarding young users from inappropriate content. As the company faces legal challenges and regulatory scrutiny over its apps, it is working to make improvements to protect the well-being of its users, particularly teenagers who may be vulnerable to harmful content and scams.

Share.
© 2024 Globe Echo. All Rights Reserved.