Smiley face
Weather     Live Markets

A recent study conducted by ECPAT International, Eurochild, and Terre des Hommes Netherlands revealed that children often rely on their own instincts when navigating the digital world and facing threats such as explicit content or online grooming. The study involved focus group discussions with 483 children from 15 countries, including ten EU member states, who expressed a preference for keeping their online activities private and struggling to communicate with adults about the risks they encounter online. The biggest threat identified across all 15 countries studied was sexual abuse and exploitation online, such as grooming, self-generated sexual material, and live-streamed child sexual abuse.

The findings of the study shed light on the fact that children feel alone in ensuring their safety online and often lack the necessary tools and information to effectively navigate the online world. The responsibility to protect themselves from online harms, such as cyberbullying, violent content, and negative mental health experiences, falls largely on the children themselves. The report underscores the urgent need for EU countries to establish regulatory frameworks that place the burden of responsibility on online service providers, rather than on children. The planned new law in the EU to crack down on child exploitation online has faced opposition from digital privacy advocates, but the NGOs behind the study emphasize the importance of making the internet safer for children.

There is mounting concern about the use of AI to generate deep-fake child sexual abuse material, with a significant amount of such content being created by adolescents themselves. Children need to be educated about the dangers of disseminating and creating abusive content online. Digital platforms play a key role in fighting illegal content that jeopardizes children’s safety online, and the study’s findings have prompted NGOs to call on platforms to step up their efforts. Snapchat, which has introduced several safeguards to protect its teen users, including limiting contact settings and turning off location-sharing by default, emphasized its commitment to safeguarding the privacy and safety of its users, especially minors.

Snapchat has an age requirement of 13 years old and additional privacy settings for users aged 13-17, but has faced criticism for failing to keep underage users off its platform. The company has expressed its support for the planned EU law to tackle child sexual abuse material, citing the importance of proactively scanning for known CSAM material. Snapchat uses reliable technologies such as photo DNA for images to detect abusive content. The platform’s ‘My AI’ chatbot, powered by Microsoft’s ChatGPT, has raised concerns about potentially harmful or misleading content. As the issue of online child safety becomes increasingly urgent, there is a need for collaboration between online service providers, policymakers, and NGOs to protect children from online exploitation and abuse.

Share.
© 2024 Globe Echo. All Rights Reserved.