Smiley face
Weather     Live Markets

Google has recently come under fire for allowing the promotion and distribution of AI-generated deepfake pornographic content on its platforms, particularly on YouTube. The platform has been found to host numerous videos advertising AI apps and websites that can remove clothing from images of women, resulting in the creation of artificial nudes. Some of these videos even provided tutorials on how to use these apps, leading to cases of high school students using them to generate nudes of their female classmates. Victims of this exploitation have faced bullying, shaming, and mental health issues as a result.

In a particularly disturbing case, a child psychiatrist was sentenced to 40 years for using artificial intelligence to create child sexual abuse images and for sexually exploiting a minor. He was accused of altering images of his high school girlfriend when she was underage by removing her clothes, causing immense distress to the victim. Signy Arnason of the Canadian Center for Child Protection expressed shock at Google’s facilitation of these harmful apps and websites, highlighting the ease with which they can be found on YouTube and Google search results. Schools have also reported an increase in cases of students being victimized by AI-generated nudes.

Aside from YouTube, Google’s AI nudifier issues extend to Android apps as well. Forbes identified three apps on the platform that offer to remove clothes from photos, with one boasting over 10 million downloads. Additionally, Google Ads Transparency Center displayed 27 ads promoting “deep nude” services, some of which were disturbingly named. After Forbes brought attention to these violations, Google took action by removing the ads and taking down multiple YouTube channels and videos associated with the promotion of AI nudifying tools.

The National Center on Sexual Exploitation (NCOSE) raised concerns about Google’s profit-making strategies surrounding these apps, criticizing the company for accepting advertising money from developers and benefiting from hosting such apps on the Google Play store. NCOSE highlighted Apple as a contrast, noting that the tech giant had been swift in removing nudifier apps from its platform when alerted. The rise of AI-generated deepfake porn, including content involving children, has raised significant alarm, with the National Center for Missing and Exploited Children reporting a surge in reports of AI-generated child sexual abuse material over the past year.

Victims of AI-generated deepfake porn have spoken out about the ongoing trauma and fear caused by the exploitation of their images. In the case of the convicted child psychiatrist, multiple victims testified in court about the lasting impact of having their childhood photos manipulated using AI. The fear of potential exploitation and distribution of these images by others has left these victims feeling vulnerable and anxious. Calls have been made for Google to implement responsible practices and policies to combat the spread of image-based sexual abuse and to prioritize the protection of vulnerable individuals, particularly children, from the harms of AI-generated deepfake pornography.

Share.
© 2024 Globe Echo. All Rights Reserved.