Nudify Apps Bypass Google and Apple Moderation Systems

Nudify Apps Bypass Google and Apple Moderation Systems

Recent research has highlighted significant concerns regarding apps that allow users to create non-consensual intimate imagery. The Tech Transparency Project has revealed that both Apple and Google continue to host a variety of “nudify” applications in their respective digital marketplaces, despite their stated policies against such software.

Nudify Apps in Major App Stores

According to the report published earlier this week, over 55 nudify apps were identified on Google Play, while Apple’s App Store listed 47 similar applications. These apps have accumulated a staggering total of 705 million downloads, generating approximately $117 million in revenue.

Global Backlash and Company Responses

The study gained traction after a public outcry over the use of an AI bot, Grok, developed by Elon Musk, which was found to facilitate the stripping of digital images. In light of this, UK regulators are considering a ban on the app.

  • Katie Paul, the director of the Tech Transparency Project, commented, “Grok is really the tip of the iceberg.”
  • Both tech giants have acknowledged ongoing reviews and have suspended various apps as per their guidelines.

Google stated they take violations seriously and have acted to remove infringing apps from their platform. Conversely, Apple has yet to respond to inquiries regarding their policies.

Ad Revenue and Misleading Listings

The investigation uncovered troubling practices concerning app visibility and advertising. When searching for “nudify” in Apple’s App Store, not only did Grok appear prominently, but Apple also displayed sponsored ads for similar apps, generating additional revenue.

Testing Non-Consensual Imagery

The Tech Transparency Project conducted tests using AI-generated models to illustrate how these apps worked. They were able to successfully create both partial and complete nudity images. In one case, face-swapping technology was tested, demonstrating the ease of generating unauthorized content.

Concerns for User Safety and Vulnerabilities

This latest report follows a previous investigation that revealed several apps linked to U.S. sanctions due to their association with human rights abuses and conflicts. In that instance, the study found:

  • 52 apps tied to sanctioned entities within Apple’s ecosystem.
  • 18 similar apps on Google Play Store.

Katie Paul expressed her concern regarding the implications of both companies ignoring potentially harmful applications. She stressed that many of the nudify apps were marketed towards children, raising serious ethical questions.

Data Privacy Risks

Another critical issue is the data privacy of users, particularly regarding the Chinese government’s laws that require company compliance in data provision. Paul warned that non-consensual images could potentially fall into the hands of foreign governments, exacerbating privacy and security risks.

Conclusion

The consistent proliferation of nudify apps in Apple’s and Google’s app stores suggests a gap between company policies and actual enforcement. Tech companies promote their platforms as trusted and safe, yet evidence indicates a lack of rigorous oversight. As investigations continue, the priority remains ensuring consumer safety and compliance with global standards.