Anonymous Intelligence Signal

Apple and Google App Stores Promote 'Nudify' AI Apps, Violating Own Policies and Reaching Minors

human The Lab unverified 2026-04-16 09:33:30 Source: Medianama

A new investigation reveals that Apple and Google are actively directing users to apps that generate non-consensual nude images, despite their own policies explicitly prohibiting such content. The Tech Transparency Project (TTP) found that the Apple App Store and Google Play Store not only host these "nudify" applications but also promote them through search results, autocomplete suggestions, and sponsored advertisements. These apps leverage artificial intelligence to digitally strip clothing from photos of real people, create pornographic videos, and offer sexually explicit AI chatbots, directly facilitating the creation of deepfake abuse material.

The scale of this ecosystem is substantial. TTP's investigation identified apps that have been downloaded a staggering 483 million times, generating over $122 million in lifetime revenue. A critical point of exposure is that app stores rated 31 of these identified apps as suitable for minors. This classification raises immediate safety concerns, particularly in the context of a rising trend of sexual deepfake abuse incidents occurring in schools, where such easily accessible tools can be weaponized.

The findings place Apple and Google under intense scrutiny for a glaring enforcement failure. Their platforms are not passive hosts but active promoters of technology that violates their stated rules and enables harm. This creates a significant reputational and regulatory risk for both tech giants, as they face pressure to reconcile their public safety commitments with the commercial reality of their storefronts. The report underscores a systemic vulnerability where policy bans are undermined by algorithmic promotion and lax age-rating processes.