Exposed: How Apple and Google’s App Stores Are Promoting Nudify Apps
The app stores have recently come under fire for not merely hosting questionable nudifying applications but also actively promoting them. This unsettling revelation suggests that tech giants like Apple and Google are not just slow to address these concerns—they might be complicit in guiding users toward these apps. A detailed investigation by the Tech Transparency Project has uncovered a troubling trend that raises serious questions about user safety and corporate responsibility.
Unveiling the Nudifying Apps
At the heart of this investigation is the discovery of nudify apps—AI-driven tools that can strip clothing from photos or generate explicit content using someone’s likeness. Shockingly, many of these apps were rated suitable for minors, highlighting a significant oversight in content moderation.
How Apple and Google Are Inadvertently Promoting These Apps
When the Tech Transparency Project conducted searches using terms like “nudify” and “deepfake” on both app stores, the results were alarming. Nearly 40% of the top results for these terms linked to apps that could render users either nude or scantily clad. This practice isn’t limited to search results; both platforms have run paid advertisements for these apps, with some openly featuring pornographic content.
Additionally, Apple’s autocomplete feature proved to be a double-edged sword. When a user typed “AI NS” into the App Store search field, the suggestions led to nudifying apps prominently displayed as top results. Despite Apple’s policies against advertising adult content, three App Store searches still returned nudify app ads as the very first results.
Why This Issue Is More Pressing Than You Think
The ramifications of these findings are stark. The nudify apps identified have been downloaded over 483 million times, raking in more than $122 million in revenue, which, naturally, benefits Apple and Google through their revenue-sharing models. This raises concerns that financial incentives may be contributing to a lack of stringent enforcement against these platforms.
Following the identification of these problematic apps, Apple removed 15, while Google suspended several others. However, both companies have yet to clarify how these apps passed their review processes or why their age ratings allowed minors to access them.
The Growing Pressure to Take Action
With increasing scrutiny and the emergence of laws targeting explicit deepfake content—such as the UK government’s new proposals and recent criminal convictions in the US—the pressure on Apple and Google to take responsibility is mounting.
Apple’s enforcement actions have already faced criticism. A recent report disclosed that the company threatened to remove the controversial Grok app over its sexualized deepfakes but ultimately allowed it to remain. This incident, along with others, indicates that tech giants are running out of excuses to turn a blind eye.
Conclusion
As the conversation around digital responsibility intensifies, both Apple and Google will need to reckon with their roles in perpetuating the presence of such harmful applications. It’s crucial that users remain vigilant and continue advocating for safer digital environments.
If you value an online space where content is curated responsibly, consider discussing your thoughts on this issue with fellow users and support initiatives aimed at enhancing app store safety. Together, we can inspire change and foster a safer digital landscape for everyone.

