Apple’s App Store search is reportedly helping people find ”nudify” apps, according to a new investigation from the Tech Transparency Project. The report says autocomplete, promoted results, and search ranking are still surfacing image-manipulation tools that can generate deepfake nude images of women. Google Play was also named in the report, suggesting both app stores are still struggling to keep these tools out.

That’s the awkward part for Apple. The company likes to frame App Store review as a hard gatekeeper, but search is where moderation often goes to die: if the listing survives, the ranking system can still hand it a megaphone. Google Play was also named in the report, which suggests this is less about one storefront’s bad day and more about a familiar app-store failure mode that keeps showing up in different clothes.

App Store search terms that surfaced nudify apps

TTP says searches for terms such as ”nudify,” ”undress,” and ”deepnude” turned up apps that could render women nude or scantily clad in a surprisingly large share of the top results. It also found sponsored placements for some of those apps, which is the sort of detail that should make platform ad teams squirm: if users can buy their way into sensitive searches, moderation has a hole in it big enough to drive a policy memo through.

  • Nearly 40% of the top 10 results for those searches could render women nude or scantily clad.
  • Some searches returned sponsored results for nudify-style apps.
  • The App Store also suggested ”image to video AI NSFW” after a search beginning with ”AI NS.”

Apple removed most of the apps after the report

The report includes examples of how quickly the tools can cross the line. In one case, an app called FaceSwap Video by DuoFace reportedly turned an uploaded image of a clothed woman into a video using a nude body; in another, AI Face Swap swapped faces between a woman in a sweater and a topless image without restrictions. TTP says it also contacted several developers, and at least one said it was using Grok for image generation and would tighten moderation settings after saying it had not realized the system could produce such extreme content.

Apple did not publicly address the report, but it did remove most of the apps TTP identified. That’s the predictable cleanup phase: take the listings down, avoid the press cycle, and wait for the next batch to show up under slightly different names. App stores have been playing that game for years, and AI image tools have only made it easier to disguise the same old abuse as harmless face swapping.

The harder problem is ranking, not just review

What makes this report sting is that it points beyond individual app approval. If search suggestions and promoted placements can steer users toward sexualized deepfake tools, then the store’s discovery system is doing part of the distribution work for them. That’s a tougher fix than basic takedowns, and it raises the uncomfortable question Apple and Google would rather not answer out loud: how much harm is being created by the way their stores recommend apps, not just by what they allow in?

The likely next move is more removals, a few policy tweaks, and another round of claims that the platforms take safety seriously. The open question is whether either store will touch the incentives baked into search ads and autocomplete, because that’s where this problem keeps leaking back in.

Source: 9to5mac

Leave a comment

Your email address will not be published. Required fields are marked *