Technology

Apple and Google Broke Their Own Rules by Promoting ‘Nudify’ Apps, Report Says

If you want an app you’ve built to be downloaded from the Apple App Store or Google Play Store, it must pass a number of criteria, including security standards.

But a new report out Wednesday says Apple and Google broke their own rules by promoting “nude” apps that are illegal in their app store policies.

The Tech Transparency Project, part of a non-profit technology organization, first revealed in January that the Apple and Google stores have more than 100 apps for nudity or undressing. These are apps that have the sole purpose of taking pictures of people, usually women, and editing them to look like that person without clothes, creating so-called unconsented intimate photos. Many of these apps use artificial intelligence to create deepfakes.

Apple removed some of the apps that were banned at the time. But many are still there, as evidenced by subsequent investigations.

In April, TTP discovered that Apple and Google still allow users to search for a number of disturbing keywords, including “nudity,” “undress” and “deep.” After digging into the top 10 apps in both app stores, TTP found that 40% of the apps advertised themselves as being able to “give women nude or scantily clad,” according to the report.

The new report also found that Google and Apple actually promote such applications in their stores, increasing their visibility, with Google in particular creating “a carousel of advertisements for the most revealing applications found in the investigation.”

Read more: How Do You Keep Kids Safe Online? Europe Believes Its Age Verification App Is The Answer

Apple and Google both have language in their policies that prohibits apps that contain “sexually explicit or sexual content” (Apple) and “suggestive situations where the subject is naked, scantily clad or scantily clad” (Google). And both have implemented these policies in the past — particularly with regard to sex apps.

But Apple and Google make money from app developers by running advertising and taking a portion of paid app subscriptions. Analytics firm AppMagic found that these “nudify” apps were downloaded 483 million times and generated more than $122 million in lifetime revenue.

“This flow of money may be the reason why these two companies are not careful when it comes to polluting apps that violate their policies,” TTP wrote.

The AI ​​Atlas

After the news broke this week, Apple told Bloomberg News that it had removed 15 of the reported apps. Google says it has removed seven. Apple also said it blocked many of the search terms TTP flagged in its report. Google told CNET that Google Play does not allow apps that contain sexual content and that many of the apps cited in the report have been suspended for violating its policies.

Unconsented pornographic content is a growing issue, thanks in part to AI. We saw with startling clarity how AI-powered apps can be used to create this illegal and abusive content earlier this year, when Grok users made 1.4 million sex-related deepfakes in a nine-day period.

Some US senators at the time asked Apple and Google to remove Grok from their app stores, but they did not.

We learned this week that Apple privately reached out to Grok to express its concerns about its abusive AI capabilities and threatened to remove it. Grok is still available in Apple and Google’s app stores and is reportedly still capable of creating abusive AI porn, despite the company saying otherwise.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button