Apple and Google Criticized for Hosting AI ‘Nudify’ Apps Amid Non-Consensual Deepfake Concerns

Apple and Google Under Fire for Hosting AI-Powered ‘Nudify’ Apps

In a recent investigation, it was discovered that Apple and Google continue to host numerous applications that utilize artificial intelligence to remove clothing from images of individuals, effectively creating non-consensual, sexualized deepfakes. This revelation has sparked intense scrutiny regarding the enforcement of content policies within both companies’ app stores.

The Tech Transparency Project (TTP) conducted a comprehensive review in January, identifying 55 such ‘nudify’ apps on Google Play and 47 on the Apple App Store. Following inquiries from TTP and CNBC, Apple reported the removal of 28 offending applications and issued warnings to other developers. However, some of these apps reappeared after developers submitted revised versions.

Findings from the Tech Transparency Project

TTP’s investigation involved searching both app stores using keywords like nudify and undress. The organization tested these applications with AI-generated images of clothed women, revealing that many could effectively remove clothing or superimpose faces onto nude bodies.

Katie Paul, a representative from TTP, emphasized the harmful nature of these tools, stating they are definitely designed for non-consensual sexualization of people.

The identified applications primarily fall into two categories:

– AI generators that create nude or sexualized images based on user prompts.

– Face swap apps that place a real person’s face onto a nude body.

Collectively, these applications have amassed over 700 million downloads and generated approximately $117 million in revenue, according to data from AppMagic. Notably, both Apple and Google receive a portion of this revenue.

The ease of creating fake nude images of real individuals through these apps raises significant concerns about potential abuse, including blackmail and online harassment.

A previous report by CNBC highlighted a case in Minnesota where over 80 women discovered their social media photos had been used to create sexualized deepfakes. Due to the limited distribution of these images, no legal action was taken.

Advancements in AI technology have made the creation of such images alarmingly quick and simple, allowing users to upload a photo and receive a fake nude image within seconds.

Responses from Apple and Google

In response to the findings, Apple stated that it had removed many of the apps identified in the report and issued warnings to other developers. Despite these actions, two of the removed apps reappeared after developers addressed the issues.

Google reported suspending several applications and is currently reviewing the remaining ones. The company did not disclose the exact number of apps removed.

Both tech giants assert their commitment to user safety, yet their own guidelines explicitly prohibit such content.

– Google’s policies ban apps that claim to undress people or see through clothing.

– Apple’s guidelines prohibit content that is overtly sexual or pornographic.

TTP criticized both companies for failing to keep pace with the rapid proliferation of these AI-powered applications.

Global and Political Pressure

The issue has garnered international attention, particularly after Elon Musk’s Grok AI on X generated sexualized images of women and children. This incident prompted investigations by the European Commission and other regulatory bodies. Grok acknowledged lapses in safeguards and committed to implementing corrective measures.

In the United States, a coalition of state attorneys general and three Democratic senators urged Apple and Google to take decisive action. They emphasized that the widespread creation of fake nude images violates app store policies and poses significant harm to users.

Concerns Over China-Based Applications

TTP identified that 14 of the ‘nudify’ apps originate from China, raising additional security concerns.

Katie Paul highlighted the potential risks, noting that Chinese laws grant the government access to data held by domestic companies. This means that fake nude images created using these apps could potentially be accessed by the Chinese government.

TTP’s findings underscore a broader issue regarding the effectiveness of Apple and Google’s app store oversight. Paul remarked that the failure to enforce their own policies raises a lot of questions about how they can present themselves as trusted app platforms.

The onus is now on both companies to demonstrate their ability to curb the spread of these harmful tools and protect users from potential exploitation.