News

Apple Delist AI Apps for Nude Image Creations

According to a 404 Media story, Apple has taken down three apps from the App Store that allowed users to employ AI to produce non-consensual nude pictures of people.

Apple has taken down at least three apps from its App Store that claimed that they could use artificial intelligence (AI) to create non-consensual nude photos, according to a post on 404 Media. Instagram was showing advertisements for these advertisements.

404 Media reported that Apple only acted against these apps when the newspaper provided links to them and their advertisements. This implies that the tech giant needed outside assistance to locate programs that broke its App Store standards.

Also Read: Apple Macs to Get AI-Focused M4 Chips Starting in Late 2024

According to the study, when perusing Meta’s Ad Library—where all platform advertisements are archived—they stumbled onto five advertisements. Three of these advertisements directed users to programs on the Apple App Store, but two of the advertisements were for web-based services that provided these services.

A handful of these apps were billed as “undressing” apps, using AI to remove clothing from people’s regular photos, while others allowed face swapping on pornographic photos. One of the advertisements for these apps also featured a picture of Kim Kardashian alongside wording emphasizing the app’s ability to help users strip their clothing.

It continues by saying that although Apple initially declined to react and requested further information concerning these advertisements, Meta was quick to remove them. This was after the news was released a week ago.

Apple has already received notifications regarding deepfakes using artificial intelligence available on the App Store. Several of these apps were discovered on the Apple App Store and Google Play Store in 2022, but neither of the internet behemoths took them down. Rather, they requested that the creators of these apps cease promoting their features on well-known pornographic websites.

Also Read: What is Apple ReALM  AI System-Based Voice Assistant?

Over the past few months, undressing applications have proliferated in universities and colleges worldwide. While some of these capabilities are available as applications, others are available as subscription services.

These AI nude generators have the potential to cause serious privacy issues for the impacted person, as well as harass and blackmail the victim. Apple is making it clear that it won’t put up with apps that allow users to create non-consensual nude photos by removing them. To stop these kinds of programs from ever showing up in the App Store, Apple must, according to the reports, enhance its app review procedure.

This post was last modified on April 28, 2024 10:58 pm

Kumud Sahni Pruthi

A postgraduate in Science with an inclination towards education and technology. She always looks for ways to help people improve their lives by putting complex things into simple words through her writing.

Recent Posts

Perplexity AI Voice Assistant: How to Use and Benefits for iOS and Android Phones

Perplexity AI Voice Assistant is a smart tool for Android devices that lets users perform…

May 10, 2025

Meta AI App: How to Download? Check Its Key Features and Benefits

Meta AI is a personal voice assistant app powered by Llama 4. It offers smart,…

May 10, 2025

AI in U.S. Education for American Youth by President DONALD TRUMP

On April 23, 2025, current President Donald J. Trump signed an executive order to advance…

May 10, 2025

Google is moving Android news to a virtual event before I/O

Google is launching The Android Show: I/O Edition, featuring Android ecosystem president Sameer Samat, to…

April 29, 2025

Top Generative AI Companies of the World 2025

The top 11 generative AI companies in the world are listed below. These companies have…

April 28, 2025

Veo 2 extends access to more Gemini Advanced Users

Google has integrated Veo 2 video generation into the Gemini app for Advanced subscribers, enabling…

April 25, 2025