According to a 404 Media story, Apple has taken down three apps from the App Store that allowed users to employ AI to produce non-consensual nude pictures of people.
Apple Delist AI Apps for Nude Image Creations
Apple has taken down at least three apps from its App Store that claimed that they could use artificial intelligence (AI) to create non-consensual nude photos, according to a post on 404 Media. Instagram was showing advertisements for these advertisements.
404 Media reported that Apple only acted against these apps when the newspaper provided links to them and their advertisements. This implies that the tech giant needed outside assistance to locate programs that broke its App Store standards.
Also Read: Apple Macs to Get AI-Focused M4 Chips Starting in Late 2024
According to the study, when perusing Meta’s Ad Library—where all platform advertisements are archived—they stumbled onto five advertisements. Three of these advertisements directed users to programs on the Apple App Store, but two of the advertisements were for web-based services that provided these services.
A handful of these apps were billed as “undressing” apps, using AI to remove clothing from people’s regular photos, while others allowed face swapping on pornographic photos. One of the advertisements for these apps also featured a picture of Kim Kardashian alongside wording emphasizing the app’s ability to help users strip their clothing.
It continues by saying that although Apple initially declined to react and requested further information concerning these advertisements, Meta was quick to remove them. This was after the news was released a week ago.
Apple has already received notifications regarding deepfakes using artificial intelligence available on the App Store. Several of these apps were discovered on the Apple App Store and Google Play Store in 2022, but neither of the internet behemoths took them down. Rather, they requested that the creators of these apps cease promoting their features on well-known pornographic websites.
Also Read: What is Apple ReALM AI System-Based Voice Assistant?
Over the past few months, undressing applications have proliferated in universities and colleges worldwide. While some of these capabilities are available as applications, others are available as subscription services.
These AI nude generators have the potential to cause serious privacy issues for the impacted person, as well as harass and blackmail the victim. Apple is making it clear that it won’t put up with apps that allow users to create non-consensual nude photos by removing them. To stop these kinds of programs from ever showing up in the App Store, Apple must, according to the reports, enhance its app review procedure.
This post was last modified on April 28, 2024 10:58 pm
Are you looking to advance your engineering career in the field of robotics? Check out…
Artificial intelligence is a topic that has recently made internet users all over the world…
Boost your learning journey with the power of AI communities. The article below highlights the…
Demystify the world of Artificial Intelligence with our comprehensive AI Glossary and Terminologies Cheat Sheet.…
Scott Wu is the co-founder and Chief Executive Officer of Cognition Labs, an artificial intelligence…
Discover the 13 best yield farming platforms of 2025, where you can safely maximize your…