Apple is finally taking action by removing at least three apps from its App Store that use Artificial Intelligence or AI to generate non-consensual nude images.
According to a report by 404 Media, these apps were being advertised on Instagram, which violates Instagram’s policies
The report highlights that Apple’s removal of these apps occurred after several users reported the apps and shared links to them and their corresponding advertisements to get these apps taken down. This is one of those rare instances where Apple relied on external assistance to identify apps violating its App Store policies.
The investigation by 404 Media revealed five such advertisements on Meta’s Ad Library, where all advertisements on the platform are archived. While two of these ads were for web-based services offering similar capabilities, three led users to apps available on the Apple App Store.
Some of these apps promised features like face swaps on adult images, while others were marketed as ‘undressing’ apps using AI to digitally remove clothing from photos.
Although Meta promptly removed these advertisements upon discovery, when contacted, Apple initially declined to comment and requested further details after the story’s publication last week.
This isn’t the first instance of concerns raised regarding AI-powered deepfake apps on the App Store.
In 2022, similar apps were found on both Google Play Store and Apple App Store, but neither company took immediate action to remove them. Instead, they urged developers to cease advertising such capabilities on popular adult websites.
Impact Shorts
More ShortsIn recent months, the proliferation of undressing apps has become prevalent in schools and colleges worldwide.
Some of these apps are distributed directly, while others offer their features through subscription services. This trend underscores the importance of vigilant oversight by tech companies to prevent the spread of harmful and unethical applications.
(With inputs from agencies)


)

)
)
)
)
)
)
)
)
