India, April 26 -- Apple Inc. (AAPL) has taken down several AI image generator applications that were promoted for creating non-consensual nude images.

A recent investigation by 404 Media revealed how companies utilized Instagram advertisements to support apps that could undress individuals without their permission.

Some of these advertisements led users directly to Apple's Store for an application marketed as an "art generator," specifically designed to generate non-consensual nude images. These applications provided features such as face-swapping in adult images and digitally removing clothing in photos. The investigation not only exposed the presence of these applications but also highlighted their promotion through popular advertising...