Tech companies Apple and Google were found to have been leading users — specifically children — to apps that could effectively pornify images through artificial intelligence.
Last Wednesday,9to5Macreported the findings from January published by the Tech Transparency Project, which concluded both the Apple App Store and Google Play “are helping users to find apps that create deepfake nude images of women.”
The stores were even found promoting these apps and autocompleting search results for them.
About 40 percent of the top 10 apps appearing in searches for “nudify,” “undress,” and “deepnude” could “render women nude or scantily clad.”
These are apps where users can take two different images — one normal and one sexually explicit — and generate an image where components of both are used, sexualizing the person from the normal one.
9to5Mac reached out to the developer for one of these apps, and were told they “had no idea it was capable of producing such extreme content.”
On Thursday, Apple responded to the outlet, saying the apps were not allowed on their store given their review guidelines prohibit sexual content.
The company said it has removed 15 apps, with others receiving notice they will be removed if they continue to be in violation.
In January, California Democratic Gov. Gavin Newsom went after social media platform X with a similar allegation.
“xAI’s decision to create and host a breeding ground for predators to spread nonconsensual sexually explicit AI deepfakes, including images that digitally undress children, is vile,” he said.
Source: VidNews » Feed