San Francisco Sues AI ‘Undressing’ Websites Over Non-Consensual Nude Images
The post San Francisco Sues AI ‘Undressing’ Websites Over Non-Consensual Nude Images appeared on BitcoinEthereumNews.com.
TLDR San Francisco’s City Attorney is suing 16 websites that use AI to create non-consensual nude images of women and girls The sites have been visited 200 million times in the first half of 2024 Some sites allow users to create pornographic images of children Victims face difficulties removing these AI-generated images once shared online The lawsuit seeks $2,500 per violation and aims to shut down the sites San Francisco City Attorney David Chiu has filed a lawsuit against 16 websites that use artificial intelligence (AI) to create non-consensual nude images of women and girls. The lawsuit, filed in San Francisco Superior Court, targets sites that allow users to “undress” or “nudify” people in photos without their consent. According to Chiu’s office, these websites have received over 200 million visits in just the first six months of 2024. The lawsuit claims the site owners include individuals and companies from Los Angeles, New Mexico, the United Kingdom, and Estonia. The AI models used by these sites are reportedly trained on pornographic images and child sexual abuse material. Users can upload a picture of their target, and the AI generates a realistic, pornographic version. While some sites claim to limit their service to adults only, others allow images of children to be created as well. “This investigation has taken us to the darkest corners of the internet, and I am absolutely horrified for the women and girls who have had to endure this exploitation,” Chiu stated. He emphasized that while AI has “enormous promise,” criminals are exploiting the technology for abusive purposes. The lawsuit alleges that these AI-generated images are “virtually indistinguishable” from real photographs. They have been used to “extort, bully, threaten, and humiliate women and girls,” many of whom have no way to control or remove the fake images once…
Filed under: News - @ August 16, 2024 4:19 pm