-
San Diego sheriff: Migrants did not try to forcefully stop school bus - August 31, 2024
-
One stabbed, another injured in altercation on L.A. Metro bus - August 31, 2024
-
Trump Judge Has ‘Two Options’ as Future of Case Unclear: Analyst - August 31, 2024
-
What to Know About Putin’s Planned Visit to Mongolia Amid ICC Arrest Warrant - August 31, 2024
-
Buying sex from a minor could be a felony under bill headed to Newsom - August 31, 2024
-
Democrat Lawmaker Switches Party to Become Republican - August 31, 2024
-
Misdated Mail-In Ballots Should Still Count, Pennsylvania Court Rules - August 31, 2024
-
Cause and manner of death determined for Lucy-Bleu Knight - August 31, 2024
-
NASCAR Craftsman Truck Series Announces Return To Iconic Circuit In 2025 - August 31, 2024
-
At Pennsylvania Rally, Trump Tries to Explain Arlington Cemetery Clash - August 31, 2024
San Francisco sues 16 websites that create AI-generated nudes
San Francisco City Atty. David Chiu announced Thursday that his office is suing the operators of 16 A.I.-powered “undressing” websites that help users create and distribute deepfake nude photos of women and girls.
The lawsuit, which city officials said was the first of its kind, accuses the websites’ operators of violating state and federal laws that ban deepfake pornography, revenge pornography and child pornography, as well as California’s unfair competition law. The names of the sites were redacted in the copy of the suit made public Thursday.
Chiu’s office has yet to identify the owners of many of the websites, but officials say they hope to find their names and hold them accountable.
Chiu said the lawsuit has two goals: shutting down these websites and sounding the alarm about this form of “sexual abuse.”
On these websites, users upload photos of fully clothed real people, then artificial intelligence alters the image to simulate what the person would look like undressed. The sites create “pornographic” images without the consent of the persons in the photo, Chiu said during a Thursday morning press conference.
According to the lawsuit, one of websites promotes the nonconsensual nature of the images, stating, “Imagine wasting time taking her out on dates, when you can just use [redacted website name] to get her nudes.”
The availability of open source A.I. models means that anyone can access and adapt A.I.-powered engines for their own purposes. One result: sites and apps that can generate deepfake nudes from scratch or “nudify” existing images in realistic ways, often for a fee.
Deepfake apps grabbed headlines in January when fake nude images of Taylor Swift circulated online, but many other, far less famous people were victimized before and after the pop star. “The proliferation of these images have exploited a shocking number of women and girls across the globe,” from celebrities to middle school students, Chiu said.
Through its investigation, the city attorney’s office found that the websites in question were visited more than 200 million times in just the first six months of 2024.
Once an image is online, it’s very difficult for victims to determine what websites were used to “nudify” their images because these images “don’t have any unique or identifying marks that link you back to websites,” said Yvonne R. Meré, San Francisco’s chief deputy city attorney.
It’s also very difficult for victims to remove the images from the internet.
Earlier this year, five Beverly Hills eighth-graders were expelled for creating and sharing deepfake nude images of 16 eighth-grade girls, superimposing the girls’ faces onto A.I.-generated bodies.
Chiu’s office said it has seen similar incidents at other schools in California, Washington and New Jersey.
“These images are used to bully, humiliate and threaten women and girls,” Chiu said. “The impact on victims has been devastating on their reputations, their mental health, loss of autonomy and, in some instances, causing individuals to become suicidal.”
Source link