-
San Diego sheriff: Migrants did not try to forcefully stop school bus - August 31, 2024
-
One stabbed, another injured in altercation on L.A. Metro bus - August 31, 2024
-
Trump Judge Has ‘Two Options’ as Future of Case Unclear: Analyst - August 31, 2024
-
What to Know About Putin’s Planned Visit to Mongolia Amid ICC Arrest Warrant - August 31, 2024
-
Buying sex from a minor could be a felony under bill headed to Newsom - August 31, 2024
-
Democrat Lawmaker Switches Party to Become Republican - August 31, 2024
-
Misdated Mail-In Ballots Should Still Count, Pennsylvania Court Rules - August 31, 2024
-
Cause and manner of death determined for Lucy-Bleu Knight - August 31, 2024
-
NASCAR Craftsman Truck Series Announces Return To Iconic Circuit In 2025 - August 31, 2024
-
At Pennsylvania Rally, Trump Tries to Explain Arlington Cemetery Clash - August 31, 2024
Many students know of peers who created deepfake nudes, report says
When news broke that AI-generated nude pictures of students were popping up at a Beverly Hills Middle School in February, many district officials and parents were horrified.
But others said no one should have been blindsided by the spread of AI-powered “undressing” programs. “The only thing shocking about this story,” one Carlsbad parent said his 14-year-old told him, “is that people are shocked.”
Now, a newly released report by Thorn, a tech company that works to stop the spread of child sexual abuse material, shows how common deepfake abuse has become. The proliferation coincides with the wide availability of cheap “undressing” apps and other easy-to-use, AI-powered programs to create deepfake nudes.
But the report also shows that other forms of abuse involving digital imagery remain bigger problems for school-age kids.
To measure the experiences and attitudes of middle- and high-school students with sexual material online, Thorn surveyed 1,040 9- to 17-year-olds across the country from Nov. 3 to Dec. 1, 2023. Well more than half of the group were Black, Latino, Asian or Native American students; Thorn said the resulting data were weighted to make the sample representative of U.S. school-age children.
According to Thorn, 11% of the students surveyed said they knew of friends or classmates who had used artificial intelligence to generate nudes of other students; an additional 10% declined to say. Some 80% said they did not know anyone who’d done that.
In other words, at least 1 in 9 students, and as many as 1 in 5, knew of classmates who used AI to create deepfake nudes of people without their consent.
Stefan Turkheimer, vice president of public policy for the Rape, Abuse & Incest National Network, the country’s largest anti-sexual-violence organization, said that Thorn’s results are consistent with the anecdotal evidence from RAINN’s online hotline. A lot more children have been reaching out to the hotline about being victims of deepfake nudes, as well as the nonconsensual sharing of real images, he said.
Compared with a year ago or even six months ago, he said, “the numbers are certainly up, and up significantly.”
Technology is amplifying both kinds of abuse, Turkheimer said. Not only is picture quality improving, he said, but “video distribution has really expanded.”
The Thorn survey found that almost 1 in 4 youths ages 13 to 17 said they’d been sent or shown an actual nude photo or video of a classmate or peer without that person’s knowledge. But that number, at least, is lower than it was in 2022 and 2019, when 29% of the surveyed students in that age group said they’d seen nonconsensually shared nudes.
Not surprisingly, only 7% of the students surveyed admitted that they had personally shared a nude photo or video without that person’s knowledge.
The study found that sharing of real nudes is widespread among students, with 31% of the 13- to 17-year-olds agreeing with the statement that “It’s normal for people my age to share nudes with each other.” That’s about the same level overall as in 2022, the report says, although it’s notably lower than in 2019, when nearly 40% agreed with that statement.
Only 17% of that age group admitted to sharing nude selfies themselves. An additional 15% of 9- to 17-year-olds said they had considered sharing a nude photo but decided not to.
Turkheimer wondered whether some of the perceived decline in sexual interactions online stemmed from the shutdown last year of Omegle, a site where people could have video chats with random strangers. Although Omegle’s rules banned nudity and the sharing of explicit content, more than a third of the students who reported using Omegle said they’d experienced some form of sexual interaction there.
He also noted that the study didn’t explore how frequently students experienced the interactions that the survey tracked, such as sharing nudes with an adult.
According to Thorn, 6% of the students surveyed said they’d been victims of sextortion — someone had threatened to reveal a sexual image of them unless they agreed to pay money, send more sexual pictures or take some other action. And when asked whom to blame when a nude selfie goes public, 28% said it was solely the victim’s fault, compared with 51% blaming the person who leaked it.
Source link