Plus: A (potentially) better dating app, what parents want for their kids, and more.
May 2, 2024 [View in browser]( Good morning! Today, senior correspondent Anna North is here to talk about the rise of deepfake nudes â and what teenagers are doing to fight back.
âCaroline Houck, senior editor of news [Illustration of a woman sitting in a corner, looking at a smartphone.] Getty Images/iStockphoto How do you stop deepfake nudes? Thereâs a lot of [debate]( about [the role of technology]( in kidsâ lives, but sometimes we come across something unequivocally bad. Thatâs the case with [AI ânudificationâ apps](, which teenagers are using to generate and share fake naked photos of their classmates. At Issaquah High School in Washington state, boys used an app to âstripâ photos of girls who attended [last fallâs homecoming dance](, [according to the New York Times](. At [Westfield High School in New Jersey](, 10th grade boys created fabricated explicit images of some of their female classmates and shared them around school. Students from [California to Illinois]( have had deepfake nudes shared without their consent, in what experts call a form of [âimage-based sexual abuse.â]( Now advocates â including some teens â are backing laws that impose penalties for creating and sharing deepfake nudes. [Legislation has passed]( in Washington, South Dakota, and Louisiana, and is in the works in California and elsewhere. Meanwhile, Rep. Joseph Morelle (D-NY) has [reintroduced a bill]( that would make sharing the images a federal crime. Francesca Mani, a 15-year-old Westfield student whose deepfaked image was shared, started pushing for legislative and policy change after she saw her male classmates making fun of girls over the images. âI got super angry, and, like, enough was enough,â she told Vox in an email sent via her mother. âI stopped crying and decided to stand up for myself.â Supporters say the laws are necessary to keep students safe. But some experts who study technology and sexual abuse argue that theyâre likely to be insufficient, since the [criminal justice]( system has been [so inefficient]( at rooting out other sex crimes. âIt just feels like itâs going to be a symbolic gesture,â said Amy Hasinoff, a communications professor at the University of Colorado Denver who has studied image-based sexual abuse. She and others recommend tighter regulation of the apps themselves so the tools people use to make deepfake nudes are less accessible in the first place. âI am struggling to imagine a reason why these apps should existââ without some form of consent verification, Hasinoff said. [silhouette face in front of computer screen] Arne Dedert/picture alliance via Getty Images Deepfake nudes are a new kind of sexual abuse So-called [revenge porn]( â nude photos or videos shared without consent â has been a problem for years. But with deepfake technology, âanybody can just put a face into this app and get an image of somebody â friends, classmates, coworkers, whomever â completely without clothes,â said Britt Paris, an assistant professor of library and information science at Rutgers who has studied deepfakes. Thereâs no hard data on how many American high school students have experienced deepfake nude abuse, but [one 2021 study]( conducted in the UK, New Zealand, and Australia found that 14 percent of respondents ages 16 to 64 had been victimized with deepfake imagery. Nude images shared without consent can be traumatic, whether theyâre real or not. When she first found out about the deepfakes at her school, âI was in the counselorâs office, emotional and crying,â Mani said. âI couldnât believe I was one of the victims.â When sexual images of students are shared around school, they can experience âshaming and blaming and stigmatization,â thanks to stereotypes that denigrate girls and women, especially, for being or appearing to be sexually active, Hasinoff said. Thatâs the case even if the images are fake because other students may not be able to tell the difference. Moreover, fake images can follow people throughout their lives, causing real harm. âThese images put these young women at risk of being barred from future employment opportunities and also make them vulnerable to physical violence if they are recognized,â Yeshi Milner, founder of the nonprofit Data for Black Lives, told Vox in an email. [A teenage girl sitting on the floor watching a smartphone] Getty Images Stopping deepfake abuse may require reckoning with AI To combat the problem, [at least nine states]( have passed or updated laws targeting deepfake nude images in some way, and many others are considering them. In Louisiana, for example, anyone who creates or distributes deepfakes of minors can be sentenced to [five or more years in prison](. Washingtonâs new law, which takes effect in June, treats a first offense [as a misdemeanor](. The federal bill, [first introduced]( in 2023, would give victims or parents [the ability to sue perpetrators]( for damages, in addition to imposing criminal penalties. It has not yet received a vote in [Congress]( but has attracted bipartisan support. However, some experts worry that the laws, while potentially helpful as a statement of values, wonât do much to fix the problem. âWe donât have a legal system that can handle sexual abuse,â Hasinoff said, noting that [only a small percentage]( of people who commit sexual violence are ever charged. âThereâs no reason to think that this image-based abuse stuff is any different.â Some states have tried to address the problem by [updating their existing laws on child sexual abuse images and videos]( to include deepfakes. While this might not eliminate the images, it would close some loopholes. (In one recent New Jersey lawsuit, lawyers for a male high school student [argued]( he should not be barred from sharing deepfaked photos of a classmate because federal laws were not designed to apply âto computer-generated synthetic images.â) Meanwhile, some [lawyers]( and legal scholars say that the way to really stop deepfake abuse is to target the apps that make it possible. Lawmakers could regulate app stores to bar them from carrying nudification apps without clear consent provisions, Hasinoff said. Apple and Google have [already removed several apps]( that offered deepfake nudes from the App Store and Google Play. However, users donât need a specific app to make nonconsensual nude images; many [AI]( image generators could potentially be used in this way. Legislators could require developers to put guardrails in place to make it harder for users to generate nonconsensual nude images, Paris said. But that would require challenging the âunchecked ethosâ of AI today, in which developers are allowed to release products to the public first and figure out the consequences later, she said. âUntil companies can be held accountable for the types of harms they produce,â Paris said, âI donât see a whole lot changing.â â[Anna North, senior correspondent]( [Listen]( One Flu Over the Cowcowâs Nest Avian flu, which recently leaped from chickens to cows, has now been detected in milk. How worried should humans be about the outbreak? [Listen now]( POLITICS - What values do you want your child to have?: The answer is probably [different depending on your party](. [[NPR](]
- In case you somehow missed it: Kristi Noem for some reason seemingly thought that writing about killing a puppy would endear her to voters?. [[Vox](]
- Arizona repeals its Civil War-era abortion ban: Why do these laws even exist in the first place? We've got the explainer for you. [[Vox](] [Kristi Noem speaking at a lectern with a bright red and blue background behind her.] Kent Nishimura/Bloomberg via Getty Images DESIRE - Falling for AI: Things can apparently get âvery steamyâ when you hijack ChatGPT Plus and get it to respond like your boyfriend. [[WSJ](]
- On bodies and fitness: âThese lifestyle gyms aren't competing with Ozempic â they're embracing it.â [[Quartz](]
- Dating apps suck: Will one made by academics instead of corporations be better? [[Guardian](] Ad
Â
[Learn more about RevenueStripe...]( Are baby bonds a good investment? A unique policy program that could help close the racial wealth gap. [Listen now]( Are you enjoying the Today, Explained newsletter? Forward it to a friend; they can [sign up for it right here](. And as always, we want to know what you think. We recently changed the format of this newsletter. Any questions, comments, or ideas? We're all ears. Specifically: If there is a topic you want us to explain or a story youâre curious to learn more about, let us know [by filling out this form]( or just replying to this email. Today's edition was edited and produced by Caroline Houck. We'll see you tomorrow! Ad
Â
[Learn more about RevenueStripe...]( [Facebook]( [Twitter]( [YouTube]( [Instagram]( [TikTok]( [WhatsApp]( This email was sent to {EMAIL}. Manage your [email preferences]( [unsubscribe](param=sentences). If you value Voxâs unique explanatory journalism, support our work with a one-time or recurring [contribution](. View our [Privacy Notice]( and our [Terms of Service](. Vox Media, 1701 Rhode Island Ave. NW, Washington, DC 20036.
Copyright © 2024. All rights reserved.