22.7 C
New York

‘A global problem’: US teen fights deepfake porn targeting schoolgirls

Published:

Deepfake pornography of famous women like Taylor Swift has sparked outrage and calls for the regulation of artificial intelligence. Yet this powerful technology is not only being used to bully women in the public eye – minors are also being victimised. Schoolgirls are finding themselves targeted by AI-generated deepfake porn made by their own classmates using new, easy-to-access “nudifying” apps. And no federal laws exist to stop it. 

Francesca Mani was summoned to the vice-principal’s office at Westfield High School in suburban New Jersey last autumn, where she was told that she was a “confirmed AI victim”. 

Mani, then 14, was informed that she was among a group of schoolgirls who had fake nude images made of them using artificial intelligence. Boys at Westfield High shared the fake images via Snapchat. 

Francesca Mani, 15, in her back garden in Westfield, New Jersey on February 28, 2024.
Francesca Mani, 15, in her back garden in Westfield, New Jersey on February 28, 2024. © Jessica Le Masurier

“I was in shock,” Francesca said. “I started to feel a little sad but I went outside in the hallway and I saw a group of boys mocking a group of girls. They were laughing about it. And then I was just super mad.” 

It was at that moment that Francesca, a junior Olympic fencing star, decided to fight back. 

The fake images had been shared during school lunch breaks and on the school bus, according to police reports. But what the boys had done was not illegal – there are no federal laws against AI-generated deepfakes in the United States. 

When Francesca told her mother, Dorota, what had happened she was furious. 

“Since the beginning of time, women have to fight for our rights and consent,” said Dorota, a Polish immigrant with a black belt in jujitsu. “It’s the biggest issue here: Consent.” 

The boy who made the images was suspended from school for two days but there was no further accountability.

Francesca chose not to look at the images. But she felt uncomfortable knowing that the boy who made them – and the others who had shared them – were in her classes.

Some US state laws ban non-consensual deepfake pornography, including in Texas, Minnesota, Virginia, New York, Hawaii and Georgia. But that has not prevented the proliferation of of AI-generated nude images at high schools in New Jersey, California or Florida.

Cases like the one at Westfield High underscore the need for the law to catch up with this fast-paced technology. Francesca and her mother contacted local lawmakers and got to work lobbying for both state and federal legislation to make the non-consensual sharing of digitally altered pornographic images a crime. 

Dorota and Francesca Mani outside the White House in Washington, DC, on February 5, 2024.
Dorota and Francesca Mani outside the White House in Washington, DC, on February 5, 2024. © Marine Pradel

Dorota and Francesca have made frequent trips to Washington, DC, over the past three months to lobby for a bipartisan bill, called the DEFIANCE act, that would make sharing sexually explicit deepfakes a crime and allow victims to sue those who create and distribute them. 

A bill that would establish criminal penalties is now making its way through the New Jersey legislature, in part thanks to their efforts. 

Both sides of the political spectrum in the US appear to agree that more should be done protect people, particularly minors, from the proliferation of deepfake porn. 

Lawmakers are also examining how to hold the tech firms hosting these apps and websites accountable. 

“This issue is broader than teenagers and high schools,” Dorota explained on a recent visit to the White House. “We talk about predators and pedophiles, who are using it to hurt others. I think laws and legislation – it’s so crucial.”

Another teen victim from Westfield High School, who wishes to remain anonymous, has taken direct legal action against a classmate. Her lawsuit, filed in February, alleges that she suffered substantial emotional distress and damage to her reputation, demanding financial compensation. 

The lawsuit names the app used to make the images: Clothoff. The app, which is easy to access and inexpensive, claims it can remove clothes from any photo. Its secretive founders are thought to be based in Belarus. 

A photo of the lawsuit filed by a New Jersey teen targeted by deepfake pornography.
A photo of the lawsuit filed by a New Jersey teen targeted by deepfake pornography. © Jessica Le Masurier

 

Deepfake porn ‘exclusively targets women’

In 2023, the total number of deepfake videos online was 95,820, representing a 550% increase over 2019, according to a report by Home Security Heroes, a group that researches online security. Deepfake pornography made up 98% of those online videos. The study also found that 99% of the individuals targeted in deepfake pornography are women. 

Disinformation expert Nina Jankowicz at a conference at Columbia University, in New York City on March 4, 2024.
Disinformation expert Nina Jankowicz at a conference at Columbia University, in New York City on March 4, 2024. © Jessica Le Masurier

“It’s a phenomenon that almost exclusively targets women,” former Joe Biden administration official Nina Jankowicz told a conference at Columbia University on March 4.  

Jankowicz is an expert on disinformation and the online harassment of women. She became a victim of deepfake pornography herself and only realised it when she received a Google alert linking to explicit videos featuring someone who looked like her. Deepfake videos had been made from her official portrait, taken while she was pregnant. The image was picked up by right-wing media, angry about her role tackling disinformation in the Biden administration, and online trolls had turned it into pornography. 

“It looks like me a little bit but it doesn’t really, again, because it’s trained on this very specific portrait of me and I can tell that it’s trained on that,” Jankowicz said, indicating the graphic videos on her laptop. “But no, I didn’t watch the whole thing. They’re each about seven minutes long – that’s a lot to subject yourself to.”

Disinformation expert Nina Jankowicz showing deepfake porn videos featuring her likeness on March 4, 2024, in New York City.
Disinformation expert Nina Jankowicz showing deepfake porn videos featuring her likeness on March 4, 2024, in New York City. © Jessica Le Masurier

She said part of the reason these deepfakes are proliferating so quickly is that the apps used to make them are now so easily available. “This technology has become entirely democratised now, and it’s being deployed not only against celebrities, like Taylor Swift or public figures like me. It’s being deployed against ordinary people – mums, young women in middle and high school whose classmates are making these images of them. Nudifying apps all need to be taken off app stores and marketplaces and things like that.” 

Global rise of deepfake porn

The United Nations has been looking into how to combat the global rise of this technology.

Kathryn Travers, a policy expert at UN Women, believes that education is key. “You have to teach boys how to respect women and how to be responsible online in the same way that you have to teach girls,” she says. 

UN policy expert Kathryn Travers in her office at UN Women on March 7, 2024.
UN policy expert Kathryn Travers in her office at UN Women on March 7, 2024. © Fanny Chauvin

These AI images may exist in the virtual space but they have real-world impact. Girls and women experience disproportionate levels of violence online, sometimes with devastating consequences.

“They may be in a position in which they have less knowledge about their rights, about who to report to. Many often blame themselves – because of the technology the potential for piling on, sharing and resharing their images, the impact on mental health is enormous and has resulted in death by suicide by a number of girls.”  

Parents of children targeted by deepfake porn from across the world have now contacted Dorota for advice. “I’ve been talking to parents and victims from all over the United States – Texas, Wisconsin, California. Recently someone from Greece contacted me and before that Japan, London, Paris, Australia and Canada. It’s a global problem.” 

Related articles

Recent articles

spot_img