Helen Mort couldn’t believe what she was hearing. There were naked photos of her plastered on a porn site, an acquaintance told her. But never in her life had she taken or shared intimate photos. Surely there must be some mistake? When she finally mustered up the courage to look, she felt frightened and humiliated.

Mort, a poet and broadcaster in Sheffield, UK, was the victim of a fake pornography campaign. What shocked her most was that the images were based on photos, dated between 2017 and 2019, that had been taken from her private social media accounts, including a Facebook profile she’d deleted.

The perpetrator had uploaded these non-intimate images—holiday and pregnancy photos and even pictures of her as a teenager—and encouraged other users to edit her face into violent pornographic photos. While some were shoddily Photoshopped, others were chillingly realistic. When she began researching what had happened, she learned a new term: deepfakes, referring to media generated and manipulated by AI.

Helen MortCOURTESY PHOTO

“It really makes you feel powerless, like you’re being put in your place,” she says. “Punished for being a woman with a public voice of any kind. That’s the best way I can describe it. It’s saying, ‘Look: we can always do this to you.’”

The revelations would lead her on a frustrating quest for recourse. She called the police, but the officer said there was nothing they could do. She considered getting off the web entirely, but it’s crucial for her work.

She also had no idea who would have done this. She was terrified that it was someone she considered close. She began to doubt everyone, but most painfully, she began to doubt her ex-husband. They’re good friends, but the abuser had used his first name as a pseudonym. “It’s not him—absolutely not. But it’s really sad,” she says. “The fact that I was even thinking that was a sign of how you start doubting your whole reality.”

While deepfakes have received enormous attention for their potential political dangers, the vast majority of them are used to target women. Sensity AI, a research company that has tracked online deepfake videos since December of 2018, has consistently found that between 90% and 95% of them are nonconsensual porn. About 90% of that is nonconsensual images of women. “This is a violence-against-women issue,” says Adam Dodge, the founder of EndTAB, a nonprofit that educates people about technology-enabled abuse.

In

Read More

————

By: Karen Hao
Title: Deepfake porn is ruining women’s lives. Now the law may finally ban it
Sourced From: www.technologyreview.com/2021/02/12/1018222/deepfake-revenge-porn-coming-ban/
Published Date: Fri, 12 Feb 2021 10:00:00 +0000

Did you miss our previous article…
https://www.mansbrand.com/auditors-are-testing-hiring-algorithms-for-bias-but-find-theres-no-easy-fix/

Comments

0 comments