Women face new sexual harassment with deepfake pornography

For years, women have faced sexual harassment online and with the rise of artificial intelligence, it’s only getting worse. Deepfakes use AI to create manipulated but realistic images and videos of real people in fake situations and are routinely used against women. A study shows 96 percent of deepfake videos were non-consensual pornography. Laura Barrón-López discussed this with Nina Jankowicz.

Read the Full Transcript

Notice: Transcripts are machine and human generated and lightly edited for accuracy. They may contain errors.

  • Geoff Bennett:

    Fears of the rapid rise of artificial intelligence often overlook the devastating and more immediate impact of this technology on women.

    That's the focus of Laura Barron-Lopez's conversation tonight.

  • Laura Barron-Lopez:

    For years, women have faced sexual harassment online. And with the rise of artificial intelligence, it's only getting worse. Deepfakes, which use A.I. to create manipulated, but hyper-realistic images and videos of real people in fake situations, are routinely used against women.

    A 2019 study revealed that a staggering 96 percent of all deepfake videos were nonconsensual pornography.

    Our guest, Nina Jankowicz, is a disinformation researcher and the author of two books on the subject. She ran the Biden administration's Disinformation Governance Board before it was dissolved after intense Republican pressure. She's also the target of deepfake pornographic videos, an experience she wrote about this week in "The Atlantic."

    Nina, thanks so much for being here.

    You are now the target of the very thing that you have researched, disinformation. And for more than a year, you have been experiencing online harassment. What has that experience been like to find yourself in multiple pornographic deepfake videos?

    Nina Jankowicz, Author, "How to Be a Woman Online: Surviving Abuse and Harassment, and How to Fight Back": Well, Laura, I think, some people might be surprised to find out that it didn't shock me.

    I have written about this and written about online abuse writ large for many years. And, frankly, I was surprised it took me this long to find these videos. I got a Google Alert a couple of weeks ago when I woke up in the morning. These basically just track news mentions of me. And there was a deepfake porn Web site that had tagged me and my image.

    And there were a couple of videos on there. And, frankly, it's not even in the top 10 worst things that have happened to me over the past year-plus. And that kind of gives you a picture of what it's been like to be the face of a nationwide harassment campaign.

  • Laura Barron-Lopez:

    And what has the impact been on your career from this online harassment?

  • Nina Jankowicz:

    Well, it's exhausting. And that's exactly the point. The point is to make you not want to speak out, to make you not want to stand up for the truth, to just retreat into anonymity online.

    And I think that's something that I have really pushed back against. I know that there are women that look up to me. There are women who I don't want to have to go through this in the future. And so I am making a point of drawing attention to this despicable behavior, whether it's deepfake porn, or violent threats, or just the incessant harassment like — that women like me are on the receiving end of.

    I have had to get a restraining order against a stalker that I have. I have been named in extraneous lawsuits. It's all taken a lot of time and, frankly, taken away from the work that I'd like to be doing, working on disinformation related to national security. But I'm not going to give up and I'm not going to stay silent.

  • Laura Barron-Lopez:

    And women like Hillary Clinton and Greta Thunberg have been subjects of these deepfake pornographic videos.

    When you recently researched one well-known Web site where people post these videos, you found that they were is only one video of former President Donald Trump, but pages of videos sexually explicitly depicting his wife, Melania, and daughter Ivanka.

    Are women the usual targets of these deepfake pornographic videos?

  • Nina Jankowicz:

    Yes, I think that's a really important point to make.

    These models are trained on women's bodies. So even if you feed a male image into the kind of face-swap tools that exist, it's not going to work as well, because they have been created by men for the purpose of either demeaning women or pleasuring themselves. And it just doesn't work as well on men's bodies.

    And that's why we see that 96 percent figure of nonconsensual pornographic videos of women, I think, is an important point to make.

  • Laura Barron-Lopez:

    And A.I. has made these videos more convincing. But even poorly edited videos known as cheapfakes can prove equally damaging.

    What's the motive here?

  • Nina Jankowicz:

    I think the motive is just to, again, demean and discredit women who are in the public eye.

    As I said, the videos of me were actually posted on a Web site for deepfake porn. It's not like they're mixed in there with the real porn, right? But, even so, the idea is to humiliate me, to show me in an instance in which — would be extremely private, right. And this is entirely the point, the cheapfakes as well meant to cast doubt on somebody's integrity, meant to say that they are not fit for public office or public life.

  • Laura Barron-Lopez:

    And you have spent a lot of time on these forums where you can actually see — watch the people interact that are posting these deepfake videos.

    What did you see on those forums?

  • Nina Jankowicz:

    The men are very concerned about their own privacy. They don't want to be found out for making these videos, but they're not concerned about the privacy of the women that they're making videos of.

    They say, if you're a public figure, if your image is out there, then they have the right to make this art, as they call it. And I don't believe it's art to put somebody in an image that they have not consented to, again, in their most private moments.

  • Laura Barron-Lopez:

    What recourse do women have who find themselves in these pornographic deepfake videos? And have there been any instances of the people posting these being held accountable?

  • Nina Jankowicz:

    So there's a patchwork of state level laws in the United States that do hold the distributors of deepfake pornography to account, if you can find out who they are.

    But that is very difficult to do. So, when a woman like me finds herself in a deepfake video, what you have got to do is see if you can find out who it is. If they happen to be in your state and in your jurisdiction, then you might be able to bring them to account in civil court. But if they're out of your state, or even out of the country, then you really don't have any recourse.

    And that's why I'm hoping that legislators, rather than wax poetic about the threats of A.I. that we might see in the future, look at the threat that's facing us today and ruining many women's lives.

  • Laura Barron-Lopez:

    And you were on the Disinformation Governance Board for the Biden administration. That was under the Homeland Security Department.

    But it only survived about, what, three weeks. And then, ultimately, it was totally killed due to Republicans — Republican outcry over what they said was attacks on freedom of expression and freedom of speech.

    Did the administration make a mistake by dissolving that board?

  • Nina Jankowicz:

    I absolutely think the administration made a mistake by dissolving the board.

    And, Laura, I will push back a little bit. It wasn't Republican outcry. It was lies and disinformation about the board and about me and my positions on how to fight disinformation. People said that I was going to be censoring American speech.

    I would never do any such thing. I have stood up my entire career for people under real freedom of expression threats in places like Russia, Belarus, Ukraine. I would have never taken a job that had anything to do with censorship. And, frankly, it's been proven by public documents now that the board never had anything to do, nor any capacity to do anything with censorship.

  • Laura Barron-Lopez:

    In this "Atlantic" article, you're at a point where you're addressing policymakers using words "I begged them" that before stopping potential — that before they act to stop greater existential threats potentially posed by A.I., that they first stop the men who are using it to discredit and humiliate women.

    What do you think it's going to take for lawmakers to actually regulate this?

  • Nina Jankowicz:

    I wish I knew.

    I would ask the lawmakers who are stalling on this and saying, oh, this is just a women's issue, or it's one that we're going to — we're going to deal with when we think about the broader A.I. threat, to think of your wife, your daughter, your sister, your mom in this situation.

    And, again, that's a little bit trite, right? We shouldn't have to ask for our basic human rights and privacy to be respected just by men thinking of the women in their lives in this position. But if that's what it takes, then I hope that they do it, because, certainly, I think, if they saw any of those women in their lives in that private situation for anybody on the Internet to see, they might think of it differently.

    This is a democratic issue that affects women's political participation. And it's something I think, in the future, we're definitely going to see nations like Russia and China and Iran using against female officials in the United States, because it works.

  • Laura Barron-Lopez:

    Nina Jankowicz, thank you so much for joining and for sharing your personal experience.

  • Nina Jankowicz:

    Thanks for having me.

Listen to this Segment