Contact

New research brief: Deepfake image-based sexual abuse, tech-facilitated sexual exploitation and the law

Posted in: Ai January 24, 2024
Share on LinkedIn Share on Twitter
New research brief: Deepfake image-based sexual abuse, tech-facilitated sexual exploitation and the law

The rise of deepfake image-based sexual abuse necessitates urgent and comprehensive responses from technological innovation, legal reform, and societal awareness.   

Our new research brief, co-authored with Equality Now with support from law firm Hogan Lovells, Deepfake image-based sexual abuse, tech-facilitated sexual exploitation and the law provides a snapshot of the legal frameworks in nine focus jurisdictions, including England, Wales, Scotland, Australia, New Zealand, South Africa, Kenya, Nigeria, the US (Virginia, Texas, California), the European Union, and international human rights law. 

This briefing paper aims to spark discussions among diverse stakeholders about the challenges of preventing deepfake image-based sexual abuse. Amanda Manyame, Equality Now’s Digital Law & Rights Advisor and author of the brief, explains. “The law has not kept pace with technological advancements, and in many parts of the world does not provide protection from deepfake sexual violations. Where the law does provide protection, it has not yet been applied to see if the law will indeed provide adequate protection.  Furthermore, deepfake sexual violations are global and multi-jurisdictional and demand a harmonized and coordinated response from the international community.”

Emma Gibson, Global Coordinator for AUDRi, underlines the urgency of this research: “The potential for harm caused by deepfakes is not only real, but is increasing day by day. Victims face not just emotional and reputational damage, but barriers to justice if they attempt to block this  non-consensual dissemination of their likeness, and hold perpetrators to account. Our research is a clarion call for the development of comprehensive legal frameworks that can keep pace with this rapidly evolving technology.”

Fake videos, real harms 

To support this research, AUDRi and Equality Now co-hosted a webinar featuring invited guests from the #MyImageMyChoice campaign, which seeks to amplify the voices of victims/survivors  of deepfake-based exploitation, and encourage governments and regulators to block websites that disseminate sexually explicit deepfake content. 

Sophie Compton, co-founder of the campaign and co-director of the award winning documentary about deepfakes, Another Body, explained some of the impacts felt by survivors of deepfake exploitation, and how the experience of abuse, and reporting it, mirror familiar patterns. “The failure to address this is grounded in misogyny and a failure to listen to women’s experiences. The lack of meaningful response might be blamed on new technology, but women experience these violations as a form of sexual abuse, and this is not new.”
Watch here: Unmasking deepfakes – real life impacts and the legal labyrinth

See Also

New research brief: Doxing, digital abuse and the law

New research brief: Doxing, digital abuse and the law

By

February 27, 2024

Doxing typically involves the deliberate sharing of private personally identifiable  information on the internet, without cons...

Read More
Equality by Design –  A model for managing the discriminatory risks of AI

Equality by Design – A model for managing the discriminatory risks of AI

By

February 15, 2024

Guest blog by Jim Fitzgerald, Director, Equal Rights Trust Throughout the last year, it has seemed that barely a week passed wi...

Read More
AI: From catastrophe to action

AI: From catastrophe to action

By

November 3, 2023

By Ivana Bartoletti, co-founder of AUDRi, and Women Leading in AI There is not a day that goes by without someone theorizing ab...

Read More

Design by StudioDBD