Contact

New research brief: Deepfake image-based sexual abuse, tech-facilitated sexual exploitation and the law

Posted in: Ai January 24, 2024
Share on LinkedIn Share on Twitter
New research brief: Deepfake image-based sexual abuse, tech-facilitated sexual exploitation and the law

The rise of deepfake image-based sexual abuse necessitates urgent and comprehensive responses from technological innovation, legal reform, and societal awareness.   

Our new research brief, co-authored with Equality Now with support from law firm Hogan Lovells, Deepfake image-based sexual abuse, tech-facilitated sexual exploitation and the law provides a snapshot of the legal frameworks in nine focus jurisdictions, including England, Wales, Scotland, Australia, New Zealand, South Africa, Kenya, Nigeria, the US (Virginia, Texas, California), the European Union, and international human rights law. 

This briefing paper aims to spark discussions among diverse stakeholders about the challenges of preventing deepfake image-based sexual abuse. Amanda Manyame, Equality Now’s Digital Law & Rights Advisor and author of the brief, explains. “The law has not kept pace with technological advancements, and in many parts of the world does not provide protection from deepfake sexual violations. Where the law does provide protection, it has not yet been applied to see if the law will indeed provide adequate protection.  Furthermore, deepfake sexual violations are global and multi-jurisdictional and demand a harmonized and coordinated response from the international community.”

Emma Gibson, Global Coordinator for AUDRi, underlines the urgency of this research: “The potential for harm caused by deepfakes is not only real, but is increasing day by day. Victims face not just emotional and reputational damage, but barriers to justice if they attempt to block this  non-consensual dissemination of their likeness, and hold perpetrators to account. Our research is a clarion call for the development of comprehensive legal frameworks that can keep pace with this rapidly evolving technology.”

Fake videos, real harms 

To support this research, AUDRi and Equality Now co-hosted a webinar featuring invited guests from the #MyImageMyChoice campaign, which seeks to amplify the voices of victims/survivors  of deepfake-based exploitation, and encourage governments and regulators to block websites that disseminate sexually explicit deepfake content. 

Sophie Compton, co-founder of the campaign and co-director of the award winning documentary about deepfakes, Another Body, explained some of the impacts felt by survivors of deepfake exploitation, and how the experience of abuse, and reporting it, mirror familiar patterns. “The failure to address this is grounded in misogyny and a failure to listen to women’s experiences. The lack of meaningful response might be blamed on new technology, but women experience these violations as a form of sexual abuse, and this is not new.”
Watch here: Unmasking deepfakes – real life impacts and the legal labyrinth

See Also

After the Summit of the Future: From influencing to implementation 

After the Summit of the Future: From influencing to implementation 

By

October 1, 2024

The recent Summit of the Future saw the United Nations General Assembly adopt the Pact for the Future, which aims to guide UN m...

Read More
AUDRi signs on in support of CSO joint statement on latest GDC revisions

AUDRi signs on in support of CSO joint statement on latest GDC revisions

By

July 24, 2024

The Alliance for Universal Digital Rights, along with AUDRi co-founder Equality Now, have endorsed a joint CSO statement, along...

Read More
Update on negotiations for the Global Digital Compact

Update on negotiations for the Global Digital Compact

By

May 30, 2024

The next stage of negotiations for the UN’s Global Digital Compact is underway with the release of the First revised version....

Read More

Design by StudioDBD