The recent scandal involving Grok has amplified the discussion around “nudifying” apps and other online harms. This Q&A draws on insights from digital rights and legal experts, both within Equality Now and working at the frontlines of regulation efforts, to examine why this crisis demands urgent action and what effective responses look like.
As many survivors testified in the wake of the Grok episode these abuses happen in digital spaces, their impact is profoundly real. They violate privacy, dignity, and bodily autonomy, inflict deep psychological trauma, damage reputations and livelihoods, and silence women and girls by driving them to withdraw from public life. The cumulative effect is a constant reminder that women and girls are not safe anywhere – online or offline.
How does harmful technology like ‘nudifying’ apps come into existence?
This technology isn’t a technical accident. It’s the result of weak regulation and deliberate choices that prioritise speed and profit over safety. Companies make conscious decisions to allow predictable misuse by failing to implement safeguards throughout the technology lifecycle, from the design phase to its use, moderation and modification, especially against harms that disproportionately affect women and girls.
These platforms are created by multi-billion pound profitable businesses, and unfortunately, violence against women and girls can be profitable. Engagement drives revenue, and keeping users on platforms for as long as possible – regardless of the content – is often the priority.
What laws currently exist to protect people from TFGBV?
Multiple laws exist at international, regional, and national levels that address gender-based violence. However, most of these laws were established before technology contributed to the issue, and do not necessarily offer protections that extend to digital spaces. The rapid pace at which both the harms and the enabling technology evolve makes it difficult for policymakers to keep up.
This challenge is compounded because digital platforms operate across multiple borders, requiring coordinated international and regional approaches to create effective laws.
Some governments are taking action against TFGBV, but the law-making process is slow. Examples of progress include Australia’s Online Safety Act, whose implementation is overseen by the E-Safety Commission, which can represent individuals and has secured significant financial claims; the UK, Denmark, and the Netherlands, which have criminalised creating sexually explicit deepfakes; and South Korea, which offers holistic survivor support including counseling, legal advice, and assistance with content removal.
Why is regulation around online harms so difficult?
Governments face multiple challenges when trying to regulate tech-facilitated abuse.
Platform power:
Large platforms wield considerable economic and political power, and some have lobbied governments for less regulation, or threatened to withdraw from territories where regulation might limit their activities. This often results in regulations being watered down.
Scale:
The sheer volume of tech-facilitated abuse against women and girls, and the speed at which it spreads across virtual space, makes prevention, eradication, and justice for survivors seem like an uphill battle.
Anonymity:
Many perpetrators operate anonymously. While the option to be anonymous is an important and valid feature of most online platforms, it makes identifying and challenging abusers more difficult.
Lack of consensus:
While there is broad consensus that tech-facilitated abuse is harmful, its prioritisation and the approaches to prevention and protection vary across countries and regions, making universal, global agreement extremely difficult.
Why can’t countries agree on laws governing digital spaces across borders?
There are many challenges preventing global agreement on digital governance laws.
Different countries and regions have different attitudes towards gender equality, shaping how TFGBV is perceived, recognised and addressed. Many countries have weaker laws around violence against women and girls generally, or do not recognise that TFGBV is part of that continuum.
Some countries view regulating to protect women as contrary to promoting innovation, or have been lobbied by the tech companies themselves with this message. Similarly, different countries have varying debates about what constitutes free speech.
What role and responsibility do platforms have in preventing online harm?
Platforms have a responsibility, grounded in international human rights law and standards, to ensure the safety of all users, and increasingly, under many jurisdictions, a legal obligation to ensure that their own users are not violating laws that protect users. Laws should mandate platforms to faciliate reporting of abuses and harms, and take action against perpetrators, as well as cooperate with legal processes to ensure justice for survivors.
We continue to advocate for governments and regulators to require safety-by-design principles at the development stage, during use and moderation, creating and deploying products that are inherently safer for everyone, including women and girls .
What is meant by“safety-by-design”?
Safety-by-design is an approach where safety considerations are systematically embedded throughout the entire lifecycle of a technology from the earliest stages of development, to deployment, use and ongoing modification, rather than added as an afterthought. In the context of AI and other emerging technologies, this includes continuous assessment and mitigation of the technology’s potential impact on human rights, particularly for groups that are particularly vulnerable to harms including women and girls, children and minorities.
This proactive approach prevents foreseeable harms rather than attempting to address them after victims have already been harmed.
So, what is the solution?
We continue to advocate for comprehensive and straightforward criminal laws that apply to all forms of tech-facilitated violence against women and girls. Existing laws to address misogyny and sexual violence should be applied where suitable, but laws should be assessed against this new digital environment to ensure they are fit-for-purpose.
We need regulators to act at the platform level to hold them accountable for systemic abuses, and the development and enforcement of laws requiring platforms to act swiftly to remove harmful content, preserve evidence, and support survivors when their technology has been used for harm.
Comprehensive survivor support, including counseling, legal advice, and takedown measures, is necessary to mitigate against the real trauma caused by online abuse.
Are laws enough to end TFGBV?
Laws are a crucial starting point to create protective guardrails, but ending these harms requires systemic accountability across technology, government, and society. Addressing the underlying social attitudes that enable and perpetuate these forms of violence will contribute to a safer online space for all.
We need to challenge the misogyny and entitlement that drives abuse and ensure survivors, and those vulnerable to abuse, are consulted for input when AI tools and digital policies are developed
What can policymakers and regulators do to combat TFGBV?
- Develop comprehensive legal frameworks that address criminal law, civil remedies, platform regulation, and survivor support
- Require safety-by-design principles in AI and other emerging technology development
- Work collaboratively across borders to create enforceable international standards
- Resist platform lobbying that seeks to weaken protective regulation
- Adequately fund survivor support services
What can technology companies do to combat TFGBV?
- Implement safety-by-design throughout the lifecycle of technology
- Conduct human rights impact assessments before releasing and during the use of new technologies
- Create effective mechanisms for content removal and evidence preservation
- Recognise that creating safer platforms ultimately benefits long-term sustainability and profitability
What can civil society do to combat TFGBV?
- Amplify survivor voices and experiences
- Hold platforms and governments accountable through advocacy and public pressure
- Challenge misogynistic attitudes that enable these forms of violence
Sources:
Equality Now
Dr Clare McGlynn
AUDRI digital principles