Contact

Online Abuse: US Supreme Court Rulings on tech companies’ Responsibility to protect users from harmful content  

Posted in: Technology June 8, 2023
Share on LinkedIn Share on Twitter
Online Abuse: US Supreme Court Rulings on tech companies’ Responsibility to protect users from harmful content  

On 18 May 2023, the US Supreme Court ruled in favour of Twitter and Google, continuing the limited liability shield afforded to platforms for user-generated content in terms of Section 230 of the Communications Decency Act (CDA).

In the judgements — one involving Google, the court rejected to limit the scope of Section 230 that frees digital platforms from liability for user-generated content. And the other involving Twitter, the court ruled that another law allowing suits for aiding terrorism did not apply to the ordinary activities of social media companies.

Both rulings did not definitively resolve the question of what responsibility digital platforms should have for the content posted on and recommended by their sites. 

What is Section 230?

Section 230 of the CDA was a reaction to a decision holding an online message board liable for what a user had posted because the service had engaged in some content moderation. The provision states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In other words, the internet service provider (ISP) is a conduit of the information and not legally liable for it. In addition to finding that ISPs are not criminally liable for not monitoring or limiting the harmful content that users might post on their platforms, courts interpret Section 230 expansively to also prohibit civil liability findings against ISPs for the content on their systems.

As a result, Section 230 helped enable the rise of social networks like Facebook and Twitter by ensuring that the sites did not assume legal liability with every new tweet, status update and comment (user-generated content). Limiting the scope of the law could expose digital platforms to lawsuits claiming they had steered people to posts and videos that promoted extremism, urged violence, harmed reputations and caused emotional distress.

What is the issue, and what needs to change?

Digital platforms are no longer mere conduits of user-generated content, but “they are knowingly choosing profits over people,” said Representative Frank Pallone Jr., the chairman of the Energy and Commerce Committee. In the Twitter judgement, the court acknowledged that “platforms use algorithms to steer users toward content that interests them.” 

Digital platforms use AI and machine learning which amplify age-old stereotypes and biases that feed misogyny and racial and religious divisions. Digital platforms, algorithmic and machine learning systems provide recommendations that are pervasive and powerful and are perpetrating and amplifying sexual exploitation and abuse online with impunity, and women and girls are disproportionately affected.  

As a result, tensions arise when efforts to ensure safety and protect users from online violence and exploitation are regarded as infringing on others’ rights, in particular, the right to freedom of expression and the right to privacy. In the first instance, there are concerns that regulating what users post online and holding digital platforms liable for user-generated content online will lead to self-censorship and/or digital platforms erring on the side of caution and removing content which will, in turn, infringe on users’ freedom of expression. Traditionally, in instances of tension arising between rights and interests, courts apply the International Covenant on Civil and Political Rights (ICCPR) proportionality test, which provides that in the event of a crime or violation of the rights of others, the right of the alleged offenders can be limited if the limitations are legal, legitimate, necessary and proportionate. 

But the problem is, in reality, on a day-to-day basis, it is digital platforms that are, in fact, balancing between the rights and interests in their online content moderation policies and not the courts. In the US, a growing group of bipartisan lawmakers, academics and activists have grown sceptical of Section 230 and say that it has shielded Big Tech from consequences for disinformation, discrimination and violent content across their platforms. A reform of Section  230 is needed to ensure the protection of all people, especially those who are vulnerable,  and combat the evolving nature of online abuse while also balancing the concerns of privacy, freedom of expression, and innovation. 

Members of the US Congress have also called for changes to the law, but political realities have largely stopped those proposals from gaining traction. In February 2023, the Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (SAFE TECH) Act to reform Section 230 and allow social media companies to be held accountable for enabling cyber-stalking, online harassment, and discrimination on social media platforms was reintroduced to the Senate. It remains to be seen how this bill will fare.

See Also

New research brief: Doxing, digital abuse and the law

New research brief: Doxing, digital abuse and the law

By

February 27, 2024

Doxing typically involves the deliberate sharing of private personally identifiable  information on the internet, without cons...

Read More
Equality by Design –  A model for managing the discriminatory risks of AI

Equality by Design – A model for managing the discriminatory risks of AI

By

February 15, 2024

Guest blog by Jim Fitzgerald, Director, Equal Rights Trust Throughout the last year, it has seemed that barely a week passed wi...

Read More
New research brief: Deepfake image-based sexual abuse, tech-facilitated sexual exploitation and the law

New research brief: Deepfake image-based sexual abuse, tech-facilitated sexual exploitation and the law

By

January 24, 2024

The rise of deepfake image-based sexual abuse necessitates urgent and comprehensive responses from technological innovation, le...

Read More

Design by StudioDBD