Empowering women by tackling online harassment and abuse
32988
post-template-default,single,single-post,postid-32988,single-format-standard,bridge-core-3.3.1,qodef-qi--no-touch,qi-addons-for-elementor-1.8.1,qode-page-transition-enabled,ajax_fade,page_not_loaded,,qode-title-hidden,qode-smooth-scroll-enabled,qode-child-theme-ver-1.0.0,qode-theme-ver-30.8.3,qode-theme-bridge,qode_header_in_grid,qode-wpml-enabled,wpb-js-composer js-comp-ver-8.0.1,vc_responsive,elementor-default,elementor-kit-41156

Empowering women by tackling online harassment and abuse

Section 230 of the U.S. Communications Decency Act (CDA) of 1996 says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

In other words, online intermediaries that host or republish speech are protected against a range of laws that might otherwise be used to hold them legally responsible for what others say and do.

Protected intermediaries include not only regular Internet Service Providers (ISPs), but also a range of “interactive computer service providers” – basically any online service that publishes third-party content. With important exceptions for certain criminal and intellectual property-based claims, CDA 230 creates a broad protection that has allowed innovation and free speech to flourish online. The legal protections provided by CDA 230 are unique to U.S. law. European nations, Canada, Japan, and the vast majority of other countries do not have similar statutes on their books.

In the USA, this is what allows YouTube and Vimeo users to upload their own videos, Amazon and Yelp to offer countless user reviews, craigslist to host classified ads, and Facebook and Twitter to offer social networking to hundreds of millions of Internet users. The sheer size of user-generated websites preclude online intermediaries to prevent objectionable content from cropping up on their site. Rather than face potential liability for their users’ actions, most would likely not host any user content at all or would need to protect themselves by actively censoring what is said, seen, and done online. However, their censorship of user-generated content that is critical for women’s rights such as information on sexual and reproductive health has often provoked a backlash.

At the same time Facebook, Twitter and YouTube have become essential spaces for women to participate in civic space, particularly women journalists, activists, human rights defenders and others with public-facing professions. So, while strong protection for intermediaries is necessary to ensure that the Internet remains a place where everyone has a real-time option for free speech, technology and social media companies still have a responsibility to guarantee that women are able to access and utilize their services and platforms without having to face gender-based online harassment and abuse.

There is no shortage of proposals on how technology companies could counter misogyny, social prejudice and sexism enabled through their platforms. They must listen, respond and take concerted actions to tackle gender injustice online and offline.

Similarly, regulators need to acknowledge the important place of media and communication within the broader objective of promoting gender equality and women’s empowerment.

1 Comment