There are laws about what can be seen or said in public. So why don’t they apply to social media?
In principle they do. The problem is enforcing them. In part it’s a problem of scale.
There is nothing new about hate speech. What has changed is the mode of delivery.
In Nazi Germany, it was state-controlled newspapers and radio. At the time of the genocide in Rwanda, it was a radio station run by the Hutu government. Today, it is social media, until recently largely unregulated.
Last year the South African president elevated the epidemic of violence against women to national crisis level following pressure from activists, promising to put in place a public national register of offenders, a review of cold cases and harsher penalties for perpetrators.
No apologies for quoting at length from “The Media Isn’t Ready to Cover Climate Apartheid” by Michelle García (The Nation, 17 June 2020).
While praising the public service ethic of many media outlets, whose coverage of the Covid-19 pandemic has been exemplary, she notes an apparent reticence or inability to delve in depth into its impact on the most marginalized. She also questions media preparedness for the greater crisis to follow:
[caption id="attachment_26456" align="alignleft" width="300"] Image: United Nations COVID-19 response[/caption]
In times of disaster, the need to engage with affected communities to ensure useful, timely and accurate information is mutually shared is increasingly recognised as essential.
A group of 153 academics, writers, and social activists published a letter in Harper’s Magazine (7 July 2020) expressing concern that “a new set of moral attitudes and political commitments” are tending “to weaken norms of open debate and toleration of differences in favor of ideological conformity”.