01 Jul 2019 A voluntary way of holding social media to account
Article 19 – the international freedom of expression organization – has proposed creating Social Media Councils (SMCs) as a way of moderating content on social media based on a “multi-stakeholder accountability mechanism”.
Article 19 argues:
“In today’s world, dominant tech companies hold a considerable degree of control over what their users see or hear on a daily basis. Current practices of content moderation on social media offer very little in terms of transparency and virtually no remedy to individual users. The impact that content moderation and distribution (in other words, the composition of users’ feeds and the accessibility and visibility of content on social media) has on the public sphere is not yet fully understood, but legitimate concerns have been expressed, especially in relation to platforms that operate at such a level of market dominance that they can exert decisive influence on public debates.”
Article 19 is proposing an open, transparent, accountable and participatory forum to address content moderation issues on social media platforms based on international standards on human rights. The model relies on:
“A voluntary approach to the oversight of content moderation: participants (social media platforms and all stakeholders) sign up to a mechanism that does not create legal obligations. Its strength and efficiency rely on voluntary compliance by platforms, whose commitment, when signing up, will be to respect and execute the SMC’s decisions (or recommendations) in good faith.”
By adopting a self-regulatory/multi-stakeholder approach, Article 19 is counting on the SMCs:
- Being independent from government, commercial and special interests;
- Being established via a fully consultative and inclusive process;
- Democratic and transparent in their selection of members and decision-making;
- Being broadly representative: it is important that the composition of the self-regulatory body include representatives of the diversity of society;
- Having a robust complaint mechanism and clear procedural rules to determine if standards were breached in individual cases, and have the power to impose only moral sanctions;
- Working in the public interest: being transparent and accountable to the public.
The consultation document can be found here. The proposal is clear and insightful. Yet, one wonders if voluntary compliance will be sufficient to address issues of personal and data privacy, harmful content, and online hate speech – especially as this particular watchdog will have few regulatory teeth and will need long-term sufficient and sustainable funding. It’s not clear who will foot the bill.
Photo above courtesy of Article 19
Sorry, the comment form is closed at this time.