Why digital culture needs a new MacBride Commission
36204
post-template-default,single,single-post,postid-36204,single-format-standard,bridge-core-3.3.1,qodef-qi--no-touch,qi-addons-for-elementor-1.8.2,qode-page-transition-enabled,ajax_fade,page_not_loaded,,qode-title-hidden,qode-smooth-scroll-enabled,qode-child-theme-ver-1.0.0,qode-theme-ver-30.8.3,qode-theme-bridge,qode_header_in_grid,qode-wpml-enabled,wpb-js-composer js-comp-ver-8.0.1,vc_responsive,elementor-default,elementor-kit-41156

Why digital culture needs a new MacBride Commission

Stefania Milan

Forty years have elapsed since the publication of the ground-breaking MacBride Report. In terms of technological innovation, however, it might as well be ten times as many years. Since then, a great deal has changed in both media practice and public discourse. This article reflects on the legacy of the MacBride Commission in the realm of digital culture broadly defined.

This transformation has affected the ways in which we consume media content, interact with each other, learn and work, engage in consumption and trade. The digital is now king. Societal concerns have shifted too: from satellites to the fifth generation of mobile networks (5G), from radio waves to podcasts, from Western news agencies setting the agenda of public debate to “global” social media platforms where everyone can have a say.

The transition to digital has been accompanied by flamboyant narratives of empowerment, fairness, and equality. Social networking services have been saluted as a “liberation technology” able to correct the inequality in access to the public sphere (Diamond, 2010). Mobile phones are for many people in infrastructure-poor countries a convenient – and often the only – way to trade goods and access news. Biometric identification and algorithmic decision-making increasingly permeate anti-poverty programmes in the Global South. Yet, the digital revolution also embeds a deal of inequality and discrimination. Most of the communications problems identified by the MacBride Commission haunt us to this day.

Living in the “datafied” society

The computational turn unveiled in the 1950s has spectacularly accelerated over the last two decades. Information in all its forms has become a central gear of modern capitalism. The advent of the so-called Big Data datasets so large as to require software to process them has altered our personal lives and our urban environments (Kitchin, 2021). Cities have become “smart”, allowing local administrators to take informed decisions in near real time about public services. Human beings are “quantified” by an array of dashboards monitoring anything from blood sugar levels to sport performance. Service work is mediated by platforms and mobile apps facilitating the encounter between workforce and demand. Contact tracing apps and thermal cameras are central weapons in the fight against the Covid-19 pandemic (see Milan et al., 2021). In short, we live in an increasingly “datafied” society, where data have taken central stage as a way of making sense of the world and intervening in it.

The datafied society harbours both novel possibilities and daunting challenges for its citizens. On the one hand, digital technology indeed facilitates social life from cruising across town avoiding traffic to ordering take-away food to finding a sweetheart. The growing availability of data in the public domain including the “open data” released by public administrations for everyone to peruse bridge the gap between citizens and policymakers. Drones and sensors help citizens to gather original data about environmental depletion to support their advocacy efforts. At first sight, the massive presence of surveillance cameras in public space might even translate into an increased sense of safety.

However, the datafied society also tells stories of intrusive citizen monitoring and latent discrimination. For instance, individual and group privacy is at risk with the adoption of security cameras implementing facial recognition technology, which are known to discriminate against non-White individuals and to jeopardize the right to protest against authoritarian governments. Algorithmic decision-making in poverty-reduction schemes profiles and keeps watch over vulnerable people, who are left with limited capacity for intervention and redress. States can resist their citizens’ quest for transparency even in the datafied society, obscuring data, threatening datasets with deletion or making the process of obtaining information so cumbersome as to discourage citizens from taking action.

Faced with an increasingly complex technical ecosystem awash with socio-cultural consequences, two questions arise: are the complaints identified 40 years ago by the MacBride Commission still relevant today? How can we translate the core concerns of the MacBride Report to interpret the contemporary datafied society?

The MacBride Report today

The MacBride Commission was tasked with analysing the communication problems of modern societies, with a view to identifying viable solutions to further human development. The report that concluded the work of the group of experts, aptly titled Many Voices One World, foregrounded three main concerns with respect to the communications systems of the time: excessive media concentration, the commercialization of media, and the unequal access to information and communication in particular for developing countries. A central theme was “the creation and diversification of infrastructures for the collection, transmission and dissemination of various messages” (MacBride, 1980, p. 68). Today, technology might have evolved, but not much else has changed.

Commercialization of poses a threat to voice

Communications are increasingly mediated by proprietary platforms, including social networking platforms and chat applications. They sell a dream of empowerment and diversity but monetize user data and time. It is the so-called “attention economy”, in which user attention has become the new commodity. Services are nominally offered free-of-charge, but users become the product. Their traffic data, social networks and preferences are sold to advertisers interested in customizing their messages. Microtargeting is an increasingly attractive proposition not only to sell products, but also in the marketplace of ideas – think of the role of political ads in electoral campaigns.

The commercialization of user data and interactions is made possible by the personalization algorithms that operate behind the scenes in platforms and apps. Personalization algorithms ensure that users are served messages and products that are in line with their taste, including political preferences. They are proprietary and inaccessible to independent scrutiny; operating in the realm of machine learning, their functioning evolves over time and in unpredictable directions. Their impact on messages and the way we visualize and consume online content raises at least two types of concerns.

The first has to do with the ability of different voices to be heard in the digital sphere, when algorithms tend to privilege popularity over diversity. The second speaks to the users’ ability to gain access to varied points of view, in a digital environment that favours sameness. Social media have been accused of pushing users into “filter bubbles” that prevent them to be exposed to divergent opinions, with potential detrimental effects on democratic deliberation (Pariser, 2011).

Platform monopolies are today’s bottlenecks

Not only are interpersonal and social communications ever more caught in economic dynamics – they are also controlled by a limited number of mega-corporations that hog the market for user data and attention. Take for example Facebook Inc. Headquartered like most of its siblings in California, it is a technological conglomerate that embraces the social networking platform Facebook, the photo sharing service Instagram, and the chat app Whatsapp. With 2.2 billion users at the time of writing and half a million new adepts added daily, Facebook Inc. is a huge player in online advertising, with a 77% share in social network ads revenue.1

Another tech giant, Alphabet Inc., exposes the extent to which the digital market is vertically integrated. Created in 2015 following the restructuring of Google, it comprises subsidiaries active in the realm of artificial intelligence (DeepMind), autonomous driving (Waymo), the smart city (Sidewalk Labs), drone-based product delivery (Wing), alongside the company’s core initial business, internet services (Google).

Platform monopolies can be seen as the present-day equivalent of Western news agencies, widely criticized in the MacBride Report for their role in perpetuating cultural domination and technological dependence on the West. Platform monopolies jeopardize pluralism in ownership (and worldviews) as anticipated already by the MacBride Report. Not only is today’s tech and media industry characterized by a troublesome concentration of power in a handful of quasi-monopolist players – it is also the expression of Silicon Valley “ideology”. The competitive advantage of platform monopolies echoes the worries of the MacBride Commission, which noted that “[a]s the amount of capital investment required in the communication industry rises, the control of financing and the provision of equipment tends to pass into the hands of large-scale enterprises since only they are able to raise the capital needed” (MacBride, 1980, p. 106).

Unequal access to infrastructure and content

As the MacBride Commission observed, developing countries often find themselves on the losing end. Today Western industry capital increasingly intervenes to make up for the inability of developing countries to provide critical infrastructure like high-speed internet. For example, Loom, a subsidiary of Alphabet Inc. active until early 2021, was tasked with developing and marketing high-altitude balloons to bring the internet to the next billion users. But the distorted effects of industry concentration extend to users themselves, affecting their online experience. The controversial case of zero-rating or free data products offered in developing economies are a working example of the problem. Offering consumers a stripped-down version of its services at no cost, the zero-rating service Facebook Zero was accused of confining the Indian poor to a “walled garden” of its choosing (Prasad, 2017).

Concerns over the digital divide – that is to say, the gulf between the “haves” and the “have nots” in the digital revolution – have lost traction since the 1990s, despite nowadays only 51% of the world population enjoying some form of access to the Internet according to the International Telecommunication Union.2 The market has been tasked with bridging the gap, with platform companies offering corrective measures to correct the imbalance – as shown by zero-rating services. Unfortunately, the market imperative together with technological determinism permeate the discourse on development to this day and have replaced concerns regarding inequality in access. Technology, now like then, is “theorized as a sort of moral force that would operate by creating an ethics of innovation, yield, and result,” as denounced by anthropologist Arturo Escobar (Escobar, 1995, p. 36) – obscuring the need for adequate policy interventions at the global level.

The grassroots fights back

In 1980, the MacBride Report called for democratizing communications and strengthening alternative voices. It identified communication as a basic individual right, advocating for a “right to communicate” as “a prerequisite to many other [rights]” (MacBride, 1980, p. 253). Despite today’s gloomy state of affairs, organized civil society has not given up its role of advocate for equality and fairness in communication. We can distinguish three strands of mobilization and activism: the fight for digital rights, the creation of alternatives, and the promotion of awareness and digital literacy. 

Digital rights – or the adaptation and extension to the digital realm of human rights like the right to privacy and freedom of expression – have replaced the right to communicate in activist discourse. To be sure, something has been lost in the translation of the right to communication to present-day digital rights – namely the emphasis on autonomy from the market. Nonetheless, digital rights activists mobilize to defend users’ privacy against platform snooping, to ban facial recognition technology in public space,3 to gain the support of the tech industry to advance human rights globally – and much more. 

A second strand of activism follows more closely in the footsteps of the MacBride Report, creating alternative software tools and infrastructure for people to communicate on their own terms. Progressive developers give birth to alternatives to commercial platforms, for example privacy-preserving chat apps. Unfortunately, however, social movements nowadays appear to have given in to the critical mass that commercial social media alone can mobilize. As a result, many independent media projects of the 1990s-2000s have capitulated, and this type of activism is no longer so popular.

Other activists again seek to empower citizens to take informed decisions about their communicative actions online, educating them about risks and opportunities alike. They may teach people to generate data to support advocacy efforts. They may train human rights defenders in digital security, or engage in artistic projects aimed at nurturing technological “counter-imaginaries” in the population (Kazansky & Milan, 2021). Others develop software to help social media users to reflect on their “information diet” and become aware of the ways in which personalization algorithms shape our worldviews.4

Conclusion

While memories of the MacBride Commission might have faded among activists for fairer communications, its legacy for contemporary digital culture is visible to this day. Its criticism of distorted market forces in the media and communications sector is dramatically current. On the one hand, platform monopolies enjoy an unrivalled power over users and states alike. On the other, technological innovations potentially introduce new reasons to worry think of artificial intelligence technology.

Without a doubt, our digital ecosystem urgently needs a new MacBride Commission able to produce a comprehensive critique of the state of play, and to identify corrective policy measures and directions for activists and practitioners to follow in the attempt to reclaim the central role of communications for human development.

Notes

1. https://financesonline.com/facebook-statistics/

2. https://www.itu.int/en/ITU-D/Statistics/Pages/stat/default.aspx

3. https://reclaimyourface.eu

4. https://tracking.exposed

References

Diamond, L. (2010). Liberation Technology. Journal of Democracy, 3, 69–83.

Escobar, A. (1995). Encountering Development: The Making and Unmaking of the Third World. Princeton University Press.

Kazansky, B., & Milan, S. (2021). Bodies Not Templates: Contesting Mainstream Algorithmic Imaginaries. New Media & Society, 23(2), 363–381. https://doi.org/10.1177/1461444820929316

Kitchin, R. (2021). The Data Revolution: A critical analysis of big data, open data and data infrastructures. Sage.

MacBride, S. (1980). Many Voices, One World. Report of the International Commission for the Study of Communication Problems. UNESCO.

Milan, S., Treré, E., & Masiero, S. (2021). COVID-19 from the Margins: Pandemic Invisibilities, Policies and Resistance in the Datafied Society. Institute of Network Cultures. https://networkcultures.org/wp-content/uploads/2021/02/Covid19FromTheMargins-1.pdf

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin.

Prasad, R. (2017). Ascendant India, digital India: How net neutrality advocates defeated Facebook’s Free Basics. Media, Culture & Society, 40(3), 415–431. https://doi.org/10.1177/0163443717736117

 

Stefania Milan (stefaniamilan.net) is Associate Professor of New Media and Digital Culture at University of Amsterdam. Her work explores the interplay between digital technology, activism and governance. Stefania is the Principal Investigator of the project DATACTIVE (data-activism.net) and “Citizenship and standard-setting in digital networks” (in-sight.it), funded by the European Research Council and the Dutch Research Council. She is the author of Social Movements and Their Technologies: Wiring Social Change (Palgrave Macmillan, 2013/2016), co-author of Media/Society (Sage, 2011), and co-editor of COVID-19 from the Margins. Pandemic Invisibilities, Policies and Resistance in the Datafied Society (Institute of Network Cultures, 2021).

No Comments

Sorry, the comment form is closed at this time.