Privacy on the frontier of lawlessness
17446
post-template-default,single,single-post,postid-17446,single-format-standard,bridge-core-3.3.1,qodef-qi--no-touch,qi-addons-for-elementor-1.8.1,qode-page-transition-enabled,ajax_fade,page_not_loaded,,qode-title-hidden,qode-child-theme-ver-1.0.0,qode-theme-ver-30.8.1,qode-theme-bridge,qode_header_in_grid,qode-wpml-enabled,wpb-js-composer js-comp-ver-7.9,vc_responsive,elementor-default,elementor-kit-41156

Privacy on the frontier of lawlessness

Privacy was something that used to be taken for granted.

Ordinarily, the private life of an individual was not open to scrutiny, while public life was the concern of law and order and decency. In communication terms, privacy meant that only the addressee could open letters or telegrams and telephone operators would not listen in to conversations. Unauthorised disclosure could be sanctioned.

In “Privacy is the new wilderness we must protect” (openDemocracy 2 August 2019), Maciej Ceglowski calls for an updated concept of privacy as a public good if we are to save our rights as individuals in today’s digital world.

Ceglowski calls this ambient privacy – “the understanding that there is value in having our everyday interactions with one another remain outside the reach of monitoring, and that the small details of our daily lives should pass by unremembered. What we do at home, work, church, school, or in our leisure time does not belong in a permanent record. Not every conversation needs to be a deposition.”

The digital surveillance system functions effectively and efficiently because it is automated. Computers monitor and sift, identifying key attributes, making connections and assessing probabilities. Commercial entities and security services use that data: the former to compile marketing trends and the latter to assess imminent culpability. China is using deep-learning systems to search in real time through video feeds capable of capturing millions of faces. State security is building an archive that will be used to identify suspicious behaviour in order to predict who will become an “unsafe” social actor.

In this context, Ceglowski argues:

“Our discourse around privacy needs to expand to address foundational questions about the role of automation: To what extent is living in a surveillance-saturated world compatible with pluralism and democracy? What are the consequences of raising a generation of children whose every action feeds into a corporate database? What does it mean to be manipulated from an early age by machine learning algorithms that adaptively learn to shape our behavior

The alternative is a world with no ambient privacy and little data protection, dominated by hostile governments and big corporations. It is the kind of dystopia that George Orwell identified in his novel Nineteen Eighty-Four in which “Power is in tearing human minds to pieces and putting them together again in new shapes of your own choosing.”

What shall we do to prevent it?

 

 

PHOTO: SHENZHEN, CHINA: Facial recognition technology at street to identify jaywalkers and automatically issue them fines by text. Offenders faces are displayed on screens at crossings. Photo: StreetVJ/Shutterstock

No Comments

Sorry, the comment form is closed at this time.