Re-building trust in the age of artificial intelligence
post-template-default,single,single-post,postid-58642,single-format-standard,bridge-core-3.1.8,qodef-qi--no-touch,qi-addons-for-elementor-1.7.1,qode-page-transition-enabled,ajax_fade,page_not_loaded,,qode-title-hidden,qode-child-theme-ver-1.0.0,qode-theme-ver-30.5,qode-theme-bridge,qode_header_in_grid,qode-wpml-enabled,wpb-js-composer js-comp-ver-7.6,vc_responsive,elementor-default,elementor-kit-41156,elementor-page elementor-page-58642

Re-building trust in the age of artificial intelligence

Against the backdrop of the horrific Israel-Hamas conflict, WACC partner the CDAC Network invited reflection on the theme “Communication in conflict: can we safeguard information in the age of AI?” with an inspiring public forum on 15 November in London.

The gathering spotlighted the need for trusted, timely, accurate information and two-way communication in situations of conflict. WACC General Secretary Philip Lee was among the 80 people present in person, with a further 60 taking part online.

Participants were asked to wrestle with questions around communication and conflict:

  • Without accurate and trusted two-way communication, how can people caught in conflict settings make critical decisions for themselves and their families?
  • How do they know how to seek help?
  • How do they connect with each other – and with aid providers?

“Being able to talk to people affected by crisis is absolutely essential to sustaining trust going forward,” said Helen McElhinney, CDAC Network executive director, in her welcome address. She noted that inaccurate mass and social media coverage, which now happens instantly, has an instant knock-on effect on conflict resolution.

Shifts in the digital environment

Robert Mardini, International Committee of the Red Cross director-general, in conversation with BBC News presenter Geeta Guru-Murthy, reflected on key conflict trends. He explored shifts in the digital environment and their consequences for people living through conflict and humanitarian response.

“Disinformation and fake news are not new in conflict situations. What is new is how quickly and how easily information can be created, circulated, and even weaponised by all actors,” Mardini pointed out.

He stressed that choices in situations of armed conflict are matters of life and death. “And we should remember that communication is also a form of aid in itself – protecting and saving lives.”

Communication on the frontlines

The panel “On the frontlines of information and conflict: how do people navigate the challenges?” highlighted first-person experiences. Panelists talked about which information is most vital; which channels and media inform lifesaving decisions; and – given the surge in online mis- and disinformation – which sources are trusted and why.

Disinformation and fake news are not new in conflict situations. What is new is how quickly and how easily information can be created, circulated, and even weaponised by all actors.

Robert Mardini, International Committee of the Red Cross director-general

Communication with disaster-affected communities is a two-way street, according to the panel, and those from outside the communities need to understand where people are coming from and to work in close partnership with local organisations.

However, a plea for sensitivity was also made, since direct, unmoderated communication by international organisations with people in conflict situations can sometimes have dire consequences for their security.

The panel identified rumourmongering as a further communication problem, as it makes it difficult to ascertain facts and reliable information. In addition, panelists noted that there are different levels of access to information within communities themselves.

AI and global narratives

Lee named as a highlight of the day the panel on “Who shapes global narratives in today’s AI-enabled world?” Pointing to the scale and sophistication of AI-fuelled disinformation campaigns in recent conflicts – from Gaza to Syria and Sudan – this session explored the actors involved, efforts to respond, and underlined who needs to be in the conversation.

Participants heard that recent research indicated that AI is synonymous with progress, is complex and hard to understand, and is risky. And that AI regulation is formulated in technical rather than values-based terms.

Helen McElhinney, executive director of the CDAC Network, and WACC General Secretary Philip Lee

One commentator noted that in the West trusted sources are still consulted for big news, whereas in a country like Sudan, trust lies in the local community.

The panel raised questions to be addressed including “What is the future we want and how do we shape it? How do we infuse it with the values we want? And who are we asking these questions of?”

Among the panelists and participants there was general optimism that communication technologies, including AI, could be used to break down barriers and create greater connectivity between people. Yet, ultimately, all present concluded that progress would not be made unless people take on the entrenched power of digital media companies.

Securing a safer communication future

The final part of the forum focused on scaling up digital opportunities and securing safer information landscapes. This panel looked at how to build resilience to emerging threats to media freedom and to create trusted communication in conflict situations. How humanitarian principles can inform sector take-up of emerging technologies was another focal point.

Panellists agreed on the need to keep affected communities at the centre, as well as the need for civil society to better understand the role it can play in raising up the voices of affected communities in respect of the design and development of AI.

“The CDAC Network is to be congratulated on tackling perplexing issues that are both urgent and timely,” said Lee, who also represented WACC at the CDAC Network’s annual general meeting following the public forum.

Top image: Panel “On the frontlines of information and conflict” during the CDAC Network public forum on 15 November 2023.

No Comments

Sorry, the comment form is closed at this time.