Comment permalink

Jacinda Ardern met with Emmanuel Macron in Paris in May 2019 to discuss the Christchurch Call....
Jacinda Ardern met with Emmanuel Macron in Paris in May 2019 to discuss the Christchurch Call. Photo: Getty Images
Those who would like to see the worst excesses of the wild west which the internet is reined in may have high hopes for the work of the Christchurch Call.

That is the international initiative launched by New Zealand Prime Minister Jacinda Ardern and French President Emmanuel Macron seeking the elimination of terrorist and extremist online content after the live-streaming of the horrific mosque killings in Christchurch in 2019. Distressingly, that Christchurch video is still reverberating, referenced this month in a United Kingdom court case involving a man appearing on 10 terrorism-related charges.

Ahead of last weekend’s Christchurch Call leaders’ summit, news of the United States formally joining the community of supporters was lauded. Commentators suggested this might help the Call gather momentum, although it is worth noting that Russia and China are not rushing to sign up. The Call community is a voluntary network where supporters pledge to work together to stop violent and extremist content being posted, but without compromising human rights including freedom of speech.

In the last two years crisis response protocols developed between the supporters have been used to remove online broadcasts of two terror attacks in Germany and in Arizona. The community now comprises 32 supporting countries along with Unesco and the Council of Europe and 10 online service providers including heavy hitters Amazon, Facebook, Google, YouTube, Twitter, and Microsoft.

It is not difficult to agree with President Macron when he says the pervasive threat of terrorist and hate content online fuelling extremism and terrorist actions is a global issue requiring a collaborative response by governments, tech companies and civil society, all supporting a free, open and secure internet. Getting an effective response is not simple. Moves by the technology giants to moderate awful content to date have generally been too little too late

— their tardy treatment of Donald Trump is a case in point.

The New York Times podcast series Rabbit Hole, exploring how the internet is changing and how it is changing us, described the internet as largely being run by "sophisticated artificial intelligences (AIs) that have tapped into our base impulses, our deepest desires, whether we would admit that or not". That information is then used to show us a reality that is "hyperbolic and polarising and entertaining and, essentially, distorted".

And as the AIs keep showing us this distorted reality, we keep paying attention and that tells them we would like to see more of it, so we are living inside a loop getting faster and faster and "showing us a world that is getting more and more distorted".

It is timely Ms Ardern is emphasising the need to improve understanding of algorithmic processes which have the potential to cause harm, or to radicalise or incite acts of terrorism and violent extremism and develop interventions to address these issues.

The online service providers who support the Call have committed to a review of algorithms’ operation and are considering using them and other processes to redirect users from extremist content. Any changes, however, must come without "compromising trade secrets or the effectiveness of service providers’ practices through unnecessary disclosure", according to the Call commitment.

Since the algorithms ensure the tech companies make money, we question whether the big companies will be prepared to lift their game far enough voluntarily. Those working in traditional media outlets subject to much more outside control and accountability are understandably cynical. It is hard to see improvement which is not weighed down or watered down by the self-interest of big technology happening quickly, or even at all, without regulation.

In the meantime, individual internet users can play a part by moderating their own online behaviour and increasing their awareness of how the algorithms might be pushing them towards a distorted view.

Comments

Programmed bias enables incitement. Close them.

Sunlight is the best disinfectant.
I'd much rather see who the radical left-wing extremists and eco-fascist are, so I can avoid them.
Trying to cleanse the internet from their views or suppressing their ramblings for 30 years doesn't fix them, it just pushes them underground, at the expense of everybody else's freedom of speech and free thought.
It is getting harder to believe, but NZ is not run by a dictatorship.