Individuals have limited control in the digital world

Can our democracy survive the digital era, asks Tom Koppe.

Prime Minister Bill English has announced that ''computational thinking'' and ''design and developing digital outcomes'' will become part of the curriculum for all children (ODT, 29.6.17). This is a step which signals an acknowledgement of the contemporary society. I hope there will be also time for ''digital awareness''.

People - and especially youngsters - gain almost all their information through digital channels. However, the online sources, incentives and outcomes which affect our behaviour are developed in a hidden black box. Recent studies show we cannot underestimate the influence of these mechanisms in the digital world on our behaviour.

Moreover, we can even raise the question as to whether or not our democracy can survive this digital era? Taking this question into account, the commonly held idea that the internet increases our freedom and fosters our democracy becomes uncertain.

According to the Belarusian thinker Evgeny Morozov, our information infrastructure is occupied by the mechanisms of ''digital capitalism''. Almost all our daily information goes though the systems of a few companies, for example Google and Facebook.

They have gained a dominant position by using a unique and effective business model based on providing free information while we transfer our personal data. This data is used to provide personal specified information and advertisements, which their shareholders want them to leverage to gain profit. Therefore, data is the holy grail in the digital world.

The more personal data companies have, the more value they can deliver for companies which want to advertise based on people's preferences.

On a political level, personal data was also used during the Brexit referendum and the US presidential campaigns by both Democrats and Republicans.

Psychologist Michal Kosinski developed a method which creates personal profiles of people based on their social media behaviour. In Das Magazin (December 2016) he described that they can predict, for instance, the skin colour of the user based on 68 Facebook likes with just a fault percentage of 5%. Stealth algorithms are continually judging what we are doing in order to categorise our skin colour, religion, sexual preference, etcetera.

This information is used by campaign teams to select people they can influence in areas where it matters. On one hand they show specific groups specific information which is beneficial for them and supports their candidate. On the other, they show information which is negative for the selected group and blackmails the opponent.

This tendency undermines the idea of information on which voters can choose their candidate on a valid and complete basis.

Philip Howard, professor of internet studies at Oxford, studied online propaganda and said in The Guardian ''the lies, the junk, the misinformation'' is widespread online and supported by the algorithms of social media.

In conjunction with even more sophisticated voice and video-editing software it can become worse. For example, it is possible to show an offensive racist statement from an opponent on someone's Facebook timeline just before they want to vote. There will be not enough time to verify the information and the voter will be at least in doubt about his or her earlier judgement.

Manipulating becomes easier in the digital era and we must be more aware of this. We simply cannot know if politicians are manipulating in a disingenuous way.

Voters are locked in a bubble where they just see information from top-down sources which have the most precisely personal data.

Currently, we have limited control over our digital world. The dominating companies are barely accountable and there is a lack of transparency and counter power.

We may have to raise awareness and strengthen institutional structures like in our physical world in order to uphold a stable democracy.

-Tom Koppe is a Dunedin freelance writer.

Add a Comment