Overcoming the “social dilemma”: Problems of AI content curation and its alternatives

Introduction

This article series by Heinrich-Böll-Stiftung's Hong Kong office explores possible alternatives to the internet’s current mechanisms and practices in content curation.

shutterstock_1746417383_1_bxl.jpg

The docudrama film The Social Dilemma of 2020 portrays our digital world as a gloomy place in which social media has harmed society through excessive surveillance capitalism, data analytics, and manipulative practices. When the internet emerged a few decades ago, civil societies were optimistic about sharing information and experiences across social boundaries. But as it developed, the internet has diverged from its early promise of democratisation.

With habits and usage patterns already established amongst social media users (a term called “path-dependent lock-in” in the social sciences), and with big tech companies already dominating the field, the manipulation of content and user data seems to be an unstoppable avalanche. Google officially dropped its early motto “Don’t Be Evil” from its corporate code of conduct in 2018, and WhatsApp plans to share extensive user data with its parent company Facebook in February 2021, despite widespread user concerns.

Ordinary internet users may not have the power to shape and reshape the internet’s architecture, protocols, and underlying manifestos as big tech “champions” (star-grade innovators) do. But given the decentralised and bottom-up nature of the internet, if a significant number of users were to propose and practice new behaviours to bring about the “good digital life” they really want, theoretically they could initiate powerful paradigm changes and enable new communication modes and platforms.

This article series explores possible alternatives to the internet’s current mechanisms and practices in content curation. Starting today we will publish three articles, one per week, by Kin Ko, outlining his proposals for overcoming the “social dilemma”. The problem, he points out, largely originates from a reward system built on free content and an advertising model based on pageviews. Most of the income thus accrued goes to large social media platforms, while original content creators are forced to generate income from auxiliary projects. The reward system based on “likes” favours piecemeal, sensational information that appeals to readers, thereby shaping the priorities and focus of writers and news organisations. Today’s first article discusses how to reward original creators through small donations.

Another dimension of the social dilemma is that in order to generate pageviews, social media platforms are using artificial intelligence algorithms to match readers with content they are likely to engage with. This fails to provide users with a healthy mix of heterogeneous views. The effect of AI algorithms, social media platforms, and advertising is to trap users within echo chambers that reinforce homogeneous and extreme viewpoints. In this context, the eradication of misinformation becomes extremely difficult. Kin Ko’s second piece suggests that fact-checking and assistive intelligence may help to eliminate echo chambers.

Users may strive to uphold the values, attitudes, and practices important to their lives and communities, but some values are deliberately ignored or suppressed by big tech companies and power players. The last article in this series introduces an alternative token economy as a possible means of defending users’ core values against the diversionary tactics of big tech and powerful institutions that control the internet’s curation mechanisms.

Reflections on the dark side of industrialisation gave rise to the green movement in the 1960s. Instead of poisoning the environment with insecticides and air pollution, the explosive digital revolution of the last three decades has infected internet users with addictive usage habits, infringed on their privacy, and extorted their profits from content creation. It is time to reflect upon the dark side of digitalisation, and come up with greener options for digital technologies beneficial to societies and humanity.

Lucia Siu
Programme Manager - Technology
Heinrich-Böll-Stiftung Asia Global Dialogue, Hong Kong

 

This article was first published on hk.boell.org.