logo
ADVERTISEMENT

MUGAMBI KIAI: Social media giants promised free expression, they must not fuel violence or exploitation

None of us wishes to see a repeat of 2007.

image
by BRIAN ORUTA

Opinion26 June 2022 - 09:48
ADVERTISEMENT

In Summary


  • • Done safely and well, social media offers an important tool for free expression, like the free press and the broadcast media before it.
  • • And to fix the social media landscape, we must turn our attention both to the spread of hate – and the people we rely on to fight it.

Once more, we are approaching a cliff’s edge. Since 2007, we have witnessed in our elections an exercise in public mud-slinging that is both dispiriting to watch and dangerous for our democracy. I have written of these woes often.

Too much modern political communication teaches us little of the candidates and works instead to divide us. None of us wishes to see a repeat of 2007.

But to address these moments of political crisis, we must grapple with all the causes. Today I wish to draw back the curtain to look at one of the risks we face: social media.  For just as we cannot understand the risks of post-election violence in 2022 without understanding 2007, it is impossible to discuss the health of Kenyan democracy today without discussing the social media titans. 

Free expression and development is what the Silicon Valley giants sold Kenyans when they entered Kenya.  But it has regrettably emerged that the service these giants offer – especially Meta, which owns Facebook, WhatsApp and Instagram, services used by millions of Kenyans every day – falls far short of the ideal. 

Done safely and well, social media offers an important tool for free expression, like the free press and the broadcast media before it. This is why Article 19, an organisation that fights for freedom of expression, regularly challenges problematic takedowns and demands more transparency from social media firms. Public debate happens on Facebook, so people must access it on free and equal terms.

What has Facebook meant for the public square in Kenya – and in Africa as a whole? The picture is, at best, mixed.

The silent editor – lifting up hate for profit

If you use Facebook, try an experiment. Log in and bring up your Facebook Feed. Have you ever considered why you are seeing the posts you see? Of course, what you are reading is not a newspaper – no editor has selected it for newsworthiness. But that does not mean what you see is neutral – far from it. It has been chosen by an opaque set of processes inside Facebook.

Facebook’s internal systems (which observers sometimes call the ‘algorithm,’ though it is many systems) promote specific posts for ‘engagement’.  Facebook will tend to show you posts that Facebook thinks you or I are likely to click, like, or share.

The reason is simple: profit. The more we engage, the longer we stay on Facebook, the likelier we click an ad, thus sending money to Facebook (whose 2021 income was about $117 billion – more than the GDP of Kenya).

This sounds innocuous until you realise what content tends to win the lottery and go to the top of your feed. Hate. Misinformation. Violence. Lies. These provocations are highly engaging – and thus, sad to say, profitable for Facebook. Unless Facebook actively takes steps to demote this material, the system it has set up by design will promote the very material that brings our people to the cliff’s edge every election season.

It goes further. Last year, a whistleblower at Facebook called Frances Haugen released thousands of internal documents to American regulators. The documents show that Facebook spends 87 per cent of its misinformation budget on the United States. Kenya – part of what Facebook lumps in as the “rest of world” – must share the 13 per cent with billions of other Facebook users.

These crumbs from the giant’s table are not good enough. They mean Facebook is woefully under-resourced to deal with viral hate or misinformation when it happens in Kenya, or in our neighbours such as Ethiopia.

No moderation without moderators

One consequence of this paltry budget is that there are few staff to moderate Facebook. We are often told Nairobi is an African ‘tech hub’. For Facebook, that means in practice that Nairobi is where a small and ill-equipped army of ‘content moderators’ take terror and hate off our social media feeds.

Moderators – the front-line fighters against hate, violence, and misinformation online –  absorb this poison so you and I don’t have to. This is gruelling, dangerous work, done for low pay. After an American news magazine exposed the working conditions of moderators here in Nairobi – done by an outsourcing firm Sama – Facebook has failed to clean up its work floor. Indeed, in a Nairobi court tomorrow, Meta will argue that the Kenyan courts have no jurisdiction over it – because it does not trade in Kenya. Those of us who use it daily, or who advertise on Facebook, may find that surprising.

As we move closer and closer to our election, many of us will debate politics online. But we should all think carefully about the platforms on which we are holding this debate – and the steps we must take to make them more accountable to us. It is impossible to have a working democracy with a broken social media landscape. And to fix the social media landscape, we must turn our attention both to the spread of hate – and the people we rely on to fight it.

Mugambi Kiai is the Regional Director, Eastern and Horn of Africa of Article 19, a global freedom of expression watchdog. The views expressed in this column are entirely his own.

WATCH: The latest videos from the Star
ADVERTISEMENT

logo© The Star 2024. All rights reserved