SECTIONS

Facebook must act on hate speech and incitement to protect lives

Davis Malombe, KHRC Executive Director. [File, Standard]

Abuses matter just as much when they are fuelled by corporations as by the state, especially in this digital age. That is why Kenya Human Rights Commission is calling on the American tech giant, Facebook, to honour Kenyan lives and do more to stop incitement to violence and hate on the platform.

This isn’t about censorship – it is about making one of the world’s richest companies spend even a fraction of its resources on preserving Kenyan life. They are obliged to do this, and if they do not, our protective agencies should hold them to account.

The risk to lives is real. The National Cohesion and Integration Commission has published a report estimating the risk of post-election violence at 53 per cent. While we all pray that we will not see a repeat of the horrors of 2007, the hate speech piling up on social media risks making matters worse.

From India to Myanmar to our Ethiopian neighbours, the record is clear: Facebook fails to act when hate spreads on the platform, people carry out attacks, and lives are lost. Apologies after the fact bring no one’s loved ones back.

The company isn’t doing anywhere near enough. In response to a shocking report, last week that showed Facebook failed to catch 100 per cent of hate and violence-inciting ads in a Global Witness-Foxglove study, Facebook’s local head Mercy Ndegwa cited ‘37,000’ pieces of hate content the firm had taken down. But this statistic is likely to mislead.

How much hate and violence is on Facebook at the moment? The tech giant’s internal estimates from last year indicated Facebook itself thinks it only catches 3 to 5 per cent of hate, and less than 1 per cent of violent incitement. If that is true – Facebook hasn’t said – thousands of posts could still be there.

The company should openly admit what it knows of the risk to Kenyans – how much of the problem it can see – and exactly what it is doing to fight it. At the moment it refuses even to say how many Kiswahili speakers it has, or the size of its election safety team. This information is vital to our national security, yet they are secrets held by a foreign company that will not share them with us.

Forcing Facebook to finally be transparent about what it already knows about hate speech on newsfeeds in Kenya is just the start of solving this problem. There is far more it could do. We know this because in countries that matter more to Facebook – not least the United States – it has stepped in to do more. After the attack on Americans’ Capitol buildings on January 6 last year, Facebook took a series of emergency steps, which it calls the “Break Glass” steps, to stop incitement and hate spreading.

These are clear and simple actions – like demoting video and using something called “reshare friction,” which is Facebook’s language for ‘stopping posts going so viral’, that they could take here. I note that Facebook, so far, has refused to say whether it has taken a single “Break Glass” step in the case of Kenya.

There are those – some, who are decrying this as ‘censorship’ or a battle over freedom of speech.

Although Facebook has refused to answer questions about “the nature of its conversations with the Kenyan government,” two CSs have already come to their defence. In a statement to Reuters, CS for ICT Joseph Mucheru referred to the same “37,000 pieces of content taken down” statistic as Facebook’s own spokesperson. His counterpart, Interior CS, also publicly ridiculed the NCIC, implying the ‘Government would not take any recommendation to suspend Facebook into account at all’. Both CSs, however, were mute on the serious claims alleged against Facebook, even as fears of impeding election-related violence continue to rise.

Let me be completely clear: No one wants to see Facebook shut down in Kenya. To Matiang’i and Mucheru: We can have social media, freedom of expression and also a safe election.