The page “was left up and not only left up, it was deemed not threatening, not a danger when they’re clearly people blatantly inciting violence, saying they’re going to shoot Black people,” Gittings told a news conference organised by the activist group Avaaz.
Executives need to understand that what happens on Facebook doesn’t just stay on Facebook. It goes home with us, it goes to the grocery store with us, it goes to our jobs. Our children are affectedJoyce Jones, mayoral candidate in Alabama
The tragic incident highlighted concerns that social networks such as Facebook are being used to foment real-world violence with little or no control by the platforms.
Facebook and other social platforms, which are also often used to organise peaceful events and pro-democracy movements, have been condemned for failing to stop a range of abusive and hateful content including organised violence such as the massacre of the Rohingya minority in Myanmar and the beheading of French schoolteacher Samuel Paty near Paris.
A Facebook spokesperson, queried by AFP, said, “We remain vigilant when it comes to policing hate speech, calls for violence, and misinformation.”
The company said that since August it identified over 600 militarised social movements, and removed their pages or accounts, as part of an effort that took down 22.1 million posts containing “hate speech.”
But some analysts argue that the platforms can’t bear full responsibility for the deeper social problems which have led to extremism and violence in the streets
“We always know there is more to do, which is why we’re constantly working to improve our technology and tighten our policies when necessary to keep dangerous content off our platform,” the company said.
But critics say Facebook still falls short on many occasions.
Executives “need to understand that what happens on Facebook doesn’t just stay on Facebook,” said Joyce Jones, a mayoral candidate in Alabama who had to fight Facebook rumours that dogged her campaign.
“It goes home with us, it goes to the grocery store with us, it goes to our jobs. Our children are affected.”
Social platforms have also been criticised for doing too little to stop deadly misinformation about the coronavirus and in some cases even amplifying hoaxes and false information through algorithms designed to boost engagement.
Kristin Urquiza, whose father died from COVID-19 this year, saw the memorial page on Facebook flooded with abusive comments from people minimising the health crisis or questioning the use of face masks.
“Facebook may not have pulled the trigger, but Facebook did drive away the getaway car,” she said the Avaaz conference.
Critics of Facebook and other social networks argue they should be held accountable for violence organised on their platforms, calling for reforms of a law which shields internet services from liability for content posted by third parties.
But some analysts argue that the platforms can’t bear full responsibility for the deeper social problems which have led to extremism and violence in the streets.
Mark Potok, a fellow with The Centre for the Analysis of the Radical Right, said social networks such as Facebook have been useful to extremist groups in “very quick mobilisations” in Kenosha another places.
But he added that “there’s an inclination to say it’s all about social media, and that’s not true.”
Potok said militants have found ways to organise with or without Facebook, and that many extremists are now gravitating to fringe platforms with little or no moderation.
“I don’t think these platforms will be able to police all the extremist content out there,” he said.
“There are such enormous numbers being put up every day. I doubt these companies can eradicate their influence.”