advertisement
Facebook Parts Ways With African Content Moderation Firm Sama
Facebook has parted ways with its content moderation provider in Africa, Sama. This comes nearly a year after there were reports of low pay, trauma and alleged union-busting as Sama’s Nairobi Office.
A legal nonprofit that investigates Big Tech companies, $2.2 billion European outsourcing firm Majorel will be taking over the contract.
Sama is also still the co-defendant, along with Meta, in a Kenyan lawsuit brought by former content Daniel Motaung, who claims that both companies are guilty of multiple violations of the Kenyan constitution.
advertisement
The content moderation provider has said that it is parting ways with Facebook due to the ‘current economic climate’. Sama further said that it would also let go of approximately 3% of its staff, mostly from Nairobi.
A spokesperson from Meta has confirmed the end of the contract with Sama in a statement.
“We respect Sama’s decision to exit the content review services it provides to social media platforms. We’ll work with our partners during this transition to ensure there’s no impact on our ability to review content,” reads the statement, as quoted by The Time.
advertisement
Sama’s contract to review harmful content for Meta, Facebook’s parent company, was worth $3.9 million in 2022, according to internal Sama documents.
All impacted employees would receive severance packages and “well-being support” for 12 months after their last day of employment, Sama’s statement said.
The remit of Sama’s Nairobi content moderation teams included Ethiopia, where Facebook has been accused of not doing enough to prevent the spread of incitement to violence amid a violent civil conflict.
advertisement
On Feb. 6, a judge is scheduled to decide whether a Nairobi court has the jurisdiction to continue hearing Motaung’s case against Meta. The social media giant argues that the case should not proceed because it does not trade in Kenya.
According to Fox Glove, Majorel, the replacement, is no better than Sama in its treatment of moderators, which shows a sign of how troubled the content moderation industry remains.
Despite several such lawsuits against social media companies like Facebook, Youtube, TikTok, and Reddit from around the world, workers are still often forced to work several hours long shifts moderating some of the most graphic content on the internet, from child abuse material to videos of gruesome accidents and beheadings, often with very few protections in place for their mental wellbeing.