Meta's Facebook is in hot water as it neglects the mental health of its content moderators.
After a lawsuit was filed against them in the Kenyan capital on May 10 alleging inhumane working conditions, social media giant Meta and its Nairobi-based content review subcontractor Sama will have to defend themselves in court.
Nzili and Sumbi Advocates filed a complaint on behalf of South African content moderator Daniel Motaung, who claimed that Sama dismissed him immediately after he attempted to form an employee union in 2019, just a few months after starting his job with the company.
After Zuckerberg's efforts to make Meta a top-name brand, these allegations may be a setback.
Meta's Lawsuit in Kenya
Motaung is seeking compensation on behalf of current and past content moderators. The former Meta employee is requesting that both firms provide mental health counseling for content moderators who have spent thousands of hours watching graphic video clips, such as rape, murder, and child abuse, for little or no compensation.
Additionally, he is also asking the court to order the two corporations to cease obstructing their efforts to organize a labor organization.
Furthermore, according to Engadget, Sama employs deceptive job advertisements to recruit poor people from across the area, including those from Kenya, South Africa, Ethiopia, Somalia, and Uganda, among other countries. The employees claim they were never informed that they would be working as Facebook moderators and that they would be required to watch distressing information as part of their employment.
It also calls into question Sama's 'deceptive' tactics in the recruitment process, claiming that the company advertised for positions such as call center agents, agents, and content moderators without providing candidates with the actual job descriptions.
During his first six months on the job, Motaung claimed to have witnessed a video of someone being decapitated, and he was sacked for attempting to lead workers' unionization efforts.
At Sama, a startup that describes itself as an "ethical AI" company that provides "dignified digital jobs" to individuals in areas such as Nairobi, there are counselors on hand. Staff, on the other hand, were largely skeptical of the counselors, and Sama allegedly rejected the counselors' recommendation to allow workers to take wellness breaks throughout the day regardless.
Meta's Content Moderators
Meta is not new to its currently existing problems that its content moderators face on a day-to-day basis. Facebook employees from all over the world face the same problems and have complained to the company about how mentally affecting their jobs are.
As a recent example, Meta agreed to pay $52 million to its content moderators in the United States as part of a class-action lawsuit that was filed in 2020. A maximum of $50,000 was awarded to those who had been diagnosed with psychological problems related to their jobs.
Last year, TikTok also experienced the same class-action lawsuit from its employees who were working as content moderators.
TikTok's Content Moderators
TikTok and its parent company, ByteDance Inc., had a lawsuit backed by almost 10,000 of their content moderators. Their employees complained about being frequently exposed to disturbing content such as child pornography, rapes, beheadings, and other forms of animal cruelty.
As previously reported, Candie Frazier, a content moderator, said in a class-action lawsuit that she had sat through videos that included sounds of bizarre cannibalism, crushed heads, school shootings, suicides, and even a terrible fall from a building that she had never seen before.
Just like Meta's Facebook, Tiktok has been criticized by moderators for failing to pay sufficient attention to psychological dangers and for failing to provide enough mental health treatment to their employees, particularly to moderators, as a result.