A Kenyan Facebook moderator who is suing its parent company Meta over his work filtering posts that violate the platform’s rules.
During his initial day of screening posts, Trevin Brownie, a South African, witnessed a man commit suicide while a three-year-old boy played nearby, oblivious to what was happening.
The child in the video took two to three minutes to notice something was wrong and call out for his father. Once an adult entered the room, the recording was stopped.
Mr Brownie said he would often witness the most disturbing aspects of human behavior, such as child abuse, torture, and suicide bombings, during his work.
According to Mr Brownie, his experience working as a content moderator on Facebook has caused him to lose a part of his humanity, despite still caring deeply about others.
“Death and seeing death became a norm for me,” he says, adding that deaths no longer affect him as he feels they should.
To Mr Brownie, moderation workers play a crucial role in safeguarding users, particularly during the pandemic when people heavily relied on the internet. He also appreciates Facebook’s ability to bring people from all over the world together.
In January, Sama, the company operating Facebook’s main moderation hub for East Africa, announced that it would no longer offer content review services to social media companies.
Last month, it laid off 260 moderators, including Mr Brownie, as it shifted its focus to annotating videos for training AI computer vision systems.
Mr Brownie expressed his worry about the future, as he and his fiancée planned to get married, and his family in South Africa depended on the money he sent them. He also felt that he had given his soul to the job and sacrificed his humanity for it, only to be laid off.
Mr Brownie acknowledges that he would have declined the job if he had known its nature beforehand. Despite this, he believes that the work is significant, and he is proficient in it, having been promoted to a higher position. He desires to continue his employment, but with additional support for his mental well-being.
A group of 184 moderators, backed by campaign group Foxglove, are suing Meta, Facebook’s parent company, Sama, and Meta’s new contractor, Luxembourg-based firm Majorel, including Mr Brownie.
On Thursday, a ruling allowed Meta, Facebook’s parent company, to be sued for unfair termination in the case involving 184 moderators supported by campaign group Foxglove. Despite seeking to distance itself from the action, Meta can now be held accountable. Cori Crider, a director at Foxglove, sees this as “a milestone”, adding that “no tech giant, however wealthy, should be above the law”.
The moderators’ contracts cannot be terminated and they must still be paid until the case is decided, following an interim ruling against Meta and Sama. The moderators believe they were laid-off in response to complaints about working conditions and unionization efforts.
According to the moderators’ petition to the court, they claim they were discriminated against and denied employment at Majorel due to their previous employment at the Sama facility.
According to a petition filed to the court by the moderators, text messages shared with their legal team and seen by the BBC indicate that moderators who expressed interest in working at Majorel were told by a third-party recruiter that candidates from Sama were strictly not accepted.
Meta has refrained from providing any comments, as legal action is still ongoing. However, the company mandates its contractors to furnish on-site support with skilled practitioners available around the clock, along with access to private healthcare right from the commencement of employment.
The spokesperson for Sama declined to comment on the ongoing legal action but stated that the company had paid its moderators fair wages, which were among the top 12 highest-paying jobs in Kenya, reflecting the local living standards.
According to a spokesperson from Sama, the company provided moderators with fair, local living wages that ranked among the top 12 paying jobs in Kenya. They added that Sama offered extensive mental health services, including licensed and trained mental health professionals on-site, a 24-hour hotline, and virtual consultations. Majorel declined to comment while the legal action was ongoing. Meta has also declined to comment, citing the ongoing legal proceedings.
It added that its wellbeing service would be available for 12 months after the last day of employment and that it provided mental health services, including on-site licensed and trained mental health professionals, a 24-hour hotline, and virtual consultations.
The BBC has seen emails sent by a few moderators to Sama expressing their frustration that due to the injunction, the company is unable to provide termination benefits such as free flights to their home countries. Two emails commend Sama’s working conditions, while one person expresses their dissatisfaction with the legal action.
Daniel Motaung, a former moderator, was granted permission by a Kenyan court in February to sue Meta over allegations of poor working conditions. Meta is also facing legal action in Nairobi for allegations that its algorithm contributed to the dissemination of hate and violence during Ethiopia’s civil war on social media platforms.
Source : bbc.com