With the 2025 Africa Day celebrations done and dusted in May, we must pause and reflect on the state of online safety on the continent and listen to the desperate plights of African digital platform workers, including content moderators in Kenya whose task is to keep the social media platforms clean and free from harm.
This is relevant as the theme for this year’s Africa Day held on May 25 was the Year of reparation and justice for Africans and people of African descent. These issues are the core of human rights, impacting on reparation and justice that we are striving for this year.
The unprecedented levels of harmful content online across all the categories fuelling ethnic and geopolitical tensions, underline the importance of online safety not only in this year’s theme, but to building theAgenda 2063: The Africa We Want.
Sadly, there are no legal instruments in many parts of Africa to hold those who distribute this harmful content on these platforms accountable. This is important as:
- digital economy and infrastructure master plans are launched in many parts of the world;
- regulations of online platforms in other parts of the world is being strengthened in the developing worlds; and
- African moderators and digital platforms workers in Kenya are desperately crying for help.
Recently, the AU's Commission on Human and Peoples’ Rights adopted a Resolution ACHPR/Res.630 (LXXXII) 2025, on the development of the guidelines to help states monitor technology companies in respect of their duty to maintain information integrity through independent fact-checking.
The resolution follows an earlier decision by some Big Tech companies such as Meta and YouTube to make algorithmic changes to discourse on certain topics, scale-down content moderation and fact-checking services in favour of what they refer to as 'Community Notes’ system or community standards.
In terms of the AU’s commission resolution, the decision by these platforms has serious implications for “information integrity and online protection of expression and access to information because community notes’ mechanisms cannot be a substitute for independent fact-checking as they are susceptible to be captured by forces that do not respect human rights. Therefore, it cannot be an alternative to the companies' own responsibilities to uphold their corporate responsibilities”.
Owing to these concerns, the AU’s commission has recommended actions:
- Digital companies that provide services in Africa to adopt transparent human rights impact assessments as part of due diligence for any changes being contemplated. In both the EU and the UK, platforms are required to conduct risk assessments on the potential harms that their product and services may have on the public and submit those reports to the regulator, Ofcom.
- These technology companies in Africa should provide full coverage of African languages in their content moderation operations and ensure sufficient human resources in the loop of content assessment and users’ appeal, and ensure that Artificial Information systems are adequately trained to cover African languages.
- Special Rapporteur on Freedom of Expression and Access to Information in Africa to develop guidelines with all interested parties, such as civil society, regulatory bodies and technology companies, to enable state parties to effectively monitor the platforms’ performance to inform of the efforts to advance information integrity online.
While the resolution is a step in the direction, it is unlikely to have any effect because:
- the horses have bolted — the decision is already in force and no resolution from an African body can change it;
- absence of domestic laws in many African member states and the regulatory capacity to enforce it. In the absence of domestic laws, the resolution is not enforceable;
- Big Tech companies’ “care-less attitude”. This is important because as shown in many jurisdictions including the EU and Australia, these Big Tech companies have shown the propensity to disregard any regulatory compliance, despite huge monetary fines imposed on them. This care-less attitude has since been emboldened by the current US administration, which has made it clear that America cannot and will not accept any efforts by foreign governments who are tightening the screws on US tech companies with international footprints through laws, such as the Digital Services Act and Online Safety Actthat, according to US vice-president JD Vance “diminishes freedom of expression or risked infringing free speech — we think it's a terrible mistake”. If these sentiments were directed at the US’s historically ally, the EU, this stance wouldn’t be changed by a resolution from Africa;
- the commission’s treatment of the changes as corporate responsibilities instead of regulatory compliance for the protection of the African public is also worrying. As sovereign governments, AU member states should develop legally binding instruments. For all the platforms, a larger proportion of content they distribute meets their community of standards, but the fact that it is still harmful, suggests that they are ineffective, hence they have been abandoned in favour of regulatory instruments which also carry monetary fines;
- the recommendation to develop guidelines to enable states parties to effectively monitor the platforms’ performance, though embracing a collaborative and inclusive regulatory model, is not instructive enough. What is required given the high level of harmful content especially that propagating hate, terrorism, misinformation/disinformation and that which is fuelling ethnic hate and wars on the continent, the commission should have given African governments deadlines to develop laws and legally binding instruments to regulate social media platforms and hold them accountable for the harmful content on the continent. Guidelines are optional.
Digital platform workers plight — it is human right too!
While the AU commission’s resolution recommends that social media platform companies should provide full coverage of African languages in their content-moderation operations and ensure sufficient human resources, it does not even refer to the digital platform workers, including the content moderators, who are “the safety net of the internet”. This is important given the ongoing moderators’ concerns in Kenya, atrocious working conditions resulting in death in certain instances in what one of the former Meta moderators, Fasica Berhane Gebrekidan, describes as “the harrowing and traumatising job of content moderation”.
The death of Ladi Olubunmi Anzaki, a Nigerian content moderator based in Kenya earlier this year, underlines the injustice and the human rights struggles faced by digital platform workers, including African moderators on daily basis. What is sad about it is that they suffer injustice in an African country with little protection from local authorities, thus telling us that sometimes the injustice we are fighting against is “within”.
African governments seem to be aiding the injustices meted out to the African moderators and digital platform workers, by looking away at their ill-treatment by business outsourcing companies in Kenya in the interest of foreign investment. We must therefore enact laws that offer protection across the entire digital ecosystem, the public and those who seek to protect the latter from egregious content online. Their plights are human rights too.
We want to see, similar the global agreement signed between UNI Global Union’s Irish affiliate, CWU (Communication Workers Union) and outsourcing company Teleperformance making a difference to African digital platform workers too. However, this will first require tough action from African authorities enforcing their own labour laws, which also allow the right of digital platform workers to unionise so they can have a voice.
This will be a first step in implementing resolution ACHPR/Res.630 (LXXXII) 2025 in terms of ensuring that platforms provide full coverage of African languages in their content moderation operations and ensure sufficient human resources. Investment can’t come at the cost of our young people’s lives and wellbeing.
• Dr Mashilo Boloka is the CEO of Online Safety Lab
For opinion and analysis consideration, email Opinions@timeslive.co.za





Would you like to comment on this article?
Sign up (it's quick and free) or sign in now.
Please read our Comment Policy before commenting.