Nigeria Issues 2024 Online Trust and Safety Compliance Report
In 2022, Nigeria issued the Code of Practice for Interactive Computer Service Platforms and Internet Intermediaries (the “Code”). The Code serves as the primary legal framework for online trust and safety in Nigeria. Amongst others, the Code requires Large Service Platforms[1] to incorporate locally, appoint liaison officers, remove harmful or illegal content in a timely manner, and submit annual compliance reports to Nigeria’s online trust and safety regulator (the “NITDA”).
The 2024 Online Trust & Safety Compliance Report
In July 2025, the NITDA published the Online Trust & Safety Compliance Report for 2024 (the “Report”). The Report evaluates the annual compliance reports submitted by online platforms operating in Nigeria and measures their alignment with the Code.
Now in its second year, the Report names Google, TikTok, and LinkedIn. The Report notes that X failed to submit a compliance report for the year 2024 and that Meta failed to submit a compliance report using the prescribed template, thereby limiting the ability to assess compliance.
Key Highlights from the 2024 Trust & Safety Report
Account Closures & Deactivations: The Report noted that 28,106,057 accounts were deactivated in Nigeria across Google, TikTok, and LinkedIn platforms, for violations of the Code. The violations recorded include fake accounts, bullying, harassment, hate speech, and child pornography.
Content Removals (With/Without Court Order/Notice): The Report noted that 58,909,112 posts were taken down for violating the provisions of the Code. These violations covered issues such as child endangerment, hate speech, fake news, and more.
Complaint Handling: The Report noted that 420,430 posts were removed and re-uploaded based on user appeals, in Nigeria across Google, TikTok, and LinkedIn platforms, and that 754,629 complaints were recorded across the same platforms during the year under review.
Child & User Safety: The Report highlights several initiatives by Google, LinkedIn, and TikTok aimed at safeguarding children and users more broadly. The Report notes that:
(a) Google combines AI-driven systems with human review to detect and address violations, particularly focusing on combating Child Sexual Abuse Material (CSAM). The Report notes some of Google’s tools, which include machine learning classifiers, hash-matching technology, and mandatory reporting to the National Centre for Missing and Exploited Children (NCMEC).
(b) LinkedIn enforces a strict minimum age requirement of 16 and operates a three-tier moderation framework, involving automated prevention, AI, and human-led review, and user reporting, to remove harmful or inappropriate content.
(c) TikTok supports content moderation in over 70 languages, dedicating over $2 billion globally to trust and safety. TikTok also operates a three-tier moderation framework, involving automated prevention, AI and human-led review, and user reporting, to remove harmful or inappropriate content. The Report notes TikTok’s work with NGOs as well as its #SaferTogether Campaign in Nigeria, which educates parents, guardians, and creators on available safety features. TikTok has introduced a number of new features, including a meditation feature after 10 pm, that helps teens relax, and a time away tool.
Key Takeaways
Overall, the level of compliance with the Code remains very low. Given Nigeria’s population of over 220 million and more than 120 million internet users, the 1,000,000-user threshold effectively captures virtually all major global platforms.
Nigeria is aligning with global trends (e.g., EU Digital Services Act, India’s IT Rules), positioning itself as part of the international push for platform accountability. This shows Nigeria’s intent to shape, not just follow, global norms.
Compliance can be strategic as the reporting process effectively opens a channel for constructive engagement. Platforms that work transparently with the regulator could help shape future digital policy and gain goodwill.
While the Code is framed as embodying best practices, its legal force extends further because under its provisions, a breach of the Code is deemed a breach of the Nigerian Communications Act 2003, the National Broadcasting Commission Act 2004, and the NITDA Act 2007. This linkage exposes platforms to direct legal liability, elevating the Code from a soft guidance framework to a binding compliance obligation.
[1] Large Service Platforms are defined as Interactive Computer Service Platforms/Internet Intermediaries whose registered users in Nigeria are more than 1,000,000.
Balogun Harold's insights are shared for general informational purposes only and do not constitute legal advice. For tailored guidance, please contact our Technology and Market Entry Lawyers at bhlegalsupport@balogunharold.com

Olu A.
LL.B. (UNILAG), B.L. (Nigeria), LL.M. (UNILAG), LL.M. (Reading, U.K.)
Olu is a Partner at Balogun Harold.

Kunle A.
LL.B. (UNILAG), B.L. (Nigeria), LL.M. (UNILAG), Barrister & Solicitor (Manitoba)
Kunle is a Partner at Balogun Harold.

Ore O.
LL.B. (UNILAG), B.L.
Ore is a Legal Analyst at Balogun Harold.
Related Articles
When Should Liability Caps Be Applied – Before or after Set-Off ?
Technology contracts frequently include liability caps. For example, a technology vendor might agree that “no more than 12 months’ fees” can be claimed. But complications often arise when both parties have contractual claims against each other. Should the parties set off their claims first and then apply the liability cap to the net amount? Or should the cap bite on each party’s liability separately, with set-off only after the cap has reduced one side’s claim?
Cross-Border Data Transfers in Nigeria – U.S. SaaS Companies
For DPOs and privacy counsel of U.S. SaaS companies and Nigerian customers, one big challenge is ensuring lawful cross-border data transfers. This article explains the rules and what NDPA compliance for U.S. SaaS companies requires in practice.