Age assurance debate arrives in Bangladesh

The dominos continue to fall in the game of global online safety legislation targeting social media platforms. Bangladesh is weighing what restrictions on social media might look like in a local context, as the government plans the launch of a unique digital ID-connected digital wallet system for all citizens, which has implications for age verification.
The debate has hit the pages of the Daily Star, which this week published an essay arguing that “online child safety needs age assurance, not age policing.” It illustrates the complexity of regulating massive platforms in different national contexts, where local culture and infrastructure factor into the conversation.
Taking away social media means giving kids alternatives
For Bangladeshis, the regulatory issue comes with the same concerns that worry privacy advocates everywhere. The Star quotes Meem Arafat Manab, a researcher on tech policy, who believes “enforcing a ban will require monitoring, which could result in excessive monitoring of online activity and raise concerns about data privacy that may prove to be unpopular with the public.”
He also raises an increasingly pressing concern, arguing that “the government does not have the necessary leverage over social media apps to force them to use IDs to verify age, as the social media companies aren’t local.” The jurisdictional breadth of online safety laws has been a sore point for U.S. companies who don’t think they should be subject to foreign laws. But it is the question of leverage that matters most, as whole nations and continents reckon with the question of how to make the world’s most powerful companies bend to their legislative will.
Other concerns accumulate: easy workarounds in the form of VPNs, restricted access to educational platforms, a lack of alternatives for youth in Bangladesh’s densely packed cities. Manab notes the lack of playgrounds and parks in Dhaka, the capital. “Many children don’t have the habit of reading or watching movies. Children need something to do, and so, they turn to social media. We need to give them more options.” More support, too, in the form of a strong social support system that can address mental health crises in youth.
Ultimately, the paper argues that “a direct transplantation of a foreign social media restriction model into Bangladesh would likely face structural and social barriers.” It advocates for stronger digital literacy to “help children understand online risks, privacy concerns, cyberbullying and responsible engagement.” That puts the onus for keeping kids safe on educators, while offering no direct intervention – and fails to account for the ways in which social platforms purposefully design their products to manipulate kids.
Blocking URLs ineffective in wider digital context
Digital education alone is insufficient protection against predatory social platforms. But one does not simply shut down Facebook, either. An opinion piece in the Daily Star sees authors Khan Khalid Adnan and Azfar Adib suggesting that “Bangladesh’s digital child protection policy still rests on a dangerously comforting illusion: that harmful online content can be managed by blocking websites.”
It’s a politically convenient approach, the authors say, “because it allows the state to appear decisive without confronting the actual architecture of digital harm.”
Sadly, “it is also a technically weak approach.”
In 2019, Bangladesh blocked 1,279 adult content websites in a would-be war on porn. But, the authors say, “a blocked URL does not protect anyone. Children do not experience the internet only through a list of prohibited websites. It happens through phones, feeds, games, livestreams, messaging apps, search results, advertising systems, influencer content, and increasingly, AI interfaces. A policy designed for static websites is badly mismatched with a digital environment built around algorithmic exposure.”
A reactionary approach, they argue, has no long-term impact. “The state blocks after panic, prosecutes after harm, and announces crackdowns after public outrage. What it does not do is require platforms, app stores, payment systems, gaming environments, and AI services to design age-appropriate access into their systems before harm occurs.”
“Bangladesh is not behind merely because it lacks a specific age assurance law. It is behind because its regulatory instinct remains reactive, moralistic, and enforcement-heavy. Criminal law can punish an offender, but it cannot by itself stop a 12-year-old from entering an adult content site, joining an unsafe stranger chat, being nudged into gambling, or receiving self-harm content through recommendation systems.”
Bangladesh needs a coherent age assurance framework
Bangladesh’s online safety law, the Cyber Security Ordinance 2025 includes provisions to address online gambling, the sexual harassment of women and children online, and the recognition of internet access as a civic right. But, say Adnan and Adib, it does not have a coherent age assurance framework.
What’s needed is a serious framework that “defines platform duties, minimum age thresholds, verification standards, independent audits, appeal mechanisms, data minimisation rules, and penalties for negligent design.”
In effect, Bangladesh needs age assurance, “not crude age policing.” Digital trust is thin in Bangladesh, which was under military rule for 15 years and has seen its government slide toward autocracy since the mid-1990s. “Bangladesh’s history of digital regulation gives citizens every reason to fear that a child safety policy could become another surveillance instrument.”
That said, the authors note the availability of highly effective, privacy preserving age assurance tools that provide only a yes or no answer to age queries. They believe Bangladesh should follow the lead of other nations like Australia and the UK with a law that would “place legal duties on high-risk services, require privacy-preserving age checks for adult content and gambling, demand stronger protections in social media and gaming environments, and prohibit platforms from using children’s data to optimize addictive engagement.”
But first, Bangladesh has to give up on blocking websites. It “can either remain trapped in a censorship-based model that is easy to announce and easy to bypass, or it can build a rights-respecting age assurance regime that protects children without turning every citizen into a monitored subject,” the authors say. “The first option is familiar, ineffective, and politically lazy. The second is difficult, technical, and institutionally demanding – but necessary.”
Ordinance should impose duties of care on platforms
A paper by Mohammad Yamin Hoque of the Bangladesh Army International University of Science and Technology interrogates the Cyber Security Ordinance 2025, which criminalizes online child sexual abuse material (CSAM), revenge porn and sextortion, with harsher penalties for offenses against minors. The law, the author says, “reveals gaps in platform accountability, victim support, and prevention.”
One critical weakness is that the Ordinance “imposes no duties on social media platforms, messaging services, or content hosts. International best practices increasingly recognize that intermediaries, not just end-users, must take responsibility for child safety.”
The paper includes several recommendations. Bangladesh could impose duties of care on platforms operating in the country, establish mandatory transparency reporting, mandate age assurance for child-accessed services, establish a Child Online Safety Division within the National Cyber Security Agency, and impose “meaningful financial penalties for non-compliance.” Ofcom in the UK is cited as a useful regulatory model.
“The Cyber Security Ordinance 2025 provides a foundation,” it says. “Building upon it a comprehensive ecosystem for child online safety will determine whether Bangladesh’s digital future is one of opportunity or exploitation for its youngest citizens.”
Article Topics
age verification | Bangladesh | biometrics | children | digital identity | regulation







Comments