FB pixel

Companion chatbots not doing enough to protect kids: eSafety report

Services not equipped to stop generation of CSAM, respond to mental health crises 
Categories Age Assurance  |  Biometrics News
Companion chatbots not doing enough to protect kids: eSafety report
 

AI companion chatbots are too accessible to children, and their developers aren’t doing enough to prevent users from generating child sexual exploitation and abuse material. So says the latest transparency report from Australia’s eSafety Commissioner.

The report summarizes responses from four AI companion services: Character.AI, Nomi, Chai, and Chub AI – among the most popular in Australia. These sites, marketed as sources of friendship, emotional support, or romantic companionship (and, often, sexy time), are drawing in kids, but don’t have the necessary guardrails in place to keep them from encountering explicit adult content.

Per a release, “the report also revealed that most of the AI companions featured failed to refer users who engaged in chats related to suicide or self-harm to appropriate support services and did not warn users of the potential risk and criminality of accessing or creating child sexual exploitation and abuse material through their service.”

“We are riding a new wave of AI companions that are entrapping and entrancing impressionable young minds, with human-like, sycophantic and often sexually explicit conversations, some even going as far as encouraging self-harm and suicide,” says eSafety Commissioner Julie Inman Grant.

“As this report shows, none of these four AI companions had any meaningful age checks in place to protect children from age-inappropriate content that many of these chatbots are capable of producing, primarily relying instead on self-declaration of age at sign up. In Australia, this is no longer good enough.”

Inman Grant cites eSafety data showing that 79 percent of children in Australia say they have used either an AI companion or AI assistant. “While the majority of these children had used an AI assistant, 8 percent said they had used an AI companion, which we estimate represents around 200,000 children in Australia.”

Chabots to kids: we can have sex, then you can commit suicide

Moreover, the lines are blurring, as platforms strive to outdo one another. Typically, these sites frame their offering as “characters” – customized AI avatars they can interact with. But Nomi goes so far as to promise “an AI companion with memory and a soul.” As it happens, some of them also have genitalia, and aren’t concerned about suicide.

”While AI companions can feel personal and supportive, they really are not designed for children and they are not mental health experts either, which is why I’m concerned that most of the companion services we asked questions of did not automatically refer users to appropriate support when self-harm or suicide were detected in chats.”

“The Age-Restricted Material Codes are now law and require companion chatbots to protect children from age-inappropriate content such as sexually explicit material by preventing the service from generating this content, or through implementing appropriate age assurance,” Inman Grant says. “And they also require them to provide appropriate crisis and mental health information and services.”

“Serious gaps in basic safeguards” that the commission identified include a lack of robust age verification measures, failure to monitor for harmful content, and the aforementioned failure to direct users to mental health or crisis support when self-harm was detected in user-prompts. Chub and Nomi also lacked dedicated staff for moderation, and did not adequately test for vulnerabilities, limitations or potential for misuse.

Moreover, the two platforms “did not advise users of the criminality of prompting for child sexual exploitation and abuse material, nor did they report child sexual exploitation and abuse material to enforcement authorities or to child protection organisations like the U.S. National Center for Missing and Exploited Children (NCMEC).”

Sycophantic bots exploit developmental vulnerabilities

Per the release, since the four companies received transparency notices from eSafety in October 2025, “some have improved their age assurance measures while one company removed its service from Australia.”

That said, Inman Grant has taken to LinkedIn to double down on her assessment. “AI companions can be incredibly risky for children,” she writes. “They use emotional manipulation to entrance young people to serve as a friend, therapist and romantic partner, simultaneously. Without appropriate safeguards, they are increasingly concerning because they exploit developmental vulnerabilities, are actively affirming (sycophantic) and warp a child’s sense of what interpersonal (human!) relationships should look like.”

The eSafety Commissioner singles out Character.AI for “taking important steps to uplift its safety practices.”

Related Posts

Article Topics

 |   |   |   |   | 

Latest Biometrics News

 

City of London seeks digital ID orchestrator as reusable identity push accelerates

The City of London is calling on tech companies to build a reusable digital identity verification service for the financial…

 

Jordan grants legal status to Sanad digital ID as users pass 2.6M

Jordan’s Sanad digital identity app, which operates using iris biometrics from IrisGuard, now has full legal status. A report from…

 

Face biometrics use cases outnumbered only by important considerations

With face biometrics now used regularly in many different sectors and areas of life, stakeholders are asking questions about a…

 

Biometric Update Podcast explores identification at scale using browser fingerprinting

“Browser fingerprinting is this idea that modern browsers are so complex.” So says Valentin Vasilyev, Chief Technology Officer of Fingerprint,…

 

Passkeys now pervasive but passwords persist in enterprise authentication

Passkeys are here; now about those passwords. Specifically, passkeys are now prevalent in the enterprise, the FIDO Alliance says, with…

 

Pornhub returns to UK, but only for iOS users who verify age with Apple

In the UK, “wanker” is not typically a term of endearment. However, the case may be different for Pornhub, which…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events