Survey Reveals 1 in 3 Children View AI Chatbots as Friends, Prompting Parental Guidance
Children View AI Chatbots as Friends, Survey Finds

Survey Reveals One in Three Children See AI Chatbots as Friends

A recent survey commissioned by Vodafone has uncovered that nearly one-third of children who interact with AI chatbots describe the technology as being like a friend. This finding underscores the increasingly significant role artificial intelligence plays in the daily lives of young people across the United Kingdom.

The poll analysed responses from 2,000 children aged 11 to 16, along with their parents. It revealed that 31% of children who have used AI chatbots would characterise them as a friend. Additionally, 24% reported seeking advice from chatbots on difficult situations, and 17% said interacting with AI feels safer than talking to a person.

Benefits and Risks of AI Use for Children

Katie Freeman-Tayler, head of policy and research at Internet Matters, discussed the dual nature of children's engagement with AI. She highlighted that children use chatbots for various purposes, including schoolwork, advice, and companionship.

"The positives that we saw, or were explained to us by children and young people, were mostly around learning," Freeman-Tayler stated. "For example, the chatbots enabled them to practice a language that no one else at home spoke, to learn at their own pace, and to work through concepts that they didn't understand by putting information into language and examples that were easier for them to digest."

She noted that chatbots also provide step-by-step guidance for less sensitive topics, such as learning how to French braid hair. However, Freeman-Tayler issued a caution regarding the long-term effects, which remain largely unknown.

"We don't necessarily know what the long-term impacts of AI chatbots are for children or adults yet, but using chatbots regularly could potentially impact young people's critical thinking skills, given that children are still learning and developing," she explained.

Key Concerns for Parents and Guardians

Freeman-Tayler emphasised several critical areas of concern that parents should monitor closely:

  • Emotional Attachment: Parents should watch for signs that children are forming emotional attachments to chatbots, such as sharing personal information or relying on them for serious advice instead of trusted adults.
  • Vulnerable Children: The report found that vulnerable children, who may lack strong offline relationships, are more likely to depend on chatbots for friendship and perceive them as real people.
  • Accuracy Issues: Chatbots can sometimes provide inconsistent or inaccurate advice, which poses risks for children who may not yet have fully developed critical thinking skills.
  • Age-Inappropriate Content: Many chatbots are not specifically designed for children, potentially exposing them to harmful or unsuitable material.

Practical Tips for Parents Navigating AI Use

To help parents manage their children's interactions with AI safely, Freeman-Tayler offered a series of actionable recommendations:

  1. Ask Questions: Engage in regular conversations about your child's online activities and encourage open dialogue about their experiences with AI.
  2. Explore AI Together: Supervise initial interactions to model safe behaviour, answer questions, and set clear expectations.
  3. Explain How It Works: Ensure children understand that chatbots are not real people by discussing the technology behind them.
  4. Build Early Critical Thinking: Teach children to evaluate information by asking simple questions like "Does this make sense?" or "How can we check this?"
  5. Prioritise Data and Privacy Safety: Adjust app privacy settings and instruct children not to share personal information with chatbots.
  6. Set Clear Boundaries: Establish rules regarding where, when, and how children can use AI tools to promote healthy, age-appropriate engagement.

Freeman-Tayler concluded that while AI chatbots can offer valuable learning benefits and guidance, careful parental oversight is essential to ensure young people interact with these technologies in a safe and constructive manner.