AI, Care and the Curriculum: Where Should We Draw the Line?

March 23rd 2026

I was delighted to sit on a panel at the Childcare Expo to discuss artificial intelligence (AI), a subject that has been gathering momentum across the media, not least through the recent Cambridge University research reported by the BBC and through the wider debate about AI toys for children. I also spoke about this on Radio 5 Live last week (25minutes in if you want to listen back), where the conversation quickly moved beyond novelty and into something much more important: what does AI mean for care, for learning, and for the relationships at the heart of early childhood education?

In Early Childhood Education, we need to be clear about the difference between digital technology and AI. Digital technology, such as cameras, tablets or recording tools, supports children’s learning by helping us document, communicate and extend experiences. It is a tool, firmly in the hands of the educator and the child.

AI is different. It does not simply support learning, it begins to interpret, generate and suggest. It can influence what we notice, how we respond and even how we understand children’s development. This represents a fundamental shift in how we think about the curriculum.

Across the world, this shift is prompting both curiosity and concern. Governments are moving cautiously, introducing safeguards that recognise children’s particular vulnerability to AI systems. The direction is clear. The younger the child, the greater the need for protection, not acceleration. At the same time, the technology sector is moving quickly, designing products specifically for young children. This tension sits at the heart of the current debate.

From an organisational perspective, it is easy to see why AI is attractive. In a sector under pressure, it offers practical benefits. It can support note taking, improve communication with families, and help analyse patterns in data to inform early help and intervention. It can reduce time spent on repetitive administrative tasks and allow leaders and educators to focus more on children and families. Used in this way, AI strengthens the system. It works behind the scenes, enabling better practice without replacing it.
The challenge comes when AI begins to move from the office into the curriculum.

This is where examples such as Gabo the Bear, an AI enabled companion toy, become important. On the surface, such innovations offer comfort, interaction and personalised responses. For busy families and stretched services, this can feel helpful. However, in Early Childhood Education, relationships are not an optional extra. They are the curriculum.

Through relationships, children develop language, emotional regulation, trust and a sense of belonging. These are not simple exchanges of information. They are deeply human processes shaped by empathy, attunement and care. An adult brings memory, context and ethical judgement to each interaction. An AI system responds based on patterns in data. It generates a response, but it does not feel it. That distinction matters.

Introducing AI into this relational space risks subtly redefining care. It may shift expectations about what it means to be listened to or understood. It may reduce opportunities for children to engage in the rich, sometimes complex interactions that build resilience and social understanding. The question is not whether AI companions are engaging, but whether they belong in spaces where relationships are foundational.

This does not mean there is no place for AI in Early Childhood Education. It means we need to think carefully about where that place is. AI can be highly effective when it supports professional practice. It can help educators organise information, reflect on patterns and communicate more effectively. In doing so, it can create the conditions for stronger pedagogy. However, it should not become the pedagogy itself.

There are also important ethical considerations. AI systems are trained on data that may reflect existing inequalities, which means they can reinforce bias if not carefully managed. There are concerns about data privacy, surveillance and the commercialisation of childhood. Environmental costs must also be considered, given the resources required to sustain AI infrastructure.

For the Early Years sector, AI is no longer a future issue. It is already here. The real question is how we respond. Do we adopt it uncritically because it is available, or do we shape its use in line with our values? If we believe that Early Childhood Education is rooted in relationships, care and human connection, then we must be clear about our boundaries. AI can support our work, but it cannot replace the human interactions that sit at the heart of learning.

The curriculum must continue to be built through observation, curiosity and shared experience. AI may help us do our work more effectively, but it should never take the place of the relationships that define it.

The task ahead is not to reject AI, but to position it carefully. To ensure it remains a tool that supports practice, rather than a force that reshapes it in ways we do not fully understand. Because in the end, the question is not whether AI has a place in Early Childhood Education. It is whether we are prepared to ensure that its place remains firmly in service of children, rather than in place of the relationships they need to thrive.