← Back to blog

AI for kids: we don't teach them to use ChatGPT

What can an 11-year-old actually learn about AI? Not how to prompt a bot — but how the bot thinks under the hood. A first-hand look at our Black Belt AI program.

Context: AI is everywhere, but kids are learning it the wrong way

Over the last two years, more and more parents come to us with the same question: "I want my child to learn AI." We get the worry — the world is changing fast. But often what a parent is picturing isn't what the child actually needs.

Most "AI for kids" content online boils down to: how to use ChatGPT to write a school essay. That's "using a tool", not understanding AI. It's like saying you've learned maths because you can use a calculator.

We teach something quite different. In the Black Belt AI program (ages 11–13), kids learn how AI thinks under the hood — how a model is "trained", how it makes mistakes, what data is, how to evaluate whether it's working.

What an 11-year-old can grasp about AI

Concept 1: AI = telling things apart

The closest analogy: AI is a program that learns to tell things apart. If we want an AI to recognise a cat in a photo, we show it thousands of pictures labelled "cat" and thousands without. After enough examples, it starts telling them apart on its own.

An 11-year-old gets this immediately. We draw two squares on the board. "What makes a cat different from a dog?" — discussion. "What makes a happy face different from a sad one?" — discussion. AI does exactly that, just faster and with more examples.

Concept 2: AI makes mistakes — and makes them predictably

The biggest theme in the first month of the program: AI isn't "smarter than a human". AI is just faster at some things and a lot dumber at others.

When a child trains their own small model (using Teachable Machine or a similar tool) and sees the model confuse "cat fur" with "a fluffy carpet" — they're learning the limits of AI. That's a more valuable lesson than any ChatGPT prompt tip.

Concept 3: Data drives behaviour

If an AI only sees photos of blonde hair, it won't recognise dark hair well. If it only learns from one language, it won't know another. AI ethics starts right there — and kids understand it.

A conversation we have a lot: "Why do you think some AIs don't understand the words you use every day?" Answer: because their data about your world isn't complete. A big lesson.

What kids actually build

The Black Belt AI program runs for one term. A typical project arc:

  1. Weeks 1–2: A "guess what I drew" game — a model that recognises drawings.
  2. Weeks 3–5: A voice assistant for a parent — say "turn on the light" and the program responds.
  3. Weeks 6–8: A small posture-detection system — a camera spots whether the child is sitting up straight (mum can use this one).
  4. Weeks 9–11: An AI-driven robot — a Lego robot following voice commands.
  5. Week 12: A final project of the child's choosing.

Kids do all of this themselves. The teacher helps but doesn't do the work for them. Final projects are often "mum, look what I made" moments — and very often they aren't school projects but something the child actually keeps using afterwards.

Best example from last year's group

A twelve-year-old built an AI that detects when his baby brother cries. The app texts mum when the brother wakes up. It actually works. Nobody assigned him that project — he came up with it once he realised what he could actually do.

What kids don't learn (and why that's fine)

Let's also be straight about what we don't do:

The goal isn't to turn our twelve-year-olds into AI engineers. The goal is to give them a good intuition for something that will surround them for life — to know what AI can do, what it can't, and how to use it wisely.

What parents should know before signing up

  1. Black Belt AI is for ages 11–13. Younger kids aren't ready — not cognitively, but in terms of attention span.
  2. Black Belt isn't recommended as a first program. Ideally, before AI, the child has the Blue or Red Belt. If they're coming "from outside" — we do an assessment.
  3. There's no homework, but kids often work on their projects at home. Don't push it — let them do it only if they want to.
  4. After Black Belt AI, the natural next step is Python (Black Belt III). But kids can also wrap up the DigiKids path here and choose where to go next.

Bonus: AI ethics for kids

Five questions we ask kids during the program:

  1. Is AI always right? (No.)
  2. Who's responsible when AI gets it wrong — the developer or the user? (Discussion.)
  3. Can you copy someone else's work and call it your own? (No.) Can you say your work was helped by AI? (Yes, if you say so.)
  4. Can AI be biased? (Yes, because of the data.)
  5. What happens if everyone uses AI to write their homework? (Open question, great discussion.)

Honestly, these conversations are valuable beyond the coding itself. A child discussing AI ethics at twelve understands the dilemmas of the world they're walking into.


More about the program: Black Belt — AI and Robotics. The trial class is free, and afterwards we give you our honest opinion on whether the program is right for your child at this stage.

Black Belt AI trial class.

See what a 12-year-old looks like when they're learning to train their own AI model.

Book a trial class