Teaching Students to Control AI

When my third-graders asked if they could build their own AI, I expected to spend a week explaining how large language models worked. Instead, they spent a week teaching me what AI education should actually look like.

The difference matters more than I initially understood.

The Design Process They Led

The three students that designed the A.I, named Growth Spurt, decided to stress-test their design before rolling it out to the class. Over several days, they systematically pushed boundaries such as typing inappropriate requests, testing whether the AI would do their homework, trying to make it play games instead of teaching.

Logo Design by the creators

Each failure became a design decision. When the AI occasionally gave direct answers, they revised the instruction set. When it drifted into entertainment mode, they added explicit constraints. When it accepted off-topic questions without redirecting, they crafted specific ways to respond.

These 8-year-olds built a quality assurance process more rigorous than most commercial edtech platforms.

More importantly, they learned something no tutorial could teach them: they control these tools. The AI doesn't make decisions. They do.

Understanding vs. Using

Most AI education focuses on teaching students how to use tools effectively. We show them how to write better prompts, how to fact-check AI responses, how to integrate AI responsibly into their work.

This approach treats AI as a black box with an instruction manual.

My students took a different path. They didn't learn to use an AI chatbot. They learned to build one. They discovered that AI behavior comes from human design decisions. They experienced directly that these systems follow rules, and humans write those rules.

When they tested different AI models (ChatGPT, Claude, Grok, Gemini, KIMI) to see which one followed instructions most accurately, they weren't learning to use AI. They were learning how AI works at a conceptual level that will transfer to every AI tool they encounter for the rest of their lives.

The distinction is critical. Students who understand they're in control of AI parameters approach these tools with agency. Students who only learn to use AI tools approach them with dependency.

What Happened When They Used What They Built

After the stress-testing phase, we rolled Growth Spurt out to the entire class. Over two weeks, I tracked 846 student interactions.

The data revealed something unexpected about student agency:

  • Math dominated usage at 35%, but students weren't looking for quick answers. They received strategy frameworks, worked through problems independently, then returned to verify their solutions. Sessions often extended 8-15 turns.

  • Creative writing came second at 23%. Students asked for help with story development and received sensory detail menus, "show don't tell" frameworks, and pacing strategies—never completed sentences or paragraphs.

  • Research skills emerged organically at 12%. When students asked factual questions, Growth Spurt taught page navigation strategies and research methodologies without providing facts.

Students engaged longer with an AI they had designed than with any commercial educational app I've used in my classroom. The difference wasn't the technology—it was the ownership.

The Pedagogy That Emerges

This experiment revealed something interesting about what AI in education could be. Teaching students to design learning parameters is more valuable than teaching them to optimize prompts.

When students design their own AI tools, they learn:

Decision-making power: They determine what the AI should and shouldn't do. They write the rules. They test the boundaries. They refine the constraints.

System thinking: They discover that AI behavior emerges from instruction sets. If the AI doesn't work correctly, they don't blame the technology but revise their design.

Agency over automation: They experience directly that humans control these systems. The AI serves their learning goals. They don't adjust their learning to fit the AI's capabilities.

Critical evaluation: When comparing different AI models, they developed criteria for what makes an AI tool effective for learning versus what makes it merely convenient.

This is what AI education should look like. Not students learning to use tools more effectively, but students understanding they have the power to shape how these tools function.

The Broader Implication

Most discussions about AI in education focus on teacher efficiency or student productivity. We debate whether AI helps or hurts learning. We worry about academic integrity and critical thinking skills.

Students will probably use AI. Will they approach AI as controllers or as users. Will they understand they make the decisions, or whether they defer to algorithmic judgment.

My third-graders proved something important: given the opportunity to design rather than just deploy AI tools, students naturally build systems that prioritize learning over convenience. They create constraints that keep themselves in the driver's seat.

That's the pedagogy we need. We don’t need tutorials on prompt engineering, but opportunities for students to determine the parameters of their own learning tools.

Students don't need to learn to use AI better. Let's teach them they have the power to decide how AI should work in the first place.

About the Author

Timothy Cook is a 3rd grade teacher and Elementary Technology Curriculum Leader at American Community School in Amman, Jordan. He writes Psychology Today's "Algorithmic Mind" column and founded Connected Classroom to explore how to preserve human agency in the age of AI.

Next
Next

Beyond Automation: Nurturing Human Wisdom in the Age of AI