Empty Echos: When AI "Experts" Won't Climb Their Own Wall
Expertise Without Demonstration
Marcos has been teaching rock climbing at Rockwall Academy for five years. His students admire his eloquent explanations of climbing theory, his detailed analysis of various techniques, and his thoughtful reflections on the psychology of overcoming fear. But there’s a major red flag with Marcos. In five years, none of his students has ever seen him climb the wall himself. When asked to demonstrate, he offers another brilliant lecture about proper technique or suggests videos of professional climbers instead. But he never climbs.
“Our rock climbing instructor who can eloquently describe hand positions without ever showing them in practice, distances himself from critique necessary to test the viability of his expertise.”
This scenario seems absurd in athletics, yet it perfectly captures the current landscape of AI discourse in education. Professional development sessions, social media posts, and educational publications, offer elaborate theories and frameworks (obviously AI generated with no disclosure) without demonstrating concrete follow up after framework implementation. They discuss what educators "should consider" and what approaches "might be effective" without risking specific, critiqueable and public solutions of their own.
The Rise of Hollow AI Discourse in Education
The ease of access to AI writing tools has created a curious irony. I’m not claiming that this is a negative, but much of what we read about AI in education is itself AI-generated, or might as well be, given its lack of substantive content. This hollow discourse follows distinctive patterns that, once recognized, become impossible to unsee.
The most revealing indicator is a linguistic technique used by strong public speakers. We can call it a “solutions mirage”. Remember that AI was trained on expert writing, and human speech patterns, so the use of this technique by GAI is perfectly reasonable. The solution mirage is a technique you will spot almost instantly after I point it out:
Extensive paragraphs about what educators "should consider" or "might implement" without a single concrete or verifiable example.
Here’s an example: “The emphasis on successful AI integration in school curricula is ensuring teacher readiness, age-appropriate curriculum design, inclusivity, and student safety...”
These articles or social media posts excel at describing problems with impressive vocabulary but offer nothing beyond vague recommendations to "leverage AI thoughtfully" or "emphasize teacher readiness." Like our rock climbing instructor who can eloquently describe hand positions without ever showing them in practice, distance from specificity prevents the critique necessary to test the viability of an idea.
Another telltale sign is the "balanced perspective trap." AI-generated content typically presents perfectly counterweighted viewpoints.
Here’s another example: AI in education offers powerful opportunities to enhance learning and support teachers, but its use requires careful consideration of ethical, pedagogical, and equity implications to ensure it serves all students effectively and responsibly.
Without taking a substantive position that might invite criticism, this artificial balance reveals a fundamental aversion to risk. The irony of course is this very quality, risk-taking, is what articles often claim students and teachers should embrace.
Stylistic patterns also emerge such as the use of'—appearing with frequency—alongside multi-clause sentences and hedging language ("potentially," "arguably," "might consider"). These are signals of algorithmic writing patterns. The content reads impressively but commits to nothing that could be proven wrong.
The rock climbing metaphor demonstrates an unspoken social contract in education. Do as you say. If we expect students to take intellectual risks, or in this case to climb a high wall where they might fall, then those guiding them should demonstrate the same courage. Asking students to engage with AI tools critically while experts hide behind theoretical frameworks creates a fundamental disconnect that undermines learning.
The Business of Privatized Solutions vs. Shared Knowledge
AI-generated content about educational technology operates at a fraction of the cost of expert-written analysis, creating a market that incentives volume over substance. This economic reality reveals another dimension of the problem that we can all see. Private solutions are being sold rather than public ideas being offered for critique. Education continue to be a marketplace rather than a community of practice. The climbing instructor who never climbs might be hiding not just fear but a business model. Could he be selling a proprietary climbing technique too valuable to demonstrate freely. Join his course to find out!
This privatization of educational solutions undermines the collaborative foundation of teaching itself. Throughout history, educational innovation has relied on public experimentation, critique, and refinement. When today's AI experts respond to complex challenges with "contact me for consulting" rather than "here's what I've tried," they remove themselves from this tradition of shared professional growth.
Why Public Experimentation Matters
Genuine expertise reveals itself through willingness to climb the wall in front of students even if these approaches that might fail in public view. Meaningful AI discourse in education requires showing, not just telling; attempting, not just theorizing.
Consider how different our rock climbing instructor would appear if he regularly climbed alongside students, narrating his thought process, acknowledging his fears, and occasionally falling. His occasional failures wouldn't diminish his expertise but enhance it through authenticity. The most valuable voices in AI and education today are those willing to share specific prompting techniques, classroom protocols, assessment frameworks, and implementation failures. Put your ideas at risk of critique to benefit the broader community.
This means abandoning the security of vague recommendations and embracing the vulnerability of specific suggestions. It means acknowledging that early attempts with new technologies will be imperfect, but that these imperfections create the foundation for collective improvement. Most importantly, it means rejecting the notion that educational innovation should be privatized rather than publicly shared and refined.
Two Concrete Strategies for the Classroom
1. Comparative Source Analysis
Provide students with a structured comparative matrix (Factual accuracy and completeness, Bias and perspective, Contextual understanding, Depth of reasoning, Unique contributions to understanding, Limitations and blind spots )that prompts them to evaluate each source.
In a high school literature class studying "The Great Gatsby," students might analyze the novel's portrayal of the American Dream using:
ChatGPT's analysis of symbolism and themes
A scholarly article on class critique in Fitzgerald's work
Historical documents about 1920s economic conditions
Modern critical perspectives on wealth inequality
Students create comparison grids identifying where AI effectively summarizes standard interpretations versus where scholarly analysis reveals subtleties the AI missed or where historical context provides insights absent from both.
2. AI-Human Debate Partnerships
Students formulate an initial position on a topic and use AI to gather supporting evidence, counterarguments, and relevant examples. Instead of accepting this information at face value, students must evaluate each point.
In a high school government class studying constitutional interpretation, students prepare for a debate on privacy rights in the digital age. They collaborate with AI to explore legal precedents and competing interpretations, but must critically evaluate AI-generated content against constitutional principles they've studied. Students are responsible for constructing logical reasoning chains that connect evidence to conclusions. AI might help organize but cannot fully execute.
Be the Instructor Who Climbs
The next time you encounter an article about "leveraging AI to transform education" that leaves you without a single concrete step to take tomorrow, remember the rock climbing instructor who never climbs. The wall needs climbers willing to demonstrate both the attempt and the occasional fall because that's how we all learn to climb higher.