You Trained Your Teachers on AI. Now What?
I’m starting to see a clear continuum when it comes to how school leaders feel they are leading AI integration at their schools. On one end of the continuum is wonder and maybe a little worry. Frequently, school leaders ask the questions, “What should I be doing?” and “Am I doing enough?” On the other end of the continuum are school leaders who are confident they are leading this changing landscape well. And the source of that confidence is most often a statement like this, “Our teachers have already had AI training.”
Training is a fantastic first step. But it doesn’t address the question that actually matters when it comes to leading with clarity in the age of AI: What are students actually experiencing across your classrooms?
Unfortunately, what I’m hearing and seeing at schools (whose entire faculty experienced some type of AI training) is that students are receiving mixed messages. In English, generative AI is a brainstorming partner and a source of feedback for their writing. In history, they've been told it’s borderline cheating. In science, the teacher hasn’t mentioned AI at all.
If this narrative sounds all-too-familiar and mirrors the student experience at your school, this article is for you.
Again, training is an excellent first step, but schools need to actively pursue alignment. Without alignment, teachers take the same input and create slightly (or drastically) different outputs, leading to different expectations and mixed messages for students.
What Alignment Actually Means
Alignment doesn’t mean every teacher teaches in the same way, but it does mean the school has clarity on what matters most. There is shared language, shared expectations, and a consistent experience for students. This kind of coherence is what research points to when highlighting collective teacher efficacy as a major driver of learning. It’s not about isolated excellence. It’s about your team moving in the same direction.
AI has a way of exposing the lack of alignment quickly because it forces schools to answer questions they can’t afford to answer differently.
What is acceptable use?
When does support of learning become a substitute for learning?
What are we actually trying to teach students in an AI world?
If those answers vary from classroom to classroom, students have to figure out the answers on their own. Simply put, that is not their responsibility. It is the school’s responsibility to establish those answers and then help students apply them…consistently!
Alignment around AI isn’t about choosing one tool or writing a single policy. It’s about being able to say, as a school, “This is how we think about AI. This is how we use it. This is what we expect from teachers and students.” If that can’t be said clearly, alignment isn’t there yet.
From Training to Alignment
That’s the gap we designed the AI Blueprint Summit to address. Not more training, but alignment. It creates the conversations, time, and space for school leaders to move from “our teachers learned about AI” to “our school has clarity.”
Clarity in areas like:
How AI fits into the mission of our school
What still counts as learning when AI is involved
Where and how AI enhances curriculum, instruction, and assessment
What constitutes appropriate and ethical use for teachers and students
What consistent practices students will experience across classrooms
At the end of the day, students don’t experience your training. They experience your system.
And if that system changes from classroom to classroom, students will do what they’ve always done. They’ll adapt, adjust, and figure out how to succeed in each individual room.
But they won’t gain clarity. And in a changing world that is being shaped by AI, clarity is what they need most.
The AI Blueprint Summit
If you want to learn how to align your faculty’s use and understanding of AI, amplify student learning, and build a healthy, mission-aligned AI culture in your school, I encourage you to bring your leadership team to the AI Blueprint Summit this summer.

