Help One Hundred Schools

Deep Fakes & AI Safety: What Every School Leader Must Know
In this episode of the Help 100 Schools Podcast, Karl Boehm interviews Evan Harris, president of Pathos Consulting Group, to discuss how deep fakes and AI abuse are creating new safety challenges for schools—and what leaders must do now to prepare. From fake crisis videos to AI-generated sextortion, these threats are no longer hypothetical.
Evan brings a combination of experience as a former teacher and administrator, national advisor on AI risks, and researcher with Stanford’s Human-Centered AI Institute. Having co-authored the NAIS legal guide on deep fake sexual abuse, he shares real-world cases, prevention strategies, and actionable steps schools can take to protect students and strengthen digital safety.
What’s Covered:1. The Reality of Deep Fake Risks- Deep fakes aren’t just a tech buzzword—they’re fueling new forms of bullying, sextortion, and reputational damage.
- Real-world cases: a Baltimore principal framed with cloned audio, and fake videos of gunshots or fires disrupting schools.
- Why schools can’t wait until it happens to them.
2. Three Essential Buckets for School Safety
- Policy: Write tech-neutral rules that cover both harmful media and threats to create it.
- Crisis Readiness: Don’t just keep a binder—practice scenarios as a team.
- Prevention: Train staff, build student awareness, and partner with parents early.
3. The Human Factor
- Why leaders, when confronted with this issue, instantly shift from administrator to parent.
- How victim notification can either lessen or multiply the damage.
- The importance of trauma-informed counseling, agency, and dignity for students.
4. Big Schools vs. Small Schools
- Larger schools may have more resources, but small schools feel disruption more acutely.
- The biggest vulnerability isn’t your IT system—it’s your people.
5. Five Steps You Can Take Today
Update your handbook policy with broad language and clear examples.Use inoculation theory—show your community a safe fake example so they know what to look for.Engage parents first so they’re prepared when kids come home with questions.Run a crisis comms tabletop with your leadership and MarCom teams.Plan victim notification protocols with compassion and care.Evan’s Top Takeaways for Schools:- Prevention is possible—but you must start before a crisis.
- This isn’t niche. Research shows up to 1 in 5 high schoolers know of a classmate targeted with deep fake abuse.
- Parent partnerships are non-negotiable. They’re critical for prevention and communication.
- Skills matter more than binders. Crisis readiness comes from practice, not paperwork.
- Every child deserves safety. Protecting students’ digital dignity is core to your mission.
- Email Evan: evan@pathosgroup.ai
- Connect on LinkedIn: Evan Harris AI
- Explore: Pathos Consulting Group
Have questions about AI safety or deep fakes in schools?
Tag us on social media and let us know what you’re seeing in your community.
And don’t forget to subscribe to Help 100 Schools for more insights on leadership, safety, and the future of education.