Explore tools, strategies, and resources for higher education. Discover insights to enhance teaching, learning, and technology in colleges and universities.
Imagine: you’re a faculty member and receive an email introducing a new AI teaching assistant (AITA). Instead of curiosity, your first reaction may be fear: “Am I being replaced? Are they implementing something to monitor me? Are we supposed to help students cheat?” As instructional coaches, it can be easy to leap into technology training with tools in hand, forgetting that successful adoption starts not with the technology—but with trust.
Leading with Trust
Often the earliest early adopter in the various institutions I’ve worked for, I recently shocked some industry colleagues when I was the only dissenting opinion regarding adopting a customer relationship management (CRM) software governed by AI. The primary audiences I support are post-traditional students, those on academic probation, first-generation students, and low socio-economic families. I often hear frustration from students about our institution’s automated teller, not using personal email addresses, and only getting voicemail after business hours. I knew immediately that if we implemented an AI automated teller, the trust we are consistently trying to build with students might experience irreparable damage. As coaches, we often sit at the intersection of these relationships, uniquely positioned to gauge institutional trust that should keenly discern the difference between novel technologies and appropriate applications.
Before introducing AI in higher education classrooms, let’s explore three essential layers of trust: institutional, instructor, and student.
1. Institutional Trust: “Will I Be Replaced?”
Many instructors and staff view institutional tech initiatives with skepticism as organizations face the constant pressure of supporting more expensive needs with shrinking funding. AI adoption can trigger fears of surveillance, deskilling, or job insecurity, especially when framed as a cost-saving or efficiency measure. When trust in the institution is low, even the best tool can do more damage than good.
Real-Life Coaching Moment: After the sales call for the AI-generated customer relationship management system, a co-worker said, “That system is going to replace us. If students can get help 24/7, what can we provide?”
Coaching Strategy: As one of the main system admins for our current customer relationship management system, I asked them if they felt that way when it was first implemented since we could write and schedule communications pretending to be them. They mentioned feeling very skeptical in the beginning. “After automating many communications, were you finally allowed to work on something else?” They noted that they saw an increase in responses and had more time to craft more intentional responses since we weren’t spending so much time determining which students needed communication and writing specific messages to hopefully get them to respond.
Trust Tip: Technologies help reduce our labor so we can spend more time on things that matter, such as building relationships instead of cold calling.
2. Instructor Trust: “Can I Use This Without Losing Control?”
Faculty often carry deep pedagogical values—and sometimes deep imposter syndrome. The fear isn’t always that AI will do it wrong. It’s that it will do it better, faster, or differently. Coaches must assure instructors that AI is a collaborator, not a competitor.
Real-life coaching moment: I worked with a committee of faculty who wanted to draft an institutional policy with their refusal to experiment with AI and wished to have it blocked on all campus networks. Not because it wasn’t valuable but because it was sometimes too useful, and students would use it to cheat. Their resistance was not about the tool. It was about control.
Coaching strategy: Acknowledge the history. I did not begin with “You have to…” Instead, I asked them to join me in identifying at least three other technologies in the past 40 years that caused similar panic. They quickly identified Google, Wikipedia, YouTube, cell phones in class, smartwatches, and countless other technologies we often use daily. I asked them how we overcame those tensions and how we might use those in the rise of AI. They mentioned having to teach students when it was appropriate to use these technologies and warning about how they might mislead them if misused. Soon, they shifted their focus on how AI use could be modeled since banning someone only increases its use.
Trust Tip: Never blame the tool. Only blame the one using it. A hammer is used to drive nails in wood, but we should not ban hammers because some misuse it as a dangerous object.
3. Student Trust: “Is this what I’m paying for?”
Students, too, are sensitive to how AI is used. They may see it as a gatekeeper when contacting instructors and staff if it’s introduced without context. Additionally, would they pay tuition to be taught by AI? Learning suffers when students don’t trust how or why AI is used.
Reflection prompt for coaches: Have any students at your institution raised concerns about AI teaching aids or that they fear their instructors inherently assume they’re using AI to cheat?
Coaching strategy: Help instructors engage students in dialogue about AI. Offer language like: “I use AI as a conversational tool because I often craft better ideas through dialogue. However, I know when I have my idea and to stop using AI and crafting my content.” Transparency is not a loss of authority—it’s a bridge to deeper learning and a license for trust.
Student-Centered Prompt: Ask students to talk with an AI about a subject they know very well and rate its accuracy. Turn bias into responsibility.
Conclusion: Start With Trust
As AI continues to shape the human world, we must remember that its success depends less on the sophistication of its design and more on the depth of our relationships. Instructional coaching is not just about helping faculty use tools—it’s about assisting them to trust themselves, their students, and the systems they’re part of.
Start with trust, and the tech will follow.