Somewhere on your campus, in a sterile, beige conference room, the “Presidential Blue-Ribbon AI Task Force” is meeting for the third time this month.
Around the table sits the CIO, a lawyer from the General Counsel’s office, the head of university communications, and a well-meaning Dean who’s been tasked with herding the cats. They’re looking at charts. They’re talking about server capacity, data security, enterprise licenses, and risk mitigation. They’re drafting another university-wide policy—a document destined to be equal parts threatening and useless.
Notice who isn’t in that room.
There’s no art history professor who just discovered a student using Midjourney to create stunning, historically-informed pastiches. There’s no nursing instructor figuring out how to use AI simulators to train clinical reasoning. There’s no philosophy Ph.D. wrestling with how to teach Kant’s Categorical Imperative when students can ask an AI to write a perfect essay on it in 30 seconds.
In short, the people actually on the front lines of the AI revolution—the faculty—are conspicuously absent. And that is why your university’s AI strategy is almost certainly backwards.
The dominant approach to AI in higher education has been a top-down, centralized, command-and-control model. It’s a strategy led by IT and Administration. It’s a strategy focused on plumbing and policy. And it is a strategy that is doomed to fail.
The Two Failed Models of AI Strategy
Universities have defaulted to two modes of thinking when it comes to AI, both of them fundamentally flawed because they misunderstand the nature of the disruption.
Model 1: The IT-Led Strategy (The Plumber’s Fallacy)
When a paradigm-shifting technology arrives, who do you call? The tech people, of course. This seems logical, but it’s a critical error. Asking your IT department to lead your academic AI strategy is like asking the plumber to design your restaurant’s menu.
The plumber is essential. They make sure the water runs and the pipes don’t leak. You cannot run a restaurant without them. But their expertise is in infrastructure, not culinary arts.
Similarly, your CIO and their team are experts in security, procurement, bandwidth, and software deployment. Their job is to ask: “Is it secure? Can we support it? How many licenses do we need? What’s the risk to the institution?” These are vital, necessary questions. But they are not the questions that drive academic innovation.
An IT-led strategy results in a focus on tools over teaching. It leads to decisions like banning tools that pose a security risk or signing a massive enterprise deal for a single, “approved” AI platform. The conversation becomes about controlling the what, not enabling the how. It’s risk management, not educational development.
Model 2: The Administrator-Led Strategy (The Policy Prison)
If IT isn’t leading, the Provost’s office is. This model is driven by administrators whose primary concerns are academic integrity, institutional reputation, and creating a uniform policy to apply to everyone. This is the “policy prison” model.
It’s an attempt to legislate our way through a revolution.
This approach produces the endless stream of memos we’ve all received. They are documents written by lawyers and deans, for lawyers and deans. They are obsessed with defining cheating and creating punitive measures for misuse. While academic integrity is non-negotiable, a strategy that begins and ends with plagiarism detection is like trying to invent the automobile by building a better horsewhip. It completely misses the point.
This model treats faculty not as strategic actors, but as policy enforcers. It hands them a rulebook and expects them to police their classrooms, stifling the very experimentation and nuance that this moment demands. It’s a model built on fear, not opportunity.
The Only Model That Works: A Faculty-Led Strategy
The great irony is that while the task forces are meeting, the real AI strategy is already happening. It’s happening in thousands of individual decisions made by faculty members every single day.
- A political science professor redesigns her final exam from a take-home essay to an in-class oral defense, forcing students to synthesize and defend their ideas in real time.
- A chemistry instructor experiments with an AI tool that helps students visualize complex molecular structures, leading to a demonstrable increase in comprehension.
- An English professor allows the use of ChatGPT for brainstorming and outlining but requires students to submit a detailed reflection on how they used the tool and how it shaped their thinking.
These aren’t just small tactical choices. These are strategic acts of pedagogical innovation.
The institutions that win the next decade will be the ones that recognize this reality and invert the pyramid. They will understand that a successful AI strategy doesn’t flow from the top down. It grows from the bottom up. It must be faculty-led for three simple reasons.
1. Proximity: Faculty Are at the Point of Impact
No one is closer to the action. An administrator sees AI as an abstraction on a risk assessment matrix. A faculty member sees it in the eyes of a student who is either struggling to understand its output or using it to achieve a new level of insight.
Faculty are the ones fielding the questions, seeing the results, and witnessing the “a-ha!” moments. They are the only ones who can provide the ground-level feedback loop necessary to develop an intelligent strategy. Any approach that ignores this granular, real-world context is just theoretical naval-gazing.
2. Context: The AI Revolution is Not Monolithic
A top-down policy assumes AI means the same thing in every discipline. This is a profound failure of imagination.
- For a nursing student, AI might be a simulation tool for practicing diagnostic conversations with patients.
- For a law student, it’s a powerful engine for legal research and brief analysis.
- For a computer science student, it’s the very material they are learning to build and critique.
- For a poetry student, it’s a bizarre and fascinating collaborator for generating experimental verse.
To create a single, uniform AI policy for all these contexts is absurd. It’s like giving the same paintbrush to a watercolorist, a house painter, and a car detailer. The tool is the same, but the craft, the context, and the standards of excellence are wildly different. Only faculty, as disciplinary experts, can provide that essential context.
3. Pedagogy: The Innovation Isn’t the Tech, It’s the Teaching
This is the most critical point. The AI disruption in education is not fundamentally about technology. It’s about pedagogy. The exciting part isn’t what the AI can do, but how it forces us to re-architect learning itself.
When students have a tool that can instantly produce a competent essay, the act of assigning an essay as a terminal assessment becomes meaningless. This forces us to ask a better question: What was that essay assignment really trying to measure? Critical thinking? Research skills? Synthesis? Argumentation?
Now, how can we design a new assessment that measures those skills more directly, perhaps by incorporatingthe AI as a tool, a subject of critique, or a sparring partner? This is a pedagogical question, not a technological one. And faculty are the only ones qualified to answer it.
The Missing Middle: Frameworks for Faculty Leadership
Arguing for a faculty-led strategy isn’t an argument for chaos. It’s not about letting “a thousand flowers bloom” with no coherence. That would be just as ineffective as a rigid top-down mandate.
This is where frameworks like TEACH come in.
A framework provides the “missing middle”—a structure that enables guided autonomy. It replaces top-down rules with a shared set of questions and principles, empowering faculty to lead from within their own disciplines.
The TEACH Framework provides the structure for those conversations:
- Tools & Tech Fluency: What does “competence” with these tools look like for a future psychologist, engineer, or artist?
- Ethical & Legal Awareness: How do we move beyond plagiarism to discuss data bias, intellectual property, and the ethics of AI-generated content in our specific field?
- AI-Integrated Pedagogy: How do we, as a department, redesign our capstone projects, our introductory courses, and our core assessments for this new reality?
- Curriculum Innovation: What does this mean for the learning outcomes of our entire degree program? Are we teaching skills that are being automated, and how do we pivot to what’s durable?
- Human-Centered Practice: How do we identify the core human-to-human value we provide—mentorship, inspiration, ethical guidance—and double down on it?
When faculty lead these conversations, guided by a robust framework, the entire institution gets smarter. The strategy becomes resilient, context-aware, and deeply embedded in the core mission of teaching and learning.
Let’s Stop Waiting and Start Leading
The future of higher education will not be defined by the institution with the most restrictive AI policy or the biggest enterprise software license. It will be defined by the institution that successfully mobilizes the creative, intellectual, and pedagogical power of its faculty.
Treating faculty as barriers to change is a fatal, unforced error. They are your single greatest asset in navigating this transition. They are not the people who need to be managed; they are the leaders you have been waiting for.
It’s time to stop drafting memos and start building capacity. It’s time to invert the pyramid.
Want to Equip Your Faculty to Lead?
A faculty-led AI strategy requires tools and training built for educators, not IT managers.
Discover the SCALE AI Leadership Toolkit. Move your insitution beyond ad hoc efforts and toward a comprehensive, cohesive institutional strategy – in a short period of time. Visit SCALE AI Leadership Toolkit to get started.
Get the full TEACH Core Toolkit. Access our complete suite of editable templates, implementation guides, departmental workshop plans, and presentation resources to accelerate your faculty-led strategy. Visit TEACH Framework to learn more.
Book a live TEACH Bootcamp or SCALE Workshop. Bring our team to your campus (virtually or in-person) for an immersive, hands-on training day designed to turn your faculty into confident AI leaders. Visit the Navigate AI About page and fill out the contact / email form to get the conversation started.
Let’s give the real leaders the tools they need to do the job.