Somewhere on your campus, in a sterile, beige conference room, the “Presidential Blue-Ribbon AI Task Force” is meeting for the third time this month.
Around the table sits the CIO, a lawyer from the General Counsel’s office, the head of university communications, and a well-meaning Dean who’s been tasked with herding the cats. They’re looking at charts. They’re talking about server capacity, data security, enterprise licenses, and risk mitigation. They’re drafting another university-wide policy—a document destined to be equal parts threatening and useless.

Notice who isn’t in that room.
There’s no art history professor who just discovered a student using Midjourney to create stunning, historically-informed pastiches. There’s no nursing instructor figuring out how to use AI simulators to train clinical reasoning. There’s no philosophy Ph.D. wrestling with how to teach Kant’s Categorical Imperative when students can ask an AI to write a perfect essay on it in 30 seconds.
In short, the people actually on the front lines of the AI revolution—the faculty—are conspicuously absent (or have a token representative). And that is why your university’s AI strategy is almost certainly backwards.
The dominant approach to AI in higher education has been a top-down, centralized, command-and-control model. It’s a strategy more dictated by Administration than fostered and empowered by Administration. It’s a strategy focused on plumbing and policy. And it is a strategy that is doomed to fail.
The Two Failed Models of AI Strategy
Universities have defaulted to two modes of thinking when it comes to AI, both of them fundamentally flawed because they misunderstand the nature of the disruption.
Model 1: The IT-Led Strategy (The Plumber’s Fallacy)

When a paradigm-shifting technology arrives, who do you call? The tech people, of course. This seems logical, but it’s a critical error. Asking your IT department to lead your academic AI strategy is like asking the plumber to design your restaurant’s menu.
The plumber is essential. They make sure the water runs and the pipes don’t leak. You cannot run a restaurant without them. But their expertise is in infrastructure, not culinary arts.
Similarly, your CIO and their team are experts in security, procurement, bandwidth, and software deployment. Their job is to ask: “Is it secure? Can we support it? How many licenses do we need? What’s the risk to the institution?” These are vital, necessary questions. But they are not the questions that drive academic innovation.
An IT-led strategy results in a focus on tools over teaching. It leads to decisions like banning tools that pose a security risk or signing a massive enterprise deal for a single, “approved” AI platform. The conversation becomes about controlling the what, not enabling the how. It’s risk management, not educational development.
Model 2: The Administrator-Led Strategy (The Policy Prison)

If IT isn’t leading, then it’s likely that the Provost’s office (or an extension of it via the Center for Teaching and Learning or the Digital Learning Unit) is. This model is driven by administrators whose primary concerns are academic integrity, institutional reputation, and creating a uniform policy to apply to everyone. This is the “policy prison” model.
It’s an attempt to legislate our way through a revolution.
This approach produces the endless stream of memos and emails we’ve all received at some point in our careers. They are documents written by lawyers and deans, for lawyers and deans. They are obsessed with defining cheating and creating punitive measures for misuse. While academic integrity is non-negotiable, a strategy that begins and ends with plagiarism detection is like trying to invent the automobile by building a better horsewhip. It completely misses the point.
This model treats faculty not as strategic actors, but as policy enforcers. It hands them a rulebook and expects them to police their classrooms, stifling the very experimentation and nuance that this moment demands. It’s a model built on fear, not opportunity.
Sidebar: There’s a phantom version of model 2 called the Administrator-Ambivalence Strategy. Since know one in administration really knows much about AI, they’re silent on it not so much because they want to be but because they just don’t know how to lead through this massive, disruptive change (or at least in a substantive way or with a strategy that makes sense).
The Only Model That Works: A Faculty-Led Strategy

The great irony is that while the task forces are meeting, the real AI strategy is already happening. It’s happening in thousands of individual decisions made by faculty members every single day.
- A political science professor redesigns her final exam from a take-home essay to an in-class oral defense, forcing students to synthesize and defend their ideas in real time.
- A chemistry instructor experiments with an AI tool that helps students visualize complex molecular structures, leading to a demonstrable increase in comprehension.
- An English professor allows the use of ChatGPT for brainstorming and outlining but requires students to submit a detailed reflection on how they used the tool and how it shaped their thinking.
These aren’t just small tactical choices. These are strategic acts of pedagogical innovation.
The institutions that win the next decade will be the ones that recognize this reality and invert the pyramid. They will understand that a successful AI strategy doesn’t flow from the top down. It grows from the bottom up. It must be faculty-led for three simple reasons.
1. Proximity: Faculty Are at the Point of Impact
No one is closer to the action. An administrator sees AI as an abstraction on a risk assessment matrix. A faculty member sees it in the eyes of a student who is either struggling to understand its output or using it to achieve a new level of insight.
Faculty are the ones fielding the questions, seeing the results, and witnessing the “a-ha!” moments. They are the only ones who can provide the ground-level feedback loop necessary to develop an intelligent strategy. Any approach that ignores this granular, real-world context is just theoretical naval-gazing.
2. Context: The AI Revolution is Not Monolithic
A top-down policy assumes AI means the same thing in every discipline. This is a profound failure of imagination.
- For a nursing student, AI might be a simulation tool for practicing diagnostic conversations with patients.
- For a law student, it’s a powerful engine for legal research and brief analysis.
- For a computer science student, it’s the very material they are learning to build and critique.
- For a poetry student, it’s a bizarre and fascinating collaborator for generating experimental verse.
To create a single, uniform AI policy for all these contexts is absurd. It’s like giving the same paintbrush to a watercolorist, a house painter, and a car detailer. The tool is the same, but the craft, the context, and the standards of excellence are wildly different. Only faculty, as disciplinary experts, can provide that essential context.
3. Pedagogy: The Innovation Isn’t the Tech, It’s the Teaching
This is the most critical point. The AI disruption in education is not fundamentally about technology. It’s about pedagogy. The exciting part isn’t what the AI can do, but how it forces us to re-architect learning itself.
When students have a tool that can instantly produce a competent essay, the act of assigning an essay as a terminal assessment becomes meaningless. This forces us to ask a better question: What was that essay assignment really trying to measure? Critical thinking? Research skills? Synthesis? Argumentation?
Now, how can we design a new assessment that measures those skills more directly, perhaps by incorporatingthe AI as a tool, a subject of critique, or a sparring partner? This is a pedagogical question, not a technological one. And faculty are the only ones qualified to answer it.
The Missing Middle: Frameworks for Faculty Leadership

Arguing for a faculty-led strategy isn’t an argument for chaos. It’s not about letting a thousand flowers bloom with no coherence. That would be just as ineffective as a rigid top-down mandate.
This is where the Navigate AI ecosystem of frameworks comes in.
Faculty don’t need another memo. They need a set of scaffolds—tools that provide structure without stifling creativity. Frameworks create the “missing middle”: a common language and set of guiding questions that empower faculty to lead from within their disciplines, while giving administrators confidence that the innovation is intentional, ethical, and strategic.
- TEACH gives faculty a compass for integrating AI into their teaching, from tools and pedagogy to curriculum and human-centered practice.
- META helps redesign assignments for the AI age, shifting focus from final artifacts to the learning process itself.
- FAFI (Faculty AI Fluency Index) provides a benchmark to measure and grow faculty capacity, moving from novice awareness to confident leadership.
- SCALE offers leaders a model for building institution-wide AI strategy that is cohesive, faculty-led, and sustainable.
Each framework tackles a different layer of the challenge. Together, they form a coherent ecosystem—a roadmap for institutions that want to empower faculty to lead while still maintaining shared standards and direction.
When faculty lead these conversations with the support of clear frameworks, the institution doesn’t just keep pace with AI. It gets smarter, faster, and more resilient.
Let’s Stop Waiting and Start Leading
The future of higher education will not be defined by the institution with the most restrictive AI policy or the biggest enterprise software license. It will be defined by the institution that successfully mobilizes the creative, intellectual, and pedagogical power of its faculty.
Treating faculty as barriers to change is a fatal, unforced error. They are your single greatest asset in navigating this transition. They are not the people who need to be managed; they are the leaders you have been waiting for.
It’s time to stop drafting memos and start building capacity. It’s time to invert the pyramid.
Want to Equip Your Faculty to Lead?
A faculty-led AI strategy requires tools and training designed for educators, not just IT managers or risk officers. That’s what Navigate AI was built for.
Explore the Frameworks: Start with TEACH for classroom integration, META for assignment redesign, FAFI for measuring faculty fluency, and SCALE for leadership and strategy. Each provides practical tools, guides, and diagnostics you can use immediately.
Download a Toolkit: Get the TEACH Starter Toolkit or the FAFI Guide to benchmark where your faculty are today and chart your next steps.
Bring It to Your Campus: Book a Navigate AI Bootcamp or Workshop (virtual or in-person) to turn your faculty into confident AI leaders in just one day.
The future won’t be decided by the institution with the strictest policy. It will be decided by the one that unleashes the power of its faculty.
Let’s stop drafting memos. Let’s start building capacity.