Helping leaders prepare for a future where AI is a part of life.

Your University’s AI Strategy is Almost Certainly Backwards

Somewhere on your campus, in a sterile, beige conference room, the “Presidential Blue-Ribbon AI Task Force” is meeting for the third time this month. Around the table sits the CIO, a lawyer from the General Counsel’s office, the head of university communications, and a well-meaning Dean who’s been tasked with herding the cats. They’re looking at charts. They’re talking about server capacity, data security, enterprise licenses, and risk mitigation. They’re drafting another university-wide policy—a document destined to be equal parts threatening and useless. Notice who isn’t in that room. There’s no art history professor who just discovered a student using Midjourney to create stunning, historically-informed pastiches. There’s no nursing instructor figuring out how to use AI simulators to train clinical reasoning. There’s no philosophy Ph.D. wrestling with how to teach Kant’s Categorical Imperative when students can ask an AI to write a perfect essay on it in 30 seconds. In short, the people actually on the front lines of the AI revolution—the faculty—are conspicuously absent. And that is why your university’s AI strategy is almost certainly backwards. The dominant approach to AI in higher education has been a top-down, centralized, command-and-control model. It’s a strategy led by IT and Administration. It’s a strategy focused on plumbing and policy. And it is a strategy that is doomed to fail. The Two Failed Models of AI Strategy Universities have defaulted to two modes of thinking when it comes to AI, both of them fundamentally flawed because they misunderstand the nature of the

Read More »

AI Literacy is First Aid. Your University Needs Surgeons.

Imagine your entire faculty has just completed a mandatory First Aid certification. They can all define “aneurysm,” apply a tourniquet, and perform CPR on a dummy. They are, in a word, literate in emergency medical care. Now, ask one of them to perform open-heart surgery. The absurdity of that request is the exact situation facing higher education today. We are in the midst of “Peak AI Literacy.” The landscape is saturated with awareness-level initiatives: These efforts are not useless. Like First Aid, they are essential for establishing a baseline of knowledge and preventing immediate harm. They have successfully moved the institutional conversation from “What is AI?” to “AI is here.” We have achieved awareness. But awareness is not a strategy. And literacy is not fluency. Acknowledging a challenge is not the same as being equipped to solve it. We are certifying a campus full of first-aiders while the moment demands a generation of surgeons. The Glossary-and-a-Prayer Approach to Faculty Development Let’s be honest about what most “AI Literacy” training entails. It’s a glossary of terms (LLM, generative, prompt), a list of popular tools, and a well-meaning but vague discussion about ethics and cheating. It’s a “Glossary-and-a-Prayer” approach. We give faculty a few new words and pray they can figure out the rest. This is insufficient because it fails to address the three deep, structural gaps that exist between knowing about AI and knowing what to do with it. 1. The Pedagogical Gap: From “What is it?” to “How do I

Read More »

Why TEACH Is the AI Pedagogy Framework Higher Ed Faculty Actually Need

Another email from the Provost’s office. Subject: “Updated Guidance on Generative AI.” It’s the fourth one this semester. It’s three pages long, filled with toothless platitudes about academic integrity and vague suggestions to “innovate.” It’s everything and nothing. It’s a document written by a committee to protect an institution, not to empower a single person standing in front of a classroom. This is the state of AI in higher ed. While universities are busy forming task forces and issuing memos, faculty are on the front lines of a pedagogical revolution with no map, no compass, and certainly no useful air support. We’re left to figure out the difference between a student using AI to cheat on a paper and a student using it as a brilliant Socratic partner. The former is a headache. The latter is the future, and most of us were never trained to tell the difference. Let’s be honest: the firehose of AI resources online is useless. It’s built for venture capitalists, developers, or futurists making grand pronouncements. It’s not for the history professor trying to design a final paper that can’t be faked in 90 seconds, or the biology instructor wondering how to teach lab reports when an AI can write a flawless one. We don’t need another TED Talk. We need a framework. A way to think. That’s why we built TEACH. Ditch the Panic Memos. Adopt a Pedagogy. TEACH is not another top-down mandate. It’s a faculty-first framework for making strategic, defensible decisions about AI

Read More »
Shopping Cart