When AI Can Build the Course in 15 Minutes, What’s Left for Faculty?

Recently, UGA Online shared a powerful & innovative example of “AI in the middle” of teaching. The shared how an AI Course Starter they built can take a faculty member’s syllabus, rubrics, and course documents and auto-build a course shell in the LMS in about 10–15 minutes.

Here’s what their system does:

  • Creates a course description and learning outcomes
  • Builds weekly modules with descriptions and module-level outcomes
  • Suggests three assignments and three discussions for each module
  • Proposes an overarching project for the course
  • Uses recursive agents to connect later modules back to earlier ones for scaffolding
  • Runs quality checks for style (first-year vs graduate) and for content

Crucially, it doesn’t create the content or videos. It handles the structure and LMS plumbing so faculty can stay in their disciplinary lane. They can spend their time refining, editing, and making pedagogical decisions instead of wrestling with Brightspace LMS.

I love that direction. It’s innovative. It’s helpful. It’s exactly the kind of infrastructure that can reduce friction for faculty and support more coherent learning experiences for students.

But what is impressive and helpful in the moment may be a signal that we need to think more about. I will suggest it signals and raises a deeper question that higher ed can’t afford to ignore.


Today’s Win vs. Tomorrow’s Risk

From an institutional perspective, leadership will see one very clear benefit today:

“We can save faculty a lot of time and frustration.”

True (check). Good (check). Much needed (check).

However, in a world of declining enrollments and tightening budgets, those same leaders who are impressed today may see something very different 5–10+ years from now:

“If this infrastructure can do more of the design work, maybe we don’t need as many people doing it…or teaching it.”

If an AI agent can build a coherent course shell, align outcomes, and wire in active learning ideas in minutes…and when it can grade for faculty, respond to students, and even serve as the content (or socratic teaching) expert, it forces us to confront a hard question:

What, exactly, are we saying the essential work of faculty is?

If we don’t answer that clearly—and loudly—we risk drifting toward a future where more and more of the design, structure, and even interaction around courses is automated… and the human parts of teaching are treated as “nice-to-have” instead of non-negotiable.


Tools That Work For Faculty vs. Fluency In Faculty

This is why I don’t think the conversation can stop at “Look how much time this saves!”

Yes, we need tools that:

  • Reduce LMS friction
  • Handle repetitive design tasks
  • Provide solid starting points for outcomes, modules, and assignments

But we also need a parallel investment in faculty AI fluency as part of the future of work in higher ed.

That means:

  • Helping faculty understand what these systems can and can’t do
  • Equipping them to co-design with AI—prompting, critiquing, and iterating—rather than passively inheriting whatever the agent spits out
  • Making the human side of teaching visible: coaching, feedback, relationships, contextual judgment, and ethical discernment

If AI handles more of the “plumbing,” then the core argument for faculty has to be anchored in the things AI can’t do well:
knowing specific students, reading the room, connecting messy real-world context to disciplinary knowledge, and shaping judgment over time.

That only works if faculty themselves are fluent enough with AI to:

  • See its blind spots
  • Push back when automations don’t fit their discipline
  • Redesign learning experiences because these tools exist, not just slot content into AI-generated structures

The Real Future-of-Work Question for Higher Ed

So yes: I’m enthusiastic about what UGA Online had build. The AI Course Starters is a brilliant idea and similar builds like it. They’re a glimpse of what’s possible when AI is used as infrastructure rather than as a gimmick.

But alongside every new AI tool that does work for faculty, we need an equally serious effort to prepare faculty to do new kinds of work with AI.

Otherwise, we’re just speeding up yesterday’s model…AND quietly eroding the case for the people inside it (which several recent articles have already been foreward thinking on).

So, the strategic question for institutions isn’t just:

“How can we use AI to make course design more efficient?”

It’s also:

“How can we develop a faculty workforce that is AI-fluent, irreplaceably human, and clearly essential to student learning in an AI-shaped future?”

If we can answer that second question well, tools like UGA’s Course Starter won’t be a threat to faculty roles. They’ll be part of the evidence that human teaching is too important not to augment.

Shopping Cart
Scroll to Top