AI Fluency: Course Design
Course design has always required careful planning, clear objectives, and thoughtful alignment between learning goals, activities, and assessments. With the rise of generative AI, instructors now have additional tools that can support this work. AI can help with brainstorming ideas, drafting materials, and organizing content, but these capabilities do not replace the expertise, judgment, and creativity that instructors bring to the design process.
Designing a course in the age of AI means thinking intentionally about when these tools can be helpful and when the work must remain fully human-led. It also means understanding how AI is changing professional and disciplinary landscapes and considering what skills students will need in a world where AI is widely used. When instructors make intentional choices about AI’s role in course design, they ensure that technology supports learning rather than shaping it on its own.
Grounding Course Design in Clear Learning Objectives
Strong course design begins with strong learning objectives. Learning objectives describe what students should be able to know, do, or create by the end of a course or module. They guide the selection of content, shape assessments, and help students understand the purpose behind their work. Starting with clear objectives ensures that every part of the course supports meaningful and measurable learning.
Bloom’s Taxonomy is a helpful tool for writing and refining objectives. It organizes cognitive skills from foundational tasks such as remembering and understanding to more complex tasks such as analyzing, evaluating, and creating. Using verbs that reflect these levels allows instructors to communicate expectations more clearly and helps students understand the type of thinking required.
In the AI 101 workshop, instructors also explored the AI-Revised Bloom’s Taxonomy, which considers how AI affects student performance. It encourages instructors to identify which skills students should practice independently and which tasks may be supported by AI. This distinction helps preserve essential human thinking while still acknowledging the realities of modern tools.
AI can support this process by helping instructors draft or revise learning objectives, identify appropriate Bloom’s level verbs, or check for clarity and measurability. When used intentionally, AI becomes a helpful brainstorming partner rather than the author of the objectives. The instructor remains responsible for ensuring that each objective aligns with disciplinary expectations, course goals, and the level of student performance expected in the class.

Defining AI Boundaries in Your Discipline
Designing a course in the age of AI requires understanding how your discipline is already shaped by emerging technologies. Every field has tasks that are well supported by AI and tasks that still rely on human judgment, creativity, or ethical reasoning. When instructors explore these boundaries, they gain clarity on what students should learn to do independently and what skills they may safely practice with AI support.
The AI 101 workshop encouraged instructors to think first about the real world rather than the classroom. This begins with examining how professionals in your field use AI today. For example, marketers may use AI to generate drafts but still refine content themselves. Analysts may rely on AI for basic data summaries but make final decisions using their own interpretation and expertise. Understanding these practices helps instructors recognize which human skills remain essential for learners.

Departmental goals also play an important role. Many programs outline the knowledge, skills, and values they expect students to develop as they progress through a degree. These goals often reflect the competencies that students will need in the workforce, including areas where human decision-making cannot be replaced by AI. Aligning course-level objectives with these broader expectations ensures that students are prepared for both the challenges and the opportunities they will encounter after graduation.
Defining AI boundaries is not about restricting students unnecessarily. It is about clarifying the purpose of a learning experience and communicating which parts of an activity require human thinking, analysis, or creativity. When instructors consider disciplinary practices and departmental goals together, they can design courses that prepare students to work effectively with AI while still strengthening the core skills that make their discipline unique.
Designing for Transparency and Responsible Use
As AI becomes more common in higher education, students benefit from clear guidance on when and how these tools can be used in a course. Transparency helps students understand the purpose behind assignments, builds trust, and reduces confusion about academic integrity. When instructors are open about their expectations, students can make better decisions about how to use AI responsibly.
Communicating expectations does not mean simply labeling assignments as “AI allowed” or “AI not allowed.” Instead, it involves explaining the reasoning behind those decisions. When students know why an activity should be completed without AI, such as practicing a foundational skill or developing original ideas, they are more likely to value the learning process. Likewise, when AI is permitted or encouraged, students should understand how to use the tool in a way that supports thinking rather than replacing it.
Transparency also includes discussing potential concerns with students, such as privacy, accuracy, or overreliance. These conversations help students reflect on their own habits and assumptions about AI and encourage a more mindful approach to using technology. Some students may be hesitant about AI, while others may rely on it heavily. Giving all students a shared understanding of its role in the course creates a more equitable learning environment.
Using AI as a Design Partner, Not a Designer
AI can be a helpful partner during course design, but it should not replace the expertise and decision-making that instructors bring to the process. Tools like Gemini and ChatGPT can generate ideas, reorganize content, or provide examples, yet they work best when guided by clear expectations and thoughtful review. The instructor remains responsible for ensuring that all materials support learning goals, reflect disciplinary standards, and align with the values of the course.
Treating AI as a partner means using it to explore possibilities rather than to make final decisions. Instructors might prompt AI to suggest activity formats, brainstorm module themes, or outline potential sequences for a unit. These ideas can spark creativity, but they should always be evaluated for accuracy, appropriateness, and relevance. Reviewing AI output helps prevent issues such as misinformation, bias, or misalignment with course objectives.
AI can also support alignment by helping instructors connect activities, assessments, and objectives. For example, an instructor might ask AI to identify which Bloom’s level a task best matches, or to generate assessment ideas that correspond to a specific learning outcome. These suggestions can save time and offer new angles, but they should be considered starting points that the instructor refines.
Working with AI in this way helps preserve the human elements of teaching, such as empathy, judgment, and disciplinary insight. When instructors stay in control of the design process, AI becomes a tool that enhances creativity and organization rather than a system that directs the course.
Try it: Course Design Practice
Use the scenario below to practice applying the course design strategies introduced in this article. This activity mirrors the hands-on work from the Course Design workshop and can be used with any AI tool.
Scenario
You are designing a new module for a 300-level course called Environmental Policy and Decision Making. One of your course-level learning objectives is:
“Students will be able to evaluate policy options using environmental, social, and economic criteria.”
You want to create module-level objectives that support this skill, but you are not yet sure how to break it down or how AI might assist in drafting clear and measurable objectives.
Your Task
Use AI to break this course-level objective into clear module-level objectives and begin drafting a plan for how the module might be structured. Your goal is to create scaffolded objectives that align with the course goal while also identifying which skills should be taught with or without AI support.
Steps to Try
Step 1. Provide the AI with context.
Share your course-level objective, the course level, the module topic, and any relevant disciplinary expectations.
Step 2. Ask the AI to propose three module-level objectives.
Try a prompt such as:
“Propose three module-level learning objectives that support the course objective above. Include measurable verbs and identify the Bloom’s level for each one.”
Step 3. Ask the AI to identify which objectives require human thinking and which could use AI support.
Use a prompt such as:
“For each objective, explain which parts students should complete independently and which parts might be appropriate for AI-supported work.”
Step 4. Ask the AI to suggest two or three learning activities that align with the objectives.
Request ideas that incorporate skills such as analysis, comparison, or synthesis.
Step 5. Review and revise.
Compare the AI’s suggestions with your understanding of the discipline. Keep the ideas that support your goals, adjust any mismatches, and revise language to reflect your instructional voice.
Optional Variations
- Ask AI to generate module objectives at different complexity levels and choose the ones that best fit your course.
- Provide the AI with a short reading or case study and ask it to propose objectives tied to that content.
- Try asking AI to map your objectives to assessment ideas so you can check alignment.
Reflection Questions
- Which suggested module-level objectives aligned most closely with the learning focus you had in mind?
- Did AI reveal any gaps or assumptions in your original thinking about the module?
- How did reviewing the objectives through Bloom’s levels support your planning?
- Which skills do you feel should remain clearly human-driven in this module, and why?
- What changes did you make to ensure the final objectives reflected your disciplinary voice and teaching values?
The AI Fluency Article Series: Your Next Read
AI Fluency: Designing Assessments and Summarizing Student Work
This article focuses on designing meaningful assessments in an AI-rich learning environment. Drawing on the Fraud Triangle and workshop strategies, it helps instructors reduce opportunities for misuse, create authentic assessments, and use AI to brainstorm questions, refine prompts, and develop rubrics. The article emphasizes instructor oversight, academic integrity, and assessment designs that highlight student thinking.
Other Articles in This Series
- AI Fluency: 101 for Instructors
- AI Fluency: Prompting Basics
- AI Fluency: Course Design – Current Read
- AI Fluency: Designing Assessments and Summarizing Student Work
- AI Fluency: Developing Instructional Content and Learning Activities
Workshop Information
AI Fluency Series
If there are no available workshops, please feel free to request an instructional consultation about this topic.