Search this site
Embedded Files
TDD ai in education
Practical AI for Higher Ed Designing, Assessing and Innovating

SOURCE 

Listen on Spotify

Continue the Discussion

Watch on YouTube

PURPOSE

Problem statement: Most assessments currently used in higher education courses were not designed with agentic AI in mind, and AI tools can now complete many of them with minimal student effort.

Why it matters: Faculty who rely on AI detectors are exposing their institutions to false positives, student appeals, and litigation. Faculty who ignore AI are watching a growing gap open between what students submit and what students actually learn.

Today's deliverable: You will leave with one revised assessment that can withstand AI use not because it blocks AI, but because it requires thinking that AI cannot replicate on the student's behalf.

PROCESS

Key vocabulary (plain language):

Agentic AI — AI that can take actions on your behalf, including navigating websites and submitting assignments without your direct involvement

AI detector — A tool that claims to identify AI-generated text; currently unreliable with high false-positive rates

Teach-back session — An assessment in which the student teaches content to peers or the instructor, demonstrating actual understanding

Oral exam — A live or recorded conversation in which students explain their reasoning, not just their answers

Scenario-based learning — A learning strategy that places students inside a realistic, complex situation and asks them to navigate it

Custom GPT / Gem — A user-built AI tool trained on specific data and instructions to serve a defined purpose

Reflection flip — Shifting a written reflection to a video or audio format to increase authenticity and reduce AI substitutability


Five-step redesign process:

  1.  Identify one current assessment that AI could complete with minimal student involvement

  2.  Name the learning outcome that assessment was meant to measure

  3.  Choose one of the eight AI-resilient strategies: teach-back, oral exam, video reflection, group AI critique, community-based project, simulation, presentation, or scenario-based work

  4.  Draft the revised assessment prompt with specific constraints: format, timing, audience, and evidence of process

  5.  Add one checkpoint that captures student thinking before the final submission, such as a planning note, outline, or recorded decision log


Worked example:

Original: "Write a 500-word reflection on what you learned in this module."

Revised: "Record a 3-minute video in which you explain one concept from this module using an example from your own workplace or community. In the first 30 seconds, state the concept. In the next 90 seconds, apply it. In the final 60 seconds, name one thing you would do differently based on what you now know."

Why it works: 

  1.  The personal example requires lived experience AI cannot fabricate credibly. 

  2.  The time structure creates a thinking constraint, not just a format constraint. 

  3.  The reflection pivot at the end asks for genuine judgment.


PRODUCE

Drafting task: Revise one assessment you currently use or plan to use in an upcoming course. Apply the five-step process above. Your revised prompt should be specific enough that a colleague unfamiliar with your course could understand what students are being asked to do and why.


Choice options:

  • Option A: Write your revised assessment prompt as a text document. Include the original prompt, the revised prompt, and two sentences explaining your reasoning.

  • Option B (accessibility option): Record a 5-minute audio or video walkthrough of your redesign. Explain your original assessment, your revised version, and what changed in your thinking. Transcribe it with any auto-caption tool before submitting.

Constraints: 15 minutes for the draft. Your revised prompt should be under 200 words. Aim for clarity over completeness; you can refine after peer feedback.


Three-level support ladder:

  • Level 1 (getting started): Open the webinar's list of eight assessment types and pick the one that sounds least like what you currently do. That friction is a signal worth following.

  • Level 2 (stuck on framing): Ask yourself: "If a student used AI to complete this, what would be missing from their submission?" The answer tells you what the real learning target is. Build your revised prompt around that gap.

  • Level 3 (stuck on specifics): Take your original assessment prompt and paste it into a Custom GPT or Gemini Gem with this instruction: "What would a student miss learning if AI completed this for them? Suggest three revisions that require personal evidence of understanding." Use the output as a starting point, not a final product.

PACKAGE

Submit your infographic and insight to the LinkedIn discussion.

Share three things in your post:

  1. Main takeaway for your specific job: What did you learn that matters for your actual teaching context? Be specific.

  2. Where you got that insight from: Which disciplinary perspective surfaced this? What did you ask the AI that revealed it?

  3. What you're going to do with that information: One concrete action you'll take in the next two weeks because of this exploration.

React with ❤️ on posts that give you ideas for your own practice.

Created by Ronald Lethcoe | Return home 

Google Sites
Report abuse
Page details
Page updated
Google Sites
Report abuse