Demo View - Illustrative session data. The live version - generated from real transcripts - is available in the teacher dashboard.
Session A - AI Mechanics

Live AI Demo: Prompting Personas

★★★★★
5 / 5 - Landmark session. Set the tone for the entire course.
What Went Well
  • Students drove every prompt themselves. Each student named a persona - Harvard professor, empathetic teacher, substitute who doesn't care - and owned the output. 100% participation without enforcement.
  • The contrast between outputs was the single clearest demonstration of how AI works. Students could see it mirroring their instructions in real time rather than thinking independently.
  • Student 3's "why should we try if there's no accountability?" was a genuine philosophical question - generated spontaneously from the exercise with no setup from the teacher.
  • Student 1 noticed mid-exercise that changing three words in a prompt shifted the entire output. That level of analytical observation is exceptional for this age group.
  • The absurdist persona (substitute who gives everyone 100%) created group energy that carried into the follow-up discussion. The best engagement often comes from students' own humor.
Opportunities
  • Students watched and reacted; they didn't write their own prompts independently. A follow-on session where each student drafts a prompt to the same task would deepen retention significantly.
  • The richest philosophical moment - "who's responsible for bad AI output?" - surfaced at the end and got cut off. Build 10 minutes of protected reflection time into future high-engagement sessions.
  • Students 5 and 6 were primarily observers. Assigning each student a required "prompt direction" before the session ensures every voice shapes the exercise.
Student Snapshot This Session
Student 1
Drove the Harvard persona. Spotted output changes in real time. Strongest analytical engagement of the session.
Student 2
Led the empathetic teacher prompt with real ownership. One of her best sessions - personal voice came through clearly.
Student 3
Substitute teacher idea was the highlight - funny, creative, and sparked more discussion than any planned prompt.
Student 4
Raised the accountability question unprompted. A genuine philosophical insight. Worth returning to explicitly.
Student 5
Present and listening. Minimal output this session - structured role assignment would help next time.
Student 6
Engaged with the outputs but didn't drive a prompt. Check in privately - may need a lower-stakes entry point.
Adapt the Next Lesson
Give Students the Keys

This session proved what happens when students direct the exercise. Next session, give each student an identical task and ask them to write their own prompt for it independently. The output comparison - six different prompts, six different results, same task - teaches more about how AI works than any explanation.

Open the next class by returning to Student 4's accountability question: "Last time someone asked - if AI just does what we tell it, who's responsible when it goes wrong? I've been thinking about that all week." That framing elevates a student's instinct and sets up a richer discussion than cold-starting from theory.

Assign Student 5 and Student 6 a specific persona direction before the next session. "Your job next class is to give us a prompt for a first-year teacher who's terrified of technology." Pre-assignment eliminates the pressure of thinking on the spot.

How this was generated: This review is synthesized from the session transcript using pattern analysis across student contributions, question quality, and pedagogical moments. Student identities and voice signatures are anonymized before any analysis is run. This is illustrative demo data.