Case Study #3: Using Brisk Teaching to Support Explicit Lesson Planning in EPSY 3029: Survey of Exceptional Children
Associate Professor of Educational Psychology, Counseling and Special Education Julie Cuccio Slichko (SUNY Oneonta) introduced Brisk Teaching to help education majors understand Explicit Instruction (EI) lesson planning, a concept students found difficult to grasp. Students were given the option to use this AI tool to generate actual lesson plans in their certification areas (such as ninth-grade poetry). This hands-on approach transformed abstract educational concepts into concrete examples that students could analyze against established EI frameworks. The implementation was conducted as an optional extra credit assignment with seven students, allowing for a focused pilot before potential wider adoption.
Results showed some specific strengths in Brisk’s capabilities. Students found that with adequate prompting skills, the AI-generated lesson plans included most required EI components (matching the Archer & Hughes template), produced grade-appropriate content, and incorporated student choice elements. However, they also identified clear limitations: the AI couldn’t account for students’ prior knowledge (a critical EI component), produced excessively detailed plans requiring teacher editing and often loss of time, and lacked the contextual awareness that comes from knowing specific students. These findings led students to conclude that while Brisk can assist with lesson planning, teacher expertise remains essential for effective implementation.
This case study offers practical insights about AI’s role in teacher preparation. Slichko’s reflection on having future students first create their own EI lessons before comparing them with AI-generated versions reveals an important pedagogical approach: using AI as a comparison point rather than a model to imitate. As long as the teacher can successfully help students be in the driver’s seat, be the expert of both topic and AI use/skills, rather than being dependent on AI assistants, and learn by judging rather than by relying on AI’s response, such tools can enhance student agency and engagement instead of undermining them. The pilot nature of this implementation—with just seven volunteer students—demonstrates a measured approach to integrating new technologies. By having future teachers critically evaluate AI-generated lesson plans against theoretical frameworks, both of which the student must become capable of judging, the activity developed not just lesson planning skills but also the critical thinking needed to appropriately integrate technology in their future classrooms. The case shows how AI tools can serve both as practical teaching aids and as objects of critical analysis in professional education programs.