Developing an Evaluation Framework
Below are the key approaches we took in our own collaborative development of the AI tool and use evaluation framework. Our evaluation framework is situated at the institutional level, as well as instructional, and its real value lies in adapting it to different contexts.
- Research and Identify AI Tools: Conduct thorough research to identify various AI tools that are likely to offer educational benefits for faculty and students. Look for tools with good reviews, documentation, user feedback, and proven track records in education.
- Assess Features and Functionality: Review the features and functionalities of each AI tool. Ensure that they align with your specific learning objective(s) and enhance the learning experience.
- User Interface and Experience: Test the user interface of the AI tool to ensure it is intuitive and user-friendly. A complicated interface can hinder student engagement and learning, as well as faculty adaptation and use.
- Data Privacy and Security: Evaluate the AI tool’s data privacy and security measures. Ensure that student data is protected and that the tool complies with relevant privacy regulations. 1EdTech’s TrustEd Apps™ Generative AI Data Rubric (Data Privacy section) is a self-assessment tool for suppliers that is still in the early stages of being developed but may be useful in helping to identify the questions that need to be asked.
- Data Collection, Storage, and Use: Determine what data the AI tool collects from students and how it is stored. Ensure that personally identifiable information (PII) and sensitive data are handled securely and that data retention policies comply with relevant regulations. Determine if the data collected is used to train the tool and the potential impacts this may have on teaching or learning in terms of privacy or intellectual property rights. Students should be able to refuse to use tools if they believe their personal information is not protected.
- Vendor Policies and Agreements: Carefully review the privacy policy and terms of service of the AI tool provider to understand how they handle student data and what responsibilities they hold. Students should be able to see this and the rest of the issues on their end as well.
- Data Sharing: Check if the AI tool shares student data with third parties or if it aggregates data across institutions. Be cautious about tools that may share data without explicit consent or for purposes beyond the scope of the educational context.
- Data Anonymization and De-identification: Verify if the AI tool anonymizes or de-identifies student data to protect their privacy. This is essential to prevent data breaches and unauthorized access.
- Access Controls: Check the access controls and permissions for the AI tool. Instructors should only have access to the data necessary for teaching, while students should have appropriate control over their personal information.
- GDPR and Compliance: If the AI tool operates in or collects data from users in the European Union, ensure that it complies with the General Data Protection Regulation (GDPR) and other relevant data protection laws.
- Security Audits and Certifications: Inquire whether the AI tool provider undergoes regular security audits and holds relevant certifications to ensure that their data protection practices meet industry standards.
- Incident Response and Data Breach Policies: Understand the AI tool provider’s incident response plan and data breach policies. Make sure that they have processes in place to handle any potential security breaches promptly and responsibly.
- Data Ownership and Portability: Clarify who owns the data generated through the AI tool and ensure that students have the right to access and export their data upon request.
- Compatibility and Integration: Check if the AI tool can integrate seamlessly with the existing learning management system or that it can be easily accessed—if that is your expectation.
- Vendor Reputation and Support: Research the reputation of the AI tool’s vendor. Consider factors like customer support, ongoing updates, and responsiveness to issues or concerns. This could impact teaching/learning post adoption.
- Instructor Training and Support: Consider what training and support is provided to instructors in using the AI tool effectively—including through demo video that the tool’s makers may have provided. If there is no support after adoption, imposing a tool upon students or faculty can adversely impact teaching/learning.
- Institutional Approval and Policy Compliance: Ensure that the AI tool meets institutional policies and guidelines for educational technology adoption. (The New York State Information Technology policy on the Acceptable Use of Artificial Intelligence Policies may also be a helpful resource to consult.)
Once a potential tool has been identified for use in a course or program, faculty should evaluate the tool prior to implementation. Here is a step-by-step guide to evaluating AI tools:
- Define Learning Objectives: Determine how the AI tool can complement or enhance the achievement of course learning objectives.
- Trial and Pilot Testing: Conduct a trial or pilot test of the AI tool with a small group of students or colleagues. Gather feedback on its effectiveness and usability.
- Learning Analytics: Assess the tool’s ability to provide valuable learning analytics and insights for instructors and students. Analytics can help identify areas for improvement and measure learning outcomes.
- Feedback and Assessment: Collect feedback from students who used the AI tool and assess its impact on their learning experience and outcomes.
- Integration with Curriculum: Ensure the AI tool can be integrated seamlessly into the course curriculum without disrupting the overall flow of the course.
- Comparison with Evidence-based Pedagogies: Compare the AI tool’s effectiveness with proven (instructor-led, both F2F and technology-based) teaching methods to gauge its added value.
- Support for Multimodal Learning: Verify if the AI tool supports multimodal learning, allowing students to engage with content using various formats, such as text, audio, video, and interactive elements.
- Long-Term Viability: Assess the long-term viability of the AI tool, considering its potential for future updates and scalability.