Wednesday, April 10, 2024

Rubric for Evaluating AI Tools for Schools

Ever since the launch of ChatGPT and other generative AI tools, there has been an explosions in the creation of AI tools for schools. It seems that every week I am hearing about a new AI product launch designed to help education.

Over time I have shared many of my favorite AI tools in the "Cool Tools" series - bit.ly/cool-tools-23 - and in my resource document "The AI Toolbox: Best AI Tools for Schools" - bit.ly/curts-aitools

All of these products promise to improve teaching and learning by assisting teachers, creating educational content, personalizing learning for students, and more. But how can a school evaluate these tools to see if they really do provide safe, educationally sound, cost effective solutions?

To assist with this process I have created a "Rubric for Evaluating AI Tools in Education" which covers 18 essential criteria for assessing these products. The rubric is freely available for anyone to use or modify as needed.

See below to get your own copy of the rubric, as well as detailed directions on how to make the best use of it. As with anything AI-related, this is always a work in progress. I welcome your suggestions for improvements to make this an even more valuable tool for educators.

💾 Download the Rubric

The rubric is available as a Google Document. You can get your own copy using any of the options below.
As with all of my resources, this rubric is licensed under a Creative Commons Attribution Non-Commercial 4.0 United States license. In short, you may copy, distribute, and adapt this work as long as you give proper attribution and do not charge for it.

👀 Overview
  • This rubric is designed to guide K-12 schools in the comprehensive evaluation of artificial intelligence (AI) tools for educational use.
  • It encompasses a broad range of 18 criteria essential for assessing the suitability, effectiveness, and safety of AI technologies within the learning environment.
  • This framework aims to simplify the selection process by providing a clear set of criteria for identifying AI tools that are most beneficial and appropriate for school settings.

⚖️ Evaluation Process
  • Team Collaboration: Evaluation is most effective when conducted collaboratively. Users are encouraged to form teams that include a diverse range of perspectives, including educators, IT staff, administrators, and where appropriate, students, to discuss each criterion and its relevance to the AI tool under consideration and the specific needs of their educational institution.
  • Guiding Questions: Each criterion is accompanied by one or more guiding questions. These are designed to prompt thorough consideration and discussion of how well the AI tool meets each criterion. Teams should use these questions to guide their evaluation and discussions.
  • Documentation: Below each criterion, there is a designated space for notes. Teams should document key observations, findings, and any consensus or differing opinions that arise during their evaluation. Including examples, specific features of the tool, and any relevant experiences or insights can be particularly useful.
  • Scoring: Adjacent to the notes section, there is a space to assign a score to each criterion. Teams are given the flexibility to determine their scoring scale (e.g., 1-5, 1-10, or qualitative descriptors like Excellent, Good, Fair, Poor). It's important to define this scale clearly before beginning the evaluation to ensure consistency and objectivity across all criteria.
  • Consensus and Decision-making: While individual perspectives are valuable, arriving at a consensus for each score can foster a unified approach to the decision-making process. If consensus cannot be reached, consider documenting the range of scores and discussing these discrepancies as part of the final evaluation.
  • Final Considerations: The comprehensive evaluation of an AI tool is not solely about tallying scores but understanding the nuances of how the tool aligns with educational goals, supports teaching and learning, and fits within the technological and ethical framework of the institution. The notes and discussions will be invaluable in making an informed decision beyond the numerical score.
  • Review and Action: After completing the rubric, review the scores and notes as a team to identify strengths, areas for improvement, and any deal-breakers. This review should inform whether the tool is a suitable choice for your institution and what next steps, if any, are necessary to implement, trial, or further investigate the tool.

💼 Vendor Involvement

To further enhance the utility of this rubric in evaluating AI tools for educational purposes, schools are encouraged to share it directly with the companies that develop these tools.
  • This initiative allows developers to provide detailed insights into how their products meet the specified criteria, offering a preliminary but valuable layer of assessment.
  • By doing so, educational institutions can gather critical information upfront, simplifying the initial stages of their evaluation process.
  • This approach not only streamlines the selection of AI tools by leveraging developers' expertise and insights but also fosters a collaborative dialogue between schools and companies, ensuring that the educational needs and goals are effectively communicated and addressed.

📋 Evaluation Rubric

1 - Alignment with Curriculum
  • Does the AI tool align with the curriculum goals and learning objectives?
  • Can the tool be customized or adapted to align with the specific curriculum goals and learning objectives of different subjects, grade levels, or educational programs?
2 - Ease of Use
  • Is the tool user-friendly for both students and teachers?
  • How intuitive is the user interface of the AI tool for first-time users?
  • What is the estimated learning curve for both students and teachers to become proficient in using the tool?
3 - Teacher Role
  • How does the tool support educators in their instructional practices?
  • What features does the tool offer for educators?
  • Is the emphasis of the tool on empowering educators with additional resources and insights, rather than automating teaching tasks in a way that could marginalize the role of the educator?
4 - Age Appropriateness
  • Is the content and interaction style of the AI tool suitable for the age and developmental stage of the students who will be using it?
  • How does the tool adjust to cater to the cognitive, social, and emotional needs of different age groups within the K12 spectrum?
5 - Accessibility
  • Is the tool accessible to all users, including those with disabilities?
  • What specific features does the tool offer to support users with disabilities?
  • Does the provider offer a VPAT for this tool?
6 - Multi-Lingual Support
  • Does the AI tool support multiple languages, and can it provide content, instructions, and feedback in languages relevant to the users?
  • How does it ensure that language does not become a barrier to learning and engagement for non-native speakers?
7 - Engagement
  • Does the tool engage students in a meaningful way that enhances learning?
  • How does the tool personalize learning experiences to match the interests, skill levels, and learning pace of individual students, thereby enhancing engagement?
8 - Content Accuracy
  • Is the information and content provided by the AI accurate and reliable?
  • How can users report errors, and how quickly are these addressed?
9 - Bias and Fairness
  • Does the tool demonstrate a commitment to reducing bias and ensuring fairness across diverse populations?
  • Does the tool use diverse datasets that represent the varied demographics of the user population, including race, gender, socio-economic status, and abilities?
  • Are there any mechanisms in place for teachers and users to report potential biases or unfair outcomes?
10 - Content Filtering
  • Does the AI include safeguards against inappropriate content?
  • Does the tool provide a way for teachers and students to report instances of inappropriate content that may have been missed by the filters?
  • How does the tool respond to inappropriate content submitted by students or other users?
11 - Integration
  • Can it be easily integrated with existing tools, services, learning management systems (LMS) and other educational technologies?
  • Which specific tools does the tool integrate with, and in which ways (input, output, sign on, etc.)?
12 - System Requirements
  • What are the minimum hardware specifications and internet speeds required to use the tool effectively?
  • How does the tool scale and perform under different network conditions and with varying numbers of simultaneous users?
13 - Free Option
  • Is there a viable free version of the tool that provides educationally valuable functionality?
  • What are the features and limitations of any free version?
  • Is there an option to pilot any paid versions of the tool to evaluate the full functionality?
14 - Cost-Effectiveness
  • Is the tool cost-effective considering its educational benefits?
  • How does the cost of this tool compare to similar tools or free alternatives in terms of features, effectiveness, and user satisfaction?
  • What unique benefits does this tool offer that justify any additional expense?
15 - Updates and Maintenance
  • How are updates handled, and is ongoing maintenance/support provided?
  • How often are new features added to the tool?
  • Does the provider appear to be actively and intentionally developing and improving the product?
16 - Training Resources
  • Are there a variety of training materials available, including manuals, online tutorials, webinars, and interactive courses, to cater to different learning preferences and schedules?
  • Does the provider foster a community of users through forums, user groups, or social media platforms where educators can share experiences, ask questions, and offer peer support?
17 - Customer Support
  • How accessible is customer support for the AI tool, and what channels (e.g., email, phone, live chat) are available for teachers and administrators to seek help?
  • How quickly does the support team respond to inquiries and issues, and are there any dedicated support options for educational institutions?
18 - Data Privacy
  • Does the AI tool clearly outline what student data is collected, how it is used, and who has access to it?
  • Are there explicit assurances that the data collected is solely for educational purposes and not for commercial use?
  • How does the tool ensure compliance with educational data protection standards and regulations? (e.g., FERPA, COPPA, GDPR)
  • What measures are in place to protect against data breaches and other cyber threats?


Post by Eric Curts
📮 Join the "Control Alt achieve" email discussion group
💬 Join the "Control Alt Achieve" Facebook group - bit.ly/caa-fb
🔔 Get new blog posts automatically through email - FollowIt link
📰 Sign up for my email newsletter
🐦 Connect on socials: Threads - Twitter - Facebook - LinkedIn - Instagram - Mastodon - Bluesky
▶️ Subscribe to my YouTube channel
📧 Reach out through email - ericcurts@gmail.com
📗 Check out my "Control Alt Achieve" book
🔗 See my "EdTech Links of the Week" - bit.ly/caa-links
🏫 Bring me to your school, organization, or conference with over 70 PD sessions to choose from

No comments:

Post a Comment