"

Introduction

This guide builds on and extends the work of two previous guides that were published by the SUNY FACT2 Council. The first edition of the SUNY FACT2 Guide to Optimizing AI in Higher Education, published in fall 2023, was written to help faculty get their initial footing given the sudden availability of generative AI tools and the significant challenges that ensued as a result. The second edition of that guide, published in summer 2024, built on the work of the first edition, integrating more detailed recommendations in response to emergent practices in teaching, research, creative applications and assessment. Those guides provide a broad introduction to AI in higher education, and readers who are looking for more general information may wish to consult those editions as a starting point.

Now, in late 2025, our thinking about generative AI for teaching and learning has evolved, as have the needs of faculty and institutions. Many of our initial concerns about AI remain valid, including biases that are trained into Large Language Models (LLM), negative environmental impacts, and the potential for violations of data privacy. All these concerns are addressed in some depth in the first and second editions of the Guide to Optimizing AI in Higher Education. At the same time, new concerns have emerged along with a more sophisticated understanding of AI’s potential—and how AI is currently being applied across SUNY as well as other institutions outside of SUNY. This guide reflects that evolution and provides more specific guidance to address common concerns, including frameworks to help faculty and institutions make good decisions about the role of AI in teaching and learning.

It is important to acknowledge that caution in the adoption of AI tools is still warranted. As we prepare our students to live and work in a world where AI will remain present, we have a responsibility to help them think about the implications of their usage. Knee-jerk rejection of AI is not helpful, nor is thoughtless embracing of AI. Instead, we need to consider the potential harms along with the potential benefits—and help our students do the same. We need to help them learn how to determine when (and how) using AI is appropriate and when it is not. We need to help them think critically about the hype—both good and bad—that characterizes public discourse about AI, especially in relation to educational practices. And perhaps most important, we need to equip students and faculty to critically evaluate when it is appropriate to use AI (or not), and make informed, ethical decisions as they do so.

This guide provides frameworks to address three key areas where faculty continue to face challenges with ethical AI use and adoption: considerations for policy development, evaluating AI tools for teaching and learning, and the use of AI tutors to support student learning experiences.

Higher education institutions continue to grapple with ensuring that faculty and students have clear and helpful guidance to navigate the use of AI. Part 1 of this guide provides an overview of important considerations for the development of AI policies and guidelines. It also describes a transparent process for developing guidance with the input of stakeholders and for making decisions about framing policies in ways that will ensure buy-in. Importantly, this section acknowledges the presence of AI tools across multiple dimensions of an institution’s infrastructure, not just the generative AI tools that have become widely available in recent years. Finally, this section also includes helpful examples of policies that have already been developed by other institutions as a point of reference.

A key challenge that faculty face is the proliferation of AI tools, both within and outside the landscape of educational technology. In the face of many options, it can be difficult to make deliberate choices about whether tools will serve specific teaching and learning needs and goals. Part 2 of this guide provides a clear and easy-to-use rubric for evaluating AI tools that includes key dimensions to guide our decision-making. This section also includes illustrative case studies to demonstrate how to use this rubric effectively.

Finally, higher education continues to explore the potential for AI to provide personalized learning experiences for students. Part 3 considers one of these possibilities, providing an overview of the role that AI tutors can play in students’ learning. In addition to considering best practices for use of AI tutors, this section considers the limitations and potential drawbacks of tutors for student learning.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

AI in Action: A SUNY FACT2 Guide to Optimizing AI in Higher Education Copyright © 2025 by SUNY FACT2 Task Group on AI in Action; Kati Ahern; Nicola Marae Allain; Abigail Bechtel; Angie Chung; Billie Franchini; Meghanne Freivald; Ken Fujiuchi; Dana Gavin; Jack Harris; Keith Landa; Alla Myzelev; Victoria Pilato; Ahmad Pratama; Russell V. Rittenhouse; Carrie Solomon; Angela C. Thering; and Shyam Sharma is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.