Logo

Academic Integrity in the Age of AI Tools
Winning Essay by Malanie Cline

When we look back to only 5 years ago, few of us could anticipate the mainstream use of AI, let alone in large sectors such as education. Beginning as a quick thesaurus, a simple calculator, and a grammar-checker, AI tools quickly grew to create outlines, brainstorm ideas, and before we knew it, write entire essays for students. While AI tools can offer support for students, these same tools challenge definitions of academic integrity. This calls for a recalibration of what it means to learn and create in higher education.

When used responsibly, AI tools can enhance learning in educational environments. For example, AI can be used like a tutor when financial or geographical barriers impede on a student’s access to tutoring. Another example is using AI for non-native speakers to improve writing in a secondary or tertiary language. AI tools can increase access to quality education when used mindfully, and not when used to simply achieve an easier outcome.

On the contrary, misuse of AI threatens core academic values, a vital one being integrity. When AI tools are used to generate and submit entire essays, original work goes out the window. Entering one’s writing into AI tools to rewrite assignments to express different tones flattens uniqueness over iterations and removes the distinctions within human work that makes the writing exceptional. These uses of AI tools undermine personal learning and can blur the lines between assistance and dishonesty. Students may not intend to submit dishonest work, but the boundaries are unclear in the fast-paced evolution of these tools.

Given this fast pace, academic integrity must evolve alongside technology. This can look like a few things. Transparent use policies to clearly state when AI has been used can remove ambiguity around what is considered acceptable use of AI in teaching settings. Teaching students how to use AI ethically to unblur the lines by building literacy in the topic. Students may not recognize tools like Grammarly and citation tools as being considered AI. When these tools can be explicitly labeled and discussed in classroom settings, ethical usage can be normalized, and unintentional violations of school policy can be reduced. Lastly, a shift toward hand-written assignments that emphasize critical thinking can help ensure the work being submitted reflects the student’s own thinking. This can look like in-class writing or multi-step assignments, including any outlines, drafts, and revisions being used to craft the writing.

While AI is not fundamentally a threat, how we use it matters greatly. In a time when AI has become a forefront in technology, avoiding shortcuts in educational material is not enough. It’s important to redefine what it means to learn and think for us as AI tools become more prevalent and unavoidable. Human creativity can be preserved rather than replaced when AI is embraced with ethical boundaries. The steps mentioned can be taken by many parties – institutions, teachers, and students must work collectively to form a society of integrity that embraces the future of technology, not the fear of it.

About the Winner

Malanie Cline recently earned her Bachelor of Science in Computer Engineering from West Virginia University and is currently pursuing her Master’s in Biomedical Engineering at Case Western Reserve University. Her focus is on integrating computational methods with medical technologies. She is passionate about advancing emerging technologies that make healthcare more accessible and sustainable worldwide. Her interests include developing systems that improve the transport and accessibility of medical devices, as well as exploring innovative stem cell research from noninvasive biological sources to expand regenerative medicine possibilities. Malanie is deeply motivated by the potential of technology to improve health outcomes and reduce disparities in medical care across different regions.