26-05-2025
- Science
- Wall Street Journal
College Students Ride the AI Cheating Wave
Editor's note: In this Future View, students discuss artificial intelligence and cheating. Next we'll ask: 'Have you noticed DEI programs being canceled or scaled back at school? Is this good or bad?' Students should click here to submit opinions of fewer than 250 words by May 26. The best responses will be published Tuesday night.
Why I Write
Writing an undergraduate paper isn't about the actual paper. As an English major, I write to understand what I have read. Using artificial intelligence to write a term paper for my Shakespeare class wouldn't only be dishonest, it would rob me of my education. The odds of my saying something novel about 'To be or not to be' are about zero, and I know academia isn't hurting for the musings of a 20-year-old student fueled by energy drinks in the library at 2 a.m.
I write not because anyone else needs to read my thoughts, but because I need to write them. Delivering a finished paper takes hours of reading, rereading, outlining, drafting and editing. Even then, as one of my professors said, papers are never really finished, they are only due. Writing may be draining, never perfect, but it's always rewarding.
Slaving over term papers every semester for three years has made me a more careful reader, insightful thinker and articulate writer. When my professor grades my work, he judges the merit of my thought and engagement with the text. The page must reflect me, then, not the output of a chatbot.
AI has its merits. The chatbot Grok can do a deep search faster than I can find someone on LinkedIn, and ChatGPT wrote me a better workout program than my personal trainer did. The technology may well improve the quality of work in many spheres, but the classroom isn't one of them.
—Moira Gleason, Hillsdale College, English
The Future of Medicine
AI isn't cheating—it's preparation. As a medical student, I have found that success isn't measured by memorization but by the ability to make informed decisions that save lives. Increasingly, that means working alongside AI.
Tools including ChatGPT, Gemini and Open Evidence are already helping clinicians navigate complex cases on hospital wards—whether it's by narrowing treatment options or summarizing the latest research on a rare disease.
Research published in Nature has shown that a doctor using AI chatbots can often make better clinical decisions than a doctor working alone. Students cross an ethical line only when they use AI to avoid learning. A medical professional needs to know enough to use AI properly and safely. If a model 'hallucinates' and a doctor doesn't realize it, mistakes can slip by. That's dangerous in the classroom and the clinic.
But there's a clear difference between outsourcing thinking and using AI to enhance it. Medical schools shouldn't only allow AI—they have a duty to teach how to use it responsibly. Generative AI can simulate patient scenarios, break down difficult concepts or offer alternative ways of thinking. The next generation of physicians must know how to collaborate with AI, critically question its output and integrate it safely into patient care. That isn't cheating. That's learning that prepares students for the future of medicine.
—Dhruva Gupta, Harvard University, medicine
Why Attend Class?
Students in college have begun unloading their coursework completely onto AI. It has become an excuse not to attend class at all; a chatbot already knows all it needs to complete your coursework. Using AI to write an essay or complete an assignment negates the entire point of education—all while disadvantaging students who actually put time and effort into earning their degrees. The widespread use of AI pushes students to rely on chatbots to keep up with their peers.
AI is valuable when it isn't used to complete assignments for students. Its ability to compile information quickly to create study guides or explain course material can assist a student's learning. Unfortunately, the majority of students aren't using AI with such intentions.
—Patryk Zielinski, University of Connecticut, economics
Education Must Evolve
As generative AI tools become more sophisticated, our definition of academic dishonesty and cheating must evolve. Using AI to help review, write or structure essays and problem sets isn't inherently dishonest. Cheating implies an unfair advantage; that you used prohibited means not available to others. But AI tools are becoming as ubiquitous as calculators. What matters is how educators design assignments, and how teachers shift their focus from assessing rote knowledge to assessing critical skills.
These new tools should prompt a re-evaluation of our educational goals. If a chatbot can produce a coherent response to a question instantly, perhaps that question no longer reflects meaningful understanding. Rather than testing easily searchable facts, teachers should design assessments that demand analytical thinking, synthesis and original insight—the skills AI can't fully replicate. In a world in which AI is universally accessible, the abilities students need to cultivate are different. Schools need to adapt to that reality.
Rather than lowering standards, AI can raise them, redefining what it means to learn.
—Shira Shturman, Reichman University, law
Click here to submit a response to next week's Future View.