Student AI use: Stupid, sneaky, and skillful

Another academic year concludes in the age of artificial intelligence (AI). The chatbots are certainly more capable: ChatGPT 5.4, Gemini 3.1, and Opus 4.7 are extraordinary. However, though enthusiasts made much of OpenClaw, it’ll be a year or so before ordinary people are using Claude’s Computer Cowork or OpenAI’s Operator. But when they do, students will be able to direct agents to complete their work without detection. For an essay, a student might specify: “following the assignment instructions and rubric: develop a thesis, find sources, read and summarize the sources, create an outline, and write the prose; avoid the tell-tale signs of AI prose, use a humanizer, and complete all steps with the pace and haphazardness of a human.”

Because I required my students to include a link to their drafts’ histories and to document their AI usage—even sharing their AI conversations—I already saw students use AI to complete each one of those steps individually. That was just one thing I saw by way of pangram.com and Process Feedback.

  • obvious misconduct
    1. stupidly copying-and-pasting (and falling for a poison pill)
    2. sneakily copying-and-typing (via “humanizer” plugin or by hand)
      • there’s even Scrawl AI that can write out assignments in a person’s handwriting
  • definitely concerning (and bordering on misconduct)
    1. copying-and-pasting and then rewriting every paragraph themselves
    2. asking for a thesis, sources, source summaries, and an outline, and then genuinely writing (then possibly editing it in Grammarly)
  • skillfully appropriate
    • developing a novel thesis (“I’m interested in topic X, and I’m thinking Y & Z, help me brainstorm.”),
    • finding sources (“What are the best sources on X? Why are they the best? Compare these to the sources in the Wikipedia article on the topic.”)
    • understanding the sources (“What is the novel contribution relative to the literature? What were the methods, strengths, and limitations?”)
    • polishing their writing (“Here are common writing issues, as well as my own foibles; critique this draft.”)
    I share my own prompts with students to stress that these are the result of many conversations with AI and still should be the start of new interactions.

This summer I’m going to continue adapting my courses to AI and the new cohort of AI “native” (dependent) students. I will lessen the weight of out-of-class writing by introducing low-stakes pop quizzes. I also plan to introduce some “use AI appropriately” exercises, such as creating self-tests and asking for critique.

Ultimately, though, I am not optimistic about education. A few students will use AI well: those who enjoy learning and who have the ability to focus and orchestrate tasks, threads, and agents. Ironically, this group of students will include those who were kept away from screens for much of their childhoods. Otherwise, I fear AI will enfeeble many, accelerating the achievement gap.

Comments !

links

social