By William Curvan

At the beginning of the semester, Carnegie Mellon students received multiple emails from the University Education Council inquiring about their thoughts on generative AI. The Council said it would use student responses “to inform the creation of various resources to support students’ effective and ethical use of generative AI tools.”

Currently, there are no prescriptive guidelines on how instructors should handle generative AI in their classrooms. The Eberly Center, a teaching development center that offers resources for faculty and graduate student professors, provides examples of different approaches but leaves the decisions up to instructors.

As a First-Year Writing instructor, Literary and Cultural Studies Ph.D. candidate Catherine Evans has been trying to navigate best practices. She told The Tartan that she and other instructors in the English department have begun “incorporating new curriculum elements” that explicitly discuss the merits and drawbacks of generative AI.

In one of Evans’ Writing About Literature, Art, and Culture lessons, she asks students to evaluate two different summaries of a literary work. She said students often find that one of the summaries has very rigid prose, and sometimes included scenes that were not in the novel. Without realizing it, the students had identified the summary written by ChatGPT, which “just produces the most average piece of writing,” Evans said.

According to Evans, some instructors have made their courses more resistant to ChatGPT-penned essays. This includes teaching films, changing the genres of focus, and using new material that is less familiar to ChatGPT.

“It knows Shakespeare pretty well, but if you ask it to write about something made in 2023 that there are less scholarly articles about, it won’t know what to do,” Evans explained.

She said that she allows students to use it in the brainstorming process as long as they make it absolutely clear where it was used and how. Evans said she caught a lot of “suspected AI use” in assignments early in the course, but that this was uncommon by the time students submitted their final essay. She even noticed that students would call each other out during the peer review process if they suspected one another of using generative AI.

Evans said she is worried about the consequences of using ChatGPT to aggregate online resources to begin one’s research process. She argued that research papers and publications that are already prevalent in the data set are going to show up the most when users ask for sources.

In her class, Evans asks students to consider whose voices AI privileges. “Plagiarism isn’t just finishing your assignment, it’s perpetuating systems of harm in higher ed,” she tells them.

Literary and Cultural Studies Ph.D. candidate Rachael Mulvihill teaches a First-Year Writing course called “Writing about Public Problems.” She said that the nature of the course is less conducive to using generative AI, as students must go through a lengthier process of research and interviews. The topics are more personal, so students tend to “get into a little more,” she told The Tartan.

Mulvihill allows students to use AI as a brainstorming tool to develop survey questions, as long as students give the writing a second pass. She also allows them to use it in early drafts, provided they clearly cite ChatGPT content.

Mulvihill is also a teaching assistant for Business Communication. The professor, Beth Walter, has begun to include ChatGPT usage in an assignment about sending a “negative news email” (for instance, an email about employee layoffs or pay cuts). In the assignment, she wants students to consider how ChatGPT’s tone affects the communication.

Elizabeth Walker, a Ph.D candidate in Literary Cultural Studies, wants to help her students “see where it can be a useful tool,” while also guiding them to be “very judicious about their usage of generative AI, because it does have its limitations,” she said.

Walker, who teaches a Dietrich Grand Challenge Seminar, recently received a grant from the Eberly Center to study potential applications of generative AI. She is studying how generative AI can “help students read academic texts more easily.”

Walker believes that many academic texts are not accessible and can be intimidating to someone delving into a field for the first time. She acknowledges that ChatGPT is not always the best at summarizing content but thinks that smart prompts can help overcome this.

“I’m curious to see how generative AI can help students get their minds around an article,” Walker said. She also foresees it being helpful for non- native English speakers reading English-language academic texts.

In her curriculum, students examine content written by ChatGPT and then discuss how to refine and revise it, Walker said. “Part of my job in the classroom is to learn how to ask questions to make ChatGPT as accurate as possible.”

Walker foresees another interesting application for ChatGPT in discussion-based classes. Instead of asking students “what do you all think?”, Walker said a teacher could instead use ChatGPT’s response to a discussion question as a starting point by asking students if they think it’s right.

When asked about adopting a university-wide policy, Walker said she believes “we need to have a policy in terms of disclosure. … I’m in the camp of making sure students know why disclosing use of it is important.”

While generative AI has helped many classrooms, it can still facilitate plagiarism. One professor told The Tartan that they had to pursue an academic integrity violation claim against a student who submitted an assignment that was written almost entirely by ChatGPT.

Both Mulvihill and Evans said a university-wide AI policy would have to tailor its rules to each academic discipline, which may use the technology differently. “It’s important that our school is talking about it,” Mulvihill said of the policy, especially because Carnegie Mellon is among the schools setting a precedent for AI use.

, ,

Leave a Reply

Your email address will not be published. Required fields are marked *