Skip to main content

The Future of Academics and ChatGPT

College faculty evaluate the impact of AI on the study of humanities

Navigating AI can seem daunting and scary as it enters deeper into our careers, our studies, and our daily lives. With its ability to generate creative ideas, write compelling articles for any audience, and develop movie scripts (among other things), ChatGPT has prompted concerned conversations among both faculty and students who fear the software’s capability to spread misinformation with no restrictions.

The College of Humanities has been working to address these potential problems in a multitude of ways. Humanities Center Director Rex Nielson (Luso-Afro-Brazilian Literature and Culture) and Associate Dean Leslee Thorne-Murphy (British Literature) approached Associate Professor Brian Jackson (Rhetoric and Writing Studies) about chairing an AI task force. Assistant Professor Meridith Reed (Rhetoric and Writing Studies) also wanted to address AI’s implications for the College, so she incorporated AI into her classes to teach her students how to use it effectively. These professors are hoping to subdue fears about AI by helping students actively learn, rather than rely on technology, and teaching them how to use AI effectively.

Computer code on a computer screen

The Problem with ChatGPT

ChatGPT stands for Chat Generated Pre-Transformer. In short, it is an artificial intelligence chatbot that, when given a prompt, creates a detailed response. Often, ChatGPT provides clear, relevant answers to prompts, but despite its best efforts, ChatGPT has limitations. Sometimes it pumps out responses that are incorrect, biased, or even offensive. When asked to provide research and sources, it has a tendency to invent sources that weren’t there before.

Even when ChatGPT produces accurate and compelling responses, many worry that it can also enable laziness and lead to a lack of creativity in the arts. Particularly in the College of Humanities, many professors worry that students will rely so heavily on ChatGPT that they never learn how to write. Furthermore, even if students do complete assignments without the help of AI, faculty have struggled to differentiate original work from AI-generated projects.

The Need for an AI Task Force

As chair of the AI task force, Jackson searched for a panel of people from different backgrounds to join him, hoping to get diverse opinions on how to address AI’s implications and problems. He invited various faculty and staff members from the Research and Writing Center, Department of Instruction in the Harold B. Lee Library, Center for Teaching and Learning, Office of Digital Humanities, and Department of Linguistics to be part of the task force. Faculty members from the English Department, like Reed, also participated.

Through their discussions, each member of the task force expressed different opinions on how to handle AI in the College. For instance, Jackson encourages his students to use ChatGPT so they can understand its word generation process. He tells them to study what the system generates and analyze if they learned anything from the AI on how to improve their writing. Through this learning process, Jackson hopes students will compare their writing skills to the AI to see how it can be helpful and how it can be problematic.

The task force also analyzed public perception of AI, noting that people generally like what it writes but don’t like knowing that a robot wrote it. People see AI-written work as devalued and disingenuous.

With this in mind, the faculty needed to show their students how to go through the writing process, not cutting corners with AI. The task force created a guide called “Teaching with Artificial Intelligence” for all faculty in the College, which, among other things, included an introduction to technology, issues for faculty to think through, and sample class activities. These class activities had students create content (such as media posts and journals) using AI. Professors could then ask their students questions about what the program generated: “What did you expect it to generate? What did it leave out? What has changed about your understanding of AI?” These assignments showed students AI's limitations, demonstrating why it is so essential to learn how to write without the help of technology.

Lights across the United States

Meridith Reed’s Classroom Experiment

Reed first learned about ChatGPT in December 2022, and while she recognized the potential problems involved in using AI, she knew that she needed to teach her students how to use it effectively. Reed says, “I personally think that we would be putting our students at a disadvantage if we never talked about how to use it or how to use it well.”

She experimented with ChatGPT in her Writing 150 and Professional Writing courses. Early on, she asked students to write a literacy narrative, compare their narratives with one written by ChatGPT, and then use ChatGPT as they wanted to revise their first drafts. She also asked students to analyze the advantages and disadvantages of using the application for their writing.

Reed found that in testing out ChatGPT, her students became more aware of its capabilities and limitations. She says, “Most students ended up concluding that ChatGPT can be useful for revision, brainstorming, or idea generation, but they didn’t feel like they wanted to rely on it to write their essays.”

Additionally, Reed saw that professional writing students were initially nervous that ChatGPT would take away their future jobs, but that fear gradually lessened as the semester progressed. Reed says, “I think they realize there are things that humans bring to the writing process that ChatGPT can’t do.”

Reed knows that AI writing isn’t all fun and games—especially because it can be difficult to catch when a student’s using it to cheat—but Reed believes that the number of students who cheat will remain the same. She says, “I don’t see why their motivations would change significantly.”

She also acknowledges that with tools like ChatGPT, students may no longer face the struggle of staring at a blank page, which she believes is valuable for learning to write. While AI can generate interesting, unique ideas, Reed hopes her students will recognize the value in creating their own.

Reed warns that one of ChatGPT’s limitations includes conducting poor research. Sometimes, it will reference real journals and researchers but make up the facts, introducing errors that students may not catch. Reed believes students need instruction and practice to become aware of these potential issues and develop skills to navigate them. Reed plans on continuing to teach her students how to use AI effectively because, as she says, “AI is here to stay. It’s going to be used as part of the writing process in professional work. We would be putting ourselves at a disadvantage if we never taught students about it.”

Student Responses to AI

Students also see the potential dangers of using ChatGPT. Alyson Bishop (English ’25) took Reed’s writing class where she wrote an assignment using ChatGPT. She says, “I think ChatGPT helped in the way that it went a lot faster than if I had done it myself, but it probably hindered my learning.”

She ultimately came to a similar conclusion as Reed, deciding that while ChatGPT can be a useful resource, it cannot replace the actual writing process. Bishop says, “There’s so much you’re going to lose if you’re not a real person trying to analyze and have feelings about a text.”

The Future of AI

Jackson asks, “In the future, are we smart enough, forward thinking enough, and brave enough to ask whether technology is aligned with our values? What do we want our students to learn and how does technology augment or automate?” While AI has its pros and cons, faculty are finding ways to use it as a resource rather than a crutch that eliminates creativity and deep thinking.