Skip to main content

Editing in the Age of AI

What does the rise of AI mean for student writers and editors who are about to enter the workforce? BYU linguistics professors offer their insight on this unique challenge.

When it comes to fears and phobias, most people probably think of spiders, heights, or small spaces. But perhaps a more universal fear—the fear of the unknown—is becoming increasingly relevant with the rise of AI. On November 13, 2025, linguistics faculty Matt Baker (Editing, Business Communication), Holly Baker (Editing and Publishing), and Professor Mark Wolfersberger (Editing and Language Teaching) gave a presentation for STET: The Editor’s Network (BYU’s editing and publishing student association) on this very topic. They shared their research exploring the future of writing and editing with AI and how writers and editors can work alongside this technology.

A photo of Matt Baker
Photo by Colby St. Gelais

Understanding AI

Matt Baker’s presentation focused on understanding what AI is and what it is not. He explained that society has begun to fall into the trap of anthropomorphizing AI, which Baker defined as “ascribing human-like traits to non-human entities.” People subconsciously humanize AI by using certain verbs, such as learn, know, and need, when speaking of AI interfaces. However, AI does not possess the human ability to know or to need. Thus, Baker believes this anthropomorphization of AI is a dangerous practice “because it’s not accurate to what [AI] is. It’s not a human.”

Some of the potential negative effects of giving AI these human characteristics, according to Baker, include misunderstanding how AI functions, maintaining unrealistic expectations of its capabilities, and allowing the technology to act as a scapegoat for unethical decisions. And, perhaps most relevant to the topic of writing and editing, AI’s widespread use could limit human creativity.

The Need for Human Editors

Holly Baker also examined the element of human creativity, making the point that a “human editor has experience in the world that ChatGPT does not have.” This experience, she explained, gives humans much more wisdom than AI could possibly possess.

A photo of Holly Baker
Photo by David John Arnett

She provided multiple examples of AI editing and writing suggestions that were blatantly incorrect and explained that if we let AI “do our copywriting for us, it would actually introduce errors into the text.” She believes that these errors demonstrate “the need for a human editor who better understands the context.”

Baker expressed no concern for the possibility of AI replacing editors because “human expertise is superior to any artificial intelligence.” However, she does recognize that the world is changing, and editors will need to know how to work alongside AI. She suggests approaching this eventual partnership by developing expertise in the field of editing, considering the job of human editors to be to “recognize when AI is not doing it correctly” and to discern when “we need to step in and fix it.”

AI as a Tool

A photo of Mark Wolfersberger
Photo by Colby St. Gelais

Wolfersberger agreed with Holly Baker, acknowledging that AI has a lot of positive aspects to it. He focused his remarks on the actual logistics of writers and editors working with AI and making use of it as a tool. He encouraged writers and editors to “think about ways in which [AI] might be able to support some of your weaknesses.”

He walked step-by-step through the writing process and suggested how writers might use AI. One idea he gave was for writers to begin with a “brain dump” and then to use AI to help organize that information. He also recommended writers use AI in the beginning stages of writing to create a preliminary outline. Using AI in this way helps writers feel inspired and guides the direction of their writing.

Wolfersberger did warn writers and editors against asking AI to copyedit their work because “it tends to correct things that don’t need correction,” as proven by Holly Baker, who issued her own warning: “Ceding knowledge, skills, and expertise to AI tools weakens both the editor and the final product.” It can destroy the author’s unique voice.

As AI becomes increasingly relevant in the world, it is even more important for students to learn how to approach it in the workplace. By listening to the counsel given by these BYU Humanities faculty, writers, and editors can be guided through that experience, taking comfort in this point made by Holly Baker: “There’s always going to be the need for that human element, which is encouraging.”

Learn more about BYU’s STET: The Editor’s Network events here.