ChatGPT was introduced to me in my sophomore year, the spring of 2023, by Professor of Writing and Rhetoric Hector Vila. We pasted our writing into the then-novel software, asked it for an improved version and reflected on what we liked about our work vs the robot’s. I watched my essay about my complicated relationship to my family in the Netherlands transform into something decidedly less authentic. I closed the tab, assured that the technology could never replace good human writing.
In that same semester, I quietly nodded along at a Campus meeting when the executive team at the time set a ChatGPT policy for our newsroom. With some caveats, writers could use the tool in the reporting and journalistic process if they informed their editors. For producing the text of an article or editing, it was forbidden. While we still maintain these standards, over two years later I feel the need to recognize their limitations and add some transparency about how The Campus is navigating the rapidly changing landscape of research and writing.
For a long time after that initial semester of ChatGPT-craze, I tried to ignore generative artificial intelligence (AI) almost entirely. I tuned out my dentist’s AI-induced skepticism when I told him I plan to pursue a career in journalism and shut my eyes to how much my peers were implementing the tool. The idea that the writing I spent hours on could theoretically be done in just a few seconds terrified me.
But when I stepped into my managing role on The Campus last fall, I adopted responsibility for the quality, accuracy and humanness not only of my own writing, but of approximately 20 pieces of others’ work per week. That meant I had to stop turning away.
I have, and partially still, view The Campus as relatively safe from what I see as generative AI’s many pitfalls: Some factual errors, limitations in original storytelling abilities and potential bias. Unlike large newsrooms that have struggled to handle AI, we report on a small campus and town community; it would be difficult for generative AI software to gather sufficient information for the kinds of articles we publish, many of which require quotations from interviews. Plus, no one is paid for the work they do for us, and participation is entirely voluntary. What incentive would students have to make a machine write for them?
I was also formerly confident that I would be able to tell if I encountered a submission written with ChatGPT or a similar device. But with the technology’s mass adoption and sharpening capabilities, I can no longer be sure. That weighs on me.
Since the 2022-23 executive team set their AI policy, not one writer nor editor has ever notified us that they have used it. Yet according to a recent study conducted by Middlebury professors, over 80% of students here implement it to some degree for classwork, turning me skeptical that it has not touched Campus contributions. Last week, we enthusiastically welcomed members of the class of 2029 to start writing for us. They have had access to ChatGPT and similar tools since their sophomore year of high school and know how to use them better than I do. While I do not accuse them of generative AI use based on this fact alone, it is indisputable that they are more likely to implement it than previous classes.
Luckily, there are aspects already enmeshed in The Campus’ editorial process that limit the influence generative AI can have over what readers find in our final pages. We use Google Docs to write and edit, meaning that most of the time, we have access to writers’ editing history. Each of our sections have teams of four or five editors that meet in-person to work on writers’ pieces in “Suggestion Mode.” These suggestions and comments are labeled with time stamps, guaranteeing that revisions are not copy-pasted from another window and revealing how quickly they were conducted. The executive team accepts or rejects editors’ suggestions and adds in our own changes in “Editing Mode.” Opinion works a little differently. We require opinion writers to accept and reject edits themselves and respond to all comments and questions, necessitating interaction and dialogue to bring their final piece to fruition.
Anyone who has written for The Campus knows that edits typically aren’t light. This is part of our process for outputting the highest quality material possible and training our contributors.
While I reiterate that our policy remains the same as it was in 2023 — we must be notified if you use generative AI for any part of your reporting process and you may not use it to write — the thought of adding an editor’s note to a story that AI contributed to the article pains me. After I tried to shut its existence out for so long, it would feel like surrender. Still, if transparency demands it, we will not hesitate.
I have also nervously wondered if our writers could unknowingly be citing information from the internet or directly from a person who has used AI to generate a typed response to email interview questions. We avoid conducting email interviews unless we need last-minute information that is vital to a story and could not be communicated over the phone, or if an essential source refuses a phone or in-person interview. We ask that community members contacted for an interview with The Campus make time for a conversation whenever possible.
I am well aware that these measures will only go so far and that the changes AI will bring are out of our control, and I am also aware that it is far from useless for those who know how to use it properly. But students should know that The Campus is a tool, too, and we want to be useful to anyone looking to practice their research and writing abilities. Readers should know that our purpose is to be an authentic resource to learn about the Middlebury community, compiled from hours of thoughtful, human work every week.
Madeleine Kaptein '25.5 (she/her) is the Editor in Chief.
Madeleine previously served as a managing editor, local editor, staff writer and copy editor. She is a Comparative Literature major with a focus on German and English literatures and was a culture journalism intern at Seven Days for the summer of 2025.

