In the world of education, ChatGPT has sparked mixed reactions. Some teachers believe that AI renders their assignments meaningless, while students argue that those assignments were already devoid of meaning and prefer using AI to save time.
However, recently there seems to be a reversal of fortunes, shifting the focus from “student cheating” to “professor productivity improvement.”
Ian Bogost, a professor at the School of Computer Science and Engineering at Washington University in St. Louis, as well as a writer and game designer, recently published an article. After interviewing several professors, he found that ChatGPT is indeed a productivity tool for educators, enabling them to write recommendation letters and develop course syllabi with remarkable efficiency.
In a surprising turn of events, a professor used a template recommendation letter generated by ChatGPT to craft a “heartfelt and personalized” letter, ultimately assisting a student in securing a prestigious scholarship at Cambridge University.
However, when it comes to recommendation letters, some internet users chimed in, claiming they had already solved the issue and that AI intervention was unnecessary. They emphasized that it should be the professor’s responsibility to provide the recommendation, while the student should take charge of writing the content.
ChatGPT's Reputation Takes a Dramatic Turn in Two Episodes
Since its inception, ChatGPT has faced relentless scrutiny from major universities, casting a dark shadow over the field of education. Reports claiming the demise of university essays, the impending doom of high school English, and the prevalence of AI-generated student assignments have perpetuated a sense of gloom in academia.
Several institutions have launched investigations into ChatGPT, prompting updates to academic integrity warnings within their teaching guidelines. Some have even introduced dedicated courses to foster discussions on the matter.
As the drawbacks of ChatGPT gradually come to light, professors have shifted their concerns away from “cheating” and are now exploring how to leverage ChatGPT to automate their own tasks.
While large language models may not excel in generating “accurate facts and knowledge,” they prove highly adept at tackling non-work-related assignments, providing remarkably reasonable outputs.
In tasks characterized by high repetition and low significance, ChatGPT has united students and faculty on the same front.
Take “recommendation letters” as an example. While ChatGPT may not explain why a professor would (or wouldn’t) recommend a specific candidate for a particular role, it can offer detailed templates that require minor modifications to key information.
An anonymous professor from the University of Texas, unwilling to disclose their name, has adopted a unique practice before delivering lectures or writing recommendation letters—they draft a rough version using ChatGPT. This phenomenon, according to the professor, is quite widespread. While some may view this shortcut as evading work, there are numerous crucial tasks at hand, and writing recommendation letters, in particular, tends to have a lower priority. Moreover, ChatGPT can cut the letter-writing time in half.
Another scholar utilizing AI writing tools, Matt Huculak, the Head of Advanced Research Services at the University of Victoria Library, believes there is an undisclosed secret in academia. Most professors categorize letters as “excellent,” “good,” or “average,” making slight adjustments based on the specific circumstances before reusing them.
However, Huculak wonders if the advent of ChatGPT can put an end to this practice, especially for top-tier students who cannot be defined by templates. To explore this, he conducted an experiment where he asked ChatGPT to generate a recommendation letter for an excellent student, not as a template but as a case study for creating a distinct and non-formulaic letter.
Huculak expressed that the writing process was unlike anything he had experienced in a long time. It resulted in an exceptionally personalized and heartfelt recommendation letter, which ultimately helped the student secure a prestigious scholarship at the University of Cambridge.
Inspired by this success, Huculak began incorporating ChatGPT into his work, using its output as a foundation for crafting non-formulaic texts. He found the process of “rearranging the materials” to be remarkably comfortable, eradicating any fear of facing a blank document.
Stephanie Kane, a lecturer at George Mason University, also praises ChatGPT for its impeccable ability to address the challenges of getting started.
Every time Kane embarks on developing a new course syllabus, she relies on ChatGPT to provide ideas. She describes ChatGPT as a “rubber duck that talks back” (similar to the concept of explaining code or documentation to a rubber duck to stimulate inspiration and uncover contradictions).
However, Kane soon discovered that ChatGPT couldn’t provide real-world readings, such as relevant books or papers. It could only offer related topics or concepts.
In comparison to consulting colleagues, Kane found ChatGPT to be preferable. At least it didn’t add pressure like interacting with colleagues would. “I can ask any question without worrying about sounding foolish or unprepared,” she remarked.
While Huculak and Kane employed ChatGPT to break free from formulaic templates, Hank Blumenthal, a filmmaker who straddled both industrial and academic realms, sought clichéd insights through ChatGPT.
Blumenthal, unable to secure academic positions within his field, questioned whether his requests for statements on diversity, fairness, and inclusivity were too unconventional for academia’s taste.
He mused, “My diversity manifesto is about all the films I’ve made, employing Black, Asian, female, and diverse crew members, directors, and actors. However, I believe schools want something else.”
Regarding what exactly that is, Blumenthal believed ChatGPT possessed the ability to assist in producing the desired text without relying solely on his own diversity standpoint.
Another unnamed American university professor admitted to using ChatGPT to generate formal assessment criteria, which have now become an integral part of courses and degree applications.
The professor expressed admiration for this remarkable research, sounding like something an uninformed evaluator would want to hear when assessing a course. The generated material was excellent enough to serve as a genuine proposal.
One common lament regarding large language models is their lack of originality after being trained on heaps of existing material.
However, professors are not frequently tasked with presenting truly novel ideas. The bulk of their daily work revolves around office tasks such as writing letters, handling forms, and composing reports—tasks that artificial intelligence is fully capable of handling, or at the very least, providing a sense of superiority beyond AI.
The same goes for students, who often find themselves overwhelmed and exhausted. They struggle to comprehend specific requirements due to varying demands from different professors, feel suffocated by tuition fees, grapple with uncertainty about their future prospects, and face the challenges of transitioning into adulthood.
Students come to university primarily to gain the college experience, followed by learning and obtaining credentials.
While some university lecturers may view class assignments as mere commodities tainted by the intervention of chatbots, students perceive these assignments as distractions that prevent them from truly understanding what they are supposed to accomplish.
In these regards, artificial intelligence serves as a tool to eliminate frustrating barriers, allowing all of us to focus on what truly matters.