Since becoming widely accessible in 2022, artificial intelligence has inevitably been used across every field of our lives – especially in schools. In recent weeks, it has also become a very prominent and controversial topic at WCSU.
Both students and professors are highly aware and engaged in the world of AI, whether as a consumer or a critic. At a first guess, you might think that students would be reveling in a tool that can seemingly generate assignments in seconds; you might think that professors and teachers would unanimously despise the very concept. A closer look at AI usage within our university reveals a world of nuance in the face of a technology that changes how we approach everything.
What are the Concerns for A.I. at WCSU?
There is a large list of common concerns carried by both students, professors, and anybody who has to interact with artificial intelligence or whose life is impacted by it.
There are immediate concerns about the practice of students relying on artificial intelligence to edit or fully complete their work. Students feel that they will not truly learn what they need in order to operate in their desired careers past college. In a similar vein, students are worried about artificial intelligence being fused into their lessons and class structure.
“A lot of students think that professors want them to sound like professional academics. We don’t. We want them to sound like themselves.”
– Dr. Brian Clements, head of the Kathwari Honors Program
There’s also the issue of reliability and credibility. When it comes to training the AI, most (if not all) large-scale A.I. companies cast as wide a net as they can and do not discern the information they take in to fuel their intelligence models. This means multiple things: the models contain false information taken from forums and posts on social websites; and the proper research and creative art stored online are taken without the knowledge or consent of the creators and authors. Because of this questionable practice, many professors will label either uncited or any A.I. usage to be plagiarism when used in schoolwork.
When trying to fill gaps in its knowledge, it generates its own fake facts and sources, known as “digital hallucinations.” While there are usually guidelines to prevent obscene or hateful messaging, the information provided by commercial generative A.I. is nowhere near consistently accurate or trustworthy.
Aside from these concerns of plagiarism and reliability, there are deeper concerns about the environmental and humanitarian impacts of the industry. The exponential growth and demand for infrastructure needed to maintain artificial intelligence (such as data centers) has greatly impacted surrounding residents’ livelihoods, cost of living, and even health.
Students and professors are aware of all the factors listed above, which has made the use of Artificial Intelligence extremely controversial at WCSU.
To get a better understanding of the ethical concerns surrounding A.I., The Echo spoke with Dr. D.L. Stephenson. Stephenson is a humanistic studies and interdisciplinary studies professor who holds a doctorate in communication and a master’s in rhetoric. Dr. Michelle Brown, the dean of the Maricostas School of Arts and Sciences, asked Dr. Stephenson to design a course in artificial intelligence ethics in the summer of 2024
When speaking on the rationale of the course, she spoke of the “ubiquitous” nature of the technology. “AI is being used by companies and businesses to, for example, make hiring decisions, perform customer service tasks, assist universities and colleges with the admission process, help banks and financial institutions make loan approvals, to steer online search engines…”
Professor Stephenson brings to light underlying questions we should have about artificial intelligence. While the benefits of artificial intelligence are something to consider, it’s also important to take concerns about AI very seriously.
Should we allow ourselves to use AI as a central tool for research and study? How would that bleed into the arts? How can professors teach skills to students who can enter a prompt and get results in seconds? She urges students, professors, and AI developers themselves to act as moral agents when considering this technology.
“Just because you can doesn’t mean you should,” said Stephenson. “… AI technology can do a vast amount of great things, but also, in equal measure, an incredible amount of harm.”
Potential Positives of A.I. at WCSU
Even though there’s a significant pushback in the use of generative AI in educational and professional institutions, the basis for keeping it around is not just to remove the burden of labor.
Both advocates and critics know that AI is not only sticking around for a while; it’s still being developed and refined. While it still has a far way to go in terms of research, it may have proper application in technical fields down the line.
Professors acknowledge that AI could be used as a tool in a number of ways, with the understanding that it’s in a very experimental phase right now.
Dr. Gadkar-Wilcox, chair of WCSU’s history department, spoke with us about AI in terms of research. He felt that “…It is perfectly fine for them to use AI for the sort of things that they might use Wikipedia for.” He said it could be useful to check basic facts that are corroborated in so many places that it would be extremely difficult for AI to hallucinate an answer on. However, anything beyond basic facts is extremely likely to be generated with false or completely made-up information. “In the future I see something for AI there, but right now I can’t recommend that to my students,” Gadkar-Wilcox said.
A student studying in the Management Information Systems major reinforced the use of AI for basic functions and to “speed up monotonous tasks,” but that “you have to know and be able to explain what the AI is producing.”
Overall, most professors who are letting their students engage with AI are doing it with the understanding that the use should be very surface level and heavily monitored. This practice, funnily enough, allows professors to speak very directly with their classes and prompts open communication about the quality of work that both students and professors expect from each other.
The Crossroads at WCSU
The expectations and standards for AI usage and quality of output vary greatly depending on what you’re studying.
Dr. Brian Clements, Head of the Kathwari Honors Program at WCSU, is one of the professors in the Professional Writing department. He, like all the other professors in the writing department, includes AI usage as plagiarism in the syllabi for his classes. He has spoken about his belief that artificial intelligence is very much a hinderance to students’ writing and does them no favors when engaging with course material.
He has also, in one project, allowed his honors students to use artificial intelligence tools for graphic design.
In this honors class, the students were assigned to participate in an international design competition called “120 Hours.” In this competition, students were given a prompt that they were supposed to respond to and then translate that into a two-page graphic.
In Dr. Clements’ own words: “Basically every major is represented among the 75 or so students who are in the 2 sections of this class. I know that the vast majority of those students have no graphic design expertise. So, I’m allowing them to use AI tools to help them with the design portion. There’s a written component I’m not allowing them to use AI for.”
The logic here is understandable. It would be inefficient to expect students with no prior experience in graphic design to be proficient enough to make a design from scratch suitable enough to enter a competition. Instead, the focus is on what Dr. Clements can teach: the writing.
This is far from the only example of certain aspects of work being outsourced to AI, while the “main focus” of a field is seen as vital for the student to accomplish themselves.
Professors in the Marketing department, while engaging with AI often, still outsource a lot of graphic design work to artificial intelligence for the same reason: graphic design is not in the students’ skillset, but is still needed for products or projects in the department.
According to student Johssa Daniels, the marketing department actually engages with – and scrutinizes – AI output often. She said that her department, like the others, uses AI for “outside what you’d usually be doing.”
Johssa explained that it is also not uncommon for professors in the Marketing department to give questions on an assignment, then tell students to put those questions into different AI chatbots to analyze how different the answers might be.
Consistent exposure to AI output might also not be too bad of a thing for students.
“As someone who works with it pretty consistently, I can see when things are not human-created,” Johssa said.
It’s not just the marketing department, or the writing department, or any specific sect of WCSU. Every professor knows that their students will have to egage with AI at some point – if not as a user, definitely as a consumer. It pays off to have your students familiar with the output of AI to be able to judge if it’s either worthwhile output or shoddy work at best.
There’s no simple response to the daunting force that is modern, generative artificial intelligence. When everyone recognizes it as world-changing, we all scramble to figure out our place amongst the changing tides.
More stories regarding AI usage and policy at WCSU will be coming soon. Check back in with The Echo to read more.


Leave a Reply