By: Steven Moe
There is no denying Artificial Intelligence’s impact on our society in the past few years. From self-driving cars to smart home devices, from social media to chatbots, AI has garnered people’s attention from all corners of society – ChatGPT alone gained 1 million users in its first five days publicly available.
Programs specific to generating new content are an emergent technology. Many of these prominent platforms were released in 2021 and 2022, and quickly embraced by students and other online users.
Here at Glenelg, and across the county, the topic and use of AI technology has gained so much recent traction, that county officials added specific language to this year’s Code of Conduct due to the concern of plagiarism, which states that “Plagiarism, using the work or ideas of others, may also include the use of Artificial Intelligence writing programs without proper acknowledgment.”
The addition specifically mentions writing and therefore focuses mostly on the English department, but plagiarism isn’t limited to text. Midjourney, DALL-E (ChatGPT’s artistic cousin), and others saw user growth in the millions, according to colorlib and MarketSplash. AI doesn’t have to be generative, either. Adobe Photoshop can now generate new art, but it can also manipulate authentic pieces.
Greg English, the instructional team leader for the Art department, said it’s a topic of concern in the department, especially in photography classes where students are learning concepts related to photo manipulation.
“It’s pretty incredible, what it can do,” English said. “But it’s also very scary to me – it can have a lasting impact on students.”
One of the concerns that teachers and students have to navigate is what exactly is, or should be, considered AI. Grammarly, for example, has been used in secondary schools and colleges for a while, but, technically, it’s not students’ work. If something starts as a human creation, but AI work is convincing enough, it blurs the line between human and machine, putting more pressure on teachers to decipher a work’s authenticity.
Glenelg and Howard County are still adjusting to this latest form of technology. But cell phones were new once, too. Should schools integrate AI technology into school work or keep them strictly policed?
Sarah Jansson, instructional team lead for the English department, believes strict policing of how AI is used by students in the present will lead to them making good decisions about the future. Students still plagiarize and cheat themselves out of an education in the present, Jansson argued. Whatever AI will lead to, her students are her focus right now.
The issue at hand is whether a piece of writing is produced by AI or a student.
“I have seen both instances,” Jansson said. “There are some instances where it has been somewhat difficult to tell, but the majority of the time it is clear that a student used AI.”
Glenelg staff deal with the reality of what AI brings to academics, and that reality is plagiarism makes learning difficult, if not impossible. Those who decide to use AI to cheat place the short-term gains over the risk of jeopardizing their academics.
“Students are not actually learning or demonstrating knowledge of course content if all they are doing is turning in AI work,” Jansson said.
No matter the arguments surrounding AI, the county policy remains. Using AI without permission or transparency could land students in trouble with their teachers, administrators, and could even affect college admissions.