AI has taken the internet by storm since its rise in popularity nearly a year ago, from generating images to writing essays in seconds. Today, AI seems to be embraced to a certain degree at Seton Hall, by being aware of some of its pros but also being aware of its cons.
The Department of Information Technology published a site called the “Artificial Intelligence Resource Center,” to guide faculty members in addressing the use of AI with strategies for integrating AI into curriculums, reducing AI-generated content by students and requesting consultations with the Teaching, Learning, and Technology Center Instructional Designer.
This new site will continue to change as more information is gathered, according to Paul Fisher, the director of the Teaching, Learning, and Technology Center and associate chief information officer.
“We need to ensure the University community is aware of both the rewards this kind of technology can provide and the risks that it presents,” Fisher said. “For example, additions of AI in our security suite of software help us to weed out false positives and become aware of potential threats faster than before.”
Fisher said that people can use AI “with both positive and negative goals in mind.”
“Taking a shortcut with AI to write or analyze or some other skill won’t make you very good at what you are trying to learn to do,” he said. “We need to make our students aware that in the long run, using AI will not get them to where they want to be.”
He added the university licenses TurnItIn, a system that detects generative AI and other forms of plagiarism in written work and the department is “actively looking at other tools to deploy that will help both faculty and students.”
Some professors have expressed their worry over students relying on AI programs to do schoolwork.
Dr. Russell Sbriglia, associate professor and director of Undergraduate Literature Studies in the English Department, said that although some professors have proposed to do more in-class writing assignments to reduce the use of generative AI among students, students can still “easily use ChatGPT in class as out of class.”
“My approach has thus far been to stress to students the limitations of ChatGPT,” Sbriglia said. “This includes modeling for students the kinds of errors to which generative AI software is prone. It also includes stressing just how integral writing is to the process of critical thinking.”
He also said that students using AI software for their writing assignments “outsource their ability to think critically.”
“It is through the very act of writing that we come to organize, construct, and even discover our thoughts on a given topic or text,” he said. “To skip that process altogether by using AI generative software is to curtail one’s ability to think critically.”
He added that professors will need to “become more nimble and creative” when creating assignments that “develop or test for the kinds of knowledge and skills that such programs simply cannot ‘fake’” as generative AI continues to advance and become more sophisticated.
“At the end of the day, artificial intelligence simply does not ‘think’ like an actual human brain,” Sbriglia said.
The English Department recently updated its Plagiarism Policy to address the use of generative AI, such as ChatGPT, amongst students stating that any use of AI tools for essays, journals, in-class writing assignments, etc. is “in violation of the English Department Academic Integrity Policy,” but “with the exception of exercises developed by your professor.”
Although some professors are worried about students relying on AI for their assignments, some professors have embraced the use of AI and adjusted their curriculum in a way their students can use it as a tool for learning instead of cheating.
Dr. Nada Khan, a professor in the Department of Chemistry and Biochemistry, said she is “happy that AI is playing an important role in exploring new technologies and ideas.”
“Rather than worrying about students relying on AI, I work on tweaking my exams and assignments in a way that instead of cheating, copying, relying on AI, students can use it as a tool to expand their knowledge,” Khan said.
She added that she uses “different and new exam models and other ways” to test her students’ knowledge, and it is easy for her to know if an exam was done with the help of AI.
“Since I know the learning nature of my students during in-class work, it is easy for me to determine if the exam was completed with the help of AI,” Khan said. “This is another reason why I make them work on in-class assignments.”
The Office of the Provost, the Center for Faculty Development, and the Teaching, Learning and Technology Center at Seton Hall will hold a panel discussion called “Impact Talks Presents: Generative AI and Higher Education” on Nov. 2 in Bethany Hall from 1–2:30 p.m.
The panel will host keynote speaker, Dr. Jason Gulya, professor at Berkeley College and chair of the Artificial Intelligence Council, along with Seton Hall faculty and student representatives sharing their thoughts on the “rapid advancement of generative Artificial Intelligence (AI) and its impact on higher education.”