Washburn professors give their take on AI and technology in the classroom

Washburn professors give their take on AI and technology in the classroom
Graphic by Morgan Albrecht

Professors battle the use of AI in their classrooms. The popularity of AI programs is causing many educators to rethink their teaching methods.

In the 21st century, technology is incorporated into many aspects of everyday life. Technology is everywhere, whether it be a cellphone used to call or text a loved one or a smart TV used to stream popular shows on Netflix. This raises the common question: how does technology fit into higher education?

For Carson Kay Ph.D., assistant professor of communication studies, technology is a tool that provides many benefits to a college classroom.

“I personally believe that technology is one of the greatest tools that we have in the classroom,” Kay said. “If we’re thinking about college not only being something that helps us develop as human beings but also something that practically prepares us for the workplace, I think it’s really important that we learn how to use technology in impactful ways in a learning environment so that we have practice when we are out in the world doing those jobs.”

One of those jobs that will inevitably include technology is teaching. Just as professors have to learn to navigate teaching with forms of technology, students in the education department with hopes of working in a classroom setting will also have to learn effective ways to use technology in the material they teach.

Gary Graves, educational technology lecturer of education studies, plays a key role in setting these students up for success in their future classrooms.

“The technology I try to introduce into my classes are things that my students, who will be teachers someday, will be able to use in their classroom,” Graves said. “So showing students different ways they can utilize [technology] in their profession is a big goal of mine.”

At Washburn, classroom technology policies vary based on each professor. Some allow students to use phones, tablets or laptops to take notes, record lectures and access course materials. However, this freedom of use can hinder students’ learning experience due to the distractions it can cause not only for the students engaging in activities on their form of technology but also for those around them.

“I certainly find myself occasionally glancing at my phone to see if anyone has sent me a message, if I have any alerts or what’s going on, and that’s kind of become a normal part of the day-to-day experience for so many people, so it can be really challenging to set that aside and be present and focus on the content at hand,” Kay said.

Divided focus and distractions aren’t the only challenges that technology brings to the classroom. Since students can access information on any subject in a matter of seconds with the help of search engines like Google, plagiarism and academic dishonesty become a concern for professors.

Associate Vice President for Student Life and Dean of Students, Teresa Clounch Ph.D., handles cases of academic dishonesty. She serves as a mediator between students and faculty when incidents in this area arise.

“Just as there’s opportunities to work with [technology], there are some ways to see if a student is using it in an incorrect manner,” Clounch said.

Some of the most common issues Clounch sees involve plagiarism in student papers and essays. This includes students using other student’s work as their own or students using information found online without properly giving credit to the author.

Recently, Clounch has worked on cases pertaining to the use of artificial intelligence softwares, such as ChatGPT. Students will enter prompts for papers or open-ended test questions and use the software’s response as their own.

The popularity of different AI software sources has increased over the last few years. Students use these programs in a multitude of ways.

“I’ve seen ChatGPT be used in constructive ways but also in ways that are what I would consider moments of academic impropriety or questionable ethics,” Kay said.

The biggest challenge facing professors when it comes to AI programs is not being able to delineate between student’s real work and fabricated work.

“I’ve seen some discussion posts that have a little bit of what I call a ‘ChatGPT voice’ to them,” Kay said. “They’re very clean. They’re very grammatically correct, but they just don’t sound quite human. So it is certainly challenging to make an assessment on how to address it because there isn’t really a detection software system that can accurately and consistently be like, ‘Yes, this is plagiarized,’ or, ‘Yes, this is ChatGPT.’”

With AI programs being so new to the world of academics, professors are finding ways to set boundaries on how much students are allowed to use them.

“I think ChatGPT has strong potential to be really beneficial in the educational setting,” Kay said. “I encourage my students to think about it as a platform that can certainly help with idea organization. Organization is something that takes so long for people to practice and get comfortable with. I say you’re welcome to use ChatGPT in my class to help you structure and organize big ideas if you’re struggling to come up with what main topics to think about.”

Professors have accepted the fact that AI is here to stay, so their main goal now is to educate themselves on ways AI can be used, either positively or negatively, and how to identify when AI is being used in dishonest ways.

“There’s definitely going to be a need to provide educational opportunities, especially for instructors to know what we’re looking at,” Kay said. “Having avenues to learn from experts about the technology will absolutely be important.
“The work is in progress, but it is going to be a process that I think is going to take quite a long time before we get to a place where we feel 100% comfortable with how we navigate AI in [educational] settings,” Kay said.

Universities are dealing with the emerging use of AI in different ways. Washburn currently has an instructor-by-instructor policy for limitations on the use of AI in the classroom. As the programs evolve, the need for more clear-cut guidelines will increase, but educators say their biggest need right now is finding and learning ways to identify and prevent the unethical use of AI software in the academic setting.

Edited by Stuti Khadka and Jeremy Ford

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *