When it comes to AI’s impact on the classroom, what is the role of the Walker Center for Teaching and Learning?
The Walker Center has been really lucky to be able to have proactive conversations around this. The programming we ofter isn’t required of faculty, and the people who choose to come to us with questions based on the availability of AI are saying, ‘How do I rewrite my assignment? I just want to know what I can do.’
Depending on the pocket of campus you’re talking about, the concern is going to be different; but in this moment, our biggest concern is how we continue to fairly assess student learning when, in a lot of disciplines, we’re used to assessing learning through student writing, and there are really widely available tools that can do that writing for you.
The concern may be different, I think, for tools positioned to make your writing sound more skillful, but the tools that we tend to think about regarding AI are the ones that can just produce the work for you.
What are some of those tools and their particular issues, in your view?
Until November of ’22, when Open AI released Chat GPT and we all became aware of generative AI, students may have been encouraged in some cases to use tools like Grammarly and Quillbot to polish their writing. Grammarly didn’t feel that different from ordinary spell check, so even if they weren’t openly encouraged, I think it’s reasonable for the students to have felt that they were simply using tools that strengthened their work. But both now have AI content-generating capabilities, making the scope of their impact on student writing much more vast.
And while there was a tendency to think Chat GPT, as a free tool, might equalize disparities among students, it’s hard to think of this thing as an equalizer given recent development. What we’re seeing with Chat GPT—the free version—is it’s just not as advanced as the paid versions, like GPT-4. GPT-4 is a lot more powerful and has capabilities that are a lot more advanced.
So there’s a significant concern about intentional or inadvertent academic dishonesty?
Despite the fact cheating has always existed, what we’re dealing with here feels different. There are very few barriers to accessing generative AI tools that you can put your essay prompt into, and it writes the essay, and there you go. There are differences in the quality based on what tool you can afford. It’s not like there are no barriers to this being undetectable, but I think it’s understandable to feel like this is a crisis.
What we know about AI detectors is that they’re much more likely to identify text generated by Chat GPT than text generated by GPT-4. So, when we’re looking at detectors as a way of maintaining proper academic conduct, what we’re likely seeing is that students without the resources to subscribe to more advanced tools, like GPT-4 and Undetectable, may be the students more likely to get caught.
The Office of Student Conduct is dealing with lots of very complicated situations, such as inadvertent misconduct—when a student might actually write a paper and then put it through Quillbot to make it sound ‘smarter.’ Then a professor puts it through an AI detector and it trips as something generated by AI, and you’ve got a student in a position to genuinely say, ‘I did write that, and I used every tool at my disposal to make my work as good as it could be, and now I’m in trouble.’ Depeding on the learning outcomes for the course, the professor’s expectations, and the goals of the assignment, a student in that situation may – or may not – have crossed an ethical boundary.
It’s complicated because I think we’ve been driven to think about this in really black and white terms: ‘Did you use AI to write this paper for you?’ I don’t think it’s always as simple as that.
How do you see the Walker Center engaging on these concerns?
I think there are unethical ways that faculty could use AI, just as there are unethical ways students could use it. I think a really valuable step in our programming will be to help faculty with workshops to look at how AI tools can make their lives a little easier alongside looking at where there might be a line you wouldn’t feel comfortable crossing. And maybe part of our job is also to help students understand how to find that line.
My Ph.D. is in English, and I spent many years teaching literature and composition at the college level. When grading papers, it can be hard to constantly recreate the same comments over and over and over. In general, freshman writers have similar areas for growth and make similar mistakes. If I were to ask Chat GPT to give me 10 examples of margin notes I could use to address argument-driven topic sentences, it may be able to do so, and some of them may be useful, but it still takes my expertise and judgment to determine whether or how to use what is suggested. The suggested examples take only a few seconds to create, but I still have to know how I want to talk to the student. But maybe it’s nice that I don’t have to come up with all of the raw material.
Your thoughts on the reality of AI being here to stay in higher education?
In my mind, saying, ‘Just don’t touch it. It’s complicated,’ isn’t realistic. Because once our graduates are applying for jobs, those job applications are going to be screened by AI-driven technology, looking for keywords and for certain structures and kinds of composition. If our graduates don’t know how those tools work nor how to use them to make themselves most marketable, they’re not going to be as successful in the job market competing against people who do.
In 2024, if we try to police out the usage of AI because we’re fearful it means students aren’t learning, we might end up with students who graduate without knowing the nuances of how to use this tool. At the point if they leave college or go on to grad school without knowing how to use this tool that is going to be ubiquitous in our world, we haven’t prepared them for their careers or the job market or their next stage in their education.
The challenge for a teaching and learning center is similar. There are times when I wish we could put this genie back in its bottle, and there are times I genuinely mourn for the way I used to be able to teach. Despite those feelings—which I think are valid, faculty have to move forward. That’s where the Walker Center aims to serve faculty at UTC. We’re consistently learning and researching and experimenting and talking to other professionals in the field so that we can bring the best disciplinarily appropriate suggestions, ideas and tools to our campus. The technology may be changing constantly, but we’re being diligent, and we’re always open to conversations on this topic – regardless of where an instructor may be on their journey to teaching alongside AI.
–Dr. Victoria Bryan is director of the Walker Center for Teaching and Learning, a resource for best practices in teaching strategies and learning technologies.