University of South Florida

Newsroom

Graphic that shows phone and text that reads "ChatGPT"

USF faculty members encourage colleagues to explore the possibilities of AI across disciplines

In a series of seminars and workshops, University of South Florida faculty are addressing the myriad effects artificial intelligence technologies, such as ChatGPT, will have on research and teaching in higher education. Through their conversations, faculty members have come away encouraging their peers to consider how to incorporate the new technology into their lives in and out the classroom.  

What is ChatGPT? 

Launched in November 2022 by San Francisco-based research and deployment company OpenAI, ChatGPT is a general, conversation-based AI program that interacts with users through dialogue, answers questions (for example, “Got any creative ideas for a 10 year old’s birthday?”) and performs text-based tasks ( “Write a poem in the style of Patricia Lockwood”).   

A notice on the site warns users of the program’s limitations, which include the occasional generation of incorrect information, production of harmful instructions or biased content and limited knowledge of the world and events after 2021. The site reported having one million users in the first five days and has since surpassed 100 million users — frequently at capacity.  

In the months since ChatGPT’s launch, the tenor of discourse has ranged from defeat to excitement in various industries, including higher education.

USF faculty members hope to set an example for other universities 

John Licato

John Licato, assistant professor in the USF Department of Computer Science and Engineering

John Licato, an assistant professor in the USF Department of Computer Science and Engineering, aims to quash overreactions and underreactions, alike.  

“I want to pull people away from the two extremes,” said Licato, who hosted a seminar earlier this month titled “ChatGPT, Cheating, and Chaos: What All Educators Need to Know, and What's Next” at the USF Institute for Artificial Intelligence + X

“There are some who think their courses will not be affected and there are some who think this is the end of the world, when the truth is it’s going to affect every course and there is a way forward from this, but it is going to look very different.”  

At a recent faculty meeting, when Licato broached the subject with his colleagues, he found himself, for the first time, on the other side of the automation conversation. 

“I was trying to make the point that this is not just a spinbot,” Licato said. “Its ability to write code is there. It can do most of our homework at the introductory level.”  

When he gave an example of ChatGPT-generated code, his colleagues began to wonder aloud, “Well if it can do this, then why are we teaching this?” and “Is it time to finally shift?”  

The irony isn’t lost on Licato, who has watched professionals in other disciplines struggle with the rise of automation in recent years.  

“We automate things, and then the people in that discipline say, ‘Okay, well, what’s the purpose of our field anymore? Let’s change it.’ And we kept saying, ‘Yeah, your field’s going to get automated, but all you have to do is learn how to code, and you’ll be fine.’ But what we're doing might be automatable as well,” he said.  

Despite lingering uncertainty, Licato remains optimistic on all fronts. He considers ChatGPT a tool — one that is pretty good at generating simple arguments and even better at synthesizing large quantities of information. Preliminary results of Licato’s own research suggest most people are unable to distinguish between human-generated arguments and ChatGPT-generated arguments. In some cases, Licato found ChatGPT capable of even generating more persuasive arguments than human beings — at least for uninitiated audiences.  

“Chess is a good example that I like to use to keep people positive,” said Licato, who uses the pair of six-game chess matches between Deep Blue, an IBM supercomputer, and then-world chess champion Garry Kasparov as a useful opponent.

“When Deep Blue beat Garry Kasparov, a lot of people thought it was the end of chess. They claimed the great thing about chess was that we were able to be creative and now these machines are more creative than us. But chess has undergone an amazing renaissance in recent years. The current grand masters are people who grew up with access to these amazingly powerful AI tools, and it changed the way they play, but it didn’t kill the game. In fact, it made it more exciting to see that, while these two grand masters are playing, we can have the super chess engine on the side telling the audience how they’re doing relative to each other. I think it accentuated the game. It made the human element more interesting.”  

“It’s a tool, and it has strengths and weaknesses,” Licato said. “We just have to figure out how to use those strengths that complement our goals.”  

Timothy Henkel

Timothy Henkel, assistant vice provost for teaching and learning

Timothy Henkel, assistant vice provost for teaching and learning, and David Tai, associate director for digital learning ecosystems, have echoed Licato’s sentiments. In a workshop hosted by the Center for Innovative Teaching and Learning titled, “ChatGPT: Reimagining Learning and Teaching,” Henkel and Tai encouraged faculty across USF to embrace the new technology.  

“Look at how you could think about using ChatGPT in your teaching and learning,” Henkel told the audience.   

Henkel took the audience on a tour of ChatGPT to demonstrate the program’s wide-ranging abilities and applications. He began by asking the program to produce the 2001 revised edition of Bloom’s Taxonomy, a hierarchical framework used by educators to organize learning objectives by complexity. He then asked it to determine how the learning objectives he’d developed for an upper-level undergraduate marine biology course corresponded with Bloom’s Taxonomy. In seconds, the program matched each of Henkel’s learning objectives with their respective taxonomic rank.  

Henkel proceeded to ask the program to help him create learning objectives for a new module on coral reef ecology. Again, in a matter of seconds, ChatGPT produced a series of learning objectives (for example, “Design and propose a conservation plan for a specific coral reef ecosystem, considering the ecological, economic, and social impacts and trade-offs") that could be applied to Henkel’s hypothetical new module — and all of which corresponded with Bloom’s Taxonomy.  

“As the user, it’s my job to interpret that text and to decide what has meaning and value for me,” Henkel said. “This can help you brainstorm, and you can keep asking more and more questions.”  

Henkel continued to prompt ChatGPT to “create an engaging assignment” and a corresponding rubric.  

“With ChatGPT, it really is all about the prompts,” Henkel said. “The questions you put in and what are you asking of it, establish what it’s going to be giving back to you.” 

When a member of the audience asked how to reimagine the essay assignments in her asynchronous course, Henkel encouraged faculty to connect with their students over questions of academic integrity, the value of learning and the student experience.  

“As we’re reimagining teaching and learning in a world that has these tools, we have to ask ourselves and our students, ‘What do we value about higher education?’” Henkel said. “What conversations are we having with our students to really ensure that they know what we value about the student experience and about learning in general?  

“We need to think about how are we sharing with our students what we value about learning, about our disciplines, about the process of writing — after all, writing is a way of practicing thinking. And so, if you want to be a better thinker and a better doer, you’re going to need to be a better writer.  

“We want to ask our students what they value about the process. What do they value about learning and what are they looking for? And are we creating opportunities that are relevant and meaningful for them to develop those skills?  

“Helping them to think about whose voice do they want going out into the world. Do they want to just be echoing the algorithm? Or do they want their voice to be known and to be heard? Whose voice matters?”  

Henkel also underscored the utility of USF’s academic integrity policy, which emphasizes honesty, respect and fairness.  

“I think it’s very important that we set clear expectations about what is and what is not acceptable,” he said. “If the view is that the augmented text violates those things, talk to your students about why you believe that and why you want them to be completing tasks on their own. If there’s opportunities where you want them to be using these tools and build those skill sets, talk about that as well.”  

Henkel brought the workshop to a close by encouraging faculty to engage with ChatGPT and other AI tools on their own.   

“We’re going to have to walk in this grey space, not only together as colleagues and faculty but also with our students,” Henkel said. “We’re just beginning to have these conversations.”  

Several units across the university, including in the Muma College of Business, are starting to host additional seminars that address ChatGPT. USF Innovative Education is also about to launch a free program that covers the most critical aspects of AI, including the use of ChatGPT and other AI tools. You can learn more about how to register here.

Return to article listing