
Teaching students to navigate a world with AI
More than two years after ChatGPT burst onto the scene, most university instructors are still grappling with how generative artificial intelligence (AI) models are transforming the paradigm for teaching and learning.
According to a 2025 Digital Education Council survey of faculty in 28 countries, 39 percent haven’t incorporated AI into their teaching. Among those who have, only half have directly taught students how to use and evaluate AI.
It’s a conundrum. How do you teach students to use a new technology they’ll surely encounter in their careers but that could, if misused, short-circuit the learning process in their college coursework?
“Generative AI has the potential to enhance teaching and learning, but only if it is used intentionally,” said Assistant Vice Provost Quinn Warnick, who co-directed a Teaching in the Age of Generative AI faculty cohort offered by Technology-enhanced Learning and Online Strategies as part of the office’s broader efforts to help instructors understand and adopt new AI tools. “In a rapidly evolving AI landscape, some of our most innovative teachers are rethinking assignments and classroom activities, always with a focus on improving student learning.”
At Virginia Tech, early adopter faculty are incorporating AI into their courses and teaching students how to use AI models ethically. Some have even developed entire courses around AI literacy. Here’s what they had to say about the ups and downs of teaching students to use AI — and why it matters.
Jen Mooney
- Senior instructor and director of technical and scientific communication, Department of English
- Course: Technical Writing: Writing with AI
Why are your students interested in a class on writing with AI?
On the first day of class, we did a survey and 47 percent said they wanted to learn about AI to prepare for their future careers. But most of them are worried that they’ll be accused of cheating for using it in a class. So they’re very cautious. Even in this course, where I say, “You can use AI in all of these ways,” they have asked me, “But can we do that?”
When is it OK to use AI?
In the syllabus, I tell them how they can use AI and how they can’t. One assignment says, “Create an outline (for a paper) in a Word doc, then use a prompt to ask AI to generate an outline. Compare the two, choose the best parts of each one to create a new outline, and analyze why you chose those parts and rejected other parts.” So it's a show-your-work process. But I think it's crucial that our students, especially those in engineering and computer science — but actually, everyone — learn how to use AI efficiently and ethically.
Yoon Jung Choi
- Assistant professor, Department of Industrial Design
- Course: AI in Design
What AI-informed assignments did your students work on in your class?
One week was about experimenting with prompts in ChatGPT and MidJourney. Another session was about creating a written story that led to a series of comic strips. Because they’re industrial design students, they need to understand prototyping, so for one assignment they sketched an idea for a stool, exported it to Vizcom.ai, and used that to create it in a 3D form.
What challenges have you experienced when students work in AI?
One thing I noticed using generative AI is that students didn’t really want to go through the iterative process. They wanted results really quickly: “The first one? I love it.” We had to push them to go through many iterations to find the right images. We also talked about ethics when you import someone else's design into an AI tool — like an image of a Dyson vacuum cleaner so you can create a pencil case that looks like a Dyson. Students should be really aware of copyright issues. They don’t own that product, so they shouldn’t use those images in the process of generating a new product.
David Hicks
- Professor, School of Education
- Course: Secondary School Teaching Methods
What got you interested in teaching about AI?
Being in education and teaching the methods class for future teachers, the whole concept for me is: If I do not have my students investigate, play with, and try to break AI — and let it break them — who else will be doing it? I need to understand what it does, and I need to have my students explore it and play with it because their students may be using it and they're going to get blindsided. I want them to see how AI can support student learning. I also want them to recognize that it can produce really rubbish stuff, so they’ve got to be critical of what it's producing.
What’s your advice for a faculty member who is unfamiliar with AI?
Take a topic that you're passionate about, whether it’s triathlons or Bertolt Brecht, and have a discussion with generative AI. Give your background, give your explanation of what you're interested in, and then have a conversation and see where it takes you. I'm a history educator, so I’ve created custom GPTs that scaffold inquiry, provide feedback on historical sources, and model disciplinary thinking — leveraging AI’s ability to see, hear, and converse about sources to support both students and preservice teachers in developing disciplinary and AI literacy.
Shilpa Rao
- Assistant professor of practice, Department of Marketing
- Course: Introduction to AI in Marketing
What do you hope your students learn about generative AI in your class?
Introduction to AI in Marketing is a Pathways general education course, so it’s a mix of majors. About 50 percent of the students at least knew about ChatGPT, and the rest said that they weren't very acquainted with it. But in just a couple of years, we are going to be surrounded by a whole lot more of it. There is no going back from that. You can’t just opt out. AI will change the landscape of industry, and we are hoping to bridge the gap to make students more ready for industry jobs. It will help students to understand the pros and cons of Generative AI. It is essential to understand that it can enhance the learning process but not replace it.
How has AI helped you be a better teacher?
I uploaded my syllabus to AI with the prompt, “Read the attached syllabus and let me know what I can do to improve it.” It came back with a number of suggestions, some related to grading, some related to the verbiage to make it more accessible for everyone. It even suggested that I need to provide additional detail on a specific grading penalty. I was impressed. I have also been using AI to generate images for class content. This is especially helpful when there are no readily available images for a complex or unique scenario.
Junghwan Kim
- Assistant professor, Department of Geography
- Course: Generative AI Applications in Social Science
How do you discuss the ethics of AI in your class?
One week we had a class debate focusing on ethics and responsible use of AI. The topic was, “Should students use generative AI for academic writing?” I assigned sides, and it was interesting to see the different opinions from the students. For example, if I say, “A student plugged a prompt into AI, copied the response, and submitted it to a teacher,” most students say, “This is cheating.” But something like, “A student created multiple AI responses based on the student’s detailed prompt, used the best parts, edited, and submitted” — that’s a gray area.
Should there be a universitywide policy about AI?
My understanding is that a one-size-fits-all policy doesn't work because there are so many use cases that cannot be captured in just one policy document. Also, the recent survey of Virginia Tech students showed significant differences in students’ AI perception between STEM and non-STEM majors, suggesting more nuanced policy approaches are needed. Of course, policies should be grounded in principles like transparency and confidentiality, but at the same time I think that each class should have some kind of specific guidelines about use cases of generative AI. In an introductory computer science class, it may not be good to use AI to create code because if they do, they don’t actually improve their computational thinking. But for an advanced class that’s more project based, it’s probably OK.
Andrew Katz
- Assistant professor of engineering education
- Course: Quantitative Data Analysis in Engineering Education
How do you think faculty members are feeling about AI in the classroom now?
When AI models were first coming out, it was easy to be a little dismissive, like, “The models can't do these things that I'm asking students to do.” Over the last year, more people have probably realized, “These models can do this,” so they are starting to be a little more open to thinking about how to creatively incorporate generative AI into their classrooms. If faculty members haven't tried to use these models in a while, I think that's a big gap in their way of thinking about what their students are doing.
What if an instructor says, “We are never using AI in my classroom”?
In engineering, we’re thinking a lot about preparing students for professional careers. How much do you want to tell students they can't use this technology that they’re going to end up using in the workplace? I tell my students that they can use the AI models as long as they're transparent about how they used it. There’s a difference between using a model to help clean up their grammar versus having it completely draft a paragraph or an entire paper for them.
Technology-enhanced Learning and Online Strategies offers resources to support faculty as they explore the opportunities and challenges that generative AI poses to teaching and learning. Enroll in a workshop, review the list of recommendations, join the conversation or request a consultation.