311. Upskilling in AI

With so many demands on faculty time, it can be difficult to prioritize professional development in the area of AI. In this episode, Marc Watkins joins is to discuss a program that incentivizes faculty development in the AI space. Marc is an Academic Innovation Fellow at the University of Mississippi, where he helped found and currently directs the AI Institute for Teachers.

Show Notes


John: With so many demands on faculty time, it can be difficult to prioritize professional development in the area of AI. In this episode, we examine a program that incentivizes faculty development in the AI space.


John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.


John: Our guest today is Marc Watkins. Marc is an Academic Innovation Fellow at the University of Mississippi, where he helped found and currently directs the AI Institute for Teachers. Welcome back, Marc.

Marc: Thank you, John. Thank you, Rebecca. It’s great to be back.

Rebecca: We’re glad to have you. Today’s teas are:… Marc, are you drinking tea?

Marc: I am. I have a Cold Brew Hibiscus, which is really great. It’s still very warm down here in Mississippi. So it’s nice to have something that’s a little bit cool. That’d be refreshing.

Rebecca: That sounds yummy. How about you, John?

John: I am drinking a peppermint spearmint tarragon blend today. And it’s not so warm here. In fact, my furnace came on for the first time yesterday.

Rebecca: Yeah, transitions. And, I have English tea time today.

Marc: Well, that’s great.

John: So we have invited you here to discuss your ongoing work related to ChatGPT and other AI tools. Could you first describe what the AI Institute for Teachers is and its origins?

Marc: Sure, I think that when I was last a guest here in January of this year on your show. And it seems like 1000 years ago [LAUGHTER], but during that spring semester, I really took a much deeper dive than the original pilot with a lot of the generative AI tools in the fall. And we started noticing that the pace that big tech was deploying these tools and integrating these with existing software from Microsoft and Google was only accelerating. So in about April or May, I went to my chair, Stephen Monroe, and said, “I think we need to start training some people to get them prepared for the fall,” because we kind of thought that fall was going to be what it is right now, which is a chaotic just sort of mash up of sort of everything you can imagine that some people dive in deeply, some people tried to ban it, some people are trying to do some critical approaches with it too. So we actually worked with the Institute of Data Science here at the University of Mississippi, and we got some money. And we were able to pay 23 faculty members $1,000 apiece to train them for a day and a half about everything we knew about Generative AI, about AI literacy, ethics, what tools were working in the classroom, which wasn’t. And their whole goal was to go back to their home departments over the summer and serve as ambassadors and help prepare them for the fall semester. And we started that, we’ve had funding for one Institute, and now we’re doing workshops, and searching, as we all will, for more funding for doing,

Rebecca: How did faculty respond to (A) the incentive, but (B) also [LAUGHTER] the training that went with it?

Marc: Well, not surprisingly, they responded really well to the incentives, where you can pay people for their time, they generally do show up and do so as well. We had quite a few people wanting to take the training both internally from the University of Mississippi and then people started finding out about it, because I was posting it out on Twitter, and writing about it on my substack. So when we had interest from graduate students in Rome, interest from other SEC schools wanting to attend, and even interest from a community college in Hawaii. Definitely seen a lot of interest within our community, both locally and more broadly, nationally.

Rebecca: Did you find that faculty were already somewhat familiar with AI tools? I had an interesting conversation with some first-year students just the other day, and we were talking about AI and copyright. And I was just asking, “Hey, how many of you have used AI?” And I and another faculty member indicated that we had used AI to make it safe to indicate. And many of them really kind of shook their heads like “No, they hadn’t,” and they were unsure. And then I started pointing to places where we see snippets of it, in email and in texting and other places where there’s auto-finishing of sentences and that kind of thing. And then they’re like, “Oh, yeah, I have seen that. I have engaged with that. I have used that.” What did you find faculty’s knowledge?

Marc: Extremely limited. They thought of AI as ChatGPT. And one of the things we did with the session was basically frame it out as “Look, this was not just going to remain as a single interface anymore.” One of the things that actually happened during the institute that was completely wild to me was the last day, I woke up that morning. And I’d signed up through Google Labs, and you can do it as well, to turn on the features within the Google suite of tools, including in search and Google Docs, and Sheets and everything else. And they gave me access that last day, right before we began. And so I literally just plugged in my laptop and said, “This is what it’s going to look like in Google docs when you have generative AI activate in Google Docs. it pops up and immediately greets you with a wand with a phrase “Help me write.” And what I tried to explain to them and explained to faculty ever since then, is that it makes having a policy against AI very difficult when it shows up at an existing application with no indication whatsoever that this is in fact Generative AI. It’s just another feature that’s in the application that you have grown up with, from many of our students’ perspectives their entire lives. So yeah, we need to really work on training faculty, not just in the actual systems itself, but also getting them outside of that mindset that AI that we’re talking about is just ChatGPT. It’s a lot more than that.

John: Yeah, in general, when we’ve done workshops, we haven’t had a lot of faculty attendance partly because we haven’t paid people to participate [LAUGHTER], but what’s been surprising to me is how few faculty have actually explored the use of AI. My experience with first-year students was a little different than Rebecca, about half of the students in my large intro class had said that they had explored ChatGPT, or some other AI tool. And they seem pretty comfortable with it. But faculty, at least in our local experience, have generally been a bit avoidant of the whole issue. I think they’ve taken the approach that this is something we don’t want to know about, because it may disrupt how we teach in the future. How do you address that issue, and getting faculty to recognize that this is going to be a disruptive technology in terms of how we assess student learning and in terms of how students are going to be demonstrating their learning, and also using these tools for the rest of their lives in some way?

Marc: That’s a great question. We trained 23 people, I’ve also been holding workshops for faculty too, and again, the enthusiasm was a little bit different in those contexts, too. And I agree that faculty, I feel like they feel overwhelmed and maybe some of them want to ignore this and don’t actually want to deal with it, but it is here and it is being integrated at phenomenal rates in everything around us too. But if faculty don’t come to terms with us, and start thinking about engagement with their technology, both for themselves and for their students, then it is going to create incredible disruption that’s going to be lasting, it’s not going to go away. We’re also not going to have things like AI detection, like it is with plagiarism detection to come in and save the day for them too. And those are all things we’ve been trying to very carefully explain to faculty and get them on board. Some of them though, just aren’t there yet, I understand that. I empathize, too. This is a huge amount of time that you spend on these things to think about and talk about as well. And we’re just coming out of the pandemic, people are exhausted, they don’t want to deal with another, quote unquote, crisis, which is another thing that we’re seeing too. So there’s a lot of factors that are at play here that make faculty engagement, less than what I’d like to see.

Rebecca: We had a chairs’ workshop over the summer, and I was somewhat surprised based on our experience with other interactions with faculty, how many chairs had used AI. The number was actually a significant number. And most of them were familiar. And that to me was encouraging [LAUGHTER], it was like, “Okay, good, the leaders of the ship are aware. That’s good, that’s exciting.” But it’s also interesting to me that there are so many folks who are not that familiar, who haven’t experimented, but seem to have really strong policies around AI use or this idea of banning it or wanting to use detectors, and not really being familiar with what they can and cannot do.

Marc: Yeah, that’s very much what we’re seeing across the board too, is that the first detectors that I’m aware of that really came online, I think, for everyone was basically GPTZero, there are a few others that existed beforehand to IBM had one called the Giant Language Testing Lab. But those were all based on GPT-2, you’re going back in time to 2019. I know how ridiculous is it to go back four years in technology terms and think about this… that was a long time ago. And we really started adopting that through education or seem to be adopted in education based off of that panic. The problem is in incidents of education putting a system like that in place, it’s not necessarily very reliable. TurnItIn also adopted their own AI detector as well too. A lot of different universities began to explore and play around with it, I believe, and I don’t want to be misquoted here or misrepresent TurnItIn. I think what they initially came out with it, they were saying there was only 1% false positive rate for detecting AI. They’ve since raised that to 5%. And that has some really deep implications for teaching and learning. Most recently, Vanderbilt Center for Excellence in Teaching and Learning made the decision to not turn on the AI detection feature in TurnItIn. Their reasoning was that they had, I think, in 2022 some 75,000 student papers submitted. If they had the detector on during then that would give them a false positive grade about 3000 papers. And they just can’t deal with that sort of situation through a university level..No one can. You’d have to go through it investigating each one. You would also have to get students a hearing because that is part of the due process. It’s just too much. And that’s one of the main concerns that I have about the tools that it’s just not reliable in education.

John: And it’s not reliable both in terms of false positives and false negatives. So some of us are kind of troubled that we have allowed the Turnitin tool to be active and have urged that our campus shut it down for those very reasons, and I think a number of campuses, Vanderbilt was one of the biggest ones, I think to do that, but I think quite a few campuses are moving in that direction.

Marc: Yes, the University of Pittsburgh also made the decision to turn it off. I think several others did as well, too.

Rebecca: It’s interesting, if we don’t have a tool to measure, a tool to catch if you will, then you can’t really have a strong policy saying you can’t use it at all. [LAUGHTER] There’s no way to follow up on that or take action on that.

Marc: Where we’re at, I think, that for education, that’s a sort of conundrum. We’re trying to explain this to faculty. I think much more broadly, in society, though, if you can’t have a tool that works when you’re talking about Twitter, I’m sorry, X now, and understanding if the material is actually real or fake, that becomes a societal problem, too, and that’s what they’re trying to work on with watermarking. And I believe the big tech companies have agreed to watermark audio outputs, video outputs, and image outputs, but they’ve not agreed to do text outputs, because text is a little bit too fungible, you can go in and you can copy it, you can kind of change it around a little bit too much. So, definitely it’s gonna be a problem, too when state governments start to look at this, and they start wondering that the police officer taking your police report is writing this with their own words, the tax official using this as well, too. So it’s gonna be a problem well outside of education.

Rebecca: And if we’re not really preparing our students for that world in which they will likely be using AI in their professional fields, then we’re not necessarily doing our jobs and education and preparing our society for the future.

Marc: Yeah, I think training is the best way to go forward too and again, going back to the idea of intentional engagement with the technology and giving the students these situations where they can use it and where you, hopefully if you’re a faculty member, you actually have the knowledge and the actual resources to begin to integrate these tools and talk about the ethical use case, understanding what the limitations are and the fact that it is going to hallucinate and make things up, and to think about what sort of parameters you want to put on your own usage too.

John: One of the things that came out within the last week or so, I believe,… we’re recording this in late September… was the introduction of AI tools into Blackboard Ultra. Could you talk a little bit about that?

Marc: Oh boy, yes indeed, they announced last week that the tools were available to us in Blackboard Ultra. They turned it on for us here at the University of Mississippi, and I’ve been playing around with it, and it is a little bit problematic, because for right now, what you can do is with a single click, it will scan your existing materials in your Ultra course and it will create learning modules. It will create quiz questions based off that material, it will create rubrics, and will also generate images. Now compared to what we’ve been dealing with ChatGPT and all these other capabilities, this is almost a little milquetoast by comparison. But it’s also an inflection event for us in education, because it’s now here, it’s directly in our learning management system, it’s going to be something we’re going to have to contend with every single time we open up to create an assignment, or to do an assessment. And I’ve played around with it. It’s an older version of GPT. The image version I think is based on Dall-E, so you would ask for a picture of college students and you get some people with 14 fingers and weird artifacts all over their face, which may not be the one that would actually be helpful for your students. And while the other learning modules there are not my thinking necessarily, it’s just what the algorithm is predicting based off the content that exists in my course. We have that discussion with our faculty, we have them cross that Rubicon on and saying, “Okay, I’m worried about my students using this, what happens to me and my teaching, my labor, if I start adopting these tools. There could be some help, definitely, this could really streamline the process, of course creation and actually making it aligned with the learning outcomes my department wants for this particular class.” But it also gets us in a situation where automation is now part of our teaching. And we really haven’t thought about that. We haven’t really gotten to that sort of conversation yet.

Rebecca: It does certainly raise questions about, obviously, many ethical questions and really about disclosing to students what has been produced by us as instructors and what has been produced by AI and authorship of what’s there. Especially if we’re expecting students to [LAUGHTER] do the same thing.

Marc: It is mind boggling, the cognitive dissonance, with having a policy and saying “No AI in my class,” then all of a sudden, it’s there in my Blackboard course, and I could click on something. And, at least at this integration of Blackboard, they may very well change this, but once you do this, there’s no way to natively indicate that this was generated by AI. You have to manually go in there and say this was created. And I value my relationship with my students, it’s based off of mutual trust. I think almost everyone in education does. If we want our students to act ethically, and use this technology openly, we should expect ourselves to do the same. And if we get into a situation where I’m generating content for my students and then telling [LAUGHTER] them that they can’t do the same with their own essays, it is just going to be kind of a big mess.

John: So given the existence of AI tools, what should we do in terms of assessing student learning? How can we assess the work reasonably given the tools that are available to them?

Rebecca: Do you mean we can just use that auto-generated rubric right, that we just learned about? [LAUGHTER]

Marc: You could, you can use the auto-generated rubric separately from Blackboard. One of the tools I’m piloting right now is the feedback assistant, it was developed by Eric Kean and Anna Mills. I consulted with them on this, too. She’s very big on the AI space for composition. It’s called MyEssayFeedback. And I’ve been piloting this with my students. They know it’s an AI, they understand this. I did get IRB approval to do so. But I’ve just got the second round of generated feedback, and it’s thorough, it’s quick, it’s to the point. And it’s literally making me say, “How am I going to compete with that?” And maybe the way is that maybe I shouldn’t be competing with that, maybe it’s I’m not going to be providing that feedback. But then maybe then I should be providing my time in different ways. Maybe I should be meeting with them one on one to talk about their experiences, maybe that way. But I think you raise an interesting question. I don’t want to be alarmist, I want to be as level-headed as I can. But from my perspective, all the pieces are now there to automate learning to some degree. They haven’t been all hooked up yet and put together a cohesive package. But they’re all there in different areas. And we need to be paying attention to this.Our hackles need to be raised just slightly at this point to see what this can do. Because I think that is where we are headed with integrating these tools into our daily practice.

Rebecca: AI generally has raised questions about intellectual property rights. And if our learning management systems are using our content in ways that we aren’t expecting, how is that violating our rights or the rights that the institution has over the content that’s already there.

Marc: A lot of perspectives of the people that I speak with too, their course content, their syllabi, from their perspective is their own intellectual property in some ways. We get debates about that, about the actual university owns some of the material. But we have had instances where lectures were copyrighted before in the past. And if you’re allowing the system to scan your lecture, you are exposing that to Generative AI. And that gets at one aspect of this. The other aspect, which I think Rebecca is referring to is the issue with training this material for these large language models itself could indicate that it was stolen or not properly sourced from internet and you’re using it and then you’re trying to teach your students [LAUGHTER] to cite material correctly too, so it’s just a gigantic conundrum of just legal and ethical challenges. The one silver lining in all this, and this has been across the board with everyone in my department. This has been wonderful material to talk about with your students, they are actually actively engaged with it, they want to know about this, they want to talk about it. They are shocked and surprised about all the depths that have gone into the training of these models, and the different ethical situations with data and all of it too. And so if you want to just engage your students by talking to them about AI too, that’s a great first step in developing their AI literacy. And it doesn’t matter what you’re teaching, it could be a history course, it could be a course in biology, this tool will have an impact in some way shape or form in your students’ lives they want to talk about, I think maybe something to talk about is there are a lot of tools outside of ChatGPT, and a lot of different interfaces as well, too. I don’t know if I talked about this before in the spring, the one tool that’s really been effective for a lot of students were the reading assistant tools, one that we’ve been employing is called ExplainPaper. They upload a PDF to it, it calls upon generative AI to scan the paper and you can actually select it to whatever reading level you want, then translate that into your reading level. The one problem is that students don’t realize that they might be giving up some close reading, critical reading skills to it as well too, just like we do with any sort of relationship with generative AI. There is kind of that handoff and offloading of that thinking, but for the most part, they have loved that and that’s helped them engage with some really critical art texts that normally would not be at their reading level that I would usually not assign to certain students. So those are helpful. There are plenty of new tools coming out too. One of them is called Claude 2 to be precise by Anthropic. That just came out, I think, in July for public release, it is as powerful as GPT-4. It is free right now, if you want to sign up for it as well too. The reason why I mentioned Claude is that the context window, what you can actually upload to it is so much bigger than ChatGPTs. I believe their context window is 75,000 words. So you can actually upload four or five documents at a time, synthesize those documents. One of the things I was using it for as a use case was that I collected tons of reflections for my students this past year about the use of AI. It’s all in a messy Word document. It’s 51 pages single spaced. It’s all anonymized so there’s new data that identifies them. But it’s so much of a time suck on my time, just go through to code those reflections. And I’ve just been uploading to Claude and having it use a sentiment analysis to point out what reflections are positive from these students, in what way, and it does it within a few seconds. It’s amazing.

John: One other nice thing about Claude is that has a training database that ends in early 2023. So it has much more current information, which actually, in some ways is a little concerning for those faculty who were trying to ask more recent questions, particularly in online asynchronous courses, so that ChatGPT could not address those. But with Claude’s expanded training database, that’s no longer quite the case.

Marc: That’s absolutely correct. And to add to this rather early discussion about AI detection, none of the AI detectors that I’m aware of had time to actually train on Claude, so if you generated essay… and you guys are free to do this on your own, your listeners are too… if you generated and essay with Claude, and you try to upload that to one of the AI detectors, very likely you’re going to get zero detection or a very low detection rate for it too, because it’s again, a different system. It’s new, the existing AI detectors have not had time. So the way to translate this is don’t tell your students about it right now, or in this case, be very careful about how you introduce this technology to your students, which we should do anyway. But this is one of those tools that is massively popular, a lot of people just haven’t known about it because, again, ChatGPT just takes up all the oxygen in the room when we talk about Generative AI

John: What are some activities where we can have students productively use AI to assist their learning or as part of their educational process?

Marc: That’s a great question. We actually started developing very specific activities for them to look at different pain points for writing classes. One of them was getting them to actually integrate the technology that way. So we built a very careful assignment, which called on very specific moves for them to make both in terms of their writing, and their integration of the technology for that. We also looked at bringing some research question, building assignments that way. We have assignments from my Digital Media Studies students right now about how they can use it to create infographics. Using the paid for version of ChatGPT Plus, they can have access to plugins, and those plugins then give them access to Canva and Wikipedia. So they can actually use Canva to create full on presentations based off of their own natural language and use actual real sources by using those two plugins in conjunction with each other. I just make them then go through it, edit it with their own words, their own language too, and reflect on what this has done to their process. So lots of different examples, too, I mean, it really is limited only to your imagination in this time, which is exciting, but it’s also kind of the problem that we’re dealing with, there’s so much to think about.

Rebecca: From your experience in training faculty, what are some getting started moves that faculty can take to get familiar enough to take this step of integrating AI by the spring?

Marc: Well, I think the one thing that they could do is, there are a few really fast courses. I think it’s Ethan Mollick from even from the Wharton School of Business put out a very effective training course that was all through YouTube, I think it’s like four or five videos, very simple to take, to get used to understanding how ChatGPT works, how Microsoft’s Bing works as well too, and what sort of activities students can use it for, what sort of activities faculty could. Microsoft has also put out a very fast course, I think takes 53 minutes to complete about using generative AI technologies in education. And those are all very fast ways of basically coming up to speed with the actual technology.

John: And Coursera has a MOOC through Vanderbilt University, on Prompt Engineering for ChatGPT, which can also help familiarize faculty with the capabilities of at least ChatGPT. We’ll include links to these in the show notes.

Marc: I really, really hope Microsoft, Google and the rest of them calm down, because this has gotten a little bit out of control. And integration of these tools are often without use cases, they’re often waiting to see how we’re going to come up and use them too. And that is concerning. Google has announced that they are committed to releasing their own model that’s going to be in competition with GPT4, I think it’s called Gemini by late November. So it looks like they’re just going to keep on heating up this arms race and you get bigger models, more capable and I think we do need to ask ourselves more broadly what our capacity is just to keep up with this. My capacity is about negative zero at this point… going down further.

John: Yeah, we’re seeing new AI tools coming out almost every week or so now in one form or another. And it is getting difficult to keep up with. I believe Apple is also planning to release an AI product.

Marc: They are. They also have a car they’re planning to release, which is the weirdest thing in the world to me, that there could be your iPhone charged in your Apple Car.

John: GM has announced that they are not going to be supporting either Android or Apple CarPlay for their electric vehicles. So perhaps this is Apple’s way of getting back at them for that. And we always end with the question, what [LAUGHTER] is next, which is perhaps a little redundant, but we do always end with that.

Marc: Yeah, I think what’s next is trying to critically engage the technology and explore it not out of fear, but out of a sense of wonder. I hope we can continue to do that. I do think we are seeing a lot of people starting to dig in. And they’re digging in real deep. So I’m trying to be as empathetic as I can be for those that don’t want to deal with the technology. But it is here and you are going to have to sit down and spend some time with it for sure.

John: One thing I’ve noticed that in working with faculty, they’re very concerned about the impact of AI tools on their students and student work. But they’re really excited about all the possibilities that opens up for them in terms of simplifying their workflows. So that, I think, is a positive sign.

Rebecca: They could channel that to help understand how to work with students.

Marc: I hope they find that out, there’s a positive pathway forward with that too.

John: Well, thank you. It’s great talking to you and you’ve given us lots more to think about.

Marc: Thank you guys so much.


John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.


310. Community Effects of Incarceration

Some students receive substantial support on their educational journey within their homes, communities, and schools; others face substantial barriers. In this episode, Arpit Gupta joins us to discuss his recent study that examines the effect of community incarceration rates on the academic performance of children in affected households and on their classmates.

Arpit is an Associate Professor of Finance at the Leonard N. Stern School of Business at NYU. Arpit has published extensively in highly ranked finance, economics, science, law, and management journals on topics ranging from housing markets, infrastructure investment, bail, local journalism, racial housing gaps, incarceration, and remote work.

Show Notes

  • Gupta, Arpit and Hansman, Christopher and Riehl, Evan (2022). Community Impacts of Mass Incarceration. May 3.
  • Norris, S., Pecenco, M., & Weaver, J. (2021). The effects of parental and sibling incarceration: Evidence from ohio. American Economic Review, 111(9), 2926-2963.
  • Lazear, E. P. (2001). Educational production. The Quarterly Journal of Economics 116(3), 777–803.
  • Chetty, R. (2016). Improving opportunities for economic mobility: New evidence and policy lessons. Economic Mobility Research and Ideas on Strengthening Families Communities the Economy, edited by Brown, Alexandra, Buchholz, David, Davis, Daniel, and Gonzalez, Arturo, 35-42.
  • Chetty, R. (2021). Improving equality of opportunity: New insights from big data. Contemporary Economic Policy, 39(1), 7-41.


John: Some students receive substantial support on their educational journey within their homes, communities, and schools; others face substantial barriers. In this episode, we discuss a recent study that examines the effect of community incarceration rates on the academic performance of children in affected households and on their classmates.


John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist;hellip;

John: ;hellip;and Rebecca Mushtare, a graphic designer;hellip;

Rebecca: ;hellip;and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.


Rebecca: Our guest today is Arpit Gupta. Arpit is an Associate Professor of Finance at the Leonard N. Stern School of Business at NYU. Arpit has published extensively in highly ranked finance, economics, science, law, and management journals on topics ranging from housing markets, infrastructure investment, bail, local journalism, racial housing gaps, incarceration, and remote work. Welcome, Arpit.

Arpit: Thanks so much for having me.

John: It’s great to see you again. It’s been a while since we last talked… 20 years or so.

Arpit: Yeah, it’s been a while. So I owe my economics career to John having him teach me at a very formative time in my life. Very happy to be back here.

John: Back at the TIP program, way back. And you would have probably done that anyway, because you had a lot of interest in it even back then. Today’s teas are: …are you drinking any tea, Arpit?

Arpit: …just drinking water at the moment.

Rebecca: It is the foundation of tea.

John: It’s one of our more popular teas.

Rebecca: I have an Awake tea today.

John: I have a Darjeeling tea today.

Rebecca: So we’ve invited you here today to discuss your May 2022 working paper on community impacts of mass incarceration, co-authored with Christopher Hansman and Evan Riehl. Could you tell us about the origin of this study?

Arpit: Yeah, so Chris, Evan, and I were all graduate students at Columbia University. Chris and I were also roommates. And we had a third roommate who was a public defender. So we would just come home and hear interesting stories of his experience at work and things he was seeing. One of the things that he brought home and kind of talked to us was the fact that bail was an interesting process. And there was an interesting random assignment across bail judges. And so that was our first project, it kind of stemmed directly from talking to this roommate and his collaborator on that project. And another thing that he was mentioning is that the way he saw it is that incarceration spells really had rippling effects, not just directly on individuals concerned, but kind of affected broader communities in different ways. And we felt that that was a really interesting insight that has been explored in some other non-economics research. And we wanted to just explore this concept further, because we felt it was an important essential public policy question. And so we spent many years to try to get the right data and setting to explore further at these broader community impacts of incarceration.

John: So earlier studies had found that incarceration of a parent had significant effects on education for children within the household. Could you just talk a little bit about those effects before we talk about your contribution to this literature?

Arpit: Yeah, absolutely. So there is a pretty broad literature on this topic. And I would sort of separate some of the papers that are not in economics from the papers that are in economics. There are a number of great studies that, for example, will track cohorts of people across generations to kind of see what are the rippling long term implications of incarceration. There are a variety of these papers that explore I would sort of describe are the multi dimensional aspects of incarceration on different outcomes for individuals and families that are concerned. And I would sort of characterize this non-economic literature as really highlighting the disproportionate spatially concentrated incarceration. And that’s kind of the key insights of this broader sociological literature, that you think of incarceration as something that affects a lot of people in very concentrated ways and bad ways. The economics literature has taken a little bit of a different approach and has primarily focused on the direct impacts of incarceration, with some literature starting to look at how that also affects household members. A lot of literature has been in Scandinavian countries where they have a different justice system and really good data. Some of those papers have actually found positive effects of parental incarceration on children outcomes, which might make sense if you’re removing, for example, a negative role model from a child’s life or if the criminal justice system itself offers positive remediation, restorative justice, and so forth, that kind of improves someone’s outcomes after they’ve returned from prison. The closest paper to our study in the United States is going to be a paper by Norris and Weaver, which focuses on the effects of incarceration for students in Ohio. And there, they argue that incarceration of a parent improves the odds that the child is going to be involved in the criminal justice system in the future, so that they are less likely to be arrested in the future. And they find more mixed evidence on the education impact. They don’t find much evidence for negative education impacts. But that’s done on the kind of little bits of a smaller sample with larger standard errors.

John: Your study, though, goes a little bit further, because you’re looking not just as the effect on children within the household, but also spillover effects into their classrooms and schools, from incarceration of adults in the household. How did you separate out the effect of differences in incarceration rates from all the other factors that might influence such outcomes in those communities?

Arpit: Absolutely. So this is going to be, of course, a key distinction between how economists think about the problem versus other disciplines. We’re thinking upon the question of identification. How can we identify whether the negative impacts or positive impact you’re looking at can be attributed to incarceration, or are just reflective of other background trends. Let me start first with actually how we think about these effects in aggregate, because that gets at like the community dimension of the problem, which is kind of our central focus. So the big question that we’re really interested in is what happens to a community, when a lot of people within that community are behind bars? How does the impact on that set of individuals spill over and impact the overall community. And of course, this is an even harder identification problem than just looking at the attacks on one person, because you wonder what the omitted background factors that can affect entire communities. But we find that when a county has a relatively more strict set of judges, that actually has a large impact on the overall performance of all the students in the area. So that suggests that there are large impacts of incarceration that kind of broadly affect all the students in a particular area. And that motivates us to think about what is the size of the effect of incarceration on children’s outcomes, and what are the mechanisms by which they’re affected? But we then dig more deeply into thinking about the effects on the directly affected children, those whose parents are themselves incarcerated. There, we similarly use judicial variation, and we also look at the spillovers onto other children in the classroom. So the key innovation, the key contribution, I think, of our analysis is to take this question that has been studied before, but adopt it to the problem, thereby thinking about the more aggregate consequences and the mechanisms by which incarcerations affect broader communities.

John: And you also use an event study approach too, to provide more support. Could you talk a little bit about that part of the analysis?

Arpit: So we use those in both our direct and indirect analyses where we were trying to understand what is the impact on a student if their family member is incarcerated. And the event study approach basically looks before and after that arrest and looks at the outcomes for the children as measured by outcomes such as the test scores, the suspension rates, misbehavior rates, and so forth. So we’re interested in a little bit of a multi dimensional set of outcomes for children, because we want to know both how is this child doing, we want to know whether there are behavioral disruptions that may stem from having a background incarceration at home, that may then affect other children, because if you’re misbehaving in the classroom, that’s something that will negatively potentially affect other children’s learning in the classroom. The event study is looking within the child before and after that arrest period. And we also do that same event study analysis at the classroom level, basically. So looking at what happens to the performance of other students in the classroom, when one of the students’ family members is arrested.

Rebecca: How big was the impact of incarceration on children in the affected households and in the classrooms.

Arpit: So for one individual child, the effects on math and English scores is something 5% of the standard deviation. So it’s an effect that is sizable enough, if you think about many educational interventions as having very heterogeneous effects, and it’s very hard often to kind of get meaningful moves in child performance. But the really big part of the analysis, I think, was trying to reconcile those direct effects, the ones that are one to one and a half percent of the standard deviation against the overall impact of incarceration on the whole community. So what happens if I take a whole county and I change the mix of judges and I have much more incarceration? What is the overall educational impact there? So when we looked at that overall community level perspective, we actually found that changing a one standard deviation in the county level stringency is actually affecting test scores by between one and a half to three and a half of a standard deviation. So we’re basically getting very big aggregate effects that the individual effects alone can’t explain. And so we think that there’s scope for these spillover effects, by which one directly affects how a child behaves in a certain way in the classroom that then spills over to the other children in the classroom that thereby amplifies the effect, so as to generate larger negative overall effects. And one channel that we use to identify those is to look within the classroom itself, not gonna measure all the potential spillovers between children, but it’s one area where we think there’s spillovers, and we think that those spillovers can also account for some fraction of the overall community effect.

Rebecca: Can you translate some of that standard deviation talk [LAUGHTER] to people that don’t know anything about statistics.

Arpit: For example, at the county level, when we are thinking about a one standard deviation increase in the stringency we’re thinking about a 15 to 20% increase in incarceration. So that’s kind of the range of variation that we’re looking at at the county level when thinking about what are the typical shock to incarceration, and that’s a kind of pretty substantial increase in the incarceration levels we’re seeing as a consequence.

John: So you’re finding the effect on any one other student is relatively small, but the aggregate effect on all the students in the class is relatively large. Is that correct?

Arpit: That’s right. So when we look at those other students in the classroom, we’re getting effects for those students in response to the incarceration of a peer’s family members, they’re on the order of 0.3, 0.4 percent of a standard deviation. You should just basically think of that as a really small number. And the only way we’re kind of getting the power to analyze this is that we’re looking at this North Carolina data, which is really great, a lot of people have worked with it, exactly because it is so comprehensive. So we’ve got all the student rolls, we’ve got all the arrest records, all of these are matched together. And so using this really holistic sample allows us to try to quantify these effects that are pretty small for any one individual child, but they’re just a lot of exposures that can aggregate up. And so we think that this classroom disruption channel can explain something like 15% of that relationship between aggregate incarceration and test scores. So it kind of all adds up to explain a more meaningful fraction of this overall relationship between what happens when a lot of people in the area go to jail and what happens to student performance in that area.

John: What sort of mechanism are you hypothesizing might be the cause of the spillover effects to other students in the classrooms?

Arpit: So let’s start with what we can measure in our data. So what we observe is that children who are affected whose family members are incarcerated are looking at increases in suspension days, they’re absent more often, they’re involved in more fighting incidents, typically it takes two people to fight. So that sort of tells us that there are other people involved in the classroom for these affected students. And so we think that this relates very closely to the idea that there are classroom level externalities, and there is a large literature, actually papers by Lazear and others that highlight the importance and implications of classroom level externalities, classroom disruptions, when it comes to learning. It also comes up, by the way, when I talk to people in North Carolina who are teachers. One thing that they really bring up is that children come into the classroom with all sorts of backgrounds that change behavior in the classroom, and that impairs the learning experience for other students in the classroom. So that’s what we can measure most cleanly, is the existence of these behavioral disruptions by students affecting how they behave in the classroom, and influences, through that channel, the learning experience of other children. That doesn’t need to be the only mechanism that’s going on here, there can be other spillover channels between children that we can’t observe in our data. There can also be other channels outside of peer interactions between children through other community interactions between people as well, that we also can’t measure in our data. So we think of this project as really trying to open a set of analysis that we’re considering and thinking about the broader web of social interactions, when incarceration happens.

Rebecca: What are some of the public policy implications of the study?

Arpit: So the challenge, of course, is that you’re measuring one side of the equation, we’re measuring sort of the cost of incarceration, and so you have to balance those against some of the possible benefits of incarceration, because children are also affected by crime in the local community, as well. And so it’s a difficult trade off to try to balance both the costs and benefits of incarceration in tandem. So I don’t think our results actually have a clear takeaway. I think the biggest thing that I personally kind of took away from the analysis is that if we have different techniques, if we have different ways of trying to reduce and address crime, it would be ideal if we were able to lean on ways that rely less on the incarceration channel, which impose these additional externalities and costs and burdens on local communities, and instead found other ways of trying to address and mitigate and reduce crime. So for example, when it comes to a different setting, when it comes to thinking about bail, which is a topic we’ve also researched before, there is sometimes a choice between arresting the individual and putting them in jail, compared to something like house arrest, compared to something like electronic or digital monitoring. These systems are also not perfect. There are also a lot of costs and tradeoffs there. But to the extent that you can find ways of deterring, mitigating, crime that don’t rely as much on the incarceration channel, I think that lowers the spillover negative effects on local communities, I want to mention that, when we look at these multi-dimensional impacts of the original incarceration event on the student, we actually find, consistent with prior literature, that to the extent that we can observe juvenile offenses, we don’t observe increases in crime, if anything, there are decreases in criminal activity. That, again, is consistent with some of the prior literature. And the way to interpret that, I think, is to again think of there as being multiple dimensions by which people are affected. So you can observe that there’s a negative role model effect, you observe someone going to jail for a crime… Well, I’m not going to commit that crime, but you may still act out in the classroom. So we shouldn’t think of the responses to these kinds of disruptive background events as happening on some uniform spectrum of good behavior or bad behavior, but it’s much more multi-dimensional in how people respond to stressful situations.

John: Did you find a difference in the effect whether it was a male or a female in the household who is incarcerated in terms of the impact on children?

Arpit: Everything I’ve said, so far, I’ve been trying to be careful in sort of saying , these are individuals in the household, because really, what we’re doing is the household level match. So we’ve got the address, and so what we really know is that this is someone that lives at this address that is arrested. We view that mostly as a strength of our approach, which is trying to identify household members. It sort of recognizes the intergenerational and complicated family backgrounds many families have, but it does make it a little more challenging to establish the sort of true relationship between individuals. And so one thing that we kind of did there is sort of try to identify probable female parents or guardians, male parents or guardians, or simply assign kind of age ranges and things like that. We did find the effects on children were much larger when we were looking at the incarceration of a female payment. So that kind of makes a lot of sense, if you think that mothers and female guardians kind of play uniquely important roles within the household. And when it comes to the child themselves, the effects were actually pretty similar between boys and girls.

John: In the US, we have one of the worst rates of intergenerational income mobility, might this type of an issue be one of the causes of that, in that in low-income communities where incarceration rates tend to be higher, it’s putting children in those communities at further disadvantage, which can have some long-term consequences.

Arpit: One thing I want to mention is where we’re kind of taking the paper is to adopt the community frame and think about other community outcomes that might potentially change as a result of incarceration. So I do think that probably one of the reasons that we have this, not just low on average in the United States, a low rate of social mobility in the United States, but also it’s very regionally varying rate of social mobility differences across the United States. I remember when the first Chetty map was released that showed the geography of economic mobility in the United States. My home state, North Carolina, is actually incredibly low for social mobility. And that’s surprising, actually, because North Carolina is where everyone’s moving to. It is incredibly economically dynamic, it has lots of job centers, but moving there is low cost of housing. It has a lot of features, which you might expect should be associated with high economic mobility. And in fact, like much of the south and very regionally varying patterns across United States, you actually observe pretty low social mobility. And I do wonder whether one reason for that is that we have these very high rates of incarceration across much of the United States. And that’s not an easy thing to just stop incarceration, because we all know that the system of criminal justice, that is also there to protect in low- income communities from the negative consequences of crime. So the public policy challenges of how to figure out what to do about this are really complicated. But we want to know why is it that people that grew up in the same state that I did, don’t necessarily have great opportunities compared to people who grew up elsewhere. So we’re hoping to use the setting, use this analysis to dig a little bit deeper into this question. And one fact that is kind of already out there that I think is very related, is that analysis by Chetty and others, which looked at the geography of social mobility, found that a big correlate, something that associates strongly with social mobility across United States is the presence of two-parent households. So the number of absent fathers, that associates very strongly with the lack of social mobility in an area. Of course, that is not a causal statement, you could imagine things go the other way. So lack of social mobility kind of impacts in different ways. But I think that’s a diagnostic that is suggestive of the idea that something about incarceration affects broader communities, affects the family formation, affects family stability in ways that impact people’s ability to build stable relationships. And all of that kind of has really persistent negative impacts.

Rebecca: As an educator, this study makes me think about if I’m a teacher in a classroom, I’m kind of experiencing the phenomenon that you’re studying, and the kinds of things that I might consider doing for classroom management or the way that I might better understand even just what’s happening or what I’m observing, I think is food for thought for educators to just be more aware of what’s happening in their communities.

Arpit: The other kind of question, I think, for economic policy is about these measures of teacher value add, which are being thought of as ways of assessing or even compensating teachers for the increase in test scores, that they’re resulting in the classroom that they have, right? And this makes sense to economists we want to value and grade people based on the incremental add that they’ve done to a population kind of coming in. But one thing I actually hear a lot from teachers is they’re very worried about this possibility as something that affects them as a teacher, because they’re saying, “Well, it’s not my problem, if I happen to have a classroom in a particular year where the children are going through a lot of stuff at home, they’re not necessarily going to learn as much, that might affect other children in the classroom as well. And that’s something that I will potentially be judged for, something outside of my control.” And that is a very strong problem for this whole teacher value add methodology, because these kinds of background events don’t necessarily follow a predictable sequence. And so they can kind of happen at various times over students’ lives, over a teacher’s career across different classrooms. And so it’s very hard statistically, to separate out whether a student is doing well or badly because of the teacher, or because of some background events. It also impacts, I think, how we statistically evaluate and think about evaluating teachers.

Rebecca: I imagine it also impacts classroom management and observations of classroom management and other tools that we use to evaluate teachers currently… behavior in that class is different than others, or they have different traumatic experiences impacting their behavior. That’s not necessarily being observed by an observer.

John: And we have put probably far too much weight on teacher and school compensations and budget tied to student performance, because, as you said, there’s so much that’s outside the control of the teachers or the school districts.

Rebecca: That’s also the schools that tend to struggle to get teachers and things too, right?

John: And we’re penalizing those teachers and those school districts, often, that face the most severe challenges and need the most support. You mentioned this dataset from North Carolina is a very rich one, but you had to do a bit of work to get all that data together. Because there is a lot of data on student outcomes, but you also have to tie this to incarceration. Could you talk a little bit about how you matched the household data, or the incarceration data, to the schooling data?

Arpit: Oh, man, this is my favorite part of the project, because it allows me to reminisce about my sort of a Moby whale moment of a project. So I think all of us as researchers need to sort of think about what are the projects that we really want to see live, what are the ones that we’re really going to go to bat for, and this is one of those projects for me. I just felt that this needed to be answered. And so, together my collaborators, we really just spent a really long time trying to figure out how to get the right data for this. So you have to put together the criminal justice records for a given area, you need to put together the education records, and then you need to figure out how to link the two of these. So some states you can get one, some states you can get the other and it’s very hard to find a set of states where the two of them match. So we tried a whole range of states, a whole range of datasets, many times we got very close, but were stopped at the last minute. And finally, we were able to work with the state of North Carolina, which has an excellent set of education records, has these great criminal justice records, and were able to figure out a way of merging and matching the two sets of documents at this household level, have a pretty good sense of the direct linkages between the children in our sample and the criminal defendants and then using the classroom identifiers in the dataset to identify other spillover effects, looking at the broader geographic implications. So all of that wound up working out for us at the end, but it was a long haul to get there. And I think it’s definitely a lesson that I took away from this project that if you want something to do well, you really got to work at it. There’s no substitute for putting in the shoe leather for calling people, cold calling people, emailing people and just hearing no, no, no, no, no again and again and again until you’re able to figure out something that works.

John: And the matching between households for the students and the incarcerated people was based on household addresses. Is that correct?

Arpit: That’s right. So that match was done by the North Carolina education folks. They took their records, they imported the criminal justice records, matched that at the household level, and then gave us a data set that had removed all identifiers that we could work with for research.

John: It’s a wonderful data set and it’s a really impressive piece of work.

Arpit: Thank you very much.

Rebecca: So we always wrap up by asking what’s next?

Arpit: For us on this project, we’re really trying to see if we can think about some of these broader implications of incarceration on communities outside of the educational impacts that we’ve been talking about so far. So thinking about the impacts on family structure, thinking about whether it spills over into the usage of other government programs, whether it has employment effects, kind of housing market access, I think that there are a whole range of different outcomes, particularly at these broader community levels that I think are shaped by the number of people in that local community that are impacted by incarceration. So I think those are the overall community spillovers, we’re interested in understanding.

John: Well, thank you. This is some really impressive work. And I have to say I’m really impressed by all the work that you’ve been doing in so many areas. You’re doing some wonderful work on some really important topics.

Arpit: Thank you very much, John. I had an economics teacher growing up who inspired me to work on these topics.

Rebecca: Well, thank you so much. We’re looking forward to sharing this with our audience.

Arpit: Thanks.


John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.


309. Preparing Students for an AI Future

New technology is often seen as a threat to learning when first introduced in an educational setting. In this episode, Michelle Miller joins us to examine the question of when to stick with tools and methods that are familiar and when to investigate the possibilities of the future.

Michelle is a Professor of Psychological Sciences and President’s Distinguished Teaching Fellow at Northern Arizona University.  She is the author of Minds Online: Teaching Effectively with Technology and Remembering and Forgetting in the Age of Technology: Teaching, Learning, and the Science of Memory in a Wired World. Michelle is also a frequent contributor of articles on teaching and learning in higher education to publications such as The Chronicle of Higher Education.

Show Notes


John: New technology is often seen as a threat to learning when first introduced in an educational setting. In this episode, we examine the question of when to stick with tools and methods that are familiar and when to investigate the possibilities of the future.


John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.


John: Our guest today is Michelle Miller. Michelle is a Professor of Psychological Sciences and President’s Distinguished Teaching Fellow at Northern Arizona University. She is the author of Minds Online: Teaching Effectively with Technology and Remembering and Forgetting in the Age of Technology: Teaching, Learning, and the Science of Memory in a Wired World. Michelle is also a frequent contributor of articles on teaching and learning in higher education to publications such as The Chronicle of Higher Education. Welcome back, Michelle.

Michelle: Hey, it’s great to be here.

Rebecca: Today’s teas are: ….Michelle, are you drinking tea?

Michelle: I’m actually still sticking with water. So it’s a healthy start so far for the day.

Rebecca: Sounds like a good plan.

John: I have ginger peach black tea today.

Rebecca: And I’ve got some Awake tea. We’re all starting the day [LAUGHTER].

John: So we’ve invited you here to discuss your August 17th Chronicle article on adapting to ChatGPT. You began that article by talking about your experience teaching a research methods course for the first time. Could you share that story? Because I think it’s a nice entree into this.

Michelle: Oh, thank you. I’m glad you agree. You never know when you’re sharing these kinds of personal experiences. But I will say this was triggered by my initial dawning awareness of the recent advances in AI tools, which we’re all talking about now. So initially, like probably a lot of people, I thought, well okay, it’s the latest thing and I don’t know how kind of attentive or concerned I should be about this. And as somebody who does write a lot about technology and education, I have a pretty high bar set for saying, “Oh wow, we actually kind of need to drop everything and look at this,” I’ve heard a lot of like, “Oh, this will change everything.” I know we all have. But as I started to get familiar with it, I thought “Oh my goodness, this really is a change” and it brought back that experience, which was from my very first assignment teaching the Research Methods in Psychology course at a, well, I’ll just say it was a small liberal arts institution, not my graduate institution. So I’m at this new place with this new group of students, very high expectations, and the research methods course… I think all disciplines have a course kind of like this, where we kind of go from, “Oh, we’re consuming and discussing research or scholarship in this area” to “Okay, how are we going to produce this and getting those skills.” So it is challenging, and one of the big challenges was and still is, in different forms, the statistical analysis. So you can’t really design a study and carry it out in psychological sciences without a working knowledge of what numbers are we going to be collecting, what kind of data (and it usually is quantitative data), and what’s our plan? What are we going to do with it once we have it, and getting all that statistical output for the first time and interpreting it, that is a big deal for psychology majors, it always is. So students are coming, probably pretty anxious, to this new class with a teacher they haven’t met before. This is my first time out as the instructor of record. And I prepared and prepared and prepared as we do. And one of the things that I worked on was, at the time, our methodology for analyzing quantitative data. We would use a statistics package and you had to feed it command line style input, it was basically like writing small programs to then hand over to the package. And you would have to define the data, you’d have to say, “Okay, here’s what’s in every column and every field of this file,” and there was a lot to it. And I was excited. Here’s all this knowledge I’m going to share with you. I had to work for years to figure out all my tricks of the trade for how to make these programs actually run. And so I’ve got my stack of overheads. I come in, and I have one of those flashbulb memories. I walked into the lab where we were going to be running the analysis portion, and I look over the students’ shoulders, and many of them have opened up and are starting to mess around with and play around with the newest version of this statistics package. And instead of these [LAUGHTER] screens with some commands, what am I looking at? I’m looking at spreadsheets [LAUGHTER]. So the data is going into these predefined boxes. There’s this big, pretty colorful interface with drop down menus… All the commands that I had to memorize [LAUGHTER], you can point and click, and I’m just looking at this and going, “Oh no, what do I do?” And part of my idea for this article was kind of going back and taking apart what that was like and where those reactions were coming from. And as I kind of put in a very condensed form in the article, I think it really was one part just purely sort of anxiety and maybe a little bit of loss and saying, “But I was going to share with you how to do these skills…” partly that “Oh no, what do I do now?” I’m a new instructor. I have to draft all this stuff, and then partly, yeah, curiosity and saying, “Well, wait a minute, is this really going to do the same thing as how I was generating these commands and I know you’re still going to need that critical thinking and the top level knowledge of “Okay, which menu item do you want?” Is this going to be more trouble than it’s worth? Are students going to be running all the wrong analyses because it’s just so easy to do, and it’s going to go away.” So all of that complex mix is, of course, not identical to, but I think pretty similar to how I felt… maybe how a lot of folks are feeling… about what is the role of this going to be in my teaching and in my field, and in scholarship in general going forward?

Rebecca: So in your article, you talk a lot about experimenting with AI tools to get started in thinking about how AI is related to your discipline. And we certainly have had lots of conversations with faculty about just getting in there and trying it out just to see how tools like ChatGPT work to become more familiar with how they might be integrated into their workflow. Can you share a little bit about how you’d recommend for faculty or how you were thinking about [LAUGHTER] jumping in and experimenting and just gettin g started in this space?

Michelle: Well, I think perhaps, it also can start with a little bit of that reflection and I think probably your listenership has a lot of very reflective faculty and instructors here. And I think that’s the great first step of “Alright now, if I’m feeling worried, or I’m feeling a very negative reaction, where’s that coming from and why?” But then, of course, yeah when you get it and actually start using it the way that I had to get it and start using my statistics package in a brand new way, then you do start to see, “Okay, well, what’s great, what’s concerning and not great, and what am I going to do with this in the future? So experimenting with the AI tools, and doing so from a really specific perspective. When I started experimenting at first, I think I thrashed around and kind of wasted some time and energy initially, looking at some things that were not really education focused. So something that’s aimed at people who are, say, social media managers, and how this will affect their lives is very different than me as a faculty member. So make sure you kind of narrow it down, and you’re a little planful about what you look at, what resources you’re going to tap into, and so on. And so that’s a good starting point. Now, here’s what I also noticed about my initial learning curve with this. So I decided to go with ChatGPT, myself, as the tool I wanted to get the most in depth with. So I did that and I noticed really that, of course, like with any sort of transfer of learning situation, and so many of those things we do with our students, I was falling back in a kind of an old pattern. So my first impulse was really funny, it was just to ask it questions, because I think now that we’ve had several decades of Google under our belts and other kinds of search engines, we get into these AI tools, and we treat them like search engines, which for many reasons, they really are not. Now, this is not bad, you can certainly get some interesting answers. But I think it’s good to really have at the front of your mind to kind of transition from simply asking questions to what these tools really shine with, which is following directions. I think one of the best little heuristics I’ve seen out there, just very general advice, is: role, goal, and instructions. So instead of coming in and saying “what is” or “find” or something like that, what perspective is it coming from? Is it acting as an expert? Is it acting as an editor? Is it going to role play the position of a college student? Tell it what you’re trying to accomplish, and then give it some instructions for what you want it to do. That’s a big kind of step that you can get to pretty quickly once you are experimenting. And that’s, I think, real important to do. So we have that. And of course, we also want to keep in mind that one of the big distinguishing factors as well is that these tools have memory, your session is going to unfold in a particular and unique way, depending not just on the prompts you give it, but what you’ve already asked it before. So, once you’ve got those two things, you can start experimenting with it. And I do think coming at it from very specific perspectives is important as I mentioned because there’s so little super general advice, or discipline-independent advice that I think is really going to be useful to you. And so doing that, I think a lot of us we start in a sort of a low-stakes, tentative way with other interests we might have. So for example, one of the first things that I did to test it out myself was I had it work out a kind of a tedious little problem in knitting. So I had a knitting pattern, and there’s just a particular little counting algorithm where to put increases in your pattern that always trips us up. And I was about to like, “Oh, I gotta go look this up,” then I thought “You know what, I’m gonna see if ChatGPT can do this.” And it did that really well. And by doing that in an area where I kind of knew what to expect, I could also push its parameters a little bit, make sure is this plausible? is what it’s given me… [LAUGHTER] does that map onto reality? and I can fact check it a little bit better as I go along. So those are some things that I think that we can do, for those who really are starting from scratch or close to it right now.

John: You’re suggesting that faculty should think about how AI tools such as this… and there’s a growing number of them, it seems more are coming out almost every week…, how they might be useful in your disciplines and in the types of things you’re preparing students for, because as you suggested it’s very different in different contexts. It might be very different if you’re teaching students to write than if you’re teaching them psychology or economics or math. And so it’s always tempting to prepare students for the way we were prepared for the world that we were entering into in our disciplines. And as you suggest in the article that we really need to prepare students for the world that they’re going to be entering. Should people be thinking about how it’s likely that students will be using these tools in the future and then helping prepare them for that world?

Michelle: Yeah, that’s a really good way to start getting our arms around this. In kind of the thinking that I’ve been doing and kind of going through this over the last couple of months… that just absolutely keeps coming up as a recurring thing, that this is so big, complicated, and overwhelming, and means very different things for different people in different fields. Being able to kind of divide and break down that problem is so important. So, yeah, I do think that and, for example, one of the very basic things that I’ve made some baby steps towards using myself is, ChatGPT is really good at kind of reformulating content that you give it, expanding or condensing it in particular. The other day, for example, I was really kind of working to shape a writing piece, and I had sort of a longer overview and I needed to go back and kind of take it back down to basics and give myself some ideas as a writer. So I was not having it write any prose for me. But I said, “Okay, take what I wrote and turn it into bullet points” and it did a great job at that. I had a request recently from somebody who was looking at some workshop content I had and said, “Oh, we really want to add on some questions where people can test their own understanding.” And you know, as the big retrieval practice [LAUGHTER] advocate and fan of all time, I’m like, “Oh, well, that’s a great idea. Oh, my goodness, and I’m gonna have to write this, I’m on a deadline.” And here too, I got, not a perfectly configured set of questions. but I got a really good starting point. So I was able to really quickly dump in some text and some content and say,”Write this many multiple choice and true/false questions.” And it did that really, really well. So those are two very elementary examples and some things that we can get in the habit of doing as faculty and as people who work with information and knowledge in general.

Rebecca: I’ve used ChatGPT, quite often to get started on things too, and generate design prompts, all kinds of things and have it revise and add things and really get me to think through some things and then kind of I do my own thing. But I use that as a good starting point to not have a blank page.

Michelle: Absolutely. Yeah, the blank page issue. And I think where we will need to develop our own practice is to say, “Okay, make sure we don’t conflate or accidentally commingle our work with ChatGPT’s, as we figure out what those acceptable parameters are.” But that reminds me too, I mean, we all have the arenas where we shine and the arenas where we have difficulty as, again, as faculty, as working professionals. I know graphic design is your background. I’m terrible. I’m great at words, but it reminds me, one of the things that I kind of made myself go and experiment with was creating a graphic, just for my online course that’s running right now, which would, for me, that would typically be a kind of an ordeal of searching and trying to find something that was legitimate to use and a lot of clipart, and I had it generate something. Now, I do not advise putting in like “exciting psychology image in the style of Salvador Dali,” [LAUGHTER] and seeing what comes out. He was not the right choice. It was quite terrifying. But after a lot of trial and error, I found something that was serviceable and there too, it’s not like I need to develop those skills. If I did, I would go about that very, very differently. But it’s something that I need in the course of my work but it’s a little outside of my real realm of expertise. So helpful there too. So yeah, the blank page… I think you really hit on something there.

John: Now did you use DALL-E or Midjourney or one of the other AI design tools to generate that image?

Michelle: Oh my goodness. Well, here again, [LAUGHTER] I was really out of the proverbial comfort zone for myself is really going to show. I did use DALL-E and I really wrestled with it for a couple of reasons. And so, as a non-graphic person, it did not come easily to me. Midjourney as well, if you’re not a Discord user, you’re really kind of fighting to figure out that interface at the same time and those that are familiar with cognitive load concept of [LAUGHTER] “I’m trying to focus on this project, but all this other stuff is happening. And then I had a good friend who’s a computer engineer and designs stained glass as a hobbyist [LAUGHTER] and kind of took my hand and said, “Okay, here’s some things you can do.” It actually came up with something a lot prettier, I have to say.

John: You had just mentioned two ways in which faculty could use this to summarize their work or to generate some questions. Not all faculty rely on retrieval practice in an optimal manner. Might this be something that students can use to fill in the gaps when they’re not getting enough retrieval practice or when they’re assigned more complex readings then they’re able to handle.

Michelle: Yeah, having the expertise is part of it, and I think we’re going to see a lot of developing understanding of that really cool tradeoff and handoff between our expertise and what the machine can do. I’m kicking around this idea as well, so I’m glad you brought that up. A nice side effect could be a new era for retrieval practice, since that is something of a limiting factor is getting quality prompts and questions for yourself. It’s funny, one of the things that I did do while taking a little prompt engineering course right now to try to build some of these skills and the facility with it. And one of the things they assigned was a big dense article [LAUGHTER] on prompt engineering, which was really great, but a little out of my field, and so I’m kind of going “Well, did I get that?” And then I thought, I better take my own medicine here and say, “Well, what’s the best way to ensure that you do and to find out if you don’t have a good grasp of what you were assigned?” And I was able to give it the content, I gave it, again, a role, a goal, and some instructions and said “Act as a tutor or a college professor, take this article, and give me five questions to test my knowledge. And then I told it to evaluate my answers [LAUGHTER] and see whether it was correct.” So that was about as meta as you can get, I think, in this area right now. So I’ve done it. And here again, it does a pretty good job, actually an excellent job. Do you want to use it for something super high stakes, probably not, especially without taking that expert eye to it. But wow, here’s something, here’s content that was challenging to me personally. It did not come with built in retrieval practice, or a live tutor to help me out with it. I read it, and I’m kind of going, “I don’t know, I don’t have a really confident feeling.” So I was able to run through that. And so yeah, that could be one of the initial steps that we suggest to students as a potentially helpful and not terribly risky way of using these really powerful new tools.

Rebecca: One of the things that this conversation is reminding me of and some of the others that we’ve had about ChatGPT is we have to talk a little bit about how students might use it in an assignment or something, or how we might coach a student to use it. But we don’t often talk a lot about ways that students might just come to a tool like this, and how they’re just going to use it on their own without us having any [LAUGHTER] impact. I think, often we jump to conclusions that they’re gonna have a tool write a paper or whatever. What are some other ways that we can imagine or roleplay or experiment in the role of a student to see how a tool like this might impact our learning?

Michelle: So that is another kind of neat running theme that does come up, I think, with these AI tools is role playing. I mean, this is what it’s essentially doing. And so having us roleplay the position of a student or having it evaluate our materials from the perspective of a student, I think, could be useful. But, it kind of reminds me let’s not have a total illusion of control over this. I think, as faculty, we have a very individualistic approach to our work. And I think that’s fine. But yeah, there’s a lot happening outside of the classroom that we should always have in mind. So just like with me on that hyper planned first course that I was going to be teaching, it just happened and students were already out there experimenting with “Oh, here’s how I can complete this basic statistics assignment with the assistance of this tool I’m going to teach myself. So that could be going on, almost certainly is going on, out there in the world of students. And it’s another time to do something which I know I have to remind myself to do, which is ask students and really talk to them about it. Early on, I think there was a little bit of like, “Oh, this is a sort of a taboo or a secret and I can’t talk to my professors about it and I want to broach it and professors, they didn’t want to broach it with their students because we don’t want to give anybody ideas or suggest some things are okay where they’re not. But I think we’re at a good point to just kind of level with our students and ask them “How do you think we could bring this in?” I think next semester, I’m going to run maybe an extra credit assignment and say, “Oh, okay, we’re gonna have a contest, you get bragging rights, and maybe a few points to “What is a good creative use of this tool in a way that relates to this class? Or can you create something, kind of a creative product or some kind of a demonstration that in some way ties to the class?” And I’ve learned through experience when I’m stumped, and I don’t quite know where to go with a tool or a technique or a problem, take it to the students and see what they can do with it.

Rebecca: I can see this is a real opportunity to just ask the students, how are they using it, and then take a look at the results that it’s creating. And then this is where we can provide some information about how expertise in a field [LAUGHTER] could actually make that better why that result is in what they think it is.

Michelle: Absolutely, and some of the best suggestions that I’ve seen out there, I’m kind of eagerly consuming across a lot of disciplines as much as I can to look at those suggestions. The most intriguing ones I’ve seen are kind of with things with a media literacy and critical thinking flair that tells students “Okay, here’s something to elicit from your AI tool that you’re using, and then we, from our human and expert perspectives, are going to critique that and see how we could improve it. So here too, critical thinking and those kinds of evaluation skills and abilities are some of the most prized things we want students to be getting in higher education. And they are simultaneously. for many different reasons, they are some of the hardest. So if we can bring that to bear on the problem, I think that can be a big benefit.

John: In the article, you suggested that faculty should consider introducing some AI based activities in their classes. Could you talk a little bit about some that you might be considering or that you might recommend to people?

Michelle: One of the things that I am going to be teaching, actually for the first time in a very long time, is a writing in psychology course, which has the added challenge of being fully online asynchronous, so that’s going to be coming up pretty soon for me. It’s still under construction, as I’m sure a lot of our activities and a lot of things are that we’re thinking about in this very fluid and rapidly developing area. I think things like outlining, things like having ChatGPT suggest improvements, and finding ways for students to also kind of track their workflow with that. I do think that one of the things that in our different professional [LAUGHTER] lives, because as I mentioned in the article, I think that should really lead the way of what work are we doing as faculty and as scholars in our particular areas. One of the things we’re going to have to be looking at is alright, how do I manage any output that I got from this and knowing what belongs to it and what was generated by me. What have I already asked it? If they’re particularly good prompts, how do I save those so I can reuse them? …another really good thing about interacting with the tools. But, I’m kind of playing around with some different ideas about having students generate maybe structures or suggestions that they can work off of themselves. And having CHATGPT give them some feedback on what they’ve developed so far. So one of the things you can ask it to do is critique what you tell it, so [LAUGHTER] you can say, “Okay, improve on this.” And then you can repeat, you can keep iterating on that, and you can keep fine tuning in different areas. You can also have it improve on its own work. So once it makes a suggestion you can, I mean, it’s virtually infinite what you can tell it to go back and do: to refocus, expand, condense, add and delete, and so on. So that’s kind of what I am shaping right here. I think too, at the Introduction to Psychology level, which is the other level that I frequently teach within, I’m not incorporating it quite yet. But I think having students have the opportunity or option to create a dialogue, an example, maybe even a short play or skit that it can produce to illustrate some concepts from the book and there ChatGPT is going to be filling in kind of all the specifics, the student won’t be doing it, but it’ll be up to them to say, “Well, what really stood out to me in this big, vast [LAUGHTER] landscape of introductory material that I think would be so cool to communicate to another person in a creative way?” And this can help out with that. I’m also going to be teaching my teaching practicum for graduate students coming up as well. And, of course, I want to incorporate just kind of the latest state of the art information about it. But also, it’s supposedly, I haven’t tried it myself yet, but supposedly it’s pretty good at structuring lesson plans. We don’t do formal lesson plans the way they’re done in K through 12 education, of course, but to give it the basics of an idea and then have a plan that you’re going to take into a course since that’s one of the things they do in that course is produce plans for courses and I gotta say it’s not a critical skill, the formatting and exactly how that’s all going to be laid out on the page, is not what they’re in the class to do. It’s to really develop their own teaching philosophy, knowledge, and the ability to put those into practice in a classroom. So if it can be an aid to that, great, and I also want them to know what the capabilities are if they haven’t experimented with them yet, so they can be very aware of that going into their first classes that they teach.

Rebecca: When you mentioned the example of a writing intensive class that’s fully asynchronous online, I immediately thought of all of the concerns [LAUGHTER], and barriers that faculty are really struggling with in really highly writing intensive spaces, and then fully online environments, especially around things like academic integrity. Can you talk a little bit about [LAUGHTER] some of the things that you’re thinking about as you’re working through how you’re gonna handle AI in that context?

Michelle: As I’m been talking with other faculty right now, one of the things that I really settled on is the importance of keeping these kind of threads of the conversation separate and so I’m really glad we’re kind of piecing that out from everything [LAUGHTER] else. Because once again, it’s just too much to say, well, on the one hand, how to prepare students and give them skills they might need in the future? How do I use it to enhance learning and oh my gosh, is everybody just going to have AI complete their assignments? It’s kind of too much at once. But once we do piece that out, as you might pick up on that I’m a little enthusiastic about some of the potential, does not mean I don’t think this is a pretty important concern. So I think we’re gonna see a lot of claims about “Oh, we’re going to AI proof assignments and I think probably many of your listeners have already run across AI detection tools and the severe problems with those right now. So I think we have to just say right now, for practical purposes, no, you cannot really reliably detect AI written material. I think that if you’re teaching online especially, I think we should all just say flat out that AI can take your exams. If you have really conventional exams, as I did before [LAUGHTER] this semester in some of my online courses, if you’ve got those, it can take those. And just to kind of drive home to folks, this is not just simple pattern matching, looking up your particular question that floated out into a database, no, it’s processing what you’re putting in. And it’s probably going to do pretty well at that. So for me, I’m kind of thinking about, in my own mind, a lot of these more as speed bumps. I can put speed bumps in the road, and to know what speed bumps are going to at least discourage students from just dumping the class work into ChatGPT. To know what’s effective, it really helps to go in and know what it does well and what it really stumbles on, that will give you some hints about how to make it less attractive. And that’s kind of what I’m settling on right now myself, and what I’ve shared with students, as I’ve spoken with them really candidly to say I’m not trying to police or catch people, I am not under an illusion that I can just AI proof everything. I want to remove obvious temptation, I want to make it so a student who otherwise is inclined to do the right thing, wants to have integrity and wants to learn doesn’t go in feeling like, “Oh, I’m at a disadvantage If I don’t just do this, it’s sitting right there.” So creating those nudges away from it, I think, is important. And yeah, I took the step of taking out conventional exams from the online class I’m teaching right now. And I have been steadily de-emphasizing them more with every single iteration. I think those who are into online course design might agree well, maybe that was never really a good fit to begin with. That’s something that we developed for these face-to-face environments, and we just kind of transplanted it into that environment. But I sort of ripped off that [LAUGHTER] bandaid and said, “Okay, we’re just not going to do this. I’ve put more into the other substance of the course, I put in other kinds of interactions. Because if I ask them Psychology 101 basic test questions, even if I write them fresh every time, it can answer those handily, it really can.

John: Recently, someone ran the Test of Understanding in College Economics through with the micro and macro versions. And I remember on the macro version ChatGPT-4 scored at the 99th percentile on this multiple choice quiz, which basically is the type of things that people would be putting in their regular tests. So it’s going to be a challenge because many of the things we use to assess student’s learning can all be completed by ChatGPT. What types of activities are you thinking of using in that online class that will let you assess student learning without assessing ChatGPT’s or other AI tools’ ability to represent learning?

Michelle: Well, I’ll share one that’s pretty simple, but I was doing anyway for other reasons. So just to take one very simple example of something that we do in that class, I really got an a big kick with Kahoot!, especially during the heyday of fully hybrid teaching where we were charged, as faculty, I know at my institution, where you have to have a class that can run synchronously with in-person and remote students at the same time, and run [LAUGHTER] asynchronously for students who need to do their work at a different time phase. And that was a lot and Kahoot! was a really good solution to that. It’s got a very K through 12 flavor to it, but most students just really take a shine to it anyway. And it is familiar to many of them from high school or previous classes right now. So it’s a quiz game, runs a timed gamified quiz. So students are answering test questions in these Kahoot!s that I set up. And because it has that flexibility, they have the option to play the quiz game sort of asynchronously on their own time, or we have those different live sessions that they can drop in and play against each other and against me. So that’s all great. But here’s the thing, prior to ChatGPT, I said I don’t want to grade this on accuracy, which feels really weird, right, as a faculty member to say, well, here’s the test and your grade is not based on the points you earn for accuracy. It’s very timed, a little hiccup in the connectivity you have at home can alter your score, and I just didn’t like it. So what students do is for their grade, they do a reflection. So I give the link to the Kahoot!, you play it, and then what you turn into me is this really informal and hopefully very authentic reflection, say, “Well, how did you do? What surprised you the most? Were there particular questions that tripped you up?” And also kind of getting them to say, “Well, what are you going to do differently next time?” And for those who are big fans of teaching metacognition, I mean, that comes through loud and clear, I’m sure. So every single module they have this opportunity to come in and say, “Okay, here’s how I’m doing, and here’s what I’m finding challenging in the content.” Is it AI proof? Absolutely not. No, it really isn’t. But it is, at least I think at that tipping point where the contortions you’d have to go through to come up with something that is gonna pass the sniff test with me, and if I’ve now read 1000s of these, I know what they tend to look like. And Kahoot!s are timed. I mean, could you really quickly transfer them out and type them in? Yes. It’s simply a speed bump. But the time would make that also a real challenge to kind of toggle back and forth. So I feel good about having that in the class. And so it’s something again, I’ve been developing for a while, I didn’t just come up with it, fortunately, the minute that ChatGPT really impinged on this class, but it was already in place. And I kind of was able to elevate that and have that be part of it. And so they’re doing that. I do a lot of collaborative annotation, I continue to be really happy with… I use Perusall. I know, that’s not the only option there is, but it’s great. They’ve got an open source textbook. And they’re in there commenting and playing off each other in the comments. So that is the kind of engagement I think that we need in force anyway, it is less of a temptation. And so I feel like that’s probably better than having them try to quickly type out answers to, frankly, pretty generic definitions and so on that we have in that course. Some people are not going to be happy with that, but that’s really truly what I’m doing in that course instead.

John: Might this lead to a bit of a shift to more people using ungrading techniques with those types of reflections as a way of shifting the focus away from grading, which would encourage the use of ChatGPT or other tools to focus on learning, which might discourage it from being used inappropriately?

Michelle: What a fantastic connection. And you know what? When I recently led a discussion with faculty in my own department about this, that is actually something that came up over and over just, it’s not ungrading, because not everybody is even kind of conversant with that concept. But how there are these trends that have been going on for a while of saying, you know, is a timed multiple choice test really what I need everything to hinge on in this online course. Ungrading, this idea of kind of, I think there’s this emerging almost idea I’ll call both sides-ism, or collaboration between student and teacher, which I think was also taking root through the pandemic teaching and that came to the forefront with me of just saying, “Okay, we’re not going to just be able to keep running everything the same way traditionally it’s been run,” which sometimes does have that underlying philosophy of, “Okay, I’m going to make you do things and then you owe me this work, and I’m going to judge it and you’re going to try to get the highest points with the least effort. I mean, that whole dynamic, that is what I think powers this interest in ungrading, which is so exciting, and it’s gonna maybe be pushed ahead by this as well. Ultimately, the reason why you’re going to do these exercises I assign to you is because you want to develop these skills. You are here for a reason, and I am here to help you. So that is, I think, a real positive perspective we can bring to this and I would love to see those two things wedded together, especially now that tests can be taken by ChatGPT, then, we should relook at all of our evaluation and sort of the underlying philosophy that powers it.

John: One of the concerns about ChatGPT is it sometimes makes mistakes, it’s sometimes will make stuff up, and it’s also not very good with citations. In many cases, it will just completely fabricate citations, where it will get the authors of people who’ve done research in the field, but will grab other titles or make up other titles for their work. Might that be a way in which we could could give students an assignment to use one of these tools to generate a paper or a summary on some topic, but then have them go out and verify the arguments made and look for citations and document it just as a way of helping prepare them for a world where they have a tool which is really powerful, but is also sometimes going off in strange directions, so that they can develop their critical thinking skills more effectively.

Michelle: Yeah, looping back to that critical thinking idea. Could this also be a real way to elevate what we’ve been doing and give us some new options in this really challenging and high value area? And yes, this is another thing that I think faculty hopefully will discover and get a sense of as they experiment themselves. I think probably a lot of us have also experimented with, just ask it about yourself. Ask it, what has Dr. Michelle Miller written? There’s a whole collaborator [LAUGHTER] I have never heard of, and when it goes off the rails, it goes. And it’s one thing to say really kind of super vaguely to say like, “Oh, AI may produce output that can’t be trusted.” And that has that real like, okay, caution, but not really, feel to it. That’s a whole other thing to actually sit with it and say, alright, have it generate these citations. They sure do look scholarly, don’t they really look right? Okay, now go check them out. And say, this came out of pure thin air, didn’t it? Or it was close, but it was way off in some particular way. So as in so many areas, to actually have the opportunity to say, okay, generate it and then look at it, and it’s staring you right there in the face some of the issues. So I think that we will see a lot of faculty coming up with really dynamic exercises that are finely tuned to their particular area. But yeah, when we talk about writing all kinds of scholarly writing and research in general, I think that’s going to be a very rich field for ideas. So I’m looking forward to seeing what students and faculty come up with there.

Rebecca: That’s a nice lead into the way that we always wrap up, Michelle, which is to ask: “what’s next?”

Michelle: Well, gosh, alright. So I’m continuing to write about and disseminate all kinds of exciting research findings. I’ve got my research base substack, that’s still going pretty strong. After summer. I actually focused it on ChatGPT and AI for a couple of months. But now I’m back to more general topics in psychology, neuroscience, education, and technology. So articles that pull in at least three out of four on those. I’ve got some other bigger writing projects that are still in the cooker. And so I’ll leave it at that with those and I’m continuing to really develop what I know about and what I can do with ChatGPT. As I was monitoring this literature, it was really very clear that we are at a very, very early stage of scholarship and applied information that people can actually use. Those are all things that are very much on the horizon for my next couple of months.

Rebecca: Well, thank you so much, Michelle, we always enjoy talking with you. And it’s always good to think through and process this new world with others.

Michelle: Absolutely.

John: It certainly keeps things more interesting and exciting than just doing the same thing in the same way all the time. Well, thank you.


John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.


308. Design for Learning

We tend to design courses for ourselves because we are the audience we know best. In this episode Jenae Cohn joins us to explore how user-experience design principles can help us create effective and engaging learning experiences for the students we have right now. Jenae is the Executive Director of the Center for Teaching and Learning at the University of California at Berkeley. She is the author of Skim, Dive, and Surface: Teaching Digital Reading. Her newest book, co-authored with Michael Greer, is Design for Learning: User Experience in Online Teaching and Learning.

Show Notes

  • Cohn, J. (2021). Skim, dive, surface: Teaching digital reading. West Virginia University Press.
  • Cohn, J., & Greer, M. (2023). Design for learning: User Experience in Online Teaching and Learning. Rosenfeld Media
  • Global Society of Online Literacy Educators
  • Horton, S., & Quesenbery, W. (2014). A web for everyone: Designing accessible user experiences. Rosenfeld Media.
  • Web Accessibility Guidelines
  • Copies of Design for Learning may be ordered at the Rosenfeld Media website. The discount code for listeners is TEA20. It’ll be available on Wednesday, 9/27 and will give listeners access to 20% off the book for one month (i.e. 30 days).


John: We tend to design courses for ourselves because we are the audience we know best. In this episode we explore how user-experience design principles can help us create effective and engaging learning experiences for the students we have right now.


John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.


Rebecca: Our guest today is Jenae Cohn. Jenae is the Executive Director of the Center for Teaching and Learning at the University of California at Berkeley. She is the author of Skim, Dive, and Surface: Teaching Digital Reading. Her newest book, co-authored with Michael Greer, is Design for Learning: User Experience in Online Teaching and Learning. Welcome back, Janae.

Jenae: Thank you. I’m so glad to be back.

John: It’s good to see you again.

Jenae: …Good to see you, too.

John: Today’s teas are… Jenae, are you drinking any tea?

Jenae: I sure am. I’m always prepared to drink tea. Especially when I’m talking to the two of you. But I went for a classic English breakfast tea this morning. Do you both have some tea with you?

Rebecca: Yeah, I have English tea time.

Jenae: We’re matching…

Rebecca: Yeah…


John: And, I’m not. I have [LAUGHTER] ginger peach black tea today.

Jenae: That sounds really good, though.

John: It is.

Rebecca: Sounds like a good way to start the day, for sure. So we invited you here today to discuss Design for Learning. Can you talk a little bit about how this book project came about?

Jenae: Absolutely. So my colleague Michael and I have a lot of shared interests. Michael and I both are trained in rhetoric and composition. And we both are people really interested in online writing, online reading, and online learning, broadly speaking. We both served on the board for the Global Society of Online Literacy Educators, which is an organization dedicated to supporting folks who teach reading and writing online, broadly speaking. Through that organization, we got to know each other better. And we just realized how much we wanted to talk about what it really meant to create quality online learning experiences. And something that kept cropping up for the two of us. And I should say that both of us have had like a hodgepodge of jobs in and around higher education. We kind of joke that we were both sort of like these misfits in higher ed, people who have kind of done a bit of teaching, a bit of admin. He’s worked in publishing, I did a lot of work in instructional design and just higher education pedagogy. And something we noticed, just in the various roles that we were in was that educators, professors, faculty could learn a lot from user experience frameworks. And we were reading a lot about UX and UI in the work that we were doing around instructional design and for him publishing, and it just dawned on us like, why are we not bridging these conversations between the work of thinking about designing learning interfaces and the work of building really good, high quality learning experiences. I think we notice that in higher ed, there is this tendency to kind of try and reinvent the wheel around defining what a good teaching experience, especially what a good online teaching experience is by just creating really kind of exhausting templates and tons of checklists and rules. And we really thought those are useful, but wouldn’t it be more useful just to remember that students are people navigating devices online? And can’t we use the frameworks that help inform those design decisions to inform the design of learning experiences to make them better? So that was really the genesis of this project. We started off thinking we’d write a bunch of blog posts and then it struck us that blogs and articles were great, but wouldn’t it be even better if we wrote a book [LAUGHTER]. So we put it all together, and it resulted in this book.

John: So who’s the intended audience of this book?

Jenae: We really are targeting a broad audience with this book, I’d say primarily folks who do instructional design style work in mind. So in higher ed, that could be faculty, a lot of faculty play the role of instructional designers, as well as facilitators and teachers, of course. But, we also hope that this book would really reach folks who do dedicated instructional design support. We also hope that this would just reach people who are having to teach online or do trainings or workshops online, and who are still really struggling with it. This book, I would say, was written before the pandemic happened. We were, I would say, drafting and conceptualizing it before the pandemic. And of course, the pandemic shaped the drafting as we went, there’s still COVID-19 out there, so I don’t want to say we’re beyond the pandemic. But in this moment where we’re beyond perhaps like a peak point of the pandemic, let’s just say. There may be folks who are still wanting to be more intentional about what it means to provide more equitable access to online learning experiences, who want to be designing in a more intentional way, and who want to be really thinking critically about how to create more sustainable online learning experiences, as well, that really work. I think we were also on a mission with this book to prove that really, anyone can do this, you just need to keep some known principles in mind that, again, this is not totally new territory, and scholars and user experience and human computer interaction have been thinking for a very long time about how to make information accessible online, and how to make sure that information and interactions are easily navigable. And so that was really the literature we wanted to tap into. So that’s all to say that I think people who benefit from reading this book is really anyone who wants to be creating a better online learning experience for whatever teaching situation they’re in.

Rebecca: I’m, of course, super excited about this book, because I’m a UX designer. I love that you use that framework to write this book. Can you talk a little bit about why you chose this approach?

Jenae: Absolutely. I’m so glad that you appreciate this book exists. We’ve gotten really good reception from the UX community on it as well. I would say that we use this framework because we felt like it really centered the learner in an important kind of way. I think that in a lot of teaching situations, people who educate or design learning are often more thinking about the content: What information do I have to deliver? What are the main things that I need to make sure people know how to do? Those aren’t bad things to focus on, we need to cover content, and we need to make sure that there are clear outcomes. But I think it’s most important to really think about how is someone engaging with that content? How are they understanding it? What are their opportunities to understand that content in a variety of different ways. And I think what a user experience framework allows us to do is to center that reminder. Learners have these embodied experiences that shape how well they’re going to be able to learn, how well they’re going to be able to interface with the information. And if we’re talking about that in an online context, in particular, it’s impossible to do so without addressing what it means to, again, engage with and use these online environments effectively. So I think a UX framework really just allows us to be more centered in reminding ourselves who really benefits from the learning experiences we design, and who really needs to have access to [LAUGHTER] the information to be successful. And I think UX frameworks just really help us center that.

John: Can you talk a little bit about how this approach centers the user in terms of practical ways in which that’s built into the design process?

Jenae: Sure, one way to sort of think about that is to really take a step back and try to remind yourself just who is taking your class in the first place. Starting there, starting from the place of trying to be curious about who your learners really are. I think that it’s easy to make assumptions, I’ll just say in higher education, in particular, since I think that’s primarily the audience for this particular podcast. I think a common misconception, for example, is that all students entering their class are traditional college age, 18 to 21 years old, but like, I should put a big asterisk on that and say, that’s probably not the traditional age at most institutions anymore. But that’s the stereotype of kind of who a college student is. And there may be some assumptions about what their prior learning experiences were like that brought them into a college classroom… about the prior knowledge that they had. And so what I think a user-centered design encourages us to say, “Do we know that? How do we know that? What information do we need to gather to remember who’s actually coming into our rooms?” And I’m not suggesting that any educator has to like, do deep dive demographic data work to find out who their learners are. But I think most of us can kind of anticipate the range of people who are coming into our classes. We might anticipate just the different types of learners that we may encounter. And by that, I mean, it’s worth, I think, before you start designing, just trying to remember, what are the different motivations that students have for coming into the class? What are their purposes for being there? What are the main things that students are going to want to do by being in your class or your training or your workshop at any given moment? So starting by just sort of trying to map out who those people are? And then try to anticipate, okay, given this motivation, or this purpose that this learner may have, what kinds of things might they be looking for… literally looking for my online course? What things will they click on first? Which links are they going to want to access most frequently? Which resources are going to benefit them on the site most? And then trying to design your learning management system course site, or if you’re not using learning management system, your course website, broadly speaking, to really privilege the resources, the links, the activities, the pages that are going to be best aligned with what you anticipate your users or your learners may need. And Rebecca, I’m sure, can speak to this given her expertise, too, but UX design really is a whole process of trying to consider how the visual information, how even like the tactile information, say how your keyboard was set up, how your device is set up, how that allows you to most easily use and engage with products, so to speak, that you’re building. And in this case, we want to think about how can you build the best online course that you can, in a way that allows users to most easily find the information you anticipate they will most frequently need?

Rebecca: So one of the things I’m hearing you say, is really thinking about the wide variety of learners that we have and the different needs that they have and trying to address that. One of the things that’s really popular in UX design and that you talk about in your book are personas. Can you talk a little bit about how learner personas can help us think through the different kinds of learners that we have in our class in a really practical, tangible way. You just kind of provided that theoretical framework, but I love that the personas is such a practical application of that.

Jenae: Yes, thanks for asking that. Rebecca, I was debating whether to dive into that with the last question. But let’s dive into it now. So for those who aren’t familiar, personas are an exercise where you really try to create a character sketch, I would say, of the user you’re imagining is going to engage with your course or in this case, try to imagine an example of a student who’s going to be in your class. And by creating a character sketch, I mean, I encourage instructors, if they have the time to sit down and say, “Okay, what might be the name of someone in my class? What might be their age? What might their prior experiences with learning my topic have been? Why are they here? What brings them to college? Or what brings them to this class in the first place? What are going to be some of their biggest challenges? What are going to be some of their biggest hopes? What are the things that they’re going to be most excited about doing in this class?” And again, it’s a bit of an imaginative exercise. And so I think it’s easier to do with more teaching experience. But it’s also not impossible to do even if you’ve had relatively limited experience. It’s really just an exercise in trying to think through who might be the real people that you are engaging with, I do want to say that there’s been a lot of conversation in the UX community, and again, Rebecca, you may have some thoughts on this, too, about sometimes the stereotypes that personas can perpetuate. For example, I think there have been concerns in the UX community that when you try to characterize, say, an older user, of an online interface, a stereotype might be that they struggle more or are more challenged with using technology than, say, a younger user. And that that might be a challenge to anticipate. And so I want to be mindful, for example, that if you are going to be in the practice of building personas, which we talk about in the book, because I do think is a useful exercise to kind of try and make concrete for yourself who is going to be on the receiving end of your experience, that you do try to check yourself a little bit on reinforcing stereotypes to the best of your ability. It’s easy to do, stereotypes exist because we notice patterns sometimes in how people behave. And that can sort of reproduce some harmful assumptions about who those users are. But again, to the best of your ability, attempt to anticipate what the needs might be based on what you do know about who might be in the room, just again, kind of reminding yourself that you’ll want to think about your personas in nuanced ways, and not necessarily make assumptions about who they are. And I would say one solution to that, how am I supposed to write a generalized description of a persona, while avoiding all possible stereotypes about who they might be? I would say again, time allowing, try to run your personas by other people, and just see what their reactions are to reading them. For example, if you have a trusted colleague, or a friend who teaches a similar class, or who you work with regularly, just show them what you’ve drafted and say, “Does this feels like a real person to you?” And to attempt to ask diverse people about how your persona sketches are landing or how realistic they feel to them. That’s always a good way to kind of gut check, and just make sure that as you’re anticipating your users’ needs, you’re not falling too much into your own biases about who the people are that you’re supporting in your course.

Rebecca: So one of the things, I think, people do sometimes run into when they’re making personas is to create the ideal student that doesn’t exist, and also to recreate themselves. And so one strategy that I often recommend is thinking about creating aggregates of people that you do know. Because then they’re more realistic in terms of the way they might interact. So if you’ve taught a class before, you might have a real pool of people you could draw from [LAUGHTER] and to create a persona from, obviously, that’s more difficult when it’s a new place. And I was also going to offer up in terms of thinking about disability and thinking about accessibility, that there’s a book called A Web for Everyone, and they have a lot of resources. It was published quite a while ago, but they have a lot of resources still online, they have some personas for people with a wide range of different kinds of disabilities. And sometimes that can be really useful in just thinking through kinds of scenarios that you might not think of on your own.

Jenae: That’s fabulous. I would love to see that resource about sort of supporting accessibility, especially. That’s such a huge issue in designing online learning experiences, particularly. I’m so glad you mentioned those resources. That’s fantastic.

John: And while there may be those types of biases that you might have, those who’ve taught classes multiple times do know some of the types of problems that past students have had. So those issues that they’ve experienced in the past could be built in. But one of the other things you suggest is doing a pre-course survey, so that you get some more information about the actual students in the room rather than those who may have been thinking about when you initially designed the course? Could you talk a little bit about that survey?

Jenae: Yes, I’d be happy to talk about the pre-course surveys. So this is a practice that, I think, has multiple benefits. So in a pre-course survey, I think instructors have this wonderful opportunity just to ask students what their motivations are for engaging with the class, what brought them here, how they would characterize some of their prior experiences with learning similar topics, if any, and just to voice what concerns they have, or what things are exciting to them about the term ahead. I’m giving a lot of examples of possible questions and I just want to acknowledge that not all instructors will want to ask all of those questions all at once. But those kinds of questions that really get at motivation and concerns, I would say, in a nutshell, can be really critical, both for adjusting, I think, those persona expectations. So, creating personas should be an iterative process, I should say, as well. It’s not a one and done thing where you anticipate who your learners are prior to the course starting and then you’re like, “Okay, I figured it out, I know who all the students are. Knowing who the real students are, can then allow you to go back to what you anticipated. I think, and both of you, Rebecca and John, were speaking to how you could use prior information from prior terms to inform your kind of current term or current course. Great, you could sort of just align your prior understanding with this current information you might get from these surveys to then go into your course website, or your course learning management system, your syllabus and say, “Okay, is this design going to work for the group of people who are actually here based on what I’m reading?” …recognizing, of course, that nothing’s gonna be perfect for everyone. But you can do the best you can to try and make the materials as good as possible for the group that you have. In front of you. I would say that you want the survey to feel less burdensome for your students to complete. I’m giving a lot of examples of questions that I think are ideal as open-ended questions. Some of these, you could turn into multiple choice or kind of Likert-scale style questions, because you can just use it as an opportunity to take the temperature. “On a scale of one to 10, for example, how confident do you feel in your understanding of your ability to pick up new quantitative concepts?” …for example, if you’re teaching in a STEM-style discipline. Or “On a scale of one to 10, how comfortable do you feel as a writer or with writing tasks?” …if you’re teaching something more humanities- or writing-centric. You can get really creative in trying to solicit some feedback. And I also encourage instructors to be judicious in what they’re asking in these pre-course surveys to kind of try and ask questions, with the end goal of helping you as the instructor make small tweaks to the design of the course. Think about this information as a way to say “Okay, are there certain links I should put on the homepage that I didn’t think needed to be on the homepage? Or should I reorganize the menu on my learning management system in a way that highlights some resources more than others based on the information I’m getting in the survey? Should I reorganize a module to introduce some content before other content, because I’m seeing a trend in the surveys about less confidence in one area of the course than I was expecting in another?” So thinking about how the answers might inform your design, a research-based perspective really, I think, can make your course really even stronger. And I think it’ll feel better, both for you and the students because it helps the students see that you’re curious about them, you want to know who they really are. And we know that engaging personally with people really matters for good teaching. But the instructor too, it can be really frustrating. If you design something and it doesn’t land with your students. You feel like you spent a lot of time building something that didn’t work. That’s a really disheartening experience. So getting the feedback might allow you to avoid [LAUGHTER] having or feeling so disappointed if the information didn’t land the way you were expecting it to. And this isn’t foolproof. There’s always room, again, for iteration. But I do think the surveys can at least help you anticipate a little bit better how the progression through your course could go.

Rebecca: I can imagine that some of those surveys with open-ended questions could lead to better understanding how students name things or label things which could give you a lot of clues about the actual user design of a course by just how you might name or provide quick descriptions of things. In your book, you talk a lot about instructional text design, which obviously has lots of skills in online learning from instructions for assignments to just how we might label a folder [LAUGHTER]. There’s lots of skill there. Can you talk a little bit about the basic principles that you’d recommend for course designers to follow when they’re writing instructional text?

Jenae: Absolutely, and I realized, as you were talking and responding, I was nodding along. And then it struck me. It’s like, “I’m on a podcast, no one’s going to know that I’m nodding and agreeing with you right now.” So [LAUGHTER] for the listeners sake, like I was nodding along quite vigorously with that entire response. Instructional text, I think, is one of the most underrated and one of the most important things to design for any online course experience. I think that online course designers have a real tendency to rely too heavily on video and on images. There’s an assumption that if you’re working online, everyone’s just using video all the time, or everyone’s just wanting to engage with the flashiest multimedia possible. That is still important. I mean, we have two chapters in the book, all dedicated to video. So I don’t want to undermine that. It is important to engage with multimodal artifacts and building multimodal interventions, when you’re teaching in a multimodal environment like the internet. However, for students who may have low internet access and low bandwidth, for students with disabilities, text remains one of the most accessible and easiest ways to find information in an online course. I’d also say text is one of the most mobile-friendly pieces to think about. And we know that increasing number of students are accessing their courses or coursework through their smartphones. I’ll answer your question directly now, but I wanted to provide that context. I would say when it comes to designing instructional text, I really encourage instructors to think about two big things, to think about the hierarchy of the information that they’re writing, and to think about the discrete chunks of information that they’re wanting to communicate. So when I talk about the hierarchy of texts, I think it’s important when we’re writing to consider what are the sections of our text? Most academics, most instructors, are used to, when they’re reading or writing, creating headers, and sub-headers, and paragraphs that denote a certain order of information. And when you’re teaching online, especially, I think even more critically about how are you labeling the text? How are you indicating which things are instructions versus content? How are you labeling the order of the content that you want students to read in? How are you even labeling the order of instructions, like there is usually multi-tiered sets of steps. So using header text and different layers of header text, is a really important web accessibility measure. And again, it helps readers see visually and if they’re using a screen reader tool, it helps them navigate that text more easily. So I should take one step back and say when I’m referring to header text, I mean that when you’re working in a rich text editor, on any website, you can typically see an option to select different layers of headers, like the header ones are usually the highest, biggest level header, header twos go below that header, threes go below that. So just being mindful that just increasing text size is not the same thing as using headers is one really, really simple way to create hierarchy. And again, to denote the correct order of reading the text information. And when I say chunking text, this is as simple as just thinking about paragraphing, making sure that you are spacing out pieces of content in really critical ways. So anyone who’s read a piece of writing with super long paragraph knows, that’s a lot harder to kind of discern, it’s a lot harder to see how one idea moves from one to the next. Shorter paragraphs are typically easier to get a sense of when you’re moving from one idea to a new idea. And so even though long paragraphs have their purpose, perhaps especially in scholarly writing, or even in more, I would say kind of creative writing, in some cases, when you’re doing really instructional or technical work, which you’re often doing when you’re designing a class, shorter is better, more chunked is easier to access, because you’re assuming that people are doing things with your information. So those are the two qualities I would just be thinking about with instructional text. There’s a whole other component that we didn’t really address in the book, but I’ll just stick to very briefly here, which is also thinking about just the visual appearance of your text. A lot of accessibility folks speak to some best practices and guidelines around font face, and font size, and some of these factors when you’re designing text as well. I’m not an expert, I should say, in like type of graphic design or font size, but I want to point out anyway, because I think if you are designing online, it’s important again to do the best that you can to try and anticipate those needs. So I think as a general rule, making sure your font sizes are not super teeny tiny, or super large. Making sure that you’re using standard font faces: Arial, Helvetica any sort of sans serif font is typically considered a best practice. The rules around this change all the time, Web Accessibility Guidelines change as technology evolves, so I never like to give super hard and fast rules, and again, it’s not my area of expertise. But it’s another piece to keep in mind that visual and verbal information is intertwined. Text is a visual medium, online learning experiences are largely a visual medium, by default. And so the more mindful we can be of what that looks like, and the more mindful we can be of how the visual experiences we design online, are compatible with accommodations for disabled users. We just anticipate our users’ needs, our learners’ needs more proactively, and it raises the boats for everyone. It just gives everybody a deeper chance to succeed if we’re just thinking about these interface choices in more deliberate ways.

Rebecca: I love that you’re really talking about how the instructional text is also part of digital accessibility. It’s important to have plain language, it’s important to chunk your content and these sorts of things. So I’m really excited that you’re incorporating that into the work that you’re doing.

Jenae: Thank you. It is exciting. I think it’s one of these things that, when Michael and I were first discussing this book, it was a real lightbulb moment for us that there was such a robust literature out there that discussed all these great principles for making sure that online information was easy to find. And it just was striking to us that a lot of folks in teaching professions weren’t getting access to that information or exposure to that information. And we started thinking about this, again, prior to the pandemic, kind of in the mid 2010s. And even at that point, online courses were growing, mobile access was becoming a more common way that students were engaging with courses. So, why not tap into these existing sets of conversations that are industry best practices, for engaging with online interfaces, in spaces like higher ed, and in spaces just like learning and development, where these dialogues seem not to have met each other as fully as they could.

Rebecca: Our chief technology officer and I were having a conversation about some of these things yesterday as we’re talking about our student body is diversifying and that we have far more students with disabilities who are able to attend college and have access to college in a way that maybe they haven’t in the past. And as you were talking about headings and paragraphs and things, something that people might not know, is that if you use a screen reader, you’re not necessarily visually interacting with the text. Instead, you’re thinking programmatically, and so just like kind of vision centered [LAUGHTER], the user might skim headings visually, it’s the same way someone might use a screen reader. So by choosing a heading level two, it allows someone to find that section easier. And by breaking things into paragraphs, and delineating that’s that kind of content that allows a screen reader user to be able to jump to a particular part of the content. When we don’t do that, a screen reader user has to listen to everything from the top to the bottom of the page.

Jenae: Great example. Yes, and that’s such a frustrating experience to have to do that. If we can be just a little bit more attentive to the information architecture of sort of what we’re trying to communicate and convey… information architecture is a technical term, but it’s also a metaphor [LAUGHTER]… we have architecture and we have design to help create solid foundations for places that we live. Similarly, when it comes to information, we need to be building solid infrastructure to help people navigate their way through a course. One of my colleagues a while ago used a metaphor for online learning design I’ve never forgotten and we’ve alluded this a bit in the book, which is that when you’re building something online, it’s like you’re just building a whole house [LAUGHTER]. When you walk into an in-person classroom, the architecture is literally there, and you make assumptions about the room in the space, the second that you walk in the door. When you’re designing text online, or just when you’re thinking about the whole online learning experience, it’s a total blank canvas, you have to build that architecture and those hierarchies. If you’re not attentive, you’re absolutely right, the consequence is that it can be a big overwhelming mess of information. And I think it’s a useful practice for instructors, even when they’re not teaching online, to think about these things. It’s also just a great exercise and getting really very focused on what information do you want to prioritize when you’re communicating assignment instructions or when you’re picking out content-based readings for your students? What do you want them to focus on? What are the big things you really need them to learn or pay attention to? And so if your course design, your visual design can align with the hierarchy of choices you’re making as an instructor or the priorities that you’re setting, it just makes it easier for everyone to have equal access that information so that more time can be spent for students to focus individually on how they’re processing, applying, doing higher-order thinking with that information. They don’t have to spend so much energy just trying to intake the kind of basics before they have the opportunity to really work with it and apply it meaningfully,

John: You provide a lot of other information in your book, and we encourage people to read your book. If they want to find out more about creating videos, about providing effective webinars, and so forth, there’s some really nice hints and suggestions throughout. But one of the things you end with in there, is ways in which instructors can continuously improve their courses, in terms of soliciting feedback to make the course better each time. Could you talk a little bit about how you would encourage instructors to continuously work on developing their courses?

Jenae: Sure. So I really like that section of the book, because what I hope that section communicates is that thinking about your course design is a reflective and an iterative process. I don’t think a course is ever really fully perfect and done, there’s always things you can do, and modify each time you teach or offer the experience. So, I don’t think getting feedback on the course has to be hard, I don’t think it has to take a ton of time. We talk about multiple ways of getting information about how the course is working. And I’m going to start with I think some of like the easiest and most passive ways to get information and then we’ll sort of work our way to some of the more perhaps active or personalized interventions for getting information about the course. So one thing I think is worth really paying attention to, after you finish teaching a course, are some of the analytics that are available in your learning management system or your course website. And I recognize that some folks are really reluctant to look at the analytics, because there is a surveillance economy implicated in the tracking of course analytics. Every site on the web tracks your movements, every site in the web knows how long you’ve stayed on a certain page, what things you’ve clicked on. And a learning management system is no exception to that. Unfortunately, that information can get weaponized to discriminate against students, discriminate against users in problematic ways. In the web, outside of learning, for example, analytics can be gathered and sold to advertising companies to spread information about your activities for profit. So I just want to note that context, but you can also use this information for good and for some useful things as well. So seeing which resources students are clicking on the most in your class can be really useful information for you to say, “Huh, seems like a lot of people found that resource useful.” You don’t have to necessarily identify which individual students looked at which resources but you can look at this data, typically in aggregate, and again, most learning management systems have an analytics dashboard, you can access to look at this. I think that’s incredibly useful just to see what was clicked most often and what wasn’t. You might also want to track, for example, which pieces of information students did spend more time on. It could indicate a couple different things, it might indicate that something was really challenging, if students spend a lot of time on one particular piece of content over another or if they found it useful. You’d have to contextualize that data based on what you were seeing in the course. But I think if you’re willing to look at that information, again, in the context of how your term went, it might just give you some passive information that could surprise you. I would even look at, for example, with assignment submissions, how many delays were there on certain assignments versus others? In which assignments did students request more extensions more than others? Again, this is just information that might help you inform whether the pacing was appropriate for the course, whether assignments were sequenced appropriately. That kind of thing. If you want to get more active, if you gave a pre-course survey, you can do a post-course survey. Most institutions, of course, have formal evaluations of teaching, but we know that institutional student evaluations of teaching can be fraught. Sometimes they ask the kinds of questions we don’t always want to ask or find most useful as instructors. So if you do your own very brief post evaluation, you could focus it on the design of the course itself. I think it’s worth asking students at the end of the course, how easy was the course site to navigate? How accessible did the materials feel for your ability to learn? You could return to some questions from your pre-course survey. If you asked a Likert scale about rating your confidence with learning something on a scale of one to 10 at the start of the course, you could ask them by the end, “How does your ranking change?” Even referring back to the original data that they might have submitted to you with the pre-course survey. So those are another way to ask them. I think if it’s possible to, what I love to at the end of the course is even a little brief post interview with students if possible. We mentioned this a bit in the book. Again, it’s time consuming. But if you have a small-ish class where you could have conferences at the end of the term, and have a moment with just a five-minute conversation to ask students: “How did it go? What aspects of the course design did you like most? Which were most challenging to you?” That’s another way to get information. Finally, I’ll just do one more technique we write about in the book, which is never discount your own reflection on your experience as well. This is another form of user research. Even though you are not the end user for the course, you are the designer, and so I think it’s always useful just to jot down a few notes and treat that as research when you’re done, too. What did you notice about user interactions on your course site throughout the term? What things surprised you? What things went exactly as you expected? You can use those notes to iterate and improve your experience for the next time that you offer it. So those are just a few techniques and many, which again, are drawn from the field of user experience research surveys, and interviews, for example, are pretty common user experience research practices… other UX research practices that, again, just depending on your time, depending on your resources, it’s great if you can see students engaging in the course as well, asking them, just really seeing what it looks like for them to interact in the course. That’s a good way to get at good information about it. I just want to encourage anyone who’s teaching not to shy away from getting that kind of feedback, because it does make, I think, teaching more satisfying when you’re getting more information about what’s working and what isn’t.

Rebecca: So, you know this question’s coming…[LAUGHTER] We always wrap up by asking: “What’s next?”

Jenae: Yes, I do know it’s coming, and it’s funny, because I was thinking about it. What am I [LAUGHTER] doing next? So to be honest, I don’t have a clearly defined project, I’m doing a lot of little things, I might be taking a little break, because I have written two books in about two and a half years [LAUGHTER]. So that’s been a lot… wonderful. I think I’ve been bitten by the writing bug, for sure. And so I suspect there’s more writing in my future, but nothing immediately next. I’m still very curious about what it’s going to mean to keep designing really good online learning experiences in the future, I don’t think we’re done with that conversation. I’m really curious about how that’s going to evolve in the context of creating more inclusive and equitable learning environments for students. So I imagine those are topics I will continue to explore to some extent, but we will see how, of course, with AI too, and the impacts of that on online learning, I’m sure there’s gonna be a whole set of ways to think about these topics that will continue to evolve. So I’m kind of keeping my eyes open and my ear to the ground on how things are developing. And we’ll just kind of see what ideas emerge from there.

Rebecca: Well, it’s always a pleasure to talk to you, Jenae. Thanks for all the work that you do.

Jenae: Likewise, thank you, again, for having me and for engaging with these excellent questions. And if you listened to this podcast, we’ll put in the speaker notes, I’ll give you a little gift of a promo code. If you’d like to buy the book, we can give you a 20% off discount with thanks to Rosenfeld Press who published this book.

John: Well thank you. We’ll be sure to include that in the show notes and it’s always great talking to you.

Jenae: Wonderful, and likewise, thank you again.


John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.


307. Career Readiness

Students do not always understand how the work that they do in our classes helps prepare them for their future careers. In this episode, Chilton Reynolds and Ed Beck join us to discuss one institution’s approach to helping students understand and articulate how their course learning activities intersect with career competencies. Chilton is the Director of the Faculty Center for Teaching, Learning, and Scholarship at SUNY Oneonta. Ed is an Open and Online Learning Specialist, also at SUNY Oneonta. Chilton and Ed have both worked on integrating career readiness skills into the curriculum.


John: Students do not always understand how the work that they do in our classes helps prepare them for their future careers. In this episode, we discuss one institution’s approach to helping students understand and articulate how their course learning activities intersect with career competencies.


John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.


John: Our guests today are Chilton Reynolds and Ed Beck. Chilton is the Director of the Faculty Center for Teaching, Learning, and Scholarship at SUNY Oneonta. Ed is an Open and Online Learning Specialist, also at SUNY Oneonta. Chilton and Ed have both worked on integrating career readiness skills into the curriculum. Welcome Chilton and Ed.

Chilton: Hey, It’s nice to be here.

Ed: Thanks, John.

Rebecca: Today’s teas are… Chilton, are you drinking any tea today?

Chilton: I am. It’s afternoon here, so I’ve moved to iced tea. I make my own decaffeinated, slightly sweetened, peach iced tea for the afternoon.

Rebecca: Sounds nice and refreshing.

Chilton: Yes.

Rebecca: What about you, Ed?

Ed: I am drinking a Chamomile honey and vanilla tea, in a very fancy special mug.

Rebecca: Oh, that’s a Tea for Teaching mug. I wonder where you got it.

John: And I am drinking an Irish Breakfast tea today.

Rebecca: Also in a tea for teaching mug. I have Lady Grey, I think.

John: We’ve invited you here today to discuss your work at SUNY Oneonta in making explicit connections between course learning objectives and career readiness skills. Can you tell us a little bit about that?

Chilton: Yes, we’d love to. And again, thanks for having us. We’re excited to be here to share this project that we’ve been doing. We just finished up with our first cohort where we are really trying to help our students make really clear connections between what’s going on in the classroom with career competency skills that they’ll be using after they leave college. So the focus of this program was really on helping faculty build into their courses, times to allow students to reflect on what they’re doing in the classroom, and really say explicitly: “Here’s the skill that we are trying to build towards. Here’s what we’re doing in class. Now, as a student, practice actually making that connection. We want you to either write that and think about it in writing or say it out loud, and practice saying out loud, so that those connections can become as strong as possible after you leave this class.” The focus of this was on lower-level classes, we specifically targeted lower-level classes, because we thought that by the time they’re getting to their senior seminars, they’re doing that in the class already. But we don’t have these conversations in our 1000- and 2000-level classes. And so the more we can do this in our lower-level classes, hopefully when they get to those upper-level classes, they can say, “Oh, yes, I do remember talking about technology skills or communication skills early on and I can make connections now between what happened in that class and what’s going on.”

Ed: Yeah, the big thing that we always are talking about in the instructional design field, in the faculty development field, we’re talking about authentic learning all the time. But I joke sometimes when I say like if a student completes a course built on authentic learning, but can’t talk about it in an interview, or articulate it to themselves or others, did it really happen? And this is our practice. This is us saying, “if we’re going to do all the effort to make sure that our courses are built on authentic learning, we’re building authentic tasks into them. Let’s go ahead and do the next step of reflection, of practicing, so that students are prepared to speak about it.”

Rebecca: Can you talk a little bit more about how you rolled it out to faculty, because you’re talking about working on it through the center, and then getting faculty to adopt it in these lower level classes? Can you talk a little bit about those details?

Chilton: Yes. So we had started with a call for faculty. We actually had gotten a grant, there was local money from our institution to be able to do this where our incoming president had created… we didn’t have a strategic plan at the time, so he created a initiative called “regaining momentum” that was very much focused on re-engaging our students, both incoming students and current students. And so one of the focuses was on career readiness. And so that was kind of “how do we help our students make those connections?” So we had applied for a grant, we received the grant, and in doing that had promised that we would do this over three years, the first year being our first cohort. So we actually put out a call for proposals and went to a couple of faculty that we knew were doing some of this from some of our previous work, and said, “Would you be willing to be a part of this?” and then also have the full call for everybody across campus, and we were looking for 10 faculty, and I think we had 11 proposals to begin with. After that we went through the team that was built from across campus. We can talk about that in just a second too. But they had a team that would review those proposals and then said, “Yes, we had a cohort of 10,” which is what we had funding for the first year of the cohort, and then went through the process with them over the year.

Ed: In our center, we’ve really been thinking about how we can focus on the student experience. We’re in a transition phase right now, we used to be known as the Teaching, Learning, and Technology Center, and we are transitioning, and as of July 1st, we’re now the Faculty Center on campus. We were thinking about how we stop leading with technology. We always were thinking about teaching but we wanted to lead with that. And one of the things that we were doing was we were focusing in on the AAC&U high-impact practices. And we went through that long list of high-impact practices and said “okay, what fits into the work that we are already doing as a center?” and kind of identified some of them, so we had already been doing sessions and cohorts of project-based learning with our faculty members, we had already been investigating and helping build ePortfolios. And we always saw ourselves as the collaborative learning people. So what we wanted to do was create a cohort of people that were thinking about this and tie it to a goal that we could keep coming back to, and have these faculty meet with each other throughout a semester to really create a community around a central idea. And that’s where the idea really came from was to keep reconnecting through the semester and focus on building that community versus the sometimes one-off presentations that faculty development can sometimes feel like.

Rebecca: Can you talk a little bit about the career readiness competencies that you’re focusing on?

Chilton: Yeah, when we started this application process, we were connecting with other groups across campus. And one of those that when we talk about career development should be the Career Development Center. So we reached out to them and they talked about how they were using the NACE career readiness competencies. NACE is a national organization that is connecting what’s going on in the classroom to careers afterwards. There are eight competencies that are a part of those and they align a lot of their work specifically with that. Additionally, we found out that some of our co-curricular activities also aligned with the NACE competencies. We have a Lead program, which is a leadership program on our campus, and that uses the NACE competencies as well. The School of Liberal Arts has a program going on right now where they were trying to do a lot of this similar work outside of the classroom with students helping them connect what they had been doing in their classes with what was going on through the NACE competencies. So we found there was a lot of work already happening on campus, and so we really wanted to make sure that we aligned with that as well. What we like about NACE competencies is it really aligns with a lot of the work that goes on in our classrooms. And that’s what resonated with us, and that said, we’re focused on what’s going on in the classroom, how we can help support faculty in doing more useful work inside the classroom. And so we thought about how the NACE competencies really do that. So we think about things like professionalism, communication, critical thinking, teamwork, technology, and leadership. Then there’s equity, inclusion, and career and self development. Those are the soft skills is that word that got used a lot in the past to kind of say, “Yes, we do these things, but we didn’t really help students make those connections between what’s going on.” So we felt like it was a great framework to take into the classroom and say, you’re doing this as faculty, you know, you were doing this, but the students don’t always know that they’re doing this, how can we be able to help do that? The other thing I’ll follow up with that is, as we were exploring this more, we reached out to the POD Network, and actually found out [LAUGHTER] from SUNY, there was already work going on some of this as well. So our Center for Professional Development, has a whole certificate program that’s around connecting career readiness skills into the classroom and our use of the NACE competencies as a part of that as well. So it was really a lot of tie-ins that we saw really strong connections between what was happening on our campus and things that were happening locally.

John: We have talked about that, to some extent in our previous podcast with Jessica Krueger, and we’ll include a link to that, in the show notes.

Chilton: And one thing I follow up with that, John, is we have a couple of pre-professional programs. And this seems to fit really well there, like career readiness makes sense when you have a pre-professional program that’s preparing you for a specific program. We were also trying to reach into our liberal arts programs, into our science programs, into lots of other programs that might not be as focused on a specific profession, but still are connecting into these career readiness competencies.

John: And since we’re doing these things in the classes anyway, it’s nice for students to be able to recognize that these are skills that are going to be helpful for them in their future careers. And when they can see that, I think that may help provide a little more intrinsic motivation to engage in these practices and develop those skills. How have students responded to that?

Chilton: So we are in the first year of this, and this is one of the things we were reflecting on as we were preparing for this in that we realized our first year was focused on what faculty are going to be doing. As Ed said, we’ve been working with some faculty on this, that have been doing this on a smaller scale. But as far as this program, we’re looking forward in year two to really hearing from students and hearing how that’s going to go, so we’ll have to provide some feedback and liner notes later on to let you know when we hear about from the students.

Ed: Yeah, I’m gonna lead a committee to do the IRB and create some surveys to send out to students that are part of the program and have a little bit more of that student voice that we can report back on. Because I think that’s really important. It grew out of a proposal like that, that I’ll talk about a little bit later. But we had done that student interviews and student feedback once before, that really helped create this framework that we were really trying to set up with now a cohort of faculty members.

Rebecca: I really love hearing that you’re using NACE across your institution in different spaces. So you mentioned that Career Development Center is doing it as well as your center. Can you talk a little bit about how that collaboration is working?

Chilton: Yeah, so we really see this as a partnership. And it’s one of the things that we really tried to be intentional early on. Because when you say career readiness, that is a Career Development Center thing, and we don’t want there to be any perception that we’re trying to take over what they’re doing and we want to be able to just support them so that when students come to them, they are more prepared. A part of the original proposal was going to the Career Development Center and saying, we want to do this with you, would you be willing to partner with us? We can do more of this in the classroom. It was very much a partnership, it was very much us wanting to say, “What is it that you do in the Career Development Center, and then also, where can we help support you?” And then make sure that we feed into what you’re already doing. So it’s not any appearance of us coming in and trying to take over your programming, but just help our students be more prepared when they come to your program.

Ed: Actually, we had a great day at a winter workshop where the Career Development Center sat with our faculty and said, “Here are some of the things that we are already doing, here are the services that we’re purchasing, here are the things that we do at one-on-one consultations, here was what it could look like if you invited us into your course.” And some of our faculty members did that and invited the Career Development Center into their course to speak to them. And some of our faculty members were doing other things that incorporated the competencies but didn’t necessarily incorporate an outside group like the Career Development Center. So we had a wide range, even among our cohort of what they were doing. During that winter workshop that I was referencing, we actually brought in an outside trainer. And that was really nice. Chilton mentioned that the SUNY Center for Professional Development, the CPD had already been doing a four course sequence on the NACE competencies, which was really meant for a variety of professionals, it wasn’t just faculty, but the instructor that came highly recommended to us was Jessie Stack Lombardo also from SUNY for the SUNY Geneseo Career Development Center Director. And she came in and did some workshops with us and the faculty thinking about what are the small things that we can do in our class that helped students reflect, that helps students make those connections?

John: Could you tell us a little bit about the impetus for starting this program?

Ed: Yeah, so even before the program, we were, of course, working with wonderful faculty members here at SUNY Oneonta. And one of the things that we’d been doing quite a bit was thinking about making websites and ePortfolios, having opportunities for students to build their own web space, build their own web presence. So even before the cohort happened, we had one great instructor that said, “Hey, I would really love to be thinking about building ePortfolio projects into my course, would you help me do that?” During this time, John, you know, we were doing the SUNYCreate, a domain of one’s own initiative. We were giving websites to students. That was a technology-focused initiative. But we were doing a lot of these things already. I said, “Yeah, let’s come in. Let’s do that.” I was invited into that class several times. And we were so proud of this course, the way it came out. I want to give such a big shout out to Dr. Sarah Portway. She later went out and won the Chancellor’s Award for Excellence in Teaching based on a lot of the work that she was doing. They were building a fashion magazine online, the students were taking the articles they submitted for that fashion magazine, and also bringing them back to their portfolio and showcasing them on their own sites in addition, and she said, “Hey, why don’t we take this thing on the road? Why don’t we go to the AAC&U’s Institute on OER and ePortfolios.” And we said, “Okay, let’s do an IRB, get some student feedback from that, to bring to the conference so that we have that student voice when we go through.” And the feedback was fantastic. Students really responded to it. It was a wonderful presentation. But we were also starting to realize during those interviews, it wasn’t a negative, but it wasn’t all positive. Students were still not making all the connections between the skills that they had done and things they had practiced and the skills they had acquired, and being able to articulate that. I have this memory of a student saying to me, “I wish I could have put this on my portfolio, but it was a group assignment, so I can’t put a group assignment on my personal portfolio.” And I remember just kind of stopping the interview format and saying to her, “Oh, I would absolutely put that group assignment on your portfolio if you’re proud of it. I would absolutely describe what it is you did to contribute to that. group atmosphere and talk about how you can be a successful collaborator and describe how you work in team environments. And then put that thing that you’re proud of, that artifact that you’re proud of, on your portfolio, but also with the framing of what it means for you to be a good teammate and what it means for you to be a good collaborator.” And the student said “Oh, I never thought about it like that, I guess I could do something like that.” And Dr. Portway, being a fantastic instructor, never being satisfied with how things went in the last class was kind of like, “We need to think about this a little bit more. We need to be more explicit. We’re already doing all these authentic assignments. And at some level, it’s hitting. And we definitely want to keep going down this road. But on some level, we are missing something in helping those students make those connections. What do we need to do in the classroom activities, in the way that the assignments are presented, that helps walk them through that to make them just a little bit more prepared, because the authentic skills were already in the course. They just needed help making that connection.” And that was really the thing for me that I walked back from that experience and knocked on Chilton’s door and said, “We need to be doing more of this, and we need to be doing it not one at a time, but with groups of faculty members.” And that was really important to me.

Chilton: And what was interesting to me was that to carry that on a little bit more, when we first had this proposal ePortfolios was in the proposal title. We were really focused on we want to do ePortfolios for everyone. And some of the feedback we received was “Yes, ePortfolios can be a part of that but this could be a much wider conversation,” which is again, how we got back to NACE. There are these bigger frameworks that we can be a part of. So we went from, “Yes, here’s this great tool to no, no, no, let’s look at it from a framework perspective.” And now we’re at the point where we’re like, yes, some of the projects will be ePortfolios, some of them will be other things. And that’s okay. And that was bigger than the tool. This is about helping our students think about what they’re doing and helping them connect to things that will be useful for them after they leave college.

John: One quick follow up, you mentioned that you have groups of faculty who worked on it. As I understand this, this was a faculty learning community that you put together, where faculty received some slight funding or a small stipend as part of the participation. Have you done any work there with entire departments in revising their curriculum yet?

Ed: No, we haven’t done the departmental level work yet. Right now we’re focusing on coalitions of the willing, having faculty who are interested in these types of things. One of the hopes is that after doing three cohorts, having worked with multiple faculty one year, then the second year, then a third year, new faculty each year that we can get to a point where we’re ready to have a bigger discussion. There was one participant in the group that was really focused on making a freshman ePortfolio with the explicit reason to keep contributing to it throughout the program. And I think that shows a lot of promise. But we’ve still got to do some work to get the buy-in from the rest of the department to make sure that it gets used. So I mean, there is a lot to be done there. And it’s one of the things I’m hopeful for the future.

Chilton: To add on to that, one of our goals out of this is to be able to build a repository where we can share and our hope is that we can have enough examples that when we go to a department, we can say, here’s some small changes you can make. Ed had mentioned this earlier, we want to be able to have a breadth of here’s some small tweaks you can make. Or here’s some larger things you can do. And be able to have some examples that are multidisciplinary, that are a wide range of both implementation needs, as well as examples from different departments so that when they go to a department, we can say this doesn’t have to be a large change, it could just be helping make some small changes to help those students make connections.

Rebecca: So I wanted to follow up on an earlier point that you were making, from an experience that I’ve had as an instructor, and I’m sure many other instructors have had is you work really hard to make these kinds of career-ready activities, things like professional email writing, and portfolio projects, and team projects, the list goes on. We do many of these things as instructors, and then you inevitably have this conversation, a one-on-one maybe with a student. And you just realize they have no idea why they were doing any of the things and you’re like, “Oh, I failed the student clearly [LAUGHTER]. I could have done a better job.” And so it seems like frameworks like NACE could be really helpful, both for instructors and students to just be more explicit about those things and to practice talking about them. Can you talk a little bit about that piece of the puzzle?

Ed: Yeah, I think it’s so important to have small opportunities to embed a skill or embed a practice in there. So I’m going to start off with a very small thing that I think anybody could throw into their class. At the end of the course, it’s at reflection time, we want to talk about what you learned. Let’s take a moment and think about a common interview technique is the star interview technique, you’ve probably heard of it, where you describe a situation, then you have the task that you were assigned, the action that you took, and then a result, explain that, say “Hey this is how a lot of times we make sure that we have an action-oriented response to an interview question.” Now talk about this course, using the STAR method. What situations did your instructor put you in? What were you asked to do? What did you do in order to be successful in there? And then is there anything else you want to share about it? Are there next steps that you should continue to do that your instructors put you on the path to? Or are there things that you’ve realized about yourself that you need to continue on with for the future for the next thing? That’s so simple, it’s not rewriting an entire course. Yet, it’s a little opportunity to say, this is important, and what we did had meaning, and take a moment to integrate that into your context. How will you talk about this course in the future.

Chilton: What was interesting to me when we were doing this was when we first started out, we list a whole bunch of sample outcomes that get at what you were talking about, I’m going to do this email, I’m going to have them do this thing and it’s going to be great. And as we got to the conversation with our faculty, we realized that what we were missing was really creating the places for the students to practice making the connection. We have to practice the skills all the time, we’re like, “Yes, we do this.” As the faculty member, we understand that there is a connection between this and career readiness but unless the students are actually practicing making the connection, not just doing the action, but making the connection, then it doesn’t always stick for them. And so that’s where we started to shift from, what do we want the faculty to do? How do we want the students to practice this so that it does stick for them? So it is meaningful for them in a way that they can think about it again, hopefully, a year, two years from now, when they’re finishing their college career and starting to think more about career readiness. That was a shift for us of what is the faculty member going to do to how do we help the students really intentionally practice what they are doing, practice talking about what they have done, and making that connection to, in this case, the NACE framework, because we thought it was such a good framework to talk about.

Ed: I feel like we’re saying NACE too often. So I feel like it’s always helpful to be a little more specific. So let’s talk about communication. We’re teaching communication to students all the time. One of the key aspects is audience. So have the conversation with your students. When we communicate to different audiences, we use different standards. So part of the reason why I’m asking you to write a more formal paper, in research style format, is I want you to be prepared to speak to other experts in your field. But when we shift to the oral presentation, I want you to adjust your language, so that you’re speaking to a non-expert, you’re speaking to your future colleagues, you’re speaking to a potential customer. And when you make that switch, make sure it’s intentional. And then at the end of this course, I’m going to ask you to reflect on that to think about what choices you made when you were speaking to someone who you expect to already understand and be embedded into the discipline and someone who you do not expect. How is it different when you talk to a colleague versus when you talk to your friends and family about what you do. That’s an important communication competency. So let’s talk about it and intentional choices that we can be making.

John: How many faculty members and how many departments were involved in this project so far?

Chilton: So we had our first cohort, and in that cohort, we specifically targeted to having 10 faculty. But we were very specific about trying to have faculty from as many departments and as many schools as possible. So we have three schools on our campus, we ensured that we had representation from all three schools, we ensured that we have representation from multiple departments. So in the end, we had nine different departments as a part of this. We did have overlap in one department, from two of our participants. As we said before, we did focus and said in the call that we wanted you to work with a 1000- or 2000-level class. And so that was part of the call as well. So we actually had a couple people that applied for this that were planning on doing this in a 3000-level class. We reached out to all of them and said “Do you have any lower-level classes that could be part of this?” Two of them said yes and one didn’t, which was one person that we weren’t able to take in. We’re focusing on in years two and three, again, lower-level courses, and going to try to continue to have faculty from as many different departments as possible, so that when we get to the end of this, again, we have a nice repository of examples from as many different disciplines and as many different schools as possible.

Ed: And we can invite some of the cohort I faculty back as mentors, and we can incorporate them into year two in a different way, as we continue to try to build a larger community and push a conversation that we think needs to happen on campus.

Rebecca: Can you talk a little bit about what was expected of a faculty member who was accepted into the program?

Chilton: So we’ve spelled that out upfront, we had already been planning on our campus, what we call the SUNY digital learning conference that was focused on open and public education. And we purposely built in a track in there that was about career readiness. Originally, as we’ve been talking about, we were focused on ePortfolios, and so I thought a lot of them will be doing ePortfolios. But in the expansion of that, we wanted to make sure that we really talked about how we can make connections. So we said that we would pay for the faculty to be a part of that conference. So they attended that conference in November of 2022. And that was the first part, kind of the kickoff for this cohort. We then had a January full day workshop, as Ed had talked about earlier, brought in Jessie Stack Lombardo from SUNY Geneseo to be our speaker for that, and she wasn’t even a speaker she really planned the day and it was very highly interactive with those ten faculty. So as Ed said, we have a staff member from the Career Development Center was a part of that and presented locally, Jesse then talked about some different frameworks to be able to do including NACE and how you can start to think about both small changes that we made and large changes. And then we had said in the call that the expectation would be that by the end of the Spring 23 semester, they would turn in a revised syllabus and examples of work that they are doing to the group. We realized that wasn’t specific enough. So we then created a rubric that focused in on three specific areas of what they would need to do. First part of that rubric is what would be the changes they would make in their syllabus to really spell out what are the NACE competencies? Are you focusing on all of them? Are you focusing on one of them? But didn’t have to be a lot in there, but we did want to have it be addressed in their syllabus in some way. And then what is the activity they’re going to be doing where they actually have students practicing and how will the students receive feedback on that? Well, there’s three levels, it’s not present as a part of the rubric. And then we had two levels of “yes, it meets expectations.” And we were thinking again, what are those small changes that could happen, but then also, we had a what’s above expectation, where what’s something if you were really dreaming about what it could be, where could you take it and what could it be, so we kind of want to have “Yes, you meet expectations” that would help us get small changes that would be usable by everybody. And then what could this be look like if you really wanted to really [LAUGHTER] dive into the deep end with it and explore what could happen with it a little bit more, and make it so that it was better for students, not just in the class, but beyond the classroom.

Ed: Yeah, the only thing I’ll add to that Chilton is we also met once a month during the spring semester. So we had recurring meetings throughout the Spring semester. With those rubrics between the present and highly effective, and we’ll share the rubric so that you can put it in the notes if you’d like. We were starting to think about not only did you incorporate the NACE competency in your course, but you were also presented your prompts or your things in a way that gave the students the opportunity to think about future activities they could take, future things they would want to do. And that was really important to us as we were doing it is to not only just create a moment of reflection for the students at that moment, but also to make that connection of, okay now that I’ve had that moment of reflection, now what? Should I be picking out some different courses? Should I be finding an internship? Should I be doing something now to set myself up for success? And so we don’t get that panicked feeling when the student is at senior year and they go into the career readiness and or the Career Development Center and say, “Okay, what do I do now?”

John: What type of incentives were offered to faculty to participate in the program?

Chilton: So we spell that out as pay for their participation in the conference in November. So they were able to go for free and participate. It was on our campus that made it easy for them to be able to go. It was just their conference registration that we covered as a part of this. In addition, we paid them a stipend for attending the January workshop. So officially it was $90 was the stipend to attend the workshop. And then when they completed and turned in their final version of their revised syllabus and examples of activities, there was another $510 stipend. So in total, it was a $600 stipend. But as a part of that final revision, we actually did review their submissions, looked at the rubric and did give them feedback… a couple of people, we said, “Hey you’re missing…” and asked them to go back and do some additional work. So we did hold them accountable to that rubric before getting the final stipend. And so it was a useful and interesting conversation when the leadership team did meet to kind of look at those to be able to say, “What do we like about this? What are we thinking for cohorts two, and three? What might be asked for more specifically next time to make this even more meaningful for our students?” So we’re already starting to think about cohort two and looking forward to that for next year.

Rebecca: Can you talk a little bit about how faculty responded to their participation,

Ed: We take the faculty’s response and the feedback they gave us really seriously. We gave them the opportunity. They had the reflections that they were doing that, of course, we knew who was speaking. But we also gave them some opportunities to give us some anonymous feedback, so that they could tell us how they really felt about us. And we were just really pleased with it in year one. We do recognize that we have to keep honing our message, we have to keep defining what we mean by career readiness, and what we mean by incorporating it into class. We need to have our elevator pitch a little bit more refined and down. Because what’s evolved through this conversation is, we’ve really talked about the skills are already there, but we can be more intentional about it. And we can be intentional in the ways we ask students to reflect and practice in ways that we really believe can be beneficial for students. But that can still be a difficult conversation. When people see career readiness in 1000 and 2000 level classes, some people are bristled or turned off by that because they’re thinking, “Oh, just one more thing that I have to do.” Now we didn’t get that from our participants in the cohort that much, because they applied and they came here on purpose, that was nice to have a group that was really wanting to be here and was willing to try some things with us in this space that we were creating. But overall, I would say that the feedback had been very positive.

Chilton: Looking through the feedback from faculty, I just pulled out there was one quote that stuck out to me that i’ll read quickly that came from one of our professors in our communication arts department, where this professor said, “Students said that they felt more confident.” So this is actually one of the professors we recruited into this program that had been doing this already. This professor did have some experience with students doing something but said that “Students felt more confident in the skills as a professional, and were able to articulate how the experiences they had in my class connected to the expectations and employers would have of them. They also appreciated being told of why we had to do certain projects and to help them transition from college to life after college.” And so I think that really speaks to how the professors enjoyed having time to be able to do that.

John: We only have courses from 100 to 500 levels. It seems there has been a bit of a course number inflation there at [LAUGHTER] Oneonta. That was just a joke. I’m sorry.

Chilton: We were told that everybody in SUNY was moving to four course digit numbers and so we over the past two years, it was like this really big project that we did to move from three digit to four digit numbers, because at least we heard, everybody in SUNY is doing this. So we have to do it. It’s very intriguing that not everybody had to do that. [LAUGHTER].

Ed: And simultaneously, as we were going from three digits to four digits, we didn’t have 400 level classes previously. And the feedback we were getting was that that was seen as a deficiency by some people who are reviewing our students’ transcripts, even though calling all of our upper division 300 level, and that people applying to professional schools would get that explanation they would understand why there weren’t 400 level. Other people who are maybe not as skilled at reading a transcript are like, “Well, did this student avoid all 400 level work?” And so simultaneously as we were adding another digit, we were also transitioning to having 1000, 2000, 3000, and 4000 level classes. So that was a big change that took a lot of curriculum writing and mapping through over the last two years. It wasn’t just as easy as adding a zero on to every course.

Rebecca: Sounds like such a fun project. Sign me up. [LAUGHTER] So we always wrap up by asking what’s next?

Chilton: So we are excited about Cohort Two. We are going to be starting our recruitment in the fall. We actually have a fall faculty Institute on campus. This year is very much focused on what the communities of practice that are already happening on campus and how you can get involved. And so that’s going to be our first, not our first, but that’s gonna be one of our big recruitment pitches for Cohort Two. In Cohort Two, we are looking to be able to include more faculty from a wider range, we are going to be starting to get into faculty that might not have as much experience in doing this. So we are thinking about how we hone our pitch and how we focus this to a wider audience to be able to say “No, this is not big changes in your classes. This is just asking one additional question or allowing space for one additional time for students to be able to practice connecting what you’re already doing to these career readiness competencies.”

Ed: And I would say what’s next for me is this experience has really solidified the idea for me that we need to continue in faculty development centers, to make spaces where faculty can repeatedly come back and interact on the same topic, getting away from that kind of one and done workshop, and identifying major things that we want to return to through the year inviting people into that space to share. Because when those faculty, when they get an opportunity to think and show off what they’re doing, it really is a wonderful spread of ideas. And you get a lot from all the energy in the room.

Rebecca: Well, thank you so much for joining us and sharing how this project’s unfolded at your institution.

John: And you’re both doing some really great work at SUNY Oneonta and it’s great to keep in touch and thank you for joining us.

Chilton: And thank you. It’s a pleasure to be here with you. So thanks for taking the time to be with us today.


John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.


306. Gender Bias and Timing of SETs

 A number of studies demonstrate gender bias in course evaluations. In this episode Whitney Buser, Jill Hayter, and Cassondra Batz-Barbarich join us to discuss their research that looks at the timing of when these gender differences emerge and theories for why they exist.

Whitney is the Associate Director of Academic Programs in the School of Economics at Georgia Tech. Jill is an Associate Professor of Economics in the College of Business and Technology at East Tennessee State University. Cassondra is an Assistant Professor of Business at Lake Forest College. Whitney, Jill, and Cassondra are the authors of an article entitled “Evaluation of Women in Economics: Evidence of Gender Bias Following Behavioral Role Violations.”

Show Notes


John: A number of studies demonstrate gender bias in course evaluations. In this episode we discuss research that looks at the timing of when these gender differences emerge and theories for why they exist.


John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.


John: Our guests today are Whitney Buser, Jill Hayter, and Cassondra Batz-Barbarich. Whitney is the Associate Director of Academic Programs in the School of Economics at Georgia Tech. Jill is an Associate Professor of Economics in the College of Business and Technology at East Tennessee State University. Cassondra is an Assistant Professor of Business at Lake Forest College. Whitney, Jill, and Cassondra are the authors of an article entitled “Evaluation of Women in Economics: Evidence of Gender Bias Following Behavioral Role Violations.” Welcome Whitney, Jill, and Cassandra,

Whitney: Thank you for having us.

Cassandra: Thank you so much.

Rebecca: Today’s teas are:… Whitney, are you drinking tea?

Whitney: I am. I have some jasmine tea.

Rebecca: Always a good choice. Jill. How about you?

Jill: Harney and Sons Hot Cinnamon Spice.

Rebecca: Oh, that’s such a good choice. I love that one. It’s a family favorite at my house. How about you, Cassandra?

Cassandra: Yesterday, we made a sun tea on the porch. So it’s sweet peach tea.

Rebecca: This is a good variety. How about you, John?

John: And I have ginger peach black tea from the Republic of Tea.

Rebecca: So we’re combining choices here [LAUGHTER]. And I have Awake tea, despite the fact that it is early afternoon here.

Jill: I also had three cups of coffee this morning.

Rebecca: It’s one of the most popular kinds of tea, Jill.

John: We’ve invited you here today to discuss your research on gender bias instudent evaluation of instructors. Could you tell us how the study came about?

Whitney: Jill and I have been working on this for about six years, believe it or not. It’s been a long process for us. And actually at the very beginning we had a different third working with us. And the original three of us, we met at the conference, and we had just attended a session that talked about teaching evaluations. And afterwards, we just naturally began talking about this, because we all had these really, really strong feelings about teaching evaluations. All three of us at the time were young, young in our careers, young age wise. We were female PhD economists. And we were all earning tenure, or I think Jill had just earned tenure. But we’re all in this similar experience of having what we felt like was a very positive class climate, and a lot of camaraderie between ourselves and the students until the grades were returned for the first time. And then we could feel a definite shift and it was upsetting to all of us. We all got into this because we love teaching and we want to do a good job in that. It was just something that we were picking up on. So that was our anecdotal experience, Jill had a little data on it herself, because she would do mid-semester evaluations herself, just to gauge the class climate and see what students were needing. And I had an experience where in my first position, they did a surprise midterm evaluation, just to kind of see how the new professor was doing, that I didn’t know about. And I got glowing reviews from the students, everything was very, very positive, wonderful and six weeks later, same students but grades returned, evaluations looked a little different. And the comments were a bit different. So we had a little data to backup this idea too, and one thing if the people listening today haven’t read the literature, there’s an extensive literature on course evaluations. And it consistently finds gender bias in those. But the thing about that literature is it only looks at evaluations, which are typically done on the very last day of class, maybe even after that, maybe a couple of days before, but at the end of the semester. And we really haven’t seen anyone look into how these opinions of students evolve over the semester, or how students feel at the beginning or the middle of the semester. So that’s what we wanted to do with that. And in my opinion, and this is just me speaking here, Jill can have her own other motivations, or our other co-author that has worked with us before could feel differently. But for me, it was really important to acknowledge that society has come a long way in the past several years with gender bias. And I don’t think that modern students are shocked by female faculty any longer, I don’t think they have an explicit distaste for female faculty. Anecdotally, I feel that my students are actually happy when they meet me. And they have expectations of me to be warm, comforting, approachable. But I do think that when you expect someone to be more comforting and approachable, and they give you a grade back, that’s not always an “A” in a difficult quantitative subject like economics, you can get a bit of a Grinch Who Stole Christmas effect. I thought it was going to be one way and now my expectations are taken down. We all know no one likes that dopamine depletion of having expectations not met. So, to me, if we’re going to talk about gender bias, we really have to talk about it in this nuanced way, so that it doesn’t get automatically dismissed by people who don’t see an explicit bias and then say, “Oh, hey, there’s nothing here.” And then the last thing that I think is really important here for the motivation for the paper is that we have this expectation that bias would grow over the semester. So if bias grows over the semester, that means the earlier in the semester you evaluate, the smaller the bias will be. And one thing that the literature is missing is a very concrete objective way to deal with bias. What we were hoping to find was: move the evaluations up in the semester a bit, and you minimize or eliminate bias and that’s a concrete objective. Towards the end today, we’ll talk about what we actually found and whether or not we knew that. But that was one of the motivations.

Jill: So that’s how the original paper found in terms of motivation, but then Cassandra, she is a PhD in Psychology until she had read and she was doing work in the area. And she had reached out to Whitney and I. She had read our paper, she had read the results of our paper. And so then a second paper with Cassandra takes a more psychology approach in terms of a lot of what Whitney is talking about and Cassandra is going to talk about it later, with respect to the role-incongruity theory, social role theory, and she’s going to talk more about that later. And Whitneys described the motivation of that first paper, the second paper takes a very different perspective and looking at it from a more psych perspective. Cassandra, you might want to chime in?

Cassandra: Absolutely. I think you summarized it well, I joined the paper, as Whitney and Jill were trying to find a home for it. And we thought that our interests, though coming from very different backgrounds ,would blend nicely for this particular topic, as there’s a lot of scholarship in psychology that looks at understanding reasons behind this bias. And so I was brought in to really help kind of think about how do we frame that in a way that might appeal to even a broader range of audiences.

Rebecca: At the beginning of the paper, and Whitney, you’ve kind of pointed to this today about being a young faculty member, you also noted in the paper that women are underrepresented among economics faculty, especially at the level of full professors. Can you tell us a little bit about the extent of this under-representation?

Jill: Women have earned more than half the doctoral degrees for over a decade. But particularly among tenure-track faculty are underrepresented. In the paper we cite 36% of full professors are females. In economics, that’s a smaller percentage, 17 and a half percent of full professors are females, in the area of economics, although 35% of PhDs in econs represent females. It’s a smaller percentage of female faculty receiving full professor rank in economics. That’s what we mean by that under representation. In terms of economics, specifically, it’s oftentimes left out of the STEM fields, and depending on which university or college that you’re out at, economics can sometimes could be found in the social sciences and in the arts and sciences, or it can be found in the business school. So at my institution, Whitney’s institution, I believe, and Cassandra’s I think we’re all represented in the business school. But sometimes, you know, economics wanted to put in there with the social science field, it’s not thought of as being this more quantitative, heavy subject, and it oftentimes is, it is by nature of it. And so females in those more math heavy classes, like the STEM classes. I think my students when I started off, and I think Whitney was getting at this, with us being more junior faculty members. I can considered by students peer, instead of the professor in the course. And that made it tough, because to Whitney’s point about that returning grade feedback and the perception that students had of me a day one versus midway through the course, I was now coming across as someone that was handing back maybe less than 100% or “A” grade. So in my business school, my principles of economics courses are required. They might not even want to be in there, but they have to be in there to get a business degree. Earlier on, that was a challenge I faced, I’m 13 years into my career. I’m going up for full professor this summer. But starting off was really a challenge. And I remember having female mentors in my graduate program. They tried to prepare me for this, they tried to say it’s going to be challenging early on, you’re going to have to go against some of these perceptions, alot of the perceptions that we measure in this paper..

John: To what extent is the underrepresentation of women faculty due to a cohort effect where women have become a larger share of PhD economists in the last few decades, but that was less true 20 or 30 years ago and how much of it might be due to the impact of gender bias on evaluations on career pathways for women?

Jill: Really what this paper looks at, the standard evaluations of teaching and the bias or potential for bias, that exists there. So I’ll just speak to that and that where I currently am, evaluations of teaching are weighted heavily for retention of faculty, promotion of faculty, tenure and promotion decisions. And then when we’re hiring new faculty, looking at any previous course evaluations and experience with teaching. At every level in academia, these are used as some gauge for teaching effectiveness. I think one of the questions that we’re looking at and accrediting bodies are looking at is whether or not this is the measure that should be used. And looking at different measures that might be options for measuring teaching effectiveness, we know that they’re flawed, that our study is showing that they’re flawed, but also previous literature has suggested that they’re flawed as well. And so the fact that for most schools, this is the single measure that’s being captured… and I know that it’s different depending on again, at my institution, some departments don’t give them a whole lot of weight in tenure and promotion decisions. But certainly, my experience in my College of Business and Technology that these are weighted heavily. And so in thinking about a junior faculty member starting off, when Whitney and I met at the conference, if my evaluations were lower, I’m putting a lot of time into my teaching and improving and bringing up those scores. My male colleagues, in discussion just with them, didn’t have the same experience that I was having with respect to these SETs. And so we think about allocation of time and resources as a tenure track junior faculty member, I’m putting more in what I would consider just catching up, getting those SET scores higher, so that it’s reflected in my tenure and promotion packet. And that’s less time that I’m allocating toward research or other things. That’s my view on it. I think Whitney has a couple other thoughts on that.

Whitney: One of the things we tried to make clear in the paper is that the literature is very clear that evaluations do have a gender bias. And if these evaluations are being used, and they are, in hiring decisions, annual evaluations, promotion, tenure evaluations, and merit pay raise decisions, then they’re being used at every single level of advancement. It’s not one small piece. It’s a piece that’s used throughout and very integrated late in the process.

Rebecca: You mentioned at the top of our interview that the second paper shifts more towards psychology, and specifically describes ways in which both social role theory and role-congruity theory may explain the bias against female faculty in student evaluations. Can you briefly summarize these arguments for our listeners?

Cassandra: So social role theory was a theory that has been put forth for decades by Alice Eagly, a very prominent scholar in the social psychology world, as well as her colleagues. And this has been used as a framework to really understand the complexities and origins of gender gaps in our workplace in particular, whether that be inequities and experiences, the expectations that are different for women, and of course, the outcomes such as promotion at work. Essentially, social role theory suggests that the reason we see these gender inequities today in society or that they originated from men and women being distributed into social roles based on physical sex differences, so that women biologically were able to have children, men, on average, were physically stronger, which those differences 1000s of years ago, had an evolutionary benefit to a well functioning society, people were supporting in the ways in which they were best equipped to do so. And the assignment of men and women into these roles led get them to adapt role-specific qualities and skills. So women who were bearing children were friendly, helpful, sensitive, concerned with others, kind, caring. We refer to these now as more communal qualities, and men and the provider, the protector, role led them to have attributes such as ambition, being assertive, authoritative, dominant. These are qualities that now we label as agentic. So while technology of course has since caught up and made these biologically driven role assignments unnecessary, society continues to see a division of labor along these lines in the modern world and society at large. And society at large still holds the belief that women do possess these traits, and should possess these traits, these more communal qualities, and men do and should possess more of these agentic. Relatedly, role-congruity theory helps us understand the consequences when men and women fail to fulfill these expectations. And we know the failure to fulfill these expectations are more consequential for women, this experience of bias driven from the failure to behave in communal ways. In other words, violating these cultural expectations can be seen in all areas of society, but particularly in traditionally male-dominated positions, like college professors, or in male-dominated fields like economics [LAUGHTER]. And so women that are in these roles are already going to experience some degree of backlash for being in gender-incongruent positions. But that is especially true if they are also going to behave in traditionally more agentic ways, being more assertive, demonstrating their power, which we argued was what was occurring when you give critical feedback back to students.

John: To approach this, you gave evaluations to students at two different points of the semester. Could you tell us a bit more about the study design, how large the sample was and how many faculty and institutions participated in the study?

Whitney: Sure, we had a really rich data set for this study. That’s one of the reasons we were able to get two different papers out of it, and maybe even some future research, because we took all of this data, and we collected it in person on paper and entered it, which was an arduous process. As I said, we had been working on this project for about six years, about a year and a half of that was just data collection. And we have a lot of people to thank that did that for us for no author credit on this paper, so we had males and females across the United States gathering that data for us, that we’re really appreciative to have. So in the end, we wound up with about 1200 students in total, we weren’t quite 50/50, we were 60/40, favoring men, which is typical for economics classrooms, even though it is required in a lot of majors (that’s where you’re getting a lot of the women taking it). And like you said, John, we surveyed them twice. We surveyed them on the second day of class, we wanted as close to a first impression as possible without having a major sample issue with drop/ad. And then we surveyed them the day after they got their first midterm grade back. So we got the first impression, and then we got the way that they felt after they had had their first grade returned. We did this at five different colleges and universities, we had three male professors contributing data and four female professors contributing data. One of the big questions that people have asked us over the time is “Well, how does race play into this?” And that’s something that’s beyond the scope of our research, I will say that we only had one underrepresented minority in our sample, again, typical of economics professors, it was one of our male instructors. So, we would expect a downward bias from race and maybe an upward bias from gender, or getting those two, at least watching one another out in the paper. And when we asked these students about how they felt after their grades were returned. This was about four weeks into the semester, so still pretty early in the semester. What we did was we really wanted to ask about the specific qualities that had been hypothesized in the literature as drivers of bias or drivers of differences. So we just asked students to rate their instructor on a bunch of different qualities. Cassie really helped us out here because she came in and she says, “Well, you know, we can categorize these qualities into communal qualities and agentic qualities and neutral qualities…” which was really the way to approach it because of course, we get different things in communal versus just qualities. So we asked our students things like: “How knowledgeable do you find your professor? How challenging? Do you find them to be approachable? Do you find them to be caring? Are they interesting?” And then we asked a couple of very general questions: “Would you recommend the course?” All of this set us up to have a really nice dataset where we could look between genders and across time as well.

Rebecca: So I think everyone’s probably dying to know exactly what you found. [LAUGHTER]

Jill: I’m just going to provide an overview of the results because we do a number of different specifications and use different econometric methods in the findings. And so you can get all of those results there in detail. But in general, on the second day of class, we find that women are receiving lower ratings across the five agentic and gender-neutral instructor characteristics that we measured. They were rated higher on that second day of class on those more communal characteristics. And not all of those differences were statistically significant. Immediately after the first exam grade was returned to students, women were receiving lower ratings for all seven measured characteristics. Each difference was significant except for those caring and approachable, more communal characteristics. And then men were now having higher ratings in all the different aspects relative to time, or the second day of class. Over time, what we see was that men’s evaluations were getting higher on all characteristics from the second day of class to the period after the first exam was returned. And then in contrast, women’s evaluations were not trending upward. So we had a couple that were staying the same, but overall, they were going down. So those are just some overview findings. Again, those more specific results, by specification, can be found in the paper.

John: We will include a link to both papers in the show notes too, so people can go back and review them. To summarize, what you found is there was relatively weak evidence of significant gender bias on the second day of class, but that gap increased fairly dramatically after the first graded exam. So what do you attribute that change to, was it because of the feedback students were getting from grades as Whitney had mentioned before?

Whitney: We were attributing, and Cassie can talk about this with more authority on the theoretical point, but we’re attributing that to backlash theory, this idea that if I expect one thing, and I don’t get it, there’s this need to back off so that things go in congruence.

Cassandra: Exactly, Whitney is spot on there. What we thought this was evidence of was women behaving in gender incongruent ways, women are supposed to be warm and caring and friendly. And when you get a perhaps grade that maybe wasn’t an “A,” that feels harsh and critical, and a woman is asserting their power and dominance in the classroom, which again, they already are in a male dominated field profession. And those two things together combined can result in this backlash.

Rebecca: So if we take these findings, and think institutionally, what are some things that institutions might want to think about moving forward?

Whitney: That’s a good question. If you remember, from the very beginning, we were saying, we’re really hoping to find this nice objective concrete solution, we anticipated finding it through timing. And that’s what I would really like to do with future research is to be able to find something concrete and objective to treat this with. We weren’t able to do that because we found bias from the beginning. And we found that it came so quickly in the semester that it’s not something that we can just move back evaluations to midterm or something like that. Since we can’t do that, we’ve talked about other ways for institutions to take this. And one takeaway really is just an awareness that these gender biases exist and that these evaluations are flawed. This is really well established in the literature, but not necessarily in the general sphere of knowledge. When we published this paper, Georgia Tech did a little feature in their daily digest, and I had two female engineering faculty email me and say, “I knew this in my gut for years, but nobody’s ever quantified it.” That to me, is just evidence that it’s not in the general sphere of knowledge, even though the literature defines it well. Some of the impact of the concrete solutions that we have seen is we’re seeing a lot of schools and accreditors, like AACSB, they’re starting to require multiple indicators of teaching effectiveness and evaluation. So evaluations and peer reviews, or maybe something else to see the observation, something to that effect to where we have more of a global and inclusive way to look at someone’s teaching effectiveness. So this is a great takeaway, hopefully that will reduce the weight of the impact of evaluation just by having other factors in there. And just one final point that I want to make. And this is just a really big sticking point to me for the paper is that all of us are researchers, we all deal with statistics and statistical significance, and robust research methods. And then when those of us in Chair and Dean roles go to look at evaluations, all the sudden, all that training completely goes out the window, and we look at the difference between a 4.2 and a 4.4. And I know those differences sound really small, they are that small. And we say, “Oh, well, this person does better than this person, this person deserves to be hired over this person.” Never in our research, or in a formal presentation, would we ever compare two means that small without significance testing, number one, and without making sure they’re actually comparable, and say, “Oh, there’s a difference.” It’s just something that I think we need to recognize, we would not recognize this as good research or good methodology in any other area of our work. It’s just something that we should keep in mind as we move forward with this.

John: Now, you mentioned the use of peer evaluations as another way of providing, perhaps, more balance, but might they be subject to the same type of bias?

Whitney: Yeah, all the things that we would see for student evaluations, I can imagine how you would see with peer evaluations as well.

Jill: But there are creative ways to do peer evaluations that I think here at ETSU, we have a Center for Teaching Excellence. And I’m confident Georgia Tech, and Lake Forest has their own version of that. And so there are creative ways. And again, not that SETs are necessarily bad, but knowing what we know about the flaws in them, that, coupled with an additional measure or two, can be a lot more insightful, I think, to the teaching effectiveness, like true teaching effectiveness of instructors.

John: And one thing I’m wondering is if the measured effect might be larger in economics, because at least at many institutions, grades and economics and STEM classes are often lower, which might magnify the effect of this difference. It would be interesting if there was to be a study that also included some classes, maybe in humanities, to see if perhaps there’s less of an effect because of that role-incongruity issue there. It may not appear to be as severe in disciplines where grades across the board tend to be higher.

Whitney: I think you’re right about that, most people when they take economics, it’s a required class and certainly the grades are a big factor, then the two things that showed the most significance outside of our key variable of interest was interest in economics, and expected grade. Those were the things that across the board… now we still found gender bias controlling for those things, but it mattered.

Rebecca: So we talked a little bit about things that institutions might want to start thinking about: institutional policy and things that might shift how we use teaching evaluations. Are there any other strategies that institutions or instructors can use, or adopt, to try to reduce this bias in the short term?

Cassandra: That’s really the million dollar question. Because this type of bias exists in a lot of different domains, whether we’re talking managers and their subordinates, teachers and their students. One thing that’s often suggested or recommended is simply making people aware that this bias exists, and providing training on how to better approach evaluations, whether that’s how to use a rating scale and ensuring that you aren’t engaging in a halo effect, for example. Another strategy is requiring that people justify their ratings that are provided with qualitative comments… that if you’re just asked to fill out on a scale, on how competent is this person? Well, bias may creep in more if you aren’t asking for a justification of why that particular rating was given for competence. A last recommendation that I’ll share here is making these evaluations more public. So if there are a couple of people, say peers, that are evaluating myself or Whitney or Jill in the classroom, well, they need to come together, share and publicly disseminate their evaluations that they had given to us. This social accountability can help to mitigate bias and for people to ensure that the ratings that you’re giving are, in fact justified.

John: So we’ve got a long ways to go with this. It’s a problem that’s been recognized for quite a while with a lot of studies. But there hasn’t been that much done to address that. And those are some good suggestions that institutions may want to try. We always end with the question: “What’s next?”

Cassandra: [LAUGHTER] That’s a good question. Of course, I think that the three of us collectively would say we do hope that administration and decision makers start asking questions about their use of student evaluations of teaching and how they might seek to mitigate this bias, based on the recommendations Whitney had already shared. But we also hope that women faculty perhaps feel more empowered to advocate for themselves when it comes time for promotion and tenure decisions to be made. My Institution, a part of the promotion process, is writing letters, and going through interviews. So speaking to this, bringing an awareness to the people who are making the decisions that this exists, and that it is not just an opinion, that there is empirical evidence of its existence. But we are really interested in exploring more fully how providing feedback, particularly critical feedback, like in our study, where the professors are giving back grades might impact the perceptions of men and women in other contexts as well. So is this a phenomenon we would see, for example, between a manager and their team? Do people respond differently to critical feedback from a manager because of their gender? And how much are these differences, perhaps, driven by perceptions of how communal or agentic they are in their delivery of that feedback? So in other words, are we seeing the same pattern in other contexts? Ultimately, we hope that by better understanding how perceptions of communion and agency impact interactions that women have at work, particularly women in male-dominated or gender-atypical roles, this greater understanding will allow us to also discover ways to alleviate some of that backlash through more targeted interventions and training and perhaps better timing. Because at a minimum, it’s important to highlight the various ways gender bias continues to persist in our society. Because without that awareness, nothing can be changed.

John: Whitney, Jill?

Jill: I think that was great. [LAUGHTER]

Whitney: Yeah, I think, Cassie, you did a great job. And Cassie certainly helped us out with bringing formal language and theory to things that we felt as intuitive and we felt in our gut as important. We don’t have a lot of language for that in the economic space. And so blending these two disciplines together has been very helpful for looking at the situation.

Rebecca: Well, thank you all for joining us. And the research that you’re doing is really important and impactful. So we hope our listeners will use it.

Whitney: Thank you so much.

Cassandra: Thank you.

Jill: Thank you so much.


John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.


305. 80 Ways to Use ChatGPT in the Classroom

Faculty discussions of ChatGPT and other AI tools often focus on how AI might interfere with learning and academic integrity. In this episode, Stan Skrabut joins us to discuss his book that explores how ChatGPT can support student learning.  Stan is the Director of Instructional Technology and Design at Dean College in Franklin, Massachusetts. He is also the author of several books related to teaching and learning. His most recent book is 80 Ways to Use ChatGPT in the Classroom.

Show Notes


John: Faculty discussions of ChatGPT and other AI tools often focus on how AI might interfere with learning and academic integrity. In this episode, we discuss a resource that explores how ChatGPT can support student learning.


John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by

John: , an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.


John: Our guest today is Stan Skrabut. Stan is the Director of Instructional Technology and Design at Dean College in Franklin, Massachusetts. He is also the author of several books related to teaching and learning. His most recent book is 80 Ways to Use ChatGPT in the Classroom. Welcome, Stan.

Stan: Well, thank you ever so much for having me on. I have been listening to your podcast since the first episode, you guys are crushing it. I recommend it all the time to my faculty. I’m excited to be here.

John: Thank you. And we very much enjoyed your podcast while you were doing it. And I’m hoping that will resume at some point when things settle down.

Rebecca: Yeah, we’re glad to have you here.

Stan: Yeah, thanks.

John: Today’s teas are:… Stan, are you drinking any tea?

Stan: A little bit of a story. I went over to the bookstore with the intent of getting tea. They had no tea in stock. I went to the vending machine on the same floor. The vending machine was down. I went to another building. I put in money. It did not give me tea. I’m stuck with Mountain Dew. I’m sorry. [LAUGHTER]

Rebecca: Not for lack of trying. Clearly. [LAUGHTER]

Stan: I tried. I tried.

Rebecca: I have some blue sapphire tea.

John: And I have Lady Grey.

Rebecca: You haven’t drink that in a while John,

John: no. [LAUGHTER]

Rebecca: Little caffeine today huh. [LAUGHTER]

John: Yeah well i am back in the office, I’ve returned from Duke and I have more options for tea again.

Rebecca: That’s good. So Stan, we invited you here today to discuss 80 Ways to Use ChatGPT in the Classroom. What inspired you to write the book?

Stan: Well, I’m an Instructional Technologist and my responsibility is to help faculty deliver the best courses possible. And in November 2022, ChatGPT came onto the scene and in December, faculty are up in arms, “Oh, my goodness, this is going to be a way that students are going to cheat and they’ll never learn anything again.” And as an instructional technologist, I see technology as a force multiplier, as a way to help us do better things quicker, easier. And so I didn’t feel threatened by ChatGPT. I’ve been looking at the horizon reports for the last 20 years. And they said, “AI is coming. It’s coming. It’s coming. Well, it’s here.” And so it was just a matter of sitting down in January, write the book, publish it, and provided a copy to all the faculty and we just started having good conversation after that. But the effort was that we should not ban it. That was the initial reaction; that this is a tool like all the other tools that we bring into the classroom.

Rebecca: Stan, I love how you just sat down in January and just wrote a book as if it was easy peasy and no big deal. [LAUGHTER]

Stan: Sell, I will have to be honest, that I was using ChatGPT for part of the book, it was a matter of I asked ChatGPT kind of give me an outline, what would be important for faculty to know about this, so I got a very nice outline. And then it was a matter of creating prompts. And so I’d write a prompt and then I would get the response back from ChatGPT. It was a lot of back and forth with ChatGPT, and I thought ChatGPT did a wonderful job in moving this forward.

John: Most of the discussion we’ve heard related to ChatGPT is from people who are concerned about the ability to conduct online assessments in the presence of this. But one of the things I really liked about your book is that most of it focuses on productive uses by both faculty and students and classroom uses of ChatGPT because we’re not always hearing that sort of balanced discussion about this. Could you talk a little bit about some of the ways in which faculty could use ChatGPT or other AI tools to support their instruction and to help develop new classes and new curriculum?

Stan: Yeah, absolutely. I guess first of all, I would like to say that this is not going anywhere. It is going to become more pervasive in our life. Resume Builder went out and did a survey of a couple thousand new job descriptions that employers were putting out. 90% of them are asking for their employees to have AI experience. As higher education, it’s upon us to make sure that the students that are going out there to be employees know how to use this tool. With that said, there has to be a balance. In order to use the tool properly, you have to have foundational knowledge of your discipline. You have to know what you’re talking about in order to create the proper prompt, but also to assess the proper response. With ChatGPT sometimes it doesn’t get it right… just how chat GPT is built, it’s built on probabilities that these word combinations go together. So it’s not pulling full articles that you can go back and verify, kind of like the human mind has been working. We have built up knowledge all these years. My memory of what happened when I was three, four or five years old is a little fuzzy. Who said what? I’m pretty confident what was said. I’m pretty confident, but it’s still a little fuzzy. And I would need to verify that. So I see ChatGPT as an intern, everybody gets an intern, now. They do great work at all hours, but you as the supervisor still have to verify the information is correct. Back to the classroom, students can’t or should not, or regardless of who’s using it, should not just hit return on a prompt, and then rip that off and hand it in to their supervisors or instructor without verifying it, without making it better, without adding the human element to working with the machine. And that is, I think, where we can do lots of wonderful things in the classroom. You know, from the instructor side of go ahead and use this for your first draft. Now turn on the review tools that track changes and show me how you made it better, as you’re working towards your final product. Instructors can go ahead and craft an essay, craft out some supposedly accurate information from ChatGPT. tThrow it in the hands of the students and say: “Please, assess this. Is this right? Where are the policies? Where are the biases? Tell me where the gaps are. How can we make this better?” And using it to assess it.” Those are some initial ways to start asking students or using it in the class. I don’t know if I’m tapping into all the things. There’s just so many things that you could do with this thing.

John: And you address many of those things in the book. Among those things that you address was having it generate some assignments, or even at a more basic level, having it develop syllabi, or course outlines and learning objectives and so forth, for when faculty are building courses.

Stan: Oh, absolutely. We have a new dean at our School of Business. And he came over and wanted to know, “Tell me a little bit more about ChatGPT, how we can use this. They’re looking at creating a new program for the college. And it’s like, “Well, let’s just start right there.” What are the courses that you would have for this new program and provide course descriptions, titles, and descriptions? Here comes the list of 10, 12 different courses for that particular program. Okay, let’s take this program, what are the learning outcomes for this particular program? So we just copied and pasted, asked for learning outcomes, here comes the list of outcomes. Now for these different outcomes, provide learning objectives. And it starts creating learning objectives. And so you can just continue to drill down. But this moves past the blank page. Normally you’d bring in a group of faculty to work on that program, what are your ideas and send everybody off, and they would pull ideas together and you would start crafting this. This was done in 30 seconds. And now okay, here’s the starting point for your faculty. Where are the problems with this? How can we make it better? Now go. Instead of a blank page, starting with nothing? That was one example. But even for your course, using ChatGPT, having a course description, you can ask it to say, provide me a course plan for 16 weeks. What would I address in this? What would be the different activities? Describe those activities. If you want it to have the activities use transparent assignment design, it’ll craft it in that format. It knows what transparent assignment design is, and it will craft it that way. And then going back to assessment, you can build content. So looking at that OER content, open education resources, that it can get you a jumpstart on that OER content. What are gaps that I want or taking content that’s there and localizing it based on your area to say here we are in New England, Massachusetts, specifically, I need an example. Here’s the content that we’re working with. Give me an example, a case study, and it will craft a case study for you. It allows you to go from that zone of drudgery to your zone of genius very rapidly. I’ve been working on a new book, and got down to the final edits, and I was like, “Oh, I’m missing conclusions to all these different chapters.” I just fed the whole chapter in and said, “Could you craft me a conclusion to this chapter?” And it just knocked it out. I mean, I could do it. But that’s my zone of drudgery, and I’d rather be doing other things.

Rebecca: It’s interesting that a lot of faculty and chairs and administrators have been engaged in this conversation around ChatGPT quite a bit, but many of them haven’t actually tried. ChatGPT. So if you were to sit down with a faculty member who’s never tried it before, what’s the first thing you’d have them do?

Stan: This is an excellent question because I do it all the time. I have a number of faculty members that I’ve sat down, looked at their courses and say, “What is the problem that you’re working with? What do you want to do?” And that’s where we start. We say “What is the problem that you’re trying to fix?” ChatGPT version three had 45 terabytes of information it was given. They say the human brain has about 1.25 terabytes. So this is like asking thirty-some people to come sit with you to work on your problem. One class was a sports management class dealing with marketing. And they were working with Kraft enterprises that has the Patriots, and working on specific activities for their students and developing marketing plans and such. We just sat down with ChatGPT and started at a very basic level to see what we could get out of it. And the things we weren’t happy with, we just rephrased it, had it focus on those areas, and it just kept improving what we were doing. But, one of the struggles that I hear from faculty all the time, because it’s very time consuming, is creating assessments, creating multiple choice questions, true and false, fill in the blank, all these different things. ChatGPT will do this for you in seconds. You feed all the content that you want, and say, “Please craft 10 questions, give me 10 more, give me 10 more, give me 10 more. And then you go through and identify the ones you like, put them into your test bank. It really comes down to the problem that you’re trying to solve.

John: And you also know that it could be used to assist with providing students feedback on their writing.

Stan: Absolutely

John: …that you can use it to help generate that. Could you talk a little bit about that.

Stan: We’re right now working with the academic coaches. And this is one of the areas to sit down. I’m also not only the Director of Instructional Technology and Design, but also my dotted line is Director of Library. So I’m trying to help students with their research. And the writing and the research go hand in hand. So from the library side, we look at what the students are being assigned, and then sit down and just start with a couple key terms or phrases, keywords that we want and have ChatGPT to give us ideas on these different terms. And it’ll provide ten, twenty different exciting ideas to go research. Once again, getting past the blank page. It’s like “I gotta do an assignment. I don’t know what to do.” It could be in economics, I don’t know what to write about in economics, it’s like, well, here pull these two terms together, and what does it say about that?” So we start at that point. And then once you have a couple ideas that you want to work with, what are some keywords that I could go and start researching the databases with, and it will provide you these ideas. It’ll do other things, it’ll draft an outline, it’ll write the thing if you want it to, but we try to take the baby steps in getting them to go in and research but getting pointed in the right direction. On the writing side, for example, I have a class that I’m going to be teaching at the University of Wyoming to grad students. I’m going to introduce ChatGPT. It’s for program development and evaluation, and I’m going to let them use ChatGPT to help with this. One of the things that academic writers struggle with is the use of active voice. They’re great at passive, they’ve mastered that. Well, this will take what you’ve written and say, “convert this to active voice” and it will rewrite it and work on those issues. I was working with one grad student and it was after playing with ChatGPT a couple of times, she finally figured out what really was the difference and how to overcome that problem and now she is writing actively, more naturally. But she struggled with it. With ChatGPT, you can take an essay, push it up into ChatGPT and say, “How can I make this better?” And it will provide guidance on how you can make it better. You could ask it specifically, “How can I improve the grammar and spelling without changing any of the wording here.” It’ll go and check that. So for our academic coaches, because there’s high volume, this is another tool that they could use to say, “Here’s the checklist of things that we’ve identified for you to go work on right away,” not necessarily giving solutions, but giving pointers and guidance on how to move forward. So you can use it at different levels and different perspective, not where it does all the work for you but you could do it incrementally and say, “here assess this and do this.” And it will do that for you.

Rebecca: Your active and passive voice example reminds me of a conversation I had with one of our writing faculty who was talking about the labor that had been involved previously of making example essays to edit of to work on writing skills. And she just had ChatGPT write things that [LAUGHTER] are of different qualities, and to compare and also to do some editing of as a writing activity in one of her intro classes.

Stan: Absolutely. What I recommend to anyone using ChatGPT is start collecting your prompts, have a Google document or a Word document, and when you find a great prompt, squirrel it away. Some of the workshops that I’ve been giving on this, I demonstrate high-level prompts that are probably two pages long that you basically feed this basic information to ChatGPT and it talks everything about the information that you’re going to be collecting, how you want to collect it, how you want it to be outputted, what items are you going to output, and you’re basically creating this tool that you can then call up and say, for example, developing a course, that it will write the course description, give you a learning outcomes, recommended readings, activities, and agenda for a 16 week, all in one prompt. And all you do is say “this is the course I want” and let it go. It’s amazing what problems that we can build this tool just like we build spreadsheets, we build these very complex spreadsheets, to do these tasks. We can do the same with Chat GPT, we just have to figure out what the problems we’re trying to solve.

John: Our students come into our classes with very varied prior preparation. In your book, you talk about some ways in which students can use ChatGPT to help fill in some of the gaps in their prior understanding to allow them to get up to speed more quickly. Could you talk about some ways in which students can use ChatGPT as a personalized tutor,

Stan: I’m going to take you through an example that I think can be applied for students. A student comes to your class. Ideally, they’re taking notes, one of the strategies that I use is I have my notebook, I’ll open my notebook, and I’ll turn on otter.AI, which is a transcription program. And I will go over my notes, I will basically get a transcription of those notes, I can then feed that transcription into ChatGPT and say clean it up, make a good set of notes for me. And it will do that. And then I can build this document and then I can review what we did in class, build a nice clean set of notes, and have that available to me. Over a series of setw of notes, I could do the same thing by reviewing a textbook and highlight and talk about, transcribe key points of the textbook or I can cut and paste. And then I can feed that information into ChatGPT and say, “Build me a study bank that I can build a Quizlet, for example, or I need to create some flashcards on what are the key terms and definitions from this content?” Here you go. Create some flashcards from that material. It could be that no matter how great the instructor is, I still don’t get it. They introduced a term that is just mind boggling, and I still don’t get it. And so I can then ask ChatGPT to explain that at another level. They talk about non-fiction, some of the best non-fiction books or the most popular that are out there getting on the bestsellers list, they’re written at a certain grade level. And I know that I write typically higher than that grade level, I can go ask ChatGPT to rewrite it at a lower grade level. I could, as a student, ask ChatGPT, to give an explainer at a level that I do get to understand. Those are certain ways that you can do this. And you basically can build your own study guides that have questions that have examples of all the materials, so you can feed that material in and get something out, just enhance it. And I think for faculty, this is also an easy way to create good study guides, that you can get the key points and build the study guides a lot easier, just going with the blank page and trying to craft it by hand, can be very difficult. But if you already have all your material, you feed it in there, and then say here, let’s build a study guide out of this year with some parameters, definitely much more useful.

Rebecca: We’ve talked a lot about how to use ChatGPT as an individual, either as an instructor or as a student. Can you talk a little bit about ways that instructors could use ChatGPT for in class exercises or other activities?

Stan: Absolutely. And I’m sorry, some of the examples other folks have actually contributed first, and I saw him and I thought they were just brilliant, but I don’t have their names right in front of me. So I apologize ahead of time. But as an instructor, I would invite ChatGPT into the classroom as another student. We call it Chad, Chad GPT and bring Chad into the classroom. So you could have an exercise in your classroom, ask the students to get into groups, talk about an issue, and then up on the whiteboard, you start getting their input, you start listing it. And then once you’re done, you can feed Chad GPT the same prompt and get the list from Chad GPT, and then compare it to what you’ve already collected from the students, what their input has been. And from there, you can do a comparison, like “We talked about that, and that, and that, oh, this is a new one. What do you think about this?” And so you can extend the conversation by what Chad GPT has provided? …and there I go, Chad, I’ll be hooked on that for a while. But you can extend the conversation with this or if students have questions that are coming up in class, you can field that to the rest of the class, get input and then say “Okay, let’s also ask Chad, see what Chad has to say about that particular topic?” Those grouping exercise we typically do the think-pair-share exercise, well part of that is each student gets to get Chat in that group. So, each group you can have Chad come in where they have to discuss, they have to think about it first, write something down, pair, discuss it, then add ChatGPT into the mix, talk about it a little bit more, and then share with the rest of the class. Lots of different ways that you can bring this into the classroom, but I bring it right in as another student.

Rebecca: Think-pair-chat-share. [LAUGHTER]

Stan: Yep. And that’s that mine that actually somebody was clever enough, they found that. I just happen to glom on to it. But yeah, definitely a great way of using it. It’s a new tool. We’re still figuring our way, but it’s not going away.

Rebecca: So whenever we introduce new technology into our classes, people are often concerned about assessment of student work using said technologies. So what suggestions do you have to alleviate faculty worry about assessing student work in the age of ChatGPT?

Stan: Well, students have been cheating since the beginning of time. That’s just human nature. Going back to why are they cheating in the first place? In most cases, they just got too much going on, and it becomes a time issue. They’re finding the quickest way to get things done. So ensuring that assignments are authentic, that they’re real, they mean something to a student ,is certainly very important in building this. The more it’s personally tied to the student, the harder it is for ChatGPT to tap into that. ChatGPT is not connected to the internet yet. So having current information, that’s always a consideration. But I would go back to the transparent assignment design, and part of the transparent assignment design that is often overlooked is the why. Why are we doing this. If you use ChatGPT to do this, this is what you’re not going to get from the assignment. So, when building those assignments, I recommend being very explicit that yes, you can use ChatGPT to work on this assignment, or no, you cannot, but here’s why. Here’s what I’m hoping that you get out of this. Why this assignment’s important. Because otherwise, it just doesn’t matter. And then when I have an employee that just simply hits the button and gives me something from ChatGPT, I’m going to ask, “Why do I need you as an employee? Because I could do that. Where’s the human element? …bringing that human element into it, why is thisimportant?” What learning shortcut or shortcutting you’re learning, if you just rely on the tool and not grasp what the essence of this particular assignment is. But I think it goes back to writing better assignments… at least that’s my two cents on it.

Rebecca: Thankfully, we have ChatGPT for that.

John: For faculty who are concerned about these issues of academic integrity, certainly creating authentic assignments and connecting to individual students and their goals and objectives could be really effective. But it’s not clear that that will work as well when you’re dealing with, say, a large gen-ed class, for example. Are there any other suggestions you might have in getting past this?

Rebecca: John? Are you asking for a friend? [LAUGHTER]

John: [LAUGHTER] Well, I’m gonna have about 250 students in class where I had shifted all of the assessment outside of the classroom. And I am going to bring some back into the classroom in terms of a midterm and final but they’re only 10 and 15% of their grade, so much of the assessment is still going to be done online. And I am concerned about students bypassing learning and using this, because it can do pretty well on the types of questions that we often ask in introductory classes in many disciplines.

Stan: That’s a hard question, because there’s certainly tools out there that can identify where it suspects it’s been written by AI. ChatGPT is original text so you’re not dealing with plagiarism, necessarily, but you’re dealing with, it’s not yours, it’s not human written. There are tools out there, but they’re not necessarily 100% reliable. Originality.AI is a tool that I use, which is quite good, but it tends to skew, everything is written AI. TurnItIn, they’ve incorporated technologies into being able to identify AI, but it’s not reliable. This honestly comes down to really an ethics issue, that folks who do this feel comfortable in bypassing the system for the end game, which is to get a diploma. But then they go to the job and they can’t do the job. And a recent article that I read in The Wall Street Journal was a lot of concern about employees not having the skill sets that they have, and how to convince students of this, that “why are you here? What’s the whole purpose of doing this? I’m here to guide you based on my life experience on how to be successful in this particular discipline, and you don’t care about that.” That’s a hard problem to fix. So I don’t have a good answer for that. I’m always on the fence on that because it’s hurting the integrity of the institution that students can bypass, but it’s harder. Peer review is another tool, you know, to have them go assess it. They seem to be a lot harder [LAUGHTER] on each other. Yes, this is a tough one. I don’t have a good answer. Sorry.

John: I had to try again, [LAUGHTER] because I still don’t have very good answers either. But certainly, there’s a lot of things you can do. I’m using clickers.I’m having them do some small group work in class and submitting responses. And that’s still a little bit hard to use ChatGPT for just because of the the timing, but it was convenient to be able to let students work on things outside although Chegg and other places had made most of those solutions to those questions visible pretty much within hours after new sets of questions have been released. So, this perhaps just continues that trend of making online assessment tools in large classes more problematic.

Stan: Well, I mean, one of the strategies that I recommend is master quizzing. So master quizzing is building quiz that are 1000s of questions large and randomly drawn from it. And they get credit when they ace it. And then the next week, they have another one, but it’s also cumulative. So they get previous questions too. And you have to ace it to get credit. Sorry, that’s how it is, cheat all you want, but it’ll get old after a while.

John: And that is how my course is set up. And they are allowed multiple attempts at all those quizzes, and they are random drawings. And there’s some spaced practice built in too, so it’s drawing on earlier questions randomly, but, but again, pretty much as soon as you create those problems, they were very quickly showing up in the online tools in Chegg and similar places. Now, they can be answered pretty well, using ChatGPT and other similar tools. It’s an issue that we’ll have to address, and some of it is an ethics issue. And some of it is again, reminding students that they are here to develop skills, and if they don’t develop the skills, their degree is not going to be very valuable. I

Rebecca: Wonder if putting some of those like Honor Code ethics prompts at the beginning or end of blank bigger assessments would [LAUGHTER] prime their pump or just cause more ChatGPT to be used. [LAUGHTER]

John: That’s been a bit of an issue because the authors of those studies have been accused of faking the data. And those studies have not been replicated. In fact, someone was suspended at Harvard, recently, and is now engaged in a lawsuit about that very issue. So the original research that was published about having people put their names on things before beginning a test hasn’t held up very well. And the data seems to have been… at least some of it seems to have been… manipulated or fabricated. [LAUGHTER] So right now, ChatGPT allows you to do a lot of things, but they’ve been adding more and more features all the time. There’s more integrations, it’s now integrated into Bing on any platform that will run Bing. And it’s amazing how well it works, but the improvements are coming along really rapidly. Where do you see this as going?

Stan: November 2022, was ChatGPT built on GPT3 , we’re now into four. And this is only half a year later, basically, that we got into four. I mean, it’s everywhere. For example, in selling books, one of the things that you want to do is try to sell more books. So I went back to Amazon, pulled out all the reviews that I had, sent them into ChatGPT and said “Tell me what the top five issues are.” In seconds it told me it just assessed it where this would take large amount of time for me to do this and it just did it nice and neatly. Everything is going to have AI into it. Grammarly AI is being built into it. All the Microsoft products are going to have AI built in. We’re not getting away from it. We have to learn how to use this in our professions, in our disciplines. With ChatGPT4, it was said somebody had drawn a wire diagram of a website buttons and mastered and text and took a picture of it, gave it to ChatGPT4 and it wrote the code for that website. It’s gonna be exciting. Buckle up, and we had consternation about January, we’re gonna have a lot more coming up. It’s just part of what we do. We have to figure out how to stay relevant, because this is so disruptive. In the long line of technologies that has come out, this is really disruptive. We can’t fight against it, we have to figure out how to do it appropriately, how to use this tool.

Rebecca: The idea of really having to learn the tool resonates with me because this is something that we’ve talked about in my discipline for a long time, which is design. But if you don’t really learn how to use the tools well and understand how the tools work, then the tools kind of control what you do versus you controlling what you’re creating and developing. And this is really just another one of those kinds of tools.

Stan: Well, even in the design world, I’ve gone to Shutterstock. And there is something that allows you to create a design with AI. So the benefit for a designer is they have a certain language, tone, and texture. Their language is vast, and for them to craft a prompt would look entirely different from me, a snowman sticks for arms, it’d be entirely different. But getting the aspect ratio of 16 x 9, everything that you craft into this prompt and feed it in, somebody who does design and knows the language would get something then a mere mortal like me putting that information in. So for somebody who’s in economics, you have a whole language about economics. Somebody who is trying to craft a prompt related to that discipline has to know the foundationals, the language of that discipline, to even get close to being correct in what they’re gonna get back. And students have to understand this, they cannot bypass their learning because they will not have the language to use the tool effectively.

John: And emphasizing to students the role that these tools will be playing in their future careers, might remind them of the importance of mastering the craft in a way that allows them to do more than AI tools can. And at some point, though, I do wonder [LAUGHTER], at what point AI tools will be able to replace a non trivial share of our labor force.

Stan: It’ll affect the white collar force a lot quicker. And I look at it… a nice analogy for the AI was in the Marvel, you have Iron Man, Tony Stark. And it is the mashup of the human and the machine. He’s using this to allow himself to get further and faster in his design, and to do things that we hadn’t thought about before. And I see this tool, being able to do this, that we’re bringing so much information and data to this, it’s mind boggling that suddenly you see a spark of inspiration that you couldn’t get there by yourself without a lot of labor, and suddenly it’s there. And you can take that and run with it. For me. It’s tremendously exciting.

Rebecca: So we always wrap up by asking, what’s next?

Stan: Great question. Right now, I’m getting edits back from my editor for my next book, it’s Strategies for Success: Scaling your Impact as Solo Instructional Technologists and Designers. I’ve been doing this for about a quarter century and mostly as someone by myself, helping small colleges on how to do this, how do I keep my head above water and try to provide the best support possible? So sharing what I think I know .

Rebecca: Sounds like another great resource.

John: Well, thank you, Stan. It’s always great talking to you, and it’s good seeing you again.

Stan: Yeah, absolutely. And also, free book… I’mgonna give a 100, first 100 listeners, but I can go more. Yeah, so there’s a link it’s bit.ly/teaforteachinggpt . And so it’s in that set of show notes to share, but the first 100 gets a free copy of the book.

John: Thank you.

Rebecca: Thank you.

John: We’ll stop the recording. And, and we’ll put that in the show notes.


John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.


304. ChatGPT Inspired Course Redesign

AI tools such as ChatGPT have the potential to significantly disrupt how we work and how we learn. In this episode, Don Donelson joins us to discuss a course redesign strategy that could help prepare students for a world in which AI tools will be ubiquitous. Don is a senior lecturer in the Miami Herbert Business School at the University of Miami. He is a recipient of the Spring 2016 University of Miami Excellence in Teaching Award and the Dean’s Excellence in Teaching Award from the Miami Herbert Business School.

Show Notes


John: AI tools such as ChatGPT have the potential to significantly disrupt how we work and how we learn. In this episode, we discuss a course redesign strategy that could help prepare students for a world in which AI tools will be ubiquitous.


John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.


Rebecca: Our guest today is Don Donelson. Don is a senior lecturer in the Miami Herbert Business School at the University of Miami. He is a recipient of the Spring 2016 University of Miami Excellence in Teaching Award and the Dean’s Excellence in Teaching Award from the Miami Herbert Business School. Welcome back, Don.

Don: Glad to be here.

John: Today’s teas are: …Don, are you drinking tea?

Don: I am. It’s the same tea that you’re drinking, black currant, and it’s great.

Rebecca: It’s a John favorite for sure. I have a Tazo Awake tea today.

John: Does that mean you’re woke? That may be an issue down in Florida.

Don: …not in a private school.

Rebecca: It means that I couldn’t make a pot of tea. I didn’t have time. So I had to use a single tea bag. [LAUGHTER] That’s what it means.

John: And I am still using the mug from Australia that Clare McNally gave me with kangaroos all over it.

Rebecca: I like that mug.

John: I do too.

Rebecca: I look forward to seeing it in person.

John: Soon.

Rebecca: Yeah, you’ll be back soon, right? A couple weeks.. I’ve got my grad studies mug. We’ve invited you here today to discuss your plans to revise the graduate and undergraduate core courses in critical thinking and business communication at Miami Herbert. Can you tell us a little bit about this course?

Don: So this course started at the grad level, MBAs in particular, in 2008. That’s what I was hired to teach. And it grew with the program, expanding into specialized master’s programs. And then it went out into the undergrad program. And it’s a core course required for all full-time business students, undergrad and graduate. This past year, we had 46 sections of undergrad courses and 21 sections of graduate courses, about 900 students or so in total.

Rebecca: So a really small situation going on here.

Don: Oh, yeah, very small, [LAUGHTER] no problems with scaling or anything like that.

John: What was the typical focus of this course in the past before this revision that you’re working on?

Don: So the course was called “Critical Thinking and Effective Written and Oral Communication,” and it lived up to its name. It was about those three things. At the time that we started in 2008, we called them soft skills. We don’t use that phrase anymore. We like to call them fundamentals, something of the sort. We think that soft skills sends a bad message. But it’s been overhauled three times, this will be the third overhaul since. And the things that we would do in the courses, from the very beginning, the main evaluations would be based on writing memos and giving presentations

Rebecca: Which should be about the kind of communication you’d be doing in business. [LAUGHTER]

Don: Yeah, it’d be based on hypothetical cases, some non-hypothetical cases, the standard Harvard Business publishing 10, 20 page case on “How did Netflix beat Blockbuster?” or something of the sort.

Rebecca: What prompted this big overhaul?

Don: Well, the accreditation body, AACSB, required program evaluation. And it’s sometimes an annoying task that people do just to go through the motions. But what we found, the first time I went through it, was that we actually learned a lot from going through those motions. And so in my department, at least, we institutionalize curriculum audits on a semester basis. And so, in between the fall and spring semesters, we have a shorter meeting, where we kind of look at what happened in all the instructors, faculty teaching in that space in the fall, what happened and what worked, what didn’t work, and we might make some minor revisions. And then at the end of the spring semesters, we’d have a little bit of a longer meeting. And the last few of those had turned out some opportunities for change.

Rebecca: Can you talk a little bit about what some of those opportunities were?

Don: Yes. So we found that this is a challenging course to get buy-in from the students. And so we still haven’t figured out 10% of what we can figure out about teaching in this space, I’m sure. But one of the things that we’ve seen is that it is somewhat of an innovative curriculum, and one of the challenges with being innovative is students haven’t had material like that before. It’s a core course. And so students have to take it. And there’s always challenges with that. But this is a bit of a different challenge. And so I was talking with John the other day, he has some core required economics courses for business students, and some of the challenges that come with that. But this is a bit different in that those students know that economics is a field of study. And they know that people take economics courses, and there’s a textbook and critical thinking and communication. They’ve been taught kind of as separate. They’ve been add ons in other courses, not as a discrete course itself. And so we think there’s some challenges with that. And really, the challenge that we’ve seen, in addition to that, is that from the faculty view, critical thinking and communication are not separate things, they are one thing. And so critical thinking, I would call it problem solving, is really what we’re teaching. But communication is a component of that. And from the student view, we’ve had a hard time getting them to see those as integrated. And so when they do a memo, that’s an evaluation metric, they see it as: “Well, that’s just looking at the writing and not critical thinking.”

Rebecca: That’s interesting. Some of the things that we’ve done in our design courses around critical thinking and writing across the curriculum, my department, which is art and design, is doing some of those same things. We would do projects and embedded in those projects would be things like memos and other ways of communicating as a way to critically think about the decisions that our designers were making on things. But we would run up sometimes against the same kinds of challenges, like how do you really make that feel practical, that’s relevant, and then also keep it interesting. And it helped, I think, in those cases, because it was tied to a project. So is that a challenge that you face in this particular class is because there are these kind of standalone case studies, and it’s hard for students to buy in or get them into a business space?

Don: That’s one of the things actually I think that’s going to be changed is more of an arc to the course. And one of the things that I’m looking at is more integration of assignments. And so things building more towards the other assignments, and so we have skills building on top of each other. But, ideally, the assignments that they’re doing all build towards one culmination assignment, capstone type project.

Rebecca: Where does this course fit into their other required courses? Is it something that happens in the beginning? or in the middle? towards the end?

Don: So that’s partially an administrative question that is dependent on staffing. We see some students wait until the very last semester to take it, particularly the students who don’t have English as a first language, but they can start taking it as early as their sophomore year. But usually, it’s junior.

John: What’s the difference between the undergraduate and the graduate versions of the course?

Don: So the graduate versions are taken on a quarterly basis, and the undergraduates on a semester basis and so there’s more contact time in the undergraduate version of the course. They use different materials, and they’re more in depth. And so much like you would see with undergraduate economics classes, the graduate version of the economics classes might have similar titles, but go far more in depth into the material.

John: So one of the main issues is that students don’t see the critical thinking aspect of it as being important in their writing. How are you going to change your course to focus a bit more on the development of those critical thinking skills?

Don: Well, this is where I need to go back and add more to what Rebecca asked before about what prompted this because of course, ChatGPT prompted a lot of the revisions as well. And so ChatGPT, AI in general, while it’s kind of an independent axis of revision, we were thinking about some of these other problems well, before ChatGPT even became a thing that people were aware of. But they go hand in hand, really. A lot of the problem that I’ve seen with the writing assignments, and why students don’t necessarily view them as critical thinking and focus on the writing, is because there’s writing for aesthetic, and then there’s writing for substance. And if you’re teaching anything about writing, you kind of have to be teaching both. But when you’re teaching both together, the students tend to focus more on the aesthetic. And they connect it back to English composition classes that they might have taken in ninth grade or 10th grade. And those classes are certainly very important. But they’re a bit different than what we’re doing in these classes. And so I think it primes them to approach the course in a way that is not really conducive to getting what we want out of it. And so with AI, well, it remains to be seen, but it looks to me like you don’t need to be teaching the aesthetics of writing so much anymore in a class like this. And so I’m going to experiment with just not. [LAUGHTER]

Rebecca: It’s interesting, because in design classes where we were doing some similar kinds of things, aesthetics obviously always come about, because if we’re doing visual design, aesthetics are a part of that conversation. But we would have the same thing. It was like, “Well, that looks nice. That reads nice. It just doesn’t say anything.” [LAUGHTER]

Don: Right. Yeah. [LAUGHTER]

Rebecca: So it’s interesting that we bump up against these same kinds of challenges across a wide variety of disciplines. And that ChatGPT does offer some opportunity to focus on some different things.

Don: Absolutely.

Rebecca: I’m curious what exactly you’re going to focus on and how you’re gonna leverage ChatGPT in the context of this class.

Don: So I think ChatGPT is an insane, wild, amazing tool. And it’s going to only be more wild, more insane, more amazing next month, [LAUGHTER] or six months from now. But I see it really changing the way that I teach, the way I prepare for teaching. So kind of on the end of creating lesson plans and what it does for me as a teacher and the kinds of things that I can do in a class that I wouldn’t be able to do in a class before without a lot more hours in the day. And then also, from the student side, changing the way that they do assignments. They’re not going to be writing memos outside the classroom in the way that they have in business for decades and centuries. They’re going to be using ChatGPT. And so if, in this course, which is meant to be a practical course, we can’t make it practical if we’re not allowing them to use the tools that they’re not only going to be encouraged to use, they’re going to probably be required. And so if they don’t use ChatGPT in the future, they’re going to have bosses saying, “Why are you spending X amount of hours on this client memo instead of doing something else.” And so we really need to prepare them for that world. There’s been some early research and I’ll get John the citations, but we looked at research over the break between fall and spring, this past year, some preliminary research about the kinds of jobs in the way the labor market is going to be affected in the future by AI and ChatGPT and the jobs that were predicted to be the hardest hit in terms of reduced wages, and just reduced demand are jobs that involve writing and the jobs that were predicted to be the most insulated from Ai were jobs that involve problem solving and critical thinking. And so really, when you look at that research, it doesn’t even give us a choice. Even if we weren’t thinking about making some kind of revisions before, we’d probably need to just on that alone.

John: So is the focus now shifting more to the critical thinking skills and a little bit less on the basic structure of writing?

Don: Yes. And that’s really where even though the impetus for the revisions were independent, in practice, they’re not going to be that independent. And so it really dovetails nicely. And so I’ll give you an example, if a student is writing a memo, where a business is making a decision between two or three different courses of action, and one of the main criteria is the profitability of those courses of action, the structure is kind of guided by the math of profitability. So if you’re not talking about revenues, independently of talking about costs, you’re not proving profitability. And so when we talk about structure in this course, that’s really what we mean. But students very often, because of some of the things I’ve talked about previously, they’re looking at it as far as like the five-paragraph structure. And that’s not really what we mean. And so by being able to focus less on the aesthetics of writing, and more on the substance, I think we’ll be able to undo some of that priming,

Rebecca: …almost like this shift to articulating the decision making…

Don: Yes.

Rebecca: …rather than talking about writing, because articulating, it could be verbal, it could be in written language, it could be in a lot of different formats. But the point is that you thought critically about the issue, and how you made the decision. [LAUGHTER]

Don: Yes, exactly. In presentations, I’ve never had as much of the same problems as we have in the memos. Part of that, I think, it’s because of contemporaneous feedback. My students early on learned that this comment is kind of a trolling comment. And it’s not really meant as a attaboy or attagirl. But, sometimes a student will give a speech, and when they’re done, I’ll say, “I’m very impressed with your public speaking skills.” And they think, at first in the early parts of the class, that that’s a compliment. But they realize that that’s actually not a compliment. What I really mean is “No one would be buying what you’re selling, no one would be buying this stock, no one would be making a decision based on this, but you have very impressive charisma and confidence.” And that’s not really what we’re about, maybe in politics, but that’s a different question.

Rebecca: I’m curious about integrating ChatGPT as part of the process. Are you thinking about requiring students to reveal and discuss how and why they use ChatGPT in particular instances, and how they leverage the tool.

Don: So I think part of it is going to be first showing them how ChatGPT is not a critical thinking tool. And so I think it’ll be kind of walking on the escalator backwards for a bit just so that we can walk forward. It’s not going to be ChatGPT’s here, so you should use it. Go. ChatGPT is like a personal assistant, who is extremely capable and competent, but will do precisely what you tell it to do, and nothing else. The input you give it determines the quality of the output. And so if you go to ChatGPT, and you say, “I’m writing a letter of recommendation for Rebecca, and she was a great student, and she’s applying to law school period,” it’s going to give you about what you would imagine…it’s going to make up some stuff about Rebecca, it might even not get what program you’re in right. It’s not going to use the last name because I didn’t give it one. And it’s going to give you a very fluffy, perhaps disingenuous response. Now if I give ChatGPT a really robust stream of consciousness almost about Rebecca Mushtare was a student in the spring of whatever and she got this grade and she did phenomenal in these areas. And this assignment she really stood out most because of this, this, and this, it might give me a much more usable response that I can then play with. And so I think that’s going to be the first to instruct students on: what it does not do, which is critical thinking. And from there, I think they’ll have to use it however they feel comfortable. We’re still going to have some writing assignments that are scored. But what I’m hoping for is that these changes will make it so that they’re focused much more on the critical thinking parts of it. And so for some students that might look like writing a fairly complete draft on their own, and then putting it into ChatGPT and telling it to edit this for brevity and clean up grammar mistakes or do something of the sort. For some students, it might be much more of a back and forth kind of a conversation with ChatGPT, which I think a lot of students will be surprised to learn that it functions in that way. And when I find myself using it, it’s mostly as a conversation. Like, I didn’t like what you did here, cut that part out and do this again.

Rebecca: It’s funny that we don’t always think about it as a chat tool, despite the fact that chat is in its name.

Don: Yeah, exactly.

John: Before making this major change in your curriculum, have you experimented with any changes in this course recently to put more focus on critical thinking skills before introducing ChatGPT?

Don: Yes, so in some of the sections, especially at the graduate level, since we have so many different master’s programs, when I first started, it was MBA, and pretty much that’s it. Now with where the business world is going, there’s a lot more demand for specialized skill sets. And so we have, in addition to MBAs, we have a Masters of Science in Finance, a Masters of Science in Sustainable Business, so on and so forth. And each of those sections afford some opportunity to take things in a different direction, really, not even just an opportunity, but we kind of want to, to be more responsive to those fields. And so in the graduate sections, we’ve had some isolated ability to experiment with more problem-based learning, which I think ChatGPT goes really, really well with on the faculty end as far as creating problem-based learning curriculum. But we haven’t experimented with the AI component of it yet, really, because it’s so new, and it doesn’t feel like it right now…it kind of feels like it’s 20 years old, but yet haven’t used it. But it’s very new. And so I don’t know about every other institution, but we don’t move at the pace of jets when it comes to curriculum revision at the University of Miami. We move, I think, faster than probably most but still, it takes time. And so we haven’t had the opportunity to do anything with the AI yet. But we’ve revised in the past couple years to focus more on some of the problem solving in some of the graduate sections.

Rebecca: The faculty member in me heard I can use ChatGPT to help me with problem-based learning classes, and I want to know more about that.

Don: Oh, yeah. [LAUGHTER] So if you type into ChatGPT, you have to give it really, really good direction: the who, what, when, where, why…that you are a professor teaching a negotiation class. And it is a upper-level, undergraduate course. And you are going to create simulation practice for negotiation in which you play one role and the student plays the other, and you will create a scenario and interact with the student, but wait for the student’s response after each of your responses. And then at the conclusion, give the student feedback based on what you know about the science of negotiation from a management sciences perspective, as well as a legal perspective. And then you hit go, you will be blown away with what ChatGPT starts to create. And so it will give you a little blurb. A couple of weeks ago, I did something of the sort, and it said, Sally is the owner of a handmade furniture manufacturing company in North Carolina and has been contacted by so and so that owns a furniture retail store. And so and so has been impressed with Sally’s furniture and wants to arrange a distribution agreement. The meeting begins over the phone and so and so ask Sally what her goals are in this arrangement. And then that’s where I would type in and I said my goals are to reach this level of profitability and to have a productive long-term relationship with the other party, and it responded back. So it can create an entire dialogue that you can then ask afterwards, once you tweak it and say, “Well, I liked this part of it. I didn’t like this part of it, write the Python code for this, and it will write the whole Python code and allow you to turn it into a web-based interactive program. It’s really quite wild.

John: So basically, it gives every faculty member the ability to create interactive simulations for their classes, which could be done for pretty much any topic I would think.

Don: Absolutely. In the past that kind of thing was in some courses, probably a bit aspirational. It’s the kind of thing that would probably require some kind of course leave to develop it. And for faculty who become really comfortable with it, it will get to a point where it’s doable within a day or two of a lesson. And so you can on kind of miniature scale, you can do these on a daily basis, really high quality ones.

Rebecca: It sounds like something that we can use in a lot of contexts in higher ed, including if we want to do simulations for interviews for new positions or other things as well, if you’re trying to better understand how someone might approach a problem.

Don: Absolutely. I think that’s a very good application, in fact.

John: One of the things though, that I think has generated some panic for a lot of faculty is the effect that this may have on how we assess student learning. So how can faculty address issues related to the ethical use of artificial intelligence?

Don: Well, I’ve never known any faculty to ever panic over a technological innovation…sarcasm ended. So I think faculty have to assess this on their own, but also part of the community. One thing that I think’s going to be an early problem are faculty doing things in a different way, I think that’s probably unavoidable. And so I say all that as kind of a disclaimer that my approach and what I think our approach is going to be in my department, and even if the disclaimer applies to that, I don’t even know that for sure, is perhaps going to be different than others’ approach. And so since this course is supposed to be so much of a practical course, and the writing is on the wall, no pun intended, well, it’s in the AI software, I view that we really have no choice. And so there’s been a lot of commentary in The Atlantic magazine, a lot of commentary in the higher education journals. And most of that I have seen focused on this question, but using as an assumption that it’s wrong to use ChatGPT. And so the easiest way to make it not a question of cheating is to allow it to be used, and then it’s not cheating. And so that’s the direction that I’m leaning in. And I think, ultimately, for the practical tools, for the practical courses, that’s going to be the direction it goes. But again, I can’t even speak for my own department on that, because we’re so new in this.

John: And that will be an issue, I think, everywhere, as it has been in the past with things like calculators, or smartphones, or even Apple watches, I remember getting all these memos coming in from various places at one point to make sure your students are not using a smartwatch while they were taking an exam, because somehow the answers are going to miraculously appear on that tiny little screen for the test that you’re giving them.

Don: Right. And I think you can’t really separate the assessment design and the student response to the assessment in this. There are going to be some courses, I can imagine, in different disciplines, that they’re focused on more fundamental foundational skills, that it’s going to be more of a challenge for them. Well, I’m not saying that students don’t need to know and learn about the aesthetics of writing; that has to keep happening, but not in this course. And so I don’t know how the faculty in those spaces and really the 9th and 10th grade composition teachers that I talked about before, I don’t know how they deal with it, probably in person assessments, that sort of thing. But for this practical application course, I would view it really as kind of training track runners to hop on one foot. And so that wouldn’t be very practical. And so if you have a cheating or plagiarism or honor code policy that requires them to only use one foot, then it would be plagiarism for them to run on both feet. But that wouldn’t be very helpful. And so I really viewed ChatGPT as the same thing in a practical sense, if you’re plagiarism or honor, code policy defines ChatGPT as out of bounds, you’re training them to run on one foot.

Rebecca: So we’ve talked a lot about the writing component, and really building in stronger structures to focus on critical thinking. One of the other issues that you identified was that students don’t necessarily see the intrinsic value of the course or like get the buy-in. Can you talk a little bit about the ways that you’re redesigning to help with that piece of it as well?

Don: Yeah, and so certainly a lot of students do get that. But it really depends on how intrinsically motivated they are. And I think it requires faculty to kind of sell it somewhat. And so in my courses, I’ve found success with selling it in that way, which I really don’t like to do. And it’s something that a lot of faculty probably think is kind of an icky thing to do. But for instance, I will repeatedly tell students, I’m not here to make myself feel good. I’m not here to make myself feel smart by putting you all down. I’m here to help you all get jobs and to get promotions at those jobs and do well in your careers. And so I will focus a lot on kind of pointing things out as criticism that I also tell them these are not affecting your grade, however, X, Y and Z. And so little things like in a presentation if they go…you know, we have to have time limits for presentations because it’s basic math, we have X number of students, 75 minutes in a class session, we have to have time. And so when it comes to a student has five minutes and they go over, what do you do? I’m not going to take off on grades for that. But I am going to point out for a student that, in some settings, if you’re given a time limit, that’s because the CEO has another meeting five minutes after you start and you will be cut off, not because they don’t like what you’re saying not because you haven’t followed the directions, but because they’ve got somewhere else they need to go. And so a lot of the problem I think, is just students are so focused on grades, to the shock of everybody, [LAUGHTER] that when the things that you’re grading and are affecting their grades are these kind of…and aesthetic isn’t the right word, but they would view grading something like that as a bit ticky tack. And when you’re scoring things like that, it’s much harder to have a serious conversation about the nitty gritty substance, and how, if you’re trying to prove that this course of action is more profitable than the others, and you didn’t provide any support for the change in costs, you really can’t have accomplished your goal. You don’t get the same attention from the students in the same response, if you’re also talking about things like, well, “you went five and a half minutes when you only had five minutes,” or “You didn’t use 10 point font when you were told to use 10 point font and that ChatGPT, with that second example with that 10 point font, if the instructions said 10 point font and the students input the instructions it produce it in the appropriate formatting.

John: And I know in the past, when I’ve graded student papers, I, as many other people do, spend far too much time correcting grammatical errors, reminding them that there’s a difference between singular and plural or the difference between all the various homonyms out there. Might be easier for us to evaluate student work when we can actually focus on the arguments they’re making, and their ability to engage in critical thinking, rather than getting ourselves so tied up in all this minutia, which I always try to avoid doing, but when I see so many errors in student work, it’s hard not to at least correct some of it so that they could become more proficient. In the future, they may not need to have that type of correction.

Don: Yeah, John, I think you really hit the nail on the head there. You really feel like you have an obligation to correct those. And when communication or writing is one of the titular topics of the course, even more so. But I have always felt that you get diminishing returns on the things that you focus on. And every time you’re talking about grammar, you are not talking about the critical thinking, and the grammar does matter. I can tell you that I have lots of conversations with CEOs with HR directors, etc, in the ongoing effort to make sure that my curriculum is responsive to what’s happening in the market. And one thing that I consistently hear is grammatical errors, spelling errors on slides or in cover letters are catastrophic. And it’s not because they’re nitpicky, it’s because the markets are so competitive, that they get a window that’s maybe 5% or less of what someone’s actual quality as a candidate may be. And that’s just something that there’s going to be some other candidate that is just as qualified and equal in every other way that didn’t make grammatical mistakes in their PowerPoints, and so on and so forth. And so it is important, but it doesn’t matter how good your grammar is, how compelling your vocabulary is, if you are missing some of the logical components of the argument, you cannot be correct.

John: And when students get feedback, where they see dozens of comments on it, the easiest strategy is to focus on correcting those small grammatical errors that are riddled through it. You might also have told the student that they don’t have a very substantive argument. But if they’re going to make a lot of corrections, it’s easier for them to focus on correcting the grammar and ignoring the more fundamental problem.

Don: Right. The very first writing assignment I ever did as a graduate student was a 50-page memo and I got back no comments anywhere except for on the front page: “Do you talk like this?” I think probably somewhere [LAUGHTER] in between those two is ideal. But you’re exactly right, that the more that we focus on things like grammar and tense and such things, the less we can focus on the meat and the critical thinking.

Rebecca: It’s funny how that often is the level of polish would be something that goes from someone that’s got like a really high grade to like an excellent grade.

Don: Right, exactly.

Rebecca: Something that’s foundational, that’s often not how our feedback structures work. And even if we keep form and function feedback separate and even weight them very differently. It’s really easy to address the form issues, because it’s almost like a series of checkboxes and it doesn’t require a lot of thought because the critical thinking part’s the hard part. And so it’s funny that even if they’re weighted differently, and to keep the comments separately, students will always flock towards the thing that’s kind of easy to fix. I mean, who wouldn’t? Then it becomes a checklist.

Don: And that’s exactly really kind of where this boils down to me, it’s not to say that those things aren’t important. They’re still very important. But in the world in which ChatGPT is a real thing, which it now is, and will continue to be and only be more powerful than it is, the juice that we get out of spending time in class or in feedback, in office hours, whatever it may be, talking about those sorts of things, is getting much less of a return than it did before ChatGPT. I am not a walking detector of 100% perfect polish by any means. But it seems to me that the product that ChatGPT can produce, in terms of those things that you were speaking about, Rebecca, is pretty dang good and hard to distinguish for me from highly polished products. But again, where it is easy to distinguish is this is a load of crap [LAUGHTER] that is fluff and has no substance to it, but a very polished load of crap, but nonetheless…

Rebecca: It’s pretty crap. [LAUGHTER]

Don: Exactly. It’s very pretty crappy with a nice bow,

John: …which reminds me of some work that I graded just the other night, where spelling and grammatical issues have mostly disappeared in student essay responses since the advent of ChatGPT, but the substance is not always there. And there were many responses that I provided feedback on which said, “this is a really nice response, but not to the question that you were asked to address.”

Rebecca: Yeah, or you spent two paragraphs and you haven’t actually said anything yet.

John: So teaching students how to use ChatGPT or other AI tools more effectively might allow them to be more productive in their learning as well as beyond their college experience.

Don: And might allow us to make for more productive learning environments as well.

Rebecca: So we’ve talked a lot about course content, and what to maybe focus on and not focus on. One of the most important things a course has is its syllabus or course outline. Can you talk a little bit about course policies and the way that you might make change in that realm?

Don: Yeah, so I think you’re gonna have to be more detailed than you probably are used to being in terms of putting language and syllabi, very specific and upfront. And so some of the policies that I’ve seen that I’ve liked elements of and are going to end up including in the syllabi the explicit weaknesses of ChatGPT. It is not a critical thinking device, it will produce responses only as deep or as shallow as you instruct it to. You are still responsible for the critical thinking, essentially, and very explicit in terms of what’s allowed, what’s not allowed. And I think also, it would probably be a good idea for faculty to be putting in explicit language that what is allowed in this course, is not necessarily the same as what will be allowed in other courses, and it is incumbent on students to navigate those differences themselves.

Rebecca: And part of the reason why things might be different across courses is because the focus of those courses is different and really helping students understand that there’s reasons why policies might be different in other classes. It’s not necessarily arbitrary.

Don: Right, exactly.

Rebecca: So we always wrap up by asking: “What’s next?”

Don: Well, what’s next is I figure out how to do all this stuff, [LAUGHTER] and not just to talk about it.

Rebecca: …and you’re gonna send us a memo, right with that in it. [LAUGHTER]

Don: Oh yeah. Yeah, I’m happy. [LAUGHTER]

John: …or at least have ChatGPT generate a memo explaining….

Don: Exactly. So yeah, what’s next is to put this stuff into action. Of course, as I mentioned, some of the things here have already been experimented with, the non-ChatGPT parts of it at least, but really kind of integrating them and seeing if what I am imagining is what comes to fruition in terms of do these things dovetail as well as I think. I really think that they do. …that kind of pre-existing urge to go more towards the critical thinking element, and really, I think, does dovetail well with the AI, but putting it into practice, it will be over the course of probably all of next year. And so there’s going to be some experimental sections, most of the sections are probably not going to look very different than they did in the spring. And I think that’s probably a very good plan. But there’s going to be some experimenting in some of the sections at the undergraduate level, and part of a faculty learning community on problem-based learning. This course is going to be participating in that in the fall. And so a lot is going to come out of that, I think, as well.

John: Do you think there’ll be much buy in from other people teaching the course?

Don: So, students, by and large, do not like writing. Faculty, by and large, do not like grading writing. And so I don’t think this is one of those political monsters of how are we going to get this through? How are we going to make these changes work? I think there’s probably a lot of people who have nervousness about how you would make these changes. But with those two facts that I don’t think you’d get much disagreement from, I think even across disciplines, I don’t think it should be that difficult for this to be implemented

Rebecca: Well I hope you’ll join us after you’ve implemented some of the things to share some of your reflections and let us know how it went.

Don: I’m happy to.

John: Well, thank you, Don.

Don: Thank you.


John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.


303. Higher Ed Then and Now

Teaching practices have gradually evolved as we’ve learned more about how humans learn. From one year to the next, these changes may appear small, but the cumulative effect is profound. In this episode, Todd Zakrajsek joins us to reflect back on the changes that have occurred in higher ed during our careers.

Todd is an Associate Research Professor and Associate Director of a Faculty Development Fellowship at the University of North Carolina at Chapel Hill. He is also the director of four Lilly conferences on evidence-based teaching and learning. Todd is the author of many superb books, and has published four books in the past four years. His most recent book is a fifth edition of Teaching at it’s Best, a book he co-authored with Linda Nilson.

Show Notes

  • Zakrajsek, T. and Nilson, L. B. (2023). Teaching at its best: A research-based resource for college instructors. 5th edition. Jossey-Bass.
  • Zakrajsek, T. D. (2022). The new science of learning: how to learn in harmony with your brain. Routledge.
  • Harrington, C., Bowen, J. A., & Zakrajsek, T. D. (2017). Dynamic lecturing: Research-based strategies to enhance lecture effectiveness. Routledge.
  • EdPuzzle
  • PlayPosit
  • ChatGPT
  • Wayback Machine


John: Teaching practices have gradually evolved as we’ve learned more about how humans learn. From one year to the next, these changes may appear small, but the cumulative effect is profound. In this episode, we reflect back on the changes that have occurred in higher ed during our careers.


John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.


John: Our guest today is Todd Zakrajsek, and I am with Todd here in Durham, North Carolina. Todd is an Associate Research Professor and Associate Director of a Faculty Development Fellowship at the University of North Carolina at Chapel Hill. He is also the director of four Lilly conferences on evidence-based teaching and learning. Todd is the author of many superb books, and has published four books in the past four years. His most recent book is a fifth edition of Teaching at it’s Best, a book he co-authored with Linda Nilson. Welcome back, Todd.

Todd: Well, thank you, John. Well, this is exciting. And Rebecca may be a long ways away, but I have never been arm’s length from a person who interviewed me for a podcast before.

Rebecca: Isn’t that cool?

John: And we’ve really done that before either at a conference or at Oswego,

Todd: I feel very special.

Rebecca: Well, we can celebrate with our teas. So, today’s teas are:… [LAUGHTER]

Todd: I’m drinking a peach mango that I got from some teas that John brought, which are fantastic.

Rebecca: John, how about you?

John: I am drinking a Tea Forte black currant tea, which I brought from Oswego, in a new mug that was given to me by Claire McNally, when she visited this area last week.

Todd: Love Claire, she’s fantastic.

John: And it has kangaroos on it.

Todd: Yeah.

Rebecca: And I can’t see it. Let me see it, John. Oh, that’s a cool mug.

Todd: It’s a good mug. I got a mug from her university. But I didn’t realize I should have brought it. So I feel bad about that. But it is a podcast. So I didn’t think about what it would look like.

John: That’s true, we generally don’t do a lot of visuals on here.

Rebecca: And I have a blue sapphire tea in my Tea Rex mug.

Todd: Well, that’s a nice mug,

John: We’ve invited you back to talk a little bit about how some of the changes you’ve observed in college teaching across your career have impacted how you teach today. When did your work in higher ed begin?

Todd: Actually, it started when I was a graduate student. So back in 1987. So there’s no reason to try to figure out how old I am. Now I’ve basically specifically dated myself here. I started teaching, I got to teach an introduction to statistics course. And I had so much fun that I taught again the following year. And by the time I left my graduate program, I had taught more courses in that program than any other graduate student had ever taught in the psychology department there. I really loved teaching right from the beginning, when from the beginning, very concerned about student learning, and just getting rolling.

Rebecca: What was it about the teaching, Todd, that really got you hooked?

Todd: Just watching the studentsis. it’s the same thing as it is today, when you have an individual who’s struggling with something, and suddenly they get it and you realize that they may eventually get it on their own, but you realize how much you’ve helped them to move that along very quickly. And facilitating the learning process, I just really love that. That doesn’t mean I was fantastic at it. But I really did love it.

Rebecca: Sometimes the things we love the most are things that we’re not great at to start with.

Todd: That’s true.

John: My experience was similar, actually, I started in 1980, with a course where I had a fellowship, so I didn’t have to teach. But there was a sudden shortage in the department. And they asked me to fill in. And I was planning to go on into research. But it was just so much fun teaching that I’ve never stopped.

Rebecca: I taught as a graduate student too, and taught the whole time I was there. But I started a little bit later in 2003.

Todd: Alright, so that was a couple of years later.

Rebecca: Just a couple.

Todd: Yeah, I had kind of a funny start, I will mention that when I first started that after the first semester of teaching, my students got almost all As and Bs. And the department chair called me in and he said, “I’m not going to have you teach any more courses.” And I said, “why not?” And he says, “Well, you give grades away like candy, we have to have better standards than that.” And I said, “Well, how are you basing that?” And he says, “Well, you know, we looked at the grade point averages.” And I said, “Well, how about if I bring in my final exam, and just walk through it, and then you can tell me how it could change to be more rigorous.” And so it was great. I showed it to him at the beginning. And like the bottom of the first page, the students had to calculate a statistical value, then I had them explain how they came about that number. But if they had used a different test how might it been inappropriately found and what the interpretation might have been, based on the fact that they had done it wrong with a different test. I thought it was important for them to understand how these things can change. The Chair said, “I can’t believe you have your students in the first class actually talk about various tests like that.” And I said, “Yeah, I did. Then we turned the page he says “You did nonparametric tests?” I said, “Well, yeah, we did parametric tests, but then I thought they should know the equivalent.” And he said, “We never do that.” And then he turned the last page and he said “You had them do a two-way ANOVA? You’re only supposed to go through one-way ANOVA.” I said, “Yeah, but we’d finished everything and we still had a week left. And I figured I might as well introduce the next concept to them. And so I showed them how to do a two-way ANOVA and they ended up with all As and Bs. So if you could help me in how to push their grades down and give them lower grades, I’m perfectly happy to do that.” And he then set me up with two courses the next semester, but it’s that reliance on the teaching evaluations is always funny.

Rebecca: Todd, it’s just funny, as we’ve gotten to know you through the podcast [LAUGHTER] it sounds so perfect that that was your first experience. [LAUGHTER]

Todd: Yeah, I’ve lived my entire career on the edge. [LAUGHTER]

John: And those sorts of arguments are still occurring in a lot of classes today about rigor and the need to keep grades lower.

Todd: Yeah.

John: They’re less severe than they were a few years ago.

Todd: Yes, but also looking at how well a person’s teaching based on student evaluations. I mean, we should be looking at authentic assessment. Some things have changed through the years, some things have not changed through the years.

Rebecca: Well, technology is one of those things that has changed.

Todd: Woosh, yeah.

Rebecca: Can you talk a little bit about what tech was like in the classroom when you first started and how it’s evolved a bit?

Todd: Yeah, I know you have some listeners who have been teaching for a very long time. So those of you have been teaching for like 30 to 40 years, just stop and think back about what it was like when we first started. For those of you who have been teaching like Rebecca since 2003, let’s just mention that technology back then was mostly pens and chalk and chalkboards. So back then, of course, there’s technology, there’s always technology, but we were using overhead projectors. This was long before the internet came along to really be used in the classes. LCD projectors were not out yet. Canvas, Blackboard, Sakai, all those learning management systems were not around. We didn’t have any of the ways to email individuals, you couldn’t email your students back then. And there was no ChatGPT to write your papers for you.

Rebecca: But there were calculators that could do all the work for you.

Todd: Yes, but this is the cool part. Back when I started teaching statistics, I’m glad you mentioned the calculators, huge debate back then was whether or not the students should calculate the statistical values by hand using the calculator, because computers had just come onto the scene and we could punch the data into a computer and have a computer run an ANOVA for you. Should you calculate it by hand? Should you run it to the computer? And there was a huge camp that said you should do it by hand or you will never understand a statistical value. And I said, “You know, we’ve got the technology there. Why don’t we have the students use the computer to do the mundane stuff, and we’ll have more time to talk about the theoretical and the important implications.” But even back then we were having the discussions about whether to use the technology at hand or not. Oh, and by the way, we are also hanging grades on doors. So we would figure out the grades, we’d tack it to the door, and then the students who want to know what their grades were for the class would swing by and look at the door.

John: And they were sorted alphabetically, to make it easier for people to find where they were in the grade list.

Todd: Yeah, it was great. We listed them according to their social security number, [LAUGHTER] which was a little different back then. And yeah, we actually did that back then. But as John pointed out, they were listed by number so nobody knew whose number went with whom, except, surprisingly, they were alphabetical on the door. So not only could you figure out Armstrong’s exam score, you’d get Armstrong’s social security number as well. Yeah, times have changed.

John: And it was also back in the day of dittos and mineos as well, which was the only way of disseminating information on paper.

Todd: This is so much fun. We’ll get to some real meat of this thing. But that walk down memory lane has some fun stuff too. The dittos…

Rebecca: I remember dittos, just for the record, okay.

Todd: Yes. So you probably remember, if you dittoed just before class, and you handed it out in class, the students would all pull the ditto up to their face, so they could smell the ditto fluid. And they got that smell. I was running dittos one time in the graduate student office, and I noticed when I looked down because it ran out of fluid, and I had to put some more fluid in, and I looked down and I noticed that the floor was kind of eaten away by this ditto fluid. And then… this is the best part… About a month later, I was digging for something in the closet and I found extra tiles and I thought they should put these tiles down to replace the ones that are all eaten and on the side of the box it said these tiles were long lasting and durable, reinforced with asbestos. So that ditto fluid was eating through asbestos tiles. That’s some strong stuff.

John: …to make it a little bit more friable so that it would disseminate in the air nicely.

Todd: Well, there had to be something to help the faculty members who were running all their own dittos to not mind doing it, and one way of doing this is to have them use ditto fluid, because I’ll tell you, you may not have liked it when you started, but by the end, it was all right. [LAUGHTER]

Rebecca: It’s funny that we’re taking this walk down memory lane, because on our campus, I was in our historic lecture classroom today in Sheldon Hall.

John: What are some of the other changes that have occurred and how have they influenced how we teach?

Todd: Yeah, so it’s interesting, I did the walk down memory lane and we were chatting about this stuff. It’s all fun, but thinking about how the changes have taken place. I think that’s really important. So there have been massive changes. I think that we tend to forget, it’s so easy to communicate with students now. Heck, people are texting now so that you can text back and forth with students. But think about how that has transcended or gone through time. There was a time when I would have to call and leave a message for a student on an answering machine, and then they would call back and we would try to find a time that we could talk on the phone. If we wanted to have a conversation. I could either leave a note for the student or I could call and leave a message that says least come see me after class. So even having a conversation with a student was difficult, then it became easier with email because you could start emailing back and forth. And now we have Zoom. And the equity in the way that this has changed, just think about the difference of this, if I’m leaving a message for a student, they may not even have an answering machine, if they’re living off campus with limited means back then. So even getting in touch with a student would be challenging. Now I can have a Zoom conversation with a student who doesn’t have to hire a babysitter, who doesn’t have to find reliable transportation, who doesn’t have to drive across town and burn gas, and to do all of those things that it would take to have a 15-minute conversation that in the past would have been really hard, and even four or five years ago would have been challenging. The grades, why in the world would a person have to leave… and I was teaching in very northern Michigan, there were days that the wind chill was 75 degrees below zero… and students would leave their dorm rooms and walk across campus to see a grade on the door. It’s actually physically dangerous. And now we have learning management systems, we could post things for students. Interlibrary loan used to take weeks to get a document that you can now go on and get. People can lament all of these technological changes at times, but we’re actually creating more and more equity within the higher education system as we make certain things easier. Not saying that we’re anywhere near an equitable system yet, but we’re moving in a really good direction. And a lot of those changes are helping us to get there.

Rebecca: I’m thinking about all the times when I get to go to the door or meet after class, it really assumes that students are a certain kind of student, they’re full time, they have time. And our students now are working [LAUGHTER], and where they’re juggling a lot of different schedules and things.

Todd: Yeah, and I mean, we want to be careful too. And I agree with you 100%. But they were juggling back then too. But some of the things we were doing, for instance, I taught a night class. Now I would probably suggest if I was going to teach a class from 7 to 10pm that I would teach it through zoom, because there’s a lot of reasons that it’s good to do. But I had students that I noticed in class, would very quickly at the end of class would start talking to other students and I couldn’t figure out what it was doing because a lot of buzzing and stuff. And what I found was that there were certain students who were uncomfortable, and we were in a very safe campus, but they were uncomfortable walking to their car at 10 o’clock at night. So I started saying to the students, “Hey, I’m gonna park a car… and when we showed up, there were quite a few cars there… but I’ll be under the second light, I drive a little red Chevette, not a Corvette, a Chevette, but I’ll have my car there. If you want to park near me, we can walk out together.” And there were students that were not paying attention to almost any of the class because they were fearful of how they were going to get to their car safely. When you think about Zoom and stuff, it’s even safety factors, I would never have a review session now like I used to at 8 to 9 pm the night before the exam because I’m exposing people to potentially dangerous situations. Now we’d have zoom sessions. But I could tell you 40 years ago, there was no even concept of what zoom would be and how it would work. Even Star Trek didn’t have stuff like that.

John: And there was also, besides the inequity associated with people who were working, many campuses had a lot of commuting students who could not easily get back to campus for office hours. Or if they were just taking classes on Monday, Wednesday, Friday and your office hours was on a Tuesday, they’d have to come in that extra day, arranging childcare, or their work to be able to fit that into the schedule.

Todd: Yeah, it really did start to change that system. So we got a little bit more equity, and like you were saying too, the commuting students, the part-time students, the students taking distance courses. When I first started teaching, I was writing… oh my word, remember the correspondence courses? …and you mail away and get a packet of material, you take a test at a local library and, and they talk about distance education being not as good as on campus, but at least better than nothing. And now we’re finally getting to a system where we can stop assuming that those folks who are coming in for part-time courses and stuff are just getting something better than nothing. They’re actually getting something similar to full college courses, which some of those online courses are actually as good or better than college courses that are on campus. But all that’s changing with the technology. It’s crazy.

John: And there’s a lot of research that supports that in terms of the relative learning gains with online and face-to-face, as well as hybrid courses, which seemed to outperform others in a few meta studies that have been done. But those were options that just weren’t available back then. And the early online courses were often designed to be replicas of face-to-face classes, and they probably didn’t work quite as well. But we’ve learned since that, which brings us to the issue of research. During the time that you’ve been teaching, there’s been a lot of research on teaching and learning. While some of it was taking place, it wasn’t very widely disseminated to faculty.

Todd: Yeah, that is true, too. It’s so much easier to get technology out. It’s easier to gather data, it’s easier to write it up. It’s easier to edit it so all of those types of things that are happening now that couldn’t happen before. And as a result, we’re learning a lot more about how people learn, you know, the book I did on the New Science of Learning, looking at a lot of the ways that students learn. And part of it’s just the ease of getting to information. But also part of it’s just being able to investigate how people process information. I used to teach Introductory Psychology back then, we would talk about the stages of sleep. And nobody really knew, for instance, what REM sleep was about, we knew that you had to have it or else it caused some problems. Deep sleep we knew was important, we now have indications that deep sleep for consolidation is necessary for semantic memory. If your sleep is interrupted, you can get eight hours of sleep. But if you don’t get deep sleep, the information doesn’t get consolidated. Procedural memory, how to give shots and kick balls and do anything procedurally looks like it’s more solidified during REM sleep. So again, the different types of sleep are associated with us learning long term, different types of information. We never knew that before all this technology was running around. In fact, back then I gotta say, I remember from my intro psych class being told that you were born with a certain number of neurons, and as you live through life, neurons would die. And if you killed them by drinking or doing something like drugs or something, they were gone forever, and you would never get more. And if you broke a connection, it was broken forever. That’s just simply not true. But it’s what we thought back then. So technology has really allowed us to look better at how people learn, different ways of helping them to learn and different ways they can even study. By the way, before we move on, we now have this physiological demonstration that staying up all night and cramming the night before the test. Even though it gets you slightly higher grades on the test, we now know that because the information is not consolidated that it won’t be there a week later or two weeks later. So we’ve always told students, you shouldn’t do it, but now we can actually show them why it doesn’t work.

John: And the LMS itself has offered a lot of ways of giving more rapid feedback to students with some automated grading with some things to give them more low-stakes testing opportunities. And those were things that we just couldn’t easily do back when you started teaching.

Todd: No, John, that’s a really good one. And we know that one of the most consistent findings right now in all of learning and memory stuff is that the more often you do something, the easier it becomes, long-term potentiation. Which means the more frequently you retrieve information from your long-term memory, the easier it is to retrieve. And just like you’d mentioned, we can now do LMS systems that are set up so that you could do practice quizzes, you could do dozens or hundreds of practice quizzes and keep pulling that information out over and over and over again. That was just not possible before this. And so the LMS helps with that, it helps by giving feedback, really good feedback so that students know what they’re doing well, and what they’re not doing well. And it helps faculty members to design feedback specifically for certain types of projects, and so that I can more easily give more feedback without spending a lot more time on it. So LMSs have done a tremendous amount of work. And that’s not even mentioning the fact that you can have all of the LMS systems loaded with the content. So students can log in and get their information without leaving their house. If there’s fiscal challenges with your class, you can put in articles, the students may not have to buy a book, they could read the articles. And so we’ve got students who were able to come to classes because they can afford to be there. By the way, I remember being on a committee when I was a graduate student, and we were looking at financial aid and different financial systems. And I remembered asking the Chief Financial Officer, I said, “What increase in tuition does there need to be before you start to see students drop off because they can’t afford to be here?” And this was about 40 years ago, but he said $100 for a year, if they have to pay $100 this year more than last year, some students won’t come back. If we look at the price of textbooks now, textbooks can cost $400. So, a book like that is definitely going to make a difference between some students being able to take the class or not. So LMS systems make this possible.

John: And they also make it easier to share OER resources that don’t have any cost for students, or some less expensive adaptive learning platforms, giving all students that first-day access. I remember, not so long ago, when I was still using textbooks in some classes, students would wait several weeks before they got that book. And that put them at a severe disadvantage. And the people who were being put at a disadvantage. were generally the students who came in with the weakest backgrounds because they came from lower resourced school districts.

Todd: Yeah, if they had the resources, they would have the better background foundational material, but they’d be able to buy the books. And you mentioned OERs. So open educational resources are really another thing that are really valuable because back then, before the technology, you couldn’t produce something that would be readily available like throughout the world. And so this project that’s going on now where they’re doing introductory level books in all the different disciplines, you can get an OER introductory psychology textbook that students can log in and read. None of that was possible before the technology. So even the creation of OERs has changed so much.

Rebecca: Well, speaking of digital materials, libraries have changed significantly too over time from having completely physical collections and interlibrary loans and things that take a lot of time to having a lot of digital resources, which changes access to research and materials that you can populate into your classes, but also can aid students in the work that they’re doing. Can you talk a little bit about the change in libraries and how that’s impacted how you’ve taught?

Todd: Yeah, you know, libraries have been fascinating to watch over the last 40 years, because it used to be the biggest challenge librarians had before them was which books to put on the shelves because there was a finite amount of shelf space. And there were lots and lots of books. And so that was the big thing. We used to take out journals that weren’t used very much to make room for other journals. Through time, little by little, they started digitizing all that stuff. And I can remember chatting with librarians, one conversation I had was back around 2001. I said, gonna be interesting, because there’s gonna come a day where there’ll be no books in the library, and the Dean of Libraries said “Well, there’s always going to be books.” I said, “Not always, potentially.” But even if we reduce them, I said, “What is your foresight? How is the library going to change?” And so he had a couple of ideas. But what it basically boiled down to our conversation is, I always felt like a library was like the brain of the campus, it had the books, and it had all of the information that you could go and get. As the books left, and things were diversified in a way that you could find this stuff, you could get all the information right from your dorm room, or from your apartment, when the internet came along, you could get anything you needed, then the library was still a physical space that was in the middle of campus. And what it should become is a learning commons, a place where people go to share and to learn from one another. And I think that’s what’s really changed is individuals still just pile into libraries and use the space, but they use it in different ways. They go there to meet other individuals to work, which they did before. But they took away that aspect of going there for the book part. And it meant all of those shelves got emptied, and they started pushing them out. And you can go into libraries right now that have very few shelves. But they have webcams, they have smartboards, they have spaces where folks can plug in their computers and share with one another. They’ve got screens set up so that you can project and have students sitting around a table, they’ve got Google Glass set up, all of these types of things that bring students together to use technology to learn from one another.

John: And they have cafes to help support that to make it easier for people to gather.

Todd: Yeah, you could swing by and get a cup of tea.

Rebecca: It’s funny, even when I was in high school, my sister and I would rely on going to the library to have access to a computer so that we could even type of paper, because we didn’t have one at home. And that kind of place of having the technology started a long time ago, but it’s amped up quite a bit over the last 20 years.

Todd: Yeah, and I agree completely. And the computers that are there. I mean, even right now, with the books dissipating, there a’re still large numbers of computers. And oftentimes, they’ll even be an area in a library that’s carved out with really high-end computers. But it gives students an opportunity to go. We make this assumption that everybody has a computer and they don’t. But libraries give them that opportunity.

John: Yeah, for those students working on smartphones or Chromebooks, that gives them access to all the tools that students with $2000 or $3000 or $4,000 computers.

Todd: Yes, because smartphones can work for lots of things. But they’re a little tough to write a paper on

John: When I started teaching, and probably when you did too, the predominant mode of instruction, which actually still is often the predominant mode of instruction in many departments, was lecture. That’s changed quite a bit since then. Could you talk a little bit about the shift from lecture-based courses to courses that involve much more active learning activity?

Todd: Yeah, or they just involve a lot more of everything. The concept of flipped classrooms, which was almost impossible 30, 40 years ago, because you really couldn’t get the information to the students. Yes, it was kind of possible, but whoo, if it was hard now, it was really hard back then. But the ability to get information out to students that they can read it before they come to class. But coming back to the lectures… So I’m going to take this moment and those of you who know me know that I’m going to do this, is that we still have no evidence that lectures are bad, but there’s something that we need to really keep in mind. I think this is vital. I do think it’s important for us to be able to talk about buzz groups and jigsaws and fish bowls and lectures and Socratic lectures, discussion lectures, all those different methodologies out there so that we know what we’re talking about when we chat with one another. But I do think it’s time that we stop talking about lectures being more effective than one thing or fishbowls being more effective than something else and look at the components of what is valuable in a learning experience.

John: And a good reference for that is a book on Dynamic Lecturing, which you happen to be a co-author of.

Todd: That is true and in fact that there’s the Dynamic Lecturing. And then there’s a chunk in that about The New Science of Learning. And then there’s a whole chapter in that about Teaching at its Best, because that’s a good point, John, thank you.

Rebecca: It’s almost like you’re trying to slip it in everywhere you are.

Todd: Because the research… people keep talking about one methodology being better than another. Here it is, folks, you can be a hideous lecturer, you can be a phenomenal lecturer. And if you’re a hideous lecturer, you’re not going to learn anything. If you’re a phenomenal lecturer, students will learn from you but they won’t learn all the time, it depends on some student factors. I’ve actually been exposed to group work in flipped classrooms that were awful. And so that concept is we start thinking about and this is why it’s going to come back to the technology, we think about the elements that need to be there, that are necessary for learning to take place. I’m just going to do this, because it’s not the topic I’ll make it very brief, is let’s just go with three things. If you don’t have your attention, as a teacher, if my learners aren’t attending to what I’m saying, if they’re on their phone or thinking about bacon, then they can’t process what I’m presenting. And if you’re having a think-pair-share, if they’re not attending to the person they’re sitting next to, you have to have attention. Number two, they have to have some value. If I’m hearing somebody or I’m reading something, and this has no value to me, it’s really hard to get it into your long-term memory and to learn it. And number three, I have to have a clue of what’s happening, I got to understand some aspects. Now if we think about attention, value, and understanding, now we can flip back to the technology. This is why gaming works. Gaming draws the attention, it increases the value, because you want to win the game, and it has understanding. We have all played games. You open up the old board games, and now it’s digital, where you don’t have a clue what the game is. It’s like, if you advance a player four pieces and the opponent advances five pieces, you have to go back three spaces, unless it’s a Tuesday. When those instructions are that complicated, you don’t understand. So we can use technology to help with attention, we can use technology to help with the value of what’s going on. And we can use that technology to help with understanding. Those are things that were very difficult before. And they allow us to do things like a mini lecture and then shift over to an active learning exercise, and then say, take all this information and create a Zoom session tomorrow that will go over it again. So the technology has really helped us to be able to do all of these things to get at the core of learning, a topic I barely care about. [LAUGHTER]

John: That’s an important one, because people often see this as this binary issue where you lecture or you use active learning. And there are some really effective ways to combine them. And in fact, in that book on dynamic lecturing, it was suggested that lecture can be more important in introductory courses, when students don’t have as much of a knowledge base.

Todd: You’re absolutely right. Discovery learning is a really great way to learn if you’ve got a lot of time. I can just put you into a room with some other people and say, “Here’s some data, and here’s some things we need to know. Go.” And if you don’t have any foundational knowledge at all, it takes forever to figure it out, you go online, you know what to look for, I could do a five-minute lecture, and at the end of five minutes, set it up and say, “Now go and work with your neighbors. In fact, here’s what we’re going to do, we’re going to have you each work in small groups in class, I’m going to open up a Padlet. At each table, I want you to go in and add your information or put it into the column that corresponds with your group number.” As an instructor, I can watch everything develop in front of me. While I’m in the room, I can look at my laptop and see it and walk over to a table and say, “looks like you’re struggling a little bit.” I’ve lectured, I put them into small groups, I’ve had them use technology, I’ve created a little bit of competition on who can come up with what and I’ve had a way for me to monitor it and give them feedback. That is so different than what teaching used to look like. So pulling it all together, that’s what we do.

Rebecca: The tools to be able to monitor have been really helpful in my own teaching and being able to get a better pulse on what’s going on and get a nice overview and then be more targeted in how to interact with small groups rather than just kind of wandering around more aimlessly like I think I did initially. [LAUGHTER]

Todd: Yeah, and this is all going to be great until we get our cognitive load headbands that I’m waiting to be developed. So anybody who’s listening, take this idea, run with it, you can make a bazillion dollars and then take me out to dinner or something. I want a headband and the headband has a light and it measures brainwave activity. And then as I’m teaching, if you start to be a little bit like it’s a little bit too much, you’re moving out of that zone of proximal development, the light turns from green to a yellow. And then when it hits red is like when you’re trying to put together an Ikea bookcase and someone comes by and says “What do you think of this?” and you say, “Errr, I’m working on an Ikea bookcase right now.” …that shutting down with that red light. I’m telling ya, that’s going to be the technology we’ll want next.

Rebecca: It would be so helpful. [LAUGHTER]

Todd: You can actually look and see somebody else’s zones of proximal development and their cognitive load. Whoof. Which by the way, there’s a little party game that they’ll do periodically at parties. It’s like if you’re a superhero, what would you want your superpower to be? And I was in a room one time and one person said they wanted to fly and somebody else said that they wanted to be invisible, which real quickly in my head, I’m thinking, what could you possibly gain that wasn’t illegal or creepy if you’re invisible. So aside from that, transporting and everything else, and they got to me, and I said, “I want to be able to see people’s zones of proximal development. If that were my superpower, I’d be the best teacher.”

John: I bet that went over really well at those parties. [LAUGHTER]

Todd: Yeah, my friends all said “You are amazingly smart and quite insightful.” They used different words, but that’s what I heard. [LAUGHTER]

Rebecca: They didn’t start with what is that? [LAUGHTER]

Todd: As soon as I start talking, most of my friends just shake their head and drink whatever beverage they have near them. [LAUGHTER] So yeah, it’s good times, good times. They’re all impressed. They don’t say it all the time, but I know they are.

Rebecca: I think one of the things that often happens with technology is that it allows us to get things quickly and move through things quickly. But sometimes, as you just noted, learning doesn’t happen quickly.

Todd: Yeah.

Rebecca: Can you talk a little bit about speed and the difference between maybe not having all the technology and all the things really quick versus maybe now where we have it at our fingertips, but do we always want it at that speed?

Todd: So there’s another study that I’m waiting to see. This is an easy study, folks, somebody can run this one quickly. We all know that students are listening to any recorded lectures or recorded material that they have to watch, 1.7 is about the best speed that we tend to see people listening. 2.0 is a little bit fast for some folks. 1.0 is like normal speed, that’s no good, too slow. So what I’m curious about is the space between words and between sentences that our brains, because they move so fast, we can listen faster than somebody can talk. And we have all this other stuff going on is I can be thinking and processing while you’re talking to me. But if I bumped that up to one seven, I think we close the gaps. And I hear it a lot faster. But what I don’t think is happening is the cognitive processing while I’m listening. The active listening component to it. So I think technology can create concerns in those directions. And students who do try to process material too fast… we’ll wait and see.

John: And that’s especially important in flipped classrooms where students do watch these videos outside. One of the things I’ve been doing with those, though, is embedding questions in the video so they can watch them as quickly as they want. But then they get these knowledge checks every few minutes. And then if they find they’re not able to answer it, they may go back and get their attention back and watch that portion again.

Todd: Yeah, I think that’s a really good way to go. EdPuzzle’s kind of a fun technology to use. I don’t know if that’s the one you used.

John: I’m using Playposit, which is a bit more expensive, it works beautifully. I love it, they did just double the price this year, though, it was bought by a new company.

Todd: This is the tricky spot now as the prices are going up. You know, inflation is a terrible thing to waste. Anytime somebody can raise prices now it’s like, “ooh, inflation”. So you know, prices double, inflation is 8% with runaway, now it’s back down around three. But when inflation was 8%, they doubled the price and say, “Hey, we’ve got to,” but yes, it’s some of them are expensive. There’s lots of things that are less expensive. Oftentimes we pay for functionality that help us but the freemiums kind of thing. So stuff that’s inexpensive. I just wanna let everybody out there know just about anything you want to do in class or can think about doing it, there’s a way to do it for either free, or probably under $100 a year, which I know $100 can be expensive for some people, it’s about eight bucks a month. And so things like Padlet that I think might be up around 140 now, so maybe $12 a month, can change how much time you spend doing things, and how much time for students. But yeah, I love the embedded questions to help slow things down.

Rebecca: I think that the cognitive load can happen really quickly if we’re piling lots of information in but not always providing the time to process and use that information in some way in the kind of activities that you were talking about. Or knowing when everybody’s red light is going off in the class.

Todd: Or when people try to do multiple things. I mean, now you’ve got the technology around. So if students are trying to listen to an assignment while they’re texting their friend and have a TV on, I mean, we’re living in an age where there is a lot going on, and people believe they can process lots of things. Evolution doesn’t happen quite that fast. And so I think we have to be careful with that one.

John: One other thing that’s happened is back when you and I both started teaching, the only way students generally communicated their learning was either on typed pages or on handwritten notes. Now we have many more types of media that students can use. And also we’ve seen a bit of an expansion of open pedagogy. How does that help students or how does that affect student learning?

Todd: Wow, that’s really changed a lot as well. Blue Books. Remember the blue books? I think they still sell blue books in the library. They may cost more than the I think it was eight cents when I started, but the concept of writing things down, you turn them into the faculty member, the faculty member would grade them and turn them back. One of the big things that I caught years and years ago was so much wasted cognitive energy in terms of what they produced. I’d read a paper from a student and think this is amazing, and no one will ever see it. It was written for me, I graded it. And now it’s done. I think the technology has changed so many things. One of the biggest things, I would encourage all the listeners, any faculty member out there is, whenever possible, create something that will take the students’ work, the things that they’re doing, and use it to make society better. It’s not that hard. There’s assignments that you can do on Wikipedia. Anybody who wants to complain about Wikipedia, if you don’t like it, I’m gonna go back to Tim Sawyer, who is a faculty member of mine, my very first time I ever did TA work. I was complaining about some students. And he said, “You can complain three times. And after you’ve complained three times, either stop talking,” he was a little bit ruder about that, “or do something about it… just shut up or do something.” And so I complained about Wikipedia for a while, that it wasn’t all that effective. And I thought, well, if I don’t like the page on cognitive load on Wikipedia, I could give an assignment of my cognitive Psych class to go on to Wikipedia and fix it. And so you can have Wikipedia assignments, there’s so many things you could do. Here’s one for you. If you’re doing one on communication, you could have your students go and take pictures or short videos somewhere on campus of something that’s meaningful to them, and then jot down why it’s meaningful, take that compilation of stuff and send it over to the office on campus that does publicity. What better way of drawing students to campus than to have all of these students that have said, I love sitting by the pond because… and in the past, we would have had students write a paper about someplace on campus that you think is effective, put it in the blue book, we would grade it, we turn it back to the students. And that is a waste of possibilities. And so I think we do have lots of ways that we can get the students involved in helping through technology,

John: One of our colleagues in SUNY, Kathleen Gradel, had an assignment for a first-year course, where the students went out, took pictures, geocoded it and added it to a map layer that was then shared with other first-year students about useful resources on campus and their favorite spots on campus, which is another great example of that type of authentic learning.

Todd: Yes, for the authentic learning, there are just so many possibilities because of the technology. If anyone doesn’t have ideas, ask deans, ask the provost, ask the president on your campus, like what kind of information would be helpful, either for the next round of accreditation or for just helping the campus and we can design those things. Another one I did was we took students to the museum. We’d go to the museum, almost any class could kind of find some way to tie museums in, and through the museum, not only would they write stuff that the folks at the museum who did curation would help use, but also just helping the students to see how issues from the museum, how artifacts and things can be used in their own life, to better understand.

Rebecca: When I first started teaching, community-based learning was popular, in fad at the time, and I think having the experience of being a student in a class like that, but then also a faculty member teaching classes like that has really informed the kinds of projects that I do. Maybe they’re not always community-based learning, but they’re often community oriented, whether it’s the campus or even the surrounding community that the campus is situated in to help students get connected. There’s so many nonprofits that need partners and love, there’s always a project that can be done. [LAUGHTER]

Todd: There is. And I used to be a director for a service learning component of the campus. And yeah, there’s just so much out there that we can do to help others.

Rebecca: And students always had such a strong connection. And they didn’t want to fail because other people were depending on them. And so there was a real investment in the work that they did on projects like that.

Todd: I will admit that I’ve never experienced it myself. I’ve never even heard of anybody that if the students are doing some kind of authentic learning, that their authentic learning is then used to help somebody else. I have never heard students say “What a waste of time” or “I hate that class,” or “those assignments are just busy work.” They’ve never used those terms.

John: One common sort of project is to create resources that could be shared with elementary or secondary school students in the disciplines. And again, they can see the intrinsic value of that.

Todd: Yeah. Students could write short manuals on how to learn and then pass that on to the first-year students. And so upper-division students could be helping the lower-division students because not everybody can get a copy of The New Science of Learning, third edition.

John: …available from… [LAUGHTER]

Todd: Available at… used to be Stylus. Since Stylus was sold to Routledge, now it’s available at Routledge. [LAUGHTER]

Rebecca: Given the historical background that we’ve walked through today, what if we think about the future? Where do you see technological changes or learning theory changes impacting the future of higher ed?

Todd: Yeah, we’re living at an interesting time. I like to point out to folks that when you go back to Socrates, Plato kind of time there was a thought that if you wrote something down, it would weaken the mind so we shouldn’t write things down. Luckily, some individuals wrote things down or we never would have known. We’ve gone through several iterations of those kinds of things. Samuel Johnson, I believe it was, who said “With the ready availability of books, teachers are no longer needed. If you want to learn something, you could go get a book on it.” Well, that was a couple of 100 years ago. And we still have faculty members, we have students writing things down, we’re reading, I don’t imagine how you could teach without writing things down and having books. The internet came along, as we were discussing earlier, while we were teaching, we watched the internet show up. And there were people who said, “Well, with the internet, there’s going to be no need for teachers anymore, because students can get whatever they want.” I can’t imagine teaching without the internet right now. So as we’ve gone through each of these iterations, there’s been this fear that maybe we’d be supplanted by some technology followed by “I don’t know how I’d work without that,” it’s a little trickier now, because with generative AI, we’re talking about not just something being available, but actually creating something. I don’t know what that’s going to look like. But there’s some real possibilities that the generative AI ChatGPT, could do things like help students who have writer’s block, get started. And that’s an individual that maybe could produce something really cool, but just can’t get started. I didn’t publish my first book until about seven or eight years ago, because I’m one of those individuals who has a terrible time from a blank screen. I just have a terrible time with that. And so now, I don’t use ChatGPT to actually write anything significant. But I will tell you that I will use it for the first paragraph. That’s all, just one paragraph. And then I completely rewrite that. And there’s no actually trace of it. But it’s something that gets me going.

John: So can we count on more than a book a year going forward? [LAUGHTER]

Todd: No, no, no, no, you can’t. So exhausting. But the concepts that will help students that can do that, I think that’s going to be helpful for them. So there’ll be a type of student who couldn’t have produced before, but now can. We are definitely going to run into some challenges, though, with students who are going to just use generative AI and use artificial intelligence to actually create and to hand something in instead of doing the work. So I do think we’re in a challenging time right now. And I wouldn’t make light of that. There’s actually something that I find fascinating from this. Right now, more than ever before, we can actually have artificial intelligence create something for us, especially in higher education, this hasn’t been done before. The tricky thing is that we were the ones to be able to make that possible, because we learned things. If we let a machine do that work for us, we’re not going to be put into the situation or our students coming along, will not be put into a situation where they’re intelligent enough to do the things that need to be done when they need to be done. And so I do think we’re facing a real dilemma right now. If my students, for instance, always do use some artificial intelligence to create a paper and hand it in, if I can’t catch it, they may end up with an A in that portion of the class. But there’s going to come a day when they’re going to have to write something or be able to read something and tell if it’s written well. And so I’m a little bit nervous, we’re entering a phase where by bypassing some cognitive processing that needs to be done, we may be limiting what we’re able to do in the future. Wrapping this up, though, I don’t want to be the person who says if you use a calculator, you’ll never understand this statistical test. So I don’t know where the balance is. But I do think we’re going to have to have decisions coming up that we’ve never had before.

John: Generative AI is drawing on that wealth of knowledge that has been produced. And for that to continue to grow in the future, we do need to have some new materials being created. So that is an interesting challenge, unless it goes beyond unless….

Todd: …unless it creates it. So that was one I thought about, by the way, sometimes you’re sitting around just thinking about stuff and it’s interesting. I was thinking how do I acquire new information. And the way I acquire new information is I go read articles, I read books, I read a ton of stuff. And then I say I think this is valuable, I don’t think that’s valuable. And then I put it together and say here’s what I’m thinking. And now I’m looking at this generative AI who goes out and scans the environment and pulls these things and then creates something new. It doesn’t have the cognitive processing that I have at this point, but…

John: it’s in the early stages.

Todd: We have some folks who are very concerned out there, especially in European countries that are starting to put some guardrails out, because at the point that it keeps grabbing stuff, and then generating and then it grabs the stuff it generated, then it’s going to be interesting. But as of right now, I just read another article, I think it was yesterday, that they’re going out and grabbing the most popular or most frequently written things and then putting it down as if that is right.

Rebecca: The way that you might prioritize as a human with an expertise in something, is going to be really different than a system that’s prioritizing based on popularity, [LAUGHTER] or like how current something is like when it was last published. That’s a really different value system that really changes priorities.

Todd: Yeah, and I think it changes how we teach. I think the way we teach is going to fundamentally shift because we’re going to have to work with students with all these things being available and explain to them and talk to them about the learning process and the value of the learning process. And keep in mind, this isn’t just about ChatGPT writing papers, everybody’s freaked out about that right now. We shouldn’t lose sight of the fact that you could get fresh, cleanly written papers that have not been plagiarized at all, we’d be able to do that for 20 years. There are paper mills, I can either write away or contact somebody and say, “Please write me a 10 page paper on Descarte, and they would write it, I could turn that in. What actually has happened recently is that everybody can do it, even those who can’t afford to have a paper written at $10 a page or whatever it’s costing. And so equity comes back again. [LAUGHTER] Now we’re an equal opportunity cheater. So we have to be careful with that. But I think the way we teach is going to change because all that information is going to be available, kind of like the internet on hyperspeed. And then what do you do with that? It’s going to be really intriguing. I think it’s an exciting time.

Rebecca: So Todd, this episode’s gonna come out right at the beginning of this semester. So you’re saying we need to be thinking about how to change our teaching. ChatGPT’s here, what are you doing for the Fall differently?

Todd: Well, I think the biggest thing is what we were just talking about, looking more at the learning process, which has been a big thing for me for the longest time, is explaining and talking through the learning process, I can hand you all this information, but if I hand it to you, you don’t learn. In fact, one of my favorite examples came from a friend of mine, and it was the gym, if you want to get in better shape, I could pay somebody to go do sit ups for me. And then I could somehow log in the book at the gym that 100 Sit ups were done, use the passive voice there, and somebody else did them for me, I’m not going to get in better shape unless I do the situps. So I have to do the work, I have to run, lift weights, do the situps in order for me to be able to gain. We need to just turn that into a cognitive process for our students to really gain cognitively they have to do the work. And so I think more than ever, it’s how do we convince students of that? And for the faculty members who say, “Well, that’d be great, but my students just want the grade.” If that’s the case, we have a bigger problem than whether or not some technology can write a paper for them.

John: So how do we convince students that it is important for them to acquire the skills that we hope they get out of college?

Todd: I think this is probably going to come down to the community building, it’s been there forever. If you really want your students to do the work, the best thing you can do in my view, and that’s why I’m gonna say, Rebecca, I don’t think a lot for the way I teach, has changed. You build a community, you build relationships, you talk to the students about importance of things, if you’re sincere about that, and they get that then yes, there’s going to be some students that are going to mess with the system, they have always been there. But you’re also going to get a lot of students who will say, “Yeah, that’s a good point.” And then they’ll do the work. I don’t teach as many undergraduates as I used to, I’m teaching more faculty than ever because of being the faculty developer. But there were years that I would have to tell my students don’t put more time than this in on your paper, you have other classes, you need to do the work in the other classes. Because, and I’m telling you, I am very proud of this, my students would spend a ton of time on this stuff for my class, because they didn’t want to let me down. And I would say you’ve already got an A, I’m proud of what you’re doing, please go work on your other classes. That kind of scenario happens when you build community. And I’m not saying it’s easy, I would never say it’s easy. And it’s not going to happen for everybody. But it is the foundation of good teaching.

Rebecca: So we always wrap up by asking, what’s next?

Todd: There’s just so much going on right now. I think that what’s next for me is I am still in that headspace of coming kind of back from the pandemic, anybody who says, “Yeah, but the pandemic’s all over,” wait for November, we won’t know, we’re going to see. But I still think that’s next is kind of thinking about how we teach and learn in this environment. So moving in that space, it’s probably not surprising. I’m working on the next book here. One of the things I want to do now is the last couple of books that I’ve done had been pretty heavy books. And now I want to write something that’s a little bit lighter. So it’s going to be more of a quick guide with more narrative and having some fun, I love telling stories. I love having fun with people. So I’m going to try to create a book that’s kind of like a science of learning and teaching at its best but really accessible and more of a story-based kind of way of looking at things.

Rebecca: Who is your audience for that book?

Todd: Anybody who will read it? [LAUGHTER] Anytime I write anything, I have to have the audience firmly in mind and think about who am I talking to. And I really believe there is a pretty big overlap with students and faculty who don’t know specific things. And I’m not saying this in a mean way toward any of my faculty colleagues at all. But there’s a lot of people who aren’t taught about things like long-term potentiation and deep sleep in terms of semantic memory, and looking at depth of processing and those types of things. So the same type of thing we can say to a student, we know you shouldn’t cram, but here’s why you shouldn’t cram… faculty learn a lot from that as well. And so my audience for this book is going to be faculty and students, students, because I think it’ll be more fun to read about how to learn in a narrative form like that. And faculty because it’s more fun to learn when you read in that kind of a format for some people. we’ll see.

John: And if faculty design their courses to take advantage of what we know about learning, it can facilitate more learning.

Todd: Wouldn’t that be cool? We could just keep rolling, rolling. What a great amount of work. I mean, a huge amount of work that faculty do. They’re hard working folks that are just cranking away all the time. Number one, making their life a little bit easier by helping to understand things would be great. And just having a little bit more fun would be fun, would be nice way to go to0.

Rebecca: Hey, anytime you can save time, so that we can have more play in our lives is better.

Todd: Yeah, just to do whatever you want to do.

John: Yeah, ending on a note of fun is probably a great way to end this.

Rebecca: Well. It’s always great talking to you, Todd. Thanks for chatting with us and going on the Wayback Machine.

Todd: Oh, you know, I love the Wayback Machine.

Rebecca: I love it too.

Todd: For those of you who don’t know about that, you should check out the Wayback Machine


John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.


302. Flipped Team-Based Learning

Flipped classrooms allow for class time to be used to put content into action. In this episode, Tina Abbate joins us to discuss the team-based approach that she uses in her classes to help develop the real-world skills important in her field.

Tina is a Clinical Assistant Professor at Stony Brook University’s School of Nursing. She holds a collection of credentials including a PhD, MPA, an MS, and is a registered nurse (RN). She teaches in-person and online undergraduate nursing classes at Stony Brook and conducts research on active learning strategies and the retention of information. She works as a nursing supervisor at two local hospitals.  She is the recipient of the 2023 SUNY FACT2 Award for Excellence in Instruction and was a recipient of the Stony Brook University Award for Excellence in Teaching an In-Person Course.

Show Notes


John: Flipped classrooms allow for class time to be used to put content into action. In this episode, we look at one instructor’s team-based approach that emphasizes real-world skills important to the field.


John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist….

John: ….and Rebecca Mushtare, a graphic designer….

Rebecca: ….and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.


Rebecca: Our guest today is Tina Abbate. Tina is a Clinical Assistant Professor at Stony Brook University’s School of Nursing. She holds a collection of credentials including a PhD, MPA, an MS, and is a registered nurse (RN). She teaches in-person and online undergraduate nursing classes at Stony Brook and conducts research on active learning strategies and the retention of information. She works as a nursing supervisor at two local hospitals. She is the recipient of the 2023 SUNY FACT2 Award for Excellence in Instruction and was a recipient of the Stony Brook University Award for Excellence in Teaching an In-Person Course. Welcome, Tina.

Tina: Thank you. Thank you so much for having me here today.

John: We’re very happy to see you again. We saw you at the SUNY Conference on Instruction and Technology (or CIT) about a month or so ago. And our teas today are:…. Tina, are you drinking tea?

Tina: I am. I am drinking a chai tea. Very good.

Rebecca: That sounds nice and warming.

Tina: Yup.

Rebecca: It’s a little chilly here, although it’s summer and it was hot yesterday. It is not hot today. [LAUGHTER]

Tina: Yes, for sure the weather has been very odd.

Rebecca: So I have my tea for teaching mug today. And in it, I think actually a mix of a couple of different black teas because I switched when I had a half a cup left. [LAUGHTER] I’m not sure what we call this today, but it’s a mix of black teas.

Tina: That sounds delicious.

John: Well, it sounds like a great tea to have while discussing blended learning.

Rebecca: A high quality blend. [LAUGHTER]

John: And we’re having a real cold spell here in Durham, North Carolina. The temperature has dropped down to 87 today, and I am drinking a tea forte black currant tea.

Rebecca: That’s a nice summer tea.

John: It is.

Rebecca: So we’ve invited you here today to discuss your use of active learning tools. But before we jump into that, we were curious about your wide range of degrees, credentials, and certifications. We didn’t even list them all. Can you share a little bit about your pathway into your current position at Stony Brook?

Tina: Sure. Well, when I went back to grad school, I certainly didn’t intend to get three graduate degrees. I had gotten into Binghamton and gotten into their BS to PhD program because I wanted to do research and my ultimate goal was to do executive leadership position at a hospital because I really enjoyed the leadership role of nursing. So just to backtrack, I graduated Binghamton University in the year 2000 and started right in the NICU (neonatal ICU) at Stony Brook and I worked as a NICU nurse for six years. And in that time, I knew that I wanted to go back to school. And like I said, I got into the BS to Ph. D. program at Binghamton. They awarded me a fellowship program. So I moved from Long Island. My daughter was one at the time. And I started my education there at Binghamton, continued it for the graduate program. And about a year into my doctoral studies they had asked if I wanted to teach clinical and I’ve taught in other capacities. I used to teach violin and piano when I was younger and I never really thought of teaching as a career goal for me. However, I was a poor graduate student, and I said, “Sure, I’ll do it.” And I had about six students in the NICU. I was teaching clinical, and, I don’t know, something came over me. I found my professional soulmate, something clicked so hard for me in that clinical that I wanted more. So I continued asking for teaching assignments. And it’s hard to articulate the feeling that you have, but I felt like I found my niche. And so I did clinical instructoring for about six years and then I moved into the classroom setting. So at that time, I still worked as a nursing supervisor, so I enjoyed the leadership role. And Binghamton started a dual master’s degree program, where you get your master’s in nursing with a concentration in whatever you wanted, I chose education. And the other part of the dual degree was a Master of Public Administration. So I was the first cohort to move through that program. So I graduated first with my Masters of Science in Nursing and my functional role was educator. Then two years later, I completed the Master of Public Administration, and then eventually the PhD. And it all just aligned so perfectly in my current career, because obviously I’m an academic at heart through and through. So those degrees have assisted me in that role. I still work in administration. I teach research, I teach leadership and management. So all of the degrees I’ve utilized and I still utilize actively every day. So this pathway was kind of carved out for me, I think, and I just feel very fortunate that I’m able to apply all of the degrees that I’ve gone for.

John: At the SUNY Conference on Instruction and Technology, you gave a presentation on how you structure your courses. And you mentioned that you were using a flipped team-based learning class structure. Can you tell us a little bit about how your classes are structured, and what a typical class day would look like in one of your classes?

Tina: Sure. So any class that I’m involved in or coordinate, the structure that I utilize is a flipped team-based learning approach. And this essentially requires students to prepare prior to coming to class. It has some benefits there, there’s flexibility, students can learn at their own pace, it really amps up the student responsibility for learning, as we know, and then it also gives us the opportunity for higher level learning because they’re interacting with the concepts outside prior to class. And the team-based part of it I like is because that increases that collaboration amongst students. We know that nursing healthcare is a team sport, so I like to engage the students in teamwork so that they can collaborate and work on their team dynamics, and their own personal team skills. So how my classes operate is, prior to each class, students complete a set of videos, and they’re interactive videos, they’re accessible videos for all types of learners, and it carries weight in their grade. So basically, in these pre-class videos, students get a little voiceover content from me about a concept, and then they get tested on it using a variety of types of questions: matching, true-false, multiple choice, hotspots, you name it. As they move through the videos, they are taking notes on a note-taking guide. So all the concepts are there for them to just follow along, take notes. So they’re seeing, hearing, they’re doing something as they move through the videos. And that note-taking guide eventually acts as a study guide for them, because they have to take a quiz every single class. So they complete these videos before class. And then I start each class with a micro-lecture review using Kahoot!, which is just a game-based learning platform. And in this micro-lecture review, I’m really drilling down to the concepts and helping these students reconcile any last residual confusion that they may have about these concepts. And then after the Kahoot!, they take a quiz. Now, since they’ve interacted with the concepts so many times prior to taking this quiz, I push the level of the questions in these quizzes. There are 15 questions and I try to push the level as high as I can. And the students are able to rise to the occasion because they are not hearing the information for the first time when they walk into class. They have a vague sense of the concepts, we nail it down, and then they take the quiz. After the quiz, the rest of class is comprised of team-based activities. And that’s how every class looks like for me.

Rebecca: Can you talk a little bit more about the embedded questions that you have in the videos and how students have responded to that aspect of a flipped classroom?

Tina: Absolutely, I use a program called Articulate 360. Articulate 360 has many different types of functions in it. But I focus more on the storyline aspect of this product, where I’m able to set up these video clips. So if you already have voice overs, you can basically chop up that voice over into different bits, put it into a story, the type of file that they reference there. And then in between each clip, you can embed any type of quiz question that you could possibly imagine. And you can set up different parameters. So for example, I like to elevate the stakes a little bit, so the students, for these pre-class videos, the grade that counts is their first pass. So it’s not like they can retake the video for a higher grade. It’s whatever they get at the end of that first pass of the video is the grade that counts. And they have two opportunities to answer each quiz question correctly. And I also embed a lot of feedback, so if they get the answer wrong, they’ll see a pop up with some review, and then if they still got it wrong, or they got it right, then there’s an explanation that pops up for the right answer. So I do survey my students in the middle of the semester using a Google form. And then at the end using the university platform, and the feedback about the videos has been very positive, they really do appreciate even though it means extra work, I’m still not giving them 20 chapters to read. I’m giving them something that passes along a bit more quickly and has a better chance of sticking in their memories. And they also appreciate the note-taking guide because it also becomes a study guide, not just for the quiz, but for the final exam at the end.

Rebecca: Like I’ve counted four or five layers of countability on that same content. [LAUGHTER]

Tina: Exactly.

Rebecca: We’ve got the note taking guide. We’ve got the embedded questions, and we’ve got the Kahoot!, and then we’ve got the quiz, and then the exam at the end.

Tina: Yeah, so it’s all about building on these concepts, having the knowledge and then being able to apply it in the classroom

John: In your presentation, you mentioned that you were de-identifying the names of students taking the Kahoot!, but maintaining a leaderboard in the classroom. Could you tell us a little bit about how that works.

Tina: So Kahoot is based on answering the questions correctly or incorrectly. And part of the score is how quickly you answer the question. So ideally, you want to answer quickly and answer the questions correctly. So at the end of the Kahoot!, they get a score. And just again, to raise the stakes, students have to hit a certain benchmark of points to receive full credit. And I try to push that benchmark a little bit, not to make it impossible, but just to make it a little bit challenging for them to give them something to work towards. So for example, in one semester, they have to reach 70,000 points to get the full credit, and then it’s prorated from there. So every time I have a class, I load the data into this program that was built by our Center for Excellence in Learning and Teaching on campus. One of the computer scientist was able to put this leaderboard, showed me how to upload the files, which are basically just CSV files. And what it does is this leaderboard shows their rank in the class, their total score, and the score for that week, so that they can monitor their progress. And everybody else is de-identified and random words, but they can see their name, and they could see their rank in the class.

John: And one of the advantages, I think, of using Kahoot! is it does provide some practice in developing automaticity. So that students can practice retrieving information quickly, which I would think would be especially important in health-care situations.

Tina: Absolutely. And I’ll have some students that come to me and they just absolutely despise Kahoot! because of the stress. And if you’ve ever taken a Kahoot!, and I have, it is stressful, you have to really think on your feet very quickly, especially since your score is based on how fast you answer the question. So what I tell them from the beginning is if you really are struggling with Kahoot!, and you don’t like Kahoot!, Kahoot is really for you, it’s meant for you, because I want you to think of a situation in a hospital setting. If a patient is deteriorating, we call something called a rapid response. And a team of people flow to the room to address whatever issue it is, perhaps the patient’s having difficulty breathing, whatever, chest pain, this now has become a very emergent situation. And in that situation, you have to be, as the primary nurse or a nurse assisting someone else, you have to have laser focus, and someone may ask you to just go get a piece of gauze. And if you’re new in the role you may be so flustered, just by getting that piece of gauze. So, this is really like a precursor to that. So I tell the students to use Kahoot! as a mechanism to help with your laser focus in situations where the outcome is dependent on what you’re doing.

Rebecca: Another thing that seems really relevant to a healthcare setting is the team-based learning aspect of your course. Can you talk a little bit about how you arrange the team-based activities and also how you set your students up for success on teams.

Tina: So with team-based learning, as we know, it’s simply a collaborative learning strategy and how the team activities look depends on the course. So I can talk to you about my research course. That happens every fall semester, and I have 160 students, this is the graduating class. These are the seniors, they’re in the last two semesters of the program. And what we do in that course is the team-based activity portion of class is working on a project. So I’ll tell you a little bit about the project which is experiential in nature. Stony Brook University is attached to Stony Brook University Hospital. So every year, I pick a unit, I meet with the manager, and they give us a clinical problem to solve. So for example, this fall the students and I will be working with the surgical ICU and the clinical topic is nurse wellbeing. So, as we know, we’re in this post-pandemic world and wellbeing has really moved to the forefront. Things like burnout, compassion fatigue is very prevalent in the healthcare environment and just globally as humans. I think we’re just a little tired of living in this fight or flight for so long. And now we’re trying to come back from this. So this fall semester, the students will be working in teams to find a solution for the surgical ICU for nurse wellbeing. So what we do is we search for articles together, and that’s how they get to their solution. We use a framework we use Melnyk’s seven steps for evidence-based practice. So in undergrad nursing, even though it’s called the nursing research course, the students are expected to utilize the research that has been done on a topic to make changes to their practice. Our expectation is not for them to actually conduct research. That’s a PhD level thing, but according to our essentials in baccalaureate nursing, that our accrediting body tells us what curriculum to teach to the students, the expectation is that they know how to read the research, how to critique it, how to appraise it, how to synthesize it, and how to use the research to develop solutions. So from there, they work in teams of eight throughout the semester, they develop their solution, they put it into a video project, the six-minute video project, and I choose the top two projects. Those top two projects then move on to the implementation phase. So then the unit will implement and evaluate the solution. And in addition to that, we put in for posters at conferences. For example, last year, we had two posters at ENRS. I was assigned the course of research, I was like, “Oh, boy, how am I going to make this interesting?” …because we know that research content can be a bit dry. So I ran the course for a couple years, and I knew that I had to do something with it. And that’s where I started moving towards this more experiential learning opportunity for the students. And so far, it’s been going really well.

Rebecca: So I heard you say something about teams of eight, and I almost maybe had a heart attack, [LAUGHTER] just thinking about how big that team is, and how to manage that. Can you talk a little bit about some of the structures you have in place to help a group that size, which is relatively large, be successful?

Tina: Sure. So teams of eight… that means I have 20 teams in total. And we’re all reviewing the same articles. So then I know the answers to all the questions. And basically, Google Drive is my answer. Every team has their own folder, within that folder are subfolders, I have them buddy up and be assigned to a certain number of articles. As a team, they have like individual and buddy responsibilities, which is clearly articulated in a contract that they review and fill out at the beginning of the semester. So they have individual responsibilities, buddy responsibilities, and they have team responsibilities. And every single class looks the same. So by the second class, they’re already into the mode. I don’t throw them any curveballs, every class structure is exactly the same, so they know what to expect. And they have appraisal forms to fill out. They have tables to fill out as a team to keep all of their literature organized. And the structure that I have in place seems to be working because there’s very little confusion now that I’ve kind of worked out all of the kinks. And I also always keep instructions projected just to make sure that they are apprised of the flow of class.

John: You mentioned Melnyk’s, seven steps of evidence-based practice. Could you give us a brief overview of that framework?

Tina: Absolutely. So there’s many evidence-based practice models out there. Stony Brook goes with Melnyk, and there are seven steps and actually I begin with step zero, step zero is igniting that spirit of inquiry. And that’s one of my main end goals of the course is for them to stay curious about how they can improve practice as a nurse for their patients. So that’s step zero. And then basically, what we do is we take the clinical problem, and then we frame it in the form of a question, a PICO question. And that helps us to find our articles. So once we find our articles, we go through the articles, we decide what we’re going to keep, what we won’t want to keep, then we start to critically appraise these articles, review them, read them, understand them, the students put that information into a literature review table, which is just the main elements of each article. After we’ve appraised all of the articles, the next step is to synthesize all of the articles. So what is the bigger picture? For that synthesis class students do complete synthesis tables. And when they create these synthesis tables, now the beginnings of their proposed solution begin to emerge. So then students put their solution together based on the synthesis table. And then the next step in this process is to implement the solution and then evaluate the solution. And of course, dissemination is always the last step.

John: You also mentioned that you use collaborative testing on exams. I’ve done this with a two-stage exam process where people take the exam individually first and then submit that but then take it again as a group. It also appears to have been and that’s been tremendously successful. It’s also appeared to have been really beneficial in terms of student learning, and it’s just so much more fun to watch the students work in groups on exams, than it is to go over the exam the next day with the whole class. That collaborative exam format has been so much much better than I ever expected it to be. Could you tell us a little bit about how you do collaborative testing on your exams?

Tina: I absolutely adore collaborative testing. If you have to assess students using exams, this is really maximizing the use of exams. So in my courses, students take collaborative exams in teams of three. And as we know, the research says that collaborative testing may decrease test-taking anxiety, the students have to take a large licensing exam at the end of the program. So it may help some of these students with that, like you said, immediate feedback on test performance, it really scales back the number of questions, I don’t even do exam reviews anymore, because the immediate feedback that they get, they’ve reconciled any confusion on the exam, that an exam review is no longer required. It increases student engagement and collaboration. I love how they, like you said, they debate, they discuss, that peer instruction. There are some people out there who can read a book and retain 100%. But generally speaking, you’ll have a better chance of retaining more information if you’re teaching someone else versus reading a book. Of course, that just varies learner to learner. So that could be really something to hone in on when it comes to collaborative testing. So, yes, the traditional way is to take the test individually, and then they take it again, in a team. And in our program, the clinical courses like medical surgical nursing, pediatrics, all of those, I would always recommend to do individual than collaborative because you really want to assess that individual on their performance and understanding of the concepts. And so I teach research, and I teach leadership and management, these are non-clinical courses, I skip the individual part, and take them right to a collaborative exam. So for example, for my research course, the students don’t know who they are paired up with, or in a team with, until about an hour before the exam. They get two articles, a quantitative article and a qualitative one. And then they have a set of questions to answer. Essentially, we’ve been preparing for this type of exam throughout the semester. So they end up doing really well. In my transitions to professional practice, where I’m teaching leadership and management, that is a traditional final exam, multiple choice, select all that apply, type of questions. And again, I actually do it on Zoom, they go into breakout rooms, they share their screen, and they take the exam, there’s a scribe who enters the answers. And also when it comes to accommodations, kind of as a side note, I’ve been able to set up strategies for individuals that do have accommodation so that they can maximize their experience as well.

John: When I first tried this, I was so excited about how the students were reacting with the collaborative exam that I took a short video clip while they were doing it and sent it to Rebecca. She was working with me in the teaching center at the time. It was just a remarkably positive experience.

Tina: Do you notice a difference like I would say an estimate of 10 points between the individual and the collaborative mean.

John: Generally, yeah. And the group one is virtually always higher than each individual score, except in one case in my class, where one student had a higher score than his group, and that’s because during the group discussions the student gave in to peer pressure within the group. I encouraged him to be more assertive when he’s confident about his answer. But that only happened with one student on one exam.

Tina: that’s pretty rare. I just love just watching them engage like that. So I’ll pop into like the breakout sessions, and they’re collaborating and negotiating and it’s just fantastic.

Rebecca: You mentioned earlier about your research class having a project coming up about well being. And I think that’s a topic that we’ve been talking a lot about in higher ed in a lot of situations. Can you talk a little bit more about that project and some of the research that’s going into it and some of the outcomes of it?

Tina: Absolutely. I mean, wellbeing is such a hot topic right now in probably every type of job you could think of. And it’s interesting wellbeing is kind of always been in the background. And I think the pandemic really shoved it into the forefront where it really should have been. That really needs to be, in my opinion, the top priority of any workplace because if your employees are well, it has a positive trickle down effect. So it has gotten to the point now where our accrediting body who tells us the essentials that we need to teach to our students, they have added a wellness component, and we’re adopting these new essentials in the next year. These are new essentials for us to follow. So it made it into the essentials, which is very telling. And now faculty are charged with teaching students, monitoring students, about their wellness and wellbeing. So this was pretty timely, because of the pandemic, the clinical topic that we’ve been doing for this EBP project has been things like compassion fatigue, burnout. And now this year, we’re doing wellbeing. Last fall, we worked with the cardiothoracic ICU. And the EBP project topic was compassion fatigue. And we wove in a lot of wellbeing into the solution, which is actually kicking off on July 1. So this year, instead of doing compassion fatigue, which has a bit of a negative connotation, let’s flip it to the positive. And like I said, we’re working with the surgical ICU, and we want to customize a wellness solution for that unit. So in the meantime, by proxy, I can teach the students about their own wellbeing and their own wellness. So I have a lot of content in there, so that they’re learning about this clinical topic to help develop a solution, but they’re also learning about it for themselves. And I do a few things with them, and definitely evolving this as we move along. And I’m lucky enough that I have the graduating class in the fall and the spring. So I move it through from the fall to the spring semester. So in addition and educating them on the different ways to promote your own wellness, we start each class with a mindfulness activity. I have a sound bowl that actually a student gifted to me, we do meditation, mindful breathing, every class is something different. This year, I’m inviting students to lead some of these sessions. So I want it to grow so that other students can participate and lead us and it’s literally three to five minutes at the beginning of every class, all lights down, devices off, phones flipped down, and we just take the time to be as present as possible. And I also help them keep an eye on their level of burnout. And I give them the professional quality of life survey at the start of every class. And halfway through, I’ll do a comparison of statistics between the different cohorts. Because I have the traditional cohort and I have the accelerated one, we look to see how our scores are doing over time, just to have that educational component to it. And then also the Insight Timer app, that’s an app that you don’t have, I would highly recommend that you download it. It has so many mindfulness type of activities that you can do. There’s a journal, you can track your progress. They have classes, and even the paid version, which is I think, maybe $60 for the year, they offer so many different bells and whistles, it’s really just a phenomenal app to use if you’re looking to promote your own wellness. So the other thing I wanted, I attended that CIT conferences, is I would love to use ChatGPT to develop a wellness assignment. So I’m still thinking about the inner machinations of how that would work. But hey, you know, if AI is here, might as well see if we can use it to promote wellbeing.

John: And it’s nice to have that focus of using ChatGPT positively because this is something that’s going to be part of students’ lives going forward, maybe not this specific tool, but AI tools are not going to disappear and using them for good would be a nice alternative for the concerns that many faculty have about the use of these tools. During your presentation at the CIT conference, you also mentioned using a variety of edtech tools. What are some of the tools that you use in your classes?

Tina: Sure. So I’ve trialed some apps here and there, I’ve used Plotagon. I used Go Animate for Schools, which is now VYOND, just for them to create case scenarios in their leadership and management class. And based on feedback, the one that they really liked is now a bit pricey. So I tried a free version of an app, and it really didn’t go well based on feedback. And that’s how it works in education. You try something out and you survey the students and if the experience over time is really not positive, you need to move on to something else. But things that really have stuck is I told you about Kahoot! and Articulate 360. How I communicate with the students. I use GroupMe. I prefer to communicate with them using that application over Brightspace or traditional email. They join via QR code and I have them all in one group chat and I can post quickly. They could send me direct messages, they could post questions in our group chat. And it just seems to really streamline communication because we’re all competing for their cerebral real estate, they have a lot going on, a lot of deadlines, so I find that this GroupMe app is really helpful. And I also try not to spam them with too many messages, thoug. It really seems to work. And then again, Google Drive, I can’t even begin Google Drive for everything, whether I want to survey them or whatever it is, Google Drive has it for us.

Rebecca: So speaking of Google, [LAUGHTER] you mentioned earlier using a Google form for a mid-semester evaluation. Can you talk a little bit more about that, and how you’ve used that to make adjustments in your class for the latter half of the semester?

Tina: Sure. So a Google Form is a pretty nice way to just give a quick survey to your students, I do that in the middle of the semester. And I have to tell you, that’s where I get my best data, because they are in the throes of it. And my response rate is typically over 90%, as compared to at the end, where they’re kind of just fizzling out, tired, maybe a bit over it, generally speaking. So I don’t get the response rate in the final that I do in the mid semester, when I analyze it, very short, a couple Likert questions: What do you like? What don’t you like? …and if there’s enough of a theme in the qualitative questions, or in the Likert scales, I’m able to make changes prior to them departing from me, instead of waiting for the next cohort to come in. For example, some things that came up was: “It can be a bit loud in the classroom.” So I’ve done something to control the volume in there, because it’s a very active classroom, or we feel like we’re sitting around too long during the TBL activities. So now I have a mechanism for them to let me know when they’re done with their activities, so that they’re not sitting around waiting. So those types of things. If they say, “let’s skip the final exam,” then that’s not anything that I can honor. But I’ve gotten some really good raw feedback that’s helped me evolve my classes. I’m just always so grateful for the student experience, because they inform me where this needs to go. Another way that I use a Google Form is with team-based learning. Michelson says that you should have the team members evaluate each other on their team performance. And typically, this is done at the end. But I like to do it in the middle of the semester, where they’re evaluating each other so that they have an opportunity to remediate, and then by the end, hopefully, their team’s performance scores have gone up. The challenge, though, with a Google Form is it’s very hard for me to share the feedback back to the students, it requires a lot of copying and pasting. And there’s a lot of room there for error, human error. So currently, I do bring in the students that are rated poorly just to give them some one-on-one guidance on how to improve their team performance. But in the meantime, to work around that I did trial a product called Kritik that offers that ability where the students will get their feedback back. But I reached out to our Center for Excellence in Learning and Teaching. And right now what we’re doing, we have a sandbox, and we’re working on trying to do a Kritik-like type of peer evaluation in Brightspace, using PeerMark. And we’re getting very close to ironing out some of the finer details. So I’m going to finally have an evaluation where every student can see their feedback from their team members based on their performance, so they know what they’re doing well and where they need to improve.

John: You teach both face to face and also online. Do you use many of the same techniques in your online classes that you use in your face-to-face classes? How do you modify your class for online delivery?

Tina: So I do everything the same, except that it’s in an asynchronous format. So students really have to be self disciplined in an asynchronous online type of environment. The online classes that I teach are post-licensure undergrads, so they have their two-year Registered Nurse license, and they’re looking to get their four-year degree. So some of the assignments, we tailor a little bit differently just because they have nursing experience, whereas my pre-licensure students do not. So maybe the assignments vary a little bit, but the structure is the same, using Articulate. I don’t use Kahoot! with them, only because I don’t have them in front of me, but they do have the quiz. And they have the TBL activities and things of that nature. So it’s the same, but it’s just in an asynchronous format.

Rebecca: I know that we mentioned in the intro that you do some research on some of your teaching practices. Can you tell us a little bit about some of that work?

Tina: Sure. So a colleague and myself got IRB approval, and we’re just starting to do some research on this evidence-based practice project that the students do in my class. And we’re just starting off with a cross-sectional study. We have a valid tool that’s been out in the literature that measures their perceived knowledge, skills and attitudes regarding evidence-based practice. So, I’m not building logistic regression models or anything yet, but starting off with a cross-sectional study to understand pre and post, the beginning of class and at the end of their research class, if there’s any impact or change in their knowledge, skills, and attitudes regarding evidence-based practice. So that’s where I’m starting. And I’d like to move on from there eventually.

John: And speaking of moving on, our last question is: what’s next?

Tina: So, I just would like to continue publishing and presenting. And continuing my research. Like I mentioned earlier, I’d like to introduce an AI tool for wellbeing, and Stony Brook just purchased several VR headsets. And because my courses include a lot of content about compassion, wellness, well being, I would love to develop a simulation about empathy. I think that would be a fantastic use of VR, apart from like, typical clinical scenarios. And that’s really my plan for now.

John: Well, thank you for joining us. And when you do have some results from your research, we’d love to have you come back and talk about it.

Tina: Thank you. Definitely. I really appreciate you inviting me. This is a wonderful opportunity for me. Thank you.

Rebecca: Yeah, thank you for letting us use your class as a little case study for folks to think about ways that they could change, improve, and reconsider their own classes. Thank you.


John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.