314. Handbook of Online Higher Ed

Since its start in the late 1990s, asynchronous online instruction has spread throughout the world and has been the subject of extensive experimentation and study. In this episode, Safary Wa-Mbaleka, Kelvin Thompson, and Leni Casimiro join us to discuss their new handbook that examines effective practices in online learning from a global perspective.

Safary is an Associate Professor of Leadership in Higher Education at Bethel University in St. Paul, Minnesota. He has authored and co-authored more than 40 scholarly journal articles and more than 20 books and book chapters. Kelvin is the Vice Provost for Online Strategy and Teaching Innovation at the University of Louisville. Kelvin developed the BlendKit Course open courseware as part of the Blended Learning Toolkit, and he co-hosts TOPcast: The Teaching Online Podcast. Leni is a Professor of Education, the Associate Dean of the AIIAS Graduate School and Chair of its Education Department and the Director of AIIAS Online, the virtual campus of the Adventist International Institute of Advanced Studies (AIIAS) in the Philippines. Kelvin, and Leni are frequent invited speakers on topics related to online instruction. They are the co-editors of The Sage Handbook of Online Higher Education.

Show Notes

Transcript

John: Since its start in the late 1990s, asynchronous online instruction has spread throughout the world and has been the subject of extensive experimentation and study. In this episode, we discuss a new handbook that examines effective practices in online learning from a global perspective.

[MUSIC]

John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.

[MUSIC]

John: Our guests today are Safary Wa-Mbaleka, Kelvin Thompson, and Leni Casimiro. Safary is an Associate Professor of Leadership in Higher Education at Bethel University in St. Paul, Minnesota. He has authored and co-authored more than 40 scholarly journal articles and more than 20 books and book chapters. Kelvin is the Vice Provost for Online Strategy and Teaching Innovation at the University of Louisville. Kelvin developed the BlendKit Course open courseware as part of the Blended Learning Toolkit, and he co-hosts TOPcast: The Teaching Online Podcast. Leni is a Professor of Education, the Associate Dean of the AIIAS Graduate School and Chair of its Education Department and the Director of AIIAS Online, the virtual campus of the Adventist International Institute of Advanced Studies (AIIAS) in the Philippines. Kelvin, and Leni are frequent invited speakers on topics related to online instruction. They are the co-editors of The Sage Handbook of Online Higher Education, which we’ll be talking about today. Welcome Safary and Leni and welcome back, Kelvin.

Safary: Thank you.

Leni: Thank you.

Kelvin: Good to be here.

Safary: A pleasure to be here.

Rebecca: Today’s teas are? Safary, are you drinking tea?

Safary: I’m having water this morning.

Rebecca: A key ingredient to tea it might add. [LAUGHTER]

Rebecca: How about you, Leni?

Leni: I used green tea, particularly this Japanese matcha. [LAUGHTER]

Rebecca: Nice. How about you, Kelvin?

Kelvin: I have deconstructed tea. That’s also called water.

Rebecca: [LAUGHTER] Popular globally.

John: And speaking of globally, Rebecca and I are both drinking Moon Bird tea, which is a gift from one of our listeners in France who sent this to us a few weeks ago. So again…

Safary: Wow.

John: …thank you, Myriam.

Rebecca: Yeah, it has a nice hint of pear and elderflower.

John: …which is also a green tea.

Rebecca: Yeah.

John: We’ve invited you here today to discuss the SAGE Handbook of Online Higher Education. Could you tell us a bit about the origin of this book project?

Safary: The origin of this project is actually something that has to do with me having worked with Kelvin several years ago at the University of Central Florida. And right after that, I decided to work in the Philippines and that’s where I met with Leni Casimiro and we worked together. And at both institutions, we were working with online education. And eventually I was transferred to work for two years in Kenya. During the COVID-19. I happened to be in Kenya, and I quickly saw the great need of people wanting to have online education. The resources went up in the place. The things were scattered all over the place. And immediately the idea came that we needed a project that captured the whole world because now this was a worldwide phenomenon, it was no longer something peculiar to Kenya or Philippines or U.S., the whole world was in need of a tool like this. And that’s how I reached out to Kelvin and to Leni.. Thankfully, they both agreed to be part of the project. And I think, from my perspective, that’s where it came from. I don’t know about them… how they think about this? [LAUGHTER]

Leni: Well, for me, it’s really a big project that we did, combining the different parts of the world. You see where Kelvin comes from, representing the West, I represent the opposite, the East. And although Safary comes from the East as well, but he can represent the African continent. And so this really makes the book a global project, really a blend of different perspectives. And so I can say that online learning is represented all over the world in this particular book. And this is indeed, a big surprise to all the readers and a big discovery for everyone.

John: Speaking of readers, what is the intended audience of this book?

Kelvin: Well, I mean, honestly, I would say anyone, anywhere, around the whole planet, who in any way touches online or digital education, should access this book. It’s great for libraries and institutions to acquire and be in their communities. It’s a big book. There’s stuff in there for everybody. So I think it’s a great resource.

Rebecca: Speaking of the size of the book, the handbook contains 50 chapters. Can you talk about how you selected those chapters?

Kelvin: I think the scope and the sequence and the layout of the chapters and the sections sources originally to Safary’s proposal with the publisher, but it was intended to be rather comprehensive with sections like fundamentals and student support and administration and instructional design, instructional delivery, regional specifics, particular regions around the world, and how online education might differ a little bit in, say, the African context versus the European context. But over time, as we were recruiting authors, and as the writing process started, you get a little bit of evolution, the sections might morph a little bit, the distinctives of a given chapter might adjust based on interest and specializations of the authors. So that’s a little bit of the insight into the evolution. But I credit Safary for the vision, which I would say, is probably about 80 plus percent of what he originally had envisioned in the layout. That’s my guess. Safary, would you agree with that?

Safary: Yeah, the thing is that, when you work on a huge book like this, especially a handbook for Sage, they want to have the complete plan when you submit your proposal. Before I can get my co-editors to agree with me, they need to have kind of ideas, okay, this is what I have in mind. So usually, when I work on a handbook like this, I come up with a rough draft. And Kelvin and Leni were very good in catching certain things that I wouldn’t have caught because of their expertise, their experience, and their regions that they represent. And so in the end, what we have here is a product of the Table of Contents was really the product of these three brains that are speaking today.

Leni: I really liked the way Safary has chosen the chapters of this book. Well, we can say that he really originated the choice of these chapters. As you can see, from the perspective of a reader, when you look at the content, you can look in the sequencing, and you will find that you are actually looking into the step-by-step development, or the step-by-step process of engaging in online education. I will say it’s almost like a manual, almost every step that you will go through in undertaking online education in your institution is covered in this book. That’s why it’s really a very important book for every school to have.

John: We had some challenges coming up with a brief intro for each of you, because each of you has done so much with online education in many different roles in many different places. But you also have an editorial board for this book, which is a little bit different than many other books that we’ve seen in terms of handbooks. What was the role of the editorial board in putting this handbook together?

Safary: Yes, we had an editorial board. When you have a project of this magnitude, it is really important to have experts from different parts, especially at the global perspective of experts, and of course, experts on the different topics that are represented in the handbook. As much as we have experience with online education, we cannot assume to know it all… areas where we definitely need help. And so we selected very well known, very well recognized experts from different parts of the world. As far as online education is concerned, all the names that are there are people who are very well respected in the field of online education within their respective countries. The role they played was, for them to be our experts in checking the accuracy and the quality and the completeness of the chapters that were submitted to us. So basically, each chapter went to two to three reviewers and the editorial board members were the primary reviewers to help us really catch everything… and the work they did, I know that some chapters had more feedback than others, but I can say that contribution they gave through their feedback was very substantive in improving this handbook. I don’t know, Leni, how you found that when you’re working with the editorial members who are assigned to you?

Leni: Yeah, actually the editors we chose, I can say they are truly excellent and helpful. During the early parts of the writing of the chapters we lead editors are having like a tug of war with the chapter authors. They tend to bargain their thoughts with us, but when the editorial board came into the picture, it gave a more balanced outlook into writing the chapters. And so we really appreciate their services. The other thing is that this editorial board members are experts in the area and so we can truly depend on them. Their feedback were truly much valued and contributed much to the excellence of the contents of this book.

Rebecca: So the handbook is divided into seven sections. Can you provide a brief overview of each of those sections to give us the lay of the land?

Leni: Oh Yeah, seven sections, it’s nice to give an overview for people to know what the book contains. First section, of course, is the fundamentals of online education. It contains the introduction to the topic of the book, online learning, and some variations in online delivery, like blended, MOOC and ERT, emergency remote teaching, we just really call it ERT, and that became popular during the pandemic. The second section, online education around the world. This section is the most colorful part of the book, at least for me. Because it tours us around the world and gives us a view of how online education grew in varied contexts like US, Canada, Europe, Asia, Latin America, Africa, Australia, and the Middle East. The third section, Online Instructional Design, this section now brings us to the T-cell of online learning, the design of online instruction with focus on how learning happens online. This is now the more serious part of the book. While we came from the most colorful, we now go to the serious part of the book. And then the fourth one, Online Instructional Delivery, this section focuses on the hammer and nail of online learning, the actual online teaching, and this is the most exciting part. Because this is now the delivery, the previous one was the most serious part, this one is the most exciting part. And then perhaps, Kelvin, can you say about the fifth section [LAUGHTER] Instructional Technology for Online Education?

Kelvin: Here’s what I would say about that, if you’ve got the most serious, you talked about the most exciting that you talked about, maybe the fifth instructional technology for online education is the most invisible, maybe that’s what it is. Nobody thinks about plumbing until it doesn’t work. [LAUGHTER]

Leni: Thank you. So that’s technology, I would say this section is essential, because you cannot teach without knowing how to use technology [LAUGHTER]. And the sixth section, Online Education Administration and Management, I would say this is the driver’s seat of online bandwagon [LAUGHTER]. Online education can never prosper without the support of the school administration. So, leading school reforms, like entering the field of online education requires certain strategies to be certain of success. Therefore, I would say this section will indeed equip the readers with those skills, perhaps Safary, can tell us what section seven is?

Safary: I would say the last section is the Customer Service, given that the students are the customers. So the customer service, how to make sure we deliver the best customer service to the online students. And so it discusses all those different aspects of how to really prepare, plan effective service to the students, because many times when people are migrating from face-to-face to online or integrating online education, they forget that online students actually need serious support. And this support definitely needs to be defined. And people who are dealing with the students need to be trained. And so the last chapter actually deals exactly with that.

Leni: For me, because I was looking at the table of contents, and I was smiling in my mind, wow, this is really neatly done. And so this works came to my mind, and I said, Oh, the seventh section, this section focuses on the heart of every online classroom, the students. And so because the students are the reason why we offer online learning, thus we ought to know how we should support them.

Rebecca: One of the things that I love about working on collaborative projects that are really big, and then you have these opportunities to reflect together, is how you summarize what you did. It’s probably really different than while you were right in the middle of it. And it’s fun watching the facial expressions and things as you guys are describing the different sections.

John: With 50 chapters, there’s a great deal of breadth and depth on these topics. In section one, though, you address two topics which are not always considered as part of traditional, at least, online education, which is the use of MOOCs and ERT, emergency remote teaching. But these have played fairly important roles. Could you talk a little bit about the role of MOOCs and Emergency Remote Teaching in the larger environment of online higher education?

Leni: As I see it, MOOC and ERT are connected to the overall theme of the book, because technically they are both delivered online. Online learning can be synchronous or asynchronous. And it’s mostly taken asynchronously while ERT is done synchronously, because it is generally a replication of the face-to-face classroom through the web. However, there are certain arguments in the field as to whether can we classify these two under online learning, because they are believed to not use the principles of effective online teaching. And they say, is their instructional design in ERT? There are more questions to raise to the point that some people believe they should not be called online learning. But for me, we have a common denominator, course delivery through the web. Maybe we can hear from my co-editors here, Kelvin and Safary, what they think about it?

Kelvin: I was thinking, John, when you asked that question, I think the combination of Emergency Remote Teaching and Massive Open Online Courses, it’s part of the popular conception of what online education is, it’s sort of like what a layperson might think, is, it’s just one big thing. So if you didn’t address Emergency Remote Teaching, Massive Open Online Courses, maybe even Blended or Hybrid learning, those mutations, it might not provide quite the same way in for the broadest possible audience. But then, once we’ve ushered you into the house, through the front door, I hope we do a good job of taking you on a more detailed guided tour through the nuances and everything that online education can be, without just being stuck at that surface level.

Safary: If I may add something to the ERT. Personally, the reason why I wanted to see this chapter there was that outside of the United States and maybe Canada, and a little bit of Latin America, when ERT came, Emergency Remote Teaching came, many people call it online education. And as we know, online education, the way we know it traditionally, is much more than translating your face-to-face class to a Zoom class or Google meet class. And let’s face it, that the word there is emergency. This was an emergency modality, which obviously emergency is never the best option, it means better than the chaos that you’re going through. And so many people who didn’t know online education, they came to believe that Emergency Remote Teaching means online education. And many people who were against online education to start with, it was like, “Okay, we have already said that this thing is really bad because it was an emergency.” So it was very important to distinguish what Emergency Remote Teaching is. And in the future, if somebody wants to use that for another calamity that happens, then they know what steps to take, but it does not replace what is known, what we define as quality online education.

Rebecca: One of the parts of your book, The second section is about online education around the world. And getting that tour around the world is not something we typically get the opportunity to have. So can you talk a little bit about what some of the global differences in how online higher education is structured and practiced across continents and regions?

Safary: This section came up as we were trying to make the book global. We really wanted to hear the voices of the people from around the world and not just the United States… the United States being the lead on online education, no question about that. We wanted to know where things are in different regions that were represented. We had to even go online to try to track people down from different countries. It was not easy finding people from certain regions where we didn’t have a network. So as a result, we’re able to bring on board chapters from different parts of the world. We had a chapter from the United States, we had a chapter from Europe, from Canada, from Asia, from Latin America, from Africa, from Australia, and from Middle East. So we were able to see what was happening in each one of them. And these chapters we had, they were kind of similar in a way where we wanted to know what is happening, what are the challenges, what are the achievements that people have in those regions, so that people from those regions who decided to do more work on online education, they have a place where they can learn of what is happening in the whole region from this book. They can have this as a reference to understand what was happening in their region. It is true that when you have one chapter, for example, I co-authored a chapter on Africa, because I was still in Africa at that time. It’s a chapter that’s covering 52 countries, you cannot really cover 52 countries, we just had to have illustration from some of African countries, because there’s no way we have data on all the 52 countries, but at least, there were some common themes that were coming up from a different African countries if I can speak from that specific region.

Leni: I can speak from the perspective of an Asian because I come from Asia. And I would say, we cannot deny that online education started in the West. But because we live in a connected world, it spread easily. Basically, I can see a lot of similarities around the world. The only differences I noticed, because your question says what are some of the global differences in how online education is structured and practiced? Now, I would say the only differences I noticed are the approaches to online learning, depending on the level of their maturity, in using this modality, and the resonance of the context they serve. Institutions that have been engaged in online learning for a long time definitely deal with issues that are different from those of newcomers, the needs of the context they serve also differ, so the strategies utilized also differ. One thing I would highlight, though, is that you can clearly see the creativity and continuity of people in different parts of the world in running online education. And we still can learn from each other. That’s why I said a while ago, the section on the global online education is really colorful.

Rebecca: One of the things that I think is really interesting about that section, is that it can also give us insight as instructors that teach a global audience about what the contexts are that students might be coming from. And that’s something that we often don’t have a little bit of insight into.

Safary: I think that is a very good point. Now that we have online education, people are teaching in many different countries. I remember just a couple of weeks ago, I was approached by one of my former students who wanted me to teach a class in the Caribbean. If things worked out for me, for that class, I would have just glanced at that chapter that covers a little bit of the Caribbean and see what I need to watch out for. So that is definitely a good point for the section on the different regions. In this handbook.

John: When online education first started, there wasn’t really that much known about what would work effectively. And as online education evolved, we saw the role of instructional design become an important part of the practice of online education. And section three deals with online instructional design. And that’s helped facilitate and inform online education, along with a lot of research that’s been done since the early stages. How have instructional design practices evolved since the early stages of online education in the latter part of last century?

Kelvin: That’s a good question. And I guess I’ve been in this field watching this first hand and touching it for about 25 years now. So I sometimes say not exactly the first floor of the building, but just one step above. And what I would say is that when I started in the late 90s, what we saw a lot was adaptation of traditional instructional systems design models and practices, that is constructs that were used quite often in corporate education. See if this takes you back to the past: CD ROM development, military learners. Those kinds of methods, practices, and models were adapted to this online context. And some of that’s constrained, like you’re making a system, like it’s a bounded system that was, quite often the context, like a CD ROM. And now you’re talking about the internet, a network open system. And I remember some of those early days, like, “Okay, what can we learn from these models? How can we adapt those?” Over time though, we learned that this is a unique context, which then began to have its own models and practices and processes and research and iterations and development. And I think of even things like much newer developments, like alongside of constructs like inclusive pedagogy, we see practices and thrusts, like inclusive design, as being a very specialized subset. So we’ve got a very robust research and professional practice literature that has grown up and these, arguably, two and a half decades of online education experience to draw upon. And I guess I’ll just say this, about that. Throughout my time in this field, what I’ve seen is that online tends to make the formerly invisible, visible; formerly implicit, explicit. And I think that evolution of instructional design and development field, it has learned from that. Online education has drawn us along in what does it mean to bring learners in from really anywhere and bring them together in a learning community, and how do we excel in that. That’s been a really rich progression over these last two and a half decades.

Safary: If I may add to that, the reason why we had this section was that many people who are new to online education, they think that online education is about uploading all the files that you have been using face to face, and then let the students read that, and that’s online education. It leads to a lot of frustration from the students because there was no instructional design for online learning. And so we needed to have a section that would guide people into that. And also for instructional designers in college and universities where they already have instructional designers. Some of them have not gotten a degree in instructional design. So they have limited knowledge. They just happen to know a little bit more than everybody else, but they don’t really have a solid foundation. And so that section helps to kind of guide people in the proper instructional design for online learning.

Rebecca: So sections four and five focus on online instructional delivery and instructional technology. These are topics that we love to talk about and have episodes of this podcast on. But given the time constraints, we probably can’t dig in fully here. But can you help us identify some of the most important changes that have occurred in how well designed online courses are taught?

Leni: That’s a nice question. Kelvin also said a while ago, he was mentioning about the early years of online instructional design, I would say, perhaps 1998 to 2000, those are the early years I’ve been involved, still in the planning stages of online delivery. Most of the online courses we developed were primarily text based, and are delivered asynchronously. That was after the military, Kelvin used, online learning, it was already in the university. Why text based? Because even our students, in the context we are serving also did not have the capability or the capacity to access videos or higher level technology tools. That’s why we designed the way they can access us. And so, yes, it was primarily text based and asynchronous. However, through the years, I would say two forces caused the major changes in the way we design online courses, first, technological developments, particularly in instructional technology. And second, changes in the needs or nature of our stakeholders, the students. Well, technological developments without a doubt have increased the repertoire of instructional media that we can use in designing truly engaging online courses. But as I’ve said earlier, technology is not the heart of online learning… it’s our students. And we saw how the nature of our online students change over time as well. While many of them were happy with plain text based asynchronous online courses during the early days, now they want more real-time meetings. And the flexibility they want is indeed tremendous, I tell you. We notice that there is a greater demand now for more flexible and personalized learning approaches. And these topics are dealt with in this handbook. I know Kelvin has written on this. And some other chapters also addressed this flexible learning, personalized learning approaches. These are now the needs and demands of the new generation of online students.

John: This is bringing me back to a time when I started back in 1997 teaching online when many of the students had 300 baud… [LAUGHTER] …or 1200 baud modems, and you couldn’t do much more than text. And I remember putting in some flash-based videos, and many students couldn’t access those because they didn’t have the download speed, especially students in more rural areas. So there was a lot of resistance to online education when it was first introduced, which is one of the reasons why I think instructional design practices became a part of early online education to help ensure the quality of that. And we do have, in most institutions a fairly elaborate process of instructional design assistance and instructional design review for online courses, which is something that’s never really happened in the same way for most face-to-face courses. Might it be time to start applying some of the techniques and practices of design that’s being used for online course delivery to in-person course delivery?

Safary: I remember about 15 years ago, I was training faculty on online teaching in the Caribbean. And I remember many of them, at the end of the training, saying, “I have improved my face-to-face teaching because of the training that I have been going through for online teaching.” So I definitely believe that if people get the proper training in online teaching, they can use that knowledge to improve face-to-face teaching. Because let’s face it, many people are teaching not because they have a degree in education, but because they have a degree in whatever field they come from, they have never learned how to teach. And so when they go through the training for online teaching, they discover a lot of principles that they should have even been using face to face. So I definitely agree with you on that one.

Kelvin: Yeah, it’s true. I say it all the time online makes the formerly implicit, explicit; formerly invisible, visible. And I think that’s why online has been a vehicle for applying thoughtful design and teaching practices and the improvement thereof. Once you sort of concretize the elements that make up an online education experience, then you can see well, how are they arrayed? Are they lined up properly? Does this cause lead to the desired effect, and you can work on improvement, no offense to anyone in this, but when we just are dealing in the ephemeral, we will walk into a space, four walls and a door, and we say words into the air, it’s much harder to see how those parts fit together or don’t. And it’s harder to be reflective. So, I think that’s the reason that online education has brought more emphasis to potential improvements, continuous improvements, and so I welcome it as a vehicle for a more thoughtful process in general. I love this elegant turn of phrase Caroline Boswell says she frames teaching as a student success intervention. Or as I put it, I’m one of those odd people who sees a connection between teaching and learning. And not everybody does.

Rebecca: You’re kind of queuing up our next question perfectly Kelvin. The final section of your book is really about student support. And our students are often distributed when we’re teaching online. So what are some of the biggest challenges in terms of supporting students that are in these online programs or online courses?

Kelvin: Yeah, I would welcome Leni’s and Safary’s viewpoint on this as well. But to me, I’ll keep it simple and say that the biggest challenge is the diversity of student profiles. The different backgrounds, the multifaceted demographics, and resource or not resource, or technological connection or not technological connection, that diversity makes it awfully hard to assure kind of an equitable experience for everyone. So that’s the gap that emerges, that student support is trying to offer… not to mention the diversity of approaches to design and development in the actual experience. But I’m curious what Leni and Safary would say to that.

Leni: I would go for the opposite, on the side of the teachers, I would say the greatest challenge in student support is personalizing your support. It’s related to your diversity. Almost every online student has her unique needs and contexts. So considering different personalities and backgrounds as well, you may be able to personalize your support. But in the name of efficiency, you’ll find yourself dehumanizing the process. What do I mean by this? Well, machines can never replace human touch. And human touch is what every online student needs.

Safary: If I may speak a little bit from experience I had in Kenya during the COVID-19, we migrated our classes to the online delivery. And I quickly realized that… and this was something that was going on in all of Africa, I know this because I was involved in different international association for online education all over the continent…. and so we were meeting and discussing some of these issues. The major challenges that were going on at that time, I don’t know about today, were dealing with infrastructure, because most universities that didn’t have online education platforms, or online education structural systems, so the technology was not in place. Many students there were not access because the internet was extremely slow, some were using loads of data to access the materials and they would run out. Some had issues with electricity. These are things we take for granted in the West. These are the not issues that we will discuss even in textbooks of online education, but they are real issues that cannot be ignored. And so that was a major challenge in supporting online students, because the infrastructure was not in place. And I think the issue is still the same. But more and more work is being done. I remember, for example, in Kenya, what the government did, they gave the free data access to all the faculty in the whole country, as long as it was used only for instructional purposes [LAUGHTER]. If you want to use it for something else, it wouldn’t work. I mean, that was quite creative, to try to help people to help education move forward, because everything was just stuck because of COVID-19.

John: Over the past year, we’ve seen a fairly explosive growth and use of generative AI large language models, including chat GPT, Claude, and a few others that have come out very recently. And that opens up a lot of interesting opportunities, but also some challenges for online education, particularly concerning the assessment of asynchronous learning. How do you see online education adapting in response to the widespread availability of tools like this, which will only become more powerful over the next few years?

Kelvin: It’s sort of the very definition and epitome of disruptive innovation or disruptive technology. And just to be clear about this, I don’t think it’s limited or focused on asynchronous online education, I think it’s everything. For me, it’s really an opportunity to address learning and assessment of learning much more meaningfully, and I’ll use one of Leni’s words, more personalized and relational. I think one of the things we’re seeing with the injection of these various forms of artificial intelligence into the learning setting is the value proposition of the human. And I think it was Cathy Davidson, years ago, from HASTAC said something like, “If we faculty can be replaced by a computer we should be.” That is, if all you’re offering is something that is easily rendered more efficient and scalable by a machine then, well, what are you doing it for? I think that the opportunity to really gauge learning, which is a very personal and a meaningful thing, we act like it’s something that’s kind of homogenized and industrialized, but learning… I don’t know what learning is, frankly, I can’t crack open a human and see what all is happening with the connections and making of meaning in all the background experiences. All I can do is get insight, but in dialogue, in the creation of artifacts I get a glimpse. If we’re product oriented, to the exclusion of the process, and to the exclusion of the human context, well, that can be certainly disrupted, maybe stolen by artificial intelligence in machines. But if we keep the emphasis on humans, on “Well, John, tell me about this…” that’s more meaningful. I learned a practice a long time ago from a faculty member that I studied under, where she adopted a practice of a learning summary. And in any course, again, that’s just one artifact, but it gives a glimpse into the articulation of what learning is really about. So I think we need to push the envelope in “What does authentic assessment mean? What does meaningful learning look like?” Now, that’s hard to do at scale. Are you going to have a personal oral defense with every student for every assignment, probably not. But if we see artifacts, and products, as breadcrumb trails leading to a destination of a more substantive dialogical process, well, then maybe that’s something. So I don’t think we know yet how this is going to play out. And I think your listeners are gonna find cold comfort from me in getting to an easy solution. But I think the future of responding to generative AI is to lean more into the human and the relational than less.

Rebecca: So we always end by asking what’s next?

Safary: Well, as far as this project is concerned, what is next really, we want to continue building a community of online higher education scholars, practitioners, so that this momentum that has been created by this book can continue, because this is one of the few maybe rare books that really have so much global contribution to online education. Many of the books that are written, they’re usually kind of regional to a specific region of the world. And so this is the first time we have a network of, I think, around 100 people who contributed to this, coming from many different countries. And I feel this has created synergy on the discussion of online education in a way that we should not let that go. So one of the things that we have been talking about is the possibility of holding a summit on online higher education in the next few months, once everybody has gotten a chance to hold a copy of this book, and to bring different experts together from different parts of the world, and try to address online education from different parts of the world, while addressing common issues such as assessment, which is one of the major controversial issues anywhere have been, everybody talks about the challenges of online assessments. So that’s things like this, and probably this artificial intelligence, which is a new thing, we may want to go deeper into that… we’re not able to dig too deep with that, although we addressed it in the book. But we didn’t go too deeply because it was still kind of new ChatGPT was just coming out when we were finishing the handbook. And so that is one of the things that we are looking into, there is another handbook in the making with SAGE that will focus specifically on instructional design in higher education. So that would be like an extension of this project. So we want to continue building on this work, because we consider it’s very important.

Leni: I’m really optimistic about the next steps on this because it’s like a seminal book that really got there’s a global perspective, as Safary says it’s not the same as the other online learning books. So we can also see a lot of developments coming up. And so I will say, this book is just step one, the next steps will really be coming up definitely, because the field is always growing. We have seen its growth, and it will still grow. And so there’s more to follow, I believe.

Rebecca: Well, thank you all for joining us. I know that our listeners will really enjoy the handbook and all that it has to offer.

John: Well, thank you, and it’s great talking to all of you and we’re looking forward to reading the book.

Safary: Thank you so much for the opportunity. Really appreciate that and wish everybody a wonderful reading experience.

Kelvin: Thanks for having us, Rebecca and John.

Leni: Thank you very much.

[MUSIC]

John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.

[MUSIC]

313. Supporting Neurodiverse Students and Faculty

Many discussions of inclusive teaching practices ignore the role of neurodiversity in higher ed. In this episode, Liz Norell joins us to discuss strategies that faculty and institutions can use to create a welcoming environment for neurodivergent students and faculty. Liz is a political scientist and the Associate Director of Instructional Support at the University of Mississippi’s Center for Excellence in Teaching and Learning.

Show Notes

Transcript

John: Many discussions of inclusive teaching practices ignore the role of neurodiversity in higher ed. In this episode, we discuss strategies that faculty and institutions can use to create a welcoming environment for neurodivergent students and faculty.

[MUSIC]

John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.

[MUSIC]

John: Our guest today is Liz Norell. Liz is a political scientist and the Associate Director of Instructional Support at the University of Mississippi’s Center for Excellence in Teaching and Learning. Welcome back, Liz.

Liz: Thank you. I’m happy to be here.

Rebecca: Today’s teas are:… Liz, are you drinking tea?

Liz: I am not. I am drinking some vitamin water, tropical mango flavor.

Rebecca: So there’s some stuff mixed with water. That’s tea, right?

Liz: Yes, sure.

John: And I am drinking Prince of Wales tea today.

Rebecca: Oh, I like that one. John. Haven’t had that in a while. And I have Awake today. It’s Monday… [LAUGHTER] when we’re recording.

Liz: That feels fair.

John: But it’s getting really boring. [LAUGHTER]

Rebecca: I know. [LAUGHTER]

John: We’ve invited you here today to discuss issues related to neurodiversity in higher education. Before we start, though, could you define neurodiversity? And how is this different from neurodivergence?

Liz: Sure. So I think these two terms get kind of conflated with one another a lot. And so I tried to be really explicit in talking about neurodiversity versus neurodivergence. And there are a lot of different perspectives on these two terms, and some of the baggage that they carry with them. But when I think about neurodiversity, I just think about a diversity of brains. And so any group that has more than one person is going to be neurodiverse. We all have different brains. But neurodivergence is a brain that works differently than how we typically think of brains working. And there are lots of diagnoses that get put under the umbrella of neurodivergent. Some of the most common ones are autism, ADHD, there’s Tourette’s. There’s lots of other ones, dyslexia, dyspraxia, OCD sometimes gets lumped in there, bipolar disorder will get lumped in there. So neurodivergence is just a brain that works differently than the way we typically think of brains working.

Rebecca: If we’re thinking about college student populations, and I know that this is partially a guess, because we don’t actually know, but how many do we think are neurodivergent?

Liz: The estimates that I’ve seen have been around like one in six, maybe, but I think that it’s probably closer to 30, or 35%, honestly. I think a third is probably a reasonably good estimate. It’s a large number. And I will say that a lot of those students either may not know that they have some sort of neurodivergence, or they may not ever tell us that they have a neurodivergence and we’ll talk more about that a little bit later, I think.

John: What proportion of neurodivergent students have accommodations through campus disability services?

Liz: Who knows? If we don’t know the denominator, it’s hard to know what proportion of people would be registering with accommodations. But I think there are certainly a lot more students who are registering with disability services with an official diagnosis, but there are some barriers to that. And the first one is that it’s really expensive and time consuming to get a diagnosis. So I should say, and I should start pretty early in our conversation by saying that I recently went through the diagnosis process to get a diagnosis of autism at age 45. And it took me a year from reaching out until I had the diagnosis. And I was able to navigate that, because I have some experience interacting with medical teams. I had good insurance, but it still took me a year to get an appointment, and to get the diagnosis. And so there are a lot of students who may not have the tools or the time or the resources to go through that, even if they suspect it. And I went 44 years of my life without even suspecting that I might be neurodivergent. I think there are a lot of barriers to that. And then once you have the diagnosis, it can be very intimidating to disclose that, to go through the campus accommodation process takes so much time and advocacy. And that comes from a population that’s already taxed in terms of their bandwidth and their resources and their just ability to get these things done. So I don’t think that thinking about the numbers of people who seek accommodations is even close to representing the population of students who might have these conditions.

John: It would seem that there’s a bit of an equity issue here in that students from wealthier households, students from continuing generation households, are much more likely to have the resources to go through the process of having the need for accommodations being documented.

Liz: That’s right. And I think this gets into some of the language around neurodivergent versus the neurotypical. A lot of people who are neurodivergent, who have some sort of condition or way of thinking or way of operating have been socialized to think that there’s something wrong with them, this kind of medical model of disability, instead of the more like social model of this is a socially constructed difference. And so to seek out a diagnosis requires a kind of self containment, I think, to recognize that this is not something wrong with me. A lot of people who have a neurodivergent brain probably feel like they should be able to act like a neurotypical person. So they don’t want accommodations because they feel like that is somehow making them less than their neurotypical students. And it’s this medical model that has infused so much of our talk about disability, and especially pernicious here where we know that there are real struggles that students have when they have these neurodivergent brains, that we are just not accommodating well in the classroom.

Rebecca: I do want to mention at this juncture, that we do have an Episode 221 – Disability in Higher Education with Kat Macfarlane that really talks in detail about the accommodation process. And so that’s a really great place to learn more about that process in high detail that was kind of the subject of most of that episode. Most of our college faculty generally haven’t been trained to address issues of neurodiversity. Can you talk about some of the common challenges our neurodivergent students face in classrooms?

Liz: Yes. And it’s absolutely the case that we have not been trained in this. And I think also many of my faculty colleagues, past and present, have this idea that an accommodation is somehow like special treatment that’s making a class easier for students, when we should, John, as you mentioned, be thinking about this as an equity issue. So accommodations are meant to provide equal opportunity for success. And if you’re bringing some of these conditions into the classroom, you’re already operating at a deficit. So what are those? Well, it can be things like being really easily overloaded by sensory information. So we see this a lot with autism and ADHD, where, as someone who now understands herself to be autistic, I think about this phrase, “the lights are too loud,” like, it just feels very harsh. And when people are talking over each other, I get very flooded very quickly. This has been the case my whole life. If there are unfamiliar foods or drinks, that can be really overloading and so background noise, people who are close to each other, uncomfortable seating, these are all things that can show up in our classrooms that can cause someone with a neurodivergent brain to go into a kind of overload. And that, of course, reduces their ability to pay attention and to learn and to retain information. Unclear communication is a huge challenge for people with neurodivergent brains, because it’s often the case that there’s some sort of like inability to recognize sarcasm, or the ability to get some nonverbal communication. Oftentimes, people with neurodivergent brains will interpret everything very literally. And so they miss out on some of the nuance. And for me, it’s been this like obsession with choosing the just right word, because I need it to be precise. And I can get really fixated on that sometimes, in a way that feels very pedantic. But that is really just me very much trying to communicate clearly. When there’s unclear terminology–write professionally, or be collegial, or work well with others–like I don’t know what any of that means. I have no idea. And there’s an assumption that there’s some shared social norms that may not be as visible to people with a neurodivergent brain. There’s a lot of, of course, well documented social aspects to neurodivergence. So just like not really knowing how to work with others in an effective way, or feeling like that sense of, I’m different. I’m broken, I’m not as good as… that I mentioned earlier, can carry over into social dynamics. And then the last one that I think is really important for us to think about in terms of higher ed is executive function. So executive function is that ability to kind of be a taskmaster of your own attention and brain. And so things like prioritizing work, time management, how to take notes, how to make decisions, how to cope with the ups and downs of life, due dates, all of those things like managing systems is really hard when you have a neurodivergent brain. And we often assume that our students have those skills. And so we don’t scaffold them. We don’t help them. We don’t point them to resources, and that can be really hard. So those are just like four big clusters: sensory overload, communication, social interaction, and executive function.

John: How can faculty anticipate or design with neurodivergence in mind, particularly when many students with disabilities choose not to self identify?

Liz: Just being aware of these things that I’ve just mentioned, is hugely helpful. And I think the hard work is really just awareness. So for example, I have heard lots of my colleagues and myself at earlier points in my career, lament about students who are distracted by their cell phones or their laptops, who seemed to need to go to the bathroom three times during a 50-minute class, or who otherwise seemed to be just kind of like disconnected from class. We see that as a sign of disrespect and as of not paying attention. But a neurodivergent brain often really struggles to sit still and make visual contact with another person or object. And so it’s often the case that our neurodivergent students can learn better if they are doing something, the more physical, the better. So for some of my students, it’s things like knitting in class or coloring or doodling. This is actually not them disengaging or not paying attention. It’s them doing something that allows them to focus their attention on what you’re saying. So I like to think about performance of attention, what we often think of as paying attention. So if a neurodivergent student is going to perform attention, they’re probably not actually listening to anything you’re saying, because they’re using all of their brainpower to do the things that you think mean they’re paying attention. With that said, this sort of notion that we have to reorient our thinking about what students are doing and what that means in terms of their engagement with us, I think being really clear about scaffolding what tasks are needed, providing clear deadlines. So Karen Costa, who is just brilliant, talks a lot about ADHD, and she is a person with ADHD. And she talks about the need for more structure, not less, that flexibility can be useful, but you need a lot of structure. And for people with neurodivergent brains, it can be really helpful to have lots of small deadlines that are low stakes with some grace around them, but like clear structure is really important… messaging to students, like, here’s how you do this class. So if that’s working in a group that’s giving specific roles, and asking the students to decide who’s going to be the note taker, and who’s going to be the recorder, and who’s going to be the crazy idea person, and who’s going to be the let’s bring it back to the text person. So just kind of delineating some specific roles, communicating clearly and in multiple modalities. So especially if you’re doing a lot of audio lecturing, or giving up directions, making sure that those are also available in writing. So students can come back to them later when they know that you said something, but they don’t remember what. And then just being really aware of that sensory environment in which you’re learning. So if that’s in a physical classroom, thinking about ways to give students permission to make themselves comfortable, if that’s getting up and walking around a little bit, and just sort of saying, like, I know that that helps some of you concentrate, fine with me if you do that. Telling them that they can get up and leave the room for a couple of minutes if they need a break. If they want to bring in things to play with, color or fidget, create whatever, that’s fine too. A lot of my students like to just sit on the floor. I mean, that sounds like a disaster for me and my middle-aged body, but when you’re 18, it’s like easy to get up off the floor, and so if you want to sit on the floor, sit on the floor, it’s cool, if that helps you be more comfortable. So I think it’s awareness and then just messaging to students that they can do what they need to do. And I just want to say one more thing about this. And that is, even if you are not neurodivergent, even if you are what people define as neurotypical, you can talk about students you’ve had, people you know, friends, family members, colleagues, you know people who are neurodivergent, talk about some of the ways that they have given themselves permission to make their environments work for them. So that you’re messaging to students that you understand and that you support those kinds of self-advocacy efforts. So you don’t have to do all of that on the first day of school, [LAUGHTER], first day of class. It’s a lot, but I often include something on my syllabus that says you may have accommodations or you may not, but if there’s something that I can do to make this class easier for you to participate in meaningfully and be successful in, according to your own goals, then I will do it, as long as it’s something within my power to do. So. You don’t need formal accommodations to ask me to do something to help you.

Rebecca: One of the things that I heard you mention and often come up a lot in inclusive pedagogy and other spaces is the idea of scaffolding. Here, again, we have this assumption that everybody knows exactly what that is all the time. But part of that is really about helping students understand their priorities, perhaps within a class, and also how to manage their time related to certain kinds of tasks. Can you talk a little bit about that component of scaffolding and what that might actually look like in practice?

Liz: Yeah, it’s hard for me to tease that out without thinking about lots of other things too. Because, we, as faculty, are coming into the classroom with certain ideas about what should be important, and what students should want to do in order to be successful, whatever that means. And I have really had to learn over my teaching career to check myself on that, because my priorities are not the same as my students. I remember once I was grading a student’s final exam they had done, it was very, very early in my teaching career, I’m embarrassed to even say what they had done for their final, but it was it was multiple choice and I was grading it by hand, and I told them, they could just stick around. I would grade it real quick, and then give them their final grade, because I had done everything else before then. And I looked at the student who had come to class every day and had really meaningfully participated and said, “Your final grade as a C,” and I was so apologetic, like, “Oh my gosh, I’m sorry. “And this student was like, “YES! I PASSED!” And it was a real moment for me of just like saying, “Okay, you cannot put your own values and goals onto students, because that was literally the student’s dream–was to get a C and not have to take the class again,” I think when we’re talking about students, it’s really having that very frank conversation, like, some of you are here because you are being required to take this class, and I know it wasn’t your choice, and I’m going to try to make it as the least awful it can be for you. And I’m going to ask you to try to invest at least enough time to give it a fair shot. But I don’t expect that everybody has the same goals and so let’s take a moment and reflect on what success for you looks like. And then what do you need to do over the course of the next 15 weeks, or whatever it might be, in order to make that goal. And so we sort of assume that our students’ goals are to get a good grade and to move on to whatever the next thing is. But maybe it’s not, maybe they’re just taking the class for fun, maybe they don’t care about the grade. Maybe they’re an adult learner who has a curiosity about something. After I graduated undergrad, I took an ethics class online–this was like 2001 or 2002 maybe, so it was very early days of online teaching and learning. And I took it through the community college just because I had never taken it, and I thought it might be interesting. It really wasn’t interesting for me. So I just like stopped paying attention. But I did not consider that a failure. Like I got a little bit of information. I also took a macroeconomics class, because I had never taken a macro class; I had taken micro. And it was all online, and I did all the work, but I didn’t turn in any assignments because I just didn’t care about those. I just wanted to learn it. So, I think having these conversations can be really helpful in students figuring out what it is that they want to get out of a class

John: Going to that point you made about structure. This is something we’re seeing an awful lot… certainly in the work on inclusive teaching, in Viji Sathy and Kelly Hogan’s book, as well as the work of Mary-Ann Winkelmes on transparency and learning and teaching. There seems to be a convergence that by providing students with structure and support, it can do a lot, it can benefit pretty much all students. In past discussions, when we spoke to those authors, much of the focus was on the benefits to first-gen students and to students who were historically minoritized. But it’s kind of nice to know that the same inclusive teaching strategies also addresses issues of neurodivergence.

Liz: I had an experience at my last job, where I just kept asking my dean, “Give me a set of rules and I will follow them.” And she would say, “Well, just use your best professional judgment.” And I don’t know what that means. I think people with certain kinds of neurodivergence just want you to tell them what to do and they will do it. Give me a set of very clear expectations, and I will meet them and this can work for everyone… just like clarity. And it doesn’t have to be punitive. I’m very fond of Cate Denial’s Pedagogy of Kindness. And this notion that like it should be kind, which can be as Sarah Rose Cavanagh says like a warm demander. I want you to have expectations of me, I want to know that you care about me. But just be really clear, because clarity is kind. And yes, it helps with all of those things. So the first generation students, the historically minoritized, the neurodivergent, lots of different kinds of people are benefited by this. And even the third generation middle class neurotypical student, also [LAUGHTER] benefits from clarity.

Rebecca: Imagine that, not spending all of our cognitive energy trying to figure out what people want.

Liz: Exactly. {LAUGHTER]

Rebecca: What are some of the challenges that neurodivergent faculty face in their careers? We’ve talked a lot about students, but we also know that faculty also exist.

Liz: We do. And we’re not all like cut from the same cookie cutter. I can tell you, just from my own experiences, that higher ed can be really hostile to those who are neurodivergent. I’ve had really great experiences, and I’ve had some really challenging ones. I think that it’s helpful when we’re aware of these things for students, because we often have the most power over their educational experience. But we also share power with our colleagues. And so knowing what some of these things are can help us understand the behaviors of our colleagues that we might have been inclined to read as subversive, or unprofessional, or lacking in collegiality. Those words get used a lot, for a lot of different kinds of identities and traits. Neurodivergence is certainly one of them. So as a woman who’s neurodivergent, that intersectionality means that I’m always on the lookout for that kind of language of like unprofessional and not collegial and you’re being difficult in some way. Well, or maybe I just don’t understand what it is that you’re asking of me. I also think we need to be really careful when we think about this idea of fit. So especially in hiring, we are looking for someone who will fit. But fit often means like me. And it can be very exclusionary to people who have some sort of neurodivergence, because they may not act the way that you do. But that’s actually a strength, I think. When you look at the different kinds of neurodivergent conditions, ADHD brains are so good at hyperfocus. They just don’t always do a good job of like, understanding time, right, there’s a kind of time blindness. But they’re so good at that. And autistic brains are so excited about the things that they’re excited about. And that energy is so captivating. And so these are not weaknesses, these are strengths that can really help us appreciate things in our work that we wouldn’t if we didn’t have those around. So when thinking about working with colleagues, all of the things that I said before, sensory overwhelm, communication, social interactions, and kind of executive function, we should be thinking about those things with our colleagues as well. So when I design a workshop, for example, in our Center for Excellence in Teaching and Learning, I’m thinking about, “Okay, how do I create a space where the chairs could be moved away from the rest of the group so that people can have a little bit of space and kind of get away from that? Can I dim the lights a little? Can I ask, make sure everybody’s using a microphone, but also let people know that if they want to put in their loop earplugs, as I do, you can do that to kind of limit some background noise. Can I make sure that everything I say is also written down somewhere so that people have something to refer to later? Can I talk about my own experiences in a way that normalizes other people doing the same?” All of those things can be used to make the environment more inviting for our colleagues. The last thing I want to say is that it is so exhausting, as a person with a disability of any kind to constantly having to advocating for yourself. So the more that non-disabled people can lend their support and their voices to advocating for easier pathways to accessibility, the less you’re taxing your disabled colleagues. So thinking about what can I do, that if I did it, would make it easier for a disabled colleague to come behind me and ask for the same?

John: Near the end of the summer you posted on the social media site formerly known as Twitter, something about a podcast and puzzles set of workshops. Have you started that? And could you talk a little bit about that?

Liz: Yes, so I’m really excited about this. And a lot of people who are not at the University of Mississippi are also excited about it. And I’m just trying to get the people on campus excited about it. So the idea here is that, and this was specifically created as a neurodivergent-friendly space. So faculty and staff can come to our center for an hour every other week. So this Wednesday will be our next one. And we play a pedagogy podcast. So we played an episode of this podcast, and we do a puzzle or some other kind of individual or parallel play is what it’s called. So I’m working on a puzzle. It’s in the next room, and I’m not done with it. And it’s driving me crazy, because I don’t like unfinished puzzles. But I have committed to not working on it, except when everybody else is here. But it’s a Funko Pop puzzle of Ted Lasso, so it’s really fun. And we’ve had two of these now, and it’s a small but mighty group who are into this, but the lights are low, it’s indirect, diffuse lighting, there’s lots of different kinds of seating, and one person comes and colors, a couple of people have come and done puzzles. But the idea is that it’s just a way to get together in a social space without the social expectation of small talk. So you can just come and show up and listen to a podcast and leave. If you want to stay and talk you can, you don’t have to, I am hoping that this becomes a movement of podcasts and puzzles. And I’m going to stick with it as long as it takes for me to make it so here, but we have probably like five or six people who have come to one of them. And I hope many more who will as it continues to spread. It’s kind of hard to describe in a website or an email, but I think once people come they see the brilliance of this… I say with all possible modesty. [LAUGHTER]

John: Have people actually finished a puzzle during the course of one of these meetings?

Liz: This Wednesday will be our third meeting. So I think the Funko Pop Ted Lasso crossover puzzle will finish this week, I hope. It’s going to drive me absolutely batty if it doesn’t, and then we’ll move on to another one.

Rebecca: That sounds like a lot of fun.

Liz: It is. And it’s a good way to kind of model why I think so many of our neurodivergent students would really thrive at a kind of way of learning that’s very different than what we’re used to in higher ed. And that’s probably why it’s hard for people to imagine why we’re doing this or what it looks like. But we have writing groups where people come into our room and do their own writing. And just that body doubling of having someone else there, while I’m trying to do something is enormously helpful. And so in this case, like I took two things that I love, that I never make time for because I feel guilty about all the other things I should be doing. So puzzles and listening to podcasts about teaching, and I just put them together. And I hope that more people will see ways to create these spaces that are perhaps a bit unconventional to higher ed. But that can open our imagination to the ways that we can model learning in different ways than the more traditional models that we’re used to seeing.

Rebecca: I like the analogy with the writing group, in that it’s really holding people accountable to do a particular thing, which is to attend to teaching in a different way, by listening to a podcast as opposed to a different kind of workshop or something and allowing them to do something with their hands.

Liz: I also have this very large bucket of fidget toys that I take to every workshop. And I say just borrow a fidget and just play with it and see how that changes your experience of the workshop. And if you find it to be soothing, imagine what normalizing this in your classes might do for your students. So my colleague who’s just a couple of doors down I have one of these little like pop balls that make like these really satisfying noises. And the first time I brought this to a workshop, she said, “Is that the sound I’ve been hearing?” I just play with it all the time.

John: Do you have any other advice for faculty and campuses who wish to better address neurodiversity?

Liz: There’s this phrase in the autism world and the disability world and I’ve been hearing it more and more and it is, “Nothing for us, without us.” And so I can tell you my perspective as someone who is neurodivergent, there’s so much expertise on your campus, and you should talk to those people. So that might be in the disabilities support services area. It might be students in your class, but just like have these conversations and find out, what can I do from my position, whatever it might be, that can make this place more welcoming to people who are neurodivergent? And I think when you’re asking that question, just like with anything else that we might be doing, then people are going to assume good intent. And they’re also going to be much more forgiving, if you make a stumble of some kind, whatever that might be. I don’t know. And so just talk to people, ask them. I feel like this is the most obvious advice that we give as faculty developers, but it’s ask your students, just ask them, they just want to be asked. And so if I was to give any advice that would be that just ask your students: “What can I do that would make this easier for you?”

Rebecca: I know, one thing that we talk a lot about on our campus is that access is really the doorway to belonging. If you don’t have access…

Liz: Yes.

Rebecca: …you’re not going to feel like you belong.

Liz: Just to know that someone is thinking about what you might need is enough to make them feel like they’re included, and that you’re listening when they tell you what they need, would be helpful.

Rebecca: So we always wrap up by asking: “What’s next?”

Liz: I have so many writing projects that I’m just sort of getting started with. So I just recently finished a manuscript on my book, The Present Professor, that I mentioned the last time we talked. And so that’s going through the publishing process and will eventually go out in the world, I assume, knock on wood. So I’m filling my time while I wait for progress there on, it seems like, about a dozen other writing projects, all of which are just kind of me thinking. I’ve been really interested lately in talking about the role of learning outcomes, and what we decide rises to the importance of a learning outcome. And if I may say this one controversial thing that I just keep saying to everyone I know, I don’t think you as a student should be able to fail a class for doing something or failing to do something that is not a learning outcome of the course. So if turning in something two days late means that I fail the assignment, then shouldn’t that be a learning outcome, timeliness? I don’t know. It just feels to me like, if we’re going to assess learning, then we should be assessing learning, and not all the other things that are performance of learning. So I’ve been thinking a lot about that and a whole bunch of other things. That’s what’s next, something, many things.

John: We had a similar conversation with Kevin Gannon, not too long ago who talked about…

Liz: …performative hardassery…

John: That was the technical term…

Rebecca: [LAUGHING]

John: …but it was…

Liz: Yes

John: …in terms of rigor, who distinguished between cognitive rigor and logistical rigor.

John: Well, thank you. It’s great talking to you and we look forward to more conversations in the future. And when your book gets closer to coming out, we’d be very happy to have you back on to talk about that as well as any other topic that comes up in between now and then.

Liz: Absolutely. I so appreciate the work you guys do, and I’m grateful and honored to be a part of it.

Rebecca: It was great talking to you. Thank you

[MUSIC]

John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.

[MUSIC]

312. Alice: Finding Wonderland

Many of our disciplines are unfamiliar to students until their first encounter in an introductory course. In this episode, Rameen Mohammadi joins us to discuss his first-year course that introduces students to computer science using an approachable hands-on experience.

Show Notes

Transcript

John: Many of our disciplines are unfamiliar to students until their first encounter in an introductory course. In this episode, we look at a first-year course that introduces students to computer science using an approachable hands-on experience.

[MUSIC]

John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.

[MUSIC]

Rebecca: Our guest today is Rameen Mohammadi. Rameen is the Associate Provost at SUNY Oswego and an Associate Professor of Computer Science. Welcome, Rameen.

Rameen: Thank you. Thank you both.

John: It’s good to have you here. Today, for the first time ever, we’re all drinking the same tea, which is Chimps tea, a black tea with honey flavor, fig, and thyme. And this is a gift from one of our listeners. Miriam in France. Thank you, Myriam.

Rebecca: Yeah, so far. It’s really tasty.

Rameen: Yeah, really excellent tea. We love it.

Rebecca: So we invited you here today, Rameen, to discuss your First-Year Seminar course that combines animation and storytelling using the Alice 3 programming environment. Before we discuss that, though, could you provide an overview of the goal of first-year seminar courses at Oswego?

Rameen: This is not a standard first-year seminar. First-year seminar courses are designed to extend orientation, familiarize students with resources, and things like that. Our perspective about this type of course, which we call signature courses at SUNY Oswego, is that you are welcoming students to the intellectual community that we have. So as first-year students we desire a number of outcomes to be met by these courses, one of them is critical thinking, they have to have a significant critical thinking component. Also, these courses need to have both writing and oral communication embedded in them. And one of my favorites is that they have to enhance the cultural competency of our students, we’re a very diverse student body, there’s quite a bit of opportunity to make sure students experience other perspectives. And I think courses of this type really need to address that. Our provost, Scott Furlong, brought the idea to us even during his interview at SUNY Oswego, about what they had called, at his previous institution, passion courses. Now, as I said, we call them signature courses here. But those of us who love our discipline certainly can understand when somebody uses the term passion. So what makes you excited about your discipline? That’s what the course should help students experience.

John: So, you’re using the Alice 3 programming environment. Could you talk a little bit about the types of things that students are going to be doing in the class?

Rameen: So Alice 3 is a VR programming environment. So what you do is you build a scene, you can bring avatars of various types, could be people, could be aliens, could be dogs, into a scene, and you have props, trees, mountains, buildings, that you could bring into a scene, and then you learn to program something. So they can talk to each other, they can move from point A to point B. And it actually turns out, they’re able to, and they will be, writing reactive programming, which typically is what we do when we design games. So the user acts in some way, and then you program the reaction in the VR world in that context, or things run into each other. And obviously, when you’re designing games cars, or other things may run into each other, and you have the ability to detect that and actually act on that. But at this point, they are already running about a month ahead of where I thought they could be in just about a month of the semester. So I’m really hoping we can get that far.

Rebecca: Can you talk a little bit about why you chose the Alice platform, and what you were really hoping to foster with students.

Rameen: So, just a bit of background about Alice, Alice is supported by researchers in Carnegie Mellon. I think Randy Paudsch, when he was at University of Virginia, is really the person that began the innovation with Alice. And he thern moved to Carnegie Mellon. Many people in computer science would know who he is because of his work in VR, but what he should universally be known for is The Last Lecture, which is a pretty amazing hour plus lecture he gave before he died from cancer. But that group has been working on Alice for a very, very long time, and of course, has had new actors along the way. Don Slater is one of the people that has been part of that group for a long time, and he’s very much involved and was at the time when I met him very much involved in advanced placement. And that’s something I’ve been involved in for a long time, advanced placement for computer science. So one of the things we do in AP readings, we have people do professional development activities, and he gave a talk about Alice and this is a long time ago. But when I first listened to him talk about it, and he showed the features of the system, I really didn’t have a place for it in anything I taught at the time. So it has been brewing in the back of my head as a thing to build a course around for a long time, but really couldn’t have done it until the opportunity came along to build a signature course.

John: For those students who do go on to computer science. It seems like a really nice way of introducing the concept of object oriented programming. Does it work in that way?

Rameen: So the thing to understand about object orientation is that most of us who are software engineers by discipline, database types, we are very comfortable thinking of a student information being an object and the fact that we have a container of student objects and so on. But it turns out that’s not necessarily as comfortable for students as it is for those of us who do this for a living. But when you say, here’s an avatar, and you put this on the screen, and you could tell it to go from point A to point B, that seems like a very natural idea to students, and the fact that this particular avatar, so suppose it’s a bird, has wings, and opposed to a person who has legs, you don’t have to explain that. It’s a concept as inherent in being a human being and 18,19 years old. So some aspects of object orientation that often is difficult for students that are really obvious in this context. So any object like an avatar of a person, dog, cat, whatever, they can all be moved from point A to point B. So they share a set of expectations and attributes. They have a location in a 3D world, and you can move them from A to B, piece of cake, they understand. But then you say, “Well, this one is a bird, it has wings.” So the fact that you can spread the wings or fold the wings would be a characteristic that exists only because of it’s a bird. So inheritance, which is a concept that we like to teach in computer science is just built into the way the system behaves. And no student will say, “Well, what do you mean, a bird can spread this way or fold its wing.” People just naturally know what it all means. And believe me, it’s not always natural, in some of the other things we try to do with students to teach these topics. So it does lend itself extremely well, in understanding that objects have attributes, they have functionality, and it’s all there on the screen, and they can see it.

Rebecca: I think Alice is really nice, because it is so visual. And so you get those immediate, “I can see the thing I did,” whereas I remember when I started learning some code, I was building a database for car parts, and it was completely abstract. And I cared nothing about car parts.

Rameen: Yeah.

Rebecca: So it didn’t make it that accessible to me.

Rameen: It’s not, exactly, then the other aspect of it that I think we need to think about, about the platform, is that you don’t write a single line of code, you generate 1000s and 1000s of lines of code, but you don’t write any. So if you have a particular avatar as the object that you’re processing at the time, in building your code on the screen, you could just drag and drop functionality it has into your code. If you need to loop and repeat steps, you drop a loop into your code and then put the steps you want to repeat inside that loop. So all the typical barriers a student had with syntax or various languages, whether it was Java, Python, C++ kind of wash away, because you don’t really have to know syntax at all you, need to know “what are you trying to do?” and what will enable you to do it, and then you can execute that. So far, clearly, that’s not a problem for them. Here is the screen, this part of it is dedicated to X, that part of it is dedicated to Y and they’ve been able to handle it probably from week one. So all the standard things that tend to take a long time, don’t take any time. And besides doing 3D graphics, if you are a computer science person, in my mind, is super senior level type of activity. You got to teach them an awful lot about data structures and other event handling elements that they must learn because that is what we all had to learn. But guess what, you can learn it with Alice in short order. And this is the course is proving that you can.

John: Now one of the challenges that I could imagine you might have as that students come in with different levels of prior knowledge or interest or engagement with computer science. Some students may have not written a single line of code in any language, while others may also be taking other CS courses at the same time, or have some prior programming experience. How do you address the differences in background?

Rameen: So my sample case here is small, I only have 17 students, but this is not a computer science required course. So this is one that has art students in it, it has biology students, and and it does have a few computer science students and then maybe this one with an AP computer science background from high school, and none of them are doing any better than these other kids. So I guess the point is, it levels the playing field in a pretty significant way. If you can think a thought you can probably write code in Alice. And I’m finding it quite interesting, since I’m not preparing them for another course… not only this course doesn’t have a prerequisite, it’s not a prerequisite for anything else. So the way I designed the course going into it, I went into it with thinking, okay, so if storytelling and writing a really cool story within groups is the best I can get out of them, great. If I can get them to a point where they can write new functionality for objects, and I can help them write reactive programming so they react to the mouse click or collisions of objects and so on, maybe I’m dreaming, but that would be fantastic. At this point, I’m pretty certain I can get him there without other stuff. But that was kind of the key coming into the course, I walked in with a mindset of being flexible, that if they are struggling, I’m not going to keep pushing it like I would typically do in a CS course, which is partly why you would also lose students, but at least in my experience with these kids, and I can’t say until I teach it again (and hopefully I can) whether it will be the way it works is that you show them how to do something, and then they go to work, and they start doing it and then they make mistakes, and we all do, and then you give them a little bit of a hint about potentially maybe a different technique they could have used to accomplish the same task. I’m just going to give you an example. So you want the bird to go from point A to point B, so it’s on the ground, needs to go up on top of the tree. So Alice lets you put a marker where you want it to go on the tree, because you can’t go to the tree, you’re going on a branch of a tree. So you need to know how to put a marker there. So you put a marker there. And then it just goes from point A to point B, it goes from the ground to the top of the tree, then you say “Wait a minute, that’s really not the way birds fly.” So now you got to figure out, well, how am I going to flap his wings to go from point A to point B, to go from the ground to the branch on that tree? So it turns out and I’ve come up with a solution to this myself, obviously, you can’t really teach these things if you haven’t thought about how you could possibly solve them. And one of my students, after like three weeks of instruction, she figured out how to do what we call in Alice a “do together.” So as it’s moving from point A to point B, the step that is happening at the same time is the flapping event of spreading and folding the bird’s wing and she made it very clear that the bird was flying [LAUGHTER] from the ground to the tree with no interruption. Then we need to talk about well, do they really need their legs hanging out as they’re flying? I don’t know much about birds, but I think they fold their legs back. So now we have to learn how to address some kind of a functionality that is about a part of the body of the bird. So this is the way the learning is happening in the course, kind of naturally, you’re trying to make a realistic action on the screen in the animation. Well, how are we going to do that? Well, we have to now address the joints like the hip joint or the knee joint or the ankle joint to make that much more natural in the way it works. And there’s no persuasion here, the student is trying to make an interesting thing. And then I’m there to help them figure out how do you make that much more realistic.

Rebecca: What I really love about these courses, and in what you’re describing with Alice, as someone who’s also taught code to students, particularly ones that are not in computer science, is that they’re thinking like a computer scientist, and you’re really getting them completely within the discipline, you’re hooking them right in because they’re leading with their curiosity. They’re not satisfied with the way something looks, so they’re digging in and digging in and digging in. And unlike our traditional way of structuring curriculum, where we think this is the foundational information, and this is the next thing we build on, it almost turns it totally on its head [LAUGHTER], and does it like backwards from what we traditionally do. And it’s really fun.

Rameen: Well, I think the students are at the center of that type of a decision, that for years, you see human beings that probably could do this kind of work, but shortly after they try and they get errors after errors after errors, they say, “Hey, listen, this is great that there are geeks like you would like to do this kind of thing. It’s not for me,” and they walk away from the discipline, even though they could have had great contributions in computer science. So for me if some of these concepts are introduced this way, where syntax and semantics, which is typically what slows people down when they first begin, even the systems we use… like how do you type your program? And how do you run your program? …there’s a whole bunch of instruction around how do you do anything. You just go to alice.org, you download Alice 3, but once you do, it’s here you go, you click here, and then you set up your scene; you click here, you begin writing code. Well, how you write code? Well, the object is on the left side, you drag the command from the left to the right, how far do you want it to go? Well, you gotta choose a certain distance that it needs to travel… really, really easy for students to take to right away. And I just had no idea what I should expect. You watch a lot of YouTube videos. I mean, I certainly do when I was preparing this course, of all these different people, young and old, building things and being proud of what they had built. And I thought, if I could bring that to a course for our first-year students, that would be really, really awesome. And I think that’s what has happened.

John: You mentioned that the students are able to interact. Are they all in one virtual shared space for the class? Or do they have to invite the other students into the spaces that they’ve developed?

Rameen: So this is a really good question, John. So when I imagined how the course was going to work, I had to think of a number of things. One, I asked our technology people to install Alice on all lab computers because I can’t assume or assert that every single student that will take a course like this will necessarily have the equipment that could enable them to run it. Even the Mac kids who had trouble at first installing the thing, and I needed people to help them to get it installed, even they could continue to work because we had the software on our machines. The type of collaboration that I advocate for in class is a little untraditional. At least I think you could argue that it may be. So like the other day, I gave them a 10-question quiz. So they answer the 10-question quiz. And then I said, “Find a couple of other people and persuade them why your answers are correct and their answer is wrong.” So now the whole room starts talking about the quiz. I don’t know if they’ve ever had an experience where somebody says, “This is not cheating, what I’m asking you to do.” Who gives a quiz that says, talk to everybody else to see what they answered for the quiz.

REBECCA; John does. [LAUGHTER]

Rameen: And that’s not surprising, but in my mind, is it about the learning process? Or is it about assessing or giving a grade? This is a very low-stake experience. So why exactly would I care if you talk to someone else about it? So why not persuade someone else that your answer is right? That’s a very different tactic than to say, “Do you know the answer? And are you right or wrong?” Persuading someone requires talking to them, requires thinking for yourself, first of all. Well, why is this answer right? And then opposite of that, you hear the explanation. Are you persuaded that what they said is accurate? Or do you think they’re wrong? In which case now you’re giving them back a different perspective, and then they change their answer. And of course, you could change your answer for the wrong reason. That’s just one example. I really want them to collaborate and work with each other. And every time somebody does something interesting, like the young woman who built the code that I had not been able to myself, having the birds fly from point A to point B, looking very natural. I had her come to the front of the room, plug in with a connector that is in the room and show everybody how she wrote her code. And we’ve done that at least a dozen times so far, where people just come up, plug their computer and show everybody their code. So we often are worried about students cheating and using other people’s work. And if it is about collaborative learning, then you really have to cultivate the idea that, you know, that was a really good idea, maybe I can do that. And I think hopefully, the course will continue to behave that way where I’m confident everybody’s learning from it. That’s the concern that I’m the only one who knows something, whoever I am as a student, and everybody’s just copying me or whatever. That is not my experience so far in this course. They’re just trying to do it better, and if you have a better idea, maybe I can take that and move with it. Then maybe I’ll make it a better idea. And you see that also with students. One of them figures out how to make somebody walk more naturally, and then the other one even enhances that, even makes it even more realistic in the way you would walk. And that’s kind of what I like to see happen and is actually happening.

Rebecca: So how are students responding? Are you cultivating a whole new crop of computer scientists?

Rameen: So, this is an interesting question. I am wondering, those who are not computer science students, whether or not they decide that this may be something for them. But I’m also… with Rebecca here, is good to bring this up… they might become interested in interaction design as a discipline to pursue and become passionate about. Those of us who do this kind of work for a living and have done it for 40 years or whatever, the engagement aspect is the critical aspect. If they are really invested in the learning process, they can overcome an awful lot of barriers, that frankly, I cannot persuade you to put the time in, if you’re persuaded that you could do something a little bit better, then I’m done. As a teacher, I’ve set up the environment in the way it should be where you are driving the learning process yourself as a student, and they look like that’s what you’re doing at this point. And we’re only within a month into the course and they are behaving that way. Now whether or not they will continue to take more computer science courses, and get a degree in computer science. I’m not really sure, but, hopefully, if they do interaction design, there’ll be better interaction design students, because some of the structures that they have to learn here would definitely benefit them in that curriculum too.

Rebecca: Yeah. When can I come recruit?

Rameen: Anytime. I’ve already had Office of Learning Services. That was one of the things I wanted to point out, that I’ve had Office of Learning Services, they came and they talked about the learning process. And besides what they have done for me and talking about the learning process, and it’s all research-based discussions, which is really critical for students to hear things that actually do work, and we can prove they can work. I talk about the learning process on a regular basis with them. And I’m very interested in them understanding why we’re doing anything that we’re doing. I mean, they may be used to somebody standing in front of them talking for an hour and I just don’t do that. I may talk for 10 minutes and then have them work on stuff and then as we see gaps in what they understand then I talk for 10 more minutes, maybe. I try not to talk a whole lot. I want them to be working. So the time I’ve spent is on building what they are supposed to do to learn, not so much talking to them on a regular basis during the 55 minutes that the class goes on. Unfortunately, I’m not quite sure that that’s the experience they’ll have in other courses that they take, because to me, there is a freedom embedded in the way the course is designed that is hard to replicate if you have to cover from A to Z of some topic. You get to “H” and people are having trouble, well, you just keep going. Well, I don’t if everybody has the luxury to say, well, maybe we need to pause longer because we can’t get past this point. I mean, what’s the point? We get past this point, you’ll never catch up to where the end is. So I am hoping some of them will decide to be CS majors as a result, but I’m more interested in seeing how they will do if they take more CS courses. I mean, if they take a CS 1 course, are they going to do better than a typical person taking a CS 1 course if they move on and take data structures and other courses we require? Will it come to them easier? I think it’s a really interesting question. And I think there’s a lot of research that advocates for the fact that they will do better, but I like to see it firsthand.

John: One of the things I believe was mentioned in the title of the course was storytelling, what types of storytelling takes place in the class? Is it the design of the scenes or is there some other type of communication going on?

Rameen: So the way you do animation, in general… and I probably should back up and say I spent about an ungodly amount of hours trying to learn how to do this, but I went about it backwards because I went after event-driven programming and interesting things that I knew Alice could do to write games. But then you step back from it and say, well, students can’t start there. I mean, that’s just not a good place. So then you look at the alice.org website, which gives you tremendous amount of resources and say, “Oh, designing scenes happens to be the first thing they teach you to do. Oh, maybe I should learn how to design a scene.” So you put the pieces that you want in your story on the screen, and if you don’t want them to appear, you could make them invisible. But the way Alice works, you have to put all the components in on the front end. It’s a little different than the way we do object-oriented programming. When we do object-oriented programming, you create things when you need them, you don’t think about setting up the scene on the front end. So that took a little getting used to, but that’s what you do. And then the characters you put on the screen can move, they can talk, they can fly, they can do whatever you need them to do. And my biggest interest was storytelling, when I was conceiving of the course was that I really want students, especially those from other cultures, other backgrounds, to tell their story, find a way to tell their story. And this is probably going to be starting as a group project for my students in a few weeks. And we’ll see how that actually goes. And obviously it has to have a beginning and middle and an end for it to be an actual story. But I’m just excited to see what they will actually decide to do and how they actually do it. Along the way, though, they’re going to need some tools. And that’s kind of what my contribution will be in making sure that they can tell the story the way they want.

John: It sounds as if developing this course required you to learn quite a few new things that were outside your normal teaching experiences. Do you have any advice for faculty who are working on the development of similar courses?

Rameen: So for those out there who teach for a living, the opportunity to build something from the ground up, especially something that frankly, when I first thought of it, I thought it’d be a lot easier than it turned out, because there was so much that I didn’t know how to do, but when you don’t know something you don’t necessarily know that you don’t know something. People who were doing it and I was watching them do it made it look very easy to me. But once I began to do it, I discovered how much work there was to come to a point that actually orchestrates a course, I mean something that is meaningful, and has a clear direction to it. So if you have that opportunity, even after 40 years of teaching, to start over in some ways and build something that you feel not particularly comfortable about, I really highly recommend people do that. Because that is what your students are experiencing every single day when they are trying to learn this stuff that you know so well. So having a little bit of a taste of what it takes to learn something you know very little about, I think is critical. So my message to the faculty who are listening, is that if that opportunity arises, by all means, take it.,

Rebecca: You certainly get a lot more empathy for what it feels like to not be an expert, when you’re learning something brand new again.

Rameen: Well, that’s the thing about most of the things we do. I’ve been programming for 50 years. So it’s one of those things where you’re completely in tune with the idea of: understand the problem, solve the problem, whatever. But where should the camera be in a 3D world in order for it to point at the person talking just the right way? I had to figure that out. It didn’t come naturally to me. The first bunch of programs I wrote the camera was always in the same spot and then I began learning that “Oh I have control over where this camera goes. [LAUGHTER] So maybe it needs to be somewhere else when this person is talking versus this other one.” That’s been a lot of fun to get a sensibility back into the system here that this stuff is not as obvious as it may seem.

John: We always end with the question: “What’s next?”

Rameen: So I certainly would like to teach the course more, and I also want to do some presentations with the faculty in computer science, and if there is interest in graphic design faculty to do some for them, too, because I think the platform is extremely powerful. It doesn’t cost anything, the resources that exist have been getting developed for a very long time and they’re pretty mature. And again, back to no cost. We all know how much books cost. You really don’t need one, you just use the exercises that they give you at the Carnegie Mellon site for Alice and go with it. So I really want to advocate for faculty to consider using it beyond just this first-year signature course.

John: Well, thank you. This sounds like a really interesting project that can really engage students.

Rebecca: Yeah, it sounds like a lot of fun. I can’t wait to come visit.

Rameen: Yeah. Thank you both. Really, this was fun. Thanks for the tea also.

John: Well, we have Myriam to thank for that.

[MUSIC]

John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.

[MUSIC]

311. Upskilling in AI

With so many demands on faculty time, it can be difficult to prioritize professional development in the area of AI. In this episode, Marc Watkins joins is to discuss a program that incentivizes faculty development in the AI space. Marc is an Academic Innovation Fellow at the University of Mississippi, where he helped found and currently directs the AI Institute for Teachers.

Show Notes

Transcript

John: With so many demands on faculty time, it can be difficult to prioritize professional development in the area of AI. In this episode, we examine a program that incentivizes faculty development in the AI space.

[MUSIC]

John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.

[MUSIC]

John: Our guest today is Marc Watkins. Marc is an Academic Innovation Fellow at the University of Mississippi, where he helped found and currently directs the AI Institute for Teachers. Welcome back, Marc.

Marc: Thank you, John. Thank you, Rebecca. It’s great to be back.

Rebecca: We’re glad to have you. Today’s teas are:… Marc, are you drinking tea?

Marc: I am. I have a Cold Brew Hibiscus, which is really great. It’s still very warm down here in Mississippi. So it’s nice to have something that’s a little bit cool. That’d be refreshing.

Rebecca: That sounds yummy. How about you, John?

John: I am drinking a peppermint spearmint tarragon blend today. And it’s not so warm here. In fact, my furnace came on for the first time yesterday.

Rebecca: Yeah, transitions. And, I have English tea time today.

Marc: Well, that’s great.

John: So we have invited you here to discuss your ongoing work related to ChatGPT and other AI tools. Could you first describe what the AI Institute for Teachers is and its origins?

Marc: Sure, I think that when I was last a guest here in January of this year on your show. And it seems like 1000 years ago [LAUGHTER], but during that spring semester, I really took a much deeper dive than the original pilot with a lot of the generative AI tools in the fall. And we started noticing that the pace that big tech was deploying these tools and integrating these with existing software from Microsoft and Google was only accelerating. So in about April or May, I went to my chair, Stephen Monroe, and said, “I think we need to start training some people to get them prepared for the fall,” because we kind of thought that fall was going to be what it is right now, which is a chaotic just sort of mash up of sort of everything you can imagine that some people dive in deeply, some people tried to ban it, some people are trying to do some critical approaches with it too. So we actually worked with the Institute of Data Science here at the University of Mississippi, and we got some money. And we were able to pay 23 faculty members $1,000 apiece to train them for a day and a half about everything we knew about Generative AI, about AI literacy, ethics, what tools were working in the classroom, which wasn’t. And their whole goal was to go back to their home departments over the summer and serve as ambassadors and help prepare them for the fall semester. And we started that, we’ve had funding for one Institute, and now we’re doing workshops, and searching, as we all will, for more funding for doing,

Rebecca: How did faculty respond to (A) the incentive, but (B) also [LAUGHTER] the training that went with it?

Marc: Well, not surprisingly, they responded really well to the incentives, where you can pay people for their time, they generally do show up and do so as well. We had quite a few people wanting to take the training both internally from the University of Mississippi and then people started finding out about it, because I was posting it out on Twitter, and writing about it on my substack. So when we had interest from graduate students in Rome, interest from other SEC schools wanting to attend, and even interest from a community college in Hawaii. Definitely seen a lot of interest within our community, both locally and more broadly, nationally.

Rebecca: Did you find that faculty were already somewhat familiar with AI tools? I had an interesting conversation with some first-year students just the other day, and we were talking about AI and copyright. And I was just asking, “Hey, how many of you have used AI?” And I and another faculty member indicated that we had used AI to make it safe to indicate. And many of them really kind of shook their heads like “No, they hadn’t,” and they were unsure. And then I started pointing to places where we see snippets of it, in email and in texting and other places where there’s auto-finishing of sentences and that kind of thing. And then they’re like, “Oh, yeah, I have seen that. I have engaged with that. I have used that.” What did you find faculty’s knowledge?

Marc: Extremely limited. They thought of AI as ChatGPT. And one of the things we did with the session was basically frame it out as “Look, this was not just going to remain as a single interface anymore.” One of the things that actually happened during the institute that was completely wild to me was the last day, I woke up that morning. And I’d signed up through Google Labs, and you can do it as well, to turn on the features within the Google suite of tools, including in search and Google Docs, and Sheets and everything else. And they gave me access that last day, right before we began. And so I literally just plugged in my laptop and said, “This is what it’s going to look like in Google docs when you have generative AI activate in Google Docs. it pops up and immediately greets you with a wand with a phrase “Help me write.” And what I tried to explain to them and explained to faculty ever since then, is that it makes having a policy against AI very difficult when it shows up at an existing application with no indication whatsoever that this is in fact Generative AI. It’s just another feature that’s in the application that you have grown up with, from many of our students’ perspectives their entire lives. So yeah, we need to really work on training faculty, not just in the actual systems itself, but also getting them outside of that mindset that AI that we’re talking about is just ChatGPT. It’s a lot more than that.

John: Yeah, in general, when we’ve done workshops, we haven’t had a lot of faculty attendance partly because we haven’t paid people to participate [LAUGHTER], but what’s been surprising to me is how few faculty have actually explored the use of AI. My experience with first-year students was a little different than Rebecca, about half of the students in my large intro class had said that they had explored ChatGPT, or some other AI tool. And they seem pretty comfortable with it. But faculty, at least in our local experience, have generally been a bit avoidant of the whole issue. I think they’ve taken the approach that this is something we don’t want to know about, because it may disrupt how we teach in the future. How do you address that issue, and getting faculty to recognize that this is going to be a disruptive technology in terms of how we assess student learning and in terms of how students are going to be demonstrating their learning, and also using these tools for the rest of their lives in some way?

Marc: That’s a great question. We trained 23 people, I’ve also been holding workshops for faculty too, and again, the enthusiasm was a little bit different in those contexts, too. And I agree that faculty, I feel like they feel overwhelmed and maybe some of them want to ignore this and don’t actually want to deal with it, but it is here and it is being integrated at phenomenal rates in everything around us too. But if faculty don’t come to terms with us, and start thinking about engagement with their technology, both for themselves and for their students, then it is going to create incredible disruption that’s going to be lasting, it’s not going to go away. We’re also not going to have things like AI detection, like it is with plagiarism detection to come in and save the day for them too. And those are all things we’ve been trying to very carefully explain to faculty and get them on board. Some of them though, just aren’t there yet, I understand that. I empathize, too. This is a huge amount of time that you spend on these things to think about and talk about as well. And we’re just coming out of the pandemic, people are exhausted, they don’t want to deal with another, quote unquote, crisis, which is another thing that we’re seeing too. So there’s a lot of factors that are at play here that make faculty engagement, less than what I’d like to see.

Rebecca: We had a chairs’ workshop over the summer, and I was somewhat surprised based on our experience with other interactions with faculty, how many chairs had used AI. The number was actually a significant number. And most of them were familiar. And that to me was encouraging [LAUGHTER], it was like, “Okay, good, the leaders of the ship are aware. That’s good, that’s exciting.” But it’s also interesting to me that there are so many folks who are not that familiar, who haven’t experimented, but seem to have really strong policies around AI use or this idea of banning it or wanting to use detectors, and not really being familiar with what they can and cannot do.

Marc: Yeah, that’s very much what we’re seeing across the board too, is that the first detectors that I’m aware of that really came online, I think, for everyone was basically GPTZero, there are a few others that existed beforehand to IBM had one called the Giant Language Testing Lab. But those were all based on GPT-2, you’re going back in time to 2019. I know how ridiculous is it to go back four years in technology terms and think about this… that was a long time ago. And we really started adopting that through education or seem to be adopted in education based off of that panic. The problem is in incidents of education putting a system like that in place, it’s not necessarily very reliable. TurnItIn also adopted their own AI detector as well too. A lot of different universities began to explore and play around with it, I believe, and I don’t want to be misquoted here or misrepresent TurnItIn. I think what they initially came out with it, they were saying there was only 1% false positive rate for detecting AI. They’ve since raised that to 5%. And that has some really deep implications for teaching and learning. Most recently, Vanderbilt Center for Excellence in Teaching and Learning made the decision to not turn on the AI detection feature in TurnItIn. Their reasoning was that they had, I think, in 2022 some 75,000 student papers submitted. If they had the detector on during then that would give them a false positive grade about 3000 papers. And they just can’t deal with that sort of situation through a university level..No one can. You’d have to go through it investigating each one. You would also have to get students a hearing because that is part of the due process. It’s just too much. And that’s one of the main concerns that I have about the tools that it’s just not reliable in education.

John: And it’s not reliable both in terms of false positives and false negatives. So some of us are kind of troubled that we have allowed the Turnitin tool to be active and have urged that our campus shut it down for those very reasons, and I think a number of campuses, Vanderbilt was one of the biggest ones, I think to do that, but I think quite a few campuses are moving in that direction.

Marc: Yes, the University of Pittsburgh also made the decision to turn it off. I think several others did as well, too.

Rebecca: It’s interesting, if we don’t have a tool to measure, a tool to catch if you will, then you can’t really have a strong policy saying you can’t use it at all. [LAUGHTER] There’s no way to follow up on that or take action on that.

Marc: Where we’re at, I think, that for education, that’s a sort of conundrum. We’re trying to explain this to faculty. I think much more broadly, in society, though, if you can’t have a tool that works when you’re talking about Twitter, I’m sorry, X now, and understanding if the material is actually real or fake, that becomes a societal problem, too, and that’s what they’re trying to work on with watermarking. And I believe the big tech companies have agreed to watermark audio outputs, video outputs, and image outputs, but they’ve not agreed to do text outputs, because text is a little bit too fungible, you can go in and you can copy it, you can kind of change it around a little bit too much. So, definitely it’s gonna be a problem, too when state governments start to look at this, and they start wondering that the police officer taking your police report is writing this with their own words, the tax official using this as well, too. So it’s gonna be a problem well outside of education.

Rebecca: And if we’re not really preparing our students for that world in which they will likely be using AI in their professional fields, then we’re not necessarily doing our jobs and education and preparing our society for the future.

Marc: Yeah, I think training is the best way to go forward too and again, going back to the idea of intentional engagement with the technology and giving the students these situations where they can use it and where you, hopefully if you’re a faculty member, you actually have the knowledge and the actual resources to begin to integrate these tools and talk about the ethical use case, understanding what the limitations are and the fact that it is going to hallucinate and make things up, and to think about what sort of parameters you want to put on your own usage too.

John: One of the things that came out within the last week or so, I believe,… we’re recording this in late September… was the introduction of AI tools into Blackboard Ultra. Could you talk a little bit about that?

Marc: Oh boy, yes indeed, they announced last week that the tools were available to us in Blackboard Ultra. They turned it on for us here at the University of Mississippi, and I’ve been playing around with it, and it is a little bit problematic, because for right now, what you can do is with a single click, it will scan your existing materials in your Ultra course and it will create learning modules. It will create quiz questions based off that material, it will create rubrics, and will also generate images. Now compared to what we’ve been dealing with ChatGPT and all these other capabilities, this is almost a little milquetoast by comparison. But it’s also an inflection event for us in education, because it’s now here, it’s directly in our learning management system, it’s going to be something we’re going to have to contend with every single time we open up to create an assignment, or to do an assessment. And I’ve played around with it. It’s an older version of GPT. The image version I think is based on Dall-E, so you would ask for a picture of college students and you get some people with 14 fingers and weird artifacts all over their face, which may not be the one that would actually be helpful for your students. And while the other learning modules there are not my thinking necessarily, it’s just what the algorithm is predicting based off the content that exists in my course. We have that discussion with our faculty, we have them cross that Rubicon on and saying, “Okay, I’m worried about my students using this, what happens to me and my teaching, my labor, if I start adopting these tools. There could be some help, definitely, this could really streamline the process, of course creation and actually making it aligned with the learning outcomes my department wants for this particular class.” But it also gets us in a situation where automation is now part of our teaching. And we really haven’t thought about that. We haven’t really gotten to that sort of conversation yet.

Rebecca: It does certainly raise questions about, obviously, many ethical questions and really about disclosing to students what has been produced by us as instructors and what has been produced by AI and authorship of what’s there. Especially if we’re expecting students to [LAUGHTER] do the same thing.

Marc: It is mind boggling, the cognitive dissonance, with having a policy and saying “No AI in my class,” then all of a sudden, it’s there in my Blackboard course, and I could click on something. And, at least at this integration of Blackboard, they may very well change this, but once you do this, there’s no way to natively indicate that this was generated by AI. You have to manually go in there and say this was created. And I value my relationship with my students, it’s based off of mutual trust. I think almost everyone in education does. If we want our students to act ethically, and use this technology openly, we should expect ourselves to do the same. And if we get into a situation where I’m generating content for my students and then telling [LAUGHTER] them that they can’t do the same with their own essays, it is just going to be kind of a big mess.

John: So given the existence of AI tools, what should we do in terms of assessing student learning? How can we assess the work reasonably given the tools that are available to them?

Rebecca: Do you mean we can just use that auto-generated rubric right, that we just learned about? [LAUGHTER]

Marc: You could, you can use the auto-generated rubric separately from Blackboard. One of the tools I’m piloting right now is the feedback assistant, it was developed by Eric Kean and Anna Mills. I consulted with them on this, too. She’s very big on the AI space for composition. It’s called MyEssayFeedback. And I’ve been piloting this with my students. They know it’s an AI, they understand this. I did get IRB approval to do so. But I’ve just got the second round of generated feedback, and it’s thorough, it’s quick, it’s to the point. And it’s literally making me say, “How am I going to compete with that?” And maybe the way is that maybe I shouldn’t be competing with that, maybe it’s I’m not going to be providing that feedback. But then maybe then I should be providing my time in different ways. Maybe I should be meeting with them one on one to talk about their experiences, maybe that way. But I think you raise an interesting question. I don’t want to be alarmist, I want to be as level-headed as I can. But from my perspective, all the pieces are now there to automate learning to some degree. They haven’t been all hooked up yet and put together a cohesive package. But they’re all there in different areas. And we need to be paying attention to this.Our hackles need to be raised just slightly at this point to see what this can do. Because I think that is where we are headed with integrating these tools into our daily practice.

Rebecca: AI generally has raised questions about intellectual property rights. And if our learning management systems are using our content in ways that we aren’t expecting, how is that violating our rights or the rights that the institution has over the content that’s already there.

Marc: A lot of perspectives of the people that I speak with too, their course content, their syllabi, from their perspective is their own intellectual property in some ways. We get debates about that, about the actual university owns some of the material. But we have had instances where lectures were copyrighted before in the past. And if you’re allowing the system to scan your lecture, you are exposing that to Generative AI. And that gets at one aspect of this. The other aspect, which I think Rebecca is referring to is the issue with training this material for these large language models itself could indicate that it was stolen or not properly sourced from internet and you’re using it and then you’re trying to teach your students [LAUGHTER] to cite material correctly too, so it’s just a gigantic conundrum of just legal and ethical challenges. The one silver lining in all this, and this has been across the board with everyone in my department. This has been wonderful material to talk about with your students, they are actually actively engaged with it, they want to know about this, they want to talk about it. They are shocked and surprised about all the depths that have gone into the training of these models, and the different ethical situations with data and all of it too. And so if you want to just engage your students by talking to them about AI too, that’s a great first step in developing their AI literacy. And it doesn’t matter what you’re teaching, it could be a history course, it could be a course in biology, this tool will have an impact in some way shape or form in your students’ lives they want to talk about, I think maybe something to talk about is there are a lot of tools outside of ChatGPT, and a lot of different interfaces as well, too. I don’t know if I talked about this before in the spring, the one tool that’s really been effective for a lot of students were the reading assistant tools, one that we’ve been employing is called ExplainPaper. They upload a PDF to it, it calls upon generative AI to scan the paper and you can actually select it to whatever reading level you want, then translate that into your reading level. The one problem is that students don’t realize that they might be giving up some close reading, critical reading skills to it as well too, just like we do with any sort of relationship with generative AI. There is kind of that handoff and offloading of that thinking, but for the most part, they have loved that and that’s helped them engage with some really critical art texts that normally would not be at their reading level that I would usually not assign to certain students. So those are helpful. There are plenty of new tools coming out too. One of them is called Claude 2 to be precise by Anthropic. That just came out, I think, in July for public release, it is as powerful as GPT-4. It is free right now, if you want to sign up for it as well too. The reason why I mentioned Claude is that the context window, what you can actually upload to it is so much bigger than ChatGPTs. I believe their context window is 75,000 words. So you can actually upload four or five documents at a time, synthesize those documents. One of the things I was using it for as a use case was that I collected tons of reflections for my students this past year about the use of AI. It’s all in a messy Word document. It’s 51 pages single spaced. It’s all anonymized so there’s new data that identifies them. But it’s so much of a time suck on my time, just go through to code those reflections. And I’ve just been uploading to Claude and having it use a sentiment analysis to point out what reflections are positive from these students, in what way, and it does it within a few seconds. It’s amazing.

John: One other nice thing about Claude is that has a training database that ends in early 2023. So it has much more current information, which actually, in some ways is a little concerning for those faculty who were trying to ask more recent questions, particularly in online asynchronous courses, so that ChatGPT could not address those. But with Claude’s expanded training database, that’s no longer quite the case.

Marc: That’s absolutely correct. And to add to this rather early discussion about AI detection, none of the AI detectors that I’m aware of had time to actually train on Claude, so if you generated essay… and you guys are free to do this on your own, your listeners are too… if you generated and essay with Claude, and you try to upload that to one of the AI detectors, very likely you’re going to get zero detection or a very low detection rate for it too, because it’s again, a different system. It’s new, the existing AI detectors have not had time. So the way to translate this is don’t tell your students about it right now, or in this case, be very careful about how you introduce this technology to your students, which we should do anyway. But this is one of those tools that is massively popular, a lot of people just haven’t known about it because, again, ChatGPT just takes up all the oxygen in the room when we talk about Generative AI

John: What are some activities where we can have students productively use AI to assist their learning or as part of their educational process?

Marc: That’s a great question. We actually started developing very specific activities for them to look at different pain points for writing classes. One of them was getting them to actually integrate the technology that way. So we built a very careful assignment, which called on very specific moves for them to make both in terms of their writing, and their integration of the technology for that. We also looked at bringing some research question, building assignments that way. We have assignments from my Digital Media Studies students right now about how they can use it to create infographics. Using the paid for version of ChatGPT Plus, they can have access to plugins, and those plugins then give them access to Canva and Wikipedia. So they can actually use Canva to create full on presentations based off of their own natural language and use actual real sources by using those two plugins in conjunction with each other. I just make them then go through it, edit it with their own words, their own language too, and reflect on what this has done to their process. So lots of different examples, too, I mean, it really is limited only to your imagination in this time, which is exciting, but it’s also kind of the problem that we’re dealing with, there’s so much to think about.

Rebecca: From your experience in training faculty, what are some getting started moves that faculty can take to get familiar enough to take this step of integrating AI by the spring?

Marc: Well, I think the one thing that they could do is, there are a few really fast courses. I think it’s Ethan Mollick from even from the Wharton School of Business put out a very effective training course that was all through YouTube, I think it’s like four or five videos, very simple to take, to get used to understanding how ChatGPT works, how Microsoft’s Bing works as well too, and what sort of activities students can use it for, what sort of activities faculty could. Microsoft has also put out a very fast course, I think takes 53 minutes to complete about using generative AI technologies in education. And those are all very fast ways of basically coming up to speed with the actual technology.

John: And Coursera has a MOOC through Vanderbilt University, on Prompt Engineering for ChatGPT, which can also help familiarize faculty with the capabilities of at least ChatGPT. We’ll include links to these in the show notes.

Marc: I really, really hope Microsoft, Google and the rest of them calm down, because this has gotten a little bit out of control. And integration of these tools are often without use cases, they’re often waiting to see how we’re going to come up and use them too. And that is concerning. Google has announced that they are committed to releasing their own model that’s going to be in competition with GPT4, I think it’s called Gemini by late November. So it looks like they’re just going to keep on heating up this arms race and you get bigger models, more capable and I think we do need to ask ourselves more broadly what our capacity is just to keep up with this. My capacity is about negative zero at this point… going down further.

John: Yeah, we’re seeing new AI tools coming out almost every week or so now in one form or another. And it is getting difficult to keep up with. I believe Apple is also planning to release an AI product.

Marc: They are. They also have a car they’re planning to release, which is the weirdest thing in the world to me, that there could be your iPhone charged in your Apple Car.

John: GM has announced that they are not going to be supporting either Android or Apple CarPlay for their electric vehicles. So perhaps this is Apple’s way of getting back at them for that. And we always end with the question, what [LAUGHTER] is next, which is perhaps a little redundant, but we do always end with that.

Marc: Yeah, I think what’s next is trying to critically engage the technology and explore it not out of fear, but out of a sense of wonder. I hope we can continue to do that. I do think we are seeing a lot of people starting to dig in. And they’re digging in real deep. So I’m trying to be as empathetic as I can be for those that don’t want to deal with the technology. But it is here and you are going to have to sit down and spend some time with it for sure.

John: One thing I’ve noticed that in working with faculty, they’re very concerned about the impact of AI tools on their students and student work. But they’re really excited about all the possibilities that opens up for them in terms of simplifying their workflows. So that, I think, is a positive sign.

Rebecca: They could channel that to help understand how to work with students.

Marc: I hope they find that out, there’s a positive pathway forward with that too.

John: Well, thank you. It’s great talking to you and you’ve given us lots more to think about.

Marc: Thank you guys so much.

[MUSIC]

John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.

[MUSIC]

310. Community Effects of Incarceration

Some students receive substantial support on their educational journey within their homes, communities, and schools; others face substantial barriers. In this episode, Arpit Gupta joins us to discuss his recent study that examines the effect of community incarceration rates on the academic performance of children in affected households and on their classmates.

Arpit is an Associate Professor of Finance at the Leonard N. Stern School of Business at NYU. Arpit has published extensively in highly ranked finance, economics, science, law, and management journals on topics ranging from housing markets, infrastructure investment, bail, local journalism, racial housing gaps, incarceration, and remote work.

Show Notes

  • Gupta, Arpit and Hansman, Christopher and Riehl, Evan (2022). Community Impacts of Mass Incarceration. May 3.
  • Norris, S., Pecenco, M., & Weaver, J. (2021). The effects of parental and sibling incarceration: Evidence from ohio. American Economic Review, 111(9), 2926-2963.
  • Lazear, E. P. (2001). Educational production. The Quarterly Journal of Economics 116(3), 777–803.
  • Chetty, R. (2016). Improving opportunities for economic mobility: New evidence and policy lessons. Economic Mobility Research and Ideas on Strengthening Families Communities the Economy, edited by Brown, Alexandra, Buchholz, David, Davis, Daniel, and Gonzalez, Arturo, 35-42.
  • Chetty, R. (2021). Improving equality of opportunity: New insights from big data. Contemporary Economic Policy, 39(1), 7-41.

Transcript

John: Some students receive substantial support on their educational journey within their homes, communities, and schools; others face substantial barriers. In this episode, we discuss a recent study that examines the effect of community incarceration rates on the academic performance of children in affected households and on their classmates.

[MUSIC]

John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist;hellip;

John: ;hellip;and Rebecca Mushtare, a graphic designer;hellip;

Rebecca: ;hellip;and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.

[MUSIC]

Rebecca: Our guest today is Arpit Gupta. Arpit is an Associate Professor of Finance at the Leonard N. Stern School of Business at NYU. Arpit has published extensively in highly ranked finance, economics, science, law, and management journals on topics ranging from housing markets, infrastructure investment, bail, local journalism, racial housing gaps, incarceration, and remote work. Welcome, Arpit.

Arpit: Thanks so much for having me.

John: It’s great to see you again. It’s been a while since we last talked… 20 years or so.

Arpit: Yeah, it’s been a while. So I owe my economics career to John having him teach me at a very formative time in my life. Very happy to be back here.

John: Back at the TIP program, way back. And you would have probably done that anyway, because you had a lot of interest in it even back then. Today’s teas are: …are you drinking any tea, Arpit?

Arpit: …just drinking water at the moment.

Rebecca: It is the foundation of tea.

John: It’s one of our more popular teas.

Rebecca: I have an Awake tea today.

John: I have a Darjeeling tea today.

Rebecca: So we’ve invited you here today to discuss your May 2022 working paper on community impacts of mass incarceration, co-authored with Christopher Hansman and Evan Riehl. Could you tell us about the origin of this study?

Arpit: Yeah, so Chris, Evan, and I were all graduate students at Columbia University. Chris and I were also roommates. And we had a third roommate who was a public defender. So we would just come home and hear interesting stories of his experience at work and things he was seeing. One of the things that he brought home and kind of talked to us was the fact that bail was an interesting process. And there was an interesting random assignment across bail judges. And so that was our first project, it kind of stemmed directly from talking to this roommate and his collaborator on that project. And another thing that he was mentioning is that the way he saw it is that incarceration spells really had rippling effects, not just directly on individuals concerned, but kind of affected broader communities in different ways. And we felt that that was a really interesting insight that has been explored in some other non-economics research. And we wanted to just explore this concept further, because we felt it was an important essential public policy question. And so we spent many years to try to get the right data and setting to explore further at these broader community impacts of incarceration.

John: So earlier studies had found that incarceration of a parent had significant effects on education for children within the household. Could you just talk a little bit about those effects before we talk about your contribution to this literature?

Arpit: Yeah, absolutely. So there is a pretty broad literature on this topic. And I would sort of separate some of the papers that are not in economics from the papers that are in economics. There are a number of great studies that, for example, will track cohorts of people across generations to kind of see what are the rippling long term implications of incarceration. There are a variety of these papers that explore I would sort of describe are the multi dimensional aspects of incarceration on different outcomes for individuals and families that are concerned. And I would sort of characterize this non-economic literature as really highlighting the disproportionate spatially concentrated incarceration. And that’s kind of the key insights of this broader sociological literature, that you think of incarceration as something that affects a lot of people in very concentrated ways and bad ways. The economics literature has taken a little bit of a different approach and has primarily focused on the direct impacts of incarceration, with some literature starting to look at how that also affects household members. A lot of literature has been in Scandinavian countries where they have a different justice system and really good data. Some of those papers have actually found positive effects of parental incarceration on children outcomes, which might make sense if you’re removing, for example, a negative role model from a child’s life or if the criminal justice system itself offers positive remediation, restorative justice, and so forth, that kind of improves someone’s outcomes after they’ve returned from prison. The closest paper to our study in the United States is going to be a paper by Norris and Weaver, which focuses on the effects of incarceration for students in Ohio. And there, they argue that incarceration of a parent improves the odds that the child is going to be involved in the criminal justice system in the future, so that they are less likely to be arrested in the future. And they find more mixed evidence on the education impact. They don’t find much evidence for negative education impacts. But that’s done on the kind of little bits of a smaller sample with larger standard errors.

John: Your study, though, goes a little bit further, because you’re looking not just as the effect on children within the household, but also spillover effects into their classrooms and schools, from incarceration of adults in the household. How did you separate out the effect of differences in incarceration rates from all the other factors that might influence such outcomes in those communities?

Arpit: Absolutely. So this is going to be, of course, a key distinction between how economists think about the problem versus other disciplines. We’re thinking upon the question of identification. How can we identify whether the negative impacts or positive impact you’re looking at can be attributed to incarceration, or are just reflective of other background trends. Let me start first with actually how we think about these effects in aggregate, because that gets at like the community dimension of the problem, which is kind of our central focus. So the big question that we’re really interested in is what happens to a community, when a lot of people within that community are behind bars? How does the impact on that set of individuals spill over and impact the overall community. And of course, this is an even harder identification problem than just looking at the attacks on one person, because you wonder what the omitted background factors that can affect entire communities. But we find that when a county has a relatively more strict set of judges, that actually has a large impact on the overall performance of all the students in the area. So that suggests that there are large impacts of incarceration that kind of broadly affect all the students in a particular area. And that motivates us to think about what is the size of the effect of incarceration on children’s outcomes, and what are the mechanisms by which they’re affected? But we then dig more deeply into thinking about the effects on the directly affected children, those whose parents are themselves incarcerated. There, we similarly use judicial variation, and we also look at the spillovers onto other children in the classroom. So the key innovation, the key contribution, I think, of our analysis is to take this question that has been studied before, but adopt it to the problem, thereby thinking about the more aggregate consequences and the mechanisms by which incarcerations affect broader communities.

John: And you also use an event study approach too, to provide more support. Could you talk a little bit about that part of the analysis?

Arpit: So we use those in both our direct and indirect analyses where we were trying to understand what is the impact on a student if their family member is incarcerated. And the event study approach basically looks before and after that arrest and looks at the outcomes for the children as measured by outcomes such as the test scores, the suspension rates, misbehavior rates, and so forth. So we’re interested in a little bit of a multi dimensional set of outcomes for children, because we want to know both how is this child doing, we want to know whether there are behavioral disruptions that may stem from having a background incarceration at home, that may then affect other children, because if you’re misbehaving in the classroom, that’s something that will negatively potentially affect other children’s learning in the classroom. The event study is looking within the child before and after that arrest period. And we also do that same event study analysis at the classroom level, basically. So looking at what happens to the performance of other students in the classroom, when one of the students’ family members is arrested.

Rebecca: How big was the impact of incarceration on children in the affected households and in the classrooms.

Arpit: So for one individual child, the effects on math and English scores is something 5% of the standard deviation. So it’s an effect that is sizable enough, if you think about many educational interventions as having very heterogeneous effects, and it’s very hard often to kind of get meaningful moves in child performance. But the really big part of the analysis, I think, was trying to reconcile those direct effects, the ones that are one to one and a half percent of the standard deviation against the overall impact of incarceration on the whole community. So what happens if I take a whole county and I change the mix of judges and I have much more incarceration? What is the overall educational impact there? So when we looked at that overall community level perspective, we actually found that changing a one standard deviation in the county level stringency is actually affecting test scores by between one and a half to three and a half of a standard deviation. So we’re basically getting very big aggregate effects that the individual effects alone can’t explain. And so we think that there’s scope for these spillover effects, by which one directly affects how a child behaves in a certain way in the classroom that then spills over to the other children in the classroom that thereby amplifies the effect, so as to generate larger negative overall effects. And one channel that we use to identify those is to look within the classroom itself, not gonna measure all the potential spillovers between children, but it’s one area where we think there’s spillovers, and we think that those spillovers can also account for some fraction of the overall community effect.

Rebecca: Can you translate some of that standard deviation talk [LAUGHTER] to people that don’t know anything about statistics.

Arpit: For example, at the county level, when we are thinking about a one standard deviation increase in the stringency we’re thinking about a 15 to 20% increase in incarceration. So that’s kind of the range of variation that we’re looking at at the county level when thinking about what are the typical shock to incarceration, and that’s a kind of pretty substantial increase in the incarceration levels we’re seeing as a consequence.

John: So you’re finding the effect on any one other student is relatively small, but the aggregate effect on all the students in the class is relatively large. Is that correct?

Arpit: That’s right. So when we look at those other students in the classroom, we’re getting effects for those students in response to the incarceration of a peer’s family members, they’re on the order of 0.3, 0.4 percent of a standard deviation. You should just basically think of that as a really small number. And the only way we’re kind of getting the power to analyze this is that we’re looking at this North Carolina data, which is really great, a lot of people have worked with it, exactly because it is so comprehensive. So we’ve got all the student rolls, we’ve got all the arrest records, all of these are matched together. And so using this really holistic sample allows us to try to quantify these effects that are pretty small for any one individual child, but they’re just a lot of exposures that can aggregate up. And so we think that this classroom disruption channel can explain something like 15% of that relationship between aggregate incarceration and test scores. So it kind of all adds up to explain a more meaningful fraction of this overall relationship between what happens when a lot of people in the area go to jail and what happens to student performance in that area.

John: What sort of mechanism are you hypothesizing might be the cause of the spillover effects to other students in the classrooms?

Arpit: So let’s start with what we can measure in our data. So what we observe is that children who are affected whose family members are incarcerated are looking at increases in suspension days, they’re absent more often, they’re involved in more fighting incidents, typically it takes two people to fight. So that sort of tells us that there are other people involved in the classroom for these affected students. And so we think that this relates very closely to the idea that there are classroom level externalities, and there is a large literature, actually papers by Lazear and others that highlight the importance and implications of classroom level externalities, classroom disruptions, when it comes to learning. It also comes up, by the way, when I talk to people in North Carolina who are teachers. One thing that they really bring up is that children come into the classroom with all sorts of backgrounds that change behavior in the classroom, and that impairs the learning experience for other students in the classroom. So that’s what we can measure most cleanly, is the existence of these behavioral disruptions by students affecting how they behave in the classroom, and influences, through that channel, the learning experience of other children. That doesn’t need to be the only mechanism that’s going on here, there can be other spillover channels between children that we can’t observe in our data. There can also be other channels outside of peer interactions between children through other community interactions between people as well, that we also can’t measure in our data. So we think of this project as really trying to open a set of analysis that we’re considering and thinking about the broader web of social interactions, when incarceration happens.

Rebecca: What are some of the public policy implications of the study?

Arpit: So the challenge, of course, is that you’re measuring one side of the equation, we’re measuring sort of the cost of incarceration, and so you have to balance those against some of the possible benefits of incarceration, because children are also affected by crime in the local community, as well. And so it’s a difficult trade off to try to balance both the costs and benefits of incarceration in tandem. So I don’t think our results actually have a clear takeaway. I think the biggest thing that I personally kind of took away from the analysis is that if we have different techniques, if we have different ways of trying to reduce and address crime, it would be ideal if we were able to lean on ways that rely less on the incarceration channel, which impose these additional externalities and costs and burdens on local communities, and instead found other ways of trying to address and mitigate and reduce crime. So for example, when it comes to a different setting, when it comes to thinking about bail, which is a topic we’ve also researched before, there is sometimes a choice between arresting the individual and putting them in jail, compared to something like house arrest, compared to something like electronic or digital monitoring. These systems are also not perfect. There are also a lot of costs and tradeoffs there. But to the extent that you can find ways of deterring, mitigating, crime that don’t rely as much on the incarceration channel, I think that lowers the spillover negative effects on local communities, I want to mention that, when we look at these multi-dimensional impacts of the original incarceration event on the student, we actually find, consistent with prior literature, that to the extent that we can observe juvenile offenses, we don’t observe increases in crime, if anything, there are decreases in criminal activity. That, again, is consistent with some of the prior literature. And the way to interpret that, I think, is to again think of there as being multiple dimensions by which people are affected. So you can observe that there’s a negative role model effect, you observe someone going to jail for a crime… Well, I’m not going to commit that crime, but you may still act out in the classroom. So we shouldn’t think of the responses to these kinds of disruptive background events as happening on some uniform spectrum of good behavior or bad behavior, but it’s much more multi-dimensional in how people respond to stressful situations.

John: Did you find a difference in the effect whether it was a male or a female in the household who is incarcerated in terms of the impact on children?

Arpit: Everything I’ve said, so far, I’ve been trying to be careful in sort of saying , these are individuals in the household, because really, what we’re doing is the household level match. So we’ve got the address, and so what we really know is that this is someone that lives at this address that is arrested. We view that mostly as a strength of our approach, which is trying to identify household members. It sort of recognizes the intergenerational and complicated family backgrounds many families have, but it does make it a little more challenging to establish the sort of true relationship between individuals. And so one thing that we kind of did there is sort of try to identify probable female parents or guardians, male parents or guardians, or simply assign kind of age ranges and things like that. We did find the effects on children were much larger when we were looking at the incarceration of a female payment. So that kind of makes a lot of sense, if you think that mothers and female guardians kind of play uniquely important roles within the household. And when it comes to the child themselves, the effects were actually pretty similar between boys and girls.

John: In the US, we have one of the worst rates of intergenerational income mobility, might this type of an issue be one of the causes of that, in that in low-income communities where incarceration rates tend to be higher, it’s putting children in those communities at further disadvantage, which can have some long-term consequences.

Arpit: One thing I want to mention is where we’re kind of taking the paper is to adopt the community frame and think about other community outcomes that might potentially change as a result of incarceration. So I do think that probably one of the reasons that we have this, not just low on average in the United States, a low rate of social mobility in the United States, but also it’s very regionally varying rate of social mobility differences across the United States. I remember when the first Chetty map was released that showed the geography of economic mobility in the United States. My home state, North Carolina, is actually incredibly low for social mobility. And that’s surprising, actually, because North Carolina is where everyone’s moving to. It is incredibly economically dynamic, it has lots of job centers, but moving there is low cost of housing. It has a lot of features, which you might expect should be associated with high economic mobility. And in fact, like much of the south and very regionally varying patterns across United States, you actually observe pretty low social mobility. And I do wonder whether one reason for that is that we have these very high rates of incarceration across much of the United States. And that’s not an easy thing to just stop incarceration, because we all know that the system of criminal justice, that is also there to protect in low- income communities from the negative consequences of crime. So the public policy challenges of how to figure out what to do about this are really complicated. But we want to know why is it that people that grew up in the same state that I did, don’t necessarily have great opportunities compared to people who grew up elsewhere. So we’re hoping to use the setting, use this analysis to dig a little bit deeper into this question. And one fact that is kind of already out there that I think is very related, is that analysis by Chetty and others, which looked at the geography of social mobility, found that a big correlate, something that associates strongly with social mobility across United States is the presence of two-parent households. So the number of absent fathers, that associates very strongly with the lack of social mobility in an area. Of course, that is not a causal statement, you could imagine things go the other way. So lack of social mobility kind of impacts in different ways. But I think that’s a diagnostic that is suggestive of the idea that something about incarceration affects broader communities, affects the family formation, affects family stability in ways that impact people’s ability to build stable relationships. And all of that kind of has really persistent negative impacts.

Rebecca: As an educator, this study makes me think about if I’m a teacher in a classroom, I’m kind of experiencing the phenomenon that you’re studying, and the kinds of things that I might consider doing for classroom management or the way that I might better understand even just what’s happening or what I’m observing, I think is food for thought for educators to just be more aware of what’s happening in their communities.

Arpit: The other kind of question, I think, for economic policy is about these measures of teacher value add, which are being thought of as ways of assessing or even compensating teachers for the increase in test scores, that they’re resulting in the classroom that they have, right? And this makes sense to economists we want to value and grade people based on the incremental add that they’ve done to a population kind of coming in. But one thing I actually hear a lot from teachers is they’re very worried about this possibility as something that affects them as a teacher, because they’re saying, “Well, it’s not my problem, if I happen to have a classroom in a particular year where the children are going through a lot of stuff at home, they’re not necessarily going to learn as much, that might affect other children in the classroom as well. And that’s something that I will potentially be judged for, something outside of my control.” And that is a very strong problem for this whole teacher value add methodology, because these kinds of background events don’t necessarily follow a predictable sequence. And so they can kind of happen at various times over students’ lives, over a teacher’s career across different classrooms. And so it’s very hard statistically, to separate out whether a student is doing well or badly because of the teacher, or because of some background events. It also impacts, I think, how we statistically evaluate and think about evaluating teachers.

Rebecca: I imagine it also impacts classroom management and observations of classroom management and other tools that we use to evaluate teachers currently… behavior in that class is different than others, or they have different traumatic experiences impacting their behavior. That’s not necessarily being observed by an observer.

John: And we have put probably far too much weight on teacher and school compensations and budget tied to student performance, because, as you said, there’s so much that’s outside the control of the teachers or the school districts.

Rebecca: That’s also the schools that tend to struggle to get teachers and things too, right?

John: And we’re penalizing those teachers and those school districts, often, that face the most severe challenges and need the most support. You mentioned this dataset from North Carolina is a very rich one, but you had to do a bit of work to get all that data together. Because there is a lot of data on student outcomes, but you also have to tie this to incarceration. Could you talk a little bit about how you matched the household data, or the incarceration data, to the schooling data?

Arpit: Oh, man, this is my favorite part of the project, because it allows me to reminisce about my sort of a Moby whale moment of a project. So I think all of us as researchers need to sort of think about what are the projects that we really want to see live, what are the ones that we’re really going to go to bat for, and this is one of those projects for me. I just felt that this needed to be answered. And so, together my collaborators, we really just spent a really long time trying to figure out how to get the right data for this. So you have to put together the criminal justice records for a given area, you need to put together the education records, and then you need to figure out how to link the two of these. So some states you can get one, some states you can get the other and it’s very hard to find a set of states where the two of them match. So we tried a whole range of states, a whole range of datasets, many times we got very close, but were stopped at the last minute. And finally, we were able to work with the state of North Carolina, which has an excellent set of education records, has these great criminal justice records, and were able to figure out a way of merging and matching the two sets of documents at this household level, have a pretty good sense of the direct linkages between the children in our sample and the criminal defendants and then using the classroom identifiers in the dataset to identify other spillover effects, looking at the broader geographic implications. So all of that wound up working out for us at the end, but it was a long haul to get there. And I think it’s definitely a lesson that I took away from this project that if you want something to do well, you really got to work at it. There’s no substitute for putting in the shoe leather for calling people, cold calling people, emailing people and just hearing no, no, no, no, no again and again and again until you’re able to figure out something that works.

John: And the matching between households for the students and the incarcerated people was based on household addresses. Is that correct?

Arpit: That’s right. So that match was done by the North Carolina education folks. They took their records, they imported the criminal justice records, matched that at the household level, and then gave us a data set that had removed all identifiers that we could work with for research.

John: It’s a wonderful data set and it’s a really impressive piece of work.

Arpit: Thank you very much.

Rebecca: So we always wrap up by asking what’s next?

Arpit: For us on this project, we’re really trying to see if we can think about some of these broader implications of incarceration on communities outside of the educational impacts that we’ve been talking about so far. So thinking about the impacts on family structure, thinking about whether it spills over into the usage of other government programs, whether it has employment effects, kind of housing market access, I think that there are a whole range of different outcomes, particularly at these broader community levels that I think are shaped by the number of people in that local community that are impacted by incarceration. So I think those are the overall community spillovers, we’re interested in understanding.

John: Well, thank you. This is some really impressive work. And I have to say I’m really impressed by all the work that you’ve been doing in so many areas. You’re doing some wonderful work on some really important topics.

Arpit: Thank you very much, John. I had an economics teacher growing up who inspired me to work on these topics.

Rebecca: Well, thank you so much. We’re looking forward to sharing this with our audience.

Arpit: Thanks.

[MUSIC]

John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.

[MUSIC]

309. Preparing Students for an AI Future

New technology is often seen as a threat to learning when first introduced in an educational setting. In this episode, Michelle Miller joins us to examine the question of when to stick with tools and methods that are familiar and when to investigate the possibilities of the future.

Michelle is a Professor of Psychological Sciences and President’s Distinguished Teaching Fellow at Northern Arizona University.  She is the author of Minds Online: Teaching Effectively with Technology and Remembering and Forgetting in the Age of Technology: Teaching, Learning, and the Science of Memory in a Wired World. Michelle is also a frequent contributor of articles on teaching and learning in higher education to publications such as The Chronicle of Higher Education.

Show Notes

Transcript

John: New technology is often seen as a threat to learning when first introduced in an educational setting. In this episode, we examine the question of when to stick with tools and methods that are familiar and when to investigate the possibilities of the future.

[MUSIC]

John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.

[MUSIC]

John: Our guest today is Michelle Miller. Michelle is a Professor of Psychological Sciences and President’s Distinguished Teaching Fellow at Northern Arizona University. She is the author of Minds Online: Teaching Effectively with Technology and Remembering and Forgetting in the Age of Technology: Teaching, Learning, and the Science of Memory in a Wired World. Michelle is also a frequent contributor of articles on teaching and learning in higher education to publications such as The Chronicle of Higher Education. Welcome back, Michelle.

Michelle: Hey, it’s great to be here.

Rebecca: Today’s teas are: ….Michelle, are you drinking tea?

Michelle: I’m actually still sticking with water. So it’s a healthy start so far for the day.

Rebecca: Sounds like a good plan.

John: I have ginger peach black tea today.

Rebecca: And I’ve got some Awake tea. We’re all starting the day [LAUGHTER].

John: So we’ve invited you here to discuss your August 17th Chronicle article on adapting to ChatGPT. You began that article by talking about your experience teaching a research methods course for the first time. Could you share that story? Because I think it’s a nice entree into this.

Michelle: Oh, thank you. I’m glad you agree. You never know when you’re sharing these kinds of personal experiences. But I will say this was triggered by my initial dawning awareness of the recent advances in AI tools, which we’re all talking about now. So initially, like probably a lot of people, I thought, well okay, it’s the latest thing and I don’t know how kind of attentive or concerned I should be about this. And as somebody who does write a lot about technology and education, I have a pretty high bar set for saying, “Oh wow, we actually kind of need to drop everything and look at this,” I’ve heard a lot of like, “Oh, this will change everything.” I know we all have. But as I started to get familiar with it, I thought “Oh my goodness, this really is a change” and it brought back that experience, which was from my very first assignment teaching the Research Methods in Psychology course at a, well, I’ll just say it was a small liberal arts institution, not my graduate institution. So I’m at this new place with this new group of students, very high expectations, and the research methods course… I think all disciplines have a course kind of like this, where we kind of go from, “Oh, we’re consuming and discussing research or scholarship in this area” to “Okay, how are we going to produce this and getting those skills.” So it is challenging, and one of the big challenges was and still is, in different forms, the statistical analysis. So you can’t really design a study and carry it out in psychological sciences without a working knowledge of what numbers are we going to be collecting, what kind of data (and it usually is quantitative data), and what’s our plan? What are we going to do with it once we have it, and getting all that statistical output for the first time and interpreting it, that is a big deal for psychology majors, it always is. So students are coming, probably pretty anxious, to this new class with a teacher they haven’t met before. This is my first time out as the instructor of record. And I prepared and prepared and prepared as we do. And one of the things that I worked on was, at the time, our methodology for analyzing quantitative data. We would use a statistics package and you had to feed it command line style input, it was basically like writing small programs to then hand over to the package. And you would have to define the data, you’d have to say, “Okay, here’s what’s in every column and every field of this file,” and there was a lot to it. And I was excited. Here’s all this knowledge I’m going to share with you. I had to work for years to figure out all my tricks of the trade for how to make these programs actually run. And so I’ve got my stack of overheads. I come in, and I have one of those flashbulb memories. I walked into the lab where we were going to be running the analysis portion, and I look over the students’ shoulders, and many of them have opened up and are starting to mess around with and play around with the newest version of this statistics package. And instead of these [LAUGHTER] screens with some commands, what am I looking at? I’m looking at spreadsheets [LAUGHTER]. So the data is going into these predefined boxes. There’s this big, pretty colorful interface with drop down menus… All the commands that I had to memorize [LAUGHTER], you can point and click, and I’m just looking at this and going, “Oh no, what do I do?” And part of my idea for this article was kind of going back and taking apart what that was like and where those reactions were coming from. And as I kind of put in a very condensed form in the article, I think it really was one part just purely sort of anxiety and maybe a little bit of loss and saying, “But I was going to share with you how to do these skills…” partly that “Oh no, what do I do now?” I’m a new instructor. I have to draft all this stuff, and then partly, yeah, curiosity and saying, “Well, wait a minute, is this really going to do the same thing as how I was generating these commands and I know you’re still going to need that critical thinking and the top level knowledge of “Okay, which menu item do you want?” Is this going to be more trouble than it’s worth? Are students going to be running all the wrong analyses because it’s just so easy to do, and it’s going to go away.” So all of that complex mix is, of course, not identical to, but I think pretty similar to how I felt… maybe how a lot of folks are feeling… about what is the role of this going to be in my teaching and in my field, and in scholarship in general going forward?

Rebecca: So in your article, you talk a lot about experimenting with AI tools to get started in thinking about how AI is related to your discipline. And we certainly have had lots of conversations with faculty about just getting in there and trying it out just to see how tools like ChatGPT work to become more familiar with how they might be integrated into their workflow. Can you share a little bit about how you’d recommend for faculty or how you were thinking about [LAUGHTER] jumping in and experimenting and just gettin g started in this space?

Michelle: Well, I think perhaps, it also can start with a little bit of that reflection and I think probably your listenership has a lot of very reflective faculty and instructors here. And I think that’s the great first step of “Alright now, if I’m feeling worried, or I’m feeling a very negative reaction, where’s that coming from and why?” But then, of course, yeah when you get it and actually start using it the way that I had to get it and start using my statistics package in a brand new way, then you do start to see, “Okay, well, what’s great, what’s concerning and not great, and what am I going to do with this in the future? So experimenting with the AI tools, and doing so from a really specific perspective. When I started experimenting at first, I think I thrashed around and kind of wasted some time and energy initially, looking at some things that were not really education focused. So something that’s aimed at people who are, say, social media managers, and how this will affect their lives is very different than me as a faculty member. So make sure you kind of narrow it down, and you’re a little planful about what you look at, what resources you’re going to tap into, and so on. And so that’s a good starting point. Now, here’s what I also noticed about my initial learning curve with this. So I decided to go with ChatGPT, myself, as the tool I wanted to get the most in depth with. So I did that and I noticed really that, of course, like with any sort of transfer of learning situation, and so many of those things we do with our students, I was falling back in a kind of an old pattern. So my first impulse was really funny, it was just to ask it questions, because I think now that we’ve had several decades of Google under our belts and other kinds of search engines, we get into these AI tools, and we treat them like search engines, which for many reasons, they really are not. Now, this is not bad, you can certainly get some interesting answers. But I think it’s good to really have at the front of your mind to kind of transition from simply asking questions to what these tools really shine with, which is following directions. I think one of the best little heuristics I’ve seen out there, just very general advice, is: role, goal, and instructions. So instead of coming in and saying “what is” or “find” or something like that, what perspective is it coming from? Is it acting as an expert? Is it acting as an editor? Is it going to role play the position of a college student? Tell it what you’re trying to accomplish, and then give it some instructions for what you want it to do. That’s a big kind of step that you can get to pretty quickly once you are experimenting. And that’s, I think, real important to do. So we have that. And of course, we also want to keep in mind that one of the big distinguishing factors as well is that these tools have memory, your session is going to unfold in a particular and unique way, depending not just on the prompts you give it, but what you’ve already asked it before. So, once you’ve got those two things, you can start experimenting with it. And I do think coming at it from very specific perspectives is important as I mentioned because there’s so little super general advice, or discipline-independent advice that I think is really going to be useful to you. And so doing that, I think a lot of us we start in a sort of a low-stakes, tentative way with other interests we might have. So for example, one of the first things that I did to test it out myself was I had it work out a kind of a tedious little problem in knitting. So I had a knitting pattern, and there’s just a particular little counting algorithm where to put increases in your pattern that always trips us up. And I was about to like, “Oh, I gotta go look this up,” then I thought “You know what, I’m gonna see if ChatGPT can do this.” And it did that really well. And by doing that in an area where I kind of knew what to expect, I could also push its parameters a little bit, make sure is this plausible? is what it’s given me… [LAUGHTER] does that map onto reality? and I can fact check it a little bit better as I go along. So those are some things that I think that we can do, for those who really are starting from scratch or close to it right now.

John: You’re suggesting that faculty should think about how AI tools such as this… and there’s a growing number of them, it seems more are coming out almost every week…, how they might be useful in your disciplines and in the types of things you’re preparing students for, because as you suggested it’s very different in different contexts. It might be very different if you’re teaching students to write than if you’re teaching them psychology or economics or math. And so it’s always tempting to prepare students for the way we were prepared for the world that we were entering into in our disciplines. And as you suggest in the article that we really need to prepare students for the world that they’re going to be entering. Should people be thinking about how it’s likely that students will be using these tools in the future and then helping prepare them for that world?

Michelle: Yeah, that’s a really good way to start getting our arms around this. In kind of the thinking that I’ve been doing and kind of going through this over the last couple of months… that just absolutely keeps coming up as a recurring thing, that this is so big, complicated, and overwhelming, and means very different things for different people in different fields. Being able to kind of divide and break down that problem is so important. So, yeah, I do think that and, for example, one of the very basic things that I’ve made some baby steps towards using myself is, ChatGPT is really good at kind of reformulating content that you give it, expanding or condensing it in particular. The other day, for example, I was really kind of working to shape a writing piece, and I had sort of a longer overview and I needed to go back and kind of take it back down to basics and give myself some ideas as a writer. So I was not having it write any prose for me. But I said, “Okay, take what I wrote and turn it into bullet points” and it did a great job at that. I had a request recently from somebody who was looking at some workshop content I had and said, “Oh, we really want to add on some questions where people can test their own understanding.” And you know, as the big retrieval practice [LAUGHTER] advocate and fan of all time, I’m like, “Oh, well, that’s a great idea. Oh, my goodness, and I’m gonna have to write this, I’m on a deadline.” And here too, I got, not a perfectly configured set of questions. but I got a really good starting point. So I was able to really quickly dump in some text and some content and say,”Write this many multiple choice and true/false questions.” And it did that really, really well. So those are two very elementary examples and some things that we can get in the habit of doing as faculty and as people who work with information and knowledge in general.

Rebecca: I’ve used ChatGPT, quite often to get started on things too, and generate design prompts, all kinds of things and have it revise and add things and really get me to think through some things and then kind of I do my own thing. But I use that as a good starting point to not have a blank page.

Michelle: Absolutely. Yeah, the blank page issue. And I think where we will need to develop our own practice is to say, “Okay, make sure we don’t conflate or accidentally commingle our work with ChatGPT’s, as we figure out what those acceptable parameters are.” But that reminds me too, I mean, we all have the arenas where we shine and the arenas where we have difficulty as, again, as faculty, as working professionals. I know graphic design is your background. I’m terrible. I’m great at words, but it reminds me, one of the things that I kind of made myself go and experiment with was creating a graphic, just for my online course that’s running right now, which would, for me, that would typically be a kind of an ordeal of searching and trying to find something that was legitimate to use and a lot of clipart, and I had it generate something. Now, I do not advise putting in like “exciting psychology image in the style of Salvador Dali,” [LAUGHTER] and seeing what comes out. He was not the right choice. It was quite terrifying. But after a lot of trial and error, I found something that was serviceable and there too, it’s not like I need to develop those skills. If I did, I would go about that very, very differently. But it’s something that I need in the course of my work but it’s a little outside of my real realm of expertise. So helpful there too. So yeah, the blank page… I think you really hit on something there.

John: Now did you use DALL-E or Midjourney or one of the other AI design tools to generate that image?

Michelle: Oh my goodness. Well, here again, [LAUGHTER] I was really out of the proverbial comfort zone for myself is really going to show. I did use DALL-E and I really wrestled with it for a couple of reasons. And so, as a non-graphic person, it did not come easily to me. Midjourney as well, if you’re not a Discord user, you’re really kind of fighting to figure out that interface at the same time and those that are familiar with cognitive load concept of [LAUGHTER] “I’m trying to focus on this project, but all this other stuff is happening. And then I had a good friend who’s a computer engineer and designs stained glass as a hobbyist [LAUGHTER] and kind of took my hand and said, “Okay, here’s some things you can do.” It actually came up with something a lot prettier, I have to say.

John: You had just mentioned two ways in which faculty could use this to summarize their work or to generate some questions. Not all faculty rely on retrieval practice in an optimal manner. Might this be something that students can use to fill in the gaps when they’re not getting enough retrieval practice or when they’re assigned more complex readings then they’re able to handle.

Michelle: Yeah, having the expertise is part of it, and I think we’re going to see a lot of developing understanding of that really cool tradeoff and handoff between our expertise and what the machine can do. I’m kicking around this idea as well, so I’m glad you brought that up. A nice side effect could be a new era for retrieval practice, since that is something of a limiting factor is getting quality prompts and questions for yourself. It’s funny, one of the things that I did do while taking a little prompt engineering course right now to try to build some of these skills and the facility with it. And one of the things they assigned was a big dense article [LAUGHTER] on prompt engineering, which was really great, but a little out of my field, and so I’m kind of going “Well, did I get that?” And then I thought, I better take my own medicine here and say, “Well, what’s the best way to ensure that you do and to find out if you don’t have a good grasp of what you were assigned?” And I was able to give it the content, I gave it, again, a role, a goal, and some instructions and said “Act as a tutor or a college professor, take this article, and give me five questions to test my knowledge. And then I told it to evaluate my answers [LAUGHTER] and see whether it was correct.” So that was about as meta as you can get, I think, in this area right now. So I’ve done it. And here again, it does a pretty good job, actually an excellent job. Do you want to use it for something super high stakes, probably not, especially without taking that expert eye to it. But wow, here’s something, here’s content that was challenging to me personally. It did not come with built in retrieval practice, or a live tutor to help me out with it. I read it, and I’m kind of going, “I don’t know, I don’t have a really confident feeling.” So I was able to run through that. And so yeah, that could be one of the initial steps that we suggest to students as a potentially helpful and not terribly risky way of using these really powerful new tools.

Rebecca: One of the things that this conversation is reminding me of and some of the others that we’ve had about ChatGPT is we have to talk a little bit about how students might use it in an assignment or something, or how we might coach a student to use it. But we don’t often talk a lot about ways that students might just come to a tool like this, and how they’re just going to use it on their own without us having any [LAUGHTER] impact. I think, often we jump to conclusions that they’re gonna have a tool write a paper or whatever. What are some other ways that we can imagine or roleplay or experiment in the role of a student to see how a tool like this might impact our learning?

Michelle: So that is another kind of neat running theme that does come up, I think, with these AI tools is role playing. I mean, this is what it’s essentially doing. And so having us roleplay the position of a student or having it evaluate our materials from the perspective of a student, I think, could be useful. But, it kind of reminds me let’s not have a total illusion of control over this. I think, as faculty, we have a very individualistic approach to our work. And I think that’s fine. But yeah, there’s a lot happening outside of the classroom that we should always have in mind. So just like with me on that hyper planned first course that I was going to be teaching, it just happened and students were already out there experimenting with “Oh, here’s how I can complete this basic statistics assignment with the assistance of this tool I’m going to teach myself. So that could be going on, almost certainly is going on, out there in the world of students. And it’s another time to do something which I know I have to remind myself to do, which is ask students and really talk to them about it. Early on, I think there was a little bit of like, “Oh, this is a sort of a taboo or a secret and I can’t talk to my professors about it and I want to broach it and professors, they didn’t want to broach it with their students because we don’t want to give anybody ideas or suggest some things are okay where they’re not. But I think we’re at a good point to just kind of level with our students and ask them “How do you think we could bring this in?” I think next semester, I’m going to run maybe an extra credit assignment and say, “Oh, okay, we’re gonna have a contest, you get bragging rights, and maybe a few points to “What is a good creative use of this tool in a way that relates to this class? Or can you create something, kind of a creative product or some kind of a demonstration that in some way ties to the class?” And I’ve learned through experience when I’m stumped, and I don’t quite know where to go with a tool or a technique or a problem, take it to the students and see what they can do with it.

Rebecca: I can see this is a real opportunity to just ask the students, how are they using it, and then take a look at the results that it’s creating. And then this is where we can provide some information about how expertise in a field [LAUGHTER] could actually make that better why that result is in what they think it is.

Michelle: Absolutely, and some of the best suggestions that I’ve seen out there, I’m kind of eagerly consuming across a lot of disciplines as much as I can to look at those suggestions. The most intriguing ones I’ve seen are kind of with things with a media literacy and critical thinking flair that tells students “Okay, here’s something to elicit from your AI tool that you’re using, and then we, from our human and expert perspectives, are going to critique that and see how we could improve it. So here too, critical thinking and those kinds of evaluation skills and abilities are some of the most prized things we want students to be getting in higher education. And they are simultaneously. for many different reasons, they are some of the hardest. So if we can bring that to bear on the problem, I think that can be a big benefit.

John: In the article, you suggested that faculty should consider introducing some AI based activities in their classes. Could you talk a little bit about some that you might be considering or that you might recommend to people?

Michelle: One of the things that I am going to be teaching, actually for the first time in a very long time, is a writing in psychology course, which has the added challenge of being fully online asynchronous, so that’s going to be coming up pretty soon for me. It’s still under construction, as I’m sure a lot of our activities and a lot of things are that we’re thinking about in this very fluid and rapidly developing area. I think things like outlining, things like having ChatGPT suggest improvements, and finding ways for students to also kind of track their workflow with that. I do think that one of the things that in our different professional [LAUGHTER] lives, because as I mentioned in the article, I think that should really lead the way of what work are we doing as faculty and as scholars in our particular areas. One of the things we’re going to have to be looking at is alright, how do I manage any output that I got from this and knowing what belongs to it and what was generated by me. What have I already asked it? If they’re particularly good prompts, how do I save those so I can reuse them? …another really good thing about interacting with the tools. But, I’m kind of playing around with some different ideas about having students generate maybe structures or suggestions that they can work off of themselves. And having CHATGPT give them some feedback on what they’ve developed so far. So one of the things you can ask it to do is critique what you tell it, so [LAUGHTER] you can say, “Okay, improve on this.” And then you can repeat, you can keep iterating on that, and you can keep fine tuning in different areas. You can also have it improve on its own work. So once it makes a suggestion you can, I mean, it’s virtually infinite what you can tell it to go back and do: to refocus, expand, condense, add and delete, and so on. So that’s kind of what I am shaping right here. I think too, at the Introduction to Psychology level, which is the other level that I frequently teach within, I’m not incorporating it quite yet. But I think having students have the opportunity or option to create a dialogue, an example, maybe even a short play or skit that it can produce to illustrate some concepts from the book and there ChatGPT is going to be filling in kind of all the specifics, the student won’t be doing it, but it’ll be up to them to say, “Well, what really stood out to me in this big, vast [LAUGHTER] landscape of introductory material that I think would be so cool to communicate to another person in a creative way?” And this can help out with that. I’m also going to be teaching my teaching practicum for graduate students coming up as well. And, of course, I want to incorporate just kind of the latest state of the art information about it. But also, it’s supposedly, I haven’t tried it myself yet, but supposedly it’s pretty good at structuring lesson plans. We don’t do formal lesson plans the way they’re done in K through 12 education, of course, but to give it the basics of an idea and then have a plan that you’re going to take into a course since that’s one of the things they do in that course is produce plans for courses and I gotta say it’s not a critical skill, the formatting and exactly how that’s all going to be laid out on the page, is not what they’re in the class to do. It’s to really develop their own teaching philosophy, knowledge, and the ability to put those into practice in a classroom. So if it can be an aid to that, great, and I also want them to know what the capabilities are if they haven’t experimented with them yet, so they can be very aware of that going into their first classes that they teach.

Rebecca: When you mentioned the example of a writing intensive class that’s fully asynchronous online, I immediately thought of all of the concerns [LAUGHTER], and barriers that faculty are really struggling with in really highly writing intensive spaces, and then fully online environments, especially around things like academic integrity. Can you talk a little bit about [LAUGHTER] some of the things that you’re thinking about as you’re working through how you’re gonna handle AI in that context?

Michelle: As I’m been talking with other faculty right now, one of the things that I really settled on is the importance of keeping these kind of threads of the conversation separate and so I’m really glad we’re kind of piecing that out from everything [LAUGHTER] else. Because once again, it’s just too much to say, well, on the one hand, how to prepare students and give them skills they might need in the future? How do I use it to enhance learning and oh my gosh, is everybody just going to have AI complete their assignments? It’s kind of too much at once. But once we do piece that out, as you might pick up on that I’m a little enthusiastic about some of the potential, does not mean I don’t think this is a pretty important concern. So I think we’re gonna see a lot of claims about “Oh, we’re going to AI proof assignments and I think probably many of your listeners have already run across AI detection tools and the severe problems with those right now. So I think we have to just say right now, for practical purposes, no, you cannot really reliably detect AI written material. I think that if you’re teaching online especially, I think we should all just say flat out that AI can take your exams. If you have really conventional exams, as I did before [LAUGHTER] this semester in some of my online courses, if you’ve got those, it can take those. And just to kind of drive home to folks, this is not just simple pattern matching, looking up your particular question that floated out into a database, no, it’s processing what you’re putting in. And it’s probably going to do pretty well at that. So for me, I’m kind of thinking about, in my own mind, a lot of these more as speed bumps. I can put speed bumps in the road, and to know what speed bumps are going to at least discourage students from just dumping the class work into ChatGPT. To know what’s effective, it really helps to go in and know what it does well and what it really stumbles on, that will give you some hints about how to make it less attractive. And that’s kind of what I’m settling on right now myself, and what I’ve shared with students, as I’ve spoken with them really candidly to say I’m not trying to police or catch people, I am not under an illusion that I can just AI proof everything. I want to remove obvious temptation, I want to make it so a student who otherwise is inclined to do the right thing, wants to have integrity and wants to learn doesn’t go in feeling like, “Oh, I’m at a disadvantage If I don’t just do this, it’s sitting right there.” So creating those nudges away from it, I think, is important. And yeah, I took the step of taking out conventional exams from the online class I’m teaching right now. And I have been steadily de-emphasizing them more with every single iteration. I think those who are into online course design might agree well, maybe that was never really a good fit to begin with. That’s something that we developed for these face-to-face environments, and we just kind of transplanted it into that environment. But I sort of ripped off that [LAUGHTER] bandaid and said, “Okay, we’re just not going to do this. I’ve put more into the other substance of the course, I put in other kinds of interactions. Because if I ask them Psychology 101 basic test questions, even if I write them fresh every time, it can answer those handily, it really can.

John: Recently, someone ran the Test of Understanding in College Economics through with the micro and macro versions. And I remember on the macro version ChatGPT-4 scored at the 99th percentile on this multiple choice quiz, which basically is the type of things that people would be putting in their regular tests. So it’s going to be a challenge because many of the things we use to assess student’s learning can all be completed by ChatGPT. What types of activities are you thinking of using in that online class that will let you assess student learning without assessing ChatGPT’s or other AI tools’ ability to represent learning?

Michelle: Well, I’ll share one that’s pretty simple, but I was doing anyway for other reasons. So just to take one very simple example of something that we do in that class, I really got an a big kick with Kahoot!, especially during the heyday of fully hybrid teaching where we were charged, as faculty, I know at my institution, where you have to have a class that can run synchronously with in-person and remote students at the same time, and run [LAUGHTER] asynchronously for students who need to do their work at a different time phase. And that was a lot and Kahoot! was a really good solution to that. It’s got a very K through 12 flavor to it, but most students just really take a shine to it anyway. And it is familiar to many of them from high school or previous classes right now. So it’s a quiz game, runs a timed gamified quiz. So students are answering test questions in these Kahoot!s that I set up. And because it has that flexibility, they have the option to play the quiz game sort of asynchronously on their own time, or we have those different live sessions that they can drop in and play against each other and against me. So that’s all great. But here’s the thing, prior to ChatGPT, I said I don’t want to grade this on accuracy, which feels really weird, right, as a faculty member to say, well, here’s the test and your grade is not based on the points you earn for accuracy. It’s very timed, a little hiccup in the connectivity you have at home can alter your score, and I just didn’t like it. So what students do is for their grade, they do a reflection. So I give the link to the Kahoot!, you play it, and then what you turn into me is this really informal and hopefully very authentic reflection, say, “Well, how did you do? What surprised you the most? Were there particular questions that tripped you up?” And also kind of getting them to say, “Well, what are you going to do differently next time?” And for those who are big fans of teaching metacognition, I mean, that comes through loud and clear, I’m sure. So every single module they have this opportunity to come in and say, “Okay, here’s how I’m doing, and here’s what I’m finding challenging in the content.” Is it AI proof? Absolutely not. No, it really isn’t. But it is, at least I think at that tipping point where the contortions you’d have to go through to come up with something that is gonna pass the sniff test with me, and if I’ve now read 1000s of these, I know what they tend to look like. And Kahoot!s are timed. I mean, could you really quickly transfer them out and type them in? Yes. It’s simply a speed bump. But the time would make that also a real challenge to kind of toggle back and forth. So I feel good about having that in the class. And so it’s something again, I’ve been developing for a while, I didn’t just come up with it, fortunately, the minute that ChatGPT really impinged on this class, but it was already in place. And I kind of was able to elevate that and have that be part of it. And so they’re doing that. I do a lot of collaborative annotation, I continue to be really happy with… I use Perusall. I know, that’s not the only option there is, but it’s great. They’ve got an open source textbook. And they’re in there commenting and playing off each other in the comments. So that is the kind of engagement I think that we need in force anyway, it is less of a temptation. And so I feel like that’s probably better than having them try to quickly type out answers to, frankly, pretty generic definitions and so on that we have in that course. Some people are not going to be happy with that, but that’s really truly what I’m doing in that course instead.

John: Might this lead to a bit of a shift to more people using ungrading techniques with those types of reflections as a way of shifting the focus away from grading, which would encourage the use of ChatGPT or other tools to focus on learning, which might discourage it from being used inappropriately?

Michelle: What a fantastic connection. And you know what? When I recently led a discussion with faculty in my own department about this, that is actually something that came up over and over just, it’s not ungrading, because not everybody is even kind of conversant with that concept. But how there are these trends that have been going on for a while of saying, you know, is a timed multiple choice test really what I need everything to hinge on in this online course. Ungrading, this idea of kind of, I think there’s this emerging almost idea I’ll call both sides-ism, or collaboration between student and teacher, which I think was also taking root through the pandemic teaching and that came to the forefront with me of just saying, “Okay, we’re not going to just be able to keep running everything the same way traditionally it’s been run,” which sometimes does have that underlying philosophy of, “Okay, I’m going to make you do things and then you owe me this work, and I’m going to judge it and you’re going to try to get the highest points with the least effort. I mean, that whole dynamic, that is what I think powers this interest in ungrading, which is so exciting, and it’s gonna maybe be pushed ahead by this as well. Ultimately, the reason why you’re going to do these exercises I assign to you is because you want to develop these skills. You are here for a reason, and I am here to help you. So that is, I think, a real positive perspective we can bring to this and I would love to see those two things wedded together, especially now that tests can be taken by ChatGPT, then, we should relook at all of our evaluation and sort of the underlying philosophy that powers it.

John: One of the concerns about ChatGPT is it sometimes makes mistakes, it’s sometimes will make stuff up, and it’s also not very good with citations. In many cases, it will just completely fabricate citations, where it will get the authors of people who’ve done research in the field, but will grab other titles or make up other titles for their work. Might that be a way in which we could could give students an assignment to use one of these tools to generate a paper or a summary on some topic, but then have them go out and verify the arguments made and look for citations and document it just as a way of helping prepare them for a world where they have a tool which is really powerful, but is also sometimes going off in strange directions, so that they can develop their critical thinking skills more effectively.

Michelle: Yeah, looping back to that critical thinking idea. Could this also be a real way to elevate what we’ve been doing and give us some new options in this really challenging and high value area? And yes, this is another thing that I think faculty hopefully will discover and get a sense of as they experiment themselves. I think probably a lot of us have also experimented with, just ask it about yourself. Ask it, what has Dr. Michelle Miller written? There’s a whole collaborator [LAUGHTER] I have never heard of, and when it goes off the rails, it goes. And it’s one thing to say really kind of super vaguely to say like, “Oh, AI may produce output that can’t be trusted.” And that has that real like, okay, caution, but not really, feel to it. That’s a whole other thing to actually sit with it and say, alright, have it generate these citations. They sure do look scholarly, don’t they really look right? Okay, now go check them out. And say, this came out of pure thin air, didn’t it? Or it was close, but it was way off in some particular way. So as in so many areas, to actually have the opportunity to say, okay, generate it and then look at it, and it’s staring you right there in the face some of the issues. So I think that we will see a lot of faculty coming up with really dynamic exercises that are finely tuned to their particular area. But yeah, when we talk about writing all kinds of scholarly writing and research in general, I think that’s going to be a very rich field for ideas. So I’m looking forward to seeing what students and faculty come up with there.

Rebecca: That’s a nice lead into the way that we always wrap up, Michelle, which is to ask: “what’s next?”

Michelle: Well, gosh, alright. So I’m continuing to write about and disseminate all kinds of exciting research findings. I’ve got my research base substack, that’s still going pretty strong. After summer. I actually focused it on ChatGPT and AI for a couple of months. But now I’m back to more general topics in psychology, neuroscience, education, and technology. So articles that pull in at least three out of four on those. I’ve got some other bigger writing projects that are still in the cooker. And so I’ll leave it at that with those and I’m continuing to really develop what I know about and what I can do with ChatGPT. As I was monitoring this literature, it was really very clear that we are at a very, very early stage of scholarship and applied information that people can actually use. Those are all things that are very much on the horizon for my next couple of months.

Rebecca: Well, thank you so much, Michelle, we always enjoy talking with you. And it’s always good to think through and process this new world with others.

Michelle: Absolutely.

John: It certainly keeps things more interesting and exciting than just doing the same thing in the same way all the time. Well, thank you.

[MUSIC]

John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.

[MUSIC]

308. Design for Learning

We tend to design courses for ourselves because we are the audience we know best. In this episode Jenae Cohn joins us to explore how user-experience design principles can help us create effective and engaging learning experiences for the students we have right now. Jenae is the Executive Director of the Center for Teaching and Learning at the University of California at Berkeley. She is the author of Skim, Dive, and Surface: Teaching Digital Reading. Her newest book, co-authored with Michael Greer, is Design for Learning: User Experience in Online Teaching and Learning.

Show Notes

  • Cohn, J. (2021). Skim, dive, surface: Teaching digital reading. West Virginia University Press.
  • Cohn, J., & Greer, M. (2023). Design for learning: User Experience in Online Teaching and Learning. Rosenfeld Media
  • Global Society of Online Literacy Educators
  • Horton, S., & Quesenbery, W. (2014). A web for everyone: Designing accessible user experiences. Rosenfeld Media.
  • Web Accessibility Guidelines
  • Copies of Design for Learning may be ordered at the Rosenfeld Media website. The discount code for listeners is TEA20. It’ll be available on Wednesday, 9/27 and will give listeners access to 20% off the book for one month (i.e. 30 days).

Transcript

John: We tend to design courses for ourselves because we are the audience we know best. In this episode we explore how user-experience design principles can help us create effective and engaging learning experiences for the students we have right now.

[MUSIC]

John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.

[MUSIC]

Rebecca: Our guest today is Jenae Cohn. Jenae is the Executive Director of the Center for Teaching and Learning at the University of California at Berkeley. She is the author of Skim, Dive, and Surface: Teaching Digital Reading. Her newest book, co-authored with Michael Greer, is Design for Learning: User Experience in Online Teaching and Learning. Welcome back, Janae.

Jenae: Thank you. I’m so glad to be back.

John: It’s good to see you again.

Jenae: …Good to see you, too.

John: Today’s teas are… Jenae, are you drinking any tea?

Jenae: I sure am. I’m always prepared to drink tea. Especially when I’m talking to the two of you. But I went for a classic English breakfast tea this morning. Do you both have some tea with you?

Rebecca: Yeah, I have English tea time.

Jenae: We’re matching…

Rebecca: Yeah…

[LAUGHTER]

John: And, I’m not. I have [LAUGHTER] ginger peach black tea today.

Jenae: That sounds really good, though.

John: It is.

Rebecca: Sounds like a good way to start the day, for sure. So we invited you here today to discuss Design for Learning. Can you talk a little bit about how this book project came about?

Jenae: Absolutely. So my colleague Michael and I have a lot of shared interests. Michael and I both are trained in rhetoric and composition. And we both are people really interested in online writing, online reading, and online learning, broadly speaking. We both served on the board for the Global Society of Online Literacy Educators, which is an organization dedicated to supporting folks who teach reading and writing online, broadly speaking. Through that organization, we got to know each other better. And we just realized how much we wanted to talk about what it really meant to create quality online learning experiences. And something that kept cropping up for the two of us. And I should say that both of us have had like a hodgepodge of jobs in and around higher education. We kind of joke that we were both sort of like these misfits in higher ed, people who have kind of done a bit of teaching, a bit of admin. He’s worked in publishing, I did a lot of work in instructional design and just higher education pedagogy. And something we noticed, just in the various roles that we were in was that educators, professors, faculty could learn a lot from user experience frameworks. And we were reading a lot about UX and UI in the work that we were doing around instructional design and for him publishing, and it just dawned on us like, why are we not bridging these conversations between the work of thinking about designing learning interfaces and the work of building really good, high quality learning experiences. I think we notice that in higher ed, there is this tendency to kind of try and reinvent the wheel around defining what a good teaching experience, especially what a good online teaching experience is by just creating really kind of exhausting templates and tons of checklists and rules. And we really thought those are useful, but wouldn’t it be more useful just to remember that students are people navigating devices online? And can’t we use the frameworks that help inform those design decisions to inform the design of learning experiences to make them better? So that was really the genesis of this project. We started off thinking we’d write a bunch of blog posts and then it struck us that blogs and articles were great, but wouldn’t it be even better if we wrote a book [LAUGHTER]. So we put it all together, and it resulted in this book.

John: So who’s the intended audience of this book?

Jenae: We really are targeting a broad audience with this book, I’d say primarily folks who do instructional design style work in mind. So in higher ed, that could be faculty, a lot of faculty play the role of instructional designers, as well as facilitators and teachers, of course. But, we also hope that this book would really reach folks who do dedicated instructional design support. We also hope that this would just reach people who are having to teach online or do trainings or workshops online, and who are still really struggling with it. This book, I would say, was written before the pandemic happened. We were, I would say, drafting and conceptualizing it before the pandemic. And of course, the pandemic shaped the drafting as we went, there’s still COVID-19 out there, so I don’t want to say we’re beyond the pandemic. But in this moment where we’re beyond perhaps like a peak point of the pandemic, let’s just say. There may be folks who are still wanting to be more intentional about what it means to provide more equitable access to online learning experiences, who want to be designing in a more intentional way, and who want to be really thinking critically about how to create more sustainable online learning experiences, as well, that really work. I think we were also on a mission with this book to prove that really, anyone can do this, you just need to keep some known principles in mind that, again, this is not totally new territory, and scholars and user experience and human computer interaction have been thinking for a very long time about how to make information accessible online, and how to make sure that information and interactions are easily navigable. And so that was really the literature we wanted to tap into. So that’s all to say that I think people who benefit from reading this book is really anyone who wants to be creating a better online learning experience for whatever teaching situation they’re in.

Rebecca: I’m, of course, super excited about this book, because I’m a UX designer. I love that you use that framework to write this book. Can you talk a little bit about why you chose this approach?

Jenae: Absolutely. I’m so glad that you appreciate this book exists. We’ve gotten really good reception from the UX community on it as well. I would say that we use this framework because we felt like it really centered the learner in an important kind of way. I think that in a lot of teaching situations, people who educate or design learning are often more thinking about the content: What information do I have to deliver? What are the main things that I need to make sure people know how to do? Those aren’t bad things to focus on, we need to cover content, and we need to make sure that there are clear outcomes. But I think it’s most important to really think about how is someone engaging with that content? How are they understanding it? What are their opportunities to understand that content in a variety of different ways. And I think what a user experience framework allows us to do is to center that reminder. Learners have these embodied experiences that shape how well they’re going to be able to learn, how well they’re going to be able to interface with the information. And if we’re talking about that in an online context, in particular, it’s impossible to do so without addressing what it means to, again, engage with and use these online environments effectively. So I think a UX framework really just allows us to be more centered in reminding ourselves who really benefits from the learning experiences we design, and who really needs to have access to [LAUGHTER] the information to be successful. And I think UX frameworks just really help us center that.

John: Can you talk a little bit about how this approach centers the user in terms of practical ways in which that’s built into the design process?

Jenae: Sure, one way to sort of think about that is to really take a step back and try to remind yourself just who is taking your class in the first place. Starting there, starting from the place of trying to be curious about who your learners really are. I think that it’s easy to make assumptions, I’ll just say in higher education, in particular, since I think that’s primarily the audience for this particular podcast. I think a common misconception, for example, is that all students entering their class are traditional college age, 18 to 21 years old, but like, I should put a big asterisk on that and say, that’s probably not the traditional age at most institutions anymore. But that’s the stereotype of kind of who a college student is. And there may be some assumptions about what their prior learning experiences were like that brought them into a college classroom… about the prior knowledge that they had. And so what I think a user-centered design encourages us to say, “Do we know that? How do we know that? What information do we need to gather to remember who’s actually coming into our rooms?” And I’m not suggesting that any educator has to like, do deep dive demographic data work to find out who their learners are. But I think most of us can kind of anticipate the range of people who are coming into our classes. We might anticipate just the different types of learners that we may encounter. And by that, I mean, it’s worth, I think, before you start designing, just trying to remember, what are the different motivations that students have for coming into the class? What are their purposes for being there? What are the main things that students are going to want to do by being in your class or your training or your workshop at any given moment? So starting by just sort of trying to map out who those people are? And then try to anticipate, okay, given this motivation, or this purpose that this learner may have, what kinds of things might they be looking for… literally looking for my online course? What things will they click on first? Which links are they going to want to access most frequently? Which resources are going to benefit them on the site most? And then trying to design your learning management system course site, or if you’re not using learning management system, your course website, broadly speaking, to really privilege the resources, the links, the activities, the pages that are going to be best aligned with what you anticipate your users or your learners may need. And Rebecca, I’m sure, can speak to this given her expertise, too, but UX design really is a whole process of trying to consider how the visual information, how even like the tactile information, say how your keyboard was set up, how your device is set up, how that allows you to most easily use and engage with products, so to speak, that you’re building. And in this case, we want to think about how can you build the best online course that you can, in a way that allows users to most easily find the information you anticipate they will most frequently need?

Rebecca: So one of the things I’m hearing you say, is really thinking about the wide variety of learners that we have and the different needs that they have and trying to address that. One of the things that’s really popular in UX design and that you talk about in your book are personas. Can you talk a little bit about how learner personas can help us think through the different kinds of learners that we have in our class in a really practical, tangible way. You just kind of provided that theoretical framework, but I love that the personas is such a practical application of that.

Jenae: Yes, thanks for asking that. Rebecca, I was debating whether to dive into that with the last question. But let’s dive into it now. So for those who aren’t familiar, personas are an exercise where you really try to create a character sketch, I would say, of the user you’re imagining is going to engage with your course or in this case, try to imagine an example of a student who’s going to be in your class. And by creating a character sketch, I mean, I encourage instructors, if they have the time to sit down and say, “Okay, what might be the name of someone in my class? What might be their age? What might their prior experiences with learning my topic have been? Why are they here? What brings them to college? Or what brings them to this class in the first place? What are going to be some of their biggest challenges? What are going to be some of their biggest hopes? What are the things that they’re going to be most excited about doing in this class?” And again, it’s a bit of an imaginative exercise. And so I think it’s easier to do with more teaching experience. But it’s also not impossible to do even if you’ve had relatively limited experience. It’s really just an exercise in trying to think through who might be the real people that you are engaging with, I do want to say that there’s been a lot of conversation in the UX community, and again, Rebecca, you may have some thoughts on this, too, about sometimes the stereotypes that personas can perpetuate. For example, I think there have been concerns in the UX community that when you try to characterize, say, an older user, of an online interface, a stereotype might be that they struggle more or are more challenged with using technology than, say, a younger user. And that that might be a challenge to anticipate. And so I want to be mindful, for example, that if you are going to be in the practice of building personas, which we talk about in the book, because I do think is a useful exercise to kind of try and make concrete for yourself who is going to be on the receiving end of your experience, that you do try to check yourself a little bit on reinforcing stereotypes to the best of your ability. It’s easy to do, stereotypes exist because we notice patterns sometimes in how people behave. And that can sort of reproduce some harmful assumptions about who those users are. But again, to the best of your ability, attempt to anticipate what the needs might be based on what you do know about who might be in the room, just again, kind of reminding yourself that you’ll want to think about your personas in nuanced ways, and not necessarily make assumptions about who they are. And I would say one solution to that, how am I supposed to write a generalized description of a persona, while avoiding all possible stereotypes about who they might be? I would say again, time allowing, try to run your personas by other people, and just see what their reactions are to reading them. For example, if you have a trusted colleague, or a friend who teaches a similar class, or who you work with regularly, just show them what you’ve drafted and say, “Does this feels like a real person to you?” And to attempt to ask diverse people about how your persona sketches are landing or how realistic they feel to them. That’s always a good way to kind of gut check, and just make sure that as you’re anticipating your users’ needs, you’re not falling too much into your own biases about who the people are that you’re supporting in your course.

Rebecca: So one of the things, I think, people do sometimes run into when they’re making personas is to create the ideal student that doesn’t exist, and also to recreate themselves. And so one strategy that I often recommend is thinking about creating aggregates of people that you do know. Because then they’re more realistic in terms of the way they might interact. So if you’ve taught a class before, you might have a real pool of people you could draw from [LAUGHTER] and to create a persona from, obviously, that’s more difficult when it’s a new place. And I was also going to offer up in terms of thinking about disability and thinking about accessibility, that there’s a book called A Web for Everyone, and they have a lot of resources. It was published quite a while ago, but they have a lot of resources still online, they have some personas for people with a wide range of different kinds of disabilities. And sometimes that can be really useful in just thinking through kinds of scenarios that you might not think of on your own.

Jenae: That’s fabulous. I would love to see that resource about sort of supporting accessibility, especially. That’s such a huge issue in designing online learning experiences, particularly. I’m so glad you mentioned those resources. That’s fantastic.

John: And while there may be those types of biases that you might have, those who’ve taught classes multiple times do know some of the types of problems that past students have had. So those issues that they’ve experienced in the past could be built in. But one of the other things you suggest is doing a pre-course survey, so that you get some more information about the actual students in the room rather than those who may have been thinking about when you initially designed the course? Could you talk a little bit about that survey?

Jenae: Yes, I’d be happy to talk about the pre-course surveys. So this is a practice that, I think, has multiple benefits. So in a pre-course survey, I think instructors have this wonderful opportunity just to ask students what their motivations are for engaging with the class, what brought them here, how they would characterize some of their prior experiences with learning similar topics, if any, and just to voice what concerns they have, or what things are exciting to them about the term ahead. I’m giving a lot of examples of possible questions and I just want to acknowledge that not all instructors will want to ask all of those questions all at once. But those kinds of questions that really get at motivation and concerns, I would say, in a nutshell, can be really critical, both for adjusting, I think, those persona expectations. So, creating personas should be an iterative process, I should say, as well. It’s not a one and done thing where you anticipate who your learners are prior to the course starting and then you’re like, “Okay, I figured it out, I know who all the students are. Knowing who the real students are, can then allow you to go back to what you anticipated. I think, and both of you, Rebecca and John, were speaking to how you could use prior information from prior terms to inform your kind of current term or current course. Great, you could sort of just align your prior understanding with this current information you might get from these surveys to then go into your course website, or your course learning management system, your syllabus and say, “Okay, is this design going to work for the group of people who are actually here based on what I’m reading?” …recognizing, of course, that nothing’s gonna be perfect for everyone. But you can do the best you can to try and make the materials as good as possible for the group that you have. In front of you. I would say that you want the survey to feel less burdensome for your students to complete. I’m giving a lot of examples of questions that I think are ideal as open-ended questions. Some of these, you could turn into multiple choice or kind of Likert-scale style questions, because you can just use it as an opportunity to take the temperature. “On a scale of one to 10, for example, how confident do you feel in your understanding of your ability to pick up new quantitative concepts?” …for example, if you’re teaching in a STEM-style discipline. Or “On a scale of one to 10, how comfortable do you feel as a writer or with writing tasks?” …if you’re teaching something more humanities- or writing-centric. You can get really creative in trying to solicit some feedback. And I also encourage instructors to be judicious in what they’re asking in these pre-course surveys to kind of try and ask questions, with the end goal of helping you as the instructor make small tweaks to the design of the course. Think about this information as a way to say “Okay, are there certain links I should put on the homepage that I didn’t think needed to be on the homepage? Or should I reorganize the menu on my learning management system in a way that highlights some resources more than others based on the information I’m getting in the survey? Should I reorganize a module to introduce some content before other content, because I’m seeing a trend in the surveys about less confidence in one area of the course than I was expecting in another?” So thinking about how the answers might inform your design, a research-based perspective really, I think, can make your course really even stronger. And I think it’ll feel better, both for you and the students because it helps the students see that you’re curious about them, you want to know who they really are. And we know that engaging personally with people really matters for good teaching. But the instructor too, it can be really frustrating. If you design something and it doesn’t land with your students. You feel like you spent a lot of time building something that didn’t work. That’s a really disheartening experience. So getting the feedback might allow you to avoid [LAUGHTER] having or feeling so disappointed if the information didn’t land the way you were expecting it to. And this isn’t foolproof. There’s always room, again, for iteration. But I do think the surveys can at least help you anticipate a little bit better how the progression through your course could go.

Rebecca: I can imagine that some of those surveys with open-ended questions could lead to better understanding how students name things or label things which could give you a lot of clues about the actual user design of a course by just how you might name or provide quick descriptions of things. In your book, you talk a lot about instructional text design, which obviously has lots of skills in online learning from instructions for assignments to just how we might label a folder [LAUGHTER]. There’s lots of skill there. Can you talk a little bit about the basic principles that you’d recommend for course designers to follow when they’re writing instructional text?

Jenae: Absolutely, and I realized, as you were talking and responding, I was nodding along. And then it struck me. It’s like, “I’m on a podcast, no one’s going to know that I’m nodding and agreeing with you right now.” So [LAUGHTER] for the listeners sake, like I was nodding along quite vigorously with that entire response. Instructional text, I think, is one of the most underrated and one of the most important things to design for any online course experience. I think that online course designers have a real tendency to rely too heavily on video and on images. There’s an assumption that if you’re working online, everyone’s just using video all the time, or everyone’s just wanting to engage with the flashiest multimedia possible. That is still important. I mean, we have two chapters in the book, all dedicated to video. So I don’t want to undermine that. It is important to engage with multimodal artifacts and building multimodal interventions, when you’re teaching in a multimodal environment like the internet. However, for students who may have low internet access and low bandwidth, for students with disabilities, text remains one of the most accessible and easiest ways to find information in an online course. I’d also say text is one of the most mobile-friendly pieces to think about. And we know that increasing number of students are accessing their courses or coursework through their smartphones. I’ll answer your question directly now, but I wanted to provide that context. I would say when it comes to designing instructional text, I really encourage instructors to think about two big things, to think about the hierarchy of the information that they’re writing, and to think about the discrete chunks of information that they’re wanting to communicate. So when I talk about the hierarchy of texts, I think it’s important when we’re writing to consider what are the sections of our text? Most academics, most instructors, are used to, when they’re reading or writing, creating headers, and sub-headers, and paragraphs that denote a certain order of information. And when you’re teaching online, especially, I think even more critically about how are you labeling the text? How are you indicating which things are instructions versus content? How are you labeling the order of the content that you want students to read in? How are you even labeling the order of instructions, like there is usually multi-tiered sets of steps. So using header text and different layers of header text, is a really important web accessibility measure. And again, it helps readers see visually and if they’re using a screen reader tool, it helps them navigate that text more easily. So I should take one step back and say when I’m referring to header text, I mean that when you’re working in a rich text editor, on any website, you can typically see an option to select different layers of headers, like the header ones are usually the highest, biggest level header, header twos go below that header, threes go below that. So just being mindful that just increasing text size is not the same thing as using headers is one really, really simple way to create hierarchy. And again, to denote the correct order of reading the text information. And when I say chunking text, this is as simple as just thinking about paragraphing, making sure that you are spacing out pieces of content in really critical ways. So anyone who’s read a piece of writing with super long paragraph knows, that’s a lot harder to kind of discern, it’s a lot harder to see how one idea moves from one to the next. Shorter paragraphs are typically easier to get a sense of when you’re moving from one idea to a new idea. And so even though long paragraphs have their purpose, perhaps especially in scholarly writing, or even in more, I would say kind of creative writing, in some cases, when you’re doing really instructional or technical work, which you’re often doing when you’re designing a class, shorter is better, more chunked is easier to access, because you’re assuming that people are doing things with your information. So those are the two qualities I would just be thinking about with instructional text. There’s a whole other component that we didn’t really address in the book, but I’ll just stick to very briefly here, which is also thinking about just the visual appearance of your text. A lot of accessibility folks speak to some best practices and guidelines around font face, and font size, and some of these factors when you’re designing text as well. I’m not an expert, I should say, in like type of graphic design or font size, but I want to point out anyway, because I think if you are designing online, it’s important again to do the best that you can to try and anticipate those needs. So I think as a general rule, making sure your font sizes are not super teeny tiny, or super large. Making sure that you’re using standard font faces: Arial, Helvetica any sort of sans serif font is typically considered a best practice. The rules around this change all the time, Web Accessibility Guidelines change as technology evolves, so I never like to give super hard and fast rules, and again, it’s not my area of expertise. But it’s another piece to keep in mind that visual and verbal information is intertwined. Text is a visual medium, online learning experiences are largely a visual medium, by default. And so the more mindful we can be of what that looks like, and the more mindful we can be of how the visual experiences we design online, are compatible with accommodations for disabled users. We just anticipate our users’ needs, our learners’ needs more proactively, and it raises the boats for everyone. It just gives everybody a deeper chance to succeed if we’re just thinking about these interface choices in more deliberate ways.

Rebecca: I love that you’re really talking about how the instructional text is also part of digital accessibility. It’s important to have plain language, it’s important to chunk your content and these sorts of things. So I’m really excited that you’re incorporating that into the work that you’re doing.

Jenae: Thank you. It is exciting. I think it’s one of these things that, when Michael and I were first discussing this book, it was a real lightbulb moment for us that there was such a robust literature out there that discussed all these great principles for making sure that online information was easy to find. And it just was striking to us that a lot of folks in teaching professions weren’t getting access to that information or exposure to that information. And we started thinking about this, again, prior to the pandemic, kind of in the mid 2010s. And even at that point, online courses were growing, mobile access was becoming a more common way that students were engaging with courses. So, why not tap into these existing sets of conversations that are industry best practices, for engaging with online interfaces, in spaces like higher ed, and in spaces just like learning and development, where these dialogues seem not to have met each other as fully as they could.

Rebecca: Our chief technology officer and I were having a conversation about some of these things yesterday as we’re talking about our student body is diversifying and that we have far more students with disabilities who are able to attend college and have access to college in a way that maybe they haven’t in the past. And as you were talking about headings and paragraphs and things, something that people might not know, is that if you use a screen reader, you’re not necessarily visually interacting with the text. Instead, you’re thinking programmatically, and so just like kind of vision centered [LAUGHTER], the user might skim headings visually, it’s the same way someone might use a screen reader. So by choosing a heading level two, it allows someone to find that section easier. And by breaking things into paragraphs, and delineating that’s that kind of content that allows a screen reader user to be able to jump to a particular part of the content. When we don’t do that, a screen reader user has to listen to everything from the top to the bottom of the page.

Jenae: Great example. Yes, and that’s such a frustrating experience to have to do that. If we can be just a little bit more attentive to the information architecture of sort of what we’re trying to communicate and convey… information architecture is a technical term, but it’s also a metaphor [LAUGHTER]… we have architecture and we have design to help create solid foundations for places that we live. Similarly, when it comes to information, we need to be building solid infrastructure to help people navigate their way through a course. One of my colleagues a while ago used a metaphor for online learning design I’ve never forgotten and we’ve alluded this a bit in the book, which is that when you’re building something online, it’s like you’re just building a whole house [LAUGHTER]. When you walk into an in-person classroom, the architecture is literally there, and you make assumptions about the room in the space, the second that you walk in the door. When you’re designing text online, or just when you’re thinking about the whole online learning experience, it’s a total blank canvas, you have to build that architecture and those hierarchies. If you’re not attentive, you’re absolutely right, the consequence is that it can be a big overwhelming mess of information. And I think it’s a useful practice for instructors, even when they’re not teaching online, to think about these things. It’s also just a great exercise and getting really very focused on what information do you want to prioritize when you’re communicating assignment instructions or when you’re picking out content-based readings for your students? What do you want them to focus on? What are the big things you really need them to learn or pay attention to? And so if your course design, your visual design can align with the hierarchy of choices you’re making as an instructor or the priorities that you’re setting, it just makes it easier for everyone to have equal access that information so that more time can be spent for students to focus individually on how they’re processing, applying, doing higher-order thinking with that information. They don’t have to spend so much energy just trying to intake the kind of basics before they have the opportunity to really work with it and apply it meaningfully,

John: You provide a lot of other information in your book, and we encourage people to read your book. If they want to find out more about creating videos, about providing effective webinars, and so forth, there’s some really nice hints and suggestions throughout. But one of the things you end with in there, is ways in which instructors can continuously improve their courses, in terms of soliciting feedback to make the course better each time. Could you talk a little bit about how you would encourage instructors to continuously work on developing their courses?

Jenae: Sure. So I really like that section of the book, because what I hope that section communicates is that thinking about your course design is a reflective and an iterative process. I don’t think a course is ever really fully perfect and done, there’s always things you can do, and modify each time you teach or offer the experience. So, I don’t think getting feedback on the course has to be hard, I don’t think it has to take a ton of time. We talk about multiple ways of getting information about how the course is working. And I’m going to start with I think some of like the easiest and most passive ways to get information and then we’ll sort of work our way to some of the more perhaps active or personalized interventions for getting information about the course. So one thing I think is worth really paying attention to, after you finish teaching a course, are some of the analytics that are available in your learning management system or your course website. And I recognize that some folks are really reluctant to look at the analytics, because there is a surveillance economy implicated in the tracking of course analytics. Every site on the web tracks your movements, every site in the web knows how long you’ve stayed on a certain page, what things you’ve clicked on. And a learning management system is no exception to that. Unfortunately, that information can get weaponized to discriminate against students, discriminate against users in problematic ways. In the web, outside of learning, for example, analytics can be gathered and sold to advertising companies to spread information about your activities for profit. So I just want to note that context, but you can also use this information for good and for some useful things as well. So seeing which resources students are clicking on the most in your class can be really useful information for you to say, “Huh, seems like a lot of people found that resource useful.” You don’t have to necessarily identify which individual students looked at which resources but you can look at this data, typically in aggregate, and again, most learning management systems have an analytics dashboard, you can access to look at this. I think that’s incredibly useful just to see what was clicked most often and what wasn’t. You might also want to track, for example, which pieces of information students did spend more time on. It could indicate a couple different things, it might indicate that something was really challenging, if students spend a lot of time on one particular piece of content over another or if they found it useful. You’d have to contextualize that data based on what you were seeing in the course. But I think if you’re willing to look at that information, again, in the context of how your term went, it might just give you some passive information that could surprise you. I would even look at, for example, with assignment submissions, how many delays were there on certain assignments versus others? In which assignments did students request more extensions more than others? Again, this is just information that might help you inform whether the pacing was appropriate for the course, whether assignments were sequenced appropriately. That kind of thing. If you want to get more active, if you gave a pre-course survey, you can do a post-course survey. Most institutions, of course, have formal evaluations of teaching, but we know that institutional student evaluations of teaching can be fraught. Sometimes they ask the kinds of questions we don’t always want to ask or find most useful as instructors. So if you do your own very brief post evaluation, you could focus it on the design of the course itself. I think it’s worth asking students at the end of the course, how easy was the course site to navigate? How accessible did the materials feel for your ability to learn? You could return to some questions from your pre-course survey. If you asked a Likert scale about rating your confidence with learning something on a scale of one to 10 at the start of the course, you could ask them by the end, “How does your ranking change?” Even referring back to the original data that they might have submitted to you with the pre-course survey. So those are another way to ask them. I think if it’s possible to, what I love to at the end of the course is even a little brief post interview with students if possible. We mentioned this a bit in the book. Again, it’s time consuming. But if you have a small-ish class where you could have conferences at the end of the term, and have a moment with just a five-minute conversation to ask students: “How did it go? What aspects of the course design did you like most? Which were most challenging to you?” That’s another way to get information. Finally, I’ll just do one more technique we write about in the book, which is never discount your own reflection on your experience as well. This is another form of user research. Even though you are not the end user for the course, you are the designer, and so I think it’s always useful just to jot down a few notes and treat that as research when you’re done, too. What did you notice about user interactions on your course site throughout the term? What things surprised you? What things went exactly as you expected? You can use those notes to iterate and improve your experience for the next time that you offer it. So those are just a few techniques and many, which again, are drawn from the field of user experience research surveys, and interviews, for example, are pretty common user experience research practices… other UX research practices that, again, just depending on your time, depending on your resources, it’s great if you can see students engaging in the course as well, asking them, just really seeing what it looks like for them to interact in the course. That’s a good way to get at good information about it. I just want to encourage anyone who’s teaching not to shy away from getting that kind of feedback, because it does make, I think, teaching more satisfying when you’re getting more information about what’s working and what isn’t.

Rebecca: So, you know this question’s coming…[LAUGHTER] We always wrap up by asking: “What’s next?”

Jenae: Yes, I do know it’s coming, and it’s funny, because I was thinking about it. What am I [LAUGHTER] doing next? So to be honest, I don’t have a clearly defined project, I’m doing a lot of little things, I might be taking a little break, because I have written two books in about two and a half years [LAUGHTER]. So that’s been a lot… wonderful. I think I’ve been bitten by the writing bug, for sure. And so I suspect there’s more writing in my future, but nothing immediately next. I’m still very curious about what it’s going to mean to keep designing really good online learning experiences in the future, I don’t think we’re done with that conversation. I’m really curious about how that’s going to evolve in the context of creating more inclusive and equitable learning environments for students. So I imagine those are topics I will continue to explore to some extent, but we will see how, of course, with AI too, and the impacts of that on online learning, I’m sure there’s gonna be a whole set of ways to think about these topics that will continue to evolve. So I’m kind of keeping my eyes open and my ear to the ground on how things are developing. And we’ll just kind of see what ideas emerge from there.

Rebecca: Well, it’s always a pleasure to talk to you, Jenae. Thanks for all the work that you do.

Jenae: Likewise, thank you, again, for having me and for engaging with these excellent questions. And if you listened to this podcast, we’ll put in the speaker notes, I’ll give you a little gift of a promo code. If you’d like to buy the book, we can give you a 20% off discount with thanks to Rosenfeld Press who published this book.

John: Well thank you. We’ll be sure to include that in the show notes and it’s always great talking to you.

Jenae: Wonderful, and likewise, thank you again.

[MUSIC]

John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.

[MUSIC]

307. Career Readiness

Students do not always understand how the work that they do in our classes helps prepare them for their future careers. In this episode, Chilton Reynolds and Ed Beck join us to discuss one institution’s approach to helping students understand and articulate how their course learning activities intersect with career competencies. Chilton is the Director of the Faculty Center for Teaching, Learning, and Scholarship at SUNY Oneonta. Ed is an Open and Online Learning Specialist, also at SUNY Oneonta. Chilton and Ed have both worked on integrating career readiness skills into the curriculum.

Transcript

John: Students do not always understand how the work that they do in our classes helps prepare them for their future careers. In this episode, we discuss one institution’s approach to helping students understand and articulate how their course learning activities intersect with career competencies.

[MUSIC]

John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.

[MUSIC]

John: Our guests today are Chilton Reynolds and Ed Beck. Chilton is the Director of the Faculty Center for Teaching, Learning, and Scholarship at SUNY Oneonta. Ed is an Open and Online Learning Specialist, also at SUNY Oneonta. Chilton and Ed have both worked on integrating career readiness skills into the curriculum. Welcome Chilton and Ed.

Chilton: Hey, It’s nice to be here.

Ed: Thanks, John.

Rebecca: Today’s teas are… Chilton, are you drinking any tea today?

Chilton: I am. It’s afternoon here, so I’ve moved to iced tea. I make my own decaffeinated, slightly sweetened, peach iced tea for the afternoon.

Rebecca: Sounds nice and refreshing.

Chilton: Yes.

Rebecca: What about you, Ed?

Ed: I am drinking a Chamomile honey and vanilla tea, in a very fancy special mug.

Rebecca: Oh, that’s a Tea for Teaching mug. I wonder where you got it.

John: And I am drinking an Irish Breakfast tea today.

Rebecca: Also in a tea for teaching mug. I have Lady Grey, I think.

John: We’ve invited you here today to discuss your work at SUNY Oneonta in making explicit connections between course learning objectives and career readiness skills. Can you tell us a little bit about that?

Chilton: Yes, we’d love to. And again, thanks for having us. We’re excited to be here to share this project that we’ve been doing. We just finished up with our first cohort where we are really trying to help our students make really clear connections between what’s going on in the classroom with career competency skills that they’ll be using after they leave college. So the focus of this program was really on helping faculty build into their courses, times to allow students to reflect on what they’re doing in the classroom, and really say explicitly: “Here’s the skill that we are trying to build towards. Here’s what we’re doing in class. Now, as a student, practice actually making that connection. We want you to either write that and think about it in writing or say it out loud, and practice saying out loud, so that those connections can become as strong as possible after you leave this class.” The focus of this was on lower-level classes, we specifically targeted lower-level classes, because we thought that by the time they’re getting to their senior seminars, they’re doing that in the class already. But we don’t have these conversations in our 1000- and 2000-level classes. And so the more we can do this in our lower-level classes, hopefully when they get to those upper-level classes, they can say, “Oh, yes, I do remember talking about technology skills or communication skills early on and I can make connections now between what happened in that class and what’s going on.”

Ed: Yeah, the big thing that we always are talking about in the instructional design field, in the faculty development field, we’re talking about authentic learning all the time. But I joke sometimes when I say like if a student completes a course built on authentic learning, but can’t talk about it in an interview, or articulate it to themselves or others, did it really happen? And this is our practice. This is us saying, “if we’re going to do all the effort to make sure that our courses are built on authentic learning, we’re building authentic tasks into them. Let’s go ahead and do the next step of reflection, of practicing, so that students are prepared to speak about it.”

Rebecca: Can you talk a little bit more about how you rolled it out to faculty, because you’re talking about working on it through the center, and then getting faculty to adopt it in these lower level classes? Can you talk a little bit about those details?

Chilton: Yes. So we had started with a call for faculty. We actually had gotten a grant, there was local money from our institution to be able to do this where our incoming president had created… we didn’t have a strategic plan at the time, so he created a initiative called “regaining momentum” that was very much focused on re-engaging our students, both incoming students and current students. And so one of the focuses was on career readiness. And so that was kind of “how do we help our students make those connections?” So we had applied for a grant, we received the grant, and in doing that had promised that we would do this over three years, the first year being our first cohort. So we actually put out a call for proposals and went to a couple of faculty that we knew were doing some of this from some of our previous work, and said, “Would you be willing to be a part of this?” and then also have the full call for everybody across campus, and we were looking for 10 faculty, and I think we had 11 proposals to begin with. After that we went through the team that was built from across campus. We can talk about that in just a second too. But they had a team that would review those proposals and then said, “Yes, we had a cohort of 10,” which is what we had funding for the first year of the cohort, and then went through the process with them over the year.

Ed: In our center, we’ve really been thinking about how we can focus on the student experience. We’re in a transition phase right now, we used to be known as the Teaching, Learning, and Technology Center, and we are transitioning, and as of July 1st, we’re now the Faculty Center on campus. We were thinking about how we stop leading with technology. We always were thinking about teaching but we wanted to lead with that. And one of the things that we were doing was we were focusing in on the AAC&U high-impact practices. And we went through that long list of high-impact practices and said “okay, what fits into the work that we are already doing as a center?” and kind of identified some of them, so we had already been doing sessions and cohorts of project-based learning with our faculty members, we had already been investigating and helping build ePortfolios. And we always saw ourselves as the collaborative learning people. So what we wanted to do was create a cohort of people that were thinking about this and tie it to a goal that we could keep coming back to, and have these faculty meet with each other throughout a semester to really create a community around a central idea. And that’s where the idea really came from was to keep reconnecting through the semester and focus on building that community versus the sometimes one-off presentations that faculty development can sometimes feel like.

Rebecca: Can you talk a little bit about the career readiness competencies that you’re focusing on?

Chilton: Yeah, when we started this application process, we were connecting with other groups across campus. And one of those that when we talk about career development should be the Career Development Center. So we reached out to them and they talked about how they were using the NACE career readiness competencies. NACE is a national organization that is connecting what’s going on in the classroom to careers afterwards. There are eight competencies that are a part of those and they align a lot of their work specifically with that. Additionally, we found out that some of our co-curricular activities also aligned with the NACE competencies. We have a Lead program, which is a leadership program on our campus, and that uses the NACE competencies as well. The School of Liberal Arts has a program going on right now where they were trying to do a lot of this similar work outside of the classroom with students helping them connect what they had been doing in their classes with what was going on through the NACE competencies. So we found there was a lot of work already happening on campus, and so we really wanted to make sure that we aligned with that as well. What we like about NACE competencies is it really aligns with a lot of the work that goes on in our classrooms. And that’s what resonated with us, and that said, we’re focused on what’s going on in the classroom, how we can help support faculty in doing more useful work inside the classroom. And so we thought about how the NACE competencies really do that. So we think about things like professionalism, communication, critical thinking, teamwork, technology, and leadership. Then there’s equity, inclusion, and career and self development. Those are the soft skills is that word that got used a lot in the past to kind of say, “Yes, we do these things, but we didn’t really help students make those connections between what’s going on.” So we felt like it was a great framework to take into the classroom and say, you’re doing this as faculty, you know, you were doing this, but the students don’t always know that they’re doing this, how can we be able to help do that? The other thing I’ll follow up with that is, as we were exploring this more, we reached out to the POD Network, and actually found out [LAUGHTER] from SUNY, there was already work going on some of this as well. So our Center for Professional Development, has a whole certificate program that’s around connecting career readiness skills into the classroom and our use of the NACE competencies as a part of that as well. So it was really a lot of tie-ins that we saw really strong connections between what was happening on our campus and things that were happening locally.

John: We have talked about that, to some extent in our previous podcast with Jessica Krueger, and we’ll include a link to that, in the show notes.

Chilton: And one thing I follow up with that, John, is we have a couple of pre-professional programs. And this seems to fit really well there, like career readiness makes sense when you have a pre-professional program that’s preparing you for a specific program. We were also trying to reach into our liberal arts programs, into our science programs, into lots of other programs that might not be as focused on a specific profession, but still are connecting into these career readiness competencies.

John: And since we’re doing these things in the classes anyway, it’s nice for students to be able to recognize that these are skills that are going to be helpful for them in their future careers. And when they can see that, I think that may help provide a little more intrinsic motivation to engage in these practices and develop those skills. How have students responded to that?

Chilton: So we are in the first year of this, and this is one of the things we were reflecting on as we were preparing for this in that we realized our first year was focused on what faculty are going to be doing. As Ed said, we’ve been working with some faculty on this, that have been doing this on a smaller scale. But as far as this program, we’re looking forward in year two to really hearing from students and hearing how that’s going to go, so we’ll have to provide some feedback and liner notes later on to let you know when we hear about from the students.

Ed: Yeah, I’m gonna lead a committee to do the IRB and create some surveys to send out to students that are part of the program and have a little bit more of that student voice that we can report back on. Because I think that’s really important. It grew out of a proposal like that, that I’ll talk about a little bit later. But we had done that student interviews and student feedback once before, that really helped create this framework that we were really trying to set up with now a cohort of faculty members.

Rebecca: I really love hearing that you’re using NACE across your institution in different spaces. So you mentioned that Career Development Center is doing it as well as your center. Can you talk a little bit about how that collaboration is working?

Chilton: Yeah, so we really see this as a partnership. And it’s one of the things that we really tried to be intentional early on. Because when you say career readiness, that is a Career Development Center thing, and we don’t want there to be any perception that we’re trying to take over what they’re doing and we want to be able to just support them so that when students come to them, they are more prepared. A part of the original proposal was going to the Career Development Center and saying, we want to do this with you, would you be willing to partner with us? We can do more of this in the classroom. It was very much a partnership, it was very much us wanting to say, “What is it that you do in the Career Development Center, and then also, where can we help support you?” And then make sure that we feed into what you’re already doing. So it’s not any appearance of us coming in and trying to take over your programming, but just help our students be more prepared when they come to your program.

Ed: Actually, we had a great day at a winter workshop where the Career Development Center sat with our faculty and said, “Here are some of the things that we are already doing, here are the services that we’re purchasing, here are the things that we do at one-on-one consultations, here was what it could look like if you invited us into your course.” And some of our faculty members did that and invited the Career Development Center into their course to speak to them. And some of our faculty members were doing other things that incorporated the competencies but didn’t necessarily incorporate an outside group like the Career Development Center. So we had a wide range, even among our cohort of what they were doing. During that winter workshop that I was referencing, we actually brought in an outside trainer. And that was really nice. Chilton mentioned that the SUNY Center for Professional Development, the CPD had already been doing a four course sequence on the NACE competencies, which was really meant for a variety of professionals, it wasn’t just faculty, but the instructor that came highly recommended to us was Jessie Stack Lombardo also from SUNY for the SUNY Geneseo Career Development Center Director. And she came in and did some workshops with us and the faculty thinking about what are the small things that we can do in our class that helped students reflect, that helps students make those connections?

John: Could you tell us a little bit about the impetus for starting this program?

Ed: Yeah, so even before the program, we were, of course, working with wonderful faculty members here at SUNY Oneonta. And one of the things that we’d been doing quite a bit was thinking about making websites and ePortfolios, having opportunities for students to build their own web space, build their own web presence. So even before the cohort happened, we had one great instructor that said, “Hey, I would really love to be thinking about building ePortfolio projects into my course, would you help me do that?” During this time, John, you know, we were doing the SUNYCreate, a domain of one’s own initiative. We were giving websites to students. That was a technology-focused initiative. But we were doing a lot of these things already. I said, “Yeah, let’s come in. Let’s do that.” I was invited into that class several times. And we were so proud of this course, the way it came out. I want to give such a big shout out to Dr. Sarah Portway. She later went out and won the Chancellor’s Award for Excellence in Teaching based on a lot of the work that she was doing. They were building a fashion magazine online, the students were taking the articles they submitted for that fashion magazine, and also bringing them back to their portfolio and showcasing them on their own sites in addition, and she said, “Hey, why don’t we take this thing on the road? Why don’t we go to the AAC&U’s Institute on OER and ePortfolios.” And we said, “Okay, let’s do an IRB, get some student feedback from that, to bring to the conference so that we have that student voice when we go through.” And the feedback was fantastic. Students really responded to it. It was a wonderful presentation. But we were also starting to realize during those interviews, it wasn’t a negative, but it wasn’t all positive. Students were still not making all the connections between the skills that they had done and things they had practiced and the skills they had acquired, and being able to articulate that. I have this memory of a student saying to me, “I wish I could have put this on my portfolio, but it was a group assignment, so I can’t put a group assignment on my personal portfolio.” And I remember just kind of stopping the interview format and saying to her, “Oh, I would absolutely put that group assignment on your portfolio if you’re proud of it. I would absolutely describe what it is you did to contribute to that. group atmosphere and talk about how you can be a successful collaborator and describe how you work in team environments. And then put that thing that you’re proud of, that artifact that you’re proud of, on your portfolio, but also with the framing of what it means for you to be a good teammate and what it means for you to be a good collaborator.” And the student said “Oh, I never thought about it like that, I guess I could do something like that.” And Dr. Portway, being a fantastic instructor, never being satisfied with how things went in the last class was kind of like, “We need to think about this a little bit more. We need to be more explicit. We’re already doing all these authentic assignments. And at some level, it’s hitting. And we definitely want to keep going down this road. But on some level, we are missing something in helping those students make those connections. What do we need to do in the classroom activities, in the way that the assignments are presented, that helps walk them through that to make them just a little bit more prepared, because the authentic skills were already in the course. They just needed help making that connection.” And that was really the thing for me that I walked back from that experience and knocked on Chilton’s door and said, “We need to be doing more of this, and we need to be doing it not one at a time, but with groups of faculty members.” And that was really important to me.

Chilton: And what was interesting to me was that to carry that on a little bit more, when we first had this proposal ePortfolios was in the proposal title. We were really focused on we want to do ePortfolios for everyone. And some of the feedback we received was “Yes, ePortfolios can be a part of that but this could be a much wider conversation,” which is again, how we got back to NACE. There are these bigger frameworks that we can be a part of. So we went from, “Yes, here’s this great tool to no, no, no, let’s look at it from a framework perspective.” And now we’re at the point where we’re like, yes, some of the projects will be ePortfolios, some of them will be other things. And that’s okay. And that was bigger than the tool. This is about helping our students think about what they’re doing and helping them connect to things that will be useful for them after they leave college.

John: One quick follow up, you mentioned that you have groups of faculty who worked on it. As I understand this, this was a faculty learning community that you put together, where faculty received some slight funding or a small stipend as part of the participation. Have you done any work there with entire departments in revising their curriculum yet?

Ed: No, we haven’t done the departmental level work yet. Right now we’re focusing on coalitions of the willing, having faculty who are interested in these types of things. One of the hopes is that after doing three cohorts, having worked with multiple faculty one year, then the second year, then a third year, new faculty each year that we can get to a point where we’re ready to have a bigger discussion. There was one participant in the group that was really focused on making a freshman ePortfolio with the explicit reason to keep contributing to it throughout the program. And I think that shows a lot of promise. But we’ve still got to do some work to get the buy-in from the rest of the department to make sure that it gets used. So I mean, there is a lot to be done there. And it’s one of the things I’m hopeful for the future.

Chilton: To add on to that, one of our goals out of this is to be able to build a repository where we can share and our hope is that we can have enough examples that when we go to a department, we can say, here’s some small changes you can make. Ed had mentioned this earlier, we want to be able to have a breadth of here’s some small tweaks you can make. Or here’s some larger things you can do. And be able to have some examples that are multidisciplinary, that are a wide range of both implementation needs, as well as examples from different departments so that when they go to a department, we can say this doesn’t have to be a large change, it could just be helping make some small changes to help those students make connections.

Rebecca: So I wanted to follow up on an earlier point that you were making, from an experience that I’ve had as an instructor, and I’m sure many other instructors have had is you work really hard to make these kinds of career-ready activities, things like professional email writing, and portfolio projects, and team projects, the list goes on. We do many of these things as instructors, and then you inevitably have this conversation, a one-on-one maybe with a student. And you just realize they have no idea why they were doing any of the things and you’re like, “Oh, I failed the student clearly [LAUGHTER]. I could have done a better job.” And so it seems like frameworks like NACE could be really helpful, both for instructors and students to just be more explicit about those things and to practice talking about them. Can you talk a little bit about that piece of the puzzle?

Ed: Yeah, I think it’s so important to have small opportunities to embed a skill or embed a practice in there. So I’m going to start off with a very small thing that I think anybody could throw into their class. At the end of the course, it’s at reflection time, we want to talk about what you learned. Let’s take a moment and think about a common interview technique is the star interview technique, you’ve probably heard of it, where you describe a situation, then you have the task that you were assigned, the action that you took, and then a result, explain that, say “Hey this is how a lot of times we make sure that we have an action-oriented response to an interview question.” Now talk about this course, using the STAR method. What situations did your instructor put you in? What were you asked to do? What did you do in order to be successful in there? And then is there anything else you want to share about it? Are there next steps that you should continue to do that your instructors put you on the path to? Or are there things that you’ve realized about yourself that you need to continue on with for the future for the next thing? That’s so simple, it’s not rewriting an entire course. Yet, it’s a little opportunity to say, this is important, and what we did had meaning, and take a moment to integrate that into your context. How will you talk about this course in the future.

Chilton: What was interesting to me when we were doing this was when we first started out, we list a whole bunch of sample outcomes that get at what you were talking about, I’m going to do this email, I’m going to have them do this thing and it’s going to be great. And as we got to the conversation with our faculty, we realized that what we were missing was really creating the places for the students to practice making the connection. We have to practice the skills all the time, we’re like, “Yes, we do this.” As the faculty member, we understand that there is a connection between this and career readiness but unless the students are actually practicing making the connection, not just doing the action, but making the connection, then it doesn’t always stick for them. And so that’s where we started to shift from, what do we want the faculty to do? How do we want the students to practice this so that it does stick for them? So it is meaningful for them in a way that they can think about it again, hopefully, a year, two years from now, when they’re finishing their college career and starting to think more about career readiness. That was a shift for us of what is the faculty member going to do to how do we help the students really intentionally practice what they are doing, practice talking about what they have done, and making that connection to, in this case, the NACE framework, because we thought it was such a good framework to talk about.

Ed: I feel like we’re saying NACE too often. So I feel like it’s always helpful to be a little more specific. So let’s talk about communication. We’re teaching communication to students all the time. One of the key aspects is audience. So have the conversation with your students. When we communicate to different audiences, we use different standards. So part of the reason why I’m asking you to write a more formal paper, in research style format, is I want you to be prepared to speak to other experts in your field. But when we shift to the oral presentation, I want you to adjust your language, so that you’re speaking to a non-expert, you’re speaking to your future colleagues, you’re speaking to a potential customer. And when you make that switch, make sure it’s intentional. And then at the end of this course, I’m going to ask you to reflect on that to think about what choices you made when you were speaking to someone who you expect to already understand and be embedded into the discipline and someone who you do not expect. How is it different when you talk to a colleague versus when you talk to your friends and family about what you do. That’s an important communication competency. So let’s talk about it and intentional choices that we can be making.

John: How many faculty members and how many departments were involved in this project so far?

Chilton: So we had our first cohort, and in that cohort, we specifically targeted to having 10 faculty. But we were very specific about trying to have faculty from as many departments and as many schools as possible. So we have three schools on our campus, we ensured that we had representation from all three schools, we ensured that we have representation from multiple departments. So in the end, we had nine different departments as a part of this. We did have overlap in one department, from two of our participants. As we said before, we did focus and said in the call that we wanted you to work with a 1000- or 2000-level class. And so that was part of the call as well. So we actually had a couple people that applied for this that were planning on doing this in a 3000-level class. We reached out to all of them and said “Do you have any lower-level classes that could be part of this?” Two of them said yes and one didn’t, which was one person that we weren’t able to take in. We’re focusing on in years two and three, again, lower-level courses, and going to try to continue to have faculty from as many different departments as possible, so that when we get to the end of this, again, we have a nice repository of examples from as many different disciplines and as many different schools as possible.

Ed: And we can invite some of the cohort I faculty back as mentors, and we can incorporate them into year two in a different way, as we continue to try to build a larger community and push a conversation that we think needs to happen on campus.

Rebecca: Can you talk a little bit about what was expected of a faculty member who was accepted into the program?

Chilton: So we’ve spelled that out upfront, we had already been planning on our campus, what we call the SUNY digital learning conference that was focused on open and public education. And we purposely built in a track in there that was about career readiness. Originally, as we’ve been talking about, we were focused on ePortfolios, and so I thought a lot of them will be doing ePortfolios. But in the expansion of that, we wanted to make sure that we really talked about how we can make connections. So we said that we would pay for the faculty to be a part of that conference. So they attended that conference in November of 2022. And that was the first part, kind of the kickoff for this cohort. We then had a January full day workshop, as Ed had talked about earlier, brought in Jessie Stack Lombardo from SUNY Geneseo to be our speaker for that, and she wasn’t even a speaker she really planned the day and it was very highly interactive with those ten faculty. So as Ed said, we have a staff member from the Career Development Center was a part of that and presented locally, Jesse then talked about some different frameworks to be able to do including NACE and how you can start to think about both small changes that we made and large changes. And then we had said in the call that the expectation would be that by the end of the Spring 23 semester, they would turn in a revised syllabus and examples of work that they are doing to the group. We realized that wasn’t specific enough. So we then created a rubric that focused in on three specific areas of what they would need to do. First part of that rubric is what would be the changes they would make in their syllabus to really spell out what are the NACE competencies? Are you focusing on all of them? Are you focusing on one of them? But didn’t have to be a lot in there, but we did want to have it be addressed in their syllabus in some way. And then what is the activity they’re going to be doing where they actually have students practicing and how will the students receive feedback on that? Well, there’s three levels, it’s not present as a part of the rubric. And then we had two levels of “yes, it meets expectations.” And we were thinking again, what are those small changes that could happen, but then also, we had a what’s above expectation, where what’s something if you were really dreaming about what it could be, where could you take it and what could it be, so we kind of want to have “Yes, you meet expectations” that would help us get small changes that would be usable by everybody. And then what could this be look like if you really wanted to really [LAUGHTER] dive into the deep end with it and explore what could happen with it a little bit more, and make it so that it was better for students, not just in the class, but beyond the classroom.

Ed: Yeah, the only thing I’ll add to that Chilton is we also met once a month during the spring semester. So we had recurring meetings throughout the Spring semester. With those rubrics between the present and highly effective, and we’ll share the rubric so that you can put it in the notes if you’d like. We were starting to think about not only did you incorporate the NACE competency in your course, but you were also presented your prompts or your things in a way that gave the students the opportunity to think about future activities they could take, future things they would want to do. And that was really important to us as we were doing it is to not only just create a moment of reflection for the students at that moment, but also to make that connection of, okay now that I’ve had that moment of reflection, now what? Should I be picking out some different courses? Should I be finding an internship? Should I be doing something now to set myself up for success? And so we don’t get that panicked feeling when the student is at senior year and they go into the career readiness and or the Career Development Center and say, “Okay, what do I do now?”

John: What type of incentives were offered to faculty to participate in the program?

Chilton: So we spell that out as pay for their participation in the conference in November. So they were able to go for free and participate. It was on our campus that made it easy for them to be able to go. It was just their conference registration that we covered as a part of this. In addition, we paid them a stipend for attending the January workshop. So officially it was $90 was the stipend to attend the workshop. And then when they completed and turned in their final version of their revised syllabus and examples of activities, there was another $510 stipend. So in total, it was a $600 stipend. But as a part of that final revision, we actually did review their submissions, looked at the rubric and did give them feedback… a couple of people, we said, “Hey you’re missing…” and asked them to go back and do some additional work. So we did hold them accountable to that rubric before getting the final stipend. And so it was a useful and interesting conversation when the leadership team did meet to kind of look at those to be able to say, “What do we like about this? What are we thinking for cohorts two, and three? What might be asked for more specifically next time to make this even more meaningful for our students?” So we’re already starting to think about cohort two and looking forward to that for next year.

Rebecca: Can you talk a little bit about how faculty responded to their participation,

Ed: We take the faculty’s response and the feedback they gave us really seriously. We gave them the opportunity. They had the reflections that they were doing that, of course, we knew who was speaking. But we also gave them some opportunities to give us some anonymous feedback, so that they could tell us how they really felt about us. And we were just really pleased with it in year one. We do recognize that we have to keep honing our message, we have to keep defining what we mean by career readiness, and what we mean by incorporating it into class. We need to have our elevator pitch a little bit more refined and down. Because what’s evolved through this conversation is, we’ve really talked about the skills are already there, but we can be more intentional about it. And we can be intentional in the ways we ask students to reflect and practice in ways that we really believe can be beneficial for students. But that can still be a difficult conversation. When people see career readiness in 1000 and 2000 level classes, some people are bristled or turned off by that because they’re thinking, “Oh, just one more thing that I have to do.” Now we didn’t get that from our participants in the cohort that much, because they applied and they came here on purpose, that was nice to have a group that was really wanting to be here and was willing to try some things with us in this space that we were creating. But overall, I would say that the feedback had been very positive.

Chilton: Looking through the feedback from faculty, I just pulled out there was one quote that stuck out to me that i’ll read quickly that came from one of our professors in our communication arts department, where this professor said, “Students said that they felt more confident.” So this is actually one of the professors we recruited into this program that had been doing this already. This professor did have some experience with students doing something but said that “Students felt more confident in the skills as a professional, and were able to articulate how the experiences they had in my class connected to the expectations and employers would have of them. They also appreciated being told of why we had to do certain projects and to help them transition from college to life after college.” And so I think that really speaks to how the professors enjoyed having time to be able to do that.

John: We only have courses from 100 to 500 levels. It seems there has been a bit of a course number inflation there at [LAUGHTER] Oneonta. That was just a joke. I’m sorry.

Chilton: We were told that everybody in SUNY was moving to four course digit numbers and so we over the past two years, it was like this really big project that we did to move from three digit to four digit numbers, because at least we heard, everybody in SUNY is doing this. So we have to do it. It’s very intriguing that not everybody had to do that. [LAUGHTER].

Ed: And simultaneously, as we were going from three digits to four digits, we didn’t have 400 level classes previously. And the feedback we were getting was that that was seen as a deficiency by some people who are reviewing our students’ transcripts, even though calling all of our upper division 300 level, and that people applying to professional schools would get that explanation they would understand why there weren’t 400 level. Other people who are maybe not as skilled at reading a transcript are like, “Well, did this student avoid all 400 level work?” And so simultaneously as we were adding another digit, we were also transitioning to having 1000, 2000, 3000, and 4000 level classes. So that was a big change that took a lot of curriculum writing and mapping through over the last two years. It wasn’t just as easy as adding a zero on to every course.

Rebecca: Sounds like such a fun project. Sign me up. [LAUGHTER] So we always wrap up by asking what’s next?

Chilton: So we are excited about Cohort Two. We are going to be starting our recruitment in the fall. We actually have a fall faculty Institute on campus. This year is very much focused on what the communities of practice that are already happening on campus and how you can get involved. And so that’s going to be our first, not our first, but that’s gonna be one of our big recruitment pitches for Cohort Two. In Cohort Two, we are looking to be able to include more faculty from a wider range, we are going to be starting to get into faculty that might not have as much experience in doing this. So we are thinking about how we hone our pitch and how we focus this to a wider audience to be able to say “No, this is not big changes in your classes. This is just asking one additional question or allowing space for one additional time for students to be able to practice connecting what you’re already doing to these career readiness competencies.”

Ed: And I would say what’s next for me is this experience has really solidified the idea for me that we need to continue in faculty development centers, to make spaces where faculty can repeatedly come back and interact on the same topic, getting away from that kind of one and done workshop, and identifying major things that we want to return to through the year inviting people into that space to share. Because when those faculty, when they get an opportunity to think and show off what they’re doing, it really is a wonderful spread of ideas. And you get a lot from all the energy in the room.

Rebecca: Well, thank you so much for joining us and sharing how this project’s unfolded at your institution.

John: And you’re both doing some really great work at SUNY Oneonta and it’s great to keep in touch and thank you for joining us.

Chilton: And thank you. It’s a pleasure to be here with you. So thanks for taking the time to be with us today.

[MUSIC]

John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.

[MUSIC]

306. Gender Bias and Timing of SETs

 A number of studies demonstrate gender bias in course evaluations. In this episode Whitney Buser, Jill Hayter, and Cassondra Batz-Barbarich join us to discuss their research that looks at the timing of when these gender differences emerge and theories for why they exist.

Whitney is the Associate Director of Academic Programs in the School of Economics at Georgia Tech. Jill is an Associate Professor of Economics in the College of Business and Technology at East Tennessee State University. Cassondra is an Assistant Professor of Business at Lake Forest College. Whitney, Jill, and Cassondra are the authors of an article entitled “Evaluation of Women in Economics: Evidence of Gender Bias Following Behavioral Role Violations.”

Show Notes

Transcript

John: A number of studies demonstrate gender bias in course evaluations. In this episode we discuss research that looks at the timing of when these gender differences emerge and theories for why they exist.

[MUSIC]

John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by John Kane, an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.

[MUSIC]

John: Our guests today are Whitney Buser, Jill Hayter, and Cassondra Batz-Barbarich. Whitney is the Associate Director of Academic Programs in the School of Economics at Georgia Tech. Jill is an Associate Professor of Economics in the College of Business and Technology at East Tennessee State University. Cassondra is an Assistant Professor of Business at Lake Forest College. Whitney, Jill, and Cassondra are the authors of an article entitled “Evaluation of Women in Economics: Evidence of Gender Bias Following Behavioral Role Violations.” Welcome Whitney, Jill, and Cassandra,

Whitney: Thank you for having us.

Cassandra: Thank you so much.

Rebecca: Today’s teas are:… Whitney, are you drinking tea?

Whitney: I am. I have some jasmine tea.

Rebecca: Always a good choice. Jill. How about you?

Jill: Harney and Sons Hot Cinnamon Spice.

Rebecca: Oh, that’s such a good choice. I love that one. It’s a family favorite at my house. How about you, Cassandra?

Cassandra: Yesterday, we made a sun tea on the porch. So it’s sweet peach tea.

Rebecca: This is a good variety. How about you, John?

John: And I have ginger peach black tea from the Republic of Tea.

Rebecca: So we’re combining choices here [LAUGHTER]. And I have Awake tea, despite the fact that it is early afternoon here.

Jill: I also had three cups of coffee this morning.

Rebecca: It’s one of the most popular kinds of tea, Jill.

John: We’ve invited you here today to discuss your research on gender bias instudent evaluation of instructors. Could you tell us how the study came about?

Whitney: Jill and I have been working on this for about six years, believe it or not. It’s been a long process for us. And actually at the very beginning we had a different third working with us. And the original three of us, we met at the conference, and we had just attended a session that talked about teaching evaluations. And afterwards, we just naturally began talking about this, because we all had these really, really strong feelings about teaching evaluations. All three of us at the time were young, young in our careers, young age wise. We were female PhD economists. And we were all earning tenure, or I think Jill had just earned tenure. But we’re all in this similar experience of having what we felt like was a very positive class climate, and a lot of camaraderie between ourselves and the students until the grades were returned for the first time. And then we could feel a definite shift and it was upsetting to all of us. We all got into this because we love teaching and we want to do a good job in that. It was just something that we were picking up on. So that was our anecdotal experience, Jill had a little data on it herself, because she would do mid-semester evaluations herself, just to gauge the class climate and see what students were needing. And I had an experience where in my first position, they did a surprise midterm evaluation, just to kind of see how the new professor was doing, that I didn’t know about. And I got glowing reviews from the students, everything was very, very positive, wonderful and six weeks later, same students but grades returned, evaluations looked a little different. And the comments were a bit different. So we had a little data to backup this idea too, and one thing if the people listening today haven’t read the literature, there’s an extensive literature on course evaluations. And it consistently finds gender bias in those. But the thing about that literature is it only looks at evaluations, which are typically done on the very last day of class, maybe even after that, maybe a couple of days before, but at the end of the semester. And we really haven’t seen anyone look into how these opinions of students evolve over the semester, or how students feel at the beginning or the middle of the semester. So that’s what we wanted to do with that. And in my opinion, and this is just me speaking here, Jill can have her own other motivations, or our other co-author that has worked with us before could feel differently. But for me, it was really important to acknowledge that society has come a long way in the past several years with gender bias. And I don’t think that modern students are shocked by female faculty any longer, I don’t think they have an explicit distaste for female faculty. Anecdotally, I feel that my students are actually happy when they meet me. And they have expectations of me to be warm, comforting, approachable. But I do think that when you expect someone to be more comforting and approachable, and they give you a grade back, that’s not always an “A” in a difficult quantitative subject like economics, you can get a bit of a Grinch Who Stole Christmas effect. I thought it was going to be one way and now my expectations are taken down. We all know no one likes that dopamine depletion of having expectations not met. So, to me, if we’re going to talk about gender bias, we really have to talk about it in this nuanced way, so that it doesn’t get automatically dismissed by people who don’t see an explicit bias and then say, “Oh, hey, there’s nothing here.” And then the last thing that I think is really important here for the motivation for the paper is that we have this expectation that bias would grow over the semester. So if bias grows over the semester, that means the earlier in the semester you evaluate, the smaller the bias will be. And one thing that the literature is missing is a very concrete objective way to deal with bias. What we were hoping to find was: move the evaluations up in the semester a bit, and you minimize or eliminate bias and that’s a concrete objective. Towards the end today, we’ll talk about what we actually found and whether or not we knew that. But that was one of the motivations.

Jill: So that’s how the original paper found in terms of motivation, but then Cassandra, she is a PhD in Psychology until she had read and she was doing work in the area. And she had reached out to Whitney and I. She had read our paper, she had read the results of our paper. And so then a second paper with Cassandra takes a more psychology approach in terms of a lot of what Whitney is talking about and Cassandra is going to talk about it later, with respect to the role-incongruity theory, social role theory, and she’s going to talk more about that later. And Whitneys described the motivation of that first paper, the second paper takes a very different perspective and looking at it from a more psych perspective. Cassandra, you might want to chime in?

Cassandra: Absolutely. I think you summarized it well, I joined the paper, as Whitney and Jill were trying to find a home for it. And we thought that our interests, though coming from very different backgrounds ,would blend nicely for this particular topic, as there’s a lot of scholarship in psychology that looks at understanding reasons behind this bias. And so I was brought in to really help kind of think about how do we frame that in a way that might appeal to even a broader range of audiences.

Rebecca: At the beginning of the paper, and Whitney, you’ve kind of pointed to this today about being a young faculty member, you also noted in the paper that women are underrepresented among economics faculty, especially at the level of full professors. Can you tell us a little bit about the extent of this under-representation?

Jill: Women have earned more than half the doctoral degrees for over a decade. But particularly among tenure-track faculty are underrepresented. In the paper we cite 36% of full professors are females. In economics, that’s a smaller percentage, 17 and a half percent of full professors are females, in the area of economics, although 35% of PhDs in econs represent females. It’s a smaller percentage of female faculty receiving full professor rank in economics. That’s what we mean by that under representation. In terms of economics, specifically, it’s oftentimes left out of the STEM fields, and depending on which university or college that you’re out at, economics can sometimes could be found in the social sciences and in the arts and sciences, or it can be found in the business school. So at my institution, Whitney’s institution, I believe, and Cassandra’s I think we’re all represented in the business school. But sometimes, you know, economics wanted to put in there with the social science field, it’s not thought of as being this more quantitative, heavy subject, and it oftentimes is, it is by nature of it. And so females in those more math heavy classes, like the STEM classes. I think my students when I started off, and I think Whitney was getting at this, with us being more junior faculty members. I can considered by students peer, instead of the professor in the course. And that made it tough, because to Whitney’s point about that returning grade feedback and the perception that students had of me a day one versus midway through the course, I was now coming across as someone that was handing back maybe less than 100% or “A” grade. So in my business school, my principles of economics courses are required. They might not even want to be in there, but they have to be in there to get a business degree. Earlier on, that was a challenge I faced, I’m 13 years into my career. I’m going up for full professor this summer. But starting off was really a challenge. And I remember having female mentors in my graduate program. They tried to prepare me for this, they tried to say it’s going to be challenging early on, you’re going to have to go against some of these perceptions, alot of the perceptions that we measure in this paper..

John: To what extent is the underrepresentation of women faculty due to a cohort effect where women have become a larger share of PhD economists in the last few decades, but that was less true 20 or 30 years ago and how much of it might be due to the impact of gender bias on evaluations on career pathways for women?

Jill: Really what this paper looks at, the standard evaluations of teaching and the bias or potential for bias, that exists there. So I’ll just speak to that and that where I currently am, evaluations of teaching are weighted heavily for retention of faculty, promotion of faculty, tenure and promotion decisions. And then when we’re hiring new faculty, looking at any previous course evaluations and experience with teaching. At every level in academia, these are used as some gauge for teaching effectiveness. I think one of the questions that we’re looking at and accrediting bodies are looking at is whether or not this is the measure that should be used. And looking at different measures that might be options for measuring teaching effectiveness, we know that they’re flawed, that our study is showing that they’re flawed, but also previous literature has suggested that they’re flawed as well. And so the fact that for most schools, this is the single measure that’s being captured… and I know that it’s different depending on again, at my institution, some departments don’t give them a whole lot of weight in tenure and promotion decisions. But certainly, my experience in my College of Business and Technology that these are weighted heavily. And so in thinking about a junior faculty member starting off, when Whitney and I met at the conference, if my evaluations were lower, I’m putting a lot of time into my teaching and improving and bringing up those scores. My male colleagues, in discussion just with them, didn’t have the same experience that I was having with respect to these SETs. And so we think about allocation of time and resources as a tenure track junior faculty member, I’m putting more in what I would consider just catching up, getting those SET scores higher, so that it’s reflected in my tenure and promotion packet. And that’s less time that I’m allocating toward research or other things. That’s my view on it. I think Whitney has a couple other thoughts on that.

Whitney: One of the things we tried to make clear in the paper is that the literature is very clear that evaluations do have a gender bias. And if these evaluations are being used, and they are, in hiring decisions, annual evaluations, promotion, tenure evaluations, and merit pay raise decisions, then they’re being used at every single level of advancement. It’s not one small piece. It’s a piece that’s used throughout and very integrated late in the process.

Rebecca: You mentioned at the top of our interview that the second paper shifts more towards psychology, and specifically describes ways in which both social role theory and role-congruity theory may explain the bias against female faculty in student evaluations. Can you briefly summarize these arguments for our listeners?

Cassandra: So social role theory was a theory that has been put forth for decades by Alice Eagly, a very prominent scholar in the social psychology world, as well as her colleagues. And this has been used as a framework to really understand the complexities and origins of gender gaps in our workplace in particular, whether that be inequities and experiences, the expectations that are different for women, and of course, the outcomes such as promotion at work. Essentially, social role theory suggests that the reason we see these gender inequities today in society or that they originated from men and women being distributed into social roles based on physical sex differences, so that women biologically were able to have children, men, on average, were physically stronger, which those differences 1000s of years ago, had an evolutionary benefit to a well functioning society, people were supporting in the ways in which they were best equipped to do so. And the assignment of men and women into these roles led get them to adapt role-specific qualities and skills. So women who were bearing children were friendly, helpful, sensitive, concerned with others, kind, caring. We refer to these now as more communal qualities, and men and the provider, the protector, role led them to have attributes such as ambition, being assertive, authoritative, dominant. These are qualities that now we label as agentic. So while technology of course has since caught up and made these biologically driven role assignments unnecessary, society continues to see a division of labor along these lines in the modern world and society at large. And society at large still holds the belief that women do possess these traits, and should possess these traits, these more communal qualities, and men do and should possess more of these agentic. Relatedly, role-congruity theory helps us understand the consequences when men and women fail to fulfill these expectations. And we know the failure to fulfill these expectations are more consequential for women, this experience of bias driven from the failure to behave in communal ways. In other words, violating these cultural expectations can be seen in all areas of society, but particularly in traditionally male-dominated positions, like college professors, or in male-dominated fields like economics [LAUGHTER]. And so women that are in these roles are already going to experience some degree of backlash for being in gender-incongruent positions. But that is especially true if they are also going to behave in traditionally more agentic ways, being more assertive, demonstrating their power, which we argued was what was occurring when you give critical feedback back to students.

John: To approach this, you gave evaluations to students at two different points of the semester. Could you tell us a bit more about the study design, how large the sample was and how many faculty and institutions participated in the study?

Whitney: Sure, we had a really rich data set for this study. That’s one of the reasons we were able to get two different papers out of it, and maybe even some future research, because we took all of this data, and we collected it in person on paper and entered it, which was an arduous process. As I said, we had been working on this project for about six years, about a year and a half of that was just data collection. And we have a lot of people to thank that did that for us for no author credit on this paper, so we had males and females across the United States gathering that data for us, that we’re really appreciative to have. So in the end, we wound up with about 1200 students in total, we weren’t quite 50/50, we were 60/40, favoring men, which is typical for economics classrooms, even though it is required in a lot of majors (that’s where you’re getting a lot of the women taking it). And like you said, John, we surveyed them twice. We surveyed them on the second day of class, we wanted as close to a first impression as possible without having a major sample issue with drop/ad. And then we surveyed them the day after they got their first midterm grade back. So we got the first impression, and then we got the way that they felt after they had had their first grade returned. We did this at five different colleges and universities, we had three male professors contributing data and four female professors contributing data. One of the big questions that people have asked us over the time is “Well, how does race play into this?” And that’s something that’s beyond the scope of our research, I will say that we only had one underrepresented minority in our sample, again, typical of economics professors, it was one of our male instructors. So, we would expect a downward bias from race and maybe an upward bias from gender, or getting those two, at least watching one another out in the paper. And when we asked these students about how they felt after their grades were returned. This was about four weeks into the semester, so still pretty early in the semester. What we did was we really wanted to ask about the specific qualities that had been hypothesized in the literature as drivers of bias or drivers of differences. So we just asked students to rate their instructor on a bunch of different qualities. Cassie really helped us out here because she came in and she says, “Well, you know, we can categorize these qualities into communal qualities and agentic qualities and neutral qualities…” which was really the way to approach it because of course, we get different things in communal versus just qualities. So we asked our students things like: “How knowledgeable do you find your professor? How challenging? Do you find them to be approachable? Do you find them to be caring? Are they interesting?” And then we asked a couple of very general questions: “Would you recommend the course?” All of this set us up to have a really nice dataset where we could look between genders and across time as well.

Rebecca: So I think everyone’s probably dying to know exactly what you found. [LAUGHTER]

Jill: I’m just going to provide an overview of the results because we do a number of different specifications and use different econometric methods in the findings. And so you can get all of those results there in detail. But in general, on the second day of class, we find that women are receiving lower ratings across the five agentic and gender-neutral instructor characteristics that we measured. They were rated higher on that second day of class on those more communal characteristics. And not all of those differences were statistically significant. Immediately after the first exam grade was returned to students, women were receiving lower ratings for all seven measured characteristics. Each difference was significant except for those caring and approachable, more communal characteristics. And then men were now having higher ratings in all the different aspects relative to time, or the second day of class. Over time, what we see was that men’s evaluations were getting higher on all characteristics from the second day of class to the period after the first exam was returned. And then in contrast, women’s evaluations were not trending upward. So we had a couple that were staying the same, but overall, they were going down. So those are just some overview findings. Again, those more specific results, by specification, can be found in the paper.

John: We will include a link to both papers in the show notes too, so people can go back and review them. To summarize, what you found is there was relatively weak evidence of significant gender bias on the second day of class, but that gap increased fairly dramatically after the first graded exam. So what do you attribute that change to, was it because of the feedback students were getting from grades as Whitney had mentioned before?

Whitney: We were attributing, and Cassie can talk about this with more authority on the theoretical point, but we’re attributing that to backlash theory, this idea that if I expect one thing, and I don’t get it, there’s this need to back off so that things go in congruence.

Cassandra: Exactly, Whitney is spot on there. What we thought this was evidence of was women behaving in gender incongruent ways, women are supposed to be warm and caring and friendly. And when you get a perhaps grade that maybe wasn’t an “A,” that feels harsh and critical, and a woman is asserting their power and dominance in the classroom, which again, they already are in a male dominated field profession. And those two things together combined can result in this backlash.

Rebecca: So if we take these findings, and think institutionally, what are some things that institutions might want to think about moving forward?

Whitney: That’s a good question. If you remember, from the very beginning, we were saying, we’re really hoping to find this nice objective concrete solution, we anticipated finding it through timing. And that’s what I would really like to do with future research is to be able to find something concrete and objective to treat this with. We weren’t able to do that because we found bias from the beginning. And we found that it came so quickly in the semester that it’s not something that we can just move back evaluations to midterm or something like that. Since we can’t do that, we’ve talked about other ways for institutions to take this. And one takeaway really is just an awareness that these gender biases exist and that these evaluations are flawed. This is really well established in the literature, but not necessarily in the general sphere of knowledge. When we published this paper, Georgia Tech did a little feature in their daily digest, and I had two female engineering faculty email me and say, “I knew this in my gut for years, but nobody’s ever quantified it.” That to me, is just evidence that it’s not in the general sphere of knowledge, even though the literature defines it well. Some of the impact of the concrete solutions that we have seen is we’re seeing a lot of schools and accreditors, like AACSB, they’re starting to require multiple indicators of teaching effectiveness and evaluation. So evaluations and peer reviews, or maybe something else to see the observation, something to that effect to where we have more of a global and inclusive way to look at someone’s teaching effectiveness. So this is a great takeaway, hopefully that will reduce the weight of the impact of evaluation just by having other factors in there. And just one final point that I want to make. And this is just a really big sticking point to me for the paper is that all of us are researchers, we all deal with statistics and statistical significance, and robust research methods. And then when those of us in Chair and Dean roles go to look at evaluations, all the sudden, all that training completely goes out the window, and we look at the difference between a 4.2 and a 4.4. And I know those differences sound really small, they are that small. And we say, “Oh, well, this person does better than this person, this person deserves to be hired over this person.” Never in our research, or in a formal presentation, would we ever compare two means that small without significance testing, number one, and without making sure they’re actually comparable, and say, “Oh, there’s a difference.” It’s just something that I think we need to recognize, we would not recognize this as good research or good methodology in any other area of our work. It’s just something that we should keep in mind as we move forward with this.

John: Now, you mentioned the use of peer evaluations as another way of providing, perhaps, more balance, but might they be subject to the same type of bias?

Whitney: Yeah, all the things that we would see for student evaluations, I can imagine how you would see with peer evaluations as well.

Jill: But there are creative ways to do peer evaluations that I think here at ETSU, we have a Center for Teaching Excellence. And I’m confident Georgia Tech, and Lake Forest has their own version of that. And so there are creative ways. And again, not that SETs are necessarily bad, but knowing what we know about the flaws in them, that, coupled with an additional measure or two, can be a lot more insightful, I think, to the teaching effectiveness, like true teaching effectiveness of instructors.

John: And one thing I’m wondering is if the measured effect might be larger in economics, because at least at many institutions, grades and economics and STEM classes are often lower, which might magnify the effect of this difference. It would be interesting if there was to be a study that also included some classes, maybe in humanities, to see if perhaps there’s less of an effect because of that role-incongruity issue there. It may not appear to be as severe in disciplines where grades across the board tend to be higher.

Whitney: I think you’re right about that, most people when they take economics, it’s a required class and certainly the grades are a big factor, then the two things that showed the most significance outside of our key variable of interest was interest in economics, and expected grade. Those were the things that across the board… now we still found gender bias controlling for those things, but it mattered.

Rebecca: So we talked a little bit about things that institutions might want to start thinking about: institutional policy and things that might shift how we use teaching evaluations. Are there any other strategies that institutions or instructors can use, or adopt, to try to reduce this bias in the short term?

Cassandra: That’s really the million dollar question. Because this type of bias exists in a lot of different domains, whether we’re talking managers and their subordinates, teachers and their students. One thing that’s often suggested or recommended is simply making people aware that this bias exists, and providing training on how to better approach evaluations, whether that’s how to use a rating scale and ensuring that you aren’t engaging in a halo effect, for example. Another strategy is requiring that people justify their ratings that are provided with qualitative comments… that if you’re just asked to fill out on a scale, on how competent is this person? Well, bias may creep in more if you aren’t asking for a justification of why that particular rating was given for competence. A last recommendation that I’ll share here is making these evaluations more public. So if there are a couple of people, say peers, that are evaluating myself or Whitney or Jill in the classroom, well, they need to come together, share and publicly disseminate their evaluations that they had given to us. This social accountability can help to mitigate bias and for people to ensure that the ratings that you’re giving are, in fact justified.

John: So we’ve got a long ways to go with this. It’s a problem that’s been recognized for quite a while with a lot of studies. But there hasn’t been that much done to address that. And those are some good suggestions that institutions may want to try. We always end with the question: “What’s next?”

Cassandra: [LAUGHTER] That’s a good question. Of course, I think that the three of us collectively would say we do hope that administration and decision makers start asking questions about their use of student evaluations of teaching and how they might seek to mitigate this bias, based on the recommendations Whitney had already shared. But we also hope that women faculty perhaps feel more empowered to advocate for themselves when it comes time for promotion and tenure decisions to be made. My Institution, a part of the promotion process, is writing letters, and going through interviews. So speaking to this, bringing an awareness to the people who are making the decisions that this exists, and that it is not just an opinion, that there is empirical evidence of its existence. But we are really interested in exploring more fully how providing feedback, particularly critical feedback, like in our study, where the professors are giving back grades might impact the perceptions of men and women in other contexts as well. So is this a phenomenon we would see, for example, between a manager and their team? Do people respond differently to critical feedback from a manager because of their gender? And how much are these differences, perhaps, driven by perceptions of how communal or agentic they are in their delivery of that feedback? So in other words, are we seeing the same pattern in other contexts? Ultimately, we hope that by better understanding how perceptions of communion and agency impact interactions that women have at work, particularly women in male-dominated or gender-atypical roles, this greater understanding will allow us to also discover ways to alleviate some of that backlash through more targeted interventions and training and perhaps better timing. Because at a minimum, it’s important to highlight the various ways gender bias continues to persist in our society. Because without that awareness, nothing can be changed.

John: Whitney, Jill?

Jill: I think that was great. [LAUGHTER]

Whitney: Yeah, I think, Cassie, you did a great job. And Cassie certainly helped us out with bringing formal language and theory to things that we felt as intuitive and we felt in our gut as important. We don’t have a lot of language for that in the economic space. And so blending these two disciplines together has been very helpful for looking at the situation.

Rebecca: Well, thank you all for joining us. And the research that you’re doing is really important and impactful. So we hope our listeners will use it.

Whitney: Thank you so much.

Cassandra: Thank you.

Jill: Thank you so much.

[MUSIC]

John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.

[MUSIC]

305. 80 Ways to Use ChatGPT in the Classroom

Faculty discussions of ChatGPT and other AI tools often focus on how AI might interfere with learning and academic integrity. In this episode, Stan Skrabut joins us to discuss his book that explores how ChatGPT can support student learning.  Stan is the Director of Instructional Technology and Design at Dean College in Franklin, Massachusetts. He is also the author of several books related to teaching and learning. His most recent book is 80 Ways to Use ChatGPT in the Classroom.

Show Notes

Transcript

John: Faculty discussions of ChatGPT and other AI tools often focus on how AI might interfere with learning and academic integrity. In this episode, we discuss a resource that explores how ChatGPT can support student learning.

[MUSIC]

John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.

Rebecca: This podcast series is hosted by

John: , an economist…

John: …and Rebecca Mushtare, a graphic designer…

Rebecca: …and features guests doing important research and advocacy work to make higher education more inclusive and supportive of all learners.

[MUSIC]

John: Our guest today is Stan Skrabut. Stan is the Director of Instructional Technology and Design at Dean College in Franklin, Massachusetts. He is also the author of several books related to teaching and learning. His most recent book is 80 Ways to Use ChatGPT in the Classroom. Welcome, Stan.

Stan: Well, thank you ever so much for having me on. I have been listening to your podcast since the first episode, you guys are crushing it. I recommend it all the time to my faculty. I’m excited to be here.

John: Thank you. And we very much enjoyed your podcast while you were doing it. And I’m hoping that will resume at some point when things settle down.

Rebecca: Yeah, we’re glad to have you here.

Stan: Yeah, thanks.

John: Today’s teas are:… Stan, are you drinking any tea?

Stan: A little bit of a story. I went over to the bookstore with the intent of getting tea. They had no tea in stock. I went to the vending machine on the same floor. The vending machine was down. I went to another building. I put in money. It did not give me tea. I’m stuck with Mountain Dew. I’m sorry. [LAUGHTER]

Rebecca: Not for lack of trying. Clearly. [LAUGHTER]

Stan: I tried. I tried.

Rebecca: I have some blue sapphire tea.

John: And I have Lady Grey.

Rebecca: You haven’t drink that in a while John,

John: no. [LAUGHTER]

Rebecca: Little caffeine today huh. [LAUGHTER]

John: Yeah well i am back in the office, I’ve returned from Duke and I have more options for tea again.

Rebecca: That’s good. So Stan, we invited you here today to discuss 80 Ways to Use ChatGPT in the Classroom. What inspired you to write the book?

Stan: Well, I’m an Instructional Technologist and my responsibility is to help faculty deliver the best courses possible. And in November 2022, ChatGPT came onto the scene and in December, faculty are up in arms, “Oh, my goodness, this is going to be a way that students are going to cheat and they’ll never learn anything again.” And as an instructional technologist, I see technology as a force multiplier, as a way to help us do better things quicker, easier. And so I didn’t feel threatened by ChatGPT. I’ve been looking at the horizon reports for the last 20 years. And they said, “AI is coming. It’s coming. It’s coming. Well, it’s here.” And so it was just a matter of sitting down in January, write the book, publish it, and provided a copy to all the faculty and we just started having good conversation after that. But the effort was that we should not ban it. That was the initial reaction; that this is a tool like all the other tools that we bring into the classroom.

Rebecca: Stan, I love how you just sat down in January and just wrote a book as if it was easy peasy and no big deal. [LAUGHTER]

Stan: Sell, I will have to be honest, that I was using ChatGPT for part of the book, it was a matter of I asked ChatGPT kind of give me an outline, what would be important for faculty to know about this, so I got a very nice outline. And then it was a matter of creating prompts. And so I’d write a prompt and then I would get the response back from ChatGPT. It was a lot of back and forth with ChatGPT, and I thought ChatGPT did a wonderful job in moving this forward.

John: Most of the discussion we’ve heard related to ChatGPT is from people who are concerned about the ability to conduct online assessments in the presence of this. But one of the things I really liked about your book is that most of it focuses on productive uses by both faculty and students and classroom uses of ChatGPT because we’re not always hearing that sort of balanced discussion about this. Could you talk a little bit about some of the ways in which faculty could use ChatGPT or other AI tools to support their instruction and to help develop new classes and new curriculum?

Stan: Yeah, absolutely. I guess first of all, I would like to say that this is not going anywhere. It is going to become more pervasive in our life. Resume Builder went out and did a survey of a couple thousand new job descriptions that employers were putting out. 90% of them are asking for their employees to have AI experience. As higher education, it’s upon us to make sure that the students that are going out there to be employees know how to use this tool. With that said, there has to be a balance. In order to use the tool properly, you have to have foundational knowledge of your discipline. You have to know what you’re talking about in order to create the proper prompt, but also to assess the proper response. With ChatGPT sometimes it doesn’t get it right… just how chat GPT is built, it’s built on probabilities that these word combinations go together. So it’s not pulling full articles that you can go back and verify, kind of like the human mind has been working. We have built up knowledge all these years. My memory of what happened when I was three, four or five years old is a little fuzzy. Who said what? I’m pretty confident what was said. I’m pretty confident, but it’s still a little fuzzy. And I would need to verify that. So I see ChatGPT as an intern, everybody gets an intern, now. They do great work at all hours, but you as the supervisor still have to verify the information is correct. Back to the classroom, students can’t or should not, or regardless of who’s using it, should not just hit return on a prompt, and then rip that off and hand it in to their supervisors or instructor without verifying it, without making it better, without adding the human element to working with the machine. And that is, I think, where we can do lots of wonderful things in the classroom. You know, from the instructor side of go ahead and use this for your first draft. Now turn on the review tools that track changes and show me how you made it better, as you’re working towards your final product. Instructors can go ahead and craft an essay, craft out some supposedly accurate information from ChatGPT. tThrow it in the hands of the students and say: “Please, assess this. Is this right? Where are the policies? Where are the biases? Tell me where the gaps are. How can we make this better?” And using it to assess it.” Those are some initial ways to start asking students or using it in the class. I don’t know if I’m tapping into all the things. There’s just so many things that you could do with this thing.

John: And you address many of those things in the book. Among those things that you address was having it generate some assignments, or even at a more basic level, having it develop syllabi, or course outlines and learning objectives and so forth, for when faculty are building courses.

Stan: Oh, absolutely. We have a new dean at our School of Business. And he came over and wanted to know, “Tell me a little bit more about ChatGPT, how we can use this. They’re looking at creating a new program for the college. And it’s like, “Well, let’s just start right there.” What are the courses that you would have for this new program and provide course descriptions, titles, and descriptions? Here comes the list of 10, 12 different courses for that particular program. Okay, let’s take this program, what are the learning outcomes for this particular program? So we just copied and pasted, asked for learning outcomes, here comes the list of outcomes. Now for these different outcomes, provide learning objectives. And it starts creating learning objectives. And so you can just continue to drill down. But this moves past the blank page. Normally you’d bring in a group of faculty to work on that program, what are your ideas and send everybody off, and they would pull ideas together and you would start crafting this. This was done in 30 seconds. And now okay, here’s the starting point for your faculty. Where are the problems with this? How can we make it better? Now go. Instead of a blank page, starting with nothing? That was one example. But even for your course, using ChatGPT, having a course description, you can ask it to say, provide me a course plan for 16 weeks. What would I address in this? What would be the different activities? Describe those activities. If you want it to have the activities use transparent assignment design, it’ll craft it in that format. It knows what transparent assignment design is, and it will craft it that way. And then going back to assessment, you can build content. So looking at that OER content, open education resources, that it can get you a jumpstart on that OER content. What are gaps that I want or taking content that’s there and localizing it based on your area to say here we are in New England, Massachusetts, specifically, I need an example. Here’s the content that we’re working with. Give me an example, a case study, and it will craft a case study for you. It allows you to go from that zone of drudgery to your zone of genius very rapidly. I’ve been working on a new book, and got down to the final edits, and I was like, “Oh, I’m missing conclusions to all these different chapters.” I just fed the whole chapter in and said, “Could you craft me a conclusion to this chapter?” And it just knocked it out. I mean, I could do it. But that’s my zone of drudgery, and I’d rather be doing other things.

Rebecca: It’s interesting that a lot of faculty and chairs and administrators have been engaged in this conversation around ChatGPT quite a bit, but many of them haven’t actually tried. ChatGPT. So if you were to sit down with a faculty member who’s never tried it before, what’s the first thing you’d have them do?

Stan: This is an excellent question because I do it all the time. I have a number of faculty members that I’ve sat down, looked at their courses and say, “What is the problem that you’re working with? What do you want to do?” And that’s where we start. We say “What is the problem that you’re trying to fix?” ChatGPT version three had 45 terabytes of information it was given. They say the human brain has about 1.25 terabytes. So this is like asking thirty-some people to come sit with you to work on your problem. One class was a sports management class dealing with marketing. And they were working with Kraft enterprises that has the Patriots, and working on specific activities for their students and developing marketing plans and such. We just sat down with ChatGPT and started at a very basic level to see what we could get out of it. And the things we weren’t happy with, we just rephrased it, had it focus on those areas, and it just kept improving what we were doing. But, one of the struggles that I hear from faculty all the time, because it’s very time consuming, is creating assessments, creating multiple choice questions, true and false, fill in the blank, all these different things. ChatGPT will do this for you in seconds. You feed all the content that you want, and say, “Please craft 10 questions, give me 10 more, give me 10 more, give me 10 more. And then you go through and identify the ones you like, put them into your test bank. It really comes down to the problem that you’re trying to solve.

John: And you also know that it could be used to assist with providing students feedback on their writing.

Stan: Absolutely

John: …that you can use it to help generate that. Could you talk a little bit about that.

Stan: We’re right now working with the academic coaches. And this is one of the areas to sit down. I’m also not only the Director of Instructional Technology and Design, but also my dotted line is Director of Library. So I’m trying to help students with their research. And the writing and the research go hand in hand. So from the library side, we look at what the students are being assigned, and then sit down and just start with a couple key terms or phrases, keywords that we want and have ChatGPT to give us ideas on these different terms. And it’ll provide ten, twenty different exciting ideas to go research. Once again, getting past the blank page. It’s like “I gotta do an assignment. I don’t know what to do.” It could be in economics, I don’t know what to write about in economics, it’s like, well, here pull these two terms together, and what does it say about that?” So we start at that point. And then once you have a couple ideas that you want to work with, what are some keywords that I could go and start researching the databases with, and it will provide you these ideas. It’ll do other things, it’ll draft an outline, it’ll write the thing if you want it to, but we try to take the baby steps in getting them to go in and research but getting pointed in the right direction. On the writing side, for example, I have a class that I’m going to be teaching at the University of Wyoming to grad students. I’m going to introduce ChatGPT. It’s for program development and evaluation, and I’m going to let them use ChatGPT to help with this. One of the things that academic writers struggle with is the use of active voice. They’re great at passive, they’ve mastered that. Well, this will take what you’ve written and say, “convert this to active voice” and it will rewrite it and work on those issues. I was working with one grad student and it was after playing with ChatGPT a couple of times, she finally figured out what really was the difference and how to overcome that problem and now she is writing actively, more naturally. But she struggled with it. With ChatGPT, you can take an essay, push it up into ChatGPT and say, “How can I make this better?” And it will provide guidance on how you can make it better. You could ask it specifically, “How can I improve the grammar and spelling without changing any of the wording here.” It’ll go and check that. So for our academic coaches, because there’s high volume, this is another tool that they could use to say, “Here’s the checklist of things that we’ve identified for you to go work on right away,” not necessarily giving solutions, but giving pointers and guidance on how to move forward. So you can use it at different levels and different perspective, not where it does all the work for you but you could do it incrementally and say, “here assess this and do this.” And it will do that for you.

Rebecca: Your active and passive voice example reminds me of a conversation I had with one of our writing faculty who was talking about the labor that had been involved previously of making example essays to edit of to work on writing skills. And she just had ChatGPT write things that [LAUGHTER] are of different qualities, and to compare and also to do some editing of as a writing activity in one of her intro classes.

Stan: Absolutely. What I recommend to anyone using ChatGPT is start collecting your prompts, have a Google document or a Word document, and when you find a great prompt, squirrel it away. Some of the workshops that I’ve been giving on this, I demonstrate high-level prompts that are probably two pages long that you basically feed this basic information to ChatGPT and it talks everything about the information that you’re going to be collecting, how you want to collect it, how you want it to be outputted, what items are you going to output, and you’re basically creating this tool that you can then call up and say, for example, developing a course, that it will write the course description, give you a learning outcomes, recommended readings, activities, and agenda for a 16 week, all in one prompt. And all you do is say “this is the course I want” and let it go. It’s amazing what problems that we can build this tool just like we build spreadsheets, we build these very complex spreadsheets, to do these tasks. We can do the same with Chat GPT, we just have to figure out what the problems we’re trying to solve.

John: Our students come into our classes with very varied prior preparation. In your book, you talk about some ways in which students can use ChatGPT to help fill in some of the gaps in their prior understanding to allow them to get up to speed more quickly. Could you talk about some ways in which students can use ChatGPT as a personalized tutor,

Stan: I’m going to take you through an example that I think can be applied for students. A student comes to your class. Ideally, they’re taking notes, one of the strategies that I use is I have my notebook, I’ll open my notebook, and I’ll turn on otter.AI, which is a transcription program. And I will go over my notes, I will basically get a transcription of those notes, I can then feed that transcription into ChatGPT and say clean it up, make a good set of notes for me. And it will do that. And then I can build this document and then I can review what we did in class, build a nice clean set of notes, and have that available to me. Over a series of setw of notes, I could do the same thing by reviewing a textbook and highlight and talk about, transcribe key points of the textbook or I can cut and paste. And then I can feed that information into ChatGPT and say, “Build me a study bank that I can build a Quizlet, for example, or I need to create some flashcards on what are the key terms and definitions from this content?” Here you go. Create some flashcards from that material. It could be that no matter how great the instructor is, I still don’t get it. They introduced a term that is just mind boggling, and I still don’t get it. And so I can then ask ChatGPT to explain that at another level. They talk about non-fiction, some of the best non-fiction books or the most popular that are out there getting on the bestsellers list, they’re written at a certain grade level. And I know that I write typically higher than that grade level, I can go ask ChatGPT to rewrite it at a lower grade level. I could, as a student, ask ChatGPT, to give an explainer at a level that I do get to understand. Those are certain ways that you can do this. And you basically can build your own study guides that have questions that have examples of all the materials, so you can feed that material in and get something out, just enhance it. And I think for faculty, this is also an easy way to create good study guides, that you can get the key points and build the study guides a lot easier, just going with the blank page and trying to craft it by hand, can be very difficult. But if you already have all your material, you feed it in there, and then say here, let’s build a study guide out of this year with some parameters, definitely much more useful.

Rebecca: We’ve talked a lot about how to use ChatGPT as an individual, either as an instructor or as a student. Can you talk a little bit about ways that instructors could use ChatGPT for in class exercises or other activities?

Stan: Absolutely. And I’m sorry, some of the examples other folks have actually contributed first, and I saw him and I thought they were just brilliant, but I don’t have their names right in front of me. So I apologize ahead of time. But as an instructor, I would invite ChatGPT into the classroom as another student. We call it Chad, Chad GPT and bring Chad into the classroom. So you could have an exercise in your classroom, ask the students to get into groups, talk about an issue, and then up on the whiteboard, you start getting their input, you start listing it. And then once you’re done, you can feed Chad GPT the same prompt and get the list from Chad GPT, and then compare it to what you’ve already collected from the students, what their input has been. And from there, you can do a comparison, like “We talked about that, and that, and that, oh, this is a new one. What do you think about this?” And so you can extend the conversation by what Chad GPT has provided? …and there I go, Chad, I’ll be hooked on that for a while. But you can extend the conversation with this or if students have questions that are coming up in class, you can field that to the rest of the class, get input and then say “Okay, let’s also ask Chad, see what Chad has to say about that particular topic?” Those grouping exercise we typically do the think-pair-share exercise, well part of that is each student gets to get Chat in that group. So, each group you can have Chad come in where they have to discuss, they have to think about it first, write something down, pair, discuss it, then add ChatGPT into the mix, talk about it a little bit more, and then share with the rest of the class. Lots of different ways that you can bring this into the classroom, but I bring it right in as another student.

Rebecca: Think-pair-chat-share. [LAUGHTER]

Stan: Yep. And that’s that mine that actually somebody was clever enough, they found that. I just happen to glom on to it. But yeah, definitely a great way of using it. It’s a new tool. We’re still figuring our way, but it’s not going away.

Rebecca: So whenever we introduce new technology into our classes, people are often concerned about assessment of student work using said technologies. So what suggestions do you have to alleviate faculty worry about assessing student work in the age of ChatGPT?

Stan: Well, students have been cheating since the beginning of time. That’s just human nature. Going back to why are they cheating in the first place? In most cases, they just got too much going on, and it becomes a time issue. They’re finding the quickest way to get things done. So ensuring that assignments are authentic, that they’re real, they mean something to a student ,is certainly very important in building this. The more it’s personally tied to the student, the harder it is for ChatGPT to tap into that. ChatGPT is not connected to the internet yet. So having current information, that’s always a consideration. But I would go back to the transparent assignment design, and part of the transparent assignment design that is often overlooked is the why. Why are we doing this. If you use ChatGPT to do this, this is what you’re not going to get from the assignment. So, when building those assignments, I recommend being very explicit that yes, you can use ChatGPT to work on this assignment, or no, you cannot, but here’s why. Here’s what I’m hoping that you get out of this. Why this assignment’s important. Because otherwise, it just doesn’t matter. And then when I have an employee that just simply hits the button and gives me something from ChatGPT, I’m going to ask, “Why do I need you as an employee? Because I could do that. Where’s the human element? …bringing that human element into it, why is thisimportant?” What learning shortcut or shortcutting you’re learning, if you just rely on the tool and not grasp what the essence of this particular assignment is. But I think it goes back to writing better assignments… at least that’s my two cents on it.

Rebecca: Thankfully, we have ChatGPT for that.

John: For faculty who are concerned about these issues of academic integrity, certainly creating authentic assignments and connecting to individual students and their goals and objectives could be really effective. But it’s not clear that that will work as well when you’re dealing with, say, a large gen-ed class, for example. Are there any other suggestions you might have in getting past this?

Rebecca: John? Are you asking for a friend? [LAUGHTER]

John: [LAUGHTER] Well, I’m gonna have about 250 students in class where I had shifted all of the assessment outside of the classroom. And I am going to bring some back into the classroom in terms of a midterm and final but they’re only 10 and 15% of their grade, so much of the assessment is still going to be done online. And I am concerned about students bypassing learning and using this, because it can do pretty well on the types of questions that we often ask in introductory classes in many disciplines.

Stan: That’s a hard question, because there’s certainly tools out there that can identify where it suspects it’s been written by AI. ChatGPT is original text so you’re not dealing with plagiarism, necessarily, but you’re dealing with, it’s not yours, it’s not human written. There are tools out there, but they’re not necessarily 100% reliable. Originality.AI is a tool that I use, which is quite good, but it tends to skew, everything is written AI. TurnItIn, they’ve incorporated technologies into being able to identify AI, but it’s not reliable. This honestly comes down to really an ethics issue, that folks who do this feel comfortable in bypassing the system for the end game, which is to get a diploma. But then they go to the job and they can’t do the job. And a recent article that I read in The Wall Street Journal was a lot of concern about employees not having the skill sets that they have, and how to convince students of this, that “why are you here? What’s the whole purpose of doing this? I’m here to guide you based on my life experience on how to be successful in this particular discipline, and you don’t care about that.” That’s a hard problem to fix. So I don’t have a good answer for that. I’m always on the fence on that because it’s hurting the integrity of the institution that students can bypass, but it’s harder. Peer review is another tool, you know, to have them go assess it. They seem to be a lot harder [LAUGHTER] on each other. Yes, this is a tough one. I don’t have a good answer. Sorry.

John: I had to try again, [LAUGHTER] because I still don’t have very good answers either. But certainly, there’s a lot of things you can do. I’m using clickers.I’m having them do some small group work in class and submitting responses. And that’s still a little bit hard to use ChatGPT for just because of the the timing, but it was convenient to be able to let students work on things outside although Chegg and other places had made most of those solutions to those questions visible pretty much within hours after new sets of questions have been released. So, this perhaps just continues that trend of making online assessment tools in large classes more problematic.

Stan: Well, I mean, one of the strategies that I recommend is master quizzing. So master quizzing is building quiz that are 1000s of questions large and randomly drawn from it. And they get credit when they ace it. And then the next week, they have another one, but it’s also cumulative. So they get previous questions too. And you have to ace it to get credit. Sorry, that’s how it is, cheat all you want, but it’ll get old after a while.

John: And that is how my course is set up. And they are allowed multiple attempts at all those quizzes, and they are random drawings. And there’s some spaced practice built in too, so it’s drawing on earlier questions randomly, but, but again, pretty much as soon as you create those problems, they were very quickly showing up in the online tools in Chegg and similar places. Now, they can be answered pretty well, using ChatGPT and other similar tools. It’s an issue that we’ll have to address, and some of it is an ethics issue. And some of it is again, reminding students that they are here to develop skills, and if they don’t develop the skills, their degree is not going to be very valuable. I

Rebecca: Wonder if putting some of those like Honor Code ethics prompts at the beginning or end of blank bigger assessments would [LAUGHTER] prime their pump or just cause more ChatGPT to be used. [LAUGHTER]

John: That’s been a bit of an issue because the authors of those studies have been accused of faking the data. And those studies have not been replicated. In fact, someone was suspended at Harvard, recently, and is now engaged in a lawsuit about that very issue. So the original research that was published about having people put their names on things before beginning a test hasn’t held up very well. And the data seems to have been… at least some of it seems to have been… manipulated or fabricated. [LAUGHTER] So right now, ChatGPT allows you to do a lot of things, but they’ve been adding more and more features all the time. There’s more integrations, it’s now integrated into Bing on any platform that will run Bing. And it’s amazing how well it works, but the improvements are coming along really rapidly. Where do you see this as going?

Stan: November 2022, was ChatGPT built on GPT3 , we’re now into four. And this is only half a year later, basically, that we got into four. I mean, it’s everywhere. For example, in selling books, one of the things that you want to do is try to sell more books. So I went back to Amazon, pulled out all the reviews that I had, sent them into ChatGPT and said “Tell me what the top five issues are.” In seconds it told me it just assessed it where this would take large amount of time for me to do this and it just did it nice and neatly. Everything is going to have AI into it. Grammarly AI is being built into it. All the Microsoft products are going to have AI built in. We’re not getting away from it. We have to learn how to use this in our professions, in our disciplines. With ChatGPT4, it was said somebody had drawn a wire diagram of a website buttons and mastered and text and took a picture of it, gave it to ChatGPT4 and it wrote the code for that website. It’s gonna be exciting. Buckle up, and we had consternation about January, we’re gonna have a lot more coming up. It’s just part of what we do. We have to figure out how to stay relevant, because this is so disruptive. In the long line of technologies that has come out, this is really disruptive. We can’t fight against it, we have to figure out how to do it appropriately, how to use this tool.

Rebecca: The idea of really having to learn the tool resonates with me because this is something that we’ve talked about in my discipline for a long time, which is design. But if you don’t really learn how to use the tools well and understand how the tools work, then the tools kind of control what you do versus you controlling what you’re creating and developing. And this is really just another one of those kinds of tools.

Stan: Well, even in the design world, I’ve gone to Shutterstock. And there is something that allows you to create a design with AI. So the benefit for a designer is they have a certain language, tone, and texture. Their language is vast, and for them to craft a prompt would look entirely different from me, a snowman sticks for arms, it’d be entirely different. But getting the aspect ratio of 16 x 9, everything that you craft into this prompt and feed it in, somebody who does design and knows the language would get something then a mere mortal like me putting that information in. So for somebody who’s in economics, you have a whole language about economics. Somebody who is trying to craft a prompt related to that discipline has to know the foundationals, the language of that discipline, to even get close to being correct in what they’re gonna get back. And students have to understand this, they cannot bypass their learning because they will not have the language to use the tool effectively.

John: And emphasizing to students the role that these tools will be playing in their future careers, might remind them of the importance of mastering the craft in a way that allows them to do more than AI tools can. And at some point, though, I do wonder [LAUGHTER], at what point AI tools will be able to replace a non trivial share of our labor force.

Stan: It’ll affect the white collar force a lot quicker. And I look at it… a nice analogy for the AI was in the Marvel, you have Iron Man, Tony Stark. And it is the mashup of the human and the machine. He’s using this to allow himself to get further and faster in his design, and to do things that we hadn’t thought about before. And I see this tool, being able to do this, that we’re bringing so much information and data to this, it’s mind boggling that suddenly you see a spark of inspiration that you couldn’t get there by yourself without a lot of labor, and suddenly it’s there. And you can take that and run with it. For me. It’s tremendously exciting.

Rebecca: So we always wrap up by asking, what’s next?

Stan: Great question. Right now, I’m getting edits back from my editor for my next book, it’s Strategies for Success: Scaling your Impact as Solo Instructional Technologists and Designers. I’ve been doing this for about a quarter century and mostly as someone by myself, helping small colleges on how to do this, how do I keep my head above water and try to provide the best support possible? So sharing what I think I know .

Rebecca: Sounds like another great resource.

John: Well, thank you, Stan. It’s always great talking to you, and it’s good seeing you again.

Stan: Yeah, absolutely. And also, free book… I’mgonna give a 100, first 100 listeners, but I can go more. Yeah, so there’s a link it’s bit.ly/teaforteachinggpt . And so it’s in that set of show notes to share, but the first 100 gets a free copy of the book.

John: Thank you.

Rebecca: Thank you.

John: We’ll stop the recording. And, and we’ll put that in the show notes.

[MUSIC]

John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.

Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer.

Ganesh: Editing assistance by Ganesh.

[MUSIC]