12. The Active Learning Initiative at Cornell

In this episode, we discuss Cornell’s Active Learning Initiative with Doug McKee, an economist at Cornell and a co-host of the Teach Better podcast. This initiative, designed to increase the use of active learning in instruction at Cornell, provides funding to departments to redesign courses to employ evidence-based active learning techniques. Doug provides an overview of the program and a discussion of how this program is being implemented to transform economics classes.

Show Notes

John: Our guest today is Doug McKee. He will be discussing the Active Learning Initiative at Cornell. Doug is an economist at Cornell and also a co-host of the Teach Better Podcast, which is one of our favorite podcasts on teaching and learning.

Rebecca: Today our teas are:

John: Ginger peach white tea.

Doug: I’ll be honest. I can’t stand tea.

[LAUGHTER]

Rebecca: Absurd.

[LAUGHTER]

Doug: It’s not that I don’t need the caffeine, but my preferred method of caffeine injection is coffee.

Rebecca: Yeah, that’s a bummer. It’s not “The coffee podcast.” There’s lots of coffee here..

Doug: Maybe next week I can be on the coffee for teaching podcast.

[LAUGHTER]

John: Or we could have a special episode.

Rebecca: Yeah.

Doug: Oh… a special coffee episode.

Rebecca: That would be kind of fun.

Doug: You’d need guest hosts.

Rebecca: Yeah.

John: The Starbucks next door is closed over the break, so we’re…

Rebecca: Anyways… I’m drinking a golden English breakfast.

Doug: Do you change the teas every single week?

Rebecca: We try to.

John: We generally do, but we got a couple hundred of them….

Doug: But, don’t you just have a few that you like?

Rebecca: I have a favorite, yeah.

Doug: Don’t you just want to drink that one all the time?

Rebecca: Sometimes.

John: Sometimes we do. I’ve used the same ones occasionally.The last couple of episodes, I had the ginger peach black tea….

Rebecca: …that’s his favorite.
JOHNL… and the ginger peach green tea… and this is a ginger peach white tea, which I drink later in the day because it has a little less caffeine.

Doug: uh-huh…

Rebecca: I like the English afternoon tea.

Doug: I would think in the afternoon you’d want the one with more caffeine.

John: I’m generally better in the afternoons and evenings. I have more trouble getting energetic in the morning.

Rebecca: John loves 8 a.m. meetings.

John: Could you describe the Active Learning Initiative at Cornell? How did it originate and how is it working?

Doug: The first thing I want to say is I’m a relative newcomer to the Active Learning Initiative. I joined Cornell about a year and a half ago, in large part because I was excited about joining the Active Learning Initiative. But it’s a project or really a program that started back in 2013. It was the brainchild of Peter LePage, who was actually a guest on episode 50 of the Teach Better Podcast and he is a former Dean of the College of Arts and Sciences. He’s a physicist and he had read a fair amount of the literature… the big literature on active learning in physics… and wanted to come up with a program that would actually instill this way of teaching across the college. He also happened to be a friend of Carl Wieman, and so he knew all about the Science Education Initiative as it was happening… and the Active Learning Initiative is a program modeled on the same principles as the Science Education Initiative… and the ideas that departments compete for grants, and once they get that money, they use it to hire people with disciplinary knowledge (usually postdocs into their own Department) who can co-teach with faculty and train faculty in how to use these methods. Because what I find, is, when you ask faculty why they don’t teach actively or why they depend on the pure lecture, even though there’s a fair amount of evidence that active learning works better on a variety of dimensions, “we don’t have the time”… well, the department education specialist or the postdoc can help with that…. they say “we don’t know how”…. and so the department education specialist comes in with that knowledge of how to develop clicker questions and what a good small group activity looks like, and then finally when they say ”well, I don’t believe it,” it’s the department education specialist can both reference literature… talk in the language of the the discipline… and…. and this is a big part of both the Science Education Initiative and the Active Learning Initiative, creates knowledge and actually try things in that discipline and evaluate what the effects are on student learning of teaching in new ways…
JOHN…and the STEM fields have been a bit of a leader in that. Physics, in particular, has been very active in development of this.

Doug: That’s right. So, the first round of the the grants funded by the Active Learning Initiative at Cornell were given to the Physics department and the Biology… actually it was a co-proposal by their two biology departments. Why? I don’t understand why universities can’t have one biology department anymore? They don’t.

John: Out of curiosity, what are the two areas?

Doug: Ecology and evolutionary biology… and then there’s the, I think, molecular biology and like these kinds of biologists…. I don’t know… I’m not a biologist, but biologists have a hard time talking to each other, it turns out. But these biologists could talk to each other even though they’re in different departments, and…

John: … maybe because they were in different departments it was easier.

Doug: …it was easier… You’re right, it was easier…

[LAUGHTER] as long as they just had to talk about teaching and not about who the appropriate person to hire was… and so those were the first two departments… and then since then, in 2017 there was another round of grants granted and one of those was economics and the idea there was to branch out beyond the sciences and the STEM fields into the social sciences and even the humanities.

John:… and this was funded by a donation from alumni, right?

Doug: So, the funding comes from an alumnus who had a passion for active learning. I think what happened was he and Peter were at a dinner… and Peter was talking about how excited he was about changing how he taught his classes, and how he wished that it would happen more often, and this person came up to him and said “I heard what you were talking about and I’d be really interested in funding that.”

Rebecca: That’s exciting.

Doug: On the other hand, the Science Education Initiative, as far as I know, was funded by internal money. So it was Carl Wieman was very persuasive in convincing the administration at the University of Colorado and the University of British Columbia, and now he’s at Stanford, and he has internal money at Stanford, to do a lot of the the same thing.

Rebecca: Funding is great to support some of these things, but I would say that our Writing Fellows program on our campus is not that different from hiring these postdocs, because we have writing fellows that are assigned to each of our schools. We have a previous episode on that topic with Stephanie Pritchard, and we have writing experts who meet with faculty in different departments to help them develop writing assignments to meet some of our standards for writing across the curriculum, and help faculty understand how to teach writing….

Doug: Right.

Rebecca: … in their disciplines. So, in some ways, it’s a very similar sort of model. It’s not about active learning, but it’s about how to do this writing infusion.

Doug: So in many ways, it does sound similar, but I’m gonna point out some of the differences.

Rebecca: Great!

Doug: These postdocs live in the department, and they have PhDs in that discipline, and so in some ways they get way more cred that way… and respect from the faculty, because they’re thought of as part of the department. Another difference is that the departments have to write proposals, so they’ve committed and they’ve said in the proposal how excited they are about this, whether they are or are not.

John: They’ve at least made a commitment.

Doug: They’ve at least made a commitment and said… stated on paper that they’re excited and they want this to happen and this should be a success. It’s not some external entity, and I think that makes a difference.

Rebecca: How long do the postdocs stay in the department?

Doug: So, it varies from department to department, but in our proposal we felt strongly that in economics the postdocs are generally two years, and so our postdocs are 2-year postdocs. But in physics, I think, the postdocs tend to be longer, and so their postdocs, I think, are three-year postdocs. One year is just not enough, because they arrive and they immediately have to start preparing to go back out on the job market, and try to get jobs elsewhere.

John: But I think you had mentioned that while they were there for two years, there were going to be a number of postdocs that were staggered over a four-year period.

Doug: Right. So our project is a five-year project and what we’re doing is we’re hiring four two-year postdocs starting, with one in the first year and then every year after that, and so during the middle three years of the program we’ll have two in residence and then in the first and the last we’ll have one.

John: You mentioned the Economics Department as one of the recipients of this. How many other departments were there?

Doug: Five. So the physics department has another grant to overhaul how they teach their lab courses, and this is actually… Natasha Holmes, I think, is arguably the world’s expert on really modernizing and changing how we teach lab courses; taking them away from following recipes to creating the recipes and answering interesting questions using experiments, and creating those experiments, and she recently joined the Cornell faculty. The music department is using a grant from the Active Learning Initiative to integrate active learning into a composition course where students all have keyboards in the classroom and the keyboards act a little bit like music clickers, so the…

John: with MIDI controllers, probably?

Doug: Right… and the instructor can then listen and select different things that different students have played and then play them for the whole class. It looks fantastic. Sociology is doing fairly straightforward integration of active learning into their large lecture classes, using group activities and clickers. They’re also standardizing, and making much more active, the discussion sections for those classes. The math department is overhauling their introductory calculus courses. Let’s see, that’s math, physics, economics, sociology, music, and then the sixth is classics…. and Classics is creating a brand new course that’s not a pure lecture course, that has students, I think, doing projects during the semester.

John: …and your department is doing how many classes?

Doug: We are treating, or transforming, eight classes?

John: Which classes are you doing?

Doug: So, we’re basically doing the entire core curriculum, or the required courses for the major. It’s not exactly that right now. We’re really doing seven plus a popular elective course called behavioral economics… and why aren’t we doing all eight of our required classes? Well, one of them is a class that we don’t actually own, but we’re hoping to somehow, over the next five years, find the money where we can treat that class too, because it would be a shame to just do seven out of eight.

Rebecca: How are your faculty in your department responding to the initiative and getting involved?

Doug: So we have a small core that’s very excited and they’ll be involved in the program early on in the first couple years. Then we have another set of faculty that are very excited in theory, as long as it’s far enough in the future…. And I think we have faculty that are not that excited, but aren’t actually being affected by it, and so they’ve been fairly passive. I’ve been thrilled not to have anyone in the department that’s actively opposed, and so it’s been great, actually, and the vast majority of the work happens when a course is actually being transformed. Right now we’re making it very easy on ourselves. The first course that we’re transforming is one of my own courses.

John: Which makes it easier.

Doug: Yeah.

Rebecca: Yeah.

Doug: So we can get the progress. We can just get the whole procedure right… get that stuff in shape as well as we can. So… okay, we’re gonna do a control course in the fall. We’re gonna do a treatment course in the spring. So we’re gonna teach it fairly plain vanilla in the fall, and we’re gonna treat it in the spring and we want really good measurements of learning. How do we go about creating those measurements of learning? Because unlike physics which has like 80 standard concept inventories, economics has one. It’s the Test of Understanding of College Economics, and it’s pretty good, but it’s only for principles courses, and so we have to build from scratch… and so next year we’ll actually be bringing more faculty into the fold, and we’ve scheduled it such that the fact that we bring in our faculty that are excited about the project.

Rebecca: What have you learned from the process so far?

Doug: Measurement is hard… and it’s not even that it’s just difficult… it’s a lot of work… and so we had a meeting with some of the folks in biology and where we talked about what we were doing and Ron Harris-Warrick said something. He said “It seems like you’re spending 90% of your time on measuring and assessing, whereas we spent 90% of our time actually changing the teaching” and I said “That’s exactly right” because we’re not changing the teaching yet and we want to know if, in fact, what we do when we change the teaching works …and so we need really good measures, and so that’s where all our investments been. Now, is that changing? Yes. In two weeks we start classes up again, and we start teaching the transformed class and so we have actually done a fair bit of work, and we have big picture. We know what we’re going to do, and we’re very excited about it, but we have a lot of work to do this spring on actually where we focus on changing the teaching.

Rebecca: So, we’re taking this week by week?

[LAUGHTER]

Doug: Well, it’s gonna be a lot of just-in-time curriculum development.

[LAUGHTER]

John: I’m very familiar with that process.

Doug: Right… right… right. I got to say, like after a semester of focusing assessment, which has been really fun, we’ve created what we think could be the start of two standard…. like two… we’ve tripled the number of standard assessments in the field in one semester. I mean… we have work to do still and we’re the only ones using it right now, but that’s changing. I am pretty excited to actually change how I teach this class and we have a lot of ideas about things we can do.

John: You’ve already been doing a lot of active learning in your classes before, so what are you going to be doing differently?

Doug: I divide it up into two chunks. So, we’ve got some high bang for the buck things… things like two-stage exams. One problem that we’ve had in the past, and I think it’s a very common problem, which is: you give an exam… the students take the exam… and then there’s a whole bunch they can learn after the exam about the mistakes that they made, and more than half my students don’t even come and pick up their exams. They see the number and then they… if it’s bad they get sad, and if that’s good they’re like I’m fine… and neither group is learning from their mistakes, and so what we’re doing is we’re saying you’re gonna take the exam… and then you’re gonna take it again in groups and then your grade for the exam will be a combination: 80 percent is your individual grade and 20% the group grade, and so at that point you’re actually discussing it. It forces students to actually discuss and talk about these problems in another step, and I think that’ll make a big difference.

John: You had a podcast episode on that not too long ago.

Doug: Yes, with Teddys Svonoros, where he’s been giving two-stage exams for quite a long time at the Kennedy School at Harvard.

John: So listen to that podcast. It’s a great way to learn more about that. We’ll put a link in the show notes.

Doug: It’s a great podcast.
Another high bang for the buck thing, we think, is assigning points in their grade to whether or not they participate in answering clicker questions in class. I have been, for my entire teaching career, adamantly opposed to giving points for class participation. I’ve always felt that it cheapens it, and I want students to be there and participate because it improves learning… not because they’re getting points toward their grade… and over the summer I was convinced that that is actually the wrong attitude. Students look at how you assign points for their final grade as a signal of how important things are.

Rebecca: …of what you value.

Doug: …of what I value…. and so if I say there’s no points for showing up at class and it’s not required, then they’ll be like: “Oh, he doesn’t think it’s important” and I can say it’s really important over and over and it doesn’t have nearly the…

John: Well, clicker questions offer some other things. It gives you some feedback too.

Doug: Right.

John: …in addition to other things, because if one person asks a question or answers a question or raises a point you know how that person responds or how they understand the concept.

Doug: Right.

John: But, with clickers you can get a feel for how all of your students are doing.

Doug: Right.

John: …and you can adjust what you’re doing in class to compensate for that somewhat. So there’s some good merits for clickers whether they are graded or not.

Doug: Right, but you’re not gonna get anything out of the clickers if the students aren’t in the classroom.

John: Right.

Doug: ….and so I assign points to get them into the classroom and I think that’s just gonna bring up a big chunk, because at the end of the semester you look at the grade distribution and you look to see who got the highest grades in the class… and guess what? You know all those people because they show up in class.

John: Yes.

Doug: ….and then you look at the bottom and your like, how did I miss these people? They did terribly the whole semester….

John: …but you didn’t see them as much.

Doug: I don’t even recognize them. I look at the pictures and they don’t even look familiar… and so pulling those people in. Those are a few of the little things… little in terms of effort… but we hope big in terms of impacts. But the class has clicker questions… the class already has… I like do short lectures… and they do problems to practice… and so we’re we’re hoping to really gain in the spring is by adding something called “invention activities” and so these are an idea that Dan Schwartz and his students have been writing papers about for a little while… and so probably the easiest thing to read if you’re interested in invention activities is the J chapter for “Just-in-Time Telling” in Dan Schwartz’s and co-authors book called The ABCs of How We Learn, which is an amazing book. I highly recommend it. If you read any book on teaching at all The ABCs of How We Learn is a great one in terms of telling you practical things you can do and giving you the evidence… and Dan Schwartz does great work… and so the idea is, in a nutshell, let students grapple with a problem before you actually teach them the solution… because it primes their brain… it sets up those knowledge structures that you can then hang the expert methods on.

John: So it activates prior knowledge….

Doug: Exactly.

John: …and gets them ready to form more complex….

Doug: It even create creates prior knowledge… and so he has these two papers. One is called “The Time for Telling” and the other is called “Inventing to Prepare for Future Learning” where he does these really amazing experiments. The one that I really like in “Time for Telling” is about casework. He has these three groups and one of the groups he gives a bunch of data on classic psychology experiments and just has them graph it and talk about it and try to figure out why the patterns are there and then he explains what we actually learn from these psychology experiments.. and then they play around with it and then he gives them a test… where they have to predict what the outcome would be of some new psychology experiments. That’s group A and they do great. But then group B… he skips that first stage and instead they read a chapter about what we can learn from these psychology experts and they summarize it… which doesn’t sound so bad… and then he gives the lecture and they do far worse… and so I’m really hoping that we can get a big bang from these priming activities and I think this class.. it’s a second semester class on statistical methods for economists, also known as econometrics… ‘cause god forbid economists use a word that other people already know…

[LAUGHTER] but there’s a whole bunch of methods there and so I look at that class as a really great match for this method…. where I can show them data and put them in this situation where a new method would be really valuable… and the old methods work poorly… and have them play around with it and then teach them the method.

John: That sounds really promising.

Doug: I think it’ll be super fun.

John: We just did a reading group here on Small Teaching and one of the chapters in that that people were pretty excited about was on prediction and it…

Doug: Exactly.

John: …summarized a lot of the research on that (which is another good reference in addition to what you just mentioned)

Doug: Right.

John: … to see a summary of the literature on that and it sounds like it’s quite effective in a wide variety of studies now.

Doug: I mean, I do think you have to be careful with students in explaining why you’re doing it upfront. I think it can be very frustrating to give students problems that you know they’re going to fail at. Students don’t like failing.

Rebecca: No.

John: But the research shows that when they have wrong predictions they actually have larger learning gains.

Doug: Right.

John: …as a result of the prediction activity.
DOUG. Right.

John: … and as long as you they know that, and as long as it’s low- or no-stakes in the prediction….

Doug: Right.

John: ….it shouldn’t harm them… but it doesn’t feel as good, so there’s a metacognitive issue there.

Doug: Well, I try to model making lots of mistakes in the classroom

John: I do that too, but I’ve never called it a strategy., but…

Doug: Exactly.

Rebecca: I think students respond really well to seeing their faculty be vulnerable and human…

Doug: Oh, I agree with that.

Rebecca: …and I know we all make mistakes and I think you’re right that students are… they’re afraid to fail…

Doug: Right.

Rebecca: …and they’re afraid to be wrong. So having an environment that’s setup that it feels safe to make those mistakes.

Doug: Right.

Rebecca: …and then get better… and the goal is to get better… really sets up an atmosphere that really supports learning because everyone feels safe about learning… and I think that that’s not always the way that students think about learning.

Doug: Not at all. Not at all. They think “if I get the answer right I’ve learned and if I get the answer wrong I haven’t” and it’s actually the opposite. If you get the answer right you haven’t learned it…like, you knew it.

Rebecca: Yeah.

Doug: Yeah, if you get the answer wrong… oh my god, there’s an opportunity to learn something…

John: …and it’s hard to convince some of that sometimes.

Doug: It is. It is. I tell them I want you to get things wrong, because if you just get everything right you’re wasting time…. then I’m wasting my time.

John: I tell them the same thing.

Rebecca: Yeah.
REBECA: So, can I circle back to something that we talked about earlier? I heard you use the word fun and assessment in the same sentence.

[LAUGHTER]

John: It’s not a common juxtaposition.

Rebecca: Exactly. I was hoping that you could talk a little bit more about what was fun about this development.

Doug: Oh my god, I’d love to.
Yes. So first of all, I admit that I like writing tests. I think it’s really fun to come up with new scenarios that get at concepts that I’ve taught before… and so I can take knowledge in some completely different area and then frame it. So I remember one time I had to give a makeup exam, but I didn’t really want to write a whole makeup exam and so all I did was change the wording… so a question about hospitals became a question about pet stores then a question about restaurants became a question about something completely different… but the methods were all identical and so that I think that’s kind of fun… but most people wouldn’t think that’s fun. So the part of the process that we use that I think more people would find fun in that was a key part and it was something that Carl Wieman and Wendy Adams call “think-alouds” where we draft the assessments and we bring a student in and we have them do it. So these assessments are 20 to 25 multiple-choice questions and if all you see are the sequence of multiple-choice answers, it’s hard to tell what’s going on in a student’s head… okay …and so what we do is we say sit down and we say vocalize everything you’re thinking as you take this…and we don’t say anything…. we don’t say no don’t explain what you’re doing, because that changes how you actually answer it. Just vocalize what you’re doing as if you’re taking it and the results are so eye opening. I learned so much about how students actually think about these things… the amount of just pure pattern matching, it would blow anybody’s mind. They say things like “You’re asking about a confidence interval… well, I remember that confidence intervals had something to do with standard errors, and so I’m gonna choose the answer that says something about a standard error.” It’s kind of frightening, to be honest… and then pretty often you ask a question where you say… you show them a picture, and then you ask a question that’s related to the picture… and they ignore the picture. They don’t even look at the picture. Yeah, ok, that’s not good… and so we did five of these during the fall with this the big assessment of learning that we wanted to use both in the fall and in the spring… so then the control course in the fall and the treatment course in the spring. We’re gonna compare results, and each time we did these “think-alouds” we would have like ten changes. We would add options because we’d find out that students…. it was really common for them to make this one kind of mistake that we hadn’t foreseen… and get an answer that wasn’t one of the choices. Sometimes they would pick the right answer for completely wrong reason like “oh well, it’s always the one that’s a yes.” Oh…well maybe we should make the right answer the one with the “no.”

John: So, to develop assessments or concept tests, it’s really helpful to know the common misperceptions so that you can break them down.

Doug: Absolutely.

Rebecca: What I’m hearing is what we call in design a user test.

Doug: Right.

Rebecca: So you’ve designed something, and now you’re testing it out on a test audience

Doug: Exactly.

Rebecca: …and making revisions.

Doug: …and no one does that.

Rebecca: Yeah, it’s kind of interesting, right? Like we don’t think of students as being users of a class, but essentially they are.

Doug: Right. I think when we teach, usually we have so little insight into what’s actually going on in students heads… and I think in a pure lecture it’s the extreme case. You get zero insight because you teach and they listen… and the only time you ever see what they’re learning is when you see their test results, or what they’ve written and so a big part of why clickers are useful is not only because it activates their brains and they’re practicing things… but the feedback it gives the faculty and so doing these think-alouds… it’s like an extreme version of insight into what students are actually thinking.

Rebecca: How do you recruit these students for the think-alouds?

Doug: What we did is we have the rosters from the previous year… students who had taken the class… and we invited a broad range, so we didn’t just get the A students… we got the A students… the A- students… the B students… even a couple C students, and then I think we paid them twenty dollars each… but you pay them a little bit and they show up.

Rebecca: That’s great.

Doug: Yeah… yeah, yeah, yeah.

Rebecca: Have you thought about using the think-aloud method at all as part of your class? I’m almost wondering if there’s a way to kind of integrate that more into, probably not a large class, but a smaller class.

Doug: That’s a really interesting idea. I think when you give students kind of meaty problems to do in class, and they’re working on them in small groups, you can go around and you can just listen. So there are these incredible transcripts of what students said to each other during the invention activities in the “Inventing to Prepare for Future Learning” paper. So the subjects were ninth grade statistics students, but it’s remarkable just how similar teaching is. There’s a lot more in common across grade levels than there is difference. I mean there are differences, third graders are not exactly the same. My classes look far more like a good third grade class then they do a pure lecture class.

Rebecca: it makes a lot of sense, right? I mean active learning works and they figured that out in elementary school.

Doug: Right, can we please not forget that?

[LAUGHTER]

John: Another way of getting the same sort of thing was suggested by Eric Mazur when he was here during a visit, and I’m sure he’s mentioned this in other places as well. When he develops clicker quizzes, he tries to aim for about half of the students getting them correct and the other half wrong, so that you’ve got a good base when he does the peer instruction component of it. What he does to develop the questions is pose a question and leave free response questions and he’ll just let students write their responses and then he’ll use that to pick the most common misconceptions that he’ll build into the clicker questions. Which gives you a little more scale, but not quite as much information as a think-aloud, perhaps.

Doug: …and the ideas that you’re iteratively refining your class semester over semester. I wish that that was a more common attitude toward teaching… but what I find is… I meet a lot of faculty that… they developed the class… they fix the things that are obviously broken… and then it’s done… and then they come in… they give their lecture… they walk out… and there’s very little investment after the fact.

John: We’re creatures of habit.

Doug: Right.

John: We tend to resist change. That’s one of the things behavioral economics tells us.

Doug: I was gonna say that classical economics tells us that people respond to incentives.

John: Right

Doug: If you’re above the bar… like you’ve responded to all the incentives… that are all the professional incentives that are there.

Rebecca: As a designer I just can’t help myself from redesigning and redesigning and redesigning… it’s never done.

[LAUGHTER]

John: I’m never satisfied with the way my course goes and I keep wanting to make it better.

Doug: Me too.

Rebecca: Yeah, but that’s why you two run a teaching center and have the teaching podcasts. You’re not the problem.

John: Well, my students might disagree at times… but that’s another issue.

[LAUGHTER]
So, are the other departments involved doing as much with the assessment component or is this something that you’re perhaps focusing on a bit more? or is it built into all of it?

Doug: No, I would say… I believe, no. It’s kind of a pet issue of mine… like I think assessment is super important. I think math is leaning on existing assessments… which is fine…

John: ….for departments or for majors where you have existing assessments that works well, but as you mentioned, we just have the micro and macro TUCE exam…

Doug: That’s all we have.

Rebecca: I mean, you have something.

John: We have something and we use it.

Rebecca: There’s nothing in my field.

Doug: So, in sociology there’s nothing also… and so sociology has actually been investing in assessments there. It’s tough… it’s tough. Sociologists are allergic to multiple-choice and so they will never use multiple choice… and I’d like to see a graphic design assessment that’s multiple choice.

Rebecca: I made one.

[LAUGHTER]
It’s hard, though.

Doug: So, Natasha Holmes and I have this ongoing back and forth where I say “Here’s something that’s tough to teach and evaluate with multiple choice” and she says “No, you could do it” and the line moves. So we agree that there are plenty of things that can’t… we agree on a lot that can …and we agree that fluency in playing the piano… you can’t… but where the line is in between, we go really back and forth… and I find that over time I believe more and more can.

Rebecca: I think there’s a lot of conceptual things that you can test that way.

Doug: Right.

Rebecca: …and measure certain kinds of understanding.

Doug: Right.

Rebecca: …but maybe not always… you can’t do a practical application with the multiple choice.

Doug: Right.

Rebecca: Like in graphic design, for example.

Doug: RIGHT. So, not everything, but a lot.

Rebecca: Yeah.

John: But you could devise, as we talked about in an earlier meeting today…. you could use rubrics for those areas.

Doug: Right… right.

John: We had a number of faculty a few years ago, who were new faculty in creative fields, and they were very concerned about how to evaluate creative work… so we put together a panel of people from art, from music, from screenwriting, a playwright, and from creative writing, and what was remarkable is they all said exactly the same things: that they use rubrics very heavily; that while people perceive this as creative work, there’s very specific things that they’re looking for in the writing; and as long as they develop good rubrics that capture what they’re looking, for it lets students know what they should be striving for what they view as important. Just as you said to give them points as students, matters but giving students rubrics helps them see what you think is important in their work… and it makes it easier for them to try to meet those standards.

Doug: Do you have a book that you recommend?

John: I don’t.

Doug: Well, wouldn’t it be great?

John: It would be. I’m not sure if there is one.

Rebecca: I don’t know of one, but I think it’s not a lot different. Revising and rewriting a rubric is no different than revising and writing multiple-choice questions

Doug: Right, right.

Rebecca: It takes time to refine that and get it to measure exactly what you want it to measure.

Doug: Right.

John: It just impressed me, though, that they all had exactly the same thing in fields that were so diverse.

Doug: Right.

Rebecca: Yeah, there’s still values to the discipline, right? There’s still things that we value. So, it might be innovation… so, maybe that’s an item on your rubric.

Doug: It’s pretty easy to get inter-rater reliability with a multiple-choice test. I think if the correct answer is A and you give a C, John and I are both gonna say it’s wrong; whereas, with a rubric it’s not obvious.

Rebecca: Right.

Doug: But I think… I mean… like so many things… like it’s not that rubrics are good and multiple choice exams are bad. There are plenty of really good multiple-choice assessments and bad ones and there’s a big difference between a bad rubric and a good rubric.

Rebecca: Um-hm

Doug: This is where people say “I tried active learning… it didn’t work …and so I don’t believe in active learning anymore.”

John: But by building the results and by working on the development of tests and assessments that could be used across time and across disciplines…

Doug: Right.

John: ….that makes it easier to build a case for the efficacy of active learning…

Doug: Right.

John: ….and that’s one of the reasons why I think why this has been so effective in so many of the STEM fields, particularly in physics.

Rebecca: I think another thing to think about is, if you’ve never done active learning as a teaching method before, then like you are a beginner, you’re not an expert…

Doug: Right.

Rebecca: …and so just like our students, the first time out of the gate….

Doug: Right.

Rebecca: ….we’re not going to be an expert, and it’s not going to be perfect, so we have to be vulnerable as learners as well… and so I think sometimes reminding folks that… wait a second, right now you’re a learner, and it’s ok… it’s ok not to be perfect…. and it’s ok that it failed. We can learn a lot from it….

Doug: Right.

Rebecca: …it can be really useful.

Doug: Right. Boy, that attitude… I wish it was more common.

John: New faculty, while they may have been exposed to more active learning techniques, they’re also sometimes reluctant to go against department standards of teaching because they know that there’s some chance that new things they try may fail… and it’s certainly safer and easier to do it after tenure, although by then people often get into habits that are hard to break.

Doug: So, a strategy that my department is planning to apply, and their current chair is highly supportive of, is bringing brand new faculty into the Active Learning Initiative right away… and so giving them classes that have already been transformed… so it becomes the new norm… so it’s not that you’re taking a risk or trying these new things and if it doesn’t work the rest of the faculty are going to shake their head… it’s “This is how you’re supposed to do it.”

John: Yes, having that as a prepackaged method of teaching…

Doug: Exactly.
JOHN… certainly would make it easier to disseminate that.

Doug: That’s the hope.

John: Excellent.

Doug: ….and they also won’t have the… I guess… the institutional power to fight against it as hard… as an established faculty member.

John: That’s true.

Rebecca: Yeah. How have students responded to some of the initiatives. I know in your department you haven’t done that active learning kind of stage but I know that you’ve had some other programs

Doug: Ok.

John: You’ve been doing active learning techniques in other classes for a while.

Doug: So, I can say three things. First, is in physics course evaluation. It improved and the students seem to really like it, and feel like they’re getting more out of the class than they are [in other classes]… because what they’re doing is they’re taking these highly flipped physics classes along with other classes outside of physics that are not… and so they see the contrasts… and I think both physics and biology have done a good job in explaining why they’re doing what they’re doing…. so that those students that say “ Why are you making us struggle? Why don’t you just tell us the answers, and then we’ll know how to do it?” and they’re explaining why that’s actually less effective… and to be honest in my experience, I do a fair amount of active learning in my classes already… and you have to explain why you’re doing it.

Rebecca: I agree. Students respond really well to that.

Doug: Right.

Rebecca: Yeah.

John: We’re much better in any case. They still sometimes resist because, one of the problems with a lot of the approaches, is that they don’t get as much positive feedback right away…

Doug: Right.

John: …because they do make mistakes and fail and that doesn’t feel quite as good as…

Doug: Right.

John: …doing well on the one or two or three exams that they might have happened to have had otherwise after cramming the night before.

Doug: In all three cases, I think, explaining what you’re doing, and why, before you actually do it… it’s pretty effective… and again you’re always going to have these students that fight against it. They’ve succeeded their whole… and it’s usually the top… and it’s those students that have succeeded in memorizing, and taking notes, and doing well on exams their whole life… and now you’re teaching them in this new way… and it’s harder. They’re gonna resist… not that surprising…

John: But it can work the other way, too. That the students who find that it’s more helpful to go through active learning techniques might encourage other faculty perhaps to adopt it.

Doug: That’s the hope… and I think the great majority, I think, get a pretty to have a pretty positive experience… and even more by the end of the semester. I think most of the students are going to have bought in if it’s done well. I mean. I’m not gonna say that every single class that did active learning the students loved it, because there’s some crappy active learning classes out there. I knew someone… they decided to flip their class and they admitted all of this… like I heard this from them, not their students… they said I turn all my lectures into videos and it was great and I really invested in the videos and then we would show up the class, I’d be like ok and I had no idea what to do in the class …They hadn’t invested in the class part.

Rebecca: Right.

John:: …and that’s the big part of active learning.

Rebecca: …that’s the active part.

Doug: That’s right. Right.

John: In most disciplies, you don’t need to create videos. There’s a lot already on YouTube. The real work needs to be on what you’re going to do in class to give you the most value added there.

Doug: That’s right. That’s right.

Rebecca: Our last question usually is what are you gonna do next?

Doug: So, we have a whole transformation plan. I mean at this point, which I talked a little bit about, but my spring will be heavily invested in creating and trying these invention activities. I’m crossing my fingers that they work. It’s the first course we transform and the department is part of the Active Learning Initiative. It’s a kind of a big ask… that we try something completely new and actually see some big results. It could be, at the end of the semester, we said why didn’t we just take that lousy pure lecture class… because that would be a lot easier to improve. I don’t know. Well, it’ll be exciting to find out. But the other big thing on my plate that I haven’t talked about already is taking these assessments that we’ve drafted in the fall and piloting them with partners at other institutions… and so if any of your listeners teach either introductory statistics or a second semester econometrics course, please contact me, douglas.mckee@cornell.edu, because we would love to have you pilot this thing…. and I’m hoping to give a big talk on it… on both of these assessments with our postdoc George Orlov who’s amazing… and will be on the academic market next fall, so if you’re looking to hire an economist that really loves teaching, and is great at it, and does really super interesting research, he’s your guy.

John: But not yet… not until he finishes…

Doug: Oh no… don’t hire him yet…

[LAUGHTER] …and so, what we’re hoping to do is make these published standards that can be used across the discipline as ways of evaluating teaching… and that can be big picture… we did this big intervention… or I think a lot of people use these standard assessments as ways to identify where the soft spots are in what their students are learning… and so people do that too… and so the vision is, for over the next five years, to take what we’re doing now and multiply it by eight… and so there’s there’s a lot to do.

Rebecca: … big shoes to fill.

Doug: …we bit off a lot. We promised alot. Now, we just have to execute. So far, so good.

Rebecca: I can’t wait to hear about the results and hopefully we have follow up and find out what happened.

Doug: We’ve submitted an abstract to the Conference on Teaching and Research and Economic Education which will be this June, where we report on the results of the transformed class, and so we won’t know… we will be getting the data in…

Rebecca: …moments before it’s due…

Doug: …moments before presenting the work. Hopefully, we’ll have a lot of incentive to put that together.

Rebecca: Great.

John: Excellent.

Doug: Oh, thank you both.

Rebecca: Well, thank you!

John: This was fascinating. Thank you for coming down here… or coming up here to Oswego.

Rebecca: Yeah, we’ve really enjoyed your visit.

Doug: It’s been a lot of fun you guys are doing great work here.

John: …and we really enjoy your podcast!

Doug: Thank you. More podcasts! I’m your number one fan.

John: OK, Thank you.

Rebecca: Thank you.