Many faculty are finding themselves teaching a fully online course for the first time this fall. In this episode Alexandra Pickett joins us to discuss how faculty can use the research-based SUNY Online Course Quality Review rubric, known as OSCQR, to help them design more effective online courses.
Alex is the SUNY Online Director of Online Teaching and an adjunct professor in the Education Department at SUNY-Albany. Previously, she was the Director of the Open SUNY Center for Online Teaching, and prior to that the Associate Director of the SUNY Learning Network for over 12 years and has directly supported and coordinated the professional development of over 5000 Online SUNY faculty.
- SUNY Online Course Quality Review rubric (OSCQR)
- Quality Matters
- OSCQR interactive dashboard and self Assessment
- OLC Effective Practice Award for OSCQR
- WCET Outstanding Work (WOW) Award
- Chico rubric
- Self-serve/Self-paced openly licensed tools and resources including Bb templates to quick start effective course designs for various modes of Remote Online Teaching
- SUNY Online Teaching Community
John: Many faculty are finding themselves teaching a fully online course for the first time this fall. In this episode we discuss how faculty can use the research-based SUNY Online Course Quality Review rubric, known as OSCQR, to help them design more effective online courses.
John: Thanks for joining us for Tea for Teaching, an informal discussion of innovative and effective practices in teaching and learning.
Rebecca: This podcast series is hosted by John Kane, an economist…
John: …and Rebecca Mushtare, a graphic designer.
Rebecca: Together, we run the Center for Excellence in Learning and Teaching at the State University of New York at Oswego.
John: Our guest today is Alexandra Pickett. Alex is the SUNY Online Director of Online Teaching and an adjunct professor in the Education Department at SUNY-Albany. Previously, she was the Director of the Open SUNY Center for Online Teaching, and prior to that the Associate Director of the SUNY Learning Network for over 12 years and has directly supported and coordinated the professional development of over 5000 Online SUNY faculty.
John: Welcome back, Alex.
Alex: Hey, John. Hi, Rebecca. Nice to see you again.
Rebecca: Good to see you too. Today’s teas are:
Alex: As you may know, I only drink Darjeeling tea, always organic. And I just love my Darjeeling tea. It’s delicious.
John: Ginger peach green tea.
Alex: Sounds good.
John: It’s delicious. It’s from the Republic of Tea.
Rebecca: I now have iced tea.
John: I had that earlier today, a few times.
Rebecca: Yeah, it’s getting a little warm in my studio. It needs to be cold now. [LAUGHTER]
Alex: I know, it’s warm right now. I’m getting warm too. Iced tea sounds good.
Rebecca: We invited you here today to discuss OSCQR, the SUNY Online Course Quality Review rubric designed by SUNY Online to support quality in online courses. Can you tell us a little bit about OSCQR?
Alex: Sure, I’d love to. So, OSCQR was developed, sort of with the advent of Open SUNY when we were developing the Open SUNY Plus programs and wanted a way to help campuses do a real systematic review of the courses in those Open SUNY Plus programs. Open SUNY launched in January of 2014. And so we began while we had the decision to develop OSCQR, and there were many reasons that went into that decision. We’ve been using rubrics and checklists pretty much from the first day, but over the years we evolved and ultimately bought into Quality Matters in order to have a branded solution for online course quality and course design tools. And so when we evolved into SUNY Online, we decided to develop OSCQR and kind of put aside Quality Matters for a variety of reasons. We needed to be able to use it in a way that was formative and Quality Matters started having more and more sort of restrictions on us. And the financial model changed. And so it just made more sense for us to develop our own rubric. And so in June 2014, we launched the first OSCQR interactive dashboard and rubric. Actually, that was June 14, when we started the design of it, and then in September 2014, we launched OSCQR 1.0. And we had 50 instructional design standards and 50 accessibility standards. And we used that rubric for the launch of the first Open SUNY Plus programs. And so we had cohorts of faculty in SUNY that were reviewing and refreshing online degree programs that were identified for participation in that Open SUNY Plus program. In October 2014, so that’s like the next month, we launched Wave II of Open SUNY Plus and added programs and campuses to the Open SUNY Plus cohort of campuses. And so we launched them with OSCQR 2.0, which improved the interactive dashboard and rubric. And then we started winning some awards. We won the OLC Effective Practice award in November 2015. In June 2016, we launched the third edition. So we’re currently in OSCQR 3.1. And so the third edition was actually launched in June 2016. And that edition consolidated the standards into 50 standards that integrated the accessibility and instructional design standards. So that’s where we are today, we have a set of 50 instructional design course quality standards. We have won a number of additional awards after the first OLC Effective Practice award. We won a Newton award for innovation in 2016. We won the WCET WOW award for it in 2018. And we partnered with OLC in 2016 when they adopted OSCQR as their online course quality scorecard and we were thrilled that OLC wanted to adopt OSCQR, to give OSCQR a national home and take them under their wing. So, we’re just thrilled with that partnership and with that umbrella and have continued to improve and evolve OSCQR for the benefit of all of us in terms of course quality review and refreshes, formative online course design, summative course reviews, in a variety of modes. The flexibility we were able to design into it, everything that was missing from previous quality checklists or rubrics that we had used. So we were able to really think about what we wanted the tool to do, what we needed it to do, why we were using it, who was using it, when they were using it. And we designed all of that in there in terms of flexibility.
Rebecca: Can you talk about some of the ways that OSCQR is different than some of these other tools. You mentioned, the formative feedback, and I know it’s also Creative Commons licensed, Are there features that make it unique?
Alex: Yeah, I think that flexibility that I mentioned, was intentionally designed into the tools and it’s actually a set of tools. It’s not just one thing. And so, technically, there is a PDF version of the rubric that is kind of a standalone thing and that’s the thing that OLC has adopted and distributes through their website. And that is intended as a tool that anyone can download. And so anyone, whether you’re a member of OLC or not, you can go there and download the tool, the PDF, and they just ask for your name and your email address so that they can send you the PDF. It’s low barrier, no commitment. So, it’s just a PDF, an online faculty person can use that as a self-assessment tool. So, after you’ve taught your course, the first time, or if you’ve been teaching for a long time, you can take that checklist, that PDF, that rubric, and do a self assessment. Just reflect on your own design of your own course and answer the questions based on your course. And then you can either take the results of that to an instructional designer and work with them to improve your course or you can use the companion website that sort of is bundled in with OSCQR and mine it for ideas to improve your own online course and there’s lots of information on that companion website to help you think through how to address each particular standard. So, that’s one way to use OSCQR. I mentioned that it was a set of tools or like a collection of tools. So the PDF is one. There’s an interactive online rubric. And there’s an interactive online dashboard that can be used together or can be used independently. So, as an instructor, you can use the interactive online rubric, an instructional designer can use the standalone interactive rubric, and you can even use that in a variety of ways. So, an individual instructor could use the interactive version, instructional designers could use the interactive version summatively, as part of their course review process, to preflight a course to say “Yes, it’s okay to go up online and be live.” You could use it in sort of an initiative kind of a way where you have peers in biology review all of the biology online courses in the department or in a program. You could use it In a way where you have different experts. So, you could have the instructor, you could have a librarian, you could have a technologist, you could have a student, you could have an instructional designer, as a group, review the course with all their different lenses. And the interactive version of the rubric actually supports that model because it’s essentially designed in a Google sheet. And so each reviewer has their own tab or their own sheet within that Google Sheet. There’s tons of code behind the actual rubric and the way that it’s designed, it actually will aggregate the ratings and the comments from each of the reviewers into an action plan. And that action plan then has a point of view based on what the inputs have been, and it will categorize the things that need improvement based on priority. So, it’ll tell you these things are important, and these things are essential. And then it also will categorize based on amount of time to fix. So, there are things that might take half an hour to fix things that might take an hour or more to fix or things that might take more than two hours to fix. And so the purpose behind this is to help whoever is going to refresh the course to prioritize. So, it’s a point of view. It’s our point of view. Because it’s an openly licensed open-source tool, if you have a different point of view, you can change that. So, it’s entirely customizable and changeable by whoever is implementing it. Now, of course, the average bear is not going to be able to make those changes because there is code involved, but those tools, when you adopt them and want to adapt them… that’s more at the campus level or the institutional level, so that they can customize it for the particular use. So, of course, there might be an instructor out there who has these Google coding skills who could do that, but it’s more intended to be used as is if you’re an individual instructor. Everything in it you can change. If you don’t like the standards, you can change the standards. You can add standards. You can create different standards for different disciplines. So, for example, if you have a dental assisting program that has certification by the American Dental Association, they might have very specific criteria for their courses that might be different from your Psych 101 course. And so you can create different rubrics for different programs or disciplines or you can customize them to meet the needs of whatever courses you are reviewing and whatever model you’re going to review. So, you can use those sort of team collaborative models, you can use a peer-review process, you can have an instructional designer conduct a formal review of an online course summatively before a course goes live. You can have an individual instructor self assess, and you can have instructors and instructional designers collaborate in a professional development activity, formatively. So, one of the differences with some of the other rubrics is that those rubrics are intended to be used summatively on courses that have been delivered a number of times by faculty who have some experience. OSCQR was intentionally developed to be used formatively with online faculty. So, as they design their course, they know what the standards are, so that by the time they’re done, it’s not like they’re going to get a whole page of stuff that they have to change or fix. If they are following the standards, the review at the end (if there is one) is just going to be clean-up stuff. I would say that OSCQR is focused on instructional design, it says nothing about the teaching of the course. So, it is intentionally that way. It is focused on the instructional design of the course to assist and scaffold quality in the design of the course. It’s not to say that we don’t know what makes effective teaching, but we just haven’t designed that aspect of the rubric yet. And also, there’s some challenges and issues and sensitivity that we want to have when we’re talking about the teaching of a course. But the reason I’m belaboring this point is that there is sometimes the tendency to forget that a course is both the design and the teaching that impact quality. And you can have a course that is stamped with Quality Matters and stamped with OSCQR and stamped with the Chico rubric and is gold, but then it’s not taught in a way that is effective and so it’s not a good course. So, you need both halves. And so OSCQR addresses the instructional design of the course. So that’s one of the things I think that makes it unique is that it’s designed to be flexible, to be used in a variety of models, and to be customizable and adaptable to the distinct uses, the distinct disciplines, the distinct campuses. Like you said, Rebecca it is openly licensed. The interactive rubric and the interactive dashboard are built in Google Sheets and are available to be customized if people want to for use. I talked a little bit about the interactive rubric. And I wanted to mention the interactive dashboard, which works with the interactive rubric and the dashboard also built in Google Sheets, also openly licensed, and also equally flexible and customizable. It’s intended for larger scale online course quality initiatives, and typically at the institutional level. So, if you have a department or an institution where you’re trying to do a larger course quality initiative, so you’re trying to review all of the online courses on the campus or you’re trying to review all of the online courses in a program, you might want to adopt the dashboard to facilitate that. So, as an instructional designer or manager of the process, a project manager, you might want to use the tool to generate a bunch of rubrics, associate them with specific courses, assign the people who are going to review them in whatever mode you’re going to do the review. So, whether it’s an individual instructional designer or a team of people, you can assign them from the rubric. And because these are Google Sheets, this stuff is automated, so people will be notified that they’ve been assigned the rubric. And then from the dashboard, you can coordinate that and view that. So, the dashboard gives you some tools that will let you know what percentage of the course review is complete. So you can track it all in one place. It gives you some tools to do some analytics so you can, across all of the courses that you’re reviewing, you can see, for example, trends, and if everybody is doing very poorly on standard 3B, you can see that and then maybe address that with some professional development. And it gives you sort of quick access to your notes, a single place where you can track and link and generate the different rubrics that are necessary in whatever your initiative is. Whether you’re doing general course quality reviews, or whether you’re targeting a particular thing, like I’ve seen some campuses say, “Okay, this year, or in this group of years, we’re going to target accessibility,” for example, and so they will have an initiative that is at the foundation of the review activities. It might be that they want to improve instructor presence, teaching presence in the course, so they could potentially have a targeted focus for the reviews and have a multi-year plan for that. It could just be that you have 15 courses in a degree program that you want to have refreshed in time for the spring semester or next fall. And so you plan that out. And you can have rubrics generated for all of those courses in that program and you can track the progress of the reviews from the dashboard. So, that’s another thing I think that makes OSCQR unique is that it’s really taking the perspective of both the faculty, the instructional designer, and the campus and making it maximally flexible for the different use cases that different scenarios might bring. I think that was the intention behind the design. The other part of it, I think, that is unique is that we don’t score faculty. This is not an evaluation of their course, we don’t give you a passing grade or you don’t get points. It really is, and always was intended as, a professional development tool to open conversations with faculty, between faculty, and with instructional designers on the best practices in online course design. And so the assumption is not that you will have nothing to fix in your course when you’re done. Because, as I’ve mentioned before, online course design is iterative. It is an ongoing process that you are continuously improving. This tool assists in the continuous improvement of the design of the online course, assuming that it can always be better. Technology changes, understanding of how people teach well online changes, and so every time you teach online, you can review and improve the design of your course and your teaching practices. So, I think using it as a professional development tool allows all of us who are involved in an interested in online course quality to focus on the best practices and to focus on the conversation around best practices and instructional design, toward the continuous improvement rather than on evaluation of a course, or evaluation of an instructor and the design of their course. So, I think that’s a fundamental difference. And for people who are used to having to score 80 percent in order to get the stamp of approval or whatever, it’s a little weird. And I’ve seen people actually change OSCQR to have points. I would always argue against that. Althougdo whatever you want because it’s openly licensed. To me, ih you know, you can t is much more important to think innovatively about the design of a course, to have faculty and instructional designers have positive and incremental progress toward quality. I think of quality kind of like a Socratic ideal, you are always striving for it, you don’t hope or anticipate that your course is going to suck, [LAUGHTER] you want it to be of high quality, and OSCQR can help you do that. And they are research- based standards. We have organized them in a way that I think makes sense as an instructor or as an instructional designer in how you approach this. Another thing that I think is unique about OSCQR, it really is thinking about how you do a course review and what you look at first and then help people to focus in on the standards to really think about what’s going on in the course, and then give some substantive feedback to the instructor or the instructional designer, whoever is going to be making the changes in the course, to be able to help move that course, in that particular standard, incrementally forward. The companion resource that I mentioned earlier is important in this process, because it addresses each standard individually and looks at what the standard means, examples and suggestions on how to improve each standard, some additional resources that are sort of background or additional resources that you can refer to. There’s citations from the research that support each standard. And then there’s the option and opportunity to leave a comment on a standard if you want to talk about a particular standard or have a question about it. And there’s also the opportunity to make a suggestion for an improvement to the standard, or an addition of a suggestion or example for each standard. And we really invite and encourage folks to interact with the rubric in that way, to have influence on the standards and have developed and are in the process of developing additional standards based on community suggestions. So, I think those are some of the things that are unique about it.
John: You mentioned that the goal is to have courses iteratively improve, and you’ve talked a little bit about how the OSCQR revision process takes place. Could you tell us a little bit more about how it evolves and the process of evaluating the standards and making OSCQR better all the time.
Alex: Sure. We’re in version 3.1 of OSCQR currently, and we are in the process of thinking through what the next iteration is going to be of OSCQR and have tons of ideas about how we can continuously improve, both from our communities who are using it, as well as from our internal plans, and we’ve always envisioned OSCQR to be something that iterates; we want to practice what we preach. And for example, we have a set of standards that address mobile learning that we have been working on for a while. And I think you were part of the FACT2 task group, John, that helped us work on these mobile standards. So, we’ve been working with entities, groups within SUNY, to think about things like accessibility, mobile standards, courses with labs, language courses, and thinking about standards that we might be able to add on to OSCQR to make it a more customized experience based on the type of course, not all courses are going to have mobile learning necessarily specifically highlighted. My daughter who’s 18, just had her first year of college, I found her last year writing a paper on her phone, and I was like, “Are you kidding me?” and then I started talking to some researchers who were making some suggestions for OSCQR for the mobile standards. And there is ample evidence to suggest that she is not the only one. [LAUGHTER]
John: And I think that was especially true with the sudden pivot back in March when many people who chose not to engage in online learning because they didn’t have the computer resources to do that effectively in their homes, ended up adopting their phone as a primary means of interaction. And that was a challenge for many people, because mobile platforms are really good for many things, but, perhaps, writing papers may not be their optimal use… or taking extended essay exams and so forth in a mobile device… may not be the best way or the most efficient way of doing that.
Alex: You know, if that’s the only device that the student has from home, because they only have one computer and their parent is working at home and they have limited access, that might be their only device. So, yeah, it’s very, very surprising some of the things that we learned as a result of the COVID pivot, and beyond, even in researching the standards, the pitfall of assumptions. It’s really hard to see around assumptions because that’s the nature of them, you don’t know. It’s only when they kind of hit you smack in the head that you realize. So, you were asking about how OSCQR has evolved. One of the things that we changed between version 2.1 and 3.0, was we collapsed the instructional design and the accessibility standards together. There were some redundancies and we wanted to integrate them, rather than have them be two separate processes, for ease of use. What we found was, it was too much work, and so they were doing one review or the other. By removing the redundancies and integrating the accessibility standards from the get go, we were able to get down to 50 standards, which was much more doable. And then the other thing that we did was we organized the standards into categories. And so we have a course overview and information category, and those are all the typical things that you would want to see at the start of a course review. And that sort of set the stage for the online course. So, all of your syllabus and information documents are in that area. And whenever I do a course review, those are the first things I look at. I want to see what the expectations are, what the assignments are, how the students are going to be evaluated, what the learning activities are, what the percentages of the grade things are, those kinds of things. And so that gives you a good overall snapshot of the course. And it’s also super, super important to start the course off in these areas really well, so that students are not confused, so that expectations are crystal clear, so that things are findable. So, that’s the first category. Tools and technology is the second category and you want to really focus in on what additional tools or what tools and technology students are going to be asked to use during a particular course. What skills are required? What prerequisites there might be? Accessibility figures in here a little bit. The third category is design and layout. And I think a lot of the accessibility standards are in this section, and it really talks about how you chunk a course, how you lay out the different components of the course. It talks very specifically about some of the accessibility things like font size, and flashing text colors, using tables, slideshows, and all of those things. It gives specific suggestions about all of those different ways of presenting content. The next category is the content and the activities, helping faculty and instructional designers think about activities that are learner centered, that are targeting Bloom’s in the correct space, depending on the discipline and level of the course. Again, there’s some accessibility standards in this one, and thinking about the variety of ways that you can engage students in an online environment. Then interaction is the next category, and that’s more specifically about the design of the learning activities, the expectations for feedback, any kind of netiquette expectations you might have, how you develop community, how you develop a sense of presence, both from the instructor and the student perspective, how that’s actually scaffolded in the design of the course, how you break the ice, how you answer questions, how you facilitate interaction, and any kind of collaborations, and so forth. The last category is assessment and feedback. And this has to do with your grading policies, the methods that you use to assess mastery or learning, giving students opportunities to self assess or to check their understanding, the grade book and how that is set up and pointed to by the instructor in the course, and then how you solicit feedback from the students in the course to help you to improve the design of the course… to understand what’s working well from their perspective and what could be improved. So, that’s kind of the overall sort of buckets of standards. Like I said, there’s 50 of them. And they fall into each of those six standards. And so that was one of the things that we did when we moved to version 3.0. Version 3.1 that we’re currently in, was one when we developed the companion resource that goes with it. And so if you go to OSCQR.SUNY.edu, you’ll find sort of the other half of the coin through the rubric and the dashboard. It’s just a simple website that has a page for each of the standards. And each of the pages, as I described, has information in a consistent way that addresses an explanation, and there’s a little video on each of the standards that has people from our community talking about how they have implemented this particular standard, why it’s important, and any thoughts they have about the particular standard… which I think is super cool, because it’s folks from our own community, and citations, and all of those other things that I mentioned earlier, are consistently on each of the pages.
John: As we’re moving into a fall semester during a pandemic, where most institutions in SUNY and many throughout the country are engaged in this magical thinking that we’re all going to somehow go back and despite the fact that the virus is spreading, especially in college age groups right now, as people have started going back to parties and other things, many institutions are going to try to imagine that that problem will somehow go away by the start of the fall in August. In case that magic doesn’t occur and we move online, how might the OSCQR rubric be helpful for those faculty who have to transition to online teaching? How might they use that to make their transition perhaps more effective for students?
Alex: Great question. OSCQR was designed way before COVID with not a glimmer of COVID anywhere near it. So, it was intended to be a tool used by faculty and instructional designers to support fully online instruction, and perhaps blended instruction, but targeting the online component of blended instruction. And so it really does not have anything to say about any synchronous online or any primarily synchronous online courses. But, I would say that any course that will be offered in the fall during this pivot that we’re all doing, could be informed and influenced positively by faculty taking a look at the standards for the online components of the course. As I’ve said other times, this stuff is not necessarily intuitive and in fact, it’s different from a face-to-face class. It’s not better or worse, it’s just different and you’re use different tools. You have different options and different limitations in an online teaching and learning environment than you do in a face-to-face teaching environment. And so some of these things are not necessarily intuitive and may actually be a feeling of resistance on some of them because you don’t either understand them or they just may not make sense because you haven’t actually experienced it yet yourself, either as a student or as an instructor. And so I think anything that helps people understand the unique aspects of an online teaching and learning environment are going to help you better prepare. So, when new or novice online faculty are faced with moving all are some of their instruction or content online, these standards are going to help you understand that better and help you understand what are the things to think about, what are the things to target and how. And I think the rubric in conjunction with the companion website would be a really good tool to use formatively. So, as you are reconceptualizing, as you are thinking about what you’re going to do in the fall, how you’re going to do it, for the pieces that are going to be primarily online, the rubric can give you some signposts, some goal lines, some suggestions for how to do that as best as possible, given the nature of the online asynchronous teaching and learning and learning. And I think that by learning more, by looking through and understanding the standards and what they’re suggesting, and what they’re trying to address, that will deepen your understanding of how to present your content most effectively online, how to facilitate collaboration and interaction, either online asynchronously or even online synchronously, a little bit. And it’ll certainly help you think through issues regarding providing asynchronous feedback and thinking about authentic online assessment and doing that asynchronously. So, I think anything that helps one in a formative way to understand what are the standards that exist, that are research based, that we understand to positively and significantly affect the experiences of faculty and the learning of students, will be a good tool to explore and to leverage and to use. These are open and available to anyone. It’s a website. But, I would suggest taking advantage of instructional designers to help you and of any professional development that might be offered by your campus, by your instructional designer, by SUNY, in addition to looking at the freely available resources that are provided to help walk you through a process, to guide you through a process, and I would look around your campus to to see what faculty might be in your department, or even outside of your department, who have experience teaching online. We have faculty across the SUNY landscape who have been teaching online for 20 plus years with vast amounts of knowledge and information in every discipline conceivable, and who have already made all of the mistakes and who have already developed all of the stuff and understanding that are willing to share. And we have an amazing community of faculty and instructional designers and people who have expertise in online learning within SUNY. It’s so unique because we are such a large system. And we have been doing this for a long time in some areas. And so I would really encourage folks to see what’s going on on their campus and what resources and supports are available on your campus to help you. You are not alone. You don’t have to reinvent the wheel and if anyone is sitting there in front of a blank course management shell thinking “What the heck am I supposed to do with this?” just know that you don’t have to start there. I have publicly posted in my self-paced and self-serve resources area, downloadable templates that will quickstart you into any learning mode, any design of course, whether it’s primarily synchronous, I have one that’s using Zoom, one that’s using Ultra Collaborate, I have one that is intended for a blended instruction. I have one that is fully online. We’re working on getting Moodle, Canvas, and Brightspace templates up. The common cartridges are posted already. So if someone wants to take the common cartridge and put it into their own system they can. So, you don’t have to start from scratch. We have ways to quick start you, all OSCQR infused, following our OSCQR standards. And that’s my worst fear, that we have some lone instructor who is out there, just really struggling and having to recreate wheels, that there’s no need to duplicate those wheels when there are tools and resources and people out here who can help you and guide you to find exactly what you need. I’m hoping that anyone who’s listening to this and feeling a little overwhelmed can know that they’re not alone and can know that there are places to turn and people who are willing to work with you, with your campus, with your instructional designer, to make sure that you have what you need in your hand to help you get past those beginning stages of staring at this blank shell and not knowing what you’re going to do next. I wake up at night thinking about those faculty. I’ve heard stories from faculty that have put hours and hours and hours of work into stuff that has nothing to do with their discipline, who are leaving their husbands and their children and their life on the side because 100% of their time is focused on climbing the learning curve of the learning management system, because they’re trying to do a good job and they’re just struggling because there’s so much to know, and so much to do, and so much stuff out there. I can imagine how confusing and overwhelming It would be and I’ve talked to some of these instructors. And so I just want them to know that they’re not alone and that there are people who can help them and we can help point them to the resources that can get them started more quickly, be more efficient, more effective, and ultimately happier at the end of the day and more successful, and their students too. That’s the goal, is to have everyone be able to do what we’re being asked to do as well as possible without killing ourselves in the fall. [LAUGHTER]
John: We always end, as you know, with the question, what’s next? So, what’s next for OSCQR?
Alex: What’s next for OSCQR? We’re in a bit of a struggle right now because there are so many other competing priorities. But we do have next plans for OSCQR. Like I mentioned, the mobile standards are pretty much ready to go. We are thinking about the next set of standards and wanting to work with folks like the FACT2 task groups to help us inform and influence next types of standards. We’re thinking about courses with labs, language courses, courses with synchronous components, now in COVID land. And so we always envisioned OSCQR as a tool that would continuously evolve, continuously change. In my dream, like I’ve had this tool designed for 20 years. In my dream, when you begin to generate your OSCQR rubric, you would be presented with a wizard that would ask you certain questions about the type of course or the nature of the program that you’re about to review. And then you would select from a menu: will it have labs? will it have hands-on activities? will it have whatever… the different types of things? and then you make your selections and then it will generate a rubric customized on your input. That’s a ways off. But right now, we’re going to potentially work on getting the mobile standards in there. [LAUGHTER] One step at a time, and I also need some technical resources to help assist. So, we’re working on developing that capacity. So yeah, stay tuned because OSCQR is a living, breathing being and is kind of a toddler, I would say, right now and we’ll be growing up over time and being improved along with the rest of us. As we continue to learn more about how people teach and learn well online, we will continue to enhance and expand what OSCQR does and how it does it, all for the purpose of helping faculty and instructional designers address the issue of quality online. John, I wanted to mention that we have this amazing community in SUNY and what I’d like to do in the links for this podcast is ask folks to join our online networking community so that we can continue the conversation. We have the SUNY Online Teaching Fellows role that allows us to collect people’s names and send information out to them periodically when we have new tools and resources and supports. And so I’d love to invite everyone to become a SUNY Online Fellow, and then to join the online networking community so that we can join the OSCQR user group if you want and continue to have conversations around online course quality and OSCQR if folks are interested.
John: And you mentioned that OSCQR is a toddler, but it’s a toddler that has become pretty well known. I remember the early days when it was just under discussion to now it’s being discussed internationally.
Alex:Yes, and there’s research on it, too. Like I have a link on the OSCQR website for all of the times I found it in the media and all of the research that I’ve seen done with it. And if anyone has any additions to those lists, I’d love to have them added. So, yeah, it is internationally used. There are hundreds and hundreds of institutions outside of SUNY that are using it at the institutional level and at the individual level, both in the United States and outside of the United States. I think of OSCQR as affectionately as a toddler, but maybe it’s more of a teenager. I don’t really know… maybe that metaphor doesn’t work. It’s certainly is well established, I agree. …well known and certainly when the OLC adopted it in 2016 really elevated the standing of that tool nationally. And so I am grateful for the OLC for giving us that recognition.
John: Well, thank you, Alex. It’s been great talking to you again,
John: If you’ve enjoyed this podcast, please subscribe and leave a review on iTunes or your favorite podcast service. To continue the conversation, join us on our Tea for Teaching Facebook page.
Rebecca: You can find show notes, transcripts and other materials on teaforteaching.com. Music by Michael Gary Brewer. Editing assistance provided by Ryan Schirano.