It's Summertime! And how to teachers and instructors often spend their summers? Why, reworking their courses, correct? In this episode, Sharona and Boz share their thoughts, ideas, and recommendations for working through the course redesign cycle and iteratively examining your course. Whether you are newer to alt grading or have been doing it a long time, this episode contains questions and ideas to work on your course for the upcoming school year!
Links
Please note - any books linked here are likely Amazon Associates links. Clicking on them and purchasing through them helps support the show. Thanks for your support!
Resources
The Center for Grading Reform - seeking to advance education in the United States by supporting effective grading reform at all levels through conferences, educational workshops, professional development, research and scholarship, influencing public policy, and community building.
The Grading Conference - an annual, online conference exploring Alternative Grading in Higher Education & K-12.
Some great resources to educate yourself about Alternative Grading:
Recommended Books on Alternative Grading:
Follow us on Bluesky, Facebook and Instagram - @thegradingpod. To leave us a comment, please go to our website: www.thegradingpod.com and leave a comment on this episode's page.
If you would like to be considered to be a guest on this show, please reach out using the Contact Us form on our website, www.thegradingpod.com.
All content of this podcast and website are solely the opinions of the hosts and guests and do not necessarily represent the views of California State University Los Angeles or the Los Angeles Unified School District.
Music
Country Rock performed by Lite Saturation, licensed under a Attribution-NonCommercial-NoDerivatives 4.0 International License.
104 - Summer redesign
===
ng the statistics class since:Welcome to the Grading podcast, where we'll take a critical lens to the methods of assessing students', learning from traditional grading to alternative methods of grading. We'll look at how grades impact our classrooms and our student success. I'm Robert Bosley, a high school math teacher, instructional coach, intervention specialist and instructional designer in the Los Angeles Unified School District and with Cal State LA.
Sharona: And I'm Sharona Krinsky, a math instructor at Cal State Los Angeles, faculty coach and instructional designer. Whether you work in higher ed or K 12, whatever your discipline is, whether you are a teacher, a coach, or an administrator, this podcast is for you. Each week, you will get the practical, detailed information you need to be able to actually implement effective grading practices in your class and at your institution.
Boz: Hello and welcome back to the Grading podcast. I'm Robert Bosley, one of your two co-hosts, and with me as always, Sharona Krinsky. How you doing today sharona?
Sharona: I am doing well. We're recording this right after the 4th of July, and so I was a little afraid that I was gonna get kept up at night because I live in downtown LA and people do interesting things. I think I was on the phone with you when a fireworks went off, literally like 10 floors below me and startled the heck outta me last night. So I was afraid I wasn't gonna get any sleep, but I did okay. I did okay. And I got to watch a lot of fireworks from my balcony, so that was kind of nice. How about you? How are you doing?
Boz: I'm doing pretty good. Do wanna give a quick happy birthday and shout out to my mom whose birthday was on the fourth yesterday.
Sharona: Happy birthday Mom. Mom Bosley.
Boz: Although the only time she ever listens to this podcast is when she's tired and can't sleep. She says she puts sleep on and it puts her to sleep.
Sharona: I'm confident it's your voice that puts her to sleep and not mine. Absolutely, a hundred percent. Totally confident. Yeah. Well, speaking of moms, I definitely had an emotional week because I went back to the cemetery for the first time after my mom passed and I was not expecting to go back that soon. So had a little bit of a emotional rollercoaster this week. So shout out to all of those who have lost a loved one because this grief cycle stuff. Woof.
Boz: I knew you had gone back. I didn't know that was your first time.
Sharona: Yeah. That was my first time. I was at a funeral for someone else, a family friend, and it was in the same room that my mom's funeral was in. And so that, yeah, kind of hit me like a ton of bricks. And then I went over and sat in the garden of Rebecca, which is the name of the part of the cemetery where my mom is buried. So that part helped, but the first part was a little rough.
And I am surprised you're still talking to me with all the projects I've been having you do this week, because you've been doing a lot of coding on our assessments this week. So thank you for hopping on today.
Boz: Well, you know, this is always one of our more enjoyable things of the week. We both find this extremely recharging and energetic. This is, at least for me, this is one of the things that I really enjoy doing.
Sharona: I would agree. So speaking of enjoying doing things, what are we talking about this week? Because we don't have any guests on so.
Boz: So, you know, one of the things that did come up some in the conference and something that's come up several times with us in different episodes and different guests is that, very few people, if anyone, ever gets this right the first time, and there's nothing wrong with that. We've talked a lot about how many mistakes we made the first time.
Sharona: Well, to clarify the first, second or third time we do this, not just the first time.
Boz: Yeah. In fact, one of our early keynotes that was still one of the more popular talks was Dr. Robert Talbert talking about how many different iterations he's had over the years and how drastically different they are from now to where they began. So that got me thinking. This is a process that oftentimes does take a very long time to get right. And I don't know if anyone ever completely gets to a point where they're like, oh, I've got the magic set up. I'm done. This is how it's always gonna be.
So we're always looking at revising and making modifications, which is exactly what we want our students doing anyways. But a lot of times that happens during the summer. And since we're kind of in that process right now, I thought about, I wanted to kind of talk about how do you do that? Like what are some advices that we would give on how someone should go about or could go about doing that?
Sharona: When you brought this topic up, I thought it was very interesting. Because like you said, we do spend a lot of our summer iterating and revising, and because of the extended role I took on last year, this is my first summer where I am working on both a course that is very mature in its alternative grading design and courses that are very, very, very early in the process. Yeah, so I was excited that you brought that up. So I'm thinking that we'll just kind of do a back and forth where each of us gets to give our perspective on what we think people might consider doing in the summer because again, we work with classes, so academic year is a thing.
Now, I know that many of you teach summer school, and in fact, we just had Robert Talbert on a few episodes ago to talk about his new redesign of his summer course, and we're supposed to get him back to find out how it went. But I wanted to start, I'm gonna give you the first crack at this. So what is your first piece of advice to people that are working on a course this summer and either considering alternative grading or they've already tried it. So which one do you wanna do first?
Boz: If I'm giving advice to someone that has been doing it and they're trying to refine their structure or their practice, I would actually break this down into two different groups. Kinda like you just said, you're dealing with two different classes, one that we've been working with for several years now, and one that is very early in the process. I think that's the same, whether it's a course or a practitioner, someone that's just might be new to, you know, last semester was your first, second, third semester doing it compared to someone like Dr. Kate Owens that's been doing it for a decade and a half. I do think reiteration process, the refinement, is definitely different for those two different categories.
So I'm gonna first look at the new practitioner, someone that's been doing this for three, four semesters or less. And I really do think, at least for me and for a lot of the people I've worked with, the first thing you look at is the calendar. Especially in my K 12 colleagues, that seems to be the biggest thing that doesn't work out the way we initially planned. That's my first advice to a new practitioner is probably look at your calendar. What did you not get to, what did you have to rush? What learning targets did you get halfway through and you're like, wow, I don't know how I planned for three weeks. I don't know how I'm gonna stretch this out for another week. So really looking at and refining that calendar would be. My first piece of advice.
Sharona: Well, and related to that, we spoke to a couple schools recently where they had not preplanned all of their attempts at gathering evidence in the calendar. Mm-hmm. And so they found that they might've only had one attempt on some things or two attempts on some of their learning outcomes and another learning outcome might have five or six. And that threw the students for a loop because they had gotten used to having five or six attempts at something. So they were expecting more chances at a different topic. And the calendar didn't allow for it. So I would agree with you that what we mean by course calendar, I would encourage people to do backwards design on that and say.
Boz: Oh, absolutely.
Sharona: I'm gonna start with. I'm gonna calendar all of my opportunities, or as many as I can, of gathering evidence of learning.
Boz: Yeah, I would definitely recommend that. So that's something that especially comes up with my K 12 colleagues, is that doing this process has made me get much more organized and much more planned out. I think, especially early in my career, we're like, okay, here's where I'm gonna start, here is the order I'm gonna go in and let's see how far through this I get. Whereas doing alternative grading and knowing, okay, I need to have this many attempts, it's made me be more organized, made me be more planned out. But again, first time I did that, I learned, oh yeah, okay, this unit took a lot longer. This unit didn't take as long, that's why I was saying, let's go back to that calendar. But definitely, definitely the backwards design is, I think, essential, especially if you're gonna do some of the architectural decisions that we've made.
Sharona: I would agree. I would also caveat that with if you're unsure how long a unit is gonna take, instead of just adjusting the calendar, you could instead adjust what evidence you take on a certain assessment. So instead of moving your assessments, because let's say you didn't get through all the content you thought you were gonna get, to go ahead and have that assessment just change how much of whatever it is you're looking for. Maybe you make it into attempt on something earlier that you had taught a little bit before you were hoping to get to this particular assessment was gonna be on learning outcome four. And you're not far enough? You have to either decide, are you gonna do it on learning outcome three, are you gonna do a partial gather on learning outcome four? But moving that assessment is actually gonna have a bigger impact on your calendar than just adjusting the content of that assessment potentially.
Boz: It's interesting that you said that because that would be my second place that I would recommend newer practitioners go to, and the first place that intermediate practitioners I would recommend them go to is looking at your assessments. A, are they actually measuring the learning targets in the way that you wanted? And B, are there other different ways that you can collect it? Are there more authentic ways to collect it? Are there ways to collect that you weren't doing that you want to do?
Sharona: I would agree. When we did our stats class, the first three iterations were massive changes to the grading architecture. We changed our proficiency scales, we changed the number of learning outcomes. We changed a lot. Once we finally got to something we liked, the next big change was refining those assessments. That's actually, I would say that's the thing that we continue to iterate the most on is we make new assessment questions. We really look to say, is this still operating? Is it still relevant? Is it still current? And we do a lot of that work every summer.
Boz: Yes. So what other advice would you give?
ing alternative grading since:Boz: Now, I wanna bring this up 'cause I think it was an interesting process because we had used the five attempts needing to get two of the five in our grading architecture that you were talking about. Our decision number three, how do we decide when a learning target has been shown to be proficient by a student, we do what we call the N times. So they have to get it on two separate occasions and we had this five assessment model for years. And that was one of the things that, 'cause we do this in a coordinated class, one of the things that one of our colleagues brought up. But when they brought that up, we didn't just make that change. We actually went into our data. We did data analysis. We looked at it and we noticed, of the people that had mastery, most of them only needed four times. Those that, even though we didn't have a ton, that would do it, all five attempts? Like most of them didn't get it. That fifth one didn't seem to help. Although, to be quite honest, most of the time students that don't have mastery on a specific learning target, it's because they didn't do multiple ones of the five.
Sharona: Very, very few students actually took all five assessments.
Boz: But yeah, this was a pain point that one of our colleagues brought up that one of the things that we've talked about. Is that you have to utilize tools to help doing some of this stuff, but it, it can be a lot of work. I mean, and they were talking about the amount of grading and when they brought it up and we looked at the data, we're like, you're right. That fifth one just does not serve, statistically serve the purpose that we were hoping for. So we took it out and what happened? Our pass rates didn't change. Or, or actually they went up. But that taking that fifth one out didn't hurt.
Sharona: Right. Well, another impact of taking that one out is we were able to actually redo our calendar.
Boz: Yep.
Sharona: So, so by taking, 'cause we took actually out of the five, we took the fourth one out. Mm-hmm. Because the fifth one is our during finals week. So by taking that out, we were able to slow down the assessment process on the first three, which means that students had more time for the feedback loop. So even though we took out a chance for students to demonstrate proficiency, we were able to slow down the pace of the other ones, which in theory is gonna give students more chance to learn from their mistakes. So we're gonna be continuing with that structure.
Boz: And another thing that we did that's related to assessments and the calendar, since those are the two we've kind of been talking about, is by taking that one out, it also gave us a little bit more freedom where we also slowed down things at the beginning but yet started assessing even earlier. Which is kind of sounds weird unless you looked at one of our calendars. And that came from really our discussions with Dr. Thomas Guskey about giving students early success. Because that was one of the other things that we noticed when we looked at our data is that out of the students that weren't getting a C or better, we lost a lot of them early in the semester. Their attendance dropped off after the first couple of weeks. So by taking that fourth attempt out, going from five to four by taking that fourth one out, allowed us some freedom in the calendar and gave us an opportunity to those first couple of learning targets, really just focusing on those. Giving those students, you know, that early success, which has seemed to help. Looking at what you were saying, looking at pain points, some of those are the kind of things that we're talking about. What didn't work for whatever reason. And how can we adjust things to see if we can help with those pain points.
Sharona: Okay. So I'm gonna kick it back to you. What is your second piece of advice or things to you would recommend people look at?
Boz: Really, my second one was the assessments. The second one for newer practitioners and really the first for the intermediate was looking at your assessments. But the next thing that I would recommend is, like you said, really going back to those four architectural decisions and the four pillars. Where are you having those pain points? Where is it breaking down for you? Or not even breaking down, just where do you think you could even go from good to great. I do think it becomes much more diverse and different from class to class. Like my pain point might be very different than yours, or even my pain point in one class might be different than another class.
Sharona: Well, and I wanna dig in a little bit to that because to me, when we talk about pain points, which is the way I think about this, it could be something that is causing the instructor difficulty, because of workload or it's just not working the way you want. Or it could be something that's causing the student difficulty. Right? And those could be two entirely different things.
Boz: Yeah. Good point.
Sharona: So I think it's important to put both hats on. To say where is this not working for you as an instructor? Or where do you think you could get better? Like you saw an opportunity and there's just something you really wanna accomplish with the class. So put that hat on and ask yourself that question, but then put the student hat on. What did you see the students struggle with? Where did they react or tell you or give you feedback that makes you think, oh, from a student perspective, I could do something better? And I know that one thing that came up at the conference for me, that we're intending to do, is the why of why we do alternative grading. Because we've really mostly focused on the how to try to get students through this transition. But I think that we're gonna try to figure out how to communicate the why we do this. What do you think?
Boz: Well, I actually had a question for you. No, I'm laughing because my big like closing point. Is around that. But, since you brought this up, do you think that one of those two hats should go on first? Or does it matter? Like, do you put the teacher hat on first? Do you put the student hat on, or is that personal preference?
Sharona: I think much of this is personal preference. I think that, for me, whatever comes to top of mind is gonna get jotted down. Like, I'm gonna brainstorm what are all my pain points, and I'm going to try to be intentional. But the student or the teacher one could be the top of what comes up to my brain. But once I have the list, then I might go through and prioritize and, for me, I want to hit sort of equally between students and teachers, until one of them's complete. Like, I don't wanna prioritize just my personal pain and ignore the students, and I don't want to prioritize just the students and ignore the teacher.
Boz: There is a lot of crossover. There's a lot of that does go hand to hand. But I also, especially if I'm thinking about the newer practitioners. Redesigning a a course, truly redesigning a course for alternative grading, is not a small process. And I think you're gonna have that in the show notes, but we have this whole course redesign cycle that we've come up and used with our PD's. This is not a small process, and I don't want new practitioners, especially very new practitioners to go, okay, I just spent all this work, there were a lot of things that need to be refined, I have to refine them all.
There's nothing wrong with going, okay, here was one or two things that, I know I might have more than that, but these are the one or two that caused the most issues. So I'm gonna try to figure out how to do these things and then later semesters I'll figure the other ones out. So don't get overwhelmed. Don't think you have to fix every single problem that you noticed or that you were able to identify. Really look at those one to two, maybe three, big ones. And try to focus on that. And I like the way you were talking about with, as you're brainstorming, kind of putting on the different hats of looking at it from teacher pain points and student pain points. I do like to look at those separately just because it helps me focus.
But once I have that brainstorm done. Yeah. Really looking at, okay, which of these cause the biggest problems and which of these do I wanna try to fix? Maybe I'm gonna try to fix two things and maybe one's a kind of teacher focused, one's a student. Maybe they're both student, maybe they're both teacher. 'cause it just depends on where your problems came up from. What side? Where were those where, what were those pain points for you specifically?
Sharona: And for me, if there is something where it shows up on both lists that it's a problem for students and a problem for teachers? That might increase its importance. Yeah. So if something didn't work for either side, that might be something I'm going to prioritize to fix.
Boz: So I have been kind of talking about the new practitioner and even the intermediate. Are things different when you do have, you know, you are more experienced and you are working with a course that you have been working with for a while. Is that process different?
Sharona: I would say it's not different. However, you have an opportunity to take a bigger picture look with a mature course, in my opinion. So I'm gonna give you two examples. So in the pre-calculus that I'm working at, since it's still traditionally graded, and I'm taking a very, very slow redesign process, I am really down in the minutiae. of that class. I am trying to pace things day by day. I'm trying to write individual assessment questions. I am way down in the weeds. Our statistics class is very mature, which gives me an opportunity to look at sort of the earlier parts of the course redesign. In particular, really looking at the learning outcomes and even perhaps going from the learning outcomes entirely up to the course content. So I'm struggling on both levels, but I don't know that it's that different. What do you think? Do you think it's different or?
Boz: Yeah, I do think it's different, even though I would agree with what you described. Like when I've got a new course, when I was a new practitioner, I was really trying to get a working model. Whether it was the assessments, the calendars, the grading, architectural decisions, refining my learning targets, I was trying to get something that worked to at least an acceptable degree. With the statistics course, we're there.
Sharona: Mm-hmm.
Boz: Now we're really, look, I do think we have the freedom and the ability now to go back to some of those earliest parts of the redesign. Even before the grading architecture, really looking at our learning targets and the purpose of the course, the why of the course, and I think the events of the last couple of years has really made that a necessity more so than ever before. And I'm talking about AI.
Sharona: Right. Well, we often tell people when they redesign a course that you should have an elevator pitch of the content. And our elevator pitch for our quantitative reasoning with Statistics course is we want to create a critical consumer of data. We want students to come out of our course being able to critically consume data. Well, the biggest, most massive amount of data that our students are now dealing with is AI driven data. Whether it's search results or whatever. That is data. And it's being created on the fly by this AI situation. And the stuff that we had them doing before and the types of assessments we have them doing can be handled by AI. So we've got like a double whammy.
Boz: So our purpose of our course, our overarching goal of the course, hasn't changed.
Sharona: Right? We want them to be critical thinkers.
Boz: Critical resource. Yeah. We want 'em to be critical thinkers and critical consumers of data. But with that goal and in the new environment, are our learning targets still appropriate and still working towards that goal? And this is what I mean when I was saying when you've got a more mature class, I think you've got the freedom to step back and look a little bit bigger picture. So we step back to our why. Our why hasn't changed. But because of the environment that we're now in, our learning targets might not be addressing and driving our students to our overall goal of the class.
Sharona: And it could be the learning targets. Or, when we go look at our learning targets, we're like, yes, we still want a student to be able to do this. It could be a breakdown in the assessments. Exactly. So it could happen in multiple places. And when we initially designed this course, we had a year before we had to implement the course to do the design. And like we've said many times, we still royally, royally messed it up. But even though I feel a tremendous amount of pressure, that I'm putting on myself internally, to perhaps make some of these changes, we're not rushing into it. And we're certainly not doing it in isolation. So I am not spending the next four weeks rewriting my learning targets for that class. I can't. It's too fast.
Boz: Because if you do that. That's gonna create a domino effect that okay, we completely redo our learning targets. Now our assessments don't make any sense. Now we've gotta redo the... yeah. And in four weeks, that's a lot of work.
Sharona: I know. I've done that. So Yeah, don't do that. That's bad. But yeah, we're having to look at the learning outcomes and even the content. I mean, one thing I'm grappling with, not so much on the mature class, actually. But also on the pre-calculus. Are we teaching the right things? Especially in STEM, because that's what I'm familiar with. I know writing has had to grapple with this perhaps even more in faster. But for me in STEM, how many of our intro STEM classes are still a collection of head facts? Yeah. And is that even the right thing?
I mean. Two years ago when AI came out and it couldn't add two plus three and consistently get five, I was kinda like, whatever this is a tempest and a teapot, right? No, no, no, no, no. All those generative AI models? They're all hooked up to math engines now. They don't make those mistakes in general. Now that being said, I was tutoring a calculus student and we put a very complicated math problem into chat and it gave us two different answers with the two different methods that recommended doing it. Neither, of which matched the answer that had been given as the sort of answer key. So we really couldn't verify which of these three answers was the correct one. And it was a calc three problem that I've never taught the topic on, so I couldn't even verify it. I was like, Ugh.
So yeah, I mean this is, this is really tough and it's gonna take some real head sweat to figure this out. And that's from the teacher side. But I see this just as big from the student side. So if I'm coming in as a student to a course it's supposed to help me critically consume data and I don't know how to do an assessment and I dump it into generative AI and it seems to give the right answer. What should I do as a student? I mean, we can all say that cheating is wrong, et cetera, et cetera, but how do we convince the student? And how do we give the student confidence in themselves to be able to bridge the gap between where their head knowledge is and what the generative AI models can do? So this issue with the content and the learning outcomes to me has both the teacher and student hat on it with the AI. Yeah. And I'm very concerned about my ability to get through it fast enough. Because the AI is iterating faster than I can.
Boz: That's, that's true.
Sharona: Now that being said, I'm actually, and I, full disclosure, and I'll tell my students this. I am using AI to generate better assessments on my existing stuff. Because one of the things about that critical data consumption is we always want it to be relevant data culturally, and timeframe relevant data. So we've had a number of questions over the years that talk about COVID. Well, COVID is out. It's not in the current lexicon of these students. They're not thinking about COVID anymore.
Adults like us, we still think about COVID. These kids are not. So it used to take me an hour or two hours to come up with a well-written version of an assessment when I wanted to update. Generative AI does an excellent job. Because although we're using real world adjacent data, it doesn't have to be a hundred percent accurate. Like we do things with surveys and we're like 63% of people in this made up survey said something. Well, I don't care if it's 63 or 62 or 64 'cause it's not actual real data. So the AI coming up with it works just fine.
Boz: But that's an interesting point that you brought up. I don't know if you kind of realize this, and it's probably gonna make us both feel old, but do you realize our incoming freshmen, this next year, would've been in seventh and eighth grade during pandemic.
Sharona: I do realize that because, you know, my sons were in high school and they're graduating college. Yeah. Both of them. I supposedly am gonna have two college graduates in this academic year coming up. So, yes.
Boz: But yeah, but I mean, if you're talking about looking at some of our assessments because we do try to be somewhat current event up to date, but. Yeah, how many sixth and seventh graders were or seventh and eighth graders were paying that close attention to political things during pandemic. So yeah, going back and looking at some of our assessments and the fact that some election things have happened and some other political things have happened. Using AI to help make those assessments, incredible time saver even if every single fact, because it is an assessment. 'cause it is something that knowing the current event and the details of the current event isn't the important part. It's the being able to critically analyze the data around it is.
Sharona: So what's your next thing that you think you might have new or more seasoned practitioners do in the summer to think about their course redesign?
Boz: I think this is really, and this is where you kind of talked about a little bit already, but how do you actually find some of these pain points? How do you decide what you want to reiterate or revise or polish up? And I think that's going back and asking yourself, why are you doing this? What is the purpose? What is your goal of doing this? Kind of what Doug Wilson was talking about a couple episodes ago, something that came up last year when we went to CMC and went to Dr. Sean Nank's closing session. Even though his wasn't about just grading his was about teaching period, but asking yourself the why. Why did you start this in the first place? Why did you wanna change your grading? What was the purpose? What were you trying to accomplish and have you accomplished it? That really, I think, can help focus you and help focus, okay, if this is my why, these are the parts that didn't really work. That's what needs to be refined.
Sharona: I also think that it's worth looking at a little bit of the bigger picture going on institutionally for you. So for example, I know that I've been hearing from a lot of people in higher ed that the push to large classes, not just larger, but large, is happening across the country. And with some of the funding things going on, I see that as continuing. So can you anticipate any structural changes that might come to your environment this coming year and what impact that might have on your course? So to me, that's another pain point you might wanna look at. Things that worked at a course of size 25 might not work if you end up with a course of 50 or a course of 75. And this is also bigger thing I'm hearing a lot of schools switching from an individual instructor section model to a large lecture with a TA support structure. So are you suddenly gonna be working in a team? And if you're gonna work in a team, what do you have to do if you're doing alternative grading. So these are just some of the other things that are sort of occurring to me as potential pain points.
Boz: Yeah. Which might be a little bit more specific to our higher ed listeners. Yes. Usually, and I say that as a caveat, usually, those kind of big structural changes don't hit the K 12 ERs all at once. That's not going from a class size of 28 to 38, 48 most of the time. Now, if you're a K 12 er, and you've seen some of those massive changes happen quickly and you're screaming at me right now, I might be speaking just from my viewpoint, which I am in a state that has unionized teachers. I am part of LAUSD, which is the second biggest district in the country. And our union, UTLA, is a pretty strong union, so I am speaking from a, possibly a very rose colored lens, but I don't think it happens as much in the K 12 as it can in the higher ed.
Sharona: Yeah, well, I'm definitely seeing that. I am on some groups that are talking about what's happening and the number of universities that are going to large lecture calculus classes, large lecture pre-calculus classes, that have had small class sizes for a while. They've had classes from 25 to 40 and now they're going to a hundred fifty, two hundred, two hundred fifty, three hundred students in a section. And that has a massive impact on the ability to do alternative grading. It can be done, but takes a lot of intentionality.
Boz: So, question, because I had never thought about something like that happening. 'cause in my world, in my K 12 world, it just wouldn't. If you have something like that, would you just like throw out the entire thing and start from scratch? Or throw out big chunks? Like what would you keep or possibly keep and what would you be like, yeah, I'm throwing this out and starting from scratch.
Sharona: For me, the big thing would be the assessments. You're gonna have to look at your assessment and feedbacks process. Because it's a lot different on a single, like right now we have one to two weeks between iterations in the statistics course on our assessments. I can do that if I have three classes of 75 students. If I have a class of 300, I can't give the kind of individualized feedback in a one week cycle. Then if I have a bunch of TAs and I'm not used to doing norming sessions, I have to learn how to do norming sessions.
Boz: Yep.
Sharona: So.
Boz: And you've gotta learn how to train your TAs to give feedback the way you want it done.
Sharona: Exactly. So the assessment cycle and structure in the larger classes. That's one of the biggest things. For our statistics course, because we use clickers, I don't think that that part would have to change so much. Like I am comfortable with the majority of our class structure in statistics. I'm not comfortable with the assessment structure. Because it's designed for instructors for whom a full load was a hundred students, it's now gonna be 120 students. 'cause we went from 25 person sections to 30. So a full load is four to five classes. So you're 120 to 150 students. Total, but that's total for like your entire semester. Yeah. If that's just a single class and that's only like six units of your working load, that's just different and you may have to resort to more multiple choice. You may have to resort to changes in your evidence of learning. You could potentially do portfolios of work. There's a lot of ways you can do it, but it just really changes the feedback cycle structure.
Boz: And I would think, and again, I've never had this experience even though I've been in the higher ed world for six years, this is going on my seventh. I've never been in, outside of as, as a student, I have never been in a class that size. I've never had to be the instructor. But I would think almost your entire grading architecture would need to be redone if you were going from a 25 to 40 up to 80 or more students. I would think at least the grading architecture would likely have to be revised or thrown out and started over?
Sharona: Well again, I don't know that I would necessarily change like the number of attempts or, I mean, I might change the number of attempts, the amount of evidence I need, I would like to keep, but how I get that evidence might have to change. Yeah, would have to change most likely. So I think that that's very interesting and I think overall the biggest piece of advice is to sit down and reflect with a piece of paper, with a pencil or with a computer. Like literally write down what you can remember from how your class went or what you've heard that sparked an interest. Do a brain dump. Because for me, I can't keep stuff in my head so much anymore. I need to write it down and then bounce it against the four decisions, the four pillars, the course redesign cycle. So that would be the biggest thing. That's what we tell our students to do, right? That's how we learn.
Boz: Again, this has come up a dozen times or more now, the importance of students doing reflection. How important that is in their learning process. That doesn't change because you're not a student and you're a teacher now. I would completely agree with that. Regardless of how you do it, when you do it. But giving yourself some time and space just to reflect on the semester, on the year and what things do you wanna change. And I think this is true, whether you do alternative grading or not. I have found I need it more, or at least I have recognized the need for it more, since going to alternative grade. But reflecting on your practice as a educator, reflecting on your pedagogical methods and philosophies, that's just how you grow as an educator.
Sharona: There's one group that I wanna add to what you said about the reflective process because you pointed out, even if you're not doing alternative grading. I wanna directly talk to people who are in my situation with the pre-calculus course. So if you want to go to alternative grading and you cannot do a full course redesign, what I would recommend is go ahead and start your full course redesign, even if you can't implement it right now. Go through the process and then find what parts of your re designed course you can put in right now. Whether it's going ahead and using your learning outcomes or designing your learning outcomes and using those to drive your assessments. Even if you are forced through circumstance to still use points and percentages, there's so many of these practices that you can do without completely going away from traditional grading. It's still gonna make the course better. Because that's where I'm at.
Boz: Yeah. We had the whole episode where we were looking at you know, the Grading for growth blog, the 25 small steps, and we actually came up with the 26th one and. I think that even goes back to looking at your why, why you're doing this, because like you said, if you know your why and you go through the process, even knowing that you can't fully implement it. Those important pieces, like you were saying, what are some of the small things you can do inside the traditional, will be more clear if you really understand your own personal why of why you're wanting to start these changes, why are you wanting to get to alternative grading or why do you teach what you teach?
Sharona: So we're coming up closer to time. Any last pieces of advice other than the why? Anything we missed?
ng the statistics class since:Sharona: So I'm gonna do the flip side of give yourself some grace and I'm gonna say give yourself the freedom to try and have fun with it. Instead of just worrying all the time about what might go wrong, what could go right? Could you get your students to a level of achievement that you never dreamed? Could you bring up topics that you've always wanted to do but felt you were restrained from, or couldn't because your students would reject it? Have some fun. Why are we busting our you know what trying to be educators if it's not fun.
We love to learn. That's why we're educators. What if we could create students who love learning? I feel like this has opened up the opportunity for me to be the kind of teacher that, like my mom was. It came up again this week where someone said in a completely random setting and somebody I didn't know, oh my gosh, the impact your mom had on me. And I ended up becoming an elementary school math teacher or a high school math teacher. What opportunities are we creating for ourselves as educators if we just open ourselves up to the incredible potential that I think alternative grading unlocks. So mine's the positive side of the whole thing.
Boz: I think that is an absolute beautiful point, and a better point to end on, I don't know if we can come up with, so.
Sharona: Probably not.
Boz: I wanna thank everyone and do know that we've got several interviews that we're hoping to get planned out, coming up. Couple of follow ups on some of our previous episodes that have been really popular, and just a couple of big names as well. So until then, and until next week, we'll see you later.
Sharona: Please share your thoughts and comments about this episode by commenting on this episode's page on our website, www.thegradingpod.com. Or you can share with us publicly on Twitter, Facebook, or Instagram. If you would like to suggest a future topic for the show or would like to be considered as a potential guest for the show, please use the Contact us form on our website. The Grading podcast is created and produced by Robert Bosley and Sharona Krinsky. The full transcript of this episode is available on our website.
Boz: The views expressed here are those of the host and our guest. These views are not necessarily endorsed by the Cal State System or by the Los Angeles Unified School District.