Artwork for podcast Educator's Playbook
Artificial Intelligence & Embracing New Technologies
Episode 326th September 2023 • Educator's Playbook • Penn GSE
00:00:00 00:34:45

Share Episode

Shownotes

Let's delve into the ever-evolving world of technology in K-12 education. New generative AI tools like ChatGPT seemingly hit our classrooms overnight. And as it becomes ever more present, educators are finding themselves in a dynamic landscape, forced to adapt and adopt on-the-fly. By May of last year, nearly 60 percent of students aged 12-18 had interacted with ChatGPT, according to one study, with many favoring it over traditional search engines for their academic pursuits. Yet, with these advancements come both challenges and opportunities. To explore this technological shift, Kimberly is joined by Penn GSE professor Ryan Baker, an expert in online learning and how data and emerging technologies can enhance the educational experience. Also weighing in on the topic is principal, author and McGraw Prize winner Chris Lehmann, adding depth to the conversation about what it truly means to integrate new tools into schools. Tune in as we explore the merits and pitfalls of ed tech in today's classrooms.

FEATURING:

  • Ryan Baker, Professor, University of Pennsylvania Graduate School of Education
  • Chris Lehmann, Founding Principal and CEO, Science Leadership Academies

NEWSLETTER:

RELATED PLAYBOOKS:

OTHER RESOURCES:

Transcripts

Chris Lehmann (:

What does it mean to have these tools in our school?

Kimberly McGlonn (:

This is the Educators Playbook from the Penn Graduate School of Education. Oh, Technology, always moving forward with or without us. While the possibilities of things like generative AI are pretty exciting, they can also be overwhelming, especially for educators being forced to adapt and adopt in real time. Most classroom teachers and students started last school year never having heard of ChatGPT, for example, but by May, nearly 60% of students ages 12 through 18 had already used it in some capacity. According to a recent poll from Common Sense Media, just as important, the majority of those students said they're more likely to use ChatGPT than Google or any other search engines for their schoolwork.

(:

Hi, I'm your host, Kimberly McGlonn. After finishing my PhD in curriculum and instruction, I spent 20 years as a high school English teacher. Today on the Educators Playbook, we're focusing on technology in the classroom. It's something I personally have a love-hate relationship with, and I suspect I'm not alone. Let's break down the good and the bad and how as educators, we can navigate a landscape that continues to change by the day. I'm joined by Penn GSE, professor Ryan Baker. He studies how data and emerging technologies like AR and AI can be best used to study and improve online learning.

(:

Ryan, thanks for coming on the podcast. Can you introduce yourself and what it is that you do?

Ryan Baker (:

Sure. I'm Ryan Baker. I am a professor of learning sciences and technologies at the University of Pennsylvania Graduate School of Education. I also direct the Penn Center for Learning Analytics, which is a research lab dedicated to seeing how data can help to improve education both higher ed institutions like Penn and also at the K-12 level.

Kimberly McGlonn (:

Fascinating. You know, technology is changing, it's always changing, and the current hot topic in that conversational space is around artificial intelligence. How do we get where we are so quickly? It seems like AI came out of nowhere, at least in the K to 12 space. If you could help me help our audience understand what's changed so quickly.

Ryan Baker (:

Yeah, it really does feel like there was this flip of a switch and I think that historically, artificial intelligence has been around since the 1960s and at some level you could call this the third wave of it, but the difference between the second wave and the third wave was enormous. You could see maybe two or three years ago the precursors to things like today's ChatGPT and how powerful they were going to be, but I don't think just about anybody outside of that very narrow area realized how transformational GPT-4, ChatGPT-3.5, ChatGPT was going to be compared to the stuff before it. I certainly didn't, and I was working in artificial intelligence second wave kind of stuff, machine learning, which machine learning is pretty powerful too, but it was much more narrow. You spent intensive effort to build something that would work on a specific problem, and the difference from that to ChatGPT is that the latest iterations of ChatGPT, they took way more effort to build in the first place, but now they can be used on a much broader range of tasks and problems.

Kimberly McGlonn (:

We throw around this phrase "Artificial intelligence," and even hearing you repeat it back to me, I'm wondering how you define it. How do we understand how AI works and the different software that's used is a part of that ecosystem?

Ryan Baker (:

I'm sure there are better definitions out there on the web, but I would say that artificial intelligence is something that can behave in a fashion that seems intelligent, that seems like it does the right thing when something is called for. You can see that, for example, in the Turing test, which was the classic way of people trying to decide whether something is artificially intelligent or not, could people tell the difference between it and a human or not? The truth is that these days, GPT can behave in some cases, in ways that seem like it meets that test.

(:

Overall, I guess that the goal is that human beings have a general adaptive capability of doing a lot of different things and learning to do new things. Today's AI technologies have that ability to act like that, even though at some level it doesn't seem like they think at all the way we think, They're fundamentally different under the hood, and that was true of both first generation AI, second generation AI, and today's third generation foundation models, large language models like ChatGPT, they reason in a fundamentally different way than we do, but they're able to produce similar looking behavior on the surface.

Kimberly McGlonn (:

Is that because of who created them or is that the function of how they're designed?

Ryan Baker (:

It's the function of how they're designed and built. Take second generation machine learning, for example. That was the case of you find a bunch of examples of what you want to detect and then you engineer features of your data that can infer those things and then if you give enough examples and powerful enough algorithms, it can find a model for that specific case that can do what you want. GPT, and for that matter, the image generation stuff like Midjourney, and Stable Diffusion, and Dall-E work in a different fashion. They're just trying to predict, in general, across a lot of text what comes next, but you layer on that a layer of also telling it when it did the right thing and when it did the wrong thing, which is what the reinforcement learning that they used to train ChatGPT, beyond that.

(:

If you get enough examples of what follows what in text and enough examples of telling it did the right thing and the wrong thing, combined with enormous amounts of processing power, the end result is a model that can simulate what a human being would respond to something with, which is essentially what it's doing, but in a way that allows it to have this emergent property of creative behavior in cases that it clearly was never designed to do. That's what distinguishes the third wave from the second wave. It's that emergent creative seeming behavior.

Kimberly McGlonn (:

Using that word seeming over and over and over again I think is such an interesting way of reminding us that there's a degree of it that's just, it's artificial, that's fake. It presents as true seeming, but then there's some space for, "Is it true? Is it real?"

Ryan Baker (:

Well, and ultimately how do we know what is real? I don't want to get too philosophical, but Elon Musk has often expressed a belief in solipsism a belief that he's the only person in the universe. How do we know that the people we talk to are any more real than GPT? They certainly behave in subtlely different ways. GPT when it fails, fails in ways that reveal its artificialness, right? It fails in different ways than a human being would, because it doesn't really at a deep level get what it's doing, but it's still able to behave in incredibly sophisticated ways.

Kimberly McGlonn (:

Fascinating. How have you seen through your research and your field studies this appear of AI within the classroom space?

Ryan Baker (:

Second generation AI has been widely used in classrooms for quite a while, but third generation AI, which I think is what you're mostly asking about, is just starting to emerge. On the one hand in K-12, you see examples like Conmigo, which allow students to engage in dialogue, asking for explanations, discussion, question and answer with these systems, and what you see is that it's able to do a lot of things and able to give support in a richer fashion than second generation AI was, but in a way that is much faster than asking a teacher. There's also a lot of research projects emerging that are built on top of these technologies.

(:

Mycenter has been involved in several of them, including projects to create a virtual teaching assistant, including projects to create a virtual language learning system and tutorial that we're very excited, about projects... For example, Chris Callison Birch's work in computer information science here at Penn that works on summarizing teachers lectures automatically. There's a lot of possibilities for this that I think are really exciting and not least of which is just students using these tools directly themselves. There are members of my lab who are using it to help improve their writing, who are using it to get ideas, who are using it to speed up their qualitative coding of data and who are using it to help them program faster.

Kimberly McGlonn (:

Those are all such positive benefits of how we can have good relationships with AI. But I'm also curious, are there some places, cases where you see how AI could be problematic in this third wave?

Ryan Baker (:

There's tons of ways that AI can be problematic. One of the things that people focus on a lot, I think, is the potential for students to just use the AI to write their assignments for them. I'm actually less worried about that in the long-term than a lot of people are, simply because I think that that's going to be part of the world of work using these tools. We're just going to have to adapt our educational system for that world anyways. If an assignment can just be done by GPT, it becomes something much like "Do we ask people to add five digit numbers anymore in the world of work," right? We have a computer do that. Similarly, some basic writing tasks will just become things that people use computers for. On the other hand, there are areas where I do think that these technologies can be problematic, in particular when they're wrong, when they hallucinate people becoming overly dependent on them in ways that aren't critical of what their limitations are.

(:

Because right now, for example, these tools, if you ask it for citations, will make up the citations. Although they can be right about a lot of things, they can be wrong about a lot of things, too. Being able to be a sophisticated customer and know where they're right and where they're wrong I think is a really important skill. I also think that we underestimate a little bit the general risks when people use these technologies, even beyond education in non-thoughtful fashions, there are people who are hooking up these technologies to their email inboxes right now, that scares me. Actually allowing these technologies to read and send emails for them. People like Mark Ridell have shown, for example, that these technologies are prone to prompt injection attacks where people can actually put fake information on their webpage in a way that's invisible to the human eye, but that these tools can see. It ends up getting put into the responses of these tools. The potential for misinformation, propagation of incorrect facts, and people actually hacking these technologies is going to create risk for people in the medium term.

Kimberly McGlonn (:

When I think about how that plays out for classroom teachers, I'm curious as to what you think teachers can do to both spot where students have this unhealthy reliance that you're talking about, but also what's the role of teachers in figuring out how to help students navigate their relationship with this third wave of AI?

Ryan Baker (:

That's a wonderful question. I think that teachers have an important role to play in helping students know what the technologies can be used for, know what they can't be used for, know what they shouldn't be used for. Just like teachers had to learn how to help students work in a world of Wikipedia as opposed to Encyclopedia Britannica, teachers will have a key role in helping students understand how to use these technologies appropriately and effectively in the world of work. We're still coming up to a societal consensus on how these tools can be used appropriately. It's no wonder that it's a challenge for teachers, but learning to use these tools like any other tool, there's ways to use it effectively, ways to use it ineffectively and ways to use it problematically. Teachers have a role to play in helping students learn that.

Kimberly McGlonn (:

That all makes sense, for sure. It's like any other kind of literacy that teachers have to figure out how to usher in some sensibility for, for their kids.

Ryan Baker (:

Great word literacy is a great term for it. It's an AI literacy.

Kimberly McGlonn (:

But I think too, there's going to be some cheat codes that students are going to open up in terms of how to get around what they're being asked to do. I want to give teachers a cheat code of their own in this conversation. How do they go about creating those kinds of question prompts, those kinds of assignments that will provoke students to have to really engage with an assignment versus just being able to simply ask a question and then regurgitate something that's unoriginal?

Ryan Baker (:

That's a great question, and let me answer it in two ways. The first way is that teachers might think and there'll be people out there trying to sell them products that will tell them they can do this, that you can just catch the kids who do it. I wish that were true. ChatGPT Zero, I largely hear, is the best of these tools and ChatGPT Zero still can relatively easily be defeated. If you try to catch the kids, you'll catch the ones who don't really know what they're doing when they use it, who just type in "Write my assignment for me," and then copy and paste it. It's an arms race and the smartest kids, you're not going to win.

(:

The smart lazy kids are going to find ways to use these tools. Instead, you want to create assignments that help kids to understand how to leverage it and where it's limited. So for example, there are things it's good at, but right now, in 2023, if you require citation to real sources and sources that are not just somebody's webpage, largely these tools are not going to be able to do that quite right. That's a case.

(:

If you ask your students to synthesize ideas in various ways, these tools can be good at it, but if they have to synthesize it with something brought up in class, if they have to connect to things that are not on the open web, even if the student gets ChatGPT to do it for them, the process of creating that response, the process of iteratively prompt engineering, modifying your prompt over time, actually there's a lot of understanding involved in that. Basically if you design assignments such that the student has to actually think about what they're doing as they use these tools, even if they use these tools, then you'll be in relatively good shape because the students will learn how to work with the tools.

Kimberly McGlonn (:

I guess, in principle, the role of the teacher is to facilitate learning in a way that fosters a room to think critically and originally, and so much of that is the nature of the assignment or the assessment. I think that this is an opportunity for teachers to rethink what they're asking kids to do in class, in particular. Do we go back to the in-class writing prompts versus the typed writing prompts that you can do over long periods of time or is there a balance where you have to do some due diligence around, to your point, calling kids into conversation with what was shared in person, in ways where they can't be completely dependent on AI for answer finding?

Ryan Baker (:

That's right. Ways where the default answer that comes up might not be a great answer, and they have to address it critically. Rather than just giving a simple prompt, ask the student to engage with arguments given by their classmates, ask them to engage with an argument the teacher gave. Ask them to critically reflect on what GPT is saying and whether it's a complete answer. Even if the student goes through a dialogue with GPT where it says, "Well, give me the critique on this," and "Here's what the student said and give me a critique on this." They still have to think about whether that's an appropriate answer. It's that critical thinking that becomes the skill, not just regurgitating text. We've gotten to a world, basically, where the traditional essay, "Write me a three paragraph essay on the Krebs cycle." There's tools now that can do it for you. That's no longer in a lot of ways the most important part, but ask students to draw a diagram of things. GPT with plugins can create a diagram, but it's really blatant whether GPT created that diagram.

Kimberly McGlonn (:

I think that in every disciplinary area from the sciences, to mathematics, to the humanities, teachers are going to have to think differently about what they're calling students to do in terms of the evaluation that they'd like to produce, which I actually think will drive perhaps more effective assessments and more effective activities in terms of how they're bringing students in relationship with the material in ways that perhaps rote memorization and then that rote response would never do anyways.

Ryan Baker (:

Exactly, and it'll train kids for the world that they really have to work in. Used to be not too terribly long ago, that a lot of kids were being trained to work in the factory. Working in the factory now isn't what it was then, and those jobs are gone. At that point. We started to really focus in a lot of ways on training kids for a relatively low end knowledge worker jobs. Well, those jobs are gone too now, and learning how to use these tools is an incredible asset for that. It's no hyperbole to say that in certain areas of work, people's efficiency has radically increased through knowing how to use these tools properly. I wouldn't say that it's radically increased my efficiency and effectiveness, but it's enabled me to do things I couldn't do before and it's enabled me to do some things faster than I could before.

Kimberly McGlonn (:

How do you think this new landscape where AI has proliferated so much of school and work, how will it work out depending on how well-resourced the school is or the kind of work that we're trying to predict that all students will be able to participate in? How will this play out across this country?

Ryan Baker (:

I actually think it has the potential to be a leveler, not a separator, although I'll say why I might be wrong. I think it has the potential to be a leveler simply because chat ChatGPT 3.5 is not that bad at all, and it's free and you can use it on a phone and you can use it on a pretty simple phone and you can use it on a Chromebook. As a result, you don't have to have expensive licenses the way you used to. Now there are going to be a lot of companies that are going to sell you expensive licenses and build stuff on top of it, and that stuff might be better than just chat ChatGPT by itself.

(:

But the fact is, it used to be that schools had to have a very expensive library and then it became that you could get information on Wikipedia that was pretty good. Similarly, chat, ChatGPT is kind of going to be a leveler. Now, the way it might not be a leveler is that school districts are responding to it differently. Some school districts, particularly large urban districts are banning it and other wealthy suburban districts are embracing it. It might be that it becomes a separator, ironically, because schools where there's the most potential for it to be a leveler, are the very ones that are banning it.

Kimberly McGlonn (:

I'm curious if you were to think about what your research says is this trajectory that we're on and what the future of tech and AI, particularly as it relates to the classroom space, what do you see as being what happens next? What are we moving towards with this?

Ryan Baker (:

There's a lot of people out there who will tell you gladly what the future is going to look like. I feel like I have a sense of what the next three years are going to look like. The workforce is going to change for a lot of people and we're going to have to train people for that. That's three years. In 20, I don't entirely have a sense of what jobs are going to be at risk. I don't entirely have a sense of what's going to transform. I could hypothesize, but I almost feel like it's really hard to tell. The emergent behaviors of ChatGPT 3.5 and 4, compared to even GPT-2, it's changed so much in such a short time, society hasn't adapted to it, and so thinking about how that's going to adapt in 20 years and how the world of work is going to change at that point, I don't have a good sense. People better learn how to adapt quickly, because that's going to be the competency that's going to stand them well in the future.

Kimberly McGlonn (:

How do teachers in the classroom space stay in relation to all that change?

Ryan Baker (:

Well, teachers are going to have to learn to be adaptable themselves, right? They're going to have to learn these technologies and they're going to have to learn them quick and they're going to have to learn how to evaluate new technologies that come forward relatively quickly, and that was already the case, but it's going to get more so. They're going to have to be following how it changes areas of the work life that they're not themselves exposed to. They're going to have to learn more quickly or otherwise they're going to be miserving their students. That's always the case to some degree, but it's become more so now. But I think we can't just ask teachers to do one more hard thing on top of all the hard things they already do. If teachers are going to have the time and the head space to do that in our current system, which is not going to suddenly let them have an extra prep period per a day, they're going to have to do the same kind of things, honestly, I think that I'm doing.

(:

I am continually studying these tools to figure out how I can make myself more efficient. One area that I think got real promise is automated grading. Both second generation tools, like for example, the [inaudible 00:19:11] platform, which I often rave about. It's an automated homework platform and just using tools like ChatGPT to do some of your grading for you, not necessarily to do the final grade, you can still look at it, but to generate feedback, can make teachers more efficient so that they can spend their time skilling up and learning how to do this and spend their time really working with their students and mentoring them. Creating is incredibly time-consuming. That's one place where these tools can provide time savings to be able to have the time and the headspace to work on other things.

Kimberly McGlonn (:

I appreciate that as a tip, and I appreciate it as a recognition of just how challenging demands on the time of educators today is and how that continues to intensify. I think that's one of the things we have to continue to acknowledge is that that is an incredible pressure on teachers. I think that there is an opportunity to recapture some of that time and to reduce some of the emotional and intellectual burden by using these tools as supports when you're not in front of the classroom as well as when you are in front of the classroom as tools for facilitating learning. Thank you so much, Dr. Ryan Baker for joining us today on The Educator's Playbook. It's been great talking with you.

Ryan Baker (:

You're most welcome. It was a fun conversation. I hope it's interesting to everybody out there. This is an exciting time. It can be a stressful time, but there's so much potential for improving education and there's so much potential for these tools to change what the meaning of work is, and to push us all towards doing the parts of the work that really have value for humanity and automate some of those parts that were drudgerous and really aren't that important.

Kimberly McGlonn (:

Excellent. Thank you.

Ryan Baker (:

Thank you.

Kimberly McGlonn (:

Thanks again to Ryan for taking the time to sit down with me. It's really helpful to hear the perspective of someone who's been working in this space for so long, especially since mass adoption of AI seemingly happened overnight for the rest of us. While generative AI and these large language models are hot right now, what about when we move on to the next thing? How do we incorporate whatever that is into our schools, and what lessons have we learned in the past, say from allowing students to use smartphones or laptops? I'm now joined by McGraw Prize winner, Chris Lehmann, to discuss how technology fits into the ethos of Philadelphia's Science Leadership Academies. Chris, thanks so much for coming on the show. Introduce yourself and what you do to our listening audience. That'd be really helpful.

Chris Lehmann (:

Sure. My name is Chris Lehmann. I'm the founding principal and CEO of SLA Schools. Day-to-day, I am at SLA Center City, our first campus, and I work with the principals of the other two SLAs here in the school district of Philadelphia.

Kimberly McGlonn (:

When did you begin that project? Then we'll talk a little bit more about what SLA really is set up to do and what it has done.

Chris Lehmann (:

Yeah, so we opened our doors back in 2006. I was employee zero back in 2005. SLA is a fully inquiry driven and project based school here in the district. Everything that we do is built on our five core values, inquiry, what are the big questions we can ask? Research, how do we find answers to those questions? Collaboration, how do we work together to make those answers deeper, better, richer? Presentation, how do we show what we know? Reflection, how do we step back and learn from what we've done?

Kimberly McGlonn (:

We go back to 2005, 2006, and you come up with this idea that there should be these guiding principles. How did you arrive at that as a framework for what SLA would aspire to do?

Chris Lehmann (:

When I was a teacher, I was a teacher in New York City, in the progressive schools world there and saw a lot of what project-based learning could do, and at the time, the school that I was at was probably 80% of what I thought a school could be, and that 20% was kind of that loose tooth I could never stop wiggling, right? Philadelphia announced that they were going to start a whole bunch of new small schools. They eventually did five. I really thought like, "Okay, I'm from Philadelphia," grew up outside the city. I went to college inside the city, and I really wanted to make a difference in my hometown.

(:

This idea of inquiry driven and project based was always a powerful idea to me. They knew they wanted to start a science high school, but they didn't necessarily have a big vision for what the school could be. What I was able to pitch to them is this idea that you could create a learn by doing high school, and that was this idea of what are the big questions we can ask? How do we seek out answers to those questions? Then marrying that to modern tools and modern ideas and it stemmed from there.

Kimberly McGlonn (:

That makes a lot of sense. The landscape of education writ large has changed so much since 2005, 2006. Thinking even more specifically about how digital tech has changed the world of education, how do you see those changes? How do you conceptualize those changes when we think about the last 10 to 15 years?

Chris Lehmann (:

Well, I think what we're seeing is that our world has changed so much. Obviously, the pandemic was a massive shift, but even before the pandemic, everything from how we get our information, to how we shop, to how we communicate with others has really gone through a revolution and schools have got to do the same thing. The notion for us was and has always been, what does it mean to have these tools in our school? How do we leverage these tools in such a way? I think there's a lot of stuff out there where if you go to the exhibit floor of a conference now people will sell you electronic curricula.

(:

Or they'll sell you the new ways to monitor, or the new ways to test, which is about as thin value for these tools as possible. The question we always ask is, "How do we leverage the tools at our disposal to allow kids to become more active and involve citizens, to allow them to become content creators as well as content consumers, to allow them to be better collaborators with people both in and outside of our walls?" Then also, I think in what we've seen certainly in the last three or four years is how do we get kids to critically examine the way these tools use us? We're living in a fascinating moment where the attention economy has become one of the really profound pieces that we all go through. The average person in America spends four hours a day on their cell phones. Now, whether or not that's a good thing are questions we need to ask. We need to point out to kids that the folks who are designing the apps, the folks who are wanting us on our phones are doing it for a reason.

Kimberly McGlonn (:

Yeah, there's an agenda there.

Chris Lehmann (:

Attention is the agenda. If you're not paying for an app, you're not the audience, you are the product.

Kimberly McGlonn (:

That's right.

Chris Lehmann (:

Right.

(:

Looking at helping kids to understand why they use a tool, when they use a tool, how they use a tool, and what that is doing to the way that we think, to the way that we engage in our world to the ways that it affects how we see ourselves, those are really, really important questions that we have to ask these days. We're asking all of that in the context of inquiry driven curriculum that is asking kids to be authentic agents in their world that is relevant to the people they are today. In doing so, we have to consider these tools.

Kimberly McGlonn (:

I'm curious as to which of these tech strategies, tools you've been experimenting with and implementing at Science Leadership Academies and what you've learned along the way of choosing them.

Chris Lehmann (:

We use a lot of different tools. We love Canvas, we're big fans. We love the Google Workspace. Those are the two every day, every minute. We obviously love video conferencing and the ability to have these kinds of conversations. On some level they're so integrated into everything we do it, they allow us to kind of have the education we want. But to give you just some anecdotal stories, this is last year when our 11th graders were looking at creating short videos that were issue oriented, that they were looking at how do we change our world? How are we active agents in our world?

(:

We were able to have video conferences with folks that we couldn't get in the building. For our kids to be able to have a video conference with their senator was an amazing thing, for kids to have video conferences with anti-gun violence activists who couldn't get to school, the ability for all of these tools to allow you to see your school not as a black box, but as a permeable membrane that the world can come in and out of and you can come in and out of, right?

(:

The idea that kids are creating films and podcasts and work that exists in the wider world and that they are getting audience for that work and that it is authentic and real, and that work can have meaning is really, really powerful. The idea that kids can collaborate not just when they're sitting next to one another, but at any time is incredibly powerful. The idea that we can construct just thoughtful ways to organize curricula so that every classroom is a blended classroom, these are all the kind of tools that you can do.

(:

Again, there's dozens of them. But I think the other question is understanding when we look at social media, it's not about the tool, but about the education around the tool. When we take a look, we did a school wide, we call it a technology reboot last year. We really found coming back from the pandemic, kids were struggling to stay off their cell phones in ways that we had never seen before. The advancement of the attention economy and the advancement of the neuropsychology around the way not just kids, people use these apps has really gotten much more sophisticated. We did a whole school-wide conversation around what does it mean to be intentional in your phone use, and what does it mean to be intentional? Don't just scroll. Or what does it mean and what does this doing to the way that your attention works? What does this doing this to your in-person interactions, right?

Kimberly McGlonn (:

Yeah. It's driving more self-awareness so that there's a healthier relationship versus being owned by the tech, but still trying to pursue some capacity to maintain the relationship from a position of agency and self-possession.

Chris Lehmann (:

That's right.

Kimberly McGlonn (:

That's huge.

Chris Lehmann (:

I don't think schools that just collect the cell phones in the Yonder pouches are necessarily getting it right, because if we don't engage kids in these conversations, we absent ourselves from them. Adults need to be involved in these conversations. Now that being said, our cell phone policy at school is like if you're in class and you're not using your cell phone for a classroom reason, we don't want to see it out. Obviously, you get a call from your parents, step outside, take the phone call, but we've really tried to message to kids that class isn't the time for TikTok scrolling.

Kimberly McGlonn (:

Yeah, no, that makes a lot of sense.

Chris Lehmann (:

We're having conversations as a school. What does it mean that if you don't look at your phone for 65 minutes, the first thing you do when you have a break is look at your phone. What does that mean and what does that say about your attention and what does that say about-

Kimberly McGlonn (:

Your investment?

Chris Lehmann (:

Right.

(:

The nature of these devices.

(:

Obviously not every kid, every period, and not just kids, but adults, too. God knows if I'm in a meeting and I keep my cell phone in my pocket, the first thing I do when I get out of that meeting is check my cell phone. I think we all do that, and I think that that's a piece of this puzzle that's really important in all of the things that we do with kids is that we don't just talk about kid behavior. You hear that in schools all the time. I think David Perkins, with his book Making Learning Whole talks about this idea that school is the junior varsity version of the adult world.

Kimberly McGlonn (:

Which means when you're in adult world, you're still in high school.

Chris Lehmann (:

Right. What we try to do all of the time is remember that the behaviors we see in schools are part of a human continuum. Then it's not just kid behavior. We're actually, that's othering in this very interesting way, right? When we say this is human behavior. If we can recognize in ourselves behavior we see in kids, then we remember that these things that kids are doing are part of the human experience. It may look different because they're 15, not 52, but the procrastination technique that a 14-year-old might use isn't necessarily all that different than what their 52-year-old principal might do when he's trying to get out of work. If we see that as human behavior, not kid behavior versus adult behavior, we allow ourselves to humanize the ways in which we deal with trying to get all of us to engage in more productive behavior, be more thoughtful and kind in the way that we come up with ways to keep kids engaged in school as opposed to the punish and lock away and what have you.

Kimberly McGlonn (:

Just given your proximity to how this really complicated tech landscape is playing out, what is some advice, some resources that you would recommend for educators who are trying to incorporate more technology into their classes? They're not trying to reject the presence of cell phones.

Chris Lehmann (:

Again, not about the tech.

(:

What we talk about is what is an in inquiry driven and project based curricula? What are the frameworks for curriculum design? What are the frameworks for authentic work? Then what are the tools necessary? I think that what I would say to any educator is not "How do you use more tech," but rather "What are your pedagogical goals and what are the tools you need to get there?" That, to me, is a more helpful lens. I find that a lot of times when people put the technology first, what you get is the whizzbang stuff.

(:

You don't get authentic use. For me, an old friend of mine used to say that we don't talk about taking kids to the pencil lab. What are the tools necessary to fully realize the goals of your curricula? By doing that, your goal is in the right place. If you say, "How do we use more tech," then you're at the mercy of the vendor. Whereas if you say, "What does it mean to be a student at this school? What do we hope for our kids?" Then you say, "What are the tools necessary to manifest that?"

Kimberly McGlonn (:

I think that one of the things I appreciate so much about what Science Leadership Academy does is the way it frames the guiding question around not what we're going to do to achieve whatever it is that we want to achieve, but why we're moving in that direction. Thank you you, Chris, for sharing with us your journey in establishing Science Leadership Academy and how it's really done so much to set what you're doing there up for being a lab, not just for the resources and the tools, but the approaches and the guiding philosophies that will ultimately drive change.

Chris Lehmann (:

Thanks. My pleasure.

Kimberly McGlonn (:

We'll talk soon. Chris, thanks again.

Chris Lehmann (:

All right, take care.

Kimberly McGlonn (:

Thanks again to Chris and thank you all for listening. Speaking of technology, be sure to subscribe, like and review us on whatever device you're using to listen to this podcast and be sure to check out our newsletter and online articles archive at educatorsplaybook.com. Some of the most useful advice I ever receive came from fellow teachers. Here's a helpful tip that you, too, can implement in your classroom.

Listener (Quinn) (:

Hi, my name is Quinn. I'm currently a teacher in West Philadelphia and I've taught for seven years. One tip I'd like to offer is that if you're looking for a brain break, one really fun thing to do is give kids five minutes to look up trivia questions from a category of your choosing and have them ask them of you if you have a little downtime. It's a really good way to get up some competition between you and them, which kid doesn't love a little competition. If you do well, you seem like a genius, that can never hurt in front of kids.

Kimberly McGlonn (:

What works best for you? Give us a call at (267) 225-4413 or share your own advice on social media and tag us with the hashtag #PennGSEPlaybook. Educator's Playbook is produced in Philadelphia by the University of Pennsylvania's Graduate School of Education in Partnership with RadioKismet. This podcast is produced by Amy Carson. Our mix engineer is Justin Berger. Christopher Plant is RadioKismet's head of operations, and Ben Geise is our project manager. Matthew Varhoss is our executive producer. I'm your host, Kimberly McGlonn. Thanks so much for listening. This has been Educator's Playbook.

Chapters

Video

More from YouTube