What if the data that matters most isn't grades or test scores, but the messy, beautiful process of how students actually learn?
Erin Mote, CEO of InnovateEDU, joins Brett and Rebecca for a timely conversation on National AI Literacy Day 2026. Erin founded the EdSafe AI Alliance and leads a network of educational partnerships touching thousands of school districts nationwide. Her background spans enterprise architecture, personalized learning platforms, and global education technology initiatives.
This episode tackles one of education's most pressing questions: who owns the learning process data that AI systems are quietly collecting? Erin introduces the "ground lease on the family farm" metaphor—describing how foundational models are capturing the intellectual property of teachers and students to fuel AGI development. The conversation moves from policy to practice, exploring the White House's new AI framework (released the day of recording), EdTech accountability gaps where safety features become paid add-ons, and emerging research on AI bias in grading. Punya Mishra's work at ASU reveals how student dialect and cultural references can lower AI-assigned scores, raising urgent questions about fairness and trust.
What You'll Learn:
AI literacy as discernment — Why the Blueprint for AI Literacy focuses on critical thinking grounded in the science of learning and development, not just tool proficiency
Learning process data vs. PII — How students think, correct mistakes, and sequence productive struggle—and why this data is foundational for AGI
EdTech accountability tensions — The pattern of features pushed open by default while safety becomes a paid upgrade, and what shared responsibility really means
AI bias in grading — Research showing how dialectical differences like "y'all" and cultural preferences (rap vs. classical music) affect AI scoring
National AI Literacy Day evolution — From its founding three years ago to 140+ supporting organizations in 2026, with statewide events, year-round curriculum, and student town halls
The episode features two rounds of The Rhythm Project's AI Effect game, exploring AI-generated apologies and the ethics of using AI to grade 140 essays, plus Ocean's 11 recommendations from Erin's dream team of education innovators.
Brett and Rebecca bring fresh perspectives from recent work: Brett shares insights from presenting with Utah's Matt Winters on six panels exploring humanity in AI policy, while Rebecca reflects on governance research at Edinburgh Futures Institute as she completes her master's in AI ethics.
Tune in, subscribe, and share if you're ready to turn up the volume on what's possible in education.
Transcripts
Erin Mote: [:
Rebecca Bultsma: I think what bothers me the most is companies who have safety features that are add-ons and paid features.
Rebecca Bultsma: It makes me crazy like the fact that if you want to try and protect students, you have to pay extra for that.
Erin Mote: Between Congress and the White House here, between states, the White House and Congress, we're gonna have to see how this shakes out. But the thing I am really delighted about is how child privacy data protection is table stakes in this conversation.
Brett Roer: We are live. Welcome everyone to the AmpED to 11 podcast. We are joined today by the incredible Erin Mote from Innovate EDU. How are you doing today, Erin?
I'm great. How are you guys? [:
Rebecca Bultsma: Living the dream.
Brett Roer: Before we dig in, Rebecca, obviously my amazing cohost is joining us today. Rebecca, what is one thing you need our listeners to know about what's going on in your world of ai?
Rebecca Bultsma: Oh, there's always so many interesting things. I just got back from the UK where I spent a couple of weeks, uh, and there's just, there's great work happening all over the place. Interesting research. I was at, uh, the Edinburgh Futures Institute, and there's on the nerdy side of things, lots of great work happening in the realm of governance and worldwide governance and interesting papers being written that nobody who's listening to this podcast cares about.
Rebecca Bultsma: But there is good work being done, and I'm excited to talk to Erin about the work she's doing a little closer to home today.
Brett Roer: Amazing way to really excite the crowd. So we're filming this, we're recording this on Friday, March 20th. That is important because this is going to be our special AI Literacy Day episode release, and Erin is one of the pioneers and leaders of AI Literacy Day here in the United States and beyond.
Brett Roer: So. [:
Erin Mote: Hi, everybody. Well, that's a, that's a lot of questions all in one. So I'm Erin Mote. I'm the CEO of Innovate edu.
Erin Mote: And I'd like to say we're a house of brands, not a branded house. Uh, because at Innovate Edu we house a number of large scale multi-stakeholder alliances, and that means we touch two and three school districts in the United States through those alliances. Three partners who work side. Set of issues that's really about radical system transformation.
workforce ecosystems to the [:
Erin Mote: And we have some other alliances that are focused on what I would consider more practice-based work. So Brett, you know, well, the Educating All Learners Alliance, really this uncommon alliance of over 175 partners that are focused on new models, ways to come together to serve students with disabilities across the ecosystem.
o we have folks row together [:
Erin Mote: So lots of different. Topics that are incredible, team and innovate edu touches. I have the most incredible team of mission oriented, really thoughtful advocates for Change who think about how we put our partners at the forefront in driving these conversations. How do we center the expertise, wisdom, and knowledge of practitioners from the field?
Erin Mote: And most of all, how do we create sort of timely, relevant tools, some of which utilize AI to really guide the field towards evidence-based reach, research backed practices. So, um, many, many hands in this work, which I think is always the most important thing to say that I'm just one person as part of an incredible set of partners.
aders and really my team who [:
Brett Roer: That was a very succinct, well done version of what oftentimes if people interact with Erin Mote or all those organizations, you might have heard you see like bits and pieces of it.
Brett Roer: And thank you for kind of explaining how they all come together and support this incredible work. As we said, we are dropping this on National AI Literacy Day. I'd love it if you can let our listeners at home know the origin of this, the trajectory and the roadmap, and like what are some things that maybe today they should know about that might have been different from one year ago for national AI literacy.
. Really focused on building [:
Erin Mote: What is it and how should we be about this technology in our.
Erin Mote: Discernment and critical consumerism, but also the ability to have really important conversations around skills, capacity, and begin to dive into what this technology does. It's important for us to name that. When we started this work, it was really driven by superintendents and leaders in the field who were really focused on how do they start.
s on? And so we started with [:
Erin Mote: So I wanna give Amanda Bicker staff and her team a shout out there really helping us originally, kind of conceive this day and what it would mean to have a rally point around AI literacy and how it's evolved over three years is I think it's really become a movement. For AI literacy, the website itself and how we think about, um, the day has really evolved and changed.
Erin Mote: So we work in partnership with Hour of ai. We work in partnership with Day of ai. We work in partnership with Alpha Ed and the work that they're doing on digital learning day to blend digital learning and media literacy with AI literacy. And so even when you go to the website now, the materials are year round.
ould be doing an AI literacy [:
Erin Mote: That we need to come together and really build this national movement. And so I'm really excited to see. The growth in supporting partners. As I'm talking to you, we have over 140 supporting organizations who are doing events and activities and curriculum on that day. We have statewide events happening in Florida, North Carolina, Colorado, Ohio, that are really focused on how they can be in service to districts, communities, and families.
nd I think it's really a day [:
Brett Roer: Amazing. I'm gonna turn it over to Rebecca in just a moment because a lot of things have just changed even in the past week on a national scale here in the United States when it comes to AI literacy and frameworks. One thing I just wanted to share about all these events, where can folks go, right? This is gonna drop the morning of National AI Literacy Day.
Brett Roer: Where's the website where they can go and find all of these and engage with them on that day?
force with the Department of [:
Erin Mote: And one of the things that I think is really important about how we're thinking about National AI Literacy Day moving forward is all of the curriculum, all of the pd, all the live sessions that happen that day will be recorded. And so folks will have access to gain insights anytime, anywhere over the next year.
Erin Mote: And we'll keep adding to that. An educator who has time for a 15 minute lesson on national AI literacy. That's awesome. And then come back for a 45 minute lesson a month later and really begin to build our fluency with AI again in a way that's developmentally appropriate by grade band and grade level.
te Innovation in partnership [:
Brett Roer: So for those that are out there, we're, we're doing that. And as you can find that on the website to register and join us. And you named another amazing partner in this Matt Winters. So last week I got to present on six panels with him, including the state of the state of national artificial intelligence.
Brett Roer: And so I wanna just give a shout out to the state of Utah, which I'd never been to before, which I'm in love with. But I wanna exec, I wanna share one quick anecdote and then I would love for you to talk about the national framework with Rebecca, who's much more well equipped to talk about the implications of it.
, the most important each is [:
Brett Roer: And she gave this amazing, amazing closing speech about the importance of humanity and the world of AI and education, which is obviously something that resonates with all of us. And she said, this line that was so powerful, there were about a thousand people I just started clapping and no one else did.
Brett Roer: And I went up afterwards and said like, that was me awkwardly clapping. But it was great. But the reason I wanna shout her out is before I said that to her, she was talking to a teacher. And like, she's on a pedestal, literally like on the stage. But like she got down on one knee and was like eye to eye and like looking this person in the eye.
Brett Roer: And I was just like, that's awesome. And then the same with me. And then she like, okay. And then she like sits down on the stage so she could be eye to eye with me and have like a deep conversation even though she's in the heart of legislative season and she's like very self-deprecating. She's like, that wasn't even a good speech.
iscuss this amazing national [:
Rebecca Bultsma: And I do wanna talk about that. I just have a, a quick question for you first. It's something I've been thinking about and uh, it has to do with something, you actually already brought up it, how you talked about how there's great appropriate bans and ways to be looking at this. And I think about that a lot because I feel like at some point it morphs from general digital literacy, uh, for the younger kids into more AI literacy.
Rebecca Bultsma: Tell me a little bit about where you see that overlap. Do you see it as an overlap or We straight AI literacy now instead of the digital literacy. And when do we start talking to kids about AI specifically?
Erin Mote: Well, we released a, a blueprint for AI literacy that took this on from a grade level band perspective, and I'd encourage people to dig into that and to take some time to read that.
ion, Rebecca, of what is the [:
Erin Mote: So discernment, I think the ability to engage in discourse, discourse and dialogue, the ability to interrogate the inputs and the outputs of this technology is incredibly important to build that critical consumerism that I think is gonna be essential, not just for kids, but frankly even for seniors right now who might be getting, you know, different types of messages or phone calls or, or seeing things online.
ern what is real and what is [:
Erin Mote: Productive struggle, um, with the technology, right. I, I think we are very focused at Ed Safe AI in orienting the building of AI and education towards the learning sciences. We'll be releasing a benchmark in September with a internationally recognized group of experts. Folks that we've, we'd like Rose Luck and Ryan Pam.
ing on these tools, it's not [:
Erin Mote: How do we classify things? How do we, what things are like or different when you're in kindergarten, first and second grade? And then I'll just say the thing I'm most focused on right now from an access and opportunity perspective, and this is why I'm really excited about the DOL, the Department of Labor's AI Literacy framework, which talks about broadband as a prerequisite for, um, AI literacy, is who has access to these tools?
nd how to augment their team [:
Erin Mote: How do they still gain the skills to work in a multifunctional team? But one of those teammates might be ai and how do you manage that teammate? How do you manage the inputs and the outputs? How do you manage that in your process flow? And um, as an enterprise architect, I'm digging into this all the time.
Erin Mote: AI is so much my teammate on lots of different things. You know, I really push some AI tools, I think, to the absolute limit when it comes to coding and, and full stack development. So
Rebecca Bultsma: I think what you said exactly just made me think about why the AI literacy piece is so important. Because I think foundationally we lack a common language and understanding because to me, um.
ally this AI literacy piece, [:
Rebecca Bultsma: Because, um, you're right, there is all of these different terms that we're all understanding differently, I think, um, and where they overlap. And I, I think that that common language is gonna be a huge part to getting everybody on the same page. Seniors, parents, adults, leaders, and students. Uh, because I'm seeing that disconnect happening a lot of places.
Rebecca Bultsma: Even the idea of ai, right? Like it being such a broad term and then AI literacy, what does it even mean when AI means a hundred different things? So I think a huge part of this conversation is going to be developing a common language that we can all speak and understand.
Erin Mote: I totally agree with you, and I think like you make an excellent point there about ai.
and the things we were most [:
h even took me by surprise in:
ually talking about, because [:
Rebecca Bultsma: Brett, go ahead.
Brett Roer: I could share some wisdom that happened yesterday. So we're working with 10 districts across Ohio. I'm building AI community playbooks, and the biggest unlock has been we, we say we mandatory silence until someone has hit record with consent for wisdom collection. And one of the practices we do is we give them, you know, a, a, a glossary of let's say 50 of the most common ubiquitous terms we find across frameworks.
Brett Roer: And we have them sit down and say like, do you understand this as it's written. If not, like, what seems unclear to you. And we had them do this with a group of eight to 10 district members, including teachers and, and uh, all the way up to superintendents. And it eliminates the idea of like having to wordsmith it.
bout it? And then it creates [:
Brett Roer: And then you can obviously scaffold it up or down, but like get to the heart of what are you trying to make sure people understand. And then that way when you use that term throughout the community playbook or then go backwards and change your instructional philosophies and frameworks, you've actually understood what your community was trying to ascertain there.
Brett Roer: And like, it's just a different way of thinking about it. But then everyone actually has common language in a common playbook. So just highlighting like, it doesn't mean five people sitting down and trying to rearrange a sentence, which is how I feel like we used to do that in education. It's. Just listen, present things and get reactions, and then AI is a great tool to hopefully gather all that wisdom and mirror it back to your community.
Brett Roer: Just sharing that, because the efficiency allows you to do that and mirror back and get immediate revision, is just something I'm, I've noticed out in these districts that they're not used to doing it that way and they just can't believe you can do that in hours instead of never or months. So just highlighting that.
ll call out two districts in [:
Erin Mote: And that was a brave thing to do two years ago, was to go out to the community and prioritize the community first in these conversations. And the other districts that I've seen do this really well, one is Santa Anna, also out in California, who oriented those conversations around their portrait of a graduate and updating their portrait of a graduate to consider an AI infused world.
s so beautiful about both of [:
Erin Mote: And again, use a starter though to begin to have those conversations in community, so shared orientation and that calling in of communities. I really appreciate that wisdom.
Brett Roer: Erin, if you could really share with everyone what just was released, I believe you said it was this week from the US government and what that might entail.
Brett Roer: This morning, I, on March 20th, this was dropped. So hot news, if you could share with everyone what it is, maybe where they could access it, and any of the roles that you, or these, uh, organizations that you're in partnership with, played a role in. And then what does that mean for next week? And then obviously beyond.
and the White House Special [:
Erin Mote: And there it's only four pages. That's the good news. It's on the White House website. Well, I'm sure put it in the show notes here. And you know, I will say, I'm not gonna like comment on the content other than there's a set of domains. It's really important to know that they start, the first domain is child safety and child privacy.
Erin Mote: It's table stakes for. And frankly I think table stakes for communities across this country. Table stakes for the first lady in particular around making sure that we're emphasizing child safeguarding when it comes to digital ecosystems and AI technology. There's gonna be. A lot of work over the next couple weeks with a lot of different players in Congress, whether that's Senator Blackburn and the work that she's put forward.
uz and the work he's putting [:
, the work that's called out [:
Erin Mote: And then probably, maybe the thing that people don't think is most sexy, but I think is the sexiest part of it all, which is AI infrastructure for equitable global development. Um, research and development testing and evaluation like sandboxes, calling out Utah again for their extraordinary work on that testbed infrastructure.
Erin Mote: And then the ability to have data sets so that we're creating the type of public infrastructure. Rebecca talked about the uk, they've done incredible work here in creating public infrastructure that allows startups to big tech companies to really test their models, test their data around nationally representative data sets.
acy data protection is table [:
Rebecca Bultsma: I agree with you a hundred percent, a thousand percent.
Rebecca Bultsma: That's something that's been on my mind for a long time, especially around EdTech. It's part of the reason I sometimes struggle to work with EdTech companies, just not necessarily their fault, but the way the system is set up and the responsibility that's put on schools and superintendents, uh, that maybe doesn't fully belong.
Rebecca Bultsma: There, right? That should be more of a shared responsibility.
Erin Mote: It's a huge burden on our schools and our districts and our superintendents and our tech directors and our educators, and, and it creates, I think, a situation where they're just like struggling to keep their head above water when it comes to the sort of changing regulatory undergirding.
I as features as not as like [:
Erin Mote: You're like frantically trying to get the chickens back in the chicken coop before the fox gets in the hen house. And so just really, really, I hope that we will have a clarion call here around what is the shared responsibility in this space.
Rebecca Bultsma: A hundred percent. And I, I think what bothers me the most is companies who have safety features that are add-ons and paid features.
Rebecca Bultsma: It makes me crazy like the fact that if you want to try and protect students, you have to pay extra for that.
s a B in my bonnet, Rebecca, [:
Erin Mote: If you wanna do business in schools, you have an enhanced set of responsibility to be a steward of young people. That's not just the director of tech.
Rebecca Bultsma: And you've named the thing that bothers me most, which is business in schools, the idea that there is this much for-profit business. In schools, sucking money out of the school systems and then also harvesting the data of, of students, uh, unbeknownst sometimes to the school leadership just based on the fact that they have so much going on, plus the level of burnout it's contributing to for, uh, our leaders and our staff trying to manage this, these escaped chickens, as you've mentioned.
perimenting and playing with [:
Rebecca Bultsma: Oh, great, now I can control my cloud code from my phone. How am I gonna use this? Right? Like, all of these latest features. So I wanna know a little bit about your stack, your favorite tools right now, what you use, what you're using 'em for. Nerd out with me for a sec.
Erin Mote: Yeah, I mean, I think, I mean, I'm pushing Claude Opus I think in a way, like too, it's like absolute limit.
Erin Mote: I have like, you know, I think like many people, I just went and got a Mac Mini because like, I can't, I can't. I can't have it on my regular computer right now. And so, you know, I, I use Opus. It's really like my force multiplier. I don't just use it for snippets. I am really doing some architecture of like, functional prototypes.
rate trends, population zip [:
Erin Mote: And so it's really hard to think about closing schools. Those are real students, real communities, real families. And so how do we sort of build tools that allow you to at least have data as a, as a neutral actor and synthesize an a lot of different types of data. Maybe not even educational data. Again, population data, workforce data to be able to make decisions that are best for your communities.
that they're understandable, [:
Erin Mote: And I think that's really the power of some of these tools is to take the invisible and make it visible. And that's the type of stuff that I am really, really enthusiastic. And then of course, I use some tools to just monitor my ongoing pipelines, uh, so I know when something breaks because it breaks all the time.
Rebecca Bultsma: Amazing. Love it. Love it. I'm here for all of that. Brett, we'll let you talk for a second because we could easily, I feel like Erin and I are best friends now. We about chicken
Erin Mote: bonnets.
Rebecca Bultsma: Yep.
Brett Roer: So why I, when you said the be in the bonnet, I, I'm glad I was on mute because I was cackling over here. But I shared this with Rebecca again.
ly chief technology officers [:
Brett Roer: Right. So with all these things you're all just talking about, like what's the right, they're all different when, when to choose, which I guess would be the best way to say it. We had some EdTech leaders on there, and myself and Matt Winters on there. And there was a gentleman in the audience who was passionate.
Brett Roer: I loved it and I wound up feeling like I was like moderating a little with Matt to make sure like this stayed productive. But afterwards I talked to this gentleman, turns out he is the technology director of the largest school district in Utah. He's the head of their state technology team. So he's basically the gatekeeper and Utah is this great DPA that applies across all districts.
Brett Roer: And we started talking about, I said, you know, and again, not knowing who he was, just trying to like keep the peace and just make sure this is a productive use of everyone's time. I said, you know, sir, like you're so passionate and informed about this. I made sure with Matt, I said, Matt, can, can districts like add addendums to the statewide DPA?
y could be adding that would [:
Brett Roer: I was like, great, but I turned it back to you all because I would love to hear like if you, not knowing their DPA, but in general. What are some exact addendums rules or ideas that you wanna make sure EdTech directors out there, or chief technology officers out there, I'm sorry, are like, definitely make sure you've asked this exact question or this type of question and see if you can get that in writing from them.
Brett Roer: Uh, and, you know, let's move this, let's make this informed anger productive for people out there. So open it up to both of you, you're both leaders in this field. What are some things you might do if you were in their seats to keep EdTech companies more honest and protect your students' data?
Rebecca Bultsma: Well, I would not having read what's already there, so I'm sure some of this is already there.
o their partners are because [:
Rebecca Bultsma: Uh, often the sharing is actually monetized, right? So they're charging the district to making money and then they're selling some of the data or sharing it, whatever arrangement they have. And student data is ending up in a lot of hands. And we know, as part of the research that I'm doing, uh, one of the biggest risks of AI for students and children is that data collected about them now will follow them for the rest of their lives and be used to make decisions about them, but also to, um, manipulate them and to, uh, tailor their algorithms and to tailor ways that that will impact them in the future.
appens to that student data? [:
Rebecca Bultsma: We shouldn't, that AI is neutral or it's just making the invisible visible. But the truth is like we don't know what it's doing. We don't know how it's working or how that data's being used to make decisions about students. We've seen this all the time. It happened in the uk. People were using AI to grade 'cause they thought it was nice and neutral and it ended up causing massive problems with university entrance.
Rebecca Bultsma: It graded students on lower income scales from public schools, lower than private schools. And so we just kind of need to know how that data is being used to make any sort of decisions about kids anywhere. I just, I would have so many questions, um, because kids can't consent. Like we're making decisions about student data now that they can't consent to.
nd what actually needs to be [:
Rebecca Bultsma: And then what happens to it and where it goes. I have, I could talk forever about this though.
Erin Mote: Yeah. I, so we released a set of resources that I would encourage folks to look at for superintendents, and then another set around CIOs for CIOs and CTOs around what are the 12 to 15 questions you should be asking before signing a deal.
Erin Mote: So it's everything from like, uh, to Rebecca's point, like, is this data being used as training data? Uh, what do your third party subcontractors look like? Are you, where's the data being stored? Is it being off shored? This is a really important thing to think about as the cost of tokens and processing goes up, it's more expensive to sho data domestically than it is overseas.
ome alignment questions, uh, [:
Erin Mote: Really important that we're stewarding. But I also, you know, I get to look at term sheets of, uh, deals from foundational models and from EdTech tools for some of our districts and states because we're in a relationship with them, um, as policy labs. And so one of the things I've been most alarmed with is they are clear about how to and what to ask for around safeguarding PII, because I think we as a sector have, have really hit that over the last like five to seven years.
Erin Mote: There's been lots of training and resources we still have worked to do, but I think folks are attentive to some of the questions that Rebecca's using about profiling and tracking and that type of thing. The thing that I don't see us safeguarding is the intellectual property of teachers and students.
Erin Mote: [:
Erin Mote: And what do you need to know to shape a GI? You need to know how people correct mistakes, what mistakes people are learning, how people sort of sequence the productive struggle of learning. If you look at the things that are building and developing with Google DeepMind and others, they're thinking about the learning process for a GI.
d your learning process data.[:
Rebecca Bultsma: Erin, I love you. Like I love being able to finally have these conversations with people because it's something nobody's talking about or thinking about, and you are 100% right. That's part of the reason I have a huge problem with AI detectors in school. I have big problems with AI detectors in general, but what we're not talking about is you're taking student intellectual property and using them in models that are unapproved, I think the most fascinating.
Rebecca Bultsma: What you said, because it's a hundred percent right, is we've also seen it in other directions of these foundational models. I think specifically last Christmas, the chat GT launched, um, or OpenAI launched the shopping feature. And it wasn't about shopping. It's about seeing how people make decisions and what persuades them to take an action and what it takes to help someone make a very specific decision.
rships. Thank you for naming [:
Erin Mote: And as an enterprise architect, if I was building these tools, I'd think about like what's the biggest sociocognitive modeling experiment that I could get at? And for me, frankly, that would be schools and higher ed institutions, because that's where I know that there are millions of data points of interaction in the learning process.
Erin Mote: And so it's not an accident that we see the deployment of free tools, um, in order to capture this data. There's a whole set of business pressures here that are driving why AI is so attracted to education
Rebecca Bultsma: and it's money At the core of it, it's money, right? All of this money, it's money. So yeah. Anyway, go ahead Brett.
Rebecca Bultsma: Like I said, I feel like Erin and I are BFFs now. We could probably do this forever. You're just gonna have to cut us off at some point.
ard, there's a new professed [:
Brett Roer: We know one of the organizations that we're all big fans of and we try to use on the show is the Rhythm Projects AI effect game. Because as we know, there's no right or wrong answer to these scenarios. They're difficult. There's gray areas, there's tensions. So Erin, we're gonna play one round of the AI effect game.
Brett Roer: So I'm gonna share my screen in a moment, and because you're our esteemed guest, you are gonna pick the scenario that most resonates with you that you'd like to explore. And then we'll go over the rules one more time and then we'll have some fun on it. So I'm just pulling up a PDF of the cards that when you play the AI effect game, it's the game to uncover how AI can strengthen human connection or where it might pull us apart.
tence with either support or [:
Erin Mote: Yes.
Brett Roer: Great. I am gonna scroll very slowly.
Brett Roer: If you see one that just catches your eye that you would be excited to do, just just say stop. And we will do that one. Sound good?
Erin Mote: Great. Okay. You're gonna test my eyesight. How about we talk about generating an AI created message to apologize to a friend after a fight?
Brett Roer: How about we do that? Let's do it.
Erin Mote: Do that.
Brett Roer: Okay. So I'm gonna say it again. And then Erin, since you are esteemed guest, you'll get to go first. So the question is, using AI to generate a message to apologize to a friend after a fight. Take it away, Erin and then Rebecca. And then I'll close.
Erin Mote: So I'm gonna say support because I do this all the time.
direct. And so I think that, [:
Erin Mote: To your point, Brett, that human connection that drives understanding and relationship, I think AI can be a tool to help us, um, speak to be heard rather than just speak.
Rebecca Bultsma: I love that. Yes. And, uh, I, I'm not a fan of ever copying and pasting anything from ai, and so I, but I do use it a lot because I'm a lot like you, Erin.
s and, and sometimes I don't [:
Rebecca Bultsma: I'd like to have that vantage point. So I'd say support, um, you know, as part of the process of figuring out what you're apologizing for, why you were wrong and identifying something that you haven't yet seen. But then I'm a huge advocate of make it your own. Uh, don't copy and paste. Use it as a guide. And, uh, yeah.
Rebecca Bultsma: So I, I would say it depends and support because, you know, I'm gray. I live in gray.
know, hopefully your friends [:
Brett Roer: So like often I'll be like, okay, here's what happened. Here's what I know about this person that obviously offended them or made this disagreement occur. Here's what I know they need to hear that I authentically believe and I need help presenting it to them and making sure I don't start with the thing that I think is most important.
Brett Roer: Starting with they think is most important. And then obviously refining that draft, but like getting what's out of your brain, the thought process and knowing like you can com compartmentalize it and take your emotion and then have someone help make it more logical in the way that that person receives it is how.
Brett Roer: That strengthens human connection. Obviously there's a way to use this to erode human connection by just copy and pasting it or not giving any context or being or leaving the prompt in the, in the text message or note you send. That is what I keep hearing from. Yes. So thank you Erin, for giving us that.
u something we did yesterday [:
n a Marriott hotel from about:
h it. So this gentleman said [:
Brett Roer: I create the stations and it has the, it has primary sources from history, and then it asks them questions. And what he loves about it is it immediately gives you on a rubric score what your grade is and one sentence of feedback. And I was like. You know what's interesting sir, is we're about to play a scenario like this.
Brett Roer: So I shared it with the group and said on paper, like, you know, high a kid said, oh, we're playing, we're using that tool today. I love it. And the teacher was like, this is great. 'cause they get instant feedback and they keep trying to get the high score on the rubric. So like, you know, from a bird's eye view, perfect, right?
Brett Roer: High engagement, the rigor's there, the standards are there and kids are persevering and there's a productive struggle. And so I said, you know, so I'm not judging what you're doing. I said, is there any policy at your school whether you should be using AI to grade papers and et cetera? And he's like, I don't know.
onna read it to our audience [:
Erin Mote: Yeah.
Brett Roer: Great. Okay. So the teacher in this scenario is Mr. Thompson. He has 140 students over, across five sections on Friday. Every one of those kids wrote a full essay for a period. It's Sunday night. Mr. Thompson has his own family, his own life, and a full week of teaching ahead, you know, facing him Monday morning.
Brett Roer: So he opens up a safe district approved tool, and it's gonna score all 140 essays on a one to four rubric. It's gonna give them one sentence of personalized feedback for each student. The the, it already passed all the DPA agreements. It meets all the data privacy. He reviews all the outputs. Two of the scores of the 140 felt off.
Brett Roer: He then enters all 4,000 grades into grade book. Kids have access to the material and the feedback. He goes to bed, they get their grades Monday morning, they didn't know the feedback was AI generated. Mr. Thompson comes in Monday morning. He rested over the weekend. He practiced self-care. He's energized and ready to teach.
s my question for y'all. You [:
Brett Roer: Oversight? And what would your district expect a teacher to do before submitting AI assisted grades? Let's just start there.
Rebecca Bultsma: I, I just can't side eye this hard enough, honestly. Like, I, I hate everything about it. I like, but then again, as an ethicist, like one of the core founding principles you will find in every single ethical AI framework is accountability, right?
n, they were trying to teach [:
Rebecca Bultsma: S and thousands of pictures of and dogs. And then they realized when they were testing it, the computer had only decided that the picture was a wolf. If there was snow in the picture. That was something that the testers never even thought about. They didn't even see the fact that every picture of it, they showed a wolf, there was snow in the picture, but that was what the AI picked up on, which is why we don't know really what the AI is making decisions about in this grading a rubric.
Rebecca Bultsma: Maybe there's some indicator in there that the kid is from a different socioeconomic class or we don't know what it's picking up on because there's no transparency, no explainability, and no accountability in this scenario. So I do think it can be tweaked, and I do recognize there are benefits to helping teachers with their workload.
Rebecca Bultsma: I just don't think this is the place for it or in this way.
o disclose, you're using ai. [:
Erin Mote: Like we owe reciprocal trust in our education system. That's the foundation of human relationships. And so if you aren't disclosing, you're using AI to your students when you're using it, you're breaking reciprocal trust. And I don't think that has. The type of thing that I would want between my educator and my students.
Erin Mote: The other thing I'll say is, you know, for folks who wanna dig into the bias and tools, there's a wonderful researcher, Punya Mishra at Arizona State University, who I think has made some of the most consumable understandable research available about this in the education profession. He sits at the Mary Lou Fulton Teacher's College.
e an educator. He's thinking [:
Erin Mote: Well, I'll give you two. One that I talked about I in Congress actually last April when talking about bias, because I think when we think about bias, we think about things that are like race or, or zip code, or. Those types of things. But there's all these contextual clues that actually AI is using to rank students lower.
Erin Mote: So I say y'all all the time, many members of Congress, even during my hearing, said Y'all, if AI was grading that transcript, it would've knocked them down in terms of a grade. Words like di, like dialectical difference words, like y'all have an out, an outbound impact on young people and whether or not they're scored higher or lower.
um, students were given the [:
Erin Mote: It's just repeating our embedded human bias. And so this is not a place where I think we should be using ai. And in fact, we are really big advocates of helping people understand how to keep a human in the loop from a policy perspective, particularly on consequential decisions. Because I don't think we want AI making decisions about what grade a kid gets, or whether they get into college, or what college loans they get access to, or whether or not you get a mortgage.
Mote: And right now we're in [:
Brett Roer: Perfect. And so I wanna share, first of all, thank you because. What's great about these scenarios is not everyone has your vantage points and level of expertise that's needed.
Brett Roer: So for example, when we talk about humans in the loop, which is the next section, it is very clear like what should students be told? What should parents be informed? If there's a tool that's used in grading, is there a difference between using it to plan a lesson, using it to evaluate, using it to give feedback?
Brett Roer: And do you have any current guidance on when and how teachers must disclose it? And then how do you communicate it? But then the last part is after you talk through all that is we've been using the red, yellow, green because it's just an easy framework for people to get a sense of. So like if at this point it sounds like Rebecca, would you be, would you be picking the color red?
Brett Roer: I know that there's much more nuance to this, but where would you fall on this, on this color scale right now?
g it as needs more training. [:
Brett Roer: Well done. Always, always sticking to the script. Love it. Take that algorithm. So anyway, I wanna thank you first, both of you for like going through that experiment.
Brett Roer: And this is exactly the kind of things that educators are grappling with. Like how do you use a tool that's supposed to help you in an authentic way? And even if you, like, one of the questions is, what would you train it on if you had access to like, so what are the state standards? What should the teacher be putting at the top?
Brett Roer: Like to make sure it is aligned to what students should be learning at that age and ways it can support it. But I really wanna highlight and applaud both of you for realizing this is without, if not done with fidelity, and if it's done in a way that could be punitive to students, this shouldn't be acceptable.
ssessment or in a evaluative [:
Rebecca Bultsma: hundred percent. You could have students use it themselves as they kind of, uh, work through their.
Rebecca Bultsma: Process. Process, if it's an approved tool, and take that feedback that maybe the teacher generated and assign a grade tube to help refine their submission potentially. Or maybe the assignment itself isn't necessarily about the output, it's how they used the feedback and what they chose to implement and not, and where they felt it was lacking and not, and to teach.
Rebecca Bultsma: Exactly. Circling back to what Erin talked about as those AI literacy skills, how they decided to interpret the feedback from the ai, whether or not they decided to use it, how valuable it was, and where it fell short.
Erin Mote: Yeah, and I think like all the learning happens in the drafting. Like that is actually where we know that learning happens, is in refinement and drafting and the productive struggle and taking inputs and outputs and making decisions about what is the way you wanna structure that argument.
e you wanna use to make your [:
Um. Development, some of the [:
Erin Mote: And those are the types of things that we need to be thinking about. Again, when we're anchoring in the learning sciences and developing purpose-built educational tools, we're not just thinking about, you know, having a tool that is about rote memorization and knowledge acquisition, but it's about like, what are the scaffolds that keep you supported in that, in that learning experience, in that Socratic thinking, in that pushing and that discernment.
Erin Mote: So those are the types of tools I think that I am really excited about right now.
something that incorporates [:
Brett Roer: Now that we've actually asked people these questions and then framing it against like pedagogy, data, safety, privacy, all the frameworks that you've referenced, we might actually move somewhere and address those points that Erin just said. That's what we really want kids to leave with. It's not really about the number grade, it's about all the skills and values that they can lead your community with.
Brett Roer: So thanks. Thanks both of you for bringing your expertise and personal beliefs to that, to both those scenarios. That was excellent. I know we could probably do this for another, what do you want? Maybe we should do like another three, four hours here, long form. Or maybe we could,
Rebecca Bultsma: if Erin wasn't so busy, I'd love it, but I know how busy she is.
Rebecca Bultsma: So I say we, we skipped to her Ocean's 11 list.
d know about, whether that's [:
Erin Mote: Well, I mean there's so many great people doing great work. I am always sort of elated and and inspired by the educators and the leaders and districts who are like doing this work day to day, fighting the good fight, being in conversations with communities and parents and. I'll, I'll call out. You know, a couple things that I think are really important.
Erin Mote: First, I'm gonna call out a podcast that I think I, um, am just driving, listening. So not, I want everyone to listen to your podcast, but I also want them to go listen to the Last Invention, which is a podcast long form podcast about the history of ai. I think it's a fasting fascinating way in eight episodes to understand this technology and how it has existed since World War II and why it feels so disruptive right now as an arrival technology.
Erin Mote: And [:
Erin Mote: I would say that, you know, I'm gonna talk in broad categories like. I think we are inviting industry to the table. So, um, I know that can feel a little hard for folks. I know that there is some friction in that conversation, but for me, I think we need the systems architecture that bends the arc to shape these tools for what we want rather than what companies wanna get outta them.
me throughout my career has [:
Erin Mote: And how do you hold both at the same time in these conversations? You already called out Michelle Culver. She's my girl from the Rhithm Project, but I love the conversations, but she's pushing. Rebecca Winthrop is another person who I think isn't afraid to say the hard thing, Isabel, Hao? It's funny how I'm naming all women.
Erin Mote: I'm sorry about that.
Rebecca Bultsma: Don't be sorry. Never
s like, you know, Ronit from [:
Erin Mote: And then, listen, probably the people I I talk to the most, I enjoy the conversations the most with are teachers who are actually in c. I, you know, believe that we need to honor the voices of educators who have a lot to say and a lot of wisdom around what it means to be moving from a schooling system to a learning system.
Erin Mote: And when I close my eyes and hoping where we get with this arrival technology, I hope that the experiences that we have for Robert and Claire, my kids are less about schooling and rote memorization and assessments, and more about the types of learning experiences that we can create. And we can't do that unless we center the voices of educators, students, and families in this conversation.
on, but I do think we need a [:
Brett Roer: Thank you Erin Mote, that was very eloquent and passioned and a wealth of amazing people that folks should be looking out for, especially amazing field of female leaders in this space.
Brett Roer: So I wanna say thank you again. Thank you to all our listeners. This is dropping on AI National Literacy Day. We hope you engage with all of the amazing organizations, solutions, and people that has shared with you today. And I, everyone listen alongside of us. Thanks again, everyone enjoy the podcast and have a wonderful day.