Artwork for podcast AmpED to 11
From Compliance to Capacity: Rethinking Accountability in AI with Julia Fallon
Episode 3023rd March 2026 • AmpED to 11 • Amplify and Elevate Innovation
00:00:00 01:03:28

Share Episode

Shownotes

What does it mean when policy disconnects from the classroom, and AI is forcing us to finally pay attention?

Julia Fallon is the Executive Director of SETDA (State Educational Technology Directors Association) and has spent three decades navigating the intersection of education, technology, and policy. She calls it "the unglamorous middle", the space where good ideas either fail quietly or become real. After 17.5 years with Washington State's Office of Superintendent of Public Instruction, Julia now leads a national network of state EdTech leaders working to operationalize values, not just write rules.

This conversation with Brett and co-host Rebecca Bultsma explores the gap between what we say we want in education and what our systems actually support. Julia makes the case that AI isn't creating new problems, it's exposing the ones we've been ignoring. She walks through the three divides in EdTech (access, design, and use), the difference between compliance and capacity building, and why we need to get as clear about where not to use AI as we are about where to use it.

What You'll Learn:

  1. Why policy is how values get operationalized — and what happens when implementation doesn't match intent
  2. The three divides in EdTech — access (nearly solved), design (teacher capacity), and use (active vs. passive tools)
  3. What ethical AI really means — building systems that notice harm and respond before it scales
  4. The accountability paradox — how "everyone's responsible" leads to no one being accountable
  5. Responsible unadoption — when and how to put tools down that don't align with your values
  6. SETDA's quality indicators for EdTech procurement — five practical filters for vetting tools

Rebecca brings her ethics lens to the conversation. Julia turns the tables and asks Brett and Rebecca: How do we keep AI leadership from becoming AI compliance? The episode wraps with shoutouts to state leaders and Brett's "jumbo cannoli" sign-off.

Tune in, subscribe, and share if you're ready to turn up the volume on what's possible in education.

Transcripts

Julia Fallon: [:

Rebecca Bultsma: Some people feel very, very strongly that it's important that kids learn traditional ways, traditional methods, and not use AI.

Rebecca Bultsma: And so that ends up being really hard. Um, and it's hard to find spots that everybody agrees on.

Julia Fallon: The model isn't working, and this is AI is just showing it even more so. And do we need to have more project-based learning types of environments for kids? Do you know where they can explore things too?

Julia Fallon: Maybe, you know, then they can learn where their, their passions are and they can go off.

Brett Roer: We are live on the AmpED to 11 podcast. Thank you to our millions of listeners for joining us today. Rebecca, how is everything going in your homestead of Canada?

u know what? Just living the [:

Rebecca Bultsma: So we might have to start another podcast, uh, for the real stuff. But, uh, the, uh, AmpED to 11 after dark podcast potentially. We'll, uh, see how that goes with all the good stories. But other than that, I'm, I'm doing great. How about you?

Brett Roer: I'm excellent. Today is Versha Munshi-South, the Chief of Staff of Amplify Elevate's birthday.

Brett Roer: We were just in our favorite school district in Westchester. We just interviewed students and while I truly am so excited to introduce our guests in a moment, there's nothing better than being in person with students. It was the best thing ever until this podcast right now. So, without further ado, we need to introduce this rockstar, I guess you could say.

, she her, and we're a first [:

Brett Roer: Julia, thank you for joining us on the AmpED to 11 podcast. How are you?

Julia Fallon: I'm good, thank you. I'm really glad that I'm here.

Brett Roer: We're as well. And did you ever think that would be the introduction to a professional setting of a podcast appearance?

Julia Fallon: No, but I love it. I love that you're using my entire title across, across the way.

Brett Roer: Yes. So first, for those that don't know, we are gonna go deep cuts in your journey. We're gonna talk about, you know, getting Ukrainian cured pork sent you halfway around the country. That's all to come. But could you first start by just sharing with people. What is the actual incredible work you're leading in the world of education?

Brett Roer: You have such a really important seat at the table. Could you share with everyone what it is you do and how you serve education today?

on of education, technology, [:

Julia Fallon: But I have been in the EdTech space, both at the higher ed side and K 12 space now for. 30 years so people can kind of figure out how old I am from that whole thing. But I spent 17 and a half years working at the Office of Superintendent of Public Instruction here in Washington State, which is where I'm based in various ed tech roles from career and tech ed to the pure ed tech, more of the boxes and wire side, you know, tech plans, things like that, to working in the federal program, title two Part A, which is around supports for teachers and administrators, and really helping our folks understand what tech integration looks like and what good professional learning looks like in that space.

ation of state level ed tech [:

Julia Fallon: And we also have corporate members that share our vision and mission. And I always say we're small and mighty because compared to our counterparts, like a Cosine or an ISTE, we do not have the numbers. We don't have a, a conference that I have thousands of folks at. We are very small. Um, if you think about folks that work in departments of education, there's only a handful.

Julia Fallon: You know, there's some states that actually have dedicated even more staff, but typically it's been a handful. And to find people that are doing the same type of work that you're doing in a state bureaucracy or a state government has been a really great experience just being able to go, Hey, who knows about this?

Julia Fallon: Because that's not maybe my wheelhouse. And, and be able to do all that kind of stuff. So really the work that we do at SETDA has really been helping state leaders move from that reaction to intention. And you know, we're gonna, I know we're probably gonna dip into the AI conversation here as well, but it's really about how do we.

Julia Fallon: [:

Brett Roer: That is impressive. I'm gonna, Rebecca is gonna be chomping at the bit because she's gonna have so many, much more incredible questions, really get in the weeds about the work you're leading.

Brett Roer: Here's something I'd love to ask you. Sometimes we ask people, you know, if you had a magic wand or if you had the right levers to make change at scale, what would you do? And you're one of the few people we're talking to who like actually do have some of those levers. What is the coolest part about your job?

Brett Roer: Like, what's something you really get to do or you can't believe you have a seat at the table or the ability to influence, you know, change in, in the United States in Education.

ike one of my like, what are [:

Julia Fallon: And I think that's what SETDA really exemplifies over the t. It's we're, we turned 25 this year, but the idea is really about how do we continue to connect people in ways that we can then move needles, right? And, and everything else. We have tech technology that lets us do it. I mean. You know, the technologies today, like we're on a podcast.

Julia Fallon: Remember, imagine doing this like 20 years ago it would've been like we had to go to a studio, we had to do some other stuff. So the air, the barriers to entry are much lower. But I think we're getting better at connecting virtually and making it really engaging and everything else. And I think again, it helps just support communities and it's affiliation I think is one of those types of skills that the internet brought to everybody.

one another. So I think the [:

Rebecca Bultsma: I'm really glad that you brought that up because you were talking about, um, before we jumped on here, one of your kids who's getting ready to go to college, maybe in Canada, which fingers crossed, but that kind of leads me to think a little bit. We have kids the same age who kind of were in school during the pandemic.

Rebecca Bultsma: What, what did we miss there that we can get right here, do you think? Like what are the opportunities? What did we miss that we have a second pants hat and what do we risk if we don't get it right this time?

Julia Fallon: Well, what's interesting is, I think we have, someone asked this the other day, also asked me this question.

, right? I think, I wanna say:

Julia Fallon: We're seeing sort of. Ripples from that, right? But at the time, I think some people were like, well, why do we need it? What do we need to do with it? Some teachers like, well, I don't need to know about it in teacher prep programs and we're not gonna, you know, address it. And I think what started to happen is when the pandemic hit, we made the joke.

Julia Fallon: I made the joke when I was a state leader, like, we should get, I told you so shirts we already knew. Well, we could say, you know, check the box that, you know, 96% or 98% of schools were connected and kids had devices. We weren't all the way there. Right? We still had that little bit to go. But what happens is now everybody's home during the pandemic, right?

connected at all, et cetera, [:

Julia Fallon: Meanwhile, for three weeks, everybody's kind of putzing around. Versus trying to figure out how do I serve what I can serve right now? And then get those folks that aren't connected or need devices up and running. So professional learning, you know, not teachers not having the skills for what it means to like teach online and virtually parents not understanding what's also happening.

Julia Fallon: You know, that sort of thing. Kids maybe not having the skills, trying to figure out developmentally, like kindergarteners and Zoom is a lot different than having high school students on Zoom at the time. And I think what's happening now with ai, and I'm hoping that we've learned the lessons of the last five years, is to be really intentional about how we implement the technology.

out it's gonna transform the [:

Julia Fallon: It's, it's accelerating. And I had posted on LinkedIn recently, I think it's actually. It's glaringly now where we have gaps in the system, not just technology. Technology seems to be getting the scapegoat here, but it's really showing the gaps in the systems of this model that we've been kind of holding on nostalgically to as well.

Julia Fallon: Um, one of the things I said during the pandemic was nobody missed the bell schedule, right? Or chem lab. They didn't miss that part. They missed ritual and they missed community, right? They were talking about when are we gonna have a football team? When are we gonna have the prom? When can the kids get together because they miss each other?

Julia Fallon: It was about that human connection. And I wish that we could also look at technology as like, how do we, how do we help that human connection happen? Um, how do we have more of those con conversations? You know what I mean? How do you free a teacher up with like, let's say, AI so they can spend more time with their students, right?

Julia Fallon: [:

Julia Fallon: I believe it's a civic thing. I think as a country, I think that helps us. But I mean, other people have other ideas about it being a workforce, you know, development kind of thing, which it does too. But I think for me, civics comes first. You know, being able to participate in my community. And I think schools should reflect the values of their community.

Julia Fallon: So I think sometimes when we think that things can come up front high, I feel like, well, where's student voice? Where's parent voice? Where's the community voice? And if they've all decided they wanna go in a certain direction, we should honor that in some way. As long as they've kind of gone through a process that reflects all of those things.

d. I help with policies, but [:

Rebecca Bultsma: Like what do you think the biggest disconnects are between like, let's say local state and like national, um, policy ambitions around AI and like the implementation realities and how do we ever start reconciling that? You probably see a lot of this at every level, and I know that a wicked problem, but I think it starts by talking about it a little bit and I'm curious what you think.

Julia Fallon: Well, there's always the, the, the, the stuff that has to happen between policy and implementation, right? Like I, I do believe, and I know it's hard to say that in probably 25, 20, 26, is I do believe that. There's good policies out there, right? The intent is good. When you look at legislative intent, I don't think anybody is trying to be a jerk here, right?

about. Ultimately the intent [:

Julia Fallon: Who sets the rules, who lives with the consequences? We have a Congress that isn't necessarily reflective of all the people in this country, you know, in the United States in particular. Uh, how do we make sure that voice gets in there? And they're also not very young, I'll put it that way. So they have ideas about how things were and how things should be.

Julia Fallon: And I think we have a generation of kids. I think about my own kids and the ones that are coming behind that they are looking at the world a little bit differently. And I think that they're actually more community focused. Previous generations. We tend to be more individualistically focused here in the United States, I think they're more community focused because they realize they're gonna have to work together in order to be successful in life and that sort of thing.

gets approved, but also what [:

Julia Fallon: So we can write the best privacy laws out there in the world, but what happens when it doesn't it when it fails, right? How do we, how do we hold companies accountable? You know what I mean? We, we, we're not even get into cybersecurity, I hope, because that's another bucket of things, but like, who's accountable at the end of the day when all that data is out there?

Julia Fallon: Right? You had said something on another podcast about asking about who does it harm and who does it. You know what I mean? Who's left behind? I, I read some language about that. And does the guidance arrive before or after harm? Right. Did, is it done in a proactive way or is it done in a reactive way? And then if our, and, and in the terms of K12 education policy are educators supported, right, or they left on their own to figure it all out.

u know, without policy, sort [:

Julia Fallon: You know, I think policy actually, it, it's how values get operationalized at the end of the day. That's really where I see it. And it's one thing to say that we care about ethics. It's another thing to actually design systems that uphold, uphold those things When things get complicated.

Brett Roer: I would love, again, we.

Brett Roer: We're so fortunate that we have someone like you who gets to see this at a national scale. You're supporting, like you're really at that intersection where like you're living with government policy, the direct impact you're seeing at the top. And you know, most of the work I'm leading and most of the people I'm in community with, we're seeing the end result, the end user result in impact.

l. So one of them was, I was [:

Brett Roer: And obviously I said, that's not the case. But they were like, well, where should we be seeing outside? Like, what else is happening outside that we should know about? And I said, because we'd already been talking. I said to Julia's point, like you've already brought up how community is the thing that fills your cup.

Brett Roer: Like everything you've mentioned brings humanity back. It's not ai, you know, but then I said like, but when we've been discussing how you feel about AI and these things. We can always adjust a framework or attach a framework to your community values, as you said before, but like the answers are really right here in this room.

Brett Roer: Like you all can have targeted conversations that would allow the wisdom in this room to shape what your AI guidance and policy is. So if you or somebody in that room, Julia, where would you first, what would be kind of things you'd want that community to do if they have already made the question and decision to bring people together?

out in those rooms? And then [:

Julia Fallon: Oh, that's a good, those are good questions. I think. I think coming up with some maybe shared principles, you know what I mean?

Julia Fallon: That you can actually implement, right? Like it's one thing to be really like 30,000 foot, but at the end of the day, I think communities are trying to figure out, in particular teachers, right? Because at the end of the day, that's, I mean, they're the ones that are impacted the most by what's happening, but they need, they don't really need mastery of everything.

ngs that we called out in the:

ssibility stuff. It actually [:

Julia Fallon: There might be some students you don't use technology for because it's not probably appropriate for them to be able to get to a learning goal and things like that. But also, while you're doing that, you can have those conversations with students because students need a safe place to learn how to use technology in a way that lets them be responsible, citizens, responsible when they become employees, responsible employers, when they're starting businesses and doing all of that kind of stuff.

Julia Fallon: And it's, again, it goes back to what shared principles are you all talking about? Do you want all your students to be, you know, like post-secondary success, whatever that looks like, college. You know, certificate apprenticeship programs, like what does that look like? Do they have a set set of standards that they all are gonna know?

you know, we're still gonna [:

Julia Fallon: All of those things that you would talk to kids about, even if technology wasn't there. It's still a conversation. But those are principles that I think you need to land on. Some shared principles, and I think where districts and states can. Align is, what do those principles also look like, right? I don't think we need one national rule book again, you know what I mean?

Julia Fallon: But I think we should have some, some shared principles about what we see. And then I feel like accountability should be building capacity, not fear mongering, right? So when, when stuff comes, compliance kind of driven, innovation completely shuts down and people aren't creative because they, they're, they're fearful, they're gonna get in trouble.

Julia Fallon: So what does it look like if accountability built capacity? Do you create spaces, you know, as a community for people to try things? You know what I mean? And, and not be fear-based and, and everything else. So I think that's where I would start them. I'm trying to think of the people that you would talk to.

n for me, Brett, because I'm [:

Julia Fallon: But just like any other emerging technology. Our, our field is really small. I mean, I don't think people realize, especially if you're not an Ed Techer that's listening to this, our field is actually quite small. And I think reputation and community and connection actually can tell a lot about who you can trust in the space as well when it comes to all of this stuff.

Julia Fallon: So to lean on that and see who they're connected to, and I'm very particular about my LinkedIn, like I just don't accept things from everybody because I don't want people to necessarily leverage my reputation, right? If I don't know them, if I, if, if it's somebody that I share 78 connections with, I'm more apt to say, yeah, let's do this.

cting. So that's my own sort [:

Brett Roer: Yeah. And I wanna ask, I wanna get Rebecca's insight on that question as well. So Rebecca, I'll let you, let's wheels churn there. But Julia, I important. We collectively, all three of us. And then, you know, this kind of like you said, there's kind of like a circle that people, that you, we get to actually interact with these people as humans, ascertain their values and see their true motivations.

Brett Roer: And that's one of the things I love about the fact that we get to bring on guests like you onto this podcast. Like I might be directly impacted as a teacher who's, my district has adopted X standards or X, you know, alignment to AI policy or guidance. And I'll never know the difference. So like that's why I'm so grateful that people like you come on and you share some of those best practices because you wouldn't be amplifying those and lifting those up unless you truly believed in them.

Brett Roer: So I just wanna say thank you. Now it's your turn. Rebecca, what would you say to that? Like what are, what would be your feedback and thoughts around the questions that surface on my end?

k it's hard, there's so many [:

Rebecca Bultsma: Tipping is ethical. You know, like all of these really, really kind of simple things that something like this, um, you know, we may feel like it's ethical to prepare kids for a future with ai, whereas some people feel very, very strongly that it's important that kids learn traditional ways, traditional methods, and not use ai.

staff and you agree on these [:

Rebecca Bultsma: But it's so much easier said, uh, than done. And that's why it's hard to do it in bigger communities and the difference between K12 and post-secondary and then kids enter the workforce where they. Do or don't have the skills that they need. So it's, it's messy. There's not a lot of, uh, clean answers, but the best thing you can do is listen and talk to people and find out what works best.

Rebecca Bultsma: But I think we have to remember that kids are getting a lot of disconnected guidance right now, like even from class to class. Like, absolutely not in this class. Okay, yes, you can use it in this way, in this class. And it's just a lot for them to manage and, and they're getting a lot of mixed information about competing priorities.

ve deep-seated like feelings [:

Rebecca Bultsma: And so it's just interesting how influential teachers and leaders and so we just, we have to get this right, whatever right. Looks like, um, it looks different for everybody in their mind. For me, it's having kids prepared to exist. Get jobs in the real world, which I think involves learning and understanding ai, which starts with teachers learning and understanding AI and the people making decisions about it.

Rebecca Bultsma: But again, chicken, egg, hard, complicated. What's the kind of the biggest disconnect that you're seeing out there? Like are you seeing this kind of polarization when it comes to ai?

Julia Fallon: I'm, and I was, I was just gonna, you just said something that is interesting because I think sometimes I don't know if ethical AI is about getting it right, necessarily.

Julia Fallon: I, because it's just, I don't know if we're ever gonna get it, right. Like, you know what I mean? Like I think, but,

Rebecca Bultsma: and what is right.

Julia Fallon: Yeah. And I wonder sometimes if it's more about building systems that can notice harm and respond before it scales, right? Like that's a different way to look at it, right?

comes in though, right? Yes. [:

Julia Fallon: well, if everybody's responsible, then no one's accountable, right? That's

Rebecca Bultsma: exactly

Julia Fallon: kind of thing. So.

Rebecca Bultsma: Do we have individual responsibility to use it in ways that don't cause harm? Is it the school's job to teach us?

Rebecca Bultsma: Is it the ed tech company's job to build that in? Is it the foundational tech company's jobs to make sure that their systems don't cause harm? Does the government, are they responsible? It's almost like the gun debate, but like on a different scale of who is accountable, what is accountable, and what is the right way to do this?

Rebecca Bultsma: And it's just like you said, right now we're living in a world where there is no real accountability for any of the harm that AI is causing short and long term. And that's part of the problem. There's not a lot of, um, accountability.

Julia Fallon: I do wanna, I do wanna do a shout out though because I do think that educators who slow down and ask better questions are really underrated.

really listen to students. I [:

Julia Fallon: And, you know, building that trust. And sometimes I wish that we would listen to those folks that are like closest to implementation because I think they see things a lot earlier than the rest of us. And I feel like we need to listen to them a little bit more. Um, 'cause there's always folks that are gonna go first, right?

Julia Fallon: They are like our scouts and then say, okay, so what did you learn? You know, what, what should we work out? You know, can we get around this pothole here? Or is this not the way to go? And I had the fortunate luck of sitting next to somebody on a, you know, cross country flight who actually, now mind you, I was also trying to look up his credentials while I was talking to him to make sure he was like, legit, like, you know, oh, you run AI for the Department of Defense?

Julia Fallon: Oh, wait a second. You know, like, I wanna make sure this guy is really legit. And it, it, he truly was. And we happened to not have anybody in that middle seat. And I was like, this is a, this is like just as chat, GPT is hitting. Right. August of 23 is what I keep thinking about in my head in terms of dates.

lia Fallon: And I said, what [:

Julia Fallon: Like if you used it in a space, it's not, it's not good. It's not good. Right. Be very clear about where not to use it as much as you are clear about where to use it. So I, I keep thinking about him and that conversation that I had, you know, across the country from Seattle to DC about, oh, where do we not do, I don't think we have a lot of conversation about where not to use it.

Julia Fallon: You know, everybody's like, let's just use it everywhere. I'm like, well then maybe not, you know, like, maybe we need to slow down here a little bit and uh, that sort of thing. But what's happening right now in the United States too is all this rhetoric about taking tech outta the classroom. Right. So it went from cell phone bands to device bands.

obably get some sort of tech [:

Julia Fallon: I'm like, let's say we, let's say the scenario happens and we get rid of tech in the classroom, just we're gonna do that. It happens. And that's the worst case scenario. You still need it for the front office. You know, we have lunch room, we have bus schedules, we have security systems, we have assessment data.

Julia Fallon: Are we all gonna start doing pencil and paper assessments again, because that's where we're gonna move to if you're not gonna have any tech in the classroom, because if you give a third grader a test, do they fail it because, you know, they didn't, didn't do well in the assessment because they don't know how to use a computer or they don't know the stuff.

Julia Fallon: Like you won't be able to ferret that out if they don't have those experiences. So I, I think people are kind of. Extreme because they're just overwhelmed and there's a lot of fear out there right now and everything else. But how do we get them to think about, okay, there's instructional tech, there's, I, I don't wanna get to a situation either where some kids that need tech in order to be successful, right?

nd the laptop because no one [:

Julia Fallon: So I'm, I'm trying to figure out how we get back to that moderate, you know, what works kind of space versus these extremes like no tech, all the tech and everything else. So that's been an interesting, interesting conversation that's been going on. And I think it's actually, frankly distracting us from the real issue, right?

Julia Fallon: About agency. Are we gonna give kits agency and choice with their stuff? And the system that we have now is not working.

Rebecca Bultsma: Isn't it interesting just that we're seeing at every level of society that kind of polarization to all or nothing, or one side or the other? I think that's so interesting because you know, all of this and all of ethics, everything lives in the gray, in the nuance, and I think even the idea of all labeling, all ai.

like versions of it that are [:

Rebecca Bultsma: Not only in this but in every aspect of everything we do. And that's part of the problems I think we're running into largely labeling people as good or bad or this as good or bad or technology as good or bad and trying to go all or nothing. I think, uh, that's not the world we live in, as you mentioned, this generation sees that, knows that, but the world is being run by people, our generation and a lot older than us, who really see the world in that binaries and it suits their needs and purposes often to keep things that way.

Rebecca Bultsma: And uh, it's not helping very many people.

ld love to know Julia, like, [:

Brett Roer: Like, if we're not supposed to use it here, you know, like she had a very strong opinion that like she morally would never do that and she knows many others do. And it, you know, it goes on, it goes, um, unpunished. So she's like, and I said, you know, that's a, that's, she's navigating that every day. Like obviously everyone has that moral imperative.

Brett Roer: You, you're allowed to cheat or not cheat, but like, she's so ethical, but she's seeing other people in with no effort because the teacher doesn't notice it and it won't change their assignments. She's indirectly being punished by that and her values have to be, you know, pushed every day. But then we came up with this idea, which is around the table.

xtreme, we all, that's easy. [:

Brett Roer: These students talked about physics. It's like having a second teacher and they've learned so much by being able to rely on a AI generated physics support. So we said, great. What if we just went around and everyone said like, well what's the gray area like, where do you just personally feel like, I don't know when I'm supposed to, or not supposed to in the absence of guidance.

Brett Roer: I said, we all just did that. Could that kind of help us as a community kind of shape like where we actually are and then like adopt policy around that?

Julia Fallon: I don't, I don't know because I think about, let's say that, I mean in the world, right? Like let's say, you know, like we're, we're adults, right? We're in there in the spaces.

Julia Fallon: My, my, no one's telling me that I can't use it. You know what I mean? No one's saying that I can't use it for something. And I think it's a big question. No one's really talking about this is assessment. Because I'd rather the teacher go to the kid, whatever you punch in there and you get the report, we're gonna know about, you know how sales divide, right?

? That's a thing that we all [:

Julia Fallon: How did they confirm that that was like a reputable source where I could really rely on that? I guess, I guess it goes back to, you know, when we can Google the 50 state capitals, do we need to teach state capitals anymore? I mean, so for me, that gray area isn't, you know, is the teacher rewarding the, the, the output at the end of the day, or are they rewarding the process that the student took to get there?

think are worm, you know, a [:

Julia Fallon: And that looks different than one school that I went to. It looks different than our legislators went to school. And that's a different, and, and it also cuts off this instant access for grades, right? That parents want, right? Like they wanna know that the kid took the test and I can check the grade book at the end of the day and they got an A that might get broken down a little bit 'cause teachers are gonna need more time to actually assess whether or not a student understands or understands how to look for the information.

Julia Fallon: Those are the skills that you're gonna need in your work life later on. You know what I mean? Unless you're gonna be a biologist or you know, a stem cell scientist. You're gonna need to get in there a little bit more, right. About how cells divide and everything else. But I think for us is where the model isn't working, and this is AI is just, is just showing it even more so.

and they can go off, do the [:

Julia Fallon: You know, anybody that's been in a combine these days, it's, it's, they're doing tiktoks. Well, the, the, the equipment's running the thing. I mean, I see that all the time, right? But they also understand what that information is getting them so they can make adjustments on the fly and everything else. So that gray area is, I mean, how do we do assessments?

Julia Fallon: How do we help parents understand what's happening? I think when they're not invited to the conversation, they become even more fearful because it doesn't look like what they know. But if they could say, Hey, no, my kid is gonna learn this, I think they would be all in. So I don't know if that answers your question, Brett.

Julia Fallon: I feel like. People. The, the, the cheating thing is the cheating thing. Kids have always

Brett Roer: gray area questions, gray area answers. Rebecca, your thoughts?

Rebecca Bultsma: I was just thinking about how it's just, this, again, all comes back to the policy thing. 'cause it's not even necessarily even about what the parents want. The parents can demand that, but the legislators need a way to categorize.

omething that they can grade [:

Rebecca Bultsma: Collaboration, communication, you know, cooperation. All of those things that are harder to put into metric categories that then help us make binary decisions about kids and their futures and SATs and all of these things. And it's, it's just not gonna be as neat and tidy into big buckets like we've always had.

Rebecca Bultsma: Like we. You know, want at certain levels, and I don't know how we, how we even start to come at something like that. I do think AI can help with that somehow, but the goal is to have it help support how we think and that process, rather than just give us outputs faster. Right? Having AI do stuff for you is not the solution.

ltsma: Uh, we all know that, [:

Julia Fallon: What we're trying to do is making sure that state leaders have, so again, I think the conversation's going to like research and evidence where we can make really good decisions based on what we know at the time, right?

Julia Fallon: We're not gonna always have everything ready to go, but can we, can we help? State leaders, figure out where they can go get information, right? Research, talk to districts, whatever, to make a policy decision or help inform policy as it gets written, because eventually it gets implemented and everything else.

say, Hey, listen, AI is now [:

the space. And again, in the:

Julia Fallon: Like we have access almost worked out. We still have to work on home access, and that's one thing. But teachers need time and space, right? To be able to figure out how to use technology effectively in their instructional practices. And this is an equity issue. And I know that is a dirty word sometimes here in the United States at the moment, but it's really about just like even equity be equitable access in a school.

Julia Fallon: And I often use my kid as an example. You know, she has to take biology, right? At some point in her high school career. And her last name starts with F and her best friend's, last name starts with M. And they will be in a section of biology 1 0 1 and they will have two different types of experiences. Now, I'm not saying that they have to look exactly the same, but one teacher might be tech adverse and one teacher might be like more tech forward.

Julia Fallon: [:

Julia Fallon: So. That's the design divide, right? And then we have the use divide, which is, I want to make sure that my kid is using it in an active way, not in a passive way. Right? They're not just doing test prep or, you know, remedial stuff. She's actually using it, like you said, to create, to collaborate, to do research.

Julia Fallon: Like they're using it in a, in an active way to be able to demonstrate knowledge and, and understanding. So I think for me is, yeah, it's just trying to figure out how, what kind of system did I create in my space? But as, as a state leader, we wanna make sure that we are getting dollars for professional learning, whether it's from the feds or from from state budgets or, you know, cybersecurity.

n: You need to have actually [:

Julia Fallon: So how do we, if you want students to come out with skills, then you better, you better be investing right in the ED system to do all of that kind of stuff. But our leaders are really coming together right now. A, you know, many states have put out AI guidance now they're like, okay, now I need to implement.

Julia Fallon: Right? I need to, now that districts have the guidance and they have sort of a container to work with what's next? So what, what's the state role? We often ask that question all the time. What's the state's role? Because the state can't do it all. Right? And again, we want it to reflect a local community. So how do you help a small rural district in Washington state be able to do this when they only have, you know, 600 kids total?

pport them? Maybe they don't [:

Julia Fallon: And, and you know, we, I, I don't remember what the stat is in terms of like these 20 million students are in rural and smaller communities, right? And that's a whole untapped, and I know the ed tech sector and develop, we haven't talked about developers. They overlook that because it's not a volume buy, right?

Julia Fallon: But you could actually go in and do a lot of fidelity on your product and impact and, and contribute to research to whether or not it works in a smaller school district with, you know, a hundred kids. Does your reading curriculum work and you know, or your platform and everything else? So our, our group is really our, our, our folks are, we, we try to use the state trends report and also just things that are coming top of mind.

e ground, like what's coming [:

Julia Fallon: AI is one of those things. Cybersecurity. And then we have some state level programs like E-Rate and title two that we have, you know, our ends in that. We help states be able to come together to ask those types of questions so they can support their districts a lot better at the end of the day.

Brett Roer: Julia, you said so many things I wanna go deeper on, but I also wanna make sure we give you some equity of voice at the AmpED to 11 podcast.

Brett Roer: You know, we always allow our guests to, you know, reverse it. It's your turn to ask my Rebecca and me any questions you might have. You get one shot at this. You know how to master a stage and rock a crowd. So let's go. What do you got?

Julia Fallon: I think my question is how do we keep sort of AI leadership from becoming another compliance exercise instead of a capacity building one, right?

s, and I am a modern school, [:

Julia Fallon: Like, how do we get beyond that? But ai, how do we make sure it's not another compliance exercise?

Brett Roer: Rebecca,

Brett Roer: we

Brett Roer: checked the box. You the first or or second you choose.

Rebecca Bultsma: I usually go first, Brett, so I'll let you go

Brett Roer: first on this front. I always say it's really hard to follow and now it's gonna be even harder to lead off.

Brett Roer: Okay, so I think you just said it, Julia. We were actually, I was just talking with someone and they were like, for their parents for example, they're like, well, we have some parents who, you know, think one thing, we have others who don't want it at all. And I said, why don't we create a space that's called I hate, but I'm willing to try and like just show them why it's relevant to how to be either a better parent or just what's the thing in your personal life as a busy adult.

Brett Roer: And [:

Brett Roer: That you have to get them to that moment where immediately, it's not like you just taught me one magic trick. You taught me something that I can replicate and do in different problem sets that I have to do. So regardless, that's with anyone with ai, whether it's an AI leader or not. And then the last thing is, everyone should take Rebecca's amazing course called like gen ai.

ed to do, and that's it. And [:

Rebecca Bultsma: Oh, I think like you've gotten to the core of it, I think you need to figure out, um, you need to identify people's pain points and speak to exactly what their actual needs are instead of just trying to ram it down their throats.

Rebecca Bultsma: Right. Like it, in my experience with the thousands and thousands of people I've, I've talked to about this, it's just a matter of figuring out one little way. That it can be useful even in one little aspect of your life, even if it's your personal life. And that is almost like a, a little magical gateway into being a little bit more open-minded about it.

headfirst, as you mentioned, [:

Rebecca Bultsma: And so I think, uh, acknowledging those things and really, um, not buying into the hype and trying to push the hype at people going in very, very measured with, yes, here's some opportunities, here are some risks, and what. What are a couple of the biggest pain points we hear about consistently and how can we solve just one little thing.

Rebecca Bultsma: I'd say the other thing, uh, is as you mentioned, just bringing students into the conversation. Like really just, uh, some of the best guests we've had and conferences we've been to have these, like this one kind of teacher or EA who's just like passionate about ai, who starts a little club with the students and then the students end up telling the board about it or teaching the parents about it and, and we're teaching the superintendents about it in a way that makes it engaging and fun and approachable and helps that light bulb moment happen.

just have to stop just like [:

Rebecca Bultsma: And I just don't think that's gonna work here.

Julia Fallon: I have another question. I have another question though I wanna ask because it's related. What is responsible UN adoption look like when the tool doesn't align to our values? Because I think that we've, in education, we tend to carry things for a long time.

Julia Fallon: When we could have put, probably put them down and let's say a community, and maybe it's students, because I'm hearing a lot from students, like they don't wanna use it. So maybe the community decides they don't wanna use it. So what does it look like for adoption when it doesn't align with our values?

t how this, uh, oversold and [:

Rebecca Bultsma: Like, uh, one to one laptops revolutionize education forever and people just aren't seeing that. But it's not like we're getting rid of computers. Maybe some people are, but I think that needs to be a conversation too. And it probably starts with being very intentional about the tools we even bring in, in the first place.

ca Bultsma: Like there's over:

to be mindful of that. And I [:

's not working, don't make it:

Julia Fallon: I do, I do see that sentiment out there and everything else along with Cosin and ISTE and Digital Promise and cas at Innovate uu. About two years ago, we started a conversation about how could we help, because I know as I use this a bird walk, I, I know as my own, when I walk the floor, I have a certain set of questions I always ask, you know what I mean about privacy and about accessibility?

And I'm not saying that you [:

Julia Fallon: Like, we don't need another LMS, please don't build another LMS because right now no one has the wherewithal to actually migrate to another LMS, but try to figure out what the pain points are for the one that somebody might be using, and then solve for that. Right? Or what kind of things are you trying to solve for and come in from that.

Julia Fallon: Things that I, I share the same sort of thing. We've wasted a lot of money and it's hard to fare it out exactly where technology has impact. I, it just, it's part of the infrastructure. I don't say, well, how does the bell schedule impact the kids', you know, learning and everything else. That's what drives me crazy, you know what I mean?

Julia Fallon: At the end of the day. So I share that sentiment as well and everything else, but I do know that we put out those quality indicators. We're hoping that people are starting to get them, and then hopefully products can start to validate against some of those indicators to show that they actually meet the standards that we put out there.

Rebecca Bultsma: It's just [:

Rebecca Bultsma: Like, that's not working. They need support and they need help with that too, right? Like it's a lot to ask of people.

Brett Roer: The adoption, the methodology I would take on for an adoption if we wanted to think as AI is a way to build community and empower, you know, empower people, right? When used correctly. So like when I have supported EdTech companies as a consultant and outcomes weren't matching intentions, the first thing I would do is I would, you know, pre AI would type and take low inference notes and reach out to the end users, the teachers, and or students.

w how much someone's using a [:

Brett Roer: Why? Especially if it's an anomaly with that product and a student's performance versus another one. Or especially about like usage. 'cause then there might be a specific reason why students aren't engaging with that tool and you have that answer. And then knowing this as a, you know, parent in a school board, people do not like waste.

Brett Roer: And they would love to hear, hey, this product didn't deliver on metrics. And our kids are explaining why. Well now it's an easy answer And like that's what would get people up in arms. Our own kids see this doesn't work. That's a very common sense way. And you can get that and you can get buy-in from your community.

the more ethical thing. Um, [:

Brett Roer: I remember last year at ISTE, it was the first time I'd ever seen like a very clear vendor expo hall checklist of questions you should ask based on your role. And I was so impressed by that. So thank you for bringing that to this kind of scale because yeah, that's, that's not what educators go to school for, right?

Brett Roer: They have, they believe in stuff that people say will help kids. So thank you for building that.

Julia Fallon: You're welcome. Procurement. Not very sexy, but it gets the job done. So we've been also working on, like, we put out a guide in relation to those quality indicators around what kind of questions can you be asking around those indicators as part of the procurement process.

Julia Fallon: Because we know procurement officers also are not, you know, well versed in the tech space either, but we wanna be sure that they come in the RFP or the, the proposal so that whoever is evaluating it then does have answers in one place.

storically, um. Haven't been [:

Rebecca Bultsma: And so I just think there needs, again, like for context, I do my research in the uk right? Where there's, you know, the GDPR and all of these data privacy protections. And so this is something I think about a lot. But, uh, yeah, I do think there's, there's a lot to think about and it's a lot on school leaders and there needs to be more.

Rebecca Bultsma: Like you're providing resources to help support them. So thank you for that.

Julia Fallon: But, but also giving school leaders like the voice too, to say I'm all about data minimization. Like I, I, and I always use my kids' example when she was in, you know, th third grade where field trip forms come home like every five minutes 'cause they're taking 'em out, right?

ething. I don't need to have [:

of them are shared with like:

Rebecca Bultsma: Because without data, there's no ai and it's like the most valuable currency that exists right now. And we're paying ed tech companies in money and in data, and then they're making money on the backend of it, a lot of them too. And so it's just a, a bigger consideration that didn't used to be a consideration as part of the procurement process, but now that, that is the number one currency right now, you know, for any tech company in ed tech is how do we get data?

it's just another thing, you [:

Brett Roer: not at all. That's why people tune in.

Rebecca Bultsma: For the data, for the data talk

Brett Roer: for the rich discourse. I mean, this is what it's for you, you, you aren't afraid to dive into the tough challenges ahead.

Brett Roer: We're almost at time, but our last and final question, Julia, this is really important. We're gonna end on a very high note, okay? And afterwards, I'm gonna unveil my new closing line for the podcast. And Julia, it was not inspired by you, but you'll appreciate more than anyone. So Julia, our last question is, right, we've talked about the challenges.

Brett Roer: Let's leave on some bright spots, right? You truly get to work across the nation with some of the top leaders. Take a moment and just, who do you wanna shout out? Give flowers to? These are the people that are helping to build the next generation to get an education right? Give some flowers to people that just other folks need to know about, should reach out to, should research, should study, should listen to their podcast.

work forward that you wanna [:

Julia Fallon: I will always shout out my state leaders collectively as a whole. So if you don't know who your state ed tech director is, contact myself or look at our website or our, you know, you can do a little research yourself on Google, but you probably should know who those folks are.

Julia Fallon: They are often doing the unsung work of trying to move the needle forward and there's good stuff happening in all states. So I will say that, you know what I mean? I know people, it's, it's politically charged right now, but there's good work happening in all states and we wanna make sure that everybody gets their flowers for that type of unsung.

Julia Fallon: It's government work. It's not very sexy. It's, you know, not very recognized as, as well. But I think of folks like Bre Urness-Straight straight out of the state of Washington here. You know, leading both the development of AI governance, but also really trying to understand how to create an ecosystem from a broadband perspective and access to everything, right?

ta Utah, both Rick Gaisford, [:

Julia Fallon: And I know that Utah right now is having its own political moments, um, with technology. But they hired Matt Winters, who's an AI specialist that sits at the state level. And you think about Utah, and I often say that Utah is leading the country and ed tech strategy and people are very surprised by that.

Julia Fallon: But they have continued those investments for lots and lots of years around how to make sure that, you know, communities are connected, that educators are supported, that students are gonna get the skills and they're gonna get the experiences using technology as a whole. I think about folks like Dorann Avey, who's probably wearing, you know, multiple hats in Nebraska.

e where it's, it's top down, [:

Julia Fallon: So it's high quality stuff, but this is a way, it's an innovative way. Very not sexy though, but it definitely helps, uh, build the capacity of all of those teachers that are out there. I think about like Brad ha out at Indiana who's leading as the ed tech director. He's very well versed in the cybersecurity space.

Julia Fallon: You know, can, can speak about that both on the national and the state level and how states can actually help provide cover for an entire state, you know, smaller districts all the way there and things like that. Um, I think about some other folks that are not necessarily in my immediate circle or my membership community, but I think about like a Rob Dixon out of Wichita.

ems capacity and making sure [:

Julia Fallon: And, and that sort of thing. So I love talking to him. I love talking to like Mike Conner who's left obviously a district, but now has started his own company. Just, just my own handful of folks that I like to just. Not test ideas with, but to be able to be like, Hey, I'm cranky about this. And they challenge you, either your thinking or they give you something new to think about, but you, it's in, it's in a very collaborative and kind of safe space to have those conversations as well and everything else.

Julia Fallon: So yeah, those are some of the folks that I would like call out. Jacob Kantor's been a great connector in getting Ed Tech, the ed tech community to really understand the ed tech developer community to really understand where states are coming from and try not to keep building stuff that we don't need.

folks that I am keeping eyes [:

Brett Roer: Bouquets. Bouquets. And I think you have found a kindred spirit in Rebecca when AI and society really probably more.

Brett Roer: She needs someone to vent to you. Two would be kindred spirits I think in like moving forward and then be like, okay, and now what do we do? So hopefully this is another member you reach out to. That was incredible and a whole lot of flowers. All right, so Julia, this is my new sign off. So we were just at a school and they are famous.

Brett Roer: The superintendent brought crumb cake to one of their first superintendent's days when they were a new superintendent and it was a hit. So I went back and it was Versha's birthday. And this crumb cake truly was like one of those things that you're like, how does this exist on this planet? So I tried to find it so I could bring it to the school because I know the superintendent likes to end.

left I told vha, you know, I [:

Brett Roer: And I said. Every day could be jumbo, cannoli worthy if you try. So that's my message for all the AmpED to 11 listeners out there. When the world gets you down and you're like, I don't deserve a jumbo cannoli in life. No. Make every day a jumbo cannoli day, and you're gonna have a pretty interesting life.

Julia Fallon: I'd say I, I wholeheartedly agree with that.

Julia Fallon: Sign off. I, I even just think a regular cannoli is there's reason for a regular cannoli every single day. Brett and I share our love of cannolis, so this is where that's coming from. But jumbo for sure, but a regular cannoli. I could eat one every day.

Rebecca Bultsma: Just outta curiosity. How big is a jumbo cannoli? Like, don't the

Brett Roer: I will send pictures.

Rebecca Bultsma: Right.

Brett Roer: You know, I've put on LinkedIn, but it usually is about, it's a really good question. I dunno.

Julia Fallon: I feel like it's two football-.

something in there. It's [:

Brett Roer: I hadn't. There's a whole backstory, Rebecca. We'll film a whole thing about that and Julia, you can be on the pod. It's called Holy cannoli. Anyway, that was a long wind way saying you deserve a jumbo cannoli every day. You don't have to have one every day, but make your life that interesting. Signing off in the AmpED to 11 podcast.

Brett Roer: Take care.

Follow

Links

Chapters

Video

More from YouTube

More Episodes
30. From Compliance to Capacity: Rethinking Accountability in AI with Julia Fallon
01:03:28
29. Harl Roehm on Fighting for What You Love in Education
01:29:59
28. AI in Schools Without the Hype: Pedagogy First, Parents Included with Joseph South & Jessica Garner
01:08:59
27. Future by Design: Marisa Janicek on Decentralized Libraries and AI-Ready Learning
01:10:55
26. What Only Humans Can Teach: Jessica Paulsen and Roberto Vargas on Post-AI Classrooms
01:03:51
25. AI, Empathy, and “It Depends” | Michelle Culver, Bomi Akarakiri and Cyra Alesha (The Rithm Project)
01:05:32
24. AI-Resistant Teens, Killer Automations & Holiday Hot Takes—with Versha Munshi-South and Serena Reynolds
01:07:09
23. Curiosity > Compliance: Jenn Womble on Student Voice, Policy Gaps, and Unfiltered Optimism
01:00:30
22. From Chaos to Craft: Thomas Thompson IV and Thomas Hummel on Building AI for Teachers
01:03:41
21. From Burnout to Bold Futures: Sabba Quidwai on Designing Schools That Empower
00:56:50
20. Why “Transform” Is the Wrong Word—Aaron Cuny on Real-World AI in Education
00:46:48
19. Turning Fear Into Forward Motion with Tia Morris & Eldridge Gilbert III
01:06:04
18. Code, Conscience, and Classrooms with Pat Yongpradit: AI’s Ethical Crossroads in Education
00:57:37
bonus More Than Grants: Jeremy Bhatia on Transforming School Finance with AI
00:07:14
17. From Parent to Power Broker: Jacob Kantor on Vetting Edtech, Opening Doors, and Building Trust
00:52:51
bonus Building with Educators, Not for Them—Inside ClassDojo’s Evolution with Dr. Chad Stevens
00:05:53
16. Teaching AI Without Bias or BS —Amanda Bickerstaff on Leading with Literacy, Not Hype.
01:02:46
15. What If School Made You Love Yourself? Sonn Sam Rethinks Learning from the Ground Up
01:12:37
bonus From Free Lessons to Strategic AI: Matthew Kennard on BetterLesson’s Evolution
00:10:07
14. Beyond Productivity: Matt Winters on Building Inclusive, Data-First AI Systems in Education
00:49:54
13. Let AI Do the Boring Stuff: Jeremy Shorr on Amplifying Humanity in Education
00:58:29
12. Rethinking AI’s Role in Teaching: Merissa Sadler-Holder on Time, Trust, and Transformation
00:41:31
11. The Teacher in a Suit Taking AI to the Streets: A Conversation with Winston Roberts
00:51:34
10. Building Learner-Centered Futures: A Conversation with Marlon Styles
00:52:40
9. If Education Stays the Same, We’ve Failed: A Conversation with Alana Winnick
00:54:12
8. AI, Equity, and Why Ethics Matter More Than Ever with Ken Shelton
01:02:40
7. AI, Education & The Future of Human Connection: A Conversation with Claire Zau
00:50:33
6. Closing the AI Literacy Gap: A Conversation with Alex Kotran
00:56:08
5. The Future of Education is Now: AI, Innovation, and the 22nd Century Classroom with Dr. Michael Connor
00:53:54
4. Pushing AI to the Limits: Innovation, Education, and the Future with Special Guest Michael Yates
00:35:46
3. Infinite Possibilities in Education: AI, Strategy, and More with Special Guest Dan Fitzpatrick
01:26:28
2. AI in the Classroom: Project Cafe and the Future of Teacher Feedback with Special Guest David Adams
00:58:51
1. AI in Education: Bridging Innovation and Humanity
00:37:32