In this thought-provoking episode of Voices for Excellence, Dr. Michael Conner sits down with Rebecca Bultsma — international AI ethics researcher, Chief Innovation Officer, and co-host of AmpED to 11 — to explore what it really means to lead in the age of artificial intelligence.
From Kendrick Lamar lyrics to ethical paradoxes, this conversation moves from AI strategy and governance to the human heartbeat of education. Rebecca unpacks her belief that “ethics is the seatbelt of AI strategy”—a grounding metaphor for how innovation must move fast, but never without responsibility.
Together, Dr. Conner and Rebecca dive into:
Rebecca challenges listeners to imagine a future where education isn’t confined to classrooms, grades, or standardized tests, but exists as a network of personalized experiences, guided by ethical innovation and human connection.
This is more than a conversation about technology — it’s a blueprint for human-centered transformation in the age of AI.
Michael Conner: So let me just tell you what happened and what I mean by what I wanna be like you when I grew up. She is absolutely brilliant, a genius. So we're at a AI symposium in Alban. And Rebecca, I, Rebecca, we, we first see each other within the first five minutes. She was like, you got some time. I wanna show you something.
es, Rebecca is unpacking the [:Michael Conner: Second, I'm trying to store everything that Rebecca's saying in the notes of my old iPhone. And third, just absolutely enamored on the depth of knowledge that Rebecca brings into this AI vertical that we talk about with education and. Rebecca Bultsma is just somebody that I just absolutely look up to and just admire her work and unpack her work surgically in the context of how I can be able to build on my individual practices as well as strengthen nodes and elements within my organization and tech and ed tech companies.
Michael Conner: So I wanna welcome, and I, I know she's gonna be mad at me when I say this, the.
o be here. Michael, you, you [:Michael Conner: Rebecca, literally to my audience.
Michael Conner: We, it was a really a social event networking event that when, you know, you see two academics, two intellects come together, Rebecca's like. I gotta show you something. You got a few minutes and I automatically new with that. And also, Rebecca Domestic international AI expert. She's an expert of various nodes, various elements, or I should say rudiments within.
t of Alberta, Canada, so not [:Michael Conner: So Rebecca, you know, selfishly, I want to be able to reincarnate that conversation for her. Our audience and go even deeper because again, I think the level of expertise that you have in AI and being a czar in a variety of different tenets in this ecosystem, in the AI operating model, I think that this will be a great professional learning experience for my audience with you.
Michael Conner: So Rebecca, welcome to VFE.
Rebecca Bultsma: Oh, what an intro. I appreciate that so much, Michael. Thank you. Thank you so much. I'm excited to be here.
Michael Conner: Absolutely excited to have you excited to, like I say, unpack your IP so that my audience will be able to apply strategies in the sophisticated and strategic context for better outcomes and systems for students.
focusing on AI expert in the [:Rebecca Bultsma: You know, I've, I've actually been thinking about this because I have what might be considered an unexpected taste in music. And, uh, so I'm actually a huge Kendrick Lamar fan, and so. The song that I picked for this was like, I had to decide between a couple, so I just picked them both. I, I picked not like us just because I spend a lot of my time helping in the ethics research that I do organizations and people figure out that like no matter how advanced and cool and, um, bright and shiny AI is, it's.
Rebecca Bultsma: It's not [:Rebecca Bultsma: That's kind of what I do in AI ethics. I'm always asking or having people ask what's real, what's authentic, what's uniquely human that we need to. Protect, and I guess also humble by Kendrick Lamar because it kind of speaks to the power paradox that I think about and talk about a lot. And we just always need to be checking ourselves.
Rebecca Bultsma: And I mean really that whole album was out about duality and weakness and strength and pride and humility and all of these paradoxes. And we just need to recognize, and this is something I struggle with daily, this paradox between this technology's incredibly powerful, um, but we are incredibly vulnerable to.
se, uh, to us as individuals [: I don't know rap music since:Michael Conner: Rebecca, I, I, I respect, I respect the choice, right? Because it, I, I'll never forget this, Rebecca, I was as a social event, and you know, most people, they see Dr.
Michael Conner: Michael Conner.
Rebecca Bultsma: Yes, they do. With the great socks. The great shoes. Yep. Polish
know, really word for word. [: and I, we were talking about: at equivalency that lives in [:Michael Conner: Leader practitioner, classroom practitioner. When we think about that, and then that power, um, that power paradox and using the song Humble, that's woo. If I have more time, I would want to get in your, in your, in your genius of a brain to unpack humble and with threads of ai. But there is, I think this paradox that we are experiencing, and I think this paradox is multi-tenant.
Michael Conner: It's multi is is multi nuanced and it needs that level of this kind of experimentation, continuous level of experimentation to be able to now navigate through, I think this inevitable paradox we're gonna continuously to live in and education specifically now. These tenets of how technology and AI improving their self, then being this merter, not just a disruptor, but a meta in the model.
to these defined areas. But [:Michael Conner: How do we have that intersectionality with this strangulation, with that? And then also the second part of this, how do we underscore policy? To impact classrooms and instruction to be future driven, just not the complicit standardized policy language around AI integration and classroom instruction. But how do we now make our classes to be future driven?
xt in a leadership role. And [:Rebecca Bultsma: I think about it like strategy is our destination and AI is the vehicle, but ethics is the seatbelt that we wear because we can't just floor it. We all have to arrive there safely together. Um, and. I actually, I have four kids who are, you know, 20, 21, 22, 23, and they use the Gen Z slang a lot. And I actually kind of adapted one of their phrases that was my favorite, which is, you know, the vibes, the vibes are off.
Rebecca Bultsma: The vibes are immaculate. But this vibe framework. I thought about it because these, this generation makes every decision kind of based on vibes, right? Like, oh, this place has good vibes. Oh, the vibes were off, I broke up with her. And they can kind of sense authenticity or the lack thereof. They can't maybe articulate it, but they can sense it.
a: So this vibe framework is [:Rebecca Bultsma: I think it needs to be visible, intentional. With purpose. We need to be using it as part of our strategy, not even ed tech, right? We didn't buy this because they had a good sales pitch. How are we intentionally going to use this technology in this way? Beneficial is the, you know, the equity check. Does it help everybody, all students, is there people who are being left behind or being harmed and earned?
Rebecca Bultsma: Because trust isn't given. It's proven through consistent actions, and if you mess up, you own it. If you succeed, share it. But. Everything that we do, like we just need to make it a vibe, right? Like with ai, make it part of the conversation. Not like a checklist, but just give it the vibe check. Really. That's how I think about it.
ics and Strat ethics without [:Michael Conner: Absolutely. Rebecca. I love the analogy where you had ethics as a seatbelt, I think, and when I thought about, or when you said that immediately I thought of like, wow, that is ethics in itself.
Michael Conner: Right? But the vibe framework, and it is funny that you talk about vibes, right? Because I got a Genair, Gen Alpha, who you met. Yep. And he, you know, when they talk about vibes, he, he incorporate vibes and they think, what's vibes? And I'm like, whatcha talking about? Yep. Like, I'm like the vibes like Yeah, the vibes.
ate these everlasting vibes. [: gonna be at this benchmark of: rspective. On how we're able [:Michael Conner: But what now? I know you are Rebecca. Like I said, you're a czar with this and I lean on you to find out information with regards to disruptive technologies. IE. Hey, you got a few moments. Lemme show you this. That turned into a 500 level course in about 30 minutes that I love. But Rebecca, what are the most advanced forms of AI and DIS and disruptive technologies we should be having structured conversations around right now?
Michael Conner: Right? If I'm a leader today, what are those conversations I need to have? And then also, what type of professional learning should be implemented with this degree of consistency and integrity? Right, so that now these forms of disruptive technologies and education become lucid and seamless in the model.
ere. 'cause I could, I could [:Rebecca Bultsma: So OpenAI just launched their most powerful version of chat, GPT for everybody, even the free tier has access to it. And to kind of match that Anthropics, Claude, their number one competitor, um, released a similarly powerful model. Opus 4.1 and Grok, Elon Musk's company released Grok four. So these are extremely powerful, what we call frontier models that can do amazing things.
you make important decisions [:Rebecca Bultsma: If I, I know I'm going to be, um, making a decision or, or speaking to a group. I try to build a research persona of the group I'm speaking to, and then have it pick apart my arguments or my delivery to see what I'm missing, right? So. I think that's a, a key technology. That's the biggest one, top of mind, right?
Rebecca Bultsma: Like Chad, GPT, these frontier models that are amazing and smart. Um, but they also create outputs that are impossible to tell if they're real or fake will be that images or songs or writing. We need to be talking about what's called a agentic ai, which are like AI agents, which everybody's talking about, and really it's not as.
for example. When you and I [:Rebecca Bultsma: So if. For me, if I was going to give a keynote at a particular event, let's say it's a company in in the real estate sector in British Columbia, Canada, I would use Chat GPT five to. I would assign it a deep research project to go out and find out everything it could about this organization, its competitors, it's historical, um, you know, the perspective and perceptions of the company.
Rebecca Bultsma: And it will just go away and take 10 or 15 minutes and go everywhere on the internet and basically return me like a master's level thesis type paper with 150 sources, but it's going off to do that on its own while I do other things. And that's really what an agent is. The best example of this is Google just came out with a, a beta project, uh, called Google.
time I ask it a question, it [:Rebecca Bultsma: All you have to do is then push a button. It builds it by you just talking to it, and it's kind of doing things in the background. That's all agents are. So if we build that for your son, let's say he has it forever. Every piece of information he finds on the internet or gets from his teacher, he just puts it in and in the background it does all of that, makes sure it's.
Rebecca Bultsma: Semi accurate, you know, whatever. So that's kind of how agents work. And then, so agents are one big thing that we need to be talking about. The other one is this idea of robotics and autonomous assistance, which I'm not gonna get into, it's a whole separate thing, but it's just the idea of we're putting this powerful technology into physical things, you know, beyond our Roomba, you know, something that can actually, and there's a lot of risk and benefits associated with that too.
s are gonna go and where the [:Michael Conner: Really good point to my audience and Rebecca, I, I use VFE as an asynchronous professional learning tool that a lot of my audience can be able to segment, rewind, go back, promulgate, uh, uh, challenge against their current practices, creative tension, Peter Sage's theory as if we wanna be explicit.
Michael Conner: But I really want to go into these agents, right? And, and, and the reason why I, I'm, I'm sounding a little disjointed with this, Rebecca, is because I was just talking about this form of ai and I want you to be able to, to highlight what this AI is, and I'm forecast, and again, this is my own personalized theory of action, value proposition, or even hypotheses and people get challenges.
Michael Conner: This, when I [:Rebecca Bultsma: potentially.
Michael Conner: Potentially. And if I'm thinking about how super intelligence, right, we are talking about these really new robust frontier models around. How they're just improving simultaneously.
Michael Conner: We look at the data sets on the back, but this agent, this new level of ai, you said that there's a lot of risk and benefits me. That's a radical architect of education models, and I'm thinking that because of this, we can bring school one-to-one. Can you just elaborate? I know I might be. Way out. Like people 10 years ago were saying, Mike, your thinking is way out there.
. I'm thinking this now. I'm [:Michael Conner: Toys r Us being one of my favorite. Can this happen in education? I think it can because if it's happening in the economy, it can be mirrored in education. But everybody kind of pushes me away when I say this. New AI can really now radicalize how one-to-one instruction can be delivered, and let's be honest, with the exception of some nuances, these are the risk.
Michael Conner: It can get to the level of a zone proximal development that you or, and I can't get with a student, but just, I, I just want to hear your thinking of this.
le don't like to hear it and [:Rebecca Bultsma: So we just have to like reimagine. When we think about a school day, like the time we actually spend delivering instruction versus like character education and uh, things like, you know, things that, hobbies, people freaked out when there was this kind of AI school, this alpha school that kind of popped up in Arizona a couple of years ago and their whole premise was, yeah, the kids are gonna like learn with AI for maybe two hours and the rest of the day we're doing character education.
Rebecca Bultsma: They chase something they're passionate about. We help them do that and. I don't think we're that far off. If you think about it, if we would've told you even pre ai, guess what? What if every kid could go home every night and have a personalized tutor who is endlessly patient and could help them understand things, meet them where they're at, like that would be the dream for kids.
t. But there's so many other [:Rebecca Bultsma: You know, people talk about flipped classrooms, but that idea that you kind of absorb the content at home and then you come back and you experience and you engage in all of those wonderfully human things that you can't necessarily do with the ai, I think that would be great and we'll still need people.
kids get the exposure to the [:Rebecca Bultsma: So, so much. But then they'll still have the opportunity to chase things that they're passionate about. And you know, if you have a teacher who doesn't care at all about this thing that you care about, well, you can still learn outta the things from them, but you can still learn in the way that you learn best and using things you care about.
Rebecca Bultsma: It's probably a disjointed answer, but your short answer is yes. I think we end up with a one-to-one. I don't know if that will be the fully replace education as it is now model. I think it, like everything else, will land somewhere in the middle. Um. But I think we can do it.
Michael Conner: Absolutely. And, and, and Rebecca, I, I, I always, what do they say?
nd percent. I agree with you.: s of talent from the De Delta: hitting onto something now. [:Michael Conner: I want you to do this because we're seeing these frontier models, these new agent AI models, uh, super intelligence coming out forms of AI that can now, you know, do logic and reasoning, which is scary in itself. With this notion, same level set that we were talking about, where do you see school being reauthorized or redefined specifically?
n. The traditional brick and [:Rebecca Bultsma: So are we talking about what needs to change or what does that look like in the future? Or what needs to happen now for change?
Michael Conner: Yeah, what does that look like in the future, Rebecca? Like when we talk about C time, what makes Rebecca Bultsma. Complete a course in 60 hours that you are gonna master it.
Michael Conner: Where? Shoot, Michael Conner might need 260 hours. Right? Like how do we challenge seat time like that with AI now?
Rebecca Bultsma: You and I both know, and this is talk to Death, but you know, that idea of traditional school, the schools we went to, which worked out great for us because we were obviously, you know, we fit well into that model, but that industrial revolution era with bells and grade summaries and summers off, which was basically for farming back then and that 150 year old operating system.
Rebecca Bultsma: [:Rebecca Bultsma: Cell phone plan, you know, stuff that kids today just don't even understand. Do you remember when you had to pay per text message and if you wanted like an E, you had to hit that three like five times, you know? But I really think the idea of school will become more like a network and not a building necessarily.
Rebecca Bultsma: Um. It's not a place you'll go. There'll be physical hubs for labs and for sports, for social connection. But I think school is your personalized AI learning companion that can go with you anywhere you go. Um, everything's school. As long as you're learning, and I, I already do this, right? As a student. I, my school is in Scotland, right?
where I go, but. You know, I [:Rebecca Bultsma: The idea of a teacher, uh, instead of just being like a discipline manager and a logistics manager, it'll be more of like a learning architect, right? Like they, they're not gonna be delivering content because let's face it, like AI will be able to deliver content. Better and in any way that you want it.
Rebecca Bultsma: I'm an auditory learner and a verbal processor, so I like to listen to content and then have a conversation with AI and have it, um, devil's advocate all of my ideas or poke holes, like I like that back and forth. And a teacher can't do that with every student. Right? So the teachers will become, but they will be important for teaching kids things like empathy, creativity.
f support. I, the thing that [:Rebecca Bultsma: I don't think we'll have standardized tests anymore. Thank goodness. I think maybe it'll be, AI is assessing you continuously through projects and conversations and creation and it'll know that you understand physics because you built something that works. Um, not because you did a CDC on your multiple choice cuts test and got lucky that time.
Rebecca Bultsma: Right. You know, I wonder what credentials will look like. You know, you and I are obviously big like degree people, right. But I wonder if it'll be less about. Taking that traditional path and checking that box that you graduated, rather than mastering a portfolio of skills and your portfolio gets maintained your whole life from 35 to 50.
ing instead of reaching a, a [:Rebecca Bultsma: So it'll just become as personalized as their. As, you know, their Spotify playlist, and I think the idea of classroom management maybe becomes irrelevant. I think a lot of behavior problems we have are because kids are, are bored, you know, and I think that, that we lose a lot of that with personalized pacing and engagement, and kids not being as frustrated because they're lost or they can't keep up or they, they feel dumb.
e exactly what they need and [:Rebecca Bultsma: And, uh, yeah, I think it's, I think it will all come together. Like you said, it'll land somewhere in the middle and. Yeah, we just have to steer it to kind of an ethical place where people don't become displaced professionally and kids don't fall through the cracks.
Michael Conner: Rebecca, I love your answer because you just gave a really simplified layman's terms of disrupting every single element in the operating model, classroom practitioner to learning architect moving away from traditional word of school to a network, moving away from a traditional school building classroom to a physical hub.
perseveration of degrees to [:Michael Conner: He was thriving. I mean, he's a very, very smart kid. Thriving in school now, but exponentially 'cause he's like that. He, he, he wants to run around, really want to explore, does his best thinking at selective times. It, it, it, it. We have to think about how we're going there. Right. And it kind of bodes into the next question, Rebecca, is if we're talking about a model where there's, our practitioners are learn learning architects, that right there is a comprehensive redesign of pre-service programs.
physical hubs and networks, [:Michael Conner: I mean, where, where do you think this leads to Rebecca?
Rebecca Bultsma: I think the things most vulnerable to AI disruption are the things everyone is trying so desperately to. Protect and hang on to. So standardized testing, traditional grading, homework as we know it, the college application essay, like it's all about to get absolutely cooked.
Rebecca Bultsma: Just to borrow another phrase from Gen Z and Alpha. But, um, and the people who are fighting so hard to keep things as is are the ones who are basically fighting for the, their own irrelevance. So I think most vulnerable, standardized testing acts, SATs, state assessments, like they're dead institutions walking like I.
tsma: AI can score a hundred [:Rebecca Bultsma: So I think, well, it'll be like fax machines. I really think it will be, I think, traditional homework the way we know it, that people are still kind of hanging onto the take home essay, like we are trying so hard to like catch kids cheating and, and, you know, prove that they cheated and they're, they can take a picture of their homework and have AI do their homework for them.
Rebecca Bultsma: Right. The way that we've always done homework like that, that can't happen. It's basically prompting practice for kids. Teachers just have to assume AI is gonna touch every single take home assignment like. You're just teaching kids to be better at using chat GPT and not get caught, rather than teaching them exactly what you're trying to teach them.
, that's totally vulnerable. [:Rebecca Bultsma: Tell a fish, he's smart by gauging his ability to climb a tree. He'll live his whole life thinking that he's dumb. We just happen to fit the mold and, and so that gatekeeping complex, I think it's gotta go. I think we're limiting, um, kids for life by making them think that if they don't fit into this box, I, I think that, I think that's gotta go this gatekeeping.
it's along the path, and if [:Rebecca Bultsma: Like you're out of the norm, but you still have to get there following this. Path. We don't necessarily say learning paths. Um, it, it will make it look like we're still trying to teach everybody Latin. You know, like the school's clinging to everybody. Turn to page 47. It's like cosplaying education at this point.
Rebecca Bultsma: Like it's, that stuff is, is all vulnerable. I think the biggest thing that's vulnerable is ego. I, I might get heap for saying this, but the teacher who really just. Wants to teach the same curriculum 25 times and then collect a pension. The administrator whose whole identity is maintaining standards of excellence or keeping everybody in line and checking all the boxes, uh, the parents who think that their school education needs to look exactly the way that theirs did.
efinitely doesn't, like they [:Rebecca Bultsma: Coaches who see potential in kids before they see it in themselves. I have a kid like that too. Um, counselors who notice that somebody's struggling. Who kind of can see that AI can't necessarily do those things You see on TikTok, those principles that know every kid's name and story or has like a secret handshake with every kid that gives them that sense of importance and belonging, that stuff will thrive.
Rebecca Bultsma: That's the stuff, that's what I think is vulnerable and the stuff that I think will thrive.
Michael Conner: Rebecca, I think that everything you stated within your answer is pivotal. For 22nd Century design. A lot of that is represented in, in a very sequential and simplistic way of alignment to various indicators.
y systems learner framework, [:Michael Conner: And. You know, and I know we have generation, generation Z will be formulated, formerly matriculating out of the PK 12 continuum at the conclusion of the 25 26 academic year. So we will have a thread, a vertical thread of all generation alpha students. That's including my son, who's in fifth grade, gonna be at fifth.
into the operating model, or [:Michael Conner: What advice would you give me if I was a leader? Regarding your two top recommendations for preparing my system, my organization, to be AI driven.
Rebecca Bultsma: I would say the first one would be set up. Permission to fail infrastructure. Um, you know, the obvious answers are build ai literacy, create policies. That's like table stakes.
Rebecca Bultsma: That's like saying learn to read and have rules, but build a permission to fail infrastructure. Stop trying to get it right. You won't, the technology is moving too fast. You're just trying to build a static solution for a technology that's always evolving. Give people permission to fail and experiment.
regular tool, every policy, [:Rebecca Bultsma: We don't do that great with our kids sometimes, and we definitely don't do that great as adults. But the way that I learned how to use this technology when there it first came out and there was nobody around who could tell me how it worked, except for computer science guys who could not. Explain it to me in regular language.
Rebecca Bultsma: And so I just had to figure it out and I just started sharing publicly all the ways that it had failed and what I learned. And we both know you learn more from your failures and your success. So create a, a culture of safety where people can experiment and fail. Because this is the thing, my co and you'll see this too, my Gen Z kids.
something. They don't expect [:Rebecca Bultsma: A patch, a fix didn't work. Cool. What are the, how are they gonna fix it? That's the mentality. It's not like for us, oh no, I failed that test. My life is over. Right? Like, so they expect iteration and failure, so like, let's make it safe for them. Let's make it safe for us. The second one is a little less, uh, more unorthodox, but I really think we need to have student shadow boards and, and hear me out.
Rebecca Bultsma: Not student councils, not focus groups. I think match every adult board member at a school with two students. One who's like a tech nerd and the other one who's like doesn't like the technology and every AI proposal that goes before the board also goes to a student shadow board. I think maybe you pay them not in pizza.
you teach them that they're. [:Rebecca Bultsma: They know and if you can incentivize them to work with you, instead of just trying to make all the decisions for them about things that we do not understand, but they do. Um, I think it will kill expensive AI initiatives that. Make no sense in real world stuff, it will help board members better understand it and it would make kids part of a conversation they really should be and more.
Rebecca Bultsma: Anyway. It's one thing to be a student council president and plan dances, but if you're actually advising people in leadership and showing them the AI and, and. How kids are gonna use it or what they're saying about it, they know. I think that's a huge thing we're missing. So I think these recommendations do two things that matter.
ients. And I think that that [:Rebecca Bultsma: I think those two suggestions create a culture where everything else works. I think it lays the foundation for everything else, so
Michael Conner: absolutely no, Rebecca, great. I I, I love you. Two recommendations. Student shadow boards. Love, I tell you, you're, that's why I love you. Thank you, man. You, you're such an innovator with that.
ating that they have that in [:Michael Conner: They're going through multiple iterations. Of what they're doing for that specific task in that game. Um, and they have multiple ample opportunities to be able to do that. And I see them collectively thinking, I see them having shared experiences. I see them, you know, going through these really divergent experimentations within the game.
Michael Conner: So absolutely needs to be emulated. In the classroom. It needs to happen with our teachers and leaders with these tools and with these advanced technologies. Um, Rebecca, we need to see more of that. It's okay to fail, right? Iterative and dynamic with that. And yes, Rebecca, I was one of those students that,
Rebecca Bultsma: oh, I know.
Rebecca Bultsma: Same. Same, right? Like it was the end of the world and this group isn't like that.
like, dude, but. Nevermind. [:Rebecca Bultsma: I picked three.
Rebecca Bultsma: I I have three.
Michael Conner: You have three? Okay. Let's, let's see if you're a rule follow what three words. Do you want today's audience to leave our podcast with regards to AI and the learner attributes of generation alpha and generation beta for future readiness?
Rebecca Bultsma: Number one, curious. They are curious. They, everything they learn today will probably be like obsolete by the time they graduate.
Rebecca Bultsma: A lot of it. And so they, I've learned this from my kids. They don't memorize anything. They just stay curious enough to figure out. How to figure it out again when they need to. We're forcing them to memorize stuff. They're not retaining it. They're just figuring out how to solve problems. They'll figure out how to get around the AI detector so that they can do that again later.
ir takeaway in a world where [:Rebecca Bultsma: Curious, which makes them good at AI because they're like, oh, I wonder if it can do this. I wonder if it can do this, and so they're curious. The second one is unimpressed. I think it's my favorite. They are unimpressed by technology. They're not intimidated. They're not obsessed. They're unimpressed. They look at like chat, GPT, like I look at calculators, right?
Rebecca Bultsma: Useful tool. Nothing more. They're not dazzled like we are. They're not scared. Some of them don't even like it. Um, oh, it can write an essay. Cool. Like what else? And so they. Tough to impress the moment. You're too impressed by technology. Uh, you stop questioning it. And I think that's what is happening a lot in our generation.
they question it. Um, and I [:Rebecca Bultsma: I'm waiting for, you know, version two or it's fine, but it doesn't run their lives. Like none of my kids are as obsessed with chat GPT as I am because it's so new and shiny to me. But this is a generation that grew up with technology that's all kind of new and shiny, and so they're, they're unimpressed really.
Rebecca Bultsma: They're unimpressed with everything, which I think is kind of a benefit. I think this one's kind of a f. I dunno, it feels a little too generic, but this idea like that, yes, they're human, but they're like all about like being human, right? Like the more artificial, everything in their sphere becomes, they care more about things being human and authentic.
. And they kind of lean into [:Rebecca Bultsma: So the kids, I think they realize that the people who are gonna stand out are the go aren't the ones who are gonna be generic and more machine-like, they're gonna be the ones who are mess messy and authentically human. We saw that kind of with TikTok, this move from Instagram where there was. One of the criticisms was, is it just showed everyone's best perfect version of their lives, right?
at made TikTok popular, that [:Rebecca Bultsma: Authentically human. So I think that's kind of their survival strategy too, uh, in their careers. I think they're going to need all of those things, uh, to kind of survive. And I think those things are how you stay human, uh, and humble in an era of God-like technology that's kind of creeping into every aspect of being.
Rebecca Bultsma: I think that there'll be. We also need to be thinking about what words our current system discourages. Like if you think about it, our current school system discourages sometimes things like messy authenticity and humanness and uh, you know, things like being okay to fail. And so we just need to look at what these kids are.
n system, to meet these kids [:Michael Conner: Right, right. And Rebecca, one word that is gonna stay with me forever now, like you already have cognitive offloading, so you already.
Michael Conner: You know, impacted my IP for that, but stay un oppressed. I love it. Miss Rebecca bma, thank you for coming on Voices for Excellence. If any of my audience, they want to get in contact with you or even extend on our conversation, how would they be able to do that?
Rebecca Bultsma: I'm on LinkedIn fairly often. I post there every day just.
Rebecca Bultsma: Talking about the things that we're talking about right now, or, uh, I have a website that's rebeccabultsma.com. Uh, that. People can contact me on, but I'm always happy to have conversations talking about my favorite thing to talk about. So thank you for giving me the chance to talk about my favorite thing to talk about.
Rebecca Bultsma: Mike.
iative. Like I said, and you [:Michael Conner: So thank you for coming on VFE, Rebecca,
Rebecca Bultsma: my pleasure. Thank you.
Michael Conner: On that note, onward and upward. Everybody have a great evening.