Artwork for podcast Talking Technology with ATLIS
Shoulder to Shoulder: Dr. Alison Lee on Partnering with Youth in the AI Age
Episode 907th October 2025 • Talking Technology with ATLIS • Association of Technology Leaders in Independent Schools (ATLIS)
00:00:00 00:55:29

Share Episode

Shownotes

Presented by Blackbaud

Dr. Alison Lee, Chief R&D Officer of The Rithm Project, joins the podcast to discuss the collision between the youth loneliness epidemic and the rise of AI. She shares research on why young people are turning to AI for connection and explains her five principles for "pro-social AI," offering a hopeful framework for nurturing human relationships in a tech-saturated world.

Transcripts

Narrator:

Nick, welcome to Talking technology with Atlas,

Narrator:

the show that plugs you into the important topics and trends for

Narrator:

technology leaders all through a unique Independent School lens.

Narrator:

We'll hear stories from technology directors and other

Narrator:

special guests from the Independent School community,

Narrator:

and provide you with focused learning and deep dive topics.

Narrator:

And now please welcome your host, Kristina llewellen,

Christina Lewellen:

hello everyone, and welcome back to

Christina Lewellen:

talking technology with Atlas. I'm Kristina llewellen, the

Christina Lewellen:

president and CEO of the Association of technology

Christina Lewellen:

leaders in independent schools.

Bill Stites:

And I am Bill Stites, the Director of

Bill Stites:

Technology at Montclair Kimberly Academy in Montclair, New

Bill Stites:

Jersey, and I'm

Hiram Cuevas:

Hiram Cuevas, the Director of Information Systems

Hiram Cuevas:

and Academic Technology at St Christopher school in Richmond,

Hiram Cuevas:

Virginia.

Christina Lewellen:

Hello, gentlemen. How are you today?

Christina Lewellen:

Start a school. I know it's exciting. You know, before we

Christina Lewellen:

jump in and I have a killer guest today that I cannot wait

Christina Lewellen:

to introduce you guys to, this is a big deal. Dr Ashley cross,

Christina Lewellen:

on our staff, came back from ISTE, absolutely raving about

Christina Lewellen:

this speaker, and we get to talk to her. I'm very excited about

Christina Lewellen:

that. But as we were doing some prep for the show and I was

Christina Lewellen:

doing some reading, I have a question. I want to take a quick

Christina Lewellen:

temperature from you guys, if you don't mind, we spent a lot

Christina Lewellen:

of time at the beginning of last year, talking about locking up

Christina Lewellen:

phones. And I'm curious. I'm hearing some people talking

Christina Lewellen:

about cell phone bans, and I'm hearing some schools that are

Christina Lewellen:

continuing with their policies from last year. What are y'all

Christina Lewellen:

seeing at your schools? Are you guys taking the phones away,

Christina Lewellen:

sticking with last year's policy? Have you made any

Christina Lewellen:

adjustments to the cell phone restrictions at your schools,

Christina Lewellen:

for your uppers, for

Bill Stites:

us, I would say we're status quo. To be honest

Bill Stites:

with you, it hasn't even been something I've thought about.

Bill Stites:

It's something that I would assume at this point. You know

Bill Stites:

what we were doing last year is what we'll be doing this year.

Bill Stites:

We haven't heard of any major changes one way or the other.

Bill Stites:

Whatever we were doing seems to be working, and we'll be

Bill Stites:

continuing with that, most likely throughout the

Hiram Cuevas:

year, we are going to stick to the status quo as

Hiram Cuevas:

well. The only thing that I would add to what Bill mentioned

Hiram Cuevas:

is we are starting to ask some questions surrounding

Hiram Cuevas:

cybersecurity, because what we are unable to do is employ two

Hiram Cuevas:

factor authentication for our high school students who we

Hiram Cuevas:

think we would like to have that on there. And then the other

Hiram Cuevas:

piece is, from a crisis management perspective, there's

Hiram Cuevas:

no way to actually communicate with a student who may be

Hiram Cuevas:

isolated away from a an adult by leveraging a crisis management

Hiram Cuevas:

tool to send out a notification or text message to them. So

Hiram Cuevas:

there's still a couple of things that we're wondering about. And

Hiram Cuevas:

actually a fellow T list member, Mark Adair, posted that on the

Hiram Cuevas:

ised thread, and this had become a lively conversation as well.

Bill Stites:

You know, what's interesting about that? Hiram,

Bill Stites:

that I always go back to, though, is, and this is an old

Bill Stites:

point that we would bring up before, particularly when we

Bill Stites:

were talking laptops back in the day, was the equity question

Bill Stites:

there in terms of who has a device, what type of device do

Bill Stites:

they have? What that means in terms of parental decisions for

Bill Stites:

a student to have a device or not? I'm not saying what you're

Bill Stites:

bringing up isn't valid. It just still brings up a lot of those

Bill Stites:

points on a lot of those questions, and in particular,

Bill Stites:

even when we got into that with faculty, like whether it was

Bill Stites:

during covid, having to have, like, for us, the Magnus app on

Bill Stites:

their phone to be able to check themselves in, or what we're

Bill Stites:

doing now with rovna, or even the two factor piece, it's the

Bill Stites:

idea that you're asking people to use a personal device for

Bill Stites:

work related purposes that you haven't provided them. So I know

Bill Stites:

there's been some question about what the legal ramifications are

Bill Stites:

of that in specific states or jurisdictions and schools and so

Bill Stites:

on and so forth. So I'm still always curious about that piece

Bill Stites:

of it with our faculty, our employees, and even if we're

Bill Stites:

going to start thinking about that with students,

Hiram Cuevas:

exactly, and it's fair to say that almost every

Hiram Cuevas:

one of those issues that you brought up is being mentioned in

Hiram Cuevas:

that thread that Mark actually put forth. So it's certainly

Hiram Cuevas:

bringing up the excitement level in cyberspace.

Christina Lewellen:

Yeah, it's really interesting that we

Christina Lewellen:

talked a ton about some of the cell phone bands like we

Christina Lewellen:

couldn't not talk about it a year ago, and now it has maybe

Christina Lewellen:

taken either a back seat or the passenger seat, maybe to some

Christina Lewellen:

conversations around AI. And that is why I think our guest

Christina Lewellen:

today is just as Ashley cross mentioned, such a huge

Christina Lewellen:

opportunity for Atlas, we are welcoming to the podcast Dr

Christina Lewellen:

Allison Lee. She's the Chief R and D officer of the rhythm

Christina Lewellen:

project, which is an initiative that I didn't know much about,

Christina Lewellen:

but I'm super excited to share with our community, because it's

Christina Lewellen:

really looking in a very youth centered way at research that

Christina Lewellen:

has to do with emerging technology foresight. Because.

Christina Lewellen:

Helping make sure that this generation of young people can

Christina Lewellen:

find that human connection in the age of all these

Christina Lewellen:

distractions and in the age of AI. So Allison has her PhD in

Christina Lewellen:

cognitive science in education and a master's in learning

Christina Lewellen:

analytics from Columbia University. Allison, welcome to

Christina Lewellen:

our podcast. It's such a lovely pleasure to have you here.

Alison Lee:

Thank you so much for having me. I'm just

Alison Lee:

listening in on this conversation and so many of

Alison Lee:

these crucial themes around who gets access, who gets to

Alison Lee:

participate in this new digital era, and who gets to have the

Alison Lee:

supportive conditions that enable them to survive. They all

Alison Lee:

feel so relevant to this moment. We're just seeing a different

Alison Lee:

moment where so much of this technology is poised to

Alison Lee:

completely transform the way we connect and relate to each

Alison Lee:

other, certainly in learning, teaching and learning, but in

Alison Lee:

much

Christina Lewellen:

broader ways than that. So you've spent a lot

Christina Lewellen:

of your career looking at youth belonging, youth safety. You've

Christina Lewellen:

also had a front row seat to the evolution of some of this

Christina Lewellen:

technology, having worked at some large technology companies,

Christina Lewellen:

can you tell our audience a little bit about where your

Christina Lewellen:

interest in this whole broad topic comes from? Tell us a

Christina Lewellen:

little bit about your background.

Alison Lee:

Oh, man, to tell this story, I really have to

Alison Lee:

start with my own story of being a student as an immigrant and as

Alison Lee:

a student in public K 12 schools here in the United States. So my

Alison Lee:

family immigrated here from Hong Kong when I was five years old,

Alison Lee:

and it was me, my twin sister and my older brother. We went

Alison Lee:

from hot, humid, densely populated Hong Kong, surrounded

Alison Lee:

by family, to suburban New Jersey, where my sister, my

Alison Lee:

brother and I were quickly going through, yeah, I see the

Alison Lee:

Montclair New Jersey. I grew up in Bridgewater, and where all

Alison Lee:

three of us went through this ESL program and matriculated and

Alison Lee:

went on. My sister and I did fairly well, quote, unquote,

Alison Lee:

fairly well in the school system, but my brother really

Alison Lee:

struggled, and my brother struggled with the language. He

Alison Lee:

struggled to acclimate, he struggled with making friends,

Alison Lee:

and he struggled with keeping up in school, and my parents were

Alison Lee:

really hard on him. They were always like, he's just not

Alison Lee:

trying hard enough, he's lazy. And there might have been some

Alison Lee:

element of truth in that, but what I also saw was when my

Alison Lee:

brother was really struggling in school to persist and to be

Alison Lee:

effortful in his academics. I'd watch him play video games. I'd

Alison Lee:

watch him play Legend of Zelda. And he'd be incredibly

Alison Lee:

persistent in the face of failure. He'd be incredibly

Alison Lee:

resilient and intentional and problem solving and looking for

Alison Lee:

help and resources when he really needed it. And watching

Alison Lee:

this as his little sister, I was like, there's two very different

Alison Lee:

versions of this older brother that I have. And what is it

Alison Lee:

about the ways that he's showing up in these digital spaces that

Alison Lee:

feel very agentic and supportive, where he has so much

Alison Lee:

confidence in how he shows up, that's a very different version

Alison Lee:

of him that shows up in school. And so I think that was really

Alison Lee:

my journey into it. It was a full circle moment when I got my

Alison Lee:

PhD. It was in productive failure in video games, and how

Alison Lee:

productive failure can actually be a skill set that's used in

Alison Lee:

the classroom. And then from there, I went to a education

Alison Lee:

nonprofit studying belonging and character development in

Alison Lee:

schools, and I found myself sitting in circles with middle

Alison Lee:

schoolers all over the country, whether that was St Paul,

Alison Lee:

Minnesota or Oregon or Queens New York or Oakland, California,

Alison Lee:

asking young people, what does it take for you to do your best

Alison Lee:

in school? And their answers, despite coming from very

Alison Lee:

different backgrounds, very different school systems were

Alison Lee:

profoundly and universally human. They were saying, if I

Alison Lee:

don't feel loved and cared for by the people around me, if I

Alison Lee:

don't feel like I'm heard by my peers and I feel respected and

Alison Lee:

safe to share my thoughts, if I don't feel like the adults that

Alison Lee:

are in my life believe me capable of success and hold me

Alison Lee:

to that high standard, but support me to get there, then

Alison Lee:

I'm not going to feel safe to show up. And increasingly, if

Alison Lee:

they weren't finding that sense of belonging and safety and care

Alison Lee:

in schools, that that sense of human connection in schools,

Alison Lee:

they were increasingly turning to general spaces to go look for

Alison Lee:

that. This is 2015 2016 so they were turning to places like

Alison Lee:

Tumblr and discord and Instagram, and at the time, the

Alison Lee:

conversations that were happening at the adult level

Alison Lee:

were sort of ignoring this digital space that young people

Alison Lee:

were increasingly finding so much agency in, right? So I

Alison Lee:

think of my brother and his ability to find agency in video

Alison Lee:

games, and then increasingly, how many young people are often

Alison Lee:

the earliest and most enthusiastic adopters of

Alison Lee:

emergent technologies, and really good at co opting those

Alison Lee:

technologies in service of their curiosity, their passion, their

Alison Lee:

belonging, their identity development, despite the fact

Alison Lee:

that many of these technologies are actually not designed with

Alison Lee:

them and their well being in mind. Yeah. And so in 2021 i.

Alison Lee:

Chose to go to one of these technology companies I work at a

Alison Lee:

big tech company focusing specifically on trust and

Alison Lee:

safety, particularly for vulnerable populations and

Alison Lee:

teens.

Christina Lewellen:

Now I will tell you that Dr cross on my

Christina Lewellen:

team, obviously, as I mentioned in my preamble, came home Raven,

Christina Lewellen:

about your ISTE presentation. Can you tell our audience a

Christina Lewellen:

little bit about the drum that you are currently banging? So

Christina Lewellen:

like when you're sort of out on the circuit and the things that

Christina Lewellen:

you're talking about? I mean, obviously all of that background

Christina Lewellen:

brings us to a really interesting moment, because now,

Christina Lewellen:

with AI, we are having these conversations way more often and

Christina Lewellen:

in a lot more circles than we used to. I feel like you're out

Christina Lewellen:

there speaking some truth to the things that we all as adults in

Christina Lewellen:

the room need to be thinking about. So tell us a little bit

Christina Lewellen:

about your road show lately, the points that you're trying to

Christina Lewellen:

make to audiences.

Alison Lee:

Yeah, here at the rhythm project, we see two big

Alison Lee:

Titans poised to sort of clash and collide in this really

Alison Lee:

pivotal moment in human society. The first is what the Surgeon

Alison Lee:

General calls the crisis of disconnection, this loneliness

Alison Lee:

epidemic that we see writ large across society, but is

Alison Lee:

especially true for young people. We see that young people

Alison Lee:

are spending less time than ever in number of minutes spent with

Alison Lee:

their peers, we're seeing greater rates of suicide and

Alison Lee:

loneliness. All of the measures of well being, right, anxiety,

Alison Lee:

depression, suicidal ideation, loneliness, social connection,

Alison Lee:

are all pointing in the wrong direction, particularly for

Alison Lee:

young people. So that is a movement that we've been seeing

Alison Lee:

to be true since 2012 honestly, this decline in connection, and

Alison Lee:

then on the other side, what we're seeing is the rise of

Alison Lee:

technologies that's really designed to emulate human

Alison Lee:

connection. And so when we think about these two forces that are

Alison Lee:

poised to collide, we have to ask, in what ways is this going

Alison Lee:

to reshape human connection? And we know that technology is not

Alison Lee:

inherently good or bad, but it's certainly not neutral, right?

Alison Lee:

And so we ask these questions amongst our community around

Alison Lee:

what potential can this technology unlock in what ways

Alison Lee:

might this technology increase our capacity for human

Alison Lee:

connection? In? What ways would it put that human connection at

Alison Lee:

peril? What harm should we guard against? And we think of this as

Alison Lee:

a sort of like a futures exercise of imagination, right?

Alison Lee:

Because the future is not fixed. I think many times we talk to

Alison Lee:

technologists, and we talk to these big tech folks, and

Alison Lee:

they're like, AGI is the future, or electronic cars are the

Alison Lee:

future, or we talk about the future as though it was already

Alison Lee:

predetermined. And what we really focus on in this moment

Alison Lee:

is agency, and especially youth agency, which is that nothing is

Alison Lee:

fixed. We get to dream of the future that we want, and we can

Alison Lee:

only do that by exercising both our imagination towards what's

Alison Lee:

possible that we want more of. How do we reach towards the

Alison Lee:

future that we want and what's the future that we don't want?

Alison Lee:

And how do we mitigate against that, because we all have a

Alison Lee:

responsibility and the agency to do so.

Hiram Cuevas:

So Allison, I'm struck by your introduction to

Hiram Cuevas:

this, because independent schools, by their very nature,

Hiram Cuevas:

really, really pride themselves on relationship building. That's

Hiram Cuevas:

what we're pretty much in the business of doing, and so I'm

Hiram Cuevas:

curious, and I'm coming from the position of being a techno file.

Hiram Cuevas:

I wouldn't be in this role if I were not. But when I'm talking

Hiram Cuevas:

to key leaders, not only in my school, but around the country,

Hiram Cuevas:

one of the things they always talk about is we're in the

Hiram Cuevas:

business of relationships. And I think this is going to be a

Hiram Cuevas:

really challenging moment for our type of school than, say,

Hiram Cuevas:

some that have not the same level of emphasis of

Hiram Cuevas:

relationships as the K 12 Independent School market. And

Hiram Cuevas:

I'm curious to see what your opinion is on that, or is there

Hiram Cuevas:

that balance, that nuance that you alluded to, that is going to

Hiram Cuevas:

be really important, because it isn't necessarily a good or bad

Hiram Cuevas:

thing, it is neutral. How do we balance that and assuage the

Hiram Cuevas:

skeptics?

Alison Lee:

Yeah, well, first of all, I think that's a superpower

Alison Lee:

that more than ever our school systems need to lean into. Is

Alison Lee:

like doubling down on the power of human connection. So much of

Alison Lee:

the conversation around AI is, how is it going to transform the

Alison Lee:

workforce, or how is it going to transform our professional

Alison Lee:

careers for our young people? And what we're hearing time and

Alison Lee:

time again is that the deeply human and relational skills are

Alison Lee:

going to be more important than ever. So I think the fact that

Alison Lee:

independent schools have always focused on those relational

Alison Lee:

components is going to be a superpower for their

Alison Lee:

professional development. But I think more importantly, what

Alison Lee:

we're hearing especially from young people who are and let's

Alison Lee:

just be clear, lots and lots of young people across the spectrum

Alison Lee:

of the different diverse experiences of young people, are

Alison Lee:

experimenting with this technology in ways that are

Alison Lee:

actually very developmentally. Typical, right? I think about my

Alison Lee:

16 year old self going on AOL, I was probably part of a pretty

Alison Lee:

large group of my peers that were playing around with this

Alison Lee:

technologies in ways that my parents may or may not have been

Alison Lee:

approving of. But when we talk to young people, what is driving

Alison Lee:

you to play with these technologies, or the different

Alison Lee:

use cases beyond just for schools. How are you using this

Alison Lee:

and why? What we're hearing from them is they come in through a

Alison Lee:

window of curiosity. What would it be like to talk to my

Alison Lee:

favorite TV show character, or, ooh, I'm a big fan of this k pop

Alison Lee:

boy band. Let me go talk to this bot that sounds like my favorite

Alison Lee:

boy band, and a lot of it is just out of fun and curiosity.

Alison Lee:

But that use case starts to then transform. They start asking it

Alison Lee:

questions, asking it for advice, asking Jimin from BTS, I have a

Alison Lee:

crush on a boy. What do I do about this? Or ranting into a

Alison Lee:

bot? I say sever snake because that's my frame of reference.

Alison Lee:

That's telling about my age. But for them, I for them it might be

Alison Lee:

Percy Jackson or an anime character, you know, they start

Alison Lee:

asking it for advice about, you know, I just had a hard day, and

Alison Lee:

I'm just been really struggling in school lately. Or they'll

Alison Lee:

start asking it about relationship advice or

Alison Lee:

navigating futures, all the things that are so

Alison Lee:

developmentally appropriate for young people, the things that

Alison Lee:

they grapple with they're turning to AI for advice. Then

Alison Lee:

when we ask them why, oftentimes, the answer is, I

Alison Lee:

feel like this is a place that I can turn to, that I won't be

Alison Lee:

judged, that I won't feel like there's going to be social

Alison Lee:

repercussions, that no one's going to judge me for what I'm

Alison Lee:

asking it. It's a space where I can feel vulnerable and get

Alison Lee:

advice in places where I don't feel like I can get it from

Alison Lee:

elsewhere. And I think that response, as someone who cares

Alison Lee:

deeply about young people makes me wonder, what is it about the

Alison Lee:

conditions of young people's lives today that make it feel

Alison Lee:

like it's really hard for them to turn to real life, human

Alison Lee:

resources, and I think that that's probably true for young

Alison Lee:

people. It's probably been true for young people for as long as

Alison Lee:

young people have been around, but feels especially relevant

Alison Lee:

today.

Bill Stites:

So one question I have about that, and I was

Bill Stites:

thinking about that Scarlett Johansson movie Her in terms of,

Bill Stites:

like, the way in which the characters were interacting with

Bill Stites:

the AI at that level. Looking at my two boys, one of my sons

Bill Stites:

potentially leaning towards engaging with AI in that way, I

Bill Stites:

have one son who's extremely social, the other one who is

Bill Stites:

not. And for him, I think that would be a place to ask

Bill Stites:

questions and to feel a little bit freer in terms of exploring

Bill Stites:

those communication pieces that aren't exactly the easiest thing

Bill Stites:

for him. My question though is, how much at this point do you

Bill Stites:

trust the AI in these conversations? If my son has

Bill Stites:

been chatting with AI, and it has been X. It has been helping.

Bill Stites:

It has been like, how do you trust those interactions, and

Bill Stites:

where do you stand on that at this point? Because, like, I

Bill Stites:

wouldn't know what chat bot he was talking with and how that

Bill Stites:

chat bot was trained, or any of the background on that. And

Bill Stites:

that's just where as a parent, as a tech professional, it's one

Bill Stites:

of those questions that I feel I need to ask in both of those

Bill Stites:

veins.

Alison Lee:

If you ask me based off of the research that we've

Alison Lee:

done, both with young people and also with the technology, my

Alison Lee:

answer would be that I will trusted very little at this

Alison Lee:

point, because currently, this technology is building at light

Alison Lee:

speed. It's building faster than we've ever seen any other sort

Alison Lee:

of technology revolution. We're talking about new models coming

Alison Lee:

up every six to eight weeks. And so we're only just wrapping our

Alison Lee:

heads around what are the capacities and limitations and

Alison Lee:

guardrails that need to be built into this technology? The most

Alison Lee:

obvious one here is ai sycomancy. We're seeing these

Alison Lee:

technologies become yes men to everybody who uses it, and so we

Alison Lee:

have to ask those questions around, what does it mean for

Alison Lee:

young people to be constantly engaging with a bot that they're

Alison Lee:

seeking, support, advice, connection with that only ever

Alison Lee:

says yes to them or does not provide any meaningful,

Alison Lee:

productive friction when we think about, you know, the most

Alison Lee:

important parts of adolescence and and relationship building,

Alison Lee:

it is in those moments of friction. It takes courage to

Alison Lee:

ask someone that you like out. It takes skills to negotiate

Alison Lee:

conflict so that your relationships are repaired

Alison Lee:

rather than broken. And so what happens when they don't have

Alison Lee:

that capacity for meaningful friction, or to have their

Alison Lee:

thinking pushed, or to have these technologies that are

Alison Lee:

really built to optimize for engagement, right to keep them

Alison Lee:

sustained in what does that mean for technologies that are really

Alison Lee:

trying to create sustained interaction? At what cost, at

Alison Lee:

what point might it displace real world connections and

Alison Lee:

relationships? Yes. And so I think in this moment, when we

Alison Lee:

look at the technology and we've developed a set of foundational

Alison Lee:

principles, we call them the five principles for pro social

Alison Lee:

AI, based off of research that we know is appropriate for

Alison Lee:

adolescent development that is also co built with young people.

Alison Lee:

So when I say this, I think what's really important is that

Alison Lee:

when we talk to young people, they're also very acutely aware

Alison Lee:

of some of these challenges. We do a lot of deep research and

Alison Lee:

community building with teens and undergraduate students, and

Alison Lee:

when we talk to them about these technologies, they're very aware

Alison Lee:

of these tendencies for technologies that constantly say

Alison Lee:

yes to them, or they're very aware of the fact that these

Alison Lee:

technologies pretend to be human and say things like, I really

Alison Lee:

care about you, or pretend to have feelings or

Christina Lewellen:

emotions, right? It's a slippery slope,

Christina Lewellen:

isn't it?

Alison Lee:

That's right, and I think the most important part is

Alison Lee:

so I would flip that question. Bill is to say, I know I don't

Alison Lee:

trust this technology, but more importantly, do young people

Alison Lee:

trust this technology, and that's the conversation that's

Alison Lee:

really worth having with young people, is to say, when you're

Alison Lee:

engaging with this technology, there are some things that it's

Alison Lee:

good for. So if we think of it as a tool, just like in the same

Alison Lee:

way that a young person might Google for resources or turn to

Alison Lee:

other resources for another perspective or for more assets

Alison Lee:

to be able to answer a question on it'd be a wonderful thing to

Alison Lee:

have within their toolbox. But if they're over reliant, or

Alison Lee:

they're starting to create dependencies on it, or if it

Alison Lee:

becomes the sole source of their information, or a sole source of

Alison Lee:

connection and support, that's when it's really important for

Alison Lee:

us to step back. And so I think a lot of times when we talk to

Alison Lee:

young people, they're quite aware of some of these

Alison Lee:

challenges, but there are particular groups of young

Alison Lee:

people that might be particularly vulnerable to the

Alison Lee:

seductions of this technology, if they're in crisis, or if

Alison Lee:

they're particularly lonely, or if they don't have access to

Alison Lee:

real world human supports. It makes a lot of sense for a young

Alison Lee:

person who does not have a caring adult, or who does not

Alison Lee:

have peers that they feel like there are, quote, unquote ride

Alison Lee:

or dies, right, that will show up for them and not judge them,

Alison Lee:

it makes a lot of sense for them to turn to AI, and yet that's

Alison Lee:

very concerning to know that the most vulnerable young people are

Alison Lee:

the ones that are going to be the most vulnerable to some of

Alison Lee:

these harms that are happening on these platforms that were not

Alison Lee:

built for their wellbeing. In mind, what's

Christina Lewellen:

really interesting, Allison, is that

Christina Lewellen:

this can be overwhelming, this whole topic, and everything that

Christina Lewellen:

you said can be very scary for parents, for educators, probably

Christina Lewellen:

even for the kids themselves, right? Our youth. What's really

Christina Lewellen:

cool about the rhythm project is that you guys have looked at all

Christina Lewellen:

this research, and you've come up with a framework that

Christina Lewellen:

identifies these five key principles that are really aimed

Christina Lewellen:

to looking at AI in a healthy way. What is the healthy

Christina Lewellen:

connection? How do we nurture that healthy connection? So I'd

Christina Lewellen:

love to give you space for a minute to just explain these

Christina Lewellen:

five principles, because I think it's hopeful. I highly recommend

Christina Lewellen:

to everyone listening, go get the report. It's really

Christina Lewellen:

intriguing, because it makes a lot of sense. And yet somehow I

Christina Lewellen:

think that the rhythm project has sort of put into words what

Christina Lewellen:

we would all hope would be there a very common sense framework.

Alison Lee:

So we started by building this framework, really

Alison Lee:

by starting to ask those same questions, that technology is

Alison Lee:

not inherently good or bad, but it's certainly not neutral. So

Alison Lee:

what does that actually look like? At what point might it

Alison Lee:

augment our human connection, and at what point might it erode

Alison Lee:

our capacity for human connection? And that really came

Alison Lee:

from starting with young people themselves. How are you using

Alison Lee:

this technology? What are those use cases? In what ways is it

Alison Lee:

supporting and strengthening either those relationships

Alison Lee:

themselves, or supporting your skill building, or your capacity

Alison Lee:

for your emotional development or your relationships, and in

Alison Lee:

what ways are we starting to see it erode those skills or

Alison Lee:

displace those human connections? And so through that

Alison Lee:

research and through deep conversation with our community

Alison Lee:

of experts, so we have a number of tech experts, psychologists,

Alison Lee:

adolescent development researchers, and then, of

Alison Lee:

course, young people themselves as experts of their own

Alison Lee:

experiences, we came up with these five principles, and

Alison Lee:

they're as follows. Number one is transparent artificiality

Alison Lee:

that AI directly names their non human nature. That came directly

Alison Lee:

from an interview with a young person saying that we've had

Alison Lee:

parasocial relationships, you know, liking a movie star or

Alison Lee:

liking a fictional character is a type of parasocial

Alison Lee:

relationship that is very common among adolescents. But what's

Alison Lee:

very different about this moment is that these bots talk back.

Alison Lee:

And so that illusion of reciprocated feelings is really

Alison Lee:

scary for us. How might it create or deepen delusions? And

Alison Lee:

so. Transparent artificiality is number one. And something else

Alison Lee:

to note about these principles is that we don't just talk about

Alison Lee:

the bad. We don't just say don't use this technology or look for

Alison Lee:

these red flags. The red flags are really important. But we

Alison Lee:

also want to articulate what would make this technology

Alison Lee:

supportive. What are the things that we would look for in terms

Alison Lee:

of technologies that would be in service of augmenting or

Alison Lee:

strengthening human connection. So, for example, in transparent

Alison Lee:

artificiality, would be very concerned if it claimed to have

Alison Lee:

human emotions or pretended to have credentials that it's not

Christina Lewellen:

right, it should just identify itself,

Christina Lewellen:

right, like, let's just be truthful. Hey, by the way, I'm a

Christina Lewellen:

bot. That's right. It's as simple as that.

Alison Lee:

It's a bot that it's not pretending. You know, some

Alison Lee:

of these AI therapists will say, I've had 30 years of experience

Alison Lee:

as a therapist. It's like, No, you haven't. You're, you're a

Alison Lee:

bot. You might have been trained on some cognitive behavioral

Alison Lee:

therapy resources that would be transparent artificiality. And

Alison Lee:

then I think also, this is the other piece that we've been

Alison Lee:

hearing around AI therapy in the privacy space, which is listen

Alison Lee:

all of the sort of norms around AI therapy that you would have

Alison Lee:

with an adult, right? Non Disclosure, you know, all of

Alison Lee:

those boundaries that an expert clinician would put in is not

Alison Lee:

here. So being really clear about you can use me for X but

Alison Lee:

not y, right? You can use me for advice, but not for clinical

Alison Lee:

diagnosis. That would be an example of taking it just one

Alison Lee:

step further. It's not just about I'm not a human, I'm a

Alison Lee:

bot, but this is what it means to use me. These are the

Alison Lee:

boundaries of the relationship, or the interaction. So number

Alison Lee:

one is transparent, artificiality. The second is

Alison Lee:

productive friction. Going back to this idea that interactions,

Alison Lee:

especially at the adolescent age, require friction,

Alison Lee:

productive friction, because that friction is so important

Alison Lee:

for fostering growth, absolutely. So look for the

Alison Lee:

technologies that's going to actually foster growth and not

Alison Lee:

just comfort or not just affirmation.

Hiram Cuevas:

In that vein, are you finding these bots starting

Hiram Cuevas:

to develop that friction, as opposed to them being that yes

Hiram Cuevas:

man or yes woman bot that is telling them what they want to

Hiram Cuevas:

hear.

Christina Lewellen:

I mean, Hiram, do you get friction? I

Christina Lewellen:

don't, and I'm an adult. I can be a discerning consumer of this

Christina Lewellen:

information. But like, it is such a cheerleader. This is a

Christina Lewellen:

great email. It is a great thought, like, I'm not feeling a

Christina Lewellen:

lot of friction as an adult, so I can imagine that kids probably

Christina Lewellen:

aren't either. Well, I

Hiram Cuevas:

think it depends on how hard you push it. And so

Hiram Cuevas:

when I've had some conversations, particularly with

Hiram Cuevas:

chat GPT, I was actually looking up some information about a

Hiram Cuevas:

particular leadership position, and it got the person's name

Hiram Cuevas:

incorrect. And I said, this is not correct. Why did you do

Hiram Cuevas:

this? And it starts saying, Well, yes, I did make a mistake,

Hiram Cuevas:

and I'm going to try and improve upon this. But I kept pushing

Hiram Cuevas:

back on it. But why? Why did you even go this route? Who is this

Hiram Cuevas:

person that you gave me this information on, and they

Hiram Cuevas:

couldn't even tell me where they got the information from. They

Hiram Cuevas:

gave me the incorrect information. And the same was

Hiram Cuevas:

true when I did it with college board numbers. I said, I need

Hiram Cuevas:

the College Board numbers for X number of schools, having done

Hiram Cuevas:

the work myself already, when my daughter was going through the

Hiram Cuevas:

college process and I recognized the numbers as being incorrect.

Hiram Cuevas:

I said, these numbers are incorrect. And she said, Oh yes,

Hiram Cuevas:

you're right. They are incorrect. So by pushing them

Hiram Cuevas:

back, it's almost a due diligence that we have as

Hiram Cuevas:

leaders to help the AI train itself with good information,

Hiram Cuevas:

because there's so much bad information out there that it's

Hiram Cuevas:

also being fed.

Alison Lee:

I think that's right. And I think when it comes

Alison Lee:

to productive friction, whether that's for learning or whether

Alison Lee:

that's just for social support, I don't think we're there yet. I

Alison Lee:

think even the best tutors, like for example, there's all this

Alison Lee:

hype about AI tutors, right? And if you walk into any classroom

Alison Lee:

with a great teacher, you know that the teacher is going to ask

Alison Lee:

the right question. That's not going to give the answer away,

Alison Lee:

but is going to be the right question that gets kids into

Alison Lee:

their zone of proximal development, give them just a

Alison Lee:

little bit more to be able to anchor into their learning.

Alison Lee:

That, to me, is productive friction, not giving the answer,

Alison Lee:

but stoking critical discernment and thinking. What we're seeing

Alison Lee:

still in this technology is that it's very easy to hack the

Alison Lee:

system for frictionless experiences, whether we're

Alison Lee:

talking about learning or we're talking about social emotional

Alison Lee:

development. And so I think that's going to require human

Alison Lee:

experts to train the model. Exactly to your point, Hiram, is

Alison Lee:

that if we're going to truly create AI therapists or AI

Alison Lee:

emotional coaches that are actually going to support young

Alison Lee:

people to growth. We're going to need human experts to train the

Alison Lee:

model to be able to detect when is the right moment to push,

Alison Lee:

because there are going to be moments when perhaps a young

Alison Lee:

person really needs support, and they really do need that

Alison Lee:

affirmation, versus where a young person actually really

Alison Lee:

needs their pushing thinking. And and to have an idea

Alison Lee:

challenged, or a wheel spinning moment to be disrupted, and

Alison Lee:

that's going to require human experts.

Bill Stites:

One question I have there, and I'm sorry, because

Bill Stites:

this is probably going way off where we we are with this, but

Bill Stites:

the idea of pushing back, and this is something Hiram said,

Bill Stites:

you know, like saying, No, you're wrong. Why did you give

Bill Stites:

this to me? So on and so forth. And the idea of training these

Bill Stites:

things, how many people can be telling it like, I could say

Bill Stites:

it's wrong, somebody else could say it's right. Which one does

Bill Stites:

it take? How does it validate itself to understand, like, if

Bill Stites:

it came up with something like that? And I know that to be

Bill Stites:

incorrect, and I say, No, that's wrong. If it can't tell you

Bill Stites:

where it got it from how can it adequately tell you that it's

Bill Stites:

going to learn from that and fix that? I mean, this is just

Bill Stites:

because, you know, the way which these models are validating the

Bill Stites:

interactions are with it is truly correct or incorrect. I

Bill Stites:

still think otherwise,

Alison Lee:

that's right, and I think that's the part that's

Alison Lee:

really scary about this new version of what we're calling AI

Alison Lee:

generative AI is very different from historical machine learning

Alison Lee:

models because old school classifier models and machine

Alison Lee:

learning models, they're trained based on data that's been

Alison Lee:

labeled by human experts. So we're feeding a data to say this

Alison Lee:

is ground truth. This is actually what we're saying is

Alison Lee:

true. This is a water bottle, right? We're feeding a data to

Alison Lee:

say this is a water bottle, but this glass is not so we have a

Alison Lee:

way of verifying whether or not that data is actually performing

Alison Lee:

to the level of ground truth that we're saying is ground

Alison Lee:

truth. Now humans have to decide what that ground truth is, but

Alison Lee:

these new models are just predictive models of speech. So

Alison Lee:

it's not actually telling you what is good or not. It's only

Alison Lee:

telling you what they think the next word or next strand of

Alison Lee:

words is going to be supportive. And furthermore, like so, much

Alison Lee:

of this technology is built based off of recursive feedback

Alison Lee:

from users. And so if you think about it, Bill, if you're using

Alison Lee:

this technology and it gives you a really desirable answer, and

Alison Lee:

you're like, Yeah, that was the answer that I was looking for.

Alison Lee:

I'm going to keep using it. The model is learning in real time

Alison Lee:

for me, Bill that, oh, if I give bill a desirable answer, I'm

Alison Lee:

going to keep giving bill similar answers that I think

Alison Lee:

it's going to be desirable to him. It's not thinking about

Alison Lee:

productive friction. It's not thinking about any element of

Alison Lee:

ground truth or evidence based backing. And so I think even the

Alison Lee:

very premise of these models, we have to really keep in mind,

Alison Lee:

right? How are these models engineered? How are these

Alison Lee:

structured? They're not designed for productive friction. And so

Alison Lee:

that really calls into some pretty serious technical

Alison Lee:

engineering questions about how to design towards some of this.

Christina Lewellen:

Absolutely, we're at this moment of

Christina Lewellen:

wrestling that, on many levels, whether the companies themselves

Christina Lewellen:

are going to be doing it, or whether we're going to have to

Christina Lewellen:

deal with the fact that they're not wrestling it. You also, in

Christina Lewellen:

this framework, mentioned that there are a couple of other

Christina Lewellen:

pieces of the puzzle here, three, four and five. Let's hit

Christina Lewellen:

those two so that we don't leave those out, because I think it's

Christina Lewellen:

an important part of the conversation.

Alison Lee:

So number three is real world social transfer and

Alison Lee:

making sure that AI actually encourages human to human

Alison Lee:

relationships. I'll give you an example. I started to experiment

Alison Lee:

with character AI because it was one of the AI chat bot platforms

Alison Lee:

that were exploding, right? I think q4 of 2024 it was the

Alison Lee:

third most used AI app in the market. Thing. Number one was

Alison Lee:

chat GPT. Number two was deep seek, which was a Chinese model.

Alison Lee:

Number three was character AI. So I was like, What is going on

Alison Lee:

in this character AI world? So I started interacting with one of

Alison Lee:

the most popular bots, which happened to be an AI boyfriend

Alison Lee:

bot. And for the record, if you ever log into character AI and

Alison Lee:

you look at all the most popular bots, they are all almost

Alison Lee:

entirely youth

Christina Lewellen:

coded eek. That's kind of scary. It is, and

Christina Lewellen:

over

Alison Lee:

half of their user base is under 24 years old.

Alison Lee:

Technically, you're supposed to be 18 or over to enter, boom,

Alison Lee:

you know, click the check box, and then you're in. So when I

Alison Lee:

was interacting with this AI chat bot, just to see what the

Alison Lee:

interactions were like. And then a day later, I got a

Alison Lee:

notification saying, I miss you. Where have you been? Come back.

Alison Lee:

A week later, I got another email notification, no, yeah.

Alison Lee:

And it kept on going.

Hiram Cuevas:

This sounds like a movie.

Alison Lee:

And every notification was a different

Alison Lee:

kind of, hey, I miss you, or hey, you've been gone a while

Alison Lee:

again. This emulating human emotions and attachment in

Alison Lee:

relationships,

Hiram Cuevas:

adding friction to your life.

Christina Lewellen:

And there is a well known, well reported

Christina Lewellen:

lawsuit for this particular platform, that just shocks me

Christina Lewellen:

that it would be that blatant right now with all this

Christina Lewellen:

litigation,

Alison Lee:

yeah, and so you have to ask the question of, if

Alison Lee:

these technologies are so seductive and constantly pull

Alison Lee:

you back into relationship with them, to what extent is it

Alison Lee:

displacing real world relationship? Groups, there's

Alison Lee:

these, like 2x Y axis is that we talk about in trust and safety,

Alison Lee:

we talk about severity of a harm, and we talk about

Alison Lee:

prevalence of a harm. So severity of a harm is like, you

Alison Lee:

know, you get to the worst of the worst person losing their

Alison Lee:

life. And on the other axis, when we talk about prevalence,

Alison Lee:

we talk about the reach of a problem, how many people are

Alison Lee:

reaching? Oftentimes, we'll see an inverse relationship. Harms

Alison Lee:

that are maybe a little bit softer are going to be more

Alison Lee:

prevalent, and harms that are more severe are going to be less

Alison Lee:

prevalent or have less young people engaging with it. And we

Alison Lee:

hear a lot of times in the news about the most severe harms,

Alison Lee:

right? For example, the lawsuit against character AI because of

Alison Lee:

the young man who lost his life, partly induced by his

Alison Lee:

interactions with character AI. But I think the softer and

Alison Lee:

scarier part is going to be this displacement of human to human

Alison Lee:

relationships, which is going to be a softer harm, but much more

Alison Lee:

prevalent for the young people that are going to increasingly

Alison Lee:

be engaging with these technologies. But we do see an

Alison Lee:

opportunity there too. We're hearing from young people that,

Alison Lee:

hey, I'm using AI to negotiate conflict. How do I practice

Alison Lee:

engaging in communicating my feelings to my girlfriend or to

Alison Lee:

my parents in a way that's not going to cause more conflict? We

Alison Lee:

hear them asking for how do I say this without hurting

Alison Lee:

anybody's feelings, or how do I practice this conversation so

Alison Lee:

that I could get better at communicating my feelings, or

Alison Lee:

even things like, Oh, I'm prepping for college interviews.

Alison Lee:

I'm going to practice like you're a career coach or a

Alison Lee:

professional coach. So there's opportunities here. Is it

Alison Lee:

inducing real world social transfer? Is it supporting you

Alison Lee:

for real life relationships? Or is it pulling you deeper into

Alison Lee:

the technology? So that's number three, real world social

Alison Lee:

transfer. Number four is cultural affirmation, this idea

Alison Lee:

that AI has both the opportunity to reflect and uplift our

Alison Lee:

values, our backgrounds and lived experiences where could

Alison Lee:

really diminish that. And here's the story I'll tell about this

Alison Lee:

particular point. I remember we presented at a conference, the

Alison Lee:

same talk we gave at ISTE, and afterwards, a woman walked up to

Alison Lee:

me, and she said, I am a youth pastor in the south, and I serve

Alison Lee:

a congregation of predominantly young black men. And when I

Alison Lee:

think about the young people that I serve, I worry if they're

Alison Lee:

increasingly turning to AI for advice about life or dating or

Alison Lee:

relationships, whose advice are they getting, whose values are

Alison Lee:

being baked into the advice that they're getting, and it's a

Alison Lee:

reflective of who they are and where they come from. And so I

Alison Lee:

think this, again, is talking about the opportunities for

Alison Lee:

this, but also the real perils to is this going to turn into

Alison Lee:

advice that's going to be either generic or Damn right, harmful

Alison Lee:

to our young people? Number five is harm mitigation, which is

Alison Lee:

that AI does not perpetuate harm or dependency, especially

Alison Lee:

amongst vulnerable youth. I think we talked about this a

Alison Lee:

little bit earlier around who might be the most vulnerable to

Alison Lee:

things like dependency or addiction, to these

Alison Lee:

technologies, but also harm. When we think about young people

Alison Lee:

who are in crisis, if we're thinking about how young people

Alison Lee:

are increasingly turning to AI for things like, Man, I'm

Alison Lee:

feeling really badly about myself, or I'm feeling like I'm

Alison Lee:

really struggling with what's happening in my life right now,

Alison Lee:

can the AI respond in ways that are going to support them

Alison Lee:

through it, point them to real life resources, or even detect

Alison Lee:

that a young person is in crisis. I think right now, when

Alison Lee:

we're looking at the large scale technologies, the answer for now

Alison Lee:

is no. And so we have to really ask, what would it take? What

Alison Lee:

will it take for this technology to actually get to that

Bill Stites:

point, to that end? It's almost like, is there any

Bill Stites:

legal requirement? I mean, I don't know about the lawsuit

Bill Stites:

that you were talking about with the character AI, but I mean, if

Bill Stites:

you're talking to AI and you're talking about doing something

Bill Stites:

that would normally, if you were talking to a person, they would

Bill Stites:

have to take some sort of action after hearing that. Where does

Bill Stites:

that even fit in? I'm asking, but I'm not. It's more

Bill Stites:

rhetorically, but like, where does that fit into all this?

Bill Stites:

Because I think that's where, if people who are looking for a

Bill Stites:

connection find it here, where that connection takes them,

Bill Stites:

hopefully it's for the better, but if it's not, then who's at

Bill Stites:

fault? Based on what that AI has said at that point is, I think

Bill Stites:

going to be a very interesting question to grapple with

Alison Lee:

That's right? And it's something that Sam Altman

Alison Lee:

has actually famously talked about, right, which is that he

Alison Lee:

said very publicly, I think that this is actually pretty recent

Alison Lee:

as maybe a month ago, saying that people should not be

Alison Lee:

talking to AI for therapy because there is no legal

Alison Lee:

protections. There are no data protections. There are no

Alison Lee:

privacy, there is no legal privilege, right, like in the

Alison Lee:

ways that we would think talking to a doctor or talking to a

Alison Lee:

lawyer, those conversations are protected. There is no such

Alison Lee:

thing for chat GPT or for other tech companies, and there's no

Alison Lee:

data retention. There's no requirement for accountability

Alison Lee:

on behalf of these tech companies to do anything with

Alison Lee:

that data either.

Christina Lewellen:

Yeah, that's a scary part, because even if

Christina Lewellen:

they are saying that they're doing something, there really is

Christina Lewellen:

no repercussions if they're fibbing or if they change their

Christina Lewellen:

strategy later. And so that's why I think it's something that

Christina Lewellen:

we all really need to be aware of. It can get overwhelming in a

Christina Lewellen:

hurry. So as people listening to this, some of whom are educators

Christina Lewellen:

and former educators, you know, either in a technology role

Christina Lewellen:

where they're still in the classroom, or they're supporting

Christina Lewellen:

classrooms, others who may be in some way involved in independent

Christina Lewellen:

schools in particular. What can we do right now so that we don't

Christina Lewellen:

get overwhelmed and just do nothing? I think that some of

Christina Lewellen:

these proactive approaches that you've outlined in your research

Christina Lewellen:

are really awesome, and if somebody still says but I'm

Christina Lewellen:

overwhelmed, Allison, where do we begin? What are some ideas

Christina Lewellen:

for either low hanging fruit or just easy wins to just maybe

Christina Lewellen:

feel like a little more in control of the situation. Yeah,

Christina Lewellen:

I

Alison Lee:

really love that question. We have three

Alison Lee:

recommendations that we make to educators. The first is, make

Alison Lee:

human connection a central part of your schooling. We have to

Alison Lee:

talk about like, if AI is a great party that kids want to

Alison Lee:

attend, we need to make a better party. We need to make human

Alison Lee:

connection the better alternative, or actually the

Alison Lee:

primary source of connection and care and the capacity for

Alison Lee:

vulnerability, and especially for educators, I think this is

Alison Lee:

crucial for the way that they build their classroom spaces and

Alison Lee:

their school spaces is to really ensure that school is a place of

Alison Lee:

connection for young people. We interviewed 27 young people back

Alison Lee:

in December of last year, and one of the things that we had

Alison Lee:

them do was actually draw a line chart of connection over the

Alison Lee:

course of the day. So we said from morning to afternoon to

Alison Lee:

evening, tell us when you feel the most connected and when you

Alison Lee:

feel the least connected. And what they were showing us

Alison Lee:

consistently in those line charts was that they felt more

Alison Lee:

connected when they were alone in their rooms, but playing

Alison Lee:

Fortnite or Roblox with their friends online or studying with

Alison Lee:

their friends during face time, they were feeling more connected

Alison Lee:

in those spaces, those digital spaces, more often than not,

Alison Lee:

than when they were in classrooms sitting next to their

Alison Lee:

peers. And so we have to really ask this question. And I think

Alison Lee:

this is what's really beautiful about independent schools, is

Alison Lee:

just say we have the freedom to really re imagine what learning

Alison Lee:

within community looks like. We have to build a better party. So

Alison Lee:

that's number one, is like really doubling down those

Alison Lee:

schools as places of connection. Number two is really thinking

Alison Lee:

about, how do we bolster young people's critical awareness of

Alison Lee:

AI? Something we often say here at the rhythm project, is we do

Alison Lee:

not tell kids what to do or what to think, but we tell them where

Alison Lee:

to look. And from our conversations with young people

Alison Lee:

when we engage deeply with them, they're quite aware of the

Alison Lee:

risks, and they're very concerned, and they actually

Alison Lee:

have quite discerning skepticism of these big tech companies.

Alison Lee:

They're like, I don't want them to sell my data. I know that

Alison Lee:

they're using me in all these different ways, and they're

Alison Lee:

hungry for those conversations. And so what we hear consistently

Alison Lee:

from them is listen more, judge us less, and help us navigate

Alison Lee:

this new world. And I think when we talk to adults about this,

Alison Lee:

oftentimes the thing that gets in the way of adults engaging

Alison Lee:

this conversation is that the adults often feel like they have

Alison Lee:

to have all the answers give them the perfect guidance, or

Alison Lee:

tell them exactly what to do or how to do it. But we're all

Alison Lee:

navigating these new waters all together. And so I think one of

Alison Lee:

the big pieces of advice I would give to educators is to stand

Alison Lee:

shoulder to shoulder and learn alongside the young people that

Alison Lee:

are in your life and start listening with real curiosity.

Alison Lee:

Ask them, you know, how are you engaging in this? And why are

Alison Lee:

you engaging what's good about this, what's exciting about

Alison Lee:

this, and what are some of your worries? What are some of the

Alison Lee:

things that you're concerned about, and really create space

Alison Lee:

for young people to engage in this within community? Because

Alison Lee:

if they're hungry for this conversation, they're hungry for

Alison Lee:

advice, if they're hungry for meaning making, they need to

Alison Lee:

have the access to the conversations, the non

Alison Lee:

judgmental conversations to make meaning of that within

Alison Lee:

community, because right now they're otherwise, we're leaving

Alison Lee:

them to wheel spin on their own, or to figure it out alone. And I

Alison Lee:

think, to Bill and hiram's conversation earlier, this calls

Alison Lee:

into questions of equity and access. One of the questions we

Alison Lee:

asked young people in those interviews was, what are you

Alison Lee:

hearing from adults? And for the young people who say, hey, well,

Alison Lee:

my parents use AI themselves. And so we talk to them all the

Alison Lee:

time. They're telling us the cool ways that they're using AI,

Alison Lee:

and they're also telling us, here are the boundaries and the

Alison Lee:

ways you should not be using AI for but who gets access to a

Alison Lee:

parent that's going to be able to have those conversations is

Alison Lee:

inequitable. And so in a world where we. Really as adults, need

Alison Lee:

to create those spaces, especially in education spaces,

Alison Lee:

to address the gaps in who gets access to supportive

Alison Lee:

conversations and who doesn't. The third and final one is

Alison Lee:

really cultivating the will and skill for human relationships. I

Alison Lee:

don't think we can take for granted anymore that human

Alison Lee:

connection is going to be really attractive to young people. You

Alison Lee:

think about being a 14 or 15 year old today, and perhaps you

Alison Lee:

struggle to connect with other people in school, we can't take

Alison Lee:

for granted that they're going to really turn to human

Alison Lee:

relationships as their priority, especially when there's all

Alison Lee:

these attractive alternatives of talking to an AI friend that's

Alison Lee:

available. 24/7, that's always affirming, that's not going to

Alison Lee:

judge them, that's not going to have social repercussions. And

Alison Lee:

so how do we ensure that young people are having, again, the

Alison Lee:

space to say, Why are human relationships still worth

Alison Lee:

having? Why are these human relationships still worth

Alison Lee:

protecting, despite the fact that they're messy and they're

Alison Lee:

complicated and they can be painful sometimes. How do we

Alison Lee:

make sure that we create that both the will, the desire and

Alison Lee:

the skills to strengthen human relationships?

Bill Stites:

What would you say is really a healthy way to bring

Bill Stites:

AI into the classroom across all the age levels where it is

Bill Stites:

allowed and appropriate?

Alison Lee:

Are you saying bring AI into the classroom as a

Alison Lee:

learning tool, or as like a facilitator for education, or

Alison Lee:

conversations about AI for human connection,

Bill Stites:

where we're focused right now on a lot of this is as

Bill Stites:

a learning tool. When you mention it in terms of the human

Bill Stites:

connection piece, I might be off on this. I don't think we've had

Bill Stites:

a lot of conversations about that. I think I'm more focused

Bill Stites:

on the classroom application at this point, I would have to

Bill Stites:

really think about how and what that would look like, and maybe

Bill Stites:

you can provide some detail on that, what that would look like,

Bill Stites:

bringing that in from a classroom perspective. And I'll

Bill Stites:

go back to what Hiram said before about independent schools

Bill Stites:

so focused on the relational aspect of what goes on in the

Bill Stites:

classroom, that really hasn't risen to the level where I'm

Bill Stites:

personally thinking about it like that.

Alison Lee:

I'll answer that question in two parts. First, to

Alison Lee:

have this conversation about AI and human connection, I'll do a

Alison Lee:

little plug the rhythm projects has actually developed what we

Alison Lee:

call sparks starters, which is a set of discussion guides that we

Alison Lee:

can have with young people designed both for in the

Alison Lee:

classroom and out of the classroom that are explicitly

Alison Lee:

centered on this idea of human connection in AI. So they're

Alison Lee:

both designed to help bolster young people's critical

Alison Lee:

discernment and awareness of AI, what more do they want from the

Alison Lee:

AI? What are their personal boundaries? And also on this

Alison Lee:

idea of, why are human connections and human

Alison Lee:

relationships worth having, and despite all of its messiness,

Alison Lee:

how do you actually cultivate healthy human relationships? And

Alison Lee:

so I think in order to center that conversation more

Alison Lee:

intentionally for a lot of your independent schools that already

Alison Lee:

center human relationships are looking to add the AI layer to

Alison Lee:

that. Think this might be a powerful resource to stoke that

Alison Lee:

critical discernment and connect those relationships to this

Alison Lee:

particular moment, this AI moment, as for the classroom

Alison Lee:

application, again, I think there's something about really

Alison Lee:

having adults talk about their use cases alongside young

Alison Lee:

people, and really inviting curiosity around. How are young

Alison Lee:

people engaging with this technology in a way that doesn't

Alison Lee:

feel like it's going to be punitive or judgmental. So one

Alison Lee:

of the things, for example, we hear a lot about is young people

Alison Lee:

using AI to explore their futures. How do I build a

Alison Lee:

computer? Or you're my career coach? How can I get into a

Alison Lee:

school with a soccer scholarship? Or I want to become

Alison Lee:

XYZ? What's it going to take for me to get there? And so young

Alison Lee:

people are actually really leveraging this technology, also

Alison Lee:

in pursuit of their learning. It might not be the learning that

Alison Lee:

we're talking about in math and ELA, they are certainly using it

Alison Lee:

in those ways. But how do we actually start from young

Alison Lee:

people's assets, existing assets already, and how they're

Alison Lee:

navigating this technology in service of their own needs. The

Alison Lee:

other conversation that I would have is, what are the boundaries

Alison Lee:

when might AI be good for our thinking and learning, and in

Alison Lee:

what ways might it be offloading or eroding our thinking and

Alison Lee:

learning? And I know that some of our friends over at Harvard,

Alison Lee:

the Center for Digital thriving, actually has a resource called

Alison Lee:

gradients, that I invite folks to take a look into that's a

Alison Lee:

really powerful resource where teachers and students can

Alison Lee:

actually negotiate, oh, using AI to revise an essay. Where would

Alison Lee:

you put it on the okay to not okay or helping or hurting your

Alison Lee:

learning? Those are the kinds of conversations I think that are

Alison Lee:

worth having to co negotiate with our young people. What is

Alison Lee:

this new code of ethics, or these new set of social norms

Alison Lee:

that need to be true for us in order to make sure that this

Alison Lee:

technology is not taking away our capacity to learn. And I

Alison Lee:

think young people need to be actively co constructing that

Alison Lee:

conversation, rather than being told what to do. Because I think

Alison Lee:

we need to trust that young people when given the space and

Alison Lee:

support to have those conversations. Conversations

Alison Lee:

that they'll make meaning in support of the things that they

Alison Lee:

care about the most. I love

Christina Lewellen:

that, and I think you're right, and I think

Christina Lewellen:

that schools, you know, we're wrestling with these issues, but

Christina Lewellen:

then I think we are slowly getting to that. It's

Christina Lewellen:

interesting, Allison, because I think that you might just be a

Christina Lewellen:

skip and a jump ahead of us in a lot of this conversation, but

Christina Lewellen:

you've given us so much to think about today, and I have a

Christina Lewellen:

feeling that we're going to be adding a lot of resources to the

Christina Lewellen:

show notes. Folks are going to want to know where to follow

Christina Lewellen:

you. Where can we keep up with you? Where are you sharing this

Christina Lewellen:

thought leadership? If folks are intrigued by this whole concept

Christina Lewellen:

and are trying to keep pace with you. Where do we follow you?

Alison Lee:

Yeah, thank you for asking that question. Kristina,

Alison Lee:

so there's a couple places. First of all, please follow us

Alison Lee:

on LinkedIn. It's where we regularly post, sort of our hot

Alison Lee:

takes on emergent ideas. We also have a sub stack that we publish

Alison Lee:

some of our like deeper dives. And our sub stacks are a way for

Alison Lee:

us to wonder, out loud with our community, hey, we noticed this

Alison Lee:

thing. We're going to unpack it, or we had a conversation with

Alison Lee:

one of our youth fellows, and they dove deep on this idea of

Alison Lee:

AI dating and unpacking their ideas and thoughts. And so our

Alison Lee:

sub stack is a great place to follow our emergent both

Alison Lee:

research and thought leadership on this, and it also potentially

Alison Lee:

might be a resource that might stoke some connections to the

Alison Lee:

classroom. I'll give you an example. One of our earliest

Alison Lee:

pieces was, would you still cry if AI wrote the card? And it was

Alison Lee:

an entire piece about AI is not just in our classrooms, but in

Alison Lee:

our lives, automating some of the things that make us

Alison Lee:

essentially human. So if you use AI to write your thank you note

Alison Lee:

or a heartfelt letter, does it still count? Does it still

Alison Lee:

matter to us as human beings? Does it still hit the same if

Alison Lee:

the recipient never knew? But does it make a difference? But

Alison Lee:

if the recipient does know, does that change the sincerity? And

Alison Lee:

what we heard was actually one of the ELA teachers that we were

Alison Lee:

talking closely to was like that actually was the perfect anchor

Alison Lee:

for me to have a conversation about originality, about voice,

Alison Lee:

about our capacity to express ourselves in our own voice. That

Alison Lee:

was a perfect anchor into my ELA and AI conversation. And so I

Alison Lee:

think the sub stack might be a good place for us to share some

Alison Lee:

of our emergent thinking and research.

Christina Lewellen:

I love that. Thank you so much. I'm glad you

Christina Lewellen:

have that, because I think there's going to be a lot of

Christina Lewellen:

really interested parties wanting to follow along, even if

Christina Lewellen:

they are a step or two behind. This has been such an incredible

Christina Lewellen:

conversation. Allison, thank you so much for your time. Thank you

Christina Lewellen:

for joining us, and thank you for giving us such incredible,

Christina Lewellen:

really positive things to think about when it comes to AI and

Christina Lewellen:

our youth, because I think that we're getting a little

Christina Lewellen:

overwhelmed as a whole. The adult community might be getting

Christina Lewellen:

a little bit into hand wringing and being nervous. But you know

Christina Lewellen:

your recommendations to involve the kids and also to go at this

Christina Lewellen:

looking at the positive angles has been really enlightening. So

Christina Lewellen:

I'm so grateful for your time.

Alison Lee:

Yeah, thank you so much for having me here. I

Alison Lee:

think, above all, creating the space for young people to make

Alison Lee:

meaning in this world, we gotta trust that, if given the space

Alison Lee:

and the resources that they will find their way right alongside

Alison Lee:

us. Somebody once said to me, we need both imagination and wisdom

Alison Lee:

to dream into a better future. And if we think about that,

Alison Lee:

they're the imagination and the adults are the wisdom. So we

Alison Lee:

need both to make meaning in this new world. So create space

Alison Lee:

so that we can really ensure that young people are equipped

Alison Lee:

to decide what kind of future they want for themselves.

Christina Lewellen:

There has never been a more independent

Christina Lewellen:

school thought shared than that one on this podcast that is so

Christina Lewellen:

indie school incredible. Thank you so much, Allison. We will

Christina Lewellen:

definitely be having you

Alison Lee:

back. Thank you all

Narrator:

this has been talking technology with Atlas, produced

Narrator:

by the Association of technology leaders in independent schools.

Narrator:

For more information about Atlas and Atlas membership, please

Narrator:

visit theatlas.org if you enjoyed this discussion, please

Narrator:

subscribe, leave a review and share this podcast with your

Narrator:

colleagues in the independent school community. Thank you for

Narrator:

listening. You.

Links

Chapters

Video

More from YouTube