In this second episode of the fifth season, Frank and Andy speak to Chris Wexler about using AI to protect the vulnerable.
Speaking of which, I would like to advise you, dear listener, that this show touches on some sensitive areas, namely child sexual abuse materials.
If you have little ears or sensitive persons within listening range, you may want to pause or skip this episode.
Hello and welcome to data driven, the podcast where we explore the emerging fields of data science and artificial intelligence.
In this second episode of the 5th season, Frank and Andy speak to Chris Wexler about using AI to protect the vulnerable.
Speaking of which, I would like to advise you, dear listener, that this show touches on some sensitive areas, namely child sexual.
If you have little ears or sensitive persons within listening range, you may want to pause or skip this episode.
Don't say we didn't warn you.
Now on with the show.
Hello and welcome to data driven, the podcast where we explore the emerging fields of data science, machine learning and artificial intelligence.
If you like to think of data as the new oil, then you can think of us as well Car Talk because we focus on where the rubber meets the verb.
Road and with me on this epic virtual road trip down the information superhighway, as always is Andy Lander. How's it going, Andy?
Good Frank, how are you brother?
I'm doing alright, I'm doing alright. We've had a chaotic week at Chateau Lavinia we. We ended up going to Baltimore in the middle of the night on.
Wednesday, wow, so you pick up.
Wow, what was in Baltimore?
What was in Baltimore?
A really good pizza, but mostly we went because there was a situation bad situation where the pit bull that was about to go to a shelter and so we do a lot of fostering and rescuing of dogs.
So we just got her out and we've spent kind of the rest of the week all over our free time trying to find our new home and she landed in the new home on Saturday and she's doing great. So that's.
That's awesome, and it's really it's really awesome y'all do that kind of stuff.
Yeah, I always wanted to do it, but it only and it's only been in the last. You know, maybe like 5-10 years I've been able to do it, so we've been doing that.
Cool, the risk of fostering is primarily foster failing. How we we got our current dog count up to five.
Uh, while twelve. That weekend, my wife and I counted it like 12 dogs who kind of come through our house the last two years. Three years.
So it's a good thing to do. We have the space to do it and.
You know at the time this one, we didn't know anything about, so we had to kind of keep her isolated.
So we had like this airlock system. She's a super sweetheart with people, but she's kind of iffy around other dogs and she she's super strong. So once she had her mind to do something it takes a lot of effort.
To corral her.
But she's super happy. She's the only dog in her new home and she has them wrapped around her little paw already, so.
How things go?
That's funny, things are good, you know, pretty quiet weekend. Here we have, uh, it's warmer weather. We're recording this on the Ides of March.
Now debatably, the upwards of March, yes, depending on who you talk to, it's probably the 13th, but I don't, I don't know.
Well, you are smart.
But we're on the 15th of March 2021 and it's starting to warm up. Our greenhouse is is being put to use.
We have some seedlings in there and that's always fun and we've got some raised beds out to the side of the house. Those are.
There's a starting to come. We're starting to see different things come up. They're kind of colder weather crops, so we started assigning couple three weeks ago.
And it's you know it's been nice. I love getting outside and working, especially this time of year. The bugs haven't shown up yet.
The pollen is really low, it's there, but it's really.
It hasn't affected me yet, so.
Oh good good yeah.
It's good that time is coming, so let's enjoy it while we can.
Oh, I totally agree. There's like 2 weeks a year where the weather in DC is wonderful and this is one of those two weeks so.
That's it, that's it.
Uh, so today we have an awesome guest and and this is, you know, in our in our HR we always talk about where the rubber meets the road in terms of you know how a I can you know how data becomes AI now a I can kind of help businesses and I think this time we have an interesting guest because now we're not just talking about helping AI.
But we're helping society.
Uh, and you know, I'll make sure Bailey has a kind of intro speech that if you have little little ears in the car, you may not.
You may want to listen to this later or listen to this on a headset, because we're going to be talking about human trafficking and all the sorts of horrible things that happen to kids.
And but he's doing some. He's doing some great work in terms of leveraging the power of AI.
To help child sexual abuse materials that are online as well as you know, kind of human trafficking and all the bad things that happen with the technology we like to focus on all the wonderful things. But there's a clearly a large underbelly.
To the Internet and I'm I'm a big believer in transparency because what happens when you I grew up in New York City?
Cockroaches are inevitable no matter what you do. One thing is when the lights come on, they all scatter. So I I think bad things tend to happen in the shadows and you know.
So the more light you turn on, I think the better it is for society as a whole. So with that I'd like to introduce.
Chris Wexler, who is the CEO at Crunch Krenim Craney I. We covered this in the green room.
Crew on crew nob. It's OK.
Rudoff crew Nam. There we go.
I I need to drink more coffee in the morning, but Trudeau is in the business of removing what I like this term that he uses digital toxic waste from the Internet and using AI to identify. I never heard this acronym before, but Sicam child sexual abuse materials.
And other awful content to help content moderation and his technology is already in use by law enforcement and is now moving into.
The the private sector and there's a whole bunch of stuff we could talk about, but particularly what's interesting is a for profit startup or social benefit corporation so we can talk about that.
But I like to work so welcome Chris to the show and and thank him for putting up with some of.
The scheduling growing pains that we're having.
Yeah, no. It's it's really great to meet you guys and.
I understand having five dogs. I definitely hearing the intro. I understand I like to refer to my house as the event horizon. If an animal comes in, it never gets out, so I I understand.
Ha ha ha ha.
Yeah, we are. Our track record is 5050 so.
Uh, I I tell you that dog was better with other dogs. She would she would she bring what a bit of current president.
How did you?
Get started in this and and and and your name. Uh, how did the name of the company come about? 'cause I think that's an interesting story right there.
Yeah, well kronom is. It's named in honor of a human trafficking child trafficking warrior in Thailand, kronom. Her name is 2 words crude.
Nam was a street artist in Chiang Mai and actually doing very well. Very well renowned. I mean, did a project with the street kids there.
And and said, hey just paint your life and she could not believe what they painted. It was eye opening.
And when she realized that a lot of the karaoke bars in Chiang Mai were fronts for child sexual trafficking, she was compelled to do something. And unlike I think 99.9% of the population, including myself.
She just marched into the.
The karaoke bars and pulled kids out and she had done this 20 times and had twenty kids in her little apartment. When the traffickers came and said, if you do that again, we're going to kill you at, at which point she went north and found a way to do that and has constantly been evolving her tactics.
And what she's done for the last 20 years and now she's saved thousands of kids. One of the first kids she rescued just was one of the first stateless, was the first stateless child in Thailand to graduate from university.
She's just been such an inspiration to us and you know, I think if you go from top to bottom.
In our organization at Kronom, we've.
All been confronted with what's going on in the world and been compelled to change what we're doing to try to help.
Try to help others in the space of human trafficking and so it just made sense to all of us to name the company in her honor.
Well, we talked a little in the green room about about some of the other organizations you mentioned. Your brother had started a similar organization.
Yeah, he and David Batstone started not for sale back in I think 2006 or 7.
And it was actually started because Kronom reached out to them and said, I've got 40 kids in a field and are lean to burn down.
And you said you might be able to help. So my brother strapped $10,000 to his body to go up to the field and so she could rebuild a space for him.
So she even started that organization, but they have since been just bringing innovation to the field of human trafficking.
Left and right, and, uh.
And so it's interesting that.
Kronom was it is a joint venture with a company out of London called Vigil AI, which has largely been in the defense and public safety space. Like really doing proof of concept.
Uh projects, though, like stuff that just you know I the geek in me just gets so excited when I hear what they do.
Vigil AI was one side of it. The other side was just business, which was the venture group that not for sale and non profit started because what we what they realized was that.
The dichotomy of for profit and non profit really didn't work when you're trying to solve really big problems, it's great for direct service, but when you're trying to solve a really big problem.
Any bit of money that you get comes with, UM comes with a lot of strings as either governmental. Like a lot of a lot of nonprofits are really, you know, pseudo governmental projects or from a large a large foundation that's donating money. So you're constantly changing who you are to keep your funding.
And what they realized was, well, that you know Dave had a background in venture capital, and so they went and started companies what they like to say is they were A cause in search of a.
And the first one they started was rebel drinks, which if you're ever in homed or not Whole Foods, is is one of the most popular drinks at Whole Foods and around the many other retailers. But it's one of the fastest growing natural drink companies in the history of the US there.
The sole financial partner of velocity. One of the big innovators in the corporate relocation space. And if you're ever in like say, Amsterdam or they just opened up in The Hague.
Dignita, which is a a brunch place that started in Amsterdam that was all about all about giving women who got out of trafficking into the red light district. Training in the hospitality business. And now.
People who go eat there don't even realize it until unless they read. You know the back of the menu because it's the top one of the top rated brunch places in all of Amsterdam.
And so you know, we like to say we we can't do good until we do well. So we're building world class companies.
All built with social justice built into them at the scale of capitalism because it's a powerful tool and that's kind of why we went into.
You know, we decided with AI that that was so important for us because AI is a is a amazing critical tool for the future.
And when you know, particularly in the age of COVID, with all of us behind computer screens and not travel.
The tactics of abuse changed and a lot more was happening online. A huge spike in. See Sam and the reason we say see Sam and that child *********** is that child *********** implies consent and there is none in that situation and so it's child sexual abuse. So that's why we say see Sam.
But as with COVID, what we're seeing is that is a shift of people paying for shows online, or and then they record it, and then they share the image.
And that's a critical. So this is a critical new front. Not even new, but a critical growing front in fighting human trafficking.
And so AI is the best tool to do that. And my background is I I was in the marketing technology side.
Things I I was with some of the largest ad agencies in the world over the last 20 years.
And really was on the other side of it. I was, you know, one of the first customers of Facebook and one of the first customers of Google and constantly evolving my marketing tactics to, you know, sell one more garbage bag or one more motorcycle to a middle aged man and and you know, established.
Data analytics practices to to learn how to do that better and you know eventually that evolved into you. Know AI projects and and what I realized was I could do I mean that that was a a good career, but I could take those skills.
And really make it impact and so that's why I came on board to lead this new joint venture. And so it's a. It's an exciting time for me because.
Uh, I feel like a lot of my like history has been able to kind of come in here and I have the skills that can really help make a difference and so that's why we're doing kruna
Wow, I mean that's there's so much to unpack in there in terms of the AI and kind of the social good.
The detection of this, but the first thing that comes to mind is that.
How do you train an AI for this? And do people have to?
How do you get people 'cause I know this? This has come up with a in Facebook at least, at least in the news.
I'm sure that real story is a bit more nuanced and made probably even worse is that people have to go to counseling because they look at all this horrible material.
And I mean, is that kind of the same thing here? Is that this? This stuff is labeled and.
Like how do you? How do you train an AI?
Yeah, I mean, I'm glad you asked that question. 'cause that's exactly the that's what got me excited about. Vigiles what the work visually I had done because we're we're actually bringing to market something that's been in.
And development since 2016, but it you know, it's it's a perfect example of public private partnerships working together and so.
Well, first I'll tell the story of how it came to, and then I'll talk. I'll definitely address how you do it, because that is, uh, that's exactly the problem we're solving.
Back in 2000, I think 15 one of our Co founders Ben Gantz, who is a a child sexual assault investigator with law enforcement in the UK.
And one of the you know you, you always have those people in every organization that, like there has to be a better way to do this.
There has to be technology that make this better and he was spending 70 to 80% of his time.
Going through confiscated materials to classify and understand what it was before we could even start investigating.
And he's like AI has to be better at this than I would ever be. You know, you get exhausted, you get tired, it's brutal. I mean, it's just emotionally and psychologically draining work.
And he he saw Scott Page, our CTO and co-founder speak, and they and he said, hey, let's figure this out.
And a year later, probably they formed a company. And here's where the public part of it comes in. Is that back in 2013?
The UK Home Office.
Which is their version of for those who are not familiar, it's their version of kind of homeland and FBI together.
And and so they put. They started building a database for law enforcement called Kade which is child abuse image database.
Because they recognized well if.
We're constantly finding the same things 'cause obviously things get copied on the Internet.
And in digital spaces, why don't we build a database and that'll speed things well, so they had that, and it's a great tool.
What Scott and Ben did is on a pro bono basis. Went in and said hey, let's see you know if we could use the latest in computer vision and AI.
To determine what the classifications are of this material.
And and and speed things along. And you know, that's where Scott's background in you know. Computer vision, you know he was doing work 15 years ago on trying to brighten dark images, something that you know all every one of our you know iPhones does just on it. On its own he was building those kind of algorithms.
15 years ago, he's an expert in computer vision and machine learning and deep learning.
And so they put it in there, not knowing if anything was going to work. And they literally had to bring their rig into a Faraday cage, because this material is not connected to the Internet in any way and an.
And then the their back to back with.
Law enforcement investigators that are the only ones that are allowed to actually view this material. This is a legal material, and so it was a weird situation where they're like.
OK, here's what I see. I'm in the near coding and and working it out and doing that and working on the.
Uh, working on it that way, and when they ran it, the kind of the first Test and they got to fairly high success rate.
It was a Eureka moment that it could be done. 'cause you know AI is really good at going. That's a cat that's a dog.
Or, you know, that's a tree, and that's a a chair. It's less good at, you know. I mean, for all the talk of facial recognition, you know that you we all know about the the problematic nature of mistakes and facial recognition. And that's and there. There's been millions and millions of examples that the algorithm is seen.
So and here we were asking the algorithm to determine implied behavior by body position and context.
And frankly, didn't know if it was going to be possible, and so when it was at a fairly high percentage success rate off the bat, they knew they could fine tune it to the point where it would be usable.
And so by and they did all that work, pro bono because they just wanted to do the they wanted to make the world a better place, eventually became a paid project with the Home Office who have continued to be a really strong partner of ours in in at helping us not only with access.
To the K database, which is the largest of its kind in the world? I mean, that's one of the problems for a Google or a Facebook is that if they find this material on their platforms, they're not allowed to?
Keep it it and and or like if there's a third party like us, you're not allowed to send that to another company for to train their AI, and so the fact that this is being collected and that this horrific data is being collected and protected.
Come by the Home Office and classified by investigators on a uh on a.
Every day by trained investigators you know, right now we only train the algorithm on three vote verified classified, classified data. And so you know it's a really gold standard.
Level of training and of data cleanliness and frankly data privacy. You know that's a big issue here is that we don't want.
The the last thing we want to do is extend the RE victimization and misery of these kids that have already been victimized, and then it was put into photos, and so it was a rare situation where the data was really, really clean.
And because it's so much work to train it, you know there's a limited a limited number of people that can even do this.
And so you know, it's just been. It's so when you know that's where you know AI is really powerful and and computer vision.
'cause you know when you look at, you know you you mentioned how brutal this is for content moderators, we really view ourselves as a digital.
Protection company not only are we obviously protecting kids long term and breaking the cycle of violence.
But we're protecting the content moderators because that is awful work and you know, studies have shown after 1520 minutes of doing work like this, your performance degrades horribly because it's emotionally exhausting computers.
Don't get emotionally exhausted, right? They're really good at this and.
I think the biggest like ahha for me is that AI is not a good or a bad technology. AI can be used for good and bad, it's null.
It's a null technology, right? And if we can use it to do the right thing, that's great and that's what we're doing here, you know?
As much as I liked, you know, selling selling motorcycles to middle aged white goods, I I think this is making a little a little better. A little better impact on the world.
This is Cheryl.
Uhm, what's fascinating here is, you know, kind of the.
You know there's a lot of problems with facial recognition, but I think a lot of the problems with facial recognition are they're twofold. One the the input data is not good enough. 2 There's too much faith in it.
Yep, poured into it.
Presence of a say a false positive in the AI models you build out.
Obviously we would never recommend any of our partners to rely solely on our algorithm because.
It it really what the way we would recommend using our classifier is to there there's a couple things. One is we want you to organize the data so you so the content model. So when let's say you know maybe you're comfortable.
Saying if it's a 99% confidence 'cause we feedback, uh, confidence level on any image or video and video as well, which is actually I'm underplaying how important that is because the current technology is a great bit of technology that Microsoft built back in 2000.
8 called photo DNA which is a, uh, a photo hashing a perceptual hashing technology which is essentially fingerprinting known images.
And it's done a lot of great work. But the problem is this material is getting recreated every day or a logo get gets added, or there's cropping or a filter is applied and it might change the hash and so it's a great technology.
But it's not a complete technique.
Gee, our algorithm actually finds previously unknown images because it's looking for patterns. It's looking for elements like that, and so then the worry is false positives, right? And frankly, false negatives as well, but.
So when you, when you're in a situation where where you're not 100% confident on anything, our feet output is a confidence level so much like a human being, there are things.
Just because there's variation in the human body, a 25 year old might look 16 and a 16 year old might look 25, and so on.
And because it's been trained by humans, it makes the same, you know, has the same troubles with those kind of images, and so it might kick out a 70% confidence level for an image like that.
Going, we think it is and then in a situation like that, we definitely want a human being involved. I mean, any AI needs to have human checks and so our classifier.
Their first we recommend.
Organizing the data in a way so a human content moderator isn't mode switching all the time, so you might get a lot of like images together so because so you don't have the mental.
The mental, you know, exhaustion of constantly going. Is it this or that? Or is this the site you go?
Is it this? Is it this? Is it this? It helps the human pertain. We also highly recommend having.
Uh, a essentially a on a preparation warning. So if a image is particularly heinous and they're all pretty bad, but if they're particularly heinous, you're not shocked by it.
So it's like it might, even you know, we recommend having a uh alert coming up going. Prepare yourself for this.
And that again helps with the psychological preparation for that moderator.
And then you want to build the right governance and so we do governance work too. We do consulting in the governance space because we have the company not only as great technologists but also experts in AI, governance and ethics and experts in human trafficking. So we can actually help.
Kind of all along the way here, but uhm.
Uh, if you know if it's every company is going to set a slightly different threshold, you know, like, uh, Facebook? They've already banned this content because they don't allow nudity. So then the question is.
Is the CSM or not? And if you know are they gonna report it or not? And so you know a false a false.
A false positive clogs the law enforcement on pipeline and then law enforcement is wading through a bunch of stuff that isn't see salmon, so there's definitely a negative there.
But let's say there it says you might set a threshold at anything over a 98% confidence. We're going to just automatically quarantine and move out, or a 99% confidence.
Anything between a 70 and a 95 will have a single check. Anything from 75 to 50. We might have a double check of having multiple people look at it. It allows you to optimize your workflow in a way that is less traumatic.
For your moderators.
And and frankly, once you're once, you know you see the output of the algorithm.
For any organization, that's where you know you start, you start making determinations like, OK, we're confident that it can do this, but we're not confident it can do that, and you start making you know you start adjusting your workflow to do that, and you know that that's how AI is best. Is is making those decisions at scale, and so that.
That's where you know. We think. Like I said, we're not only protecting kids, we're protecting content matters, and we're protecting these companies there in their bottom line like nobody wants to be on the front page of the Washington Post or the Wall Street Journal saying that there's see Sam on your platform and so.
That's a really critical part of who we are. We're a protection company.
Interesting, I like the re framing of that. Like your protection company I think that part of it I think is the normalization of AI and business.
Like it's they're not an AI company per southeast. Your protection company, which I think just happens to use PayPal.
Yeah, I I think that was a determination of us early as we pulled together. This joint venture is.
You know, if you build houses, you're not a hammer company.
You know, because in we're, you know, as the technology evolves, we're going to be evolving our technology, technological use, AI and machine learning and and deep learning are all absolutely vital tools for us right now.
Uhm, but and you know, as we move into looking at grooming and other things, natural language processing is going to be something we'll probably get into, but.
And when you say grooming, maybe for the benefit of folks, don't know that. I think I know what you mean.
Oh sure, well you know one of the kind of one part of the cycle of trafficking.
Children is either getting them OK with sending you images or trying to convince them to run away, so then you know and go hey run away and come to the bus stop and I'll come pick you up and then they're then they're trafficked.
And so those are known patterns, and there's actually quite a few organizations and companies working on on that of text detection of understanding how to head that off before kids are even trafficked. Which is the ideal thing you want to head it off before? They're they're hurt.
And so you know, as you know, we talk about.
The we are worthy toxic waste management of the Internet. That's part of what we do.
You know I've been involved in the Internet for pretty much since the beginning. I was on Wall Street early on and scratching my head during the first Internet bubble going. I don't understand how these companies make money.
I I literally was sitting at my desk and the CEO of I think it was furniture.com was sitting across from any.
He said, hey, we're only losing 5% per transaction and I will.
What and he and I go? What are you gonna do about that? And he said I'm making it up in volume and he was out of business since.
Ha ha ha.
But it it did make sense to me. Then I then got into the digital marketing space and I was there at the very beginning of that, really booming.
But you know, if you look at how technology gets adopted by society in in any big communication platform, it takes about 30 years for society to figure it out the 1st 10 years is kind of promotion and early adoption.
And if you look at that, that was probably a little quicker with the Internet, but it was kind of.
It was and then you have 10 years of growth and adoption. So really maybe this you know. I think maybe you best can put that in the social media years of the Internet of and then you have 10 years of reckoning of of really understanding. Oh that actually did.
This to society and we're in those ten years. Right now, we're we're we're seeing what it's done to our body politic. We've seen what it's the the nasty side effects it's it's having.
In the human trafficking space in cyber bullying, like we're seeing the warts of a system that is by and large been a great societal positive, and so it's not surprising that governments are talking about regulation and.
That companies are getting really finally getting really serious about monitoring what's going on in their platforms.
Because, you know, unfortunately, this is I mean, unfortunately or fortunately.
Technology is moving a lot faster than our human brains and joint society can handle, and so we you know that's why you know I'm compelled, you know, I was part of the early promotion.
I was part of the acceleration I thought, oh boy, this is all great and you know some of the algorithms.
That are that kind of drove the insanity like the Cambridge Analytica stuff like. I looked at the Cambridge Analytica. Not literally Cambridge Analytica but but similar technologies. When I was doing Mark.
Thing going well, we could probably do that. I I didn't do it, but it it. You know I was part of the problem for a long time and now I think there it's important to be part of the solution.
Chris, these are these people engaging in these activities in child trafficking, human trafficking. There's a. There's a lot of money in in this, and they're bad actors.
And you know, Lord knows what? What motivates people to do, you know to do this sort of thing?
But are you concerned at all about your own safety?
Sure, I mean that that is always a worry.
I think my mind is I'd rather be in danger than the kids and so.
That is, that's a concern. It's a concern when we talk about crew. NUM. In fact, frankly, one of the reasons that the woman Kronom is happy that we named the company after her. She, when we we asked her if we could do it, she said.
Well, the more famous I am, the harder it is for the police to give me trouble.
So you know she lives in real.
That's crazy that she's worried about the police.
Yeah, well, I mean there are bad actors through the entire chain. And so and and so for her fame is a protect.
I I couldn't live with myself if I didn't get out of the foxhole and try to try to do something more.
So for me I was just compelled, and if I'd rather be, I'd rather take the flak than the kids. So if I can draw their fire, that's good for me.
So so another question is, is that these?
Probably are well funded bad actors.
What sorts of countermeasures I mean are I mean obviously with the with the photo, DNA and federal DNA you know is at this point a 13 year old technology, but.
Uhm, I would imagine that this is going to evolve into kind of for lack of a better term and arms race.
Yeah, I mean, but that's I think something. Where are the the brilliance of Scott Page and Ben's approach to building the technology?
Is it's reading what humans read and if they want to obscure the images so much that it doesn't look like Sam, I think we've won.
And so really all we're looking for is something that looks like it, and it's being, you know, it's using computer vision.
In a way that it's perceiving it like a human would perceive it well. Obviously computers look very differently. You know the last thing we do is start with edge detection and then look at shading and you know that's not how we process images, but literally it's looking at the image and going what what? What would someone see when they look at this and and so on?
On many levels, you can't obscure that. In fact, the algorithm has already like we've we already added, added a classification because the algorithm kept finding.
Cartoon Magna and ****** versions of see Sam.
Because it looked like it it went. Oh, this is the same, only it's a cartoon. And so while we haven't done the testing on deepfakes or even things that are made whole cloth.
Because that's coming to is like created. See Sam and you know, I guess that's, uh.
It's like I guess, like vegan meat. I guess it's ethically created. See Sam because nobody was damaged, but it's still damaging.
Content, yeah, the computer is going to see the same thing because it look it has to look right to humans and as a result that's what the the computer is going to look at.
I have to assume that there will be. You know, there'll be some kind of countermeasure they come with and then we'll just have to adjust, but because.
We're building it on on an end user perspective of how someone looks at it.
It's relatively future proof versus a kind of a transitional technology like photo DNA that turns it into an anonymous hash that's a little easier to feed, and now obviously there's been ways to make that stronger, but.
Just the nature of this should be should be a little future protected. See not knock on wood.
Interesting my like. The fact that you've already kind of thought about that, 'cause you know it's not like you're running around saying we solve that we solved it only to have the one of the most powerful forces in the universe, in my opinion, is the law of unintended consequences.
Yep, Yep. Well I think the the one thing we know is we're not perfect and you know, I mean anybody who's done technology and particularly innovation.
In technology you go OK. I've got two years. How do I? How do I keep evolving? How do I keep making it better?
I I I have that terror for 20 years in marketing like you'd build something and go OK. I've got two years to build.
The new one, and so, uh, that is that that's very much our mindset and you know, we're we're lucky we have, you know, a strong technology team that's that is constantly looking at edge cases of how to do things. I would take Scott Page and team.
Uh, going to war with anybody and you know, so there you go.
Yeah, a great answer.
All right, so this is the point. In the show where we talk about the prefab questions, we have the pre questions we have so you mentioned kind of your early days in the.com era and I'm sure we can swap some stories too. 'cause I was at a.
Couple of startups at the time? Uh, how did you find your way into data in AI? Did you find you did you find? Did data find you, or did you find your way into data?
I found my way into data.
And I very much it's funny because I chose a college based on the lack of a math requirement.
I I I always had the skills, but I didn't really like doing it, but.
Even in college, I realized the power of data and so this is way back in the day. Now it's not not so far back that we didn't have computers I like.
I would I just to date myself. I got my first PC while I was in college so it was it was the age of moving off of mainframes into having a computer sitting on.
And I was getting a degree in political science at American University in DC.
And I took a campaign management class.
And it was kind of crazy they they would do this thing over J term. It was two weeks long, a full, you know, 3 credits and you'd work you you'd have class from 8:00 AM to 6:00 PM and then at the end of it you had to present a 200 page group paper. So you did that in the off hours and they really wanted to simulate the last few days.
With the campaign and how brutal that.
Well, I got told that I was working on the 2nd District of Utah and I had to figure out the voter turn out approach.
Well, I had no idea what to do. Like I I was, you know, 21 years old I didn't know so I went down to the Federal Election Committee.
Got the precinct level data and built a spreadsheet.
And it was back. It may probably wasn't even excel. It was probably Lotus 123. I don't remember. But and and I loaded every precinct data for the last four year or the last four elections.
I'm so much so that you know the processing power is so bad, like I had to hit F nine and then wait 20 minutes for it to recalculate.
But it it was a way for me to. I was like this is the only way I'm gonna figure out what precincts to target. And so it was. Technology was always.
And I needed a way to get to it. It was a tool and so whether it was that or and and then it was, you know, building models based on on Wall Street.
And when I say models, nothing like today's insanity like very light models by today's standards that I was doing on Wall Street or.
And then in A and then once you have that basis and and lack of fear of how the technology works, then you're just looking for good data to make better decisions.
And how to how to have clean data so you can make a smarter decision and so that just became kind of my superpower in my career.
And so it just kept going and going.
So the kids listening F 9 is how you used to update spreadsheets. Now it happens so fast it's it's automatic.
Yeah it yeah. And and I remember when I'd like accidentally hit F nine and go well there goes an hour.
Yes, what would you say is the your.
Favorite part of your current gig?
It's that I'm helping.
I'm helping people that can't help themselves right now and and and I'm helping detoxify the Internet for people who are going to run into this stuff and so.
There's a real joy in knowing that the output of what I do is going to benefit a lot of people.
So we have a number of complete this sentences when I'm not working, I enjoy blank.
Baseball, I'm a huge baseball fan fan of my Minnesota Twins, which means we just hit 6000 days of losing playoff baseball. So I'm obviously a glutton for punishment.
Ha ha ha ha ha.
Yeah, that can be rough.
That certainly can be, I'm uh.
We've lost literally 17 times in a row to the Yankees, which I hated the Yankees before. I can't. Yeah, I, I I.
I was taught as a young child not to hate, but I think hating the Yankees is a is a virtue.
I'm a Yankees fan, but I'll let it go.
That was gonna say yeah.
I and I was and I was liking you up until now. Oh wow.
They hate us 'cause they ain't us. That's what that's whatever Yankee fans.
Oh my goodness.
Ha ha ha.
So, uh, Braves and nationals here, so.
Yeah yeah, I I was in DC. I lived in I went to a you and I lived in DC pre nationals so I would always go down to the the Orioles would always play a series at RFK Stadium in right before the season.
And I'd always go down and and have and for some reason I don't know why it was cold and the hot dogs were cold, but it didn't.
Matter I still went.
The the the 4th question four is their second fill in the blank. I think the coolest thing in technology today is blank.
There's so many things, I think the coolest thing is the application of AI to technology, travel, travel, and driving.
I'm looking forward to the day when we are not trusting another human being on the road not to crash into us.
That segues nicely into the next question. I look forward to the day when technology can blank.
I I am as also, you know, baseball. I'm exposing how nerdy I am baseball and and also Star Trek.
I'm looking forward to the day when they can teleport me to A to a a warm climate in the middle of January and it doesn't take 12 hours.
All right, so you mentioned baseball.
If we can, if we can move, uh, if we can move a quark 6 inches. I figure we can move me to Tahiti.
Oh, I like that idea. I like that idea.
Uhm, since you mentioned baseball and Star Trek.
What about the Niners?
This is the San Francisco 49ers.
No, this is this is an obscure Star Trek baseball nerd out reference to, yeah?
Oh the Niners. Oh yeah.
Not about them that that that you would think that that would be at the forefront of my brain as as like like the Reese's Peanut Butter Cup of my pop culture experience.
Right, right, right, right.
But but but I do remember watching that, much like when you hear national public radio talk about sports.
Right where they?
You know the local 9 play a game of baseball today. It's. It's just like, oh boy, these people don't really know what they're doing.
So for those for the 90% of the audience who probably didn't get the reference is we're referring to deep Space 9 and.
Deep Space Nine was the first spinoff from kind of the traditional Star Trek franchise. It's it's why I think it's still the high watermark for Star Trek.
If you carefully watch my live streams behind me, there's a model of deep Space 9.
And the the Captain Sisko, who is in my mind. It's not my question was was never Kirk or Picard, it was, you know, Cisco versus anyone else.
But he was a big baseball fan, and that features prominently, and I've been really watching deep Space 9.
It's sliced fresh in my head is.
He's a big baseball fan in a time when baseball is kind of really waned and one of the episodes they were playing, a game and the team that they they kind of set up was called the Niners.
And then it was against, uh, somebody. It was against Captain Sisko's rival from the Academy, or something like that.
Oh my gosh, I had forgotten about that completely. We loved that.
Yeah, that was that was a great episode. That was 'cause 'cause most of these baselines. Pretty heavy kind of existential questions.
But every once in a while they had kind of a lighthearted show and that was that was one of the.
That and the one where they hang out in Vegas, which is bizarre.
Well, we could talk Star Trek for hours.
I could tell I could just tell.
Our next question is share something different about yourself, Chris, but we remind everyone, not just you, that it's a family podcast. We want to keep that clean rating.
Boy everything I I have a view that everybody is different and so I'm I'm sure that you know it.
It actually reminds me of a story when my sister was going off to college. My dad hated bacon.
I know there are people out there that don't like bacon and so we always have tuna fish with pancakes.
Come on wow.
Because my mom's like we need protein and I remember the look on my sister's face when she said wait, this is weird. What else is? What else do we do? That's weird.
Uh, I think the the biggest thing that's different. About me.
I've kind of lived in every kind of Phase I I have a deep experience in every phase of kind of the American demographic I grew up in a kind of an established urban neighborhood, but went to a very poor urban.
High school, but at the time I was attending a affluent suburban mega church.
All while having farming grandparents and so.
It's been a blessing for me in my life that.
I accidentally didn't have a bubble and so it allowed it's it's it taught me a superpower of empathy that has allowed you know, allows me to look at like the current political state right now.
Go people, we just need to meet each other. 'cause we're a lot more similar than you think and.
And but I think that that's you know that unique background that I I can thank my parents who literally decided to.
Decided to support Minneapolis and there they live not too far from where George Floyd was murdered.
They decided to support the community, but you know, and so I think that that's probably the biggest difference about me that you would never know unless we talked about it.
And the next question.
Where can people learn more about you and what you're working on?
Our website, which is still in a nascent state right now, but we'll be rolling out a new website soon, is at Cru Nammco. You can sign up their LinkedIn is another good spot for us. If you follow us on LinkedIn.
Come and just keep watching 'cause we're we're going to we're we're ramping up our our content work.
Cool cool, I sent you a LinkedIn invite this morning so.
Yeah, it's very important work. I'm going to try and connect with you as well.
We we are sponsored by Audible and you can get a free audio book on us if you go to the data drivenbook.com.
And then if you sign up and you know, subscribe to audible and I think I've got like the maximum platinum coated gold subscription because I have myself and two others in the house. Two teenagers that love audible books. So because of that, we ask all of our guests.
If you have a favorite audiobook, if you don't listen to audiobooks, do you have a favorite book recommendation? You don't have to limit it to one. We'd love to know what you say.
Well, this is a book I've literally purchased for people many times and it is not what you would expect. It is called the power broker by Robert Caro.
The telling of Robert Moses New York, who started as someone who wanted to build parks for moms in in New York City and was heralded for it and took that power and corrupted.
So much in the world, but just he literally moved millions of people apartments in New York and has formed his thinking.
Really formed the modern freeway system around the country. It is a fascinating look at power corrupting an individual at.
How public works impact society. It is just a masterwork of how the world works, and so I highly recommend it.
And it's perfect for audible, 'cause it's about 1000 pages. So it there's a. There's a lot of jogging you can get done while listening to the power broker.
Interesting, yeah. I grew up in New York City and you know, years after. Kind of.
Robert Moses, and he was.
No one had a neutral opinion on Robert Moses.
Nope, people either hated him or admired him. I wouldn't say love him.
Because but they admired, like his ability to get things done.
Uh, you know, and for those who don't know, I mean, ultimately he's the one that kind of.
Turn the Interstate Highway system in the US from kind of a way to move.
Goods from city to city, or effectively, let's be real. I mean, it was a defense project that was labeled as a civilian project, like the Internet.
Exactly sure, yeah.
Two into kind of the the daily commuting engine that it has become across the states.
I don't have a better answer.
And he he was the brilliant, he was the brilliant one that.
He he would write up the plans so he'd be ready to move faster than anybody else. And so, like on, there's so many practical things on how to get things done that are just fascinating and thinking systemically. It's it's just and you know, unfortunately, how certain groups were excluded.
From the from the and actually damaged by what was going on. It's just a it's a fascinating look at the history of that time.
It's it's a brilliant masterwork by Robert Caro. The other book I would highly recommend is not for Sale by David Batstone it.
Is the really what started everything that we are at kronom and it tells stories about human trafficking but not in a way that is so heavy it talks.
It shows the hope and the and the power of doing something and so highly recommend not for sale as well. That's a.
That's really the book that started a movement and then eventually started this company, so I probably should have started there, but I love the power broker so much.
Yeah, I definitely check that out. 'cause there's like a I grew up kind of in the post. Robert Moses New York and you know it was interesting from from someone who grew up. Kind of not on the island of Manhattan.
For those who don't know, basically Robert Moses wanted to put a freeway straight through Manhattan.
Power Center of New York City is Manhattan. The boroughs are kind of an afterthought, so he was able to. Kind of, you know, pillage the boroughs.
The outer boroughs, but not necessarily Manhattan, Manhattan was kind of his Waterloo where this at least that's my.
Recollection from older family members was.
That's exactly well. If you look at any freeway system in any city.
You have the same situation like if you look at DC there were power brokers through the entire city. That's why there is a Beltway.
The freeway is supposed to right go right down Pennsylvania, and instead it kept getting pushed out a road and another road and another road, and there was a Beltway. If you looked in.
Right, I don't think the Beltway touches the the boundaries of DC.
Exactly, and if you look at yeah if you look at like my hometown of Minneapolis and Saint Paul.
Or barely without all.
All they put the major freeways right through the heart of the African American communities and took out all the African American businesses.
And I think if you look at the history of where freeways have been replaced, you see who was powerful and who was oppressed at that time.
It's a fascinating. It's one of those things you just don't realize until you kind of dig into it.
Well, my grandfather went from liking Moses to despising him like with him.
Well, and I think you do the same thing while reading the book you go from you. You go from oh what a good man to what a power hungry monster. And that's the part of the brilliance of it.
Chris, I want to ask you another question a little bit of a follow up here and I know we're pushing time so I apologize the I I'd like to ask you about the book, not for sale and the concept, and you brought up faith. I I'd like to explore that intersection.
Absolutely well, I think you know I. I grew up in in the church.
Uh, and you know that was a core part of.
Was and am from my faith.
Uh, and you know, I think.
When you look at.
I think the phrase in the Bible is, you know.
However, you treat the least of these, you treat me, and that's a that's been a that's been a motto that I've carried through my entire life, and I'm glad to be doing it with Colonel.
Certainly trafficking victims could fall into that category very easily, being treated as the least of these.
Exactly, yeah OK. I appreciate that answer.
Well, awesome, I think you're doing.
Yeah I I appreciate appreciate your time and putting up with these bugs and we'll let the nice the the, the glitches that we've had including my my machine crashing yet again on another PC. Time to re-evaluate Zen caster.
But with that, I'll let the nice British lady and the show.
Thanks for listening to data driven.
We know you're busy and we appreciate you. Listening to our podcast.
But we have a favor to ask. Please rate and review our podcast on iTunes, Amazon Music, Stitcher or wherever you subscribe to us.
You have subscribed to us, haven't you?
Having high ratings and reviews helps us improve the quality of our show and rank us more favorably with the search algorithms.
That means more people listen to us spreading the joy and can't the world use a little more joy these days?
Now go do your part to make the world just a little better and be sure to rate and review the show.