Artwork for podcast Tech Transformed
Intuition and Experience with US Army's Kris Saling
Episode 1819th January 2022 • Tech Transformed • Mark & Carolyn
00:00:00 00:44:40

Share Episode

Shownotes

In an AI driven world, the role of intuition and experience can be hard to define. Kris Saling, Chief Analytics Officer for the Army Talent Management Task Force and Director of People Analytics in the office of the Assistant Secretary of the Army M&RA joins Tech Transforms to give insight on talent management within government agencies.

Episode Table of Contents

  • [00:41] The Analytic and Technology of Talent Management
  • [07:35] Ensuring Unbiased Data Talent Management
  • [16:28] Talent Management Prediction Vectors
  • [22:21] Quality of Life
  • [31:58] Talent Management in AI Analytics
  • [38:39] How Do We Ensure Trust in Talent Management

Episode Links and Resources

The Analytic and Technology of Talent Management

Carolyn: Kris Saling is Deputy Director Army, People Analytics, and the Chief Analytics Officer for the Army Talent Management Task Force. She coordinates analytic and technology solutions, rights policy, and resources innovation to promote data-driven decision-making across the Army's people enterprise. Kris, welcome to Tech Transforms.

Kris: Thanks so much, and I'm really happy to be here. This is going to be fun.

Carolyn: So I want to start off with a two-part question. Let's start with the awesome poster behind you, of Sherlock Holmes. Tell us the story behind that.

Kris: There are a bunch of stories behind that. The big one is people ask me why I went into data science out of all the things I could have gotten into. My usual answer for them is because I read too much Arthur Conan Doyle when I was growing up. I just love the idea of sifting through all this information, finding clues, and solving problems, and that just persisted.

That's up there for some motivation, but also a huge Robert Downey Jr. fan. He established a smart AI, a corporation that specializes in sustainability work through AI. It's called Footprints. He took the whole Tony Stark thing and decided he was going to make that his real life.

Carolyn: I'm loving him even more. I've always been a big fan. How can you not be?

Kris: Yes, save the planet through AI, how can you not love that?

Carolyn: Before we move on to your job, do you have a favorite Arthur Conan Doyle story? What's your favorite Sherlock Holmes?

A Long and Unusual Story

Kris: There's so many of them that stick, but I'm trying to remember the title of it. It's one of the first ones where he first meets Watson and just some of their banter. It is a really long and unusual story. Half of the story is a flashback where he's talking to the perpetrator, one of these crimes. He is talking about his migration across the wild west frontier. Going to have to try and remember what that was, but it's just the meeting between him and Watson.

Just the dynamic of this very straight-laced professional, trying to sit there and figure out what the heck he has in this certifiably, insane new roommate. But the fact that they connect on the intellectual level, it just makes that dynamic all kinds of fun.

Carolyn: Do you remember?

Mark: It's like Jarvis.

Carolyn: Do you remember how old you were when you first got hooked? Your first story?

Kris: 12 or 13 I think?

Carolyn: So now we know what inspired you to get into the line of work that you're in? Mark and I are really intrigued by your titles. Is it HR or is it operations? It's super cool. You have a quote that says you're leveraging AI to leverage AI. Will you unpack what it is that you do to include how the USA fits in?

Kris: I have two bosses. One of them is the assistant secretary of the army for manpower and reserve affairs. The other is the director of the army talent management task force. In both of my roles, I essentially have the same portfolio.

Applying Talent Management in People Analytics

Kris: My short way of explaining it is it's all things talent data and data talent. So doing a lot to revitalize how the army is looking at its personnel management systems. I’m going from just transactional human capital type work to actual people analytics. We're starting to look at not just quantity analytics, but quality distribution employment. How do we optimize performance, but then also how do we do targeted retention to keep our top talent? Those kinds of problems.

On the data talent side, I run a planning team that looks at the army data workforce. I've run this for about two years now. We look at how we do talent management for the folks who are doing everything. From our very high-end analytics, building AI solutions, doing very high-end data architecture to just what does basic data literacy look like for them? So all of that fits within this really broad portfolio.

I've been really fortunate that my bosses have allowed me to essentially build my portfolio. They just come at me and we're giving you some free space to think about what the army needs to understand about its people and what kind of talent we need to make that happen. So go play and I swear this has been my favorite job in the army, it's so much fun.

Mark: Who are your customers then? Are your customers commanders over certain areas and then you're looking to help them as far as how they field their organization? Or it's not really operational or logistics type stuff?

Kris: Oh it is. One of the projects we have ongoing right now is a campaign of learning where we are.

Data Talent Management

Kris: We’re trying to see how far down in the operational army we push data talent and how we build the supporting architecture. How do we build the supporting environments that we need in order to do analytics at that level? Also, how do we make it accessible? But then, how do we train up not just the data professionals who are going to be working at that level, but also the commanders to interpret the information and turn it into action, into decisions?

Mark: When you talk about the architecture, is that the architecture of the human capital and the resources that you're providing to do whatever the mission would be?

Kris: There's some of that. Some of it is process engineering, the rest is actual data architecture. Have we developed our pipelines in order to be able to push data safely and securely to these organizations that haven't previously had access? They've had to send RFIs or requests for information up higher.

We don't want them to have to send in a request 48 hours, two weeks, whatever it is later, they get some analysis that you've all seen the requirements model where it's like playing telephone. The more steps you have in between the person who actually needs something and the person who's going to deliver it, the more it gets garbled.

We want to be able to have commanders ask their questions to the right kind of talent to interpret it appropriately. Understand what is needed in the data and be able to engineer that solution. Now, it might just be a couple of folks and at that point of the question point of origin, basically with reach back capability up in the higher ranks.

Ensuring Unbiased Data Talent Management

Kris: But again, we're still campaigning and learning. We're experimenting with what works best and what's the best use of our talent.

Carolyn: Do you have challenges with ensuring that the AI is unbiased?

Kris: That's a challenge everywhere. The idea that AI is ever going to be unbiased is like saying that humans are ever going to be unbiased because we all use biases.

Carolyn: It's really just an extension of us.

Kris: It's only as good as the decisions made by the people who program it or the decisions it's learning from.

The way you get around bias in AI is the same way you get around bias in an organization. You bring in a lot of diverse perspectives, you bring in a lot of quality control. And you ask a lot of the “what if” questions.

So you have to go back through and essentially triage the model when you're developing either a machine learning model, deep learning model, or you have AI making those decisions and those recommendations. But part of that is going back through and looking at what the model's keying in on. Is it actually making decisions based on stuff you want it to be making decisions? Do you need to hide some of the variables?

There's an example I used. There was an AI that somebody created a few years ago to predict winners of the academy awards. It turned out, it was just keying in on anything Daniel Dan Lewis was in. It’s just like, "Okay, granted that is correlated." But we don't want you predicting based on this. He might turn into a lemon.

From the End Game Backwards

Mark: You mentioned in a recent interview that you look at the problem from the end game backwards. This builds on the question we just asked because it seems like a very difficult thing to do. If you start with the end game, then aren't you basically saying this is the outcome we want to see? How do you create the algorithm to give you something that well? You don't know and you don't want to necessarily point it to the answer. But you want to see what you might be able to get out of it without doing that.

Kris: It's beginning with the end in mind. When we start looking at the outcomes we want to generate, we have to focus on something. Train the model and sometimes that focus is a proxy for what we actually want it to focus on. More and more, we're getting our folks to build things using very modular, reusable, very tailor-able code. If we decide we want it to do business in a different way, we want it to key in on something differently. Then that's fairly easy to go in and modify.

We're doing this right now with some of the predictive tools we're looking at for performance. The way we're looking at performance is based on how the army currently measures performance. But we have a number of efforts going on to change through our campaign of assessments on how we measure performance. We need that to be something we can iterate on later. One of the things that we've keyed in over time for the army has been that there's no more end states.

How Intuition and Experience Helps in Process Improvement

Kris: We're not going to plan toward an end state. Something isn't just like a one and done, this is continual. We've got to keep upgrading, and we've got to keep developing. We have to keep this thing adaptive and learning. So if we get more traction on that, then we generate more of a flavor process improvement.

Carolyn: Your job sounds really broad. I heard you say you use AI to source new talent, retain talent, and help identify talent who's going to be best on this or that mission. Identify talents who should be on these different teams.

Kris: That last one is something we're trying to get to. We're doing a lot of job competency studies right now to see what's actually required for different positions. We don't have that in the granular level of data that I'm looking for to be able to make some of those recommendations. What I envision out of our marketplace is eventually, you get a recommendation engine, it starts looking like Amazon. Since you like this job, you might also like these. Since you did this job, these might be the next best for you based on your capabilities.

So we have an effort right now to basically make an intelligent individual development plan where folks can go through. See where their skills might best lead them or pick goal positions down the way. This can show them, "Okay, here are the gaps in your current resume." My eventual dream for that is here are the gaps in your resume, would you like to sign up for a class? Would you like to take a self-initiated assessment or to talk to your career coach? All these different ways that we can help them bridge that gap.

Where the Army Needs To Be

Mark: It sounds like you're working with the leadership of the army to determine this is where the army needs to be in 5, 10 years down the road.

Kris: That was the charter they gave me. It's just like, "Okay, what do we need to know about the army? What's your 10-year life plan?" I'm like, "Oh this is going to be fun."

Carolyn: I had never really thought about using the AI for this purpose. It makes sense and I'm wondering about your role, how new is this job? For other organizations, even in the commercial world, is this a thing? Or is this a thing that you're creating right now?

Kris: This particular job is one created for the army. I've been in and out of the MNRA for the past six years which is unusual for an army officer. Normally the PCS has a lot. But I started out in the G1, the Army's director and personnel working on how we actually use our data. How we collude all of it together out of all of the different systems we have spread throughout the army in a way where we can rapidly access it and utilize it to solve problems.

We put together what was at the time called the Human Capital Big Data Plan. It just basically directed consolidation of all our personnel data into this massive data warehouse that we house out in Monterey with the research facilitation lab. From there, we've grown a number of different initiatives. We started using this data in creative ways. We've really started to see the gaps between what we need, as far as data, to answer our questions.

Asking the Right Questions

Kris: We see what we don't have and basically if we are asking the right questions. I noticed, I didn't quite address this, the leveraging at leveraging AI to leverage A?  AI takes a lot of data so we are making more data. We're using natural language processing and optical character recognition. A whole bunch of other techniques that fall underneath advanced analytics umbrella to go through and read old evaluations. Read files, transcripts, and all of the stuff that we have that's sitting in TIF files in our repositories at human resources command. We’re trying to turn that into more usable information.

Carolyn: How much data are you crunching every day?

Kris: A lot.

Mark: Is this more of a future state for us? Is this a state now where we're able to take advantage of this?

Kris: This is now, this argument. We have this argument with future commands, a lot. It's like AI is not the future, AI is now because right now at RFL, we’re supervising seven separate AI projects. We're actually using an IBM Watson implementation and a couple of other things to do AI projects and predictive modeling. One of those that we mentioned before was retention prediction.

Hopefully this February, March, it's already been tested for a trial. We're just waiting to get it up and accessible. The RPMA, the Retention Prediction Model Army, we've got this in partnership with the Institute For Defense Analysis. We're just putting it into our systems and getting it hooked up to all the right data feeds. But it creates an individual vector of attrition for every active duty person right now.

Talent Management Prediction Vectors

Kris: We can say based on this quarter, next quarter out to 20 years from now, what's the likelihood we're going to retain this person. That's cool, but the best part of it is not just the individual aspect. It makes it really easy for us to combine that data. Look at those prediction vectors by demographics, by particular skill sets, by commissioning source, by a point of recession.

We can really figure out trends in who's staying and who's going. Where we need to do some targeted retention efforts if we're going to keep the right talent that we need for the future. We're also trying to develop some algorithms to help us figure that out of our marketplace. We want to have something that looks like LinkedIn's talent insights, where you start seeing demand go up for certain skills.

You've got your human resources professional. Well, this person also needs to have a background in analytics now because we're seeing all these things. We wanted to pop up in the marketplace and tell a commander, "Hey, we're starting to see this demand signal here. Do you want to add this to your request?" Yes they do. I also want to be able to funnel that information to our schoolhouses as we're developing the training talent. So that they know, "Hey, this is an increasing demand in your field. You need to make sure that we have this in the program of instruction."

Carolyn: Do you collaborate?

Mark: Are you looking across the army into the warfighter needs? Are you talking more like IT and some of the different skill sets like that?

Project RIDGWAY

Kris: No, our best client in the campaign of learning right now is the 18th airborne corps in the 82nd air board division. They're doing Project RIDGWAY which is their move toward becoming an AI-driven corp and division. They have a number of different exercises that have gone on through that process to test how they're going to use this information. To see what they need to automate, how it's going to change their business processes and their decision flow.

We're looking at supporting them with this talent. But then, we're also looking at how we do it at the high level. What does the leading edge for these capabilities look like? We've got army research labs working on this. We have the AI integration center of Carnegie Mellon looking at this. Also, we have a future command doing a lot of great data science work. We have the center for army analysis. And we have a lot of folks who are really digging in and looking at what the future looks like.

Carolyn: You're collaborating a lot within DoD and Carnegie Mellon. What about industry at large? You mentioned LinkedIn insights. Do you collaborate with industry?

Kris: Frequently. We're actually trying to leverage some of the collaboration with academia and industry that our air force partners have done so far. I talk a lot to the guys who found a digital university. They have a lot of networks going with both academic partners and industry partners and how they're developing their curriculum for their advanced analysts. I do have some other partnerships going on with various vendors that we're trying to...

Links

Chapters

Video

More from YouTube