Artwork for podcast Privacy Pros Podcast
How To Future-Proof Your Privacy Career
Episode 7225th April 2023 • Privacy Pros Podcast • The King of Data Protection - Jamal Ahmed
00:00:00 00:48:33

Share Episode

Shownotes

Want to know the number one skill to future-proof your career in privacy?

Heidi Saas, Data Privacy and Technology Attorney, has got you covered!

Hi, my name is Jamal Ahmed and I'd like to invite you to listen to this special episode of the #1 ranked Data Privacy podcast.

In this power-packed episode, you'll learn about the number one skill you need to succeed in the world of privacy, and discover how AI will shape the future of the industry. 

Plus, Heidi shares practical tips and strategies to help you up-skill and stay relevant in the ever-evolving world of privacy.

But that's not all! 

You'll also discover the qualities that make a standout Privacy Pro, and learn how to embody them to achieve success.

And if that's not enough, Heidi also shares invaluable advice for Privacy Pros who have been laid off, the importance of Privacy regulations to safeguard consumer rights, and her personal journey into the world of Privacy

With this episode, you'll gain access to actionable insights that will help you build a credible and fulfilling career in privacy.

Heidi Saas is a Data Privacy and Technology Attorney who regularly advises SMEs and startups working in a wide variety of industries, on a global scale.

She delivers thoughtful process management, utilising emotional intelligence, legal training, and business acumen to help her clients understand their challenges and help them map a better path forward. In addition to being an Attorney and a Privacy Pro, Heidi is also an A2J Advocate, Mentor, Coalition Builder, and Disruptor for truth and justice initiatives.

Follow Jamal on LinkedIn: https://www.linkedin.com/in/kmjahmed/

Follow Heidi on LinkedIn: https://www.linkedin.com/in/heidi-saas-31a7a16/

Get Exclusive Insights, Secret Expert Tips & Actionable Resources For A Thriving Privacy Career That We Only Share With Email Subscribers

 https://newsletter.privacypros.academy/sign-up

Subscribe to the Privacy Pros Academy YouTube Channel

► https://www.youtube.com/c/PrivacyPros

Join the Privacy Pros Academy Private Facebook Group for:

  • Free LIVE Training
  • Free Easy Peasy Data Privacy Guides
  • Data Protection Updates and so much more

Apply to join here whilst it's still free: https://www.facebook.com/groups/privacypro

Transcripts

Heidi:

That's a great area for people to start learning new things, because once it's mandated, we already don't have enough people trained in the area to provide services that businesses are going to have to have. So the more people we can get into that, the better. And I think privacy professionals are best suited to do that. It is so hard to do all of this research and have all of this knowledge and be ignored, but you're going to be ignored unless you put it out there. You've got to get into the arena. So if all you do is read some guidance article and you pull some key takeaways, put that in a post and put that out on LinkedIn and just wait, the community will support you. The community will correct you. The community will be there for you. The community may just ignore you, but you've got to start showing up and say, here's what I know, here's what I can do so that you can participate in the community. That's how you build credibility.

Intro:

Are you ready to know what you don't know about Privacy Pros? Then you're in the right place.

Intro:

Welcome to the Privacy Pros Academy podcast by Kazient Privacy Experts. The podcast to launch progress and excel your career as a Privacy Pro.

Intro:

Hear about the latest news and developments in the world of privacy. Discover fascinating insights from leading global privacy professionals and hear real stories and top tips from the people who've been where you want to get to.

Intro:

We’re an official IAPP training partner.

Intro:

We've trained people in over 137 countries and counting.

Intro:

So whether you're thinking about starting a career in data privacy or you’re an experienced professional, this is the podcast for you.

Jamal:

Good morning, good afternoon, and good evening from wherever you are listening to the Privacy Pros podcast today. I am super excited to have an amazing guest with us, and this is someone I've been trying to find an opportunity to get on the show for some time now. So I'm really excited to present to you guys Heidi Saas. Heidi Saas is a data privacy and technology attorney who regularly advises SMEs and startups working in a wide variety of industries on a global scale. She delivers thoughtful process management utilizing emotional intelligence, legal training, and business acumen to help her clients understand their challenges and help them map a better path forward. In addition to being an attorney and a Privacy Pro, Heidi is also an A2J advocate, mentor, coalition builder, and disruptor for truth and justice initiatives. Heidi, welcome to the Privacy Pros.

Heidi:

Thank you so much for having me. My name is Heidi Saas and these are my opinions, not legal advice. Now we can let it flow. We're ready to go. I'm honoured to be here. You've already talked to so many luminaries in our field, and I'm excited that I am included in such great companies. So thank you so much for the invitation.

Jamal:

My absolute pleasure. Now, Heidi, one of the things that you spoke about at the IAPP conference was about your journey and also about what people can do who have been laid off and really upskill and find their niche or find their space in privacy. So I'm going to get straight into that. Before we do that, could you be kind enough for the listeners who haven't had the chance to get to know you a little bit more? Tell us a little bit about your journey so far into privacy.

Heidi:

Sure. Well, I'm a consumer rights attorney, and previously I ran a consumer rights law firm. It was a national firm, and I skilled the firm to over 200 attorneys and managed the attorneys for years. And it was new what we were doing, because it was during the financial crisis. So we used limited scope representation, which you guys call alternative fee structures. And it wasn't new in our country. It just wasn't used, because it wasn't available. But that was the only way we could find to afford access to legal services to people in such a financial crisis. So at the same time, they were trying to offer these services. We're lobbying for the passage of the Card Act and Dodd Frank, with the establishment of the CFPB, I feel personally invested. I'm so excited. The consumer agency I always wanted for people. After I finished doing all of that, it took a terrible toll on me. Disruption work is not easy. And I was teaching hundreds of lawyers on new areas, and we were under attack from all sides because the monied powerful interests were all against us. And so they came for us as many ways as they could. And ultimately, when I left, I needed a few years just to gather my thoughts and recover and focus on building my family. Because as a woman, it's just not easy to pick the right time to have a family. When you're a lawyer, especially, there's no right time, because motherhood is where legal careers go to die. So I needed some time to do that. And then when I came back, I said, you know, if I am going to work on consumer rights, I have to learn more about data, data privacy and technology, because that's where the consumer rights violations are happening now. And so I got into learning more about it. I did the Google Analytics courses, and I was like, oh, my God, they're doing what? And so then I moved into the IAPP, and I said, what are you guys doing about it? I took the exam, and I show up, and I'm like, hey, everybody, I hate data brokers, and I'm here to see about privacy and ethical AI and consumer rights. And they were like, okay, lady, and I said, well, what are you doing?

Heidi:

And it was:

Jamal:

No, I love that. And I can see how that consumer right journey fed you into data privacy and you went down the rabbit hole of all the analytics and you was like, wow, I really have to do something about this. I can't just let this happen. I'm going to go in and I'm going to do my best. And even if people are saying they're not interested right now, I know it's about doing the right thing and I'm going to give it my all. And you've been doing amazing work since. And for anyone who hasn't met Heidi yet, please go check out her LinkedIn profile. Read through some of the amazing stuff she's been doing and you will see why we need to get behind people like Heidi and why we need more people like Heidi in our industry. And Heidi, you know what I love most about what you said there and most about the way you come across is being a privacy professional is not about ticking a bloody box. It's not about populating a bloody template. It's about doing the right thing. It's about helping businesses to go beyond compliance so they can gain that trust, so they can build that confidence and ultimately create more of an impact with whatever business that they have in whatever the parts of the world that they're actually looking to serve people and make a positive difference.

Heidi:

Fair Credit Reporting Act. In:

Jamal:

Highly unlikely. It's a little bit like what we say when you shoot yourself in the foot. They definitely don't want to shoot themselves in the foot now, do they?

Heidi:

And they don't want to pass a legislation that carves out, everybody else has to do this but us. It's cool for us, right? Because that looks like BS and people know BS they're going catch on. So there are other ways to go about this. Agencies have rulemaking authority and they are working on it right now. This past week, I was discouraged to hear some people refer to our agencies that are there to protect consumers as activist agencies, as if that's some nasty negative label to put on them to delegitimize their work. People forget that the agencies have always had this mandate to do this work. Whether they did the work or not depends on what administration was there, what funding they were getting from Congress, who was in charge of the agency at the time. And quite frankly, it broke my heart to know that Mick Mulvaney was put in charge of my little baby agency. And I don't even know if he stepped foot through the doors, but I still wanted to go over there and burn some sage when it was over, just to clear things out for the people to come back and do the work for consumers. So I'm glad to see the government agencies are doing their jobs. And I'm discouraged to hear people in the privacy community call them activists, because those corporate lawyers are the same ones that as soon as they get up the legislation or rulemaking, they're going to go running into court complaining, oh, it's just too burdensome. And they're the ones pushing the four corners of it. So you know what? It's fair for you, it's fair for them. You don't want to get dirty, then put your comments in. Otherwise shut up when the rules come out, that's where I'm at.

Jamal:

Yes, exactly. I completely agree with you. If you don't take responsibility for doing the right thing now, then don't complain later when you have to deal with the consequences.

Heidi:

How often does your government say, tell me what the rules should be? That's a huge opportunity right now.

Jamal:

It was really interesting because I remember when we have the GDPR here, I'm in the UK now, but when we were part of Europe, I remember companies like Microsoft and other big companies, they were creating their own rules and saying, hey, here, we've written the law for you, just take this and accept it. What are your thoughts of that?

Heidi:

So in our country, that's how we do law. That's what we do. So the people that write law are not the people that you vote for, the people that you see giving sound bites on TV, on capitol hill. Those aren’t the people that write the law. Lobbyists, think tanks, those sorts of groups write law. I know this because when I was in college, I worked at a lobbying law firm, and a lot of the documents that I was drafting for my bosses started with s and a number because it was a senate bill that was going to be introduced. We were putting it down on paper, and then we would hand it to the legislator and they would stand up and say, I'd like to introduce this bill and read from what I was typing. So I know first hand that's where law comes from, it doesn't necessarily come from the legislators. They also don't have time to write law because they're constantly trying to campaign and keep their job. In the house, they're up for re-election every two years. As soon as you're done with the one election cycle, you're starting another one. How much time do you really have to spend doing things for the people? You have to rely on other people, right? So you have the lobbyists and the think tanks and the other groups that are there trying to advance their cause for you. And everybody owes somebody something all the time. So that's how it's done in our country. It's new to you, but that's how we do stuff. If you look at the rules, the EU's AI act, and the machinations, that companies are coming around saying, oh well, this is the definition of AI, and this is the definition of AI. And they're trying to create a definition for a safe harbour that includes all boats. Well, that's not the purpose of the law. What's happened is that everybody has gone and found whatever tool increases operational efficiency and they thought it's great, so they pipe and they go ahead and integrate it into their system. Well now we're telling you there are harms, potentially harms to humans from your use of these tools in certain respects. And so it's your obligation to go back through those tools as part of your overall data pipeline review. This is what I'm telling people that are laid off or going into new areas of privacy. It is not your obligation to understand algorithmic bias, but it will be to your great benefit if you do. If you learn how these models work, if you learn a little bit of data science, if you learn a little bit of ethical training in addition to the privacy skills that you already have, you can review the whole pipeline. Where are you getting your data? What are your tools doing to that data? Here are the areas where you need to make some changes and then what happens to your data later? It's the entire lifecycle. Right. That's a good area for people to get in, the algorithmic bias area. But that's only if you know you're helping your employer with in-house work. Because the audits that are mandated and coming to be mandated must be independent. So if you have to have an independent auditor come, it's to your benefit that you have someone in-house that is already working on the issue. And then after your independent audit, then you'll know what that report says and then you know where you have gaps in your analysis. It's a second set of eyes. But it will be to their benefit to have that sort of training first hand around pre audits, because some of these audits, the disclosures are mandatory on what they find. That's why they have to be independent. We don't have transparency and accountability without independent audits.

Jamal:

Yeah, I truly agree. For us to have true accountability, then there has to be some kind of an independent oversight, making sure that all of the different areas are actually being counted for and where there are gaps identified, we're actually aware of them, we can actually do something about it. And the other problem is when you don't have independent assurances, you get side-tracked, you get lost in your view and you have this tunnel vision and sometimes you miss the wood for the trees just because you're in the thick of it so much. So having that independent oversight really helps just to give that fresh perspective, someone coming in who hasn't been looking at this for too long and gets a bit lost in it. I want to talk about ethical AI. And the second thing you mentioned was about how people who have been impacted by the recent layoffs can really upskill and carve themselves out a niche in this really exciting privacy space, focusing specifically on AI or something else. I want to come and touch on both those things. Before I do that, one of the things that was really fascinating about how you kind of given me an insight and really opened my eyes to how laws are passed in the US. It is not the actual politician. It is not the executive that passes the laws. It is whoever has the biggest lobbying power and whoever has the biggest lobbying budget. Does that sound about right?

Heidi:

Yeah, has been for a long time. There are disclosures required in some instances, but dark money refers to the money that's not even required to be disclosed because there's required disclosure money. And then as soon as you have a regulation, people carve out their own little exceptions. And then there's the mountains of other money over here that's flowing all over the place to make things done. If you look at the Virginia privacy bill for the state of Virginia, okay, the state of Virginia's privacy law does not cover Alexa. It's also the location for HQ 2 for Amazon. You see a little connection there because it was covered until the last minute, until they get a phone call, and we're like, we don't like this language. Change that language. So they tweaked a couple of words. Guess what? Alexa is no longer covered. Bill’s passed. Everybody's excited. They've enshrined privacy. Yes, they've enshrined privacy by passing a law. No, you didn't. You were policy blocking good law, and you've done damage to consumers in the process. So, yes, that was purchased. Open Secrets is an organization that we have in our country that follows the money, the campaign money, the dark trail money, the bills being passed, and that sort of thing. So you can go and find out, how many lawyers and lobbyists do they have working at just this giant law firm? You can go and look, and then you can look and see who their clients are, and you can see which industries they're lobbying. So it's all disclosed. Yeah. Dark money policy, consumer rights, all of those things are connected. It's one ecosystem.

Jamal:

Wow, that's super fascinating. And I'm sure if I actually started exploring into that, I would get lost and fascinated at the same time. So I'll save that for another time. Now, the next thing I really want to focus on, Heidi, while we have you here, is about ethical AI. The question I have for you is, is it possible for AI to be ethical?

Heidi:

All of AI or generative AI? Because we need to separate this by the AI tools that I'm concerned about people using now.

Jamal:

Tell us the difference between the two. What's the main difference between the two types of AI you just mentioned there?

Heidi:

AI tools that we use now, for example, a hiring tool. You say, I want to go and I want to hire someone, and you're using a hiring tool, and it screens through your candidates, and it brings you the best candidates, and then you go through that, and then you onboard them, and then whatever. They're part of your HR system. Right. That tool right there, that is a tool that impacts a lot of different humans in a lot of different ways. Now, the large language models haven't been incorporated into that yet, okay? Because it just came out. So they haven't found a way to make this unsupervised tool worse yet by feeding it unsupervised data. Right. The large language models took everything in the Internet, which, you know, everything's true on the Internet. Right. And so they put that into this big pot like a wishing well. I explain it to people. You throw some data in and some props, and then it kicks back some magic. But if you look at that, it doesn't have citations or authorization. You don't know where that came from it. Hallucinates it's a good example of AI and how the regular AI tools are wrong can just be amplified by this unsupervised area. It is regulated, but the results that it can't tell you how it got the results. Now, as a lawyer, when I'm reading something and I don't have like, somebody states a position and I don't have a footnote or a citation or something to back up what you just said, my brain stops reading. I don't have any sort of trust in what I'm reading if it can't give me citations. So that's kind of the issue that's going on over here. There are a lot of other privacy issues. I don't know that generative AI will ever be GDPR compliant just because of the nature of everything that's in it, right?

Heidi:

Every single piece of data that sent out billions and billions of data points. We don't know if we gave consent for that in there or not. We don't know if we can go back now that the cat's out of the bag and try to get consent for a billion data points that they have in there either. So that requires a little bit more analysis before people start rolling this into every single business tool that they can to increase operational efficiency. Think about the cost. If you don't know how these tools work and you can't explain them, you're in trouble. If you are doing this so you can get rid of humans and lay them off and you're relying on a tool that you don't know how it works, you don't test the results and you're just doing it anyway to save money, you're going to get in trouble. So these are the sort of problems ones that I have with people using generative AI, right? Now, just because it's so magical and transformative, especially law firms. If law firms want to use generative AI to replace lawyers, you better hope and pray you don't come after other people for the unauthorized practice of law when they use the very same tool in civil practice. Because that is hypocrisy of the highest order, lawyers cannot have a lock on generative AI and say it's unauthorized practice of law for anyone else to use it. I see that coming. They're already working on it. So that is a problem. Now, the regular AI tools that we use impact protected classes of persons in a variety of ways. An example that I like to give people about the fallacies of AI is that I've got a picture in front of me and the robot says it's 99% sure that it's a snowplough, right? 99% sure it's a snowplough. So I look at the toddler and the toddler goes, uh oh, school bus. So what I'm looking at is a picture of a school bus on its side. The toddler is absolutely right and did two things, identified what it was properly and that there was a problem with it. Okay, where the AI just flat lied to me and said, that's a snowplough. That is the level of accuracy that people are using to place people in jobs, housing, insurance, lending products, educational programs. That cannot be how these tools work. That can't be an acceptable level of results. Like, just because it produces something doesn't mean it's not garbage. The reason it produces garbage is because all the data that goes into it is garbage. Right? They go scrape whatever data they can from the web and feed it into the system, and then you're screened out.

Heidi:

Now people in protected classes especially are feeling the damage of this because we have automated systemic racism. And now the tool is responsible, not the people. Because we used to go after the biased people saying they're a racist. That's why I didn't get a job. Now it's the tool. And they're like, it's the tool. No, I'm sorry, it's your racist tool, so you're still going to get in trouble for it. These are the impacts that we've got to chase down so the businesses know that's where the real risk is. So one more quick story on algorithms and why you shouldn't trust them. One metric that's used in the hiring tools is called job progression. You start at the window, you move up to fries. Eventually you get to be the manager. If you're in a situation where you have a gap in your resume for any reason, motherhood sick kid, layoff a gap for any reason, job progression just discarded you. It doesn't process your data any further in the applications process because you didn't meet that metric. They can't gauge you and give you a score on the job progression. You're in the garbage, never got seen by a human. How is that fair? Because at this point, after COVID, that's almost everybody. And so these machines aren't working for anybody. If these machines were doing a great job at picking candidates, people wouldn't have quit en-masse and said, I hate my job. The great resignation people, I hate my job and they took off. That wouldn't have happened. The algorithm got them that job two years ago, and they hate it. They left, and they were super sour about it, too. It wasn't just like, I'm going to go and get another job. They were just like, I'm out. I hate you. So, I mean, the system doesn't work for employers. It doesn't work for employees. It only works for the people making these garbage tools. Now we have to hold them responsible so that we can get validity out of these tools so that they can be used for trustworthy purposes. The Fair Credit Reporting Act is what needs to regulate that, the high life critical areas employment, housing, education, lending, and insurance at a minimum. So that data needs to be toxic for data brokers to use unless they can verify that the data is true before it is used against people for these purposes.

Jamal:

Wow, that's super fascinating. And I especially love how you put into context when you were saying there's a school bus on its side. AI says it's 99% sure it's no cloud, but even Toddler can see that. Actually it's not. It's a school bus and there's something wrong with it. And that is what we're saying is we're happy with to make these life changing decisions that can have massive impact on individuals across all things. The other thing you've mentioned a few times is it's garbage because it's garbage that's fed into it. So is there an argument there that said if we fed only quality data, trusted, verified information where we know where the source of that information is, it's reliable, then we could actually create powerful AI.

Heidi:

Well, then you would have a better training set to begin with and a better model to work with. And then when you do go live, you should get better results. And if not, you will have a better chance at chasing down where exactly you need to make the changes to remediate the issues, to get more refined results. Data quality is paramount. Why would you spend so much money building a tool and then feed it a bunch of garbage and expect results you can rely upon? That just doesn't make sense to me. But the only way we can get that high quality of data is you could tell people all day long that you got to use high quality data to get high quality results. And they're like, yeah, but if that data is toxic for them to use, and there's a private right of action that lights you up, if you don't listen, people are going to start hearing you and they're going to start making the right decision. So, like I said, the solution disruption work for me is not just pointing out problems. There are some other people doing disruptive things in privacy. And yeah, we all need to play our part, but disruption work is different for me because I come with solutions. Solutions that leave all stakeholders in a better position when we're done with this. Otherwise, why go through transformative change? It's hard, it's expensive, and it takes every stakeholder's buy in. That's why we need to collaborate to find out where's the right balance for businesses legitimate need for the use of data versus the consumer rights and the agency in their data, not to be used against them in this way. So that's where the balancing is right now.

Heidi:

If everybody works towards this goal, then we can come out with a system that is better in the end. It's unfortunate that I'm going to try to make this data toxic for people to use and it's going to hurt them in the end if they don't listen. That is unfortunate, but also it is unfortunate that there are businesses that are willing to go through every loophole they can to make money and they're going to do it until somebody lights them up. So it is unfortunate, but that is a part of disruption work. This behaviour is not okay. This is the right solution. These are the people that should work on the team. And here's where we need to have our transformative change so that we're all better off in the end. Consumers are going to have their rights and agency over their data and businesses are going to have that data quality that you're looking for so they can use it for something. They're also going to minimize their risk. They're going to build trust and brand messaging with their client by showing how trustworthy they are in the use of that data. That is going to help them move forward and succeed in business. It's also going to help them make more money because the data that they have is trustworthy data. So if you are going to use that data for other purposes and you have permission to sell that data, you can sell it for a higher price than the other slurry that they're scraping from the data because it's verifiable data. Like other assets, I predict data is going to be cut into tranches. You're going to have SCRA compliant data. You're going to have the maybe compliant fight it out in court data, and then you're going to have the slurry because people are still going to buy that stuff. So I think, like other assets, once there is proper regulation in place, those are the kinds of tranches we're going to start to see. And I think it's to the benefit of consumers and our international data sharing partners. They're waiting for us to do something legitimate other than pinky swears in executive orders. They're waiting for us to do something concrete and provide redress because we don't have the same rights you do. But if we have rights closer to yours, we might actually get a chance at Adequacy. Wouldn't that be cool? Shame on them if they give us Adequacy before we change our rights, though. That's just where I'm at. The US does not deserve Adequacy right now. No, we don't. We still need to do some work over here and then I will be in a better position. Plus we will regulate the people you're trying to reach. All the big tech companies are here, so we have to do something here to reach them and regulate them properly to improve the lot of international data transfers and all of the other things that are just waiting for the Americans to do something.

Jamal:

Yeah, I completely agree. And look, the other challenge with Adequacy is as much as it's based on privacy, it's also based on some of the other human rights. And you still have lots of states who have the capital punishment. So unless we kind of see that wiped out, adequacy is going to be it's not going to happen anytime soon. But then we see this farcical other mechanism that they created where there's meant to be some oversight, but all it is, is a rubber stamp to say yay or nay and there's no more information like, that completely makes me fume. And I've got almost as far as saying it makes my blood boil to see how they've taken where they've enshrined privacy rights for Europeans. They've said, Schrems I one, okay, great, now this doesn't work. Schrems II, okay, this doesn't work. And now they're coming up with something almost comical and nonsensical and trying to pass it off as it's going to be offering equivalent guarantees or equivalents of protection for Europeans when it's going to do no such thing. What are your view on that, Heidi?

Heidi:

I think it's garbage. My favourite word to start. Like I said, the executive orders are pinky swears. They could be the electorate executive that comes in. So, I mean, it's not worth the paper it's written on. It was more of a show pony. I think the big issue, you're absolutely right. People know bull. And as soon as they saw that, I was like, I cannot wait to see what this redress process looks like. And I saw that and was like, oh, you can't even call that a court. The decisions are secret and it's just kind of a rubber stamp and it's basically everything's fine. But also from my perspective, I got to pay for a court to enforce your rights that I don't even have. That's bull. Like, seriously, my judiciary has to set up a court system for you guys to come and enforce rights I don't even have and I get to pay for that? No, thank you. So that's a uniquely American experience right there. They got more rights than I do. As soon as you can give me some rights, maybe I can set up adequate redress mechanisms on both sides. But we got to get some rights and agency in our data first before we start placating other people with all their fancy rights. I'm definitely GDPR jealous.

Jamal:

Thank you for sharing that, Heidi. The other thing I want to touch on, the other thing I want you to really take a deep dive on is your experience or your top tips for someone who's maybe been laid off or lost a job recently or even for lawyers who are looking to pivot their careers into data privacy. What advice and top tips do you have for them to make that successful transition and for people who are already practicing as data privacy professionals to take their career to the next level.

Heidi:

So curiosity, I think, is the most important thing you can have. And my advice for lawyers is just I think you need to understand one thing. You're not offered the option to fail in the law. Technology requires failure. So make peace with those two. You have to have failure in technology. So you learn and you improved on the next iteration, that's a new mindset that isn't native to lawyers. So if lawyers can get over the one bot at the Apple and then work with technology and realize that every failure is another opportunity to improve what you're doing, that is a mindset shift for lawyers, they'll be fine with that. For other privacy professionals, I would encourage you just to be curious if you are curious and you see something going on cookie deprecation, they're on borrowed time anyway, and people are looking at your business and they're finding out where all the cookies are to make sure they've got the Do Not Sell for California and this and that. Well, why did you stop there? Aren't you curious? Isn't there something else you can do besides cookies so that you can set this up in a better way for your company? So be curious and learn more about it. Or if you're working on the cyber side and there's a lot of overlap of privacy and cyber, then talk to the people that work in those departments so that you know what salting and hashing is about. It's not just food. So I mean those sorts of curiosities so that you have more skills in your toolkit. So when you get dropped into a situation with a new job, you can say, here's what I can help you do, here's what I bring to the table to help you through this issue and you're going to have a better chance. Instead of hitting the ground and finding out what's everybody doing over here, you're already going to have some ideas that may or may not be listened to, but at least you can bring those new ideas to the table. Some of those new ideas mitigate risk and increase revenue, but you've got to be heard, right? Yeah, exactly. Those are definitely things the C suite wants to hear, but you got to get them to hear you first. So if you go through and upskill before you get into a new job, you can hit the ground running in a better way. Brand safety and trust people, especially, they already have the right skill set to shift right over into bias. So start learning those algorithmic bias audits. Start working with other people in that field. It's not that complicated. Don't be afraid of the data science, just a little math. But that's a great area for people to start learning new things because once it's mandated, we already don't have enough people trained in the area to provide services that businesses are going to have to have. So the more people we can get into that the better. And I think privacy professionals are best suited to do that. What's your thought on that?

Jamal:

Yeah, I completely agree. Privacy professionals will play a great part in that, especially those who are passionate and actually care about the people's privacy rights and don't just want it because they offer the big pay check. Yeah, it's, it's curious and what's really interesting there, Heidi, you talk about curiosity and at the Privacy Pros Academy, any program that we do, I take them through what I call my C Five methodology. And the first C for that C Five methodology is basically clarity. And I say for you to be, for you to be able to get the clarity, you have to get curious. You have to go and ask the questions. You have to dig, you have to be like an investigator. And it's when you get really curious and you ask powerful questions, you then get that clarity that you need to then go and confidently be able to provide credible solutions. How can you offer a solution if you don't know the full information? How can you offer a solution if you haven't asked the right question? You don't even know what you're dealing with. So the C Five methodology basically goes, you need clarity. Once you have the clarity, then you have the confidence. You know what you're dealing with. And that gives confidence to anyone who is going to listen to you provide credible solutions. You need to make sure you're competent in whatever that is doing, which is where the training and the upskilling comes from. And you also need a powerful community and environment around you to help you thrive. And sometimes that will be the community with the client or the employer you're working with, making sure you know different stakeholders in different parts maybe, and creating a hearing committee and for yourself what we provide is a community of like minded individuals who come together, who are there to support you, who are there to answer questions, people from different industries, people with different background, different level of experience. Because then you get to mishmash everyone's ideas, everyone uses everyone’s approaches and come up with more pragmatic and powerful solutions. What are your thoughts on that Heidi?

Heidi:

No, the privacy community, I love the privacy community. Number one, I think it's female led, mostly because when privacy became a thing, the guys in cyber found the nearest woman. They said, hey, learn that. And so now we have all these veteran leaders in the field. Seriously, so many of the origin stories from so many of the veteran females that we know in privacy, that's the origin story. They were volunteered to learn it and so they got onto it. Now they own it. And I think it's amazing. But that example has led other women to go, I see a place for me here. I'm going to get in here because law, technology, finance, politics, all of these powerful areas are very heavily male dominated. Privacy is right there. Everything doesn't work unless you have privacy. Guess what? The women own that. And so I think that's important for other people to see. We also are welcoming of people with pink hair and people that have pronouns. It's just an inclusive kind of whole community that I love that celebration of openness because we know we need more people to help and we're excited when we find more people that want to learn more about this. Because privacy is a human right. We're all invested in this. It's not like we all found some niche market that only impacts one small little area. No, this impacts all of our human lives. Privacy in some countries is a matter of life or death. And so the perspective and diversity of thought that we have in this industry I think has made us grow so quickly and so beautifully that I think that is one of the most exciting areas that you could practice in.

Heidi:

Now, the last bit of advice, unless you're a privacy business and you're trying to get into this field, if you drop into our community and say, hey, I built a new privacy widget and I do this, you will immediately be crushed if people don't know you. I mean, you've seen it happen, right? One of our technologists or engineers will be like, yes, we see your widget. How about that? Yes, we've seen them sudden death, right? So, I mean, I'm starting to see the people, toolmakers, especially toolmakers from other countries and whatnot they're starting to approach me to try to get some advice to say, how do I come into your country without getting killed by your community? Because I'm trying to offer services in the privacy community and credibility has to be earned on that point of credibility. That's where I would encourage people to get into the arena. It is so hard to do all of this research and have all of this knowledge and be ignored, but you're going to be ignored unless you put it out there. You've got to get into the arena. So if all you do is read some guidance article and you pull some key takeaways, put that in a post and put that out on LinkedIn and just wait, the community will support you. The community will correct you. The community will be there for you. The community may just ignore you, but you've got to start showing up and say, here's what I know, here's what I can do so that you can participate in the community. That's how you build credibility. You also build personal relationships and are more likely to get a job from another human than from an algorithm. And so it's the hardest thing to just get into the arena, but you've got to let it go. You've got to just jump in and not worry about it. Just hit post and walk away from the computer, come back later and see how you did. But getting into the arena is the hardest part and you're the only one holding you back. So if people want to get into this discussion, I welcome you. Please get into this discussion. If you're bringing a new tool to market, you better know some other people already in this market so that you can either get some support or have some advice on some of your gaps in your tool development or your tech stack before you go live with that. So subject matter experts are available in our field. That's something that founders need to take advantage of as well. So just the community has toolmakers, founders, VCs, data scientists, ethicists. It's not just privacy pros. Our community has all of the necessary people with the keys to unlock the answers to a better digital future. The busier we are and the more people join us and get more training, and the more we keep talking about what we want our goals to be, we're being heard. And the more of us join in and do this work, the better off we're all going to be.

Jamal:

Absolutely. And some amazing tips. I especially love what you were saying about credibility there and getting into the arena. It's just about posting your takeaways. And one of the things we encourage all of our mentees to do at the Privacy Pros Academy is when they've completed a session with me, is they all create a post. They share the key takeaways on LinkedIn with what they've learned. And you said it helps them to stand out, it helps them to create those networking opportunities, to foster those great relationships and work on exciting projects and also get spotted by hiring managers and recruiters as well. And on this particular cohort of the Accelerator program, I'm not sure if you come across them, Heidi, but what these guys are doing, they've really taken this to another level. And every week, every single one of them, they go and they create a video and they post it on LinkedIn and they talk about their experiences, they talk about what they've learned, they talk about why it's important, how they feel, and they're making massive waves. And I'm so proud of every single one of them because you should have seen their faces the first week when I said, on this program, we're going to be making videos every week. But now they haven't even done ten videos yet. But they're absolutely loving it. Some of them have had opportunities coming to them to be on podcast, to get on webinars. Debbie Reynolds even mentioned one of them on her newsletter. And then we've have one who's had three interviews from that. They said, I watched all your videos and we'd love to offer you a position. And it was 30% more than what she was currently earning. And she was like, you know what? Nah, I'm better than that. Thanks, but no thanks. But isn't it powerful to be in a position where you have such opportunities? And it all starts off by taking action, getting in the ring and putting yourself out there. Just like you said, Heidi, belonging matters.

Heidi:

Belonging matters, it does. Belonging matters. Now that you found your people, the opportunities are endless. That was one of my favourite parts of being in Washington last week, is that the connections that you could make because I collect smart people and I have for years. And so I know how you think and I know what you're doing and I know how you think, and I think you should meet and great things happen for these people. And it's not about trying to find out how can I make money by making these connections? Because a lot of people get hung up with that. And you miss the human value of the interaction of building relationships when you bring in money. So I think that's a great way to do it. They're giving testimonials about their experience. Other people see themselves and they see room for themselves there. It gives them permission to be vulnerable and say, I could do that too. So belonging matters. And as soon as you get in, if you can reach back and pull someone with you, I would appreciate that because if you're excited about it and you talk to other people, they'll be excited about it. So for all the new people that are coming in, please bring your friends. We just need as many people as possible. I wish I had more time to do more teaching, but I've had to pull back on that because I'm just swamped with other things. But that's why I'm glad that you are out there teaching and not just teaching you're, teaching the skills they need so that they can do the work the right way instead of do the work and then have to redo the work later because they learned something the first time by taking the class with you. So your professionals are standing at a different level than some other people that were just declared DPO. I feel for those people. I am concerned for those people in those positions who are now been designated DPO or designated Ethics Officer or something. That's terrible position for people because otherwise they have to leave their job or learn all of these new things. So I feel bad for them. And I'm glad that your training programs offer something that people aren't going to find themselves in that position. Like your candidate was just like, no thanks. Let the demand build before you take an offer. I like that. So apply the training that you've got. Yeah, I think it's great. I love what you're doing. Thank you for that.

Jamal:

Thank you very much for the acknowledgment and yeah, we are definitely revolutionizing how data privacy is done and taking it to a world leading level. And thank you so much for the kind compliments, but I can't take credit for that. I'm just the guide. It's the amazing, driven privacy professionals that come and say, I want to be better, I want to be the best I can. I'm going to put in the effort. I'm going to take the action. I'm going to do whatever it takes. Jamal just guide me so I can't take any of the credit for that. It has to be every single mentee that comes and says, I want to be part of the Privacy Pros Club. And we just help them to be their best selves. Sometimes they need a little bit of help with confidence. Some people suffer from impostor syndrome. Others, they see they have so much value that they've done as a business analyst, as IT, as journalists, whatever it is, and they bring all of those skills to the table. And the biggest fear they have is, oh, I don't want to start again. Like I'm looking for a career change. This privacy looks interesting, but I want to start again. And I say to them, you're not starting again. You're already coming with all that value. All we're going to do is stack more value on top of that by helping you understand how to do privacy in a pragmatic way that actually goes beyond tick box templates and exercises and actually makes a massive difference so everybody can win. And together we can actually make sure that every woman, every man, and every child will be able to enjoy freedom over their personal information wherever they are in the world. And that's what every single person who signs up with the Privacy Pros Academy is committed to. We're committed to the main course. They just energize me, my mentees. I love them so much. They energize me so much. And if you're listening and you also want to have that community around you, and you also want to be empowered to be a world class privacy professional or privacy leader, then drop me a DM, get in touch with us and reach out and see what we can do for you. There's nothing wrong with just DMing me a message. I'm more than happy to respond. I've got great resources I can share with you. Heidi is also there. She's always on LinkedIn creating all those provocative posts, trying to spark conversations. That's what she's trying to do. She's trying to get people like me and you into those conversations. So we can start talking about these things. We can create awareness, and ultimately, by doing that, we will work towards making that huge difference that Heidi and me and so many other Privacy Pros around the world are working really hard to do the right thing for people because we actually care.

Heidi:

What one more caveat about being in the arena? Cutting down. Because I try to provide constructive criticism in a way that here's the problem, here's the solution type of thing. I don't just come in and say, this is garbage and I hate you, and I give you something, a call to action on that. That's an important part of being in the arena. We don't want just people there complaining. Twitter, okay, so that's what you want to do, then you should save that for Twitter and leave that out of LinkedIn. We don't work that way. Bring something with some value to it and be careful about what you're putting out there if it's just all negative all the time. Ruth Bader Ginsburg said, fight for what you care about, but do it in a way that will lend others to join you as justice. So that is my entire career, and I feel that is how I build communities and give support for what I'm doing. I'm passionate about what I do, but it only is effective if I put it out there in the arena and I do it in a way that it inspires other people to join me. So go in the arena, but don't just go in there throwing rocks.

Jamal:

I love that. I love that. One of the assumptions we teach on the Privacy Project Accelerator Program one of the assumptions we adopt on the Privacy Accelerator Program is the mindset that every single person is doing the best they can with the knowledge and resources available to them. And it's up to us as world class privacy leaders to go and educate and empower them with kindness to bring them up to the level what's required to solve whatever challenge we're facing that day. Heidi, thank you so much for sharing all of your valuable tips on this podcast. It's been an absolute pleasure having you. We spoke about AI, we spoke about US politics, we spoke on so many different things, and you also shared some amazing tips for aspiring privacy professionals and professionals who want to take the career to the next level. It's been an absolute pleasure having you, and I look forward to catching up with you again soon.

Outro:

If you enjoyed this episode, be sure to subscribe, like and share so you're notified when a new episode is released.

Outro:

Remember to join the Privacy Pros Academy Facebook group where we answer your questions.

Outro:

Thank you so much for listening. I hope you're leaving with some great things that will add value on her journey as a world class privacy pro.

Outro:

Please leave us a four- or five-star review.

Outro:

And if you'd like to appear on a future episode of our podcast or have a suggestion for a topic you’d like to hear more about, please send an email to team@kazient.co.uk

Outro:

Until next time, peace be with.

Chapters

Video

More from YouTube