Artwork for podcast Privacy Pros Podcast
The Groundbreaking Book Redefining Privacy By Design
Episode 9314th November 2023 • Privacy Pros Podcast • The King of Data Protection - Jamal Ahmed
00:00:00 00:35:58

Share Episode

Shownotes

Unlocking the Power of Privacy by Design: An Exclusive Conversation with R. Jason Cronk, Author of the IAPP Textbook "Strategic Privacy By Design"

In this episode, we have seasoned privacy engineer, developer, lawyer and author R. Jason Cronk on the show. Jason gives a comprehensive breakdown of the essence of privacy by design, the nuances of privacy threat modelling as well as the differences between normative and tangible privacy harms, and how companies often overlook the former.

By the end of this episode, you'll learn:

  • How to identify and model privacy threats
  • The essential skills for mastering privacy by design
  • The key qualities that define a successful privacy leader

Don't miss your chance to learn the ins and outs of privacy by design!

With over two decades of experience in principle and trust consulting, R. Jason Cronk is a seasoned privacy engineer, developer, lawyer, author of the IAPP textbook “Strategic Privacy by Design,”.

He is also the founder and president of the Institute of Operational Privacy Design, a non-profit organisation of privacy professionals which seeks to define and drive the adoption of common and comprehensive standards to protect individuals’ privacy. His knowledge and involvement reaches across the spectrum as an active member of the academic, engineering, legal and professional privacy communities and a pioneering voice in the development of privacy by design. Whether it is writing books, developing models and frameworks, or training companies and individuals alike, he is tirelessly advocating for privacy across the world.

If you're ready to transform your career and become the go-to GDPR expert, get your copy of 'The Easy Peasy Guide to GDPR' here: https://www.bestgdprbook.com/

Follow Jamal on LinkedIn: https://www.linkedin.com/in/kmjahmed/

Follow Jason on LinkedIn: https://www.linkedin.com/in/rjc06c/

Subscribe to the Privacy Pros Academy YouTube Channel

► https://www.youtube.com/c/PrivacyPros

Join the Privacy Pros Academy Private Facebook Group for:

  • Free LIVE Training
  • Free Easy Peasy Data Privacy Guides
  • Data Protection Updates and so much more

Apply to join here whilst it's still free: https://www.facebook.com/groups/privacypro

Transcripts

Jason:

I've known and many privacy professionals have known for years that the facebook model did not match GDPR. They kept trying to argue Oh, it's a legitimate interest. No, it's not legitimate. Oh, okay. It's contract. No, it's not contract. Yo, it's consent No, there's no way one check box is going to give you consent for everything you're doing.

:

Are you ready to know what you don't know about Privacy? Pros, then you are in the right place. Welcome to the Privacy Pros Academy podcast by Kazient Privacy experts, the podcast to launch progress and excel your career as a privacy pro. Hear about the latest news and development. Discover fascinating insights from leading global privacy professionals.

And hear real stories and top tips from the people who've been where you want to get to. We've trained people in over 137 countries and countries. So, whether you're thinking about starting a career in data privacy. Or you're an experienced professional. This is

Jamal:

Hello, and welcome to another episode of the Privacy Pros Podcast. I'm your host, Jamal Ahmed, founder and lead mentor at the Privacy Pros Academy. And I'm thrilled to have you joining us today, because today we have a very special episode with a very special author. And by the end of this episode, you will gain a deeper understanding of what Privacy by Design is, tips on how you can implement Privacy by Design better in your organization, and how Privacy by Design can set you apart from the rest of the privacy pros and transform you into a privacy leader right now. So to talk more about privacy by design is the person that wrote the book on it.

With over two decades of experience in principle and trust consulting, R. Jason Cronk is a seasoned privacy engineer, developer, lawyer. Author of the IAPP textbook, Strategic Privacy by Design, and founder and president of the Institute of Operational Privacy Design, a non for profit organization of privacy professionals which seeks to define the adoption of common and comprehensive standards to protect individuals privacy. His knowledge and involvement reaches across the spectrum and as an active member of the academic, engineering, legal, and professional privacy communities. He is a pioneering voice in the development of privacy by design. Whether it's writing books, developing models and frameworks or training companies and individuals alike, he is tirelessly advocating for privacy across the world. Welcome, Jason.

Jason:

Yeah, thanks for having me. I'm excited to be here.

Jamal:

We're excited to have you Let's get into Privacy by Design. First of all, what is Privacy by Design?

Jason:

So a lot of people think of privacy by design and it's absolutely true is basically being proactive about privacy, about thinking about when you're designing products, services, or even business processes, thinking about privacy up front. That's absolutely true, but there's a lot more detail to it.

So when I got into the concept and the idea back when Ann Cavoukian was the Commissioner of Information and Privacy for the Ontario Information and Privacy Commissioner's Office. She had these seven principles of privacy by design, but you would turn those over to engineers and they would say I don't know what to do with this. This is great, but I don't know what to do with it. So I started investigating saying, what do we actually need to do privacy by design? There's a lot more that goes into it in terms of actually , understanding risk, understanding the harms of what you're doing, understanding the controls and mitigations you can put in place, understanding the competing values and quality attributes of system.

You may be able to increase privacy, but maybe that's going to reduce usability. And balance these competing qualities in designing what you're going to do? So I think there's a lot to unpack in privacy by design, but very simplistic answer is it's being proactive and thinking about privacy as one of the attributes you want in your product or service.

Jamal:

Okay, great. Thank you for explaining it. And I guess based on the explanation you've just given, it makes sense why your book is entitled Strategic Privacy by Design. It's taking those principles that Dr. Ann Cavoukian came up with and then saying how can we strategically adopt this in a way and implement it to the business that actually makes sense.

So let's find a little bit more about that.

Jason:

Actually , I want to emphasize that a lot of the things I use are leveraged from people who came before me, who were smarter than me, who thought about these things in different ways. I don't want to downplay their contribution. My contribution was putting it all together.

So the book title strategic privacy by design came exactly from that. Everybody knew at the time Ann Cavoukian and seven principles. So I didn't just want to call it privacy by design because I didn't want to take that from her. But also my methodology is leveraging yin holman's, privacy design strategy and tactic. So I was like, okay, we're taking strategies and we're taking privacy by design. So let's put them together to come up with a systematic method of how are we actually going to do this?

How are we actually gonna design privacy enterprise? We all know the end results is going to be adherence to the seven principles or, adherence to a quality of privacy. But what is the strategy? What is the methodology? How are we going to systematically doing it? And that's what I think was lacking.

And that's where I grabbed all the components from all these different places and put them together in what I think is a cohesive process.

Jamal:

Thank you for explaining that. And it's very modest of you and giving credit to where it's due. So that's very commendable and respectable. Thank you. Now, one of the things that you talk about in the book, and one of the things that you actually deliver on some of your programs by the Institute of Privacy by Design is the privacy harms. So what I want to know is what do you mean by privacy harms and how do we actually identify them?

Jason:

So again, this is where I'm leveraging people who have come before me who have, worked on these sorts of things. I primarily use Professor Dan Solov's taxonomy of privacy and in it he breaks privacy harms into four categories information processing, information dissemination, attempted collection of information and invasions, which includes intrusions like your spam or people come into your door or decisional interference.

So interfering with people's autonomy and there are many different models for privacy harm, there's Woody Hardog in his book, Privacy Blueprint, talks about there's three pillars of privacy. You have Alan Weston and his four states of privacy. You have Ryan Calo has subjective and objective harm.

The thing I really like about Dan Solov's taxonomy is, it is comprehensive. In other words, I haven't found anything that people would consider a privacy violation that can't fit into one of his buckets, but it also is granular enough to work with. In other words, you can look at a specific harm, like aggregation or breach of confidentiality.

And you can say, okay, where in my design Is there a potential for this? Is there a risk of this happening? And then how do I mitigate that? So it's comprehensive and granular enough that I find it very workable. Certainly. That's not the only one that, people will work with and people can work with other things.

Now, Professor Solov did produce a follow on to that with Professor Danielle Citron called Typology of Privacy or Privacy Typology of Harm. And the only thing about that, I like it, but it conflates what I consider normative harms with tangible harms. And there's a difference there.

So if I were to come in and install a surveillance camera in your house. That you didn't know about. We would all agree that it's a privacy violation. But a lot of people, it's what's the harm, right? I'm not blackmailing you. I'm not trying to extract money from you. I'm not selling it on the Internet.

I'm not doing anything, but we would all agree. That's it's creepy and it's a violation of your personal space, a violation of your privacy. It is a normative harm, it is a violation of our social norms. And of course there are tangible harms that can come from that.

I could, again, I could turn around and, blackmail you if I found something incriminating. I could I could go and sell it and embarrass you or put it out on the Internet and potentially embarrass you I could do a lot of things that could potentially have tangible consequences.

But there's a difference between the tangible consequences and the normative harm. And a lot of times a lot of companies will focus on the tangible harms and forego thinking about the normative harm. A great example I use one of the Facebook and one of the legal suits. I think it might have been around Cambridge Analytica might have been one of their other lawsuits that came out.

Their phone app was collecting people's SMS messages and in the email exchange with executive the executive were like, hey people won't like this. They'll be upset that we're doing this. They'll be concerned that it's a violation of their privacy. What can we do? Oh, let's try to suppress the pop up message from the phone from Android that tells people that we're doing that.

So it wasn't the underlying harm they were trying to prevent what they were trying to prevent is the tangible consequences of people being embarrassed, upset deleting the Facebook app, changing their behavior, et cetera, but still not addressing the normative harm.

Again, when you talk about privacy harms, I talk about normative harm. So a lot of normative harms when they reach. Enough of a momentum, or there's this kind of disconnected society of is this something society agrees with? They become instantiated within law.

And so they become a legal requirement, but, with technology changing and everything, there's a lot of norms around our activities that are still developing, what's the appropriate, use of information for behavioral targeted advertising.

It's a little bit more settled down in Europe, but in the United States it's still, a question of, does society agree with it? If you ask the advertisers, they say yeah, everybody's okay with it. You ask people on the street. They're like, no, I don't want to be tracked and targeted.

So we're still fleshing that out and then that's where the law can step in and put a line in the sand and say this line is okay to cross, this line is not okay to cross but there's also a lot of, social norms that aren't codified in law, and you think about, schoolyard antics.

If, a kid walks up to another kid and whispers in their ear something, and then that kid turns around and announces it to the schoolyard, That's a violation of their social norms. That's why they whispered in the ear. They didn't want it to be disclosed to everybody, but it's probably not a legal violation.

Jamal:

Thank you for explaining that. And I guess when you talk about normative harms, it can quite easily be overlooked by companies. so for example, you was talking about if you came and put a camera in my house, okay, you might not be doing anything with it, but the fact that I know that the camera's there, it's going to alter my behavior, it's going to moderate the way I think about doing

Jason:

But here's the thing. So those are the tangible consequences that you've changed your behavior, that you're upset, annoyed, whatever. Because you know about it. So if I'm trying to address those consequences, what do I do? I hide the camera better. I make it a pinhole camera.

So it's harder to find. I encrypt the data. So if you're able to see that there's data going over your network, you can't like sniff it and see that it's a camera connected to it. Or I put it on a separate network altogether. So there's all sorts of things that I can do to reduce the tangible consequences.

But in a way that makes it creepier, right? Because I'm trying to evade your finding it. Like I said, this is what some companies like Facebook do. They address the tangible consequences and not the normative consequences.

Jamal:

Why do you think that happens?

Jason:

That's a loaded question. There's a lot of answers to that.

Obviously, companies are trying to make money and there is money to be made in invading people's privacy, right? There's benefit to be had to them through doing things with data that people don't necessarily want done with it.

So I think that's the bottom line. But, it's funny.

I know you do some consulting. I don't know if you've run into this in the consulting world, a client called you and they say, hey, what do we need to do better from privacy perspective. You tell them, here's all the things you need to do.

And they're like. Okay, what do we need to do from a compliance perspective and you tell them that and then they say okay what do we need to do so that the regulators don't actually go after us, right? So so they're lowering the bar because It's not that they don't want to do the behavior because it's, ethically questionable or again, goes against social norm.

It's because they want to do as much as they can until somebody calls them out on it and anything they can do to prevent somebody from calling them out on it. Again, if consumers don't know then no harm, no foul, right? If you don't know that I’m watching your every move in your house who cares right?

You weren't hurt so because of that money making there's an incentive to try to do as much as they can and not get called out on it.

Jamal:

Okay that, that does make sense. And what you've just described there does sound very familiar in lots of consulting scenarios.

Jason:

Yes

Jamal:

It's not that they actually don't want to do the right thing. It's when they start looking at all of the things that they need to do because they were so immature to begin with. It starts becoming quite onerous and they start thinking about budget and how it's going to impact other things. And so they just want to know what is the minimum that we can do to, I don't want to use the words get away with it, but that's almost how it feels sometimes.

Jason:

Yeah I think one of the things there hasn't been a reckoning in the privacy world yet. Or actually, in the business world. Privacy is great when you can stick it in legal and they can write some notices and sign off and say, we're doing everything right. But when it actually affects your business model when you are trying to make money doing x and privacy either the law or the social norm say X is not gonna cut it. This is what we're seeing with Facebook and meta and their ad model in Europe right now, right? I don't know about you, but I've known and many privacy professionals have known for years that the facebook model did not match GDPR. They kept trying to argue Oh, it's a legitimate interest. No, it's not legitimate. Oh, okay. It's contract. No, it's not contract. Yo, it's consent No, there's no way one check box is going to give you consent for everything you're doing.

So they tried to basically make arguments when anybody in the know who has actually read GDPR and, looked at what Facebook's doing knows it didn't fit in any of those models. So they're just trying to stay alive because the law runs counter to their business model. And I think that's one of the fundamental problems is you can still do your business and just layer on security, right? Make it stronger encryption, use better access control, et cetera, et cetera, but that's not going to affect your actual business model, whereas privacy comes in and actually affects your business model.

So I've been using the analogy recently of privacy is where industrial manufacturing and pollution was in the 50s, 60s, and 70s. So if you think back then, a company was manufacturing products or, doing some chemical process and they would just stick their waste out the back of the factory into the local stream, and it was polluting downstream, but they didn't have to internalize all those costs.

It's what economists call an externality. They were externalizing those costs. Those costs were being borne by society, by the downstream residents, by people far off in the future who got cancer and those sorts of things. And in the 60s, 70s, and 80s, Government started realizing, hey, we can't let companies externalize this cost.

We need to have them put in internal procedures to address it. And this is where privacy is, right? They have been utilizing people's data for decades and making money off it. But the cost, the harm, the risk is being borne by the individuals. It is an externality that the company isn't feeling it.

And now all these regulations are coming on and say, hey, if you're going to do this process that has all these external effects on individuals. You need to put in place certain controls, just like manufacturing controls to make sure you're not putting pollutants out in the air. You need to put in place certain controls to reduce the risk to individuals and internalize those costs and unfortunately, your business model of making a certain toy out of arsenic is not going to fly anymore because you can't internalize the cost. You have to change your business model and now find a new way to manufacture a new toy that doesn't use arsenic. And so similar with businesses that have and I'm going to use an air quotes toxic data practices. They can't reform. They have to find a new business to get into if they're going to meet their privacy obligation. And I think that's the fundamental kind of things that we're right now struggling with is business saying. Oh, yeah. Yeah, we can just hire lawyers and do privacy And no, you're gonna have to actually figure out if your business can work in a privacy friendly world.

Jamal:

That's a great analogy. And I think for a lot of people who are trying to get their heads around why this is so important, it actually just would make perfect sense now. So thank you for that. Now that brings us on nicely onto my next question, which is what are privacy threats and how do we model them?

Jason:

By the way, privacy threat modeling is a nascent and upcoming field. Ultimately, you're trying to come up with systematic way of identifying who the threat actors are. What are the potential threats? And what are the potential consequences of those threats?

Looking at the Solov taxonomy and saying what are the activities that could create a harm under the Solav? For instance, one of the harms is surveillance and Solov defines this as the watching, listening to or monitoring of an individual.

Okay. So look at my product. I'm pulling up a phone. Is there any ability to watch somebody with this? Okay. It's got a camera. Who is it? It's an app developer or, a person using an app. Okay. Is there any ability to listen? Yes, it's got a, it's got a microphone.

Is there an ability to monitor? Okay. It's got an accelerometer. It's got a GPS. So here are all these what we would consider threat vectors that a threat actor could do any of these things that would end up with a consequence of the person being surveilled.

This is an interesting and an advancing area of privacy. Remember, privacy is like 10, 15, 20 years behind security. I was talking to somebody at NIST about this. And they pointed out, look, NIST has been dealing with cybersecurity issues since like the 60s and 70s. . Privacy is just now catching up.

So even security threat modeling is only a decade old. So privacy threat modeling is something that's very new and upcoming. But essentially, it is about trying to identify specifically who are the potential threat actors? And, by what means are they able to do something that would result in a consequence to an individual.

One of the things I always like to emphasize to differentiate this from security threats when I think threat actor that could be the organization. It could be vendors or other departments within the organization. We typically think of security threat actors as threat actors working in contravention to the organization's activities.

Like I've got an Apple, somebody trying to hack into the app. It's somebody trying to misuse the app for a different thing than what it was intended. Those are more security threat actors. Privacy threat actors are organizations or entities that are using whatever the product is as it's intended but then resulting in these privacy consequences and it's a stark difference , because for the longest time, and I used to rally against this, people were thinking of privacy threats were like data breaches.

No, that's a security threat. The privacy threat is the organization's marketing department taking the phone number that was used for two factor authentication and now sending out SMS marketing messages. So it's furthering the organization's mission of growing the company but it is using data for a purpose for which it was not collected, a secondary use. And it is intruding into people's personal space by sending spam messages that they didn't ask for and didn't want. So these are the privacy threats, and the threat actor is the marketing department, not some external hacker.

Jamal:

Got it. Thank you for explaining that and clarifying that. It makes it super clear. And I thought it's very exciting because this means this is a potential in an area that there is a lot of space to grow and develop. And actually people can come and really make an impact by getting their hands stuck into really understanding privacy threat modelling.

Jason:

Absolutely.

Jamal:

Okay. And I know at the Institute of Privacy by Design, you have a number of courses where people can come and actually upscale and get certified in some of those. Can you tell us more about that?

Jason:

Yeah so pre pandemic, I was doing training corporate training, but it was exclusively in person. And obviously, the pandemic happened. I started to move that online and I started to see a demand for more discreet courses, not just necessarily my entire offering of privacy by design and more individualistic, courses on different engineering topics or different design topics, privacy, risk modeling, privacy, threat modeling. And you mentioned certifications, like my training is not necessarily for a certification course. It is to, as you said, upskill. It's to gain more knowledge and be better. But a lot of people are really focused on that certification and getting those letters after their name, not necessarily, understanding the material.

So there's a disconnect between what I think people need to learn and what the market is asking for, and I'm still struggling to find that proper balance of giving people what they want. An example is my courses, the individual lessons tend to be 45 minutes to an hour.

And I know another trainer who provides training material and, this company provides training material for companies that are like 10 minutes long. They have a course on DPIA just 7 minutes long. You can't learn really anything other than what the letters DPIA stand for in seven minutes, right?

So it boggles my mind. It's is this really even useful? Because it's not going to teach people how to do a DPIA. It's just going to be you know, seven minutes of their time trying to do that. And not to say companies buy that, companies want to check the box off and say, we trained our people on doing privacy.

But ultimately, did they actually train them? Is it just a performative compliance activity.

Jamal:

I completely hear what you're saying there, Jason. And I feel exactly the same way. And that's one of the reasons why I started the Privacy Pros Academy, because there were so many people looking to come into the industry that needed to upskill, that needs to get certified, but at some point people realized, some people have a mindset of, I just need these letters after my name to help me land the roles. And then you've got these materials being provided. By the institution and then people are hiring a lawyer to come and read the slides out Like you said the DPIA section if we're talking about the same thing lasts about seven minutes and all it says is this is what a DPIA is supposed to do. This is where you need to go to the regulators and here's a couple of things that you need to think about and that's it. That's not going to empower anyone to do that. So one of the things we've actually done at the privacy pros academy is moved away from any ties that we had with these membership associations and what we're actually focusing now on is actually delivering In depth content to help people actually become the go to expert. So they know exactly how to conduct a data protection impact assessment. They know what are the things they need to be looking for. When we talk about the rights and freedoms, a lot of people don't even know what they mean. The rights and freedoms, what are you looking at? And some of the things that you've been mentioning today about the privacy threats, the privacy harms, when you take that and you incorporate it into your DPIA framework, It makes it a hundred times better and it really helps the businesses, not just protect their reputation, but also build that trust and really show that you really thought about the customer and you've got a a solution now that is much stronger than it would have been otherwise.

Jason:

I like the way you tied that together because that's absolutely the case. I was talking to somebody about DPIA training and I started talking about rights and freedoms and assessing the likelihood of violations of people's rights. And they're like, that's not what we do for a DPIA.

And it's yes, that's absolutely what you're supposed to be doing for a DPIA. Look at the GDPR and it says, I can't remember the exact phrasing, but assess the likelihood of, violations of people's rights and freedom and what are the rights and freedoms?

You look at, the charter on EU fundamental rights and those sorts of things. And people don't think about that. I remember this was when GDPR first came out. Our DPIA was essentially, there's three items in the data protection impact assessment.

Are we targeting large numbers of people? Are we and I can't even remember the three, but there's,

Jamal:

criteria. Yes.

Jason:

Yeah, that's right. And it's that's all they were looking at. And it's no, those are just examples, right? It's they're just trying to provide some illustrative example.

And basically their were essentially 1 row on an Excel spreadsheet. They had 100 items that they were looking at 100 different products and services and, it's like each product service with a row and it's here's our DPIA. Does it target large numbers of consumers?

Does it do this? Does it do that? Nope. Okay. We're done. No, no high risk there. It's a fascinating exercise, and this is an interesting story, had a company, it was a roundabout, I didn't talk directly to the company, but I talked to somebody who was talking to the company about doing consulting work, and they had a backlog of 10, 000 privacy impact assessments.

And they needed to hire an outside consultant to go through these and I'm like, yeah, that's ridiculous. You can't do that with any level of granularity. It's a factory. You're just rubber stamping things at that point. So Woody Hartzog, in his book, Privacy Blueprint, one of the things he says is lack of a better term controls for reducing privacy harms is increasing transaction costs. So increasing the cost of doing something for a business. Perfect example, telephone sales rule in the United States.

It says you can't use automated dialer to call out and spam people with with telephone solicitations, right? Because that's really cheap and effective. I can get an auto dialer and essentially call everybody in the United States in a matter of a day or days, because it can send out, thousands and thousands of calls an hour.

So they said, no, you have, but you can still call people. If you have a human. Well humans are expensive and they take time and you can't do thousands an hour if you're using a human to make a solicitation call. So that is an increase in transaction cost. My argument is same thing with the DPIA if you're doing something 10, 000 different things that you potentially are privacy impactful then maybe you do need to slow down and this is causing a transaction cost where you need to actually sit and think about it and not try to what I call invade people's privacy at scale.

Jamal:

Yes. Very insightful. Thank you, Jason. Now we spoke about privacy by design. What skills do we need to do privacy by design well?

Jason:

So I think there's a mix of skill. So one is being empathetic. So understanding that there are people at the end of the line. Now I say that, obviously, you don't want to be too empathetic. You still have to get work done, but you want to understand that, when it, when you've got tens of thousands of customers, millions of customers, these aren't just statistics.

There are actual people whose lives are affected by this . So one of the things I run into a lot, especially I was speaking to a bunch of Python developers and talking to him about privacy. But in fact, afterwards, a woman came up and she said, I wish more companies were doing this because she had been the victim of stalking and had to remove herself from social media, change all of her contact information so being empathetic and knowing that this is impacting people.

It's not just statistics I think that's one thing. Two is so I hate to say this but logic and mathematics I run in privacy risk analysis and assessment, where even just a basic understanding of math and logic , can be really helpful in doing assessments and identifying things. And it's unfortunate, I know a lot of lawyers, very smart people they went into law because they didn't like math. And so now I come to them and I say, Oh, yeah we need to do, some math here in order to figure this out. And they're like, but there's so much in Economics, behavioral economics, understanding cognitive biases. So psychology being really a jack of all trade can be helpful in understanding this.

So it's really hard to say what skill set specifically, but I think, having knowledge in a lot of areas and those. Either personality traits or transferable skills of attention to detail, being able to pinpoint and pull out important facts and information from what's going on and then analyzing those and using those are extremely important.

Jamal:

You mentioned paying attention to detail is what I like to think of as clarity and one of the challenges that we have is lots of people businesses clients even privacy professionals They say a lot of stuff But you'll walk away not knowing what to do.

And sometimes when we say something to the business, they just nod, pretend they know what we're talking about. They move away.

what we in the Privacy Pros Academy is our C5 methodology. And we say, the first thing is part of that C5 methodology is clarity. You need to be clear and you need to understand clearly what the situation is, what the conversation is, because without that clarity, you won't have the second C, which is confidence. How can you go and address a problem or offer a solution to a problem if you haven't even understood the problem well? Or how do you expect somebody to have confidence in what you're saying when you haven't clearly explained to them what it is that they need to logically figure out how this slots in. And if you can't explain something clearly, there's no confidence in what you're saying, then how do you expect people to take you credibly? so we have the C5 methodology that we run across and that helps them with some of the things that you've spoken about the details. And the other thing that you mentioned, which I think is a little bit underspoken about and undersold is being able to have that well rounded approach where we have emotional intelligence where we have the understanding of psychology where we understand cognitive biases. We understand how things work and bringing that holistic picture into the way you look at challenges and the way you come up with solutions and I think that is what separates the great professionals from your average and mediocre privacy professional.

So that was really helpful.

Jason:

Absolutely. I think that's important. It's certainly been important in my work and just privacy touches on so many other facets. Obviously, we've got law and regulation. We've got IT and technology. We've got psychology, how people behave. We've got, risk assessments and understanding statistics and likelihoods and, impact .We've got behavioral economics and, how people, make calculations in their heads as so to what they do There is just so much there. One of my character traits is I tend to be a generalist. I'm good at privacy, but for the most part, I know enough about a lot of different topics to be dangerous.

Jamal:

Thank you. Thank you for sharing Jason, I want to say thank you very much for the time and for sharing all of these valuable insights and gems. So we covered Privacy harms and how we actually identify them.

We said we need to look for both the normative harms as well as the tangible harms. Then we spoke about privacy threats and we said, look what we need to think about when we're thinking about privacy threats is actually the privacy threat actors. And when we talk about privacy threat actors, we're not just talking about it from the security lens.

It could be anyone that actually has access that is able to use this. So we need to think about it from that holistic point of view. what we need to think about is the consequences of that. You gave some key great tips on what we need to do to become better privacy professionals, especially when it comes to privacy by design. Skills, like empathy, having that logical approach, being able to pay attention to the detail and having that well rounded, background that we bring, so we can actually do anything and hold conversations and do something to a level of competence that will help us to enhance the value that we bring to our clients, to our employers, and that's really going to help us to have a thriving career. Jason, it's been an absolutely fantastic episode. Thank you so much. Are there any last words that you'd like to share with our listeners before you go?

Jason:

Thanks for having me. It's been a great conversation. We could have gone for hours and hours. If anybody has any questions about privacy by design, please feel free to reach out to me. I love discussing this and I love sharing my knowledge.

I view my role part time as a privacy professional who is doing this as a job and part time as a privacy advocate who wants to see more privacy in the world. So I have those dual roles. I do want to mention the Institute of Operational Privacy Design. You did mention it up front, but we are a professional organization who is dedicated to creating standards and certifications around privacy by design.

Thank you. We'd love to have new members, new volunteers, lots of opportunities there. So if anybody is interested in learning more about that organization please reach out. And we are also looking for beta companies who want to apply our standard and potentially get certified as some of our initial companies on board.

Thank you.

Jamal:

Okay, and if you're looking for more details on those, we will link the website in the show notes and also in the description below. So make sure you go check it out.

Jason:

Thanks, Jamal.

Outro:

If you enjoyed this episode, be sure to subscribe, like and share so you're notified when a new episode is released. Remember to join the Privacy Pros Academy Facebook group where we answer your questions. Thank you so much for listening. I hope you're leaving with some great things that will add value on your journey as a world class Privacy Pro.

Please leave us a comment. a four or five star review. And if you'd like to appear on a future episode of our podcast, or have a suggestion for a topic you'd like to hear more about, please send an email to team@kazient.co.uk. Until next time, peace be with you.

Chapters

Video

More from YouTube