Artwork for podcast Data Mesh Radio
#175 Ethical Data Usage - Informing and Educating Consumers - Interview w/ Esther Tham
Episode 1751st January 2023 • Data Mesh Radio • Data as a Product Podcast Network
00:00:00 01:05:36

Share Episode

Shownotes

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/

Please Rate and Review us on your podcast app of choice!

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

Episode list and links to all available episode transcripts here.

Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.

Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.

Esther's LinkedIn: https://www.linkedin.com/in/esthertham/

In this episode, Scott interviewed Esther Tham, Experience Designer at Thoughtworks. Scott reached out to talk about data ethics based on a post Esther made on LinkedIn.


Some key takeaways/thoughts from Esther's point of view:

  1. When designing your UX (user experience), companies should aim for as little friction as possible when signing up or transacting. For an ethical company, that means collecting as little information as possible to still maximize value of the service to the user.
  2. Companies: if you don't need it, don't collect it! It isn't ethical but also it increases your attack surface for a data leak and potentially lowers consumer trust.
  3. We don't have the proof points of many companies doing the right thing and disclosing potential issues of sharing information with them in an understandable way. But that would likely increase consumer trust. Is that trust worth more than the hassle to a company? We need companies willing to try being more ethical to really know but it's a cost with a very uncertain upside so not too likely.
  4. People need to learn that their personal data has value - and risk - associated with it. Don't give it over without thinking about how it might be used/misused. But most people are nowhere near that thought process yet. Right now, most people are only worried at most about getting scammed, not should this company have my data and how might they misuse it.
  5. Ethics isn't just about collection or even usage, protection is also crucial. If you can't protect sensitive information, you shouldn't be collecting it.
  6. How can we encourage the general population to really care about ethical collection and use of their data? Is it just better explanation of how it's used? With greater understanding, will most people actually care?
  7. The question of who is responsible for ethical data collection is an interesting one. On the one hand, companies should be behaving ethically. On the other hand, they often don't so how much responsibility to protect sensitive information is on the consumer not handing it over in the first place?
  8. A designer's role is to advocate and build for the user. But we still don't really know exactly what most users want when it comes to data ethics. Do they really care about ethics around their data or are they willing to trade their data for services? Is it about more education/communication or do users genuinely not care? We need brave companies willing to test.
  9. How do we press companies to be more ethical in the data they collect, how they protect data, and how they use data? Have many companies suffered damage - reputational or otherwise - from ethics breaches? We clearly can't trust every organization...
  10. On the flip side, how do companies that are actually doing the right things ethically communicate that to consumers? Is there a value to consumers where they will seek out companies with high ethics? Is the cost of behaving ethically worth it, does it result in a tangible benefit? We assume there is an additional cost to behaving ethically too so there needs to be an upside for companies to consider it.
  11. It's easy for consumers to have a false sense of security online relative to their data. While identify theft and similar issues are on the rise, companies are still asking for - and consumers are regularly freely giving - sensitive PII.
  12. Very few people really think about potential misuse of data we give to private companies, often with little explanation by those companies of what they will use our data for. Can we really expect companies to fully explain their projected use of data when that might simply confuse more people? We can press them to do it but likely not expect them to willingly.
  13. However, when companies do try to explain their use of data, does anyone read it? Are EULAs actually useful? Do we need something that is in addition to a EULA to explain how data will be used and what will be collected?
  14. Most people seem to not really be all that concerned with the data they share until it seems it was used improperly, especially if a company sold their data to a partner or some scammer got ahold of it. And most don't expect a scam to happen to them so they only look to react after the data is already out there. Can we change their approach and view?




Esther started the conversation with her background of growing up in Singapore, one of the safest large cities in the world. But that kind of environment can also lull people into a false sense of security when transacting online. Even the government was not really thinking about misuse as birth certificates, passports, and national ID numbers were all the exact same number until 2006. And until 2016, companies could ask for your full national ID number and many people gave it out without a second thought to get access to services.


The Singapore government has pretty stringent requirements to only collect information for legitimate purposes and store it securely. The few incidents of people in their government accessing the private information of others has led to strong repercussions. Basically, it's not like many of the other countries out there with potentially strong laws but little to no enforcement. But when it comes to private companies, even in Singapore a number are still asking for a significant amount of data without giving clear justification or expected usage.


For Esther, creating a low friction user experience is a bit of a double-edged sword sometimes relative to ethical collection. Do you store someone's credit card information to make transacting easier next time? Do you try to collect as much information as possible to show them relevant ads? Let's be honest though, that second one is about selling ads not a 'better ad experience'… So, how do companies effectively balance how much information to collect in order to provide a great experience but not too much where you are getting things you don't need?


A fantastic point Esther brought up is that data ethics isn't only collection and usage, though that's often where we focus. Data protection, especially around sensitive information, is a major ethics challenge/question. If you can't protect the data, should you really be collecting it? If you are in possession of sensitive data, what level of duty do you have to protect it? Are you salting and hashing your passwords and other sensitive information?


Again, Esther believes we have to think about who is the burden on to protect information. If a company collects it, presumably the burden should be on them. But if people willingly give up very sensitive information to companies, what level of responsibility do we place on people to be informed and smart about giving their data away? Who is the 'shame on' for the second time something happens? How do consumers protect themselves from companies like the US credit bureaus that collect information without consent? Especially when they haven't been ethical in their level of data protection.


Another interesting point Esther raised is how do companies properly explain what data they will collect and how will they use it. Be honest, show of hands, how many actually read through most of the EULAs you agree to? Is that because they are insanely tedious or because we genuinely don't really care or have resigned ourselves to data being misused anyway? Is that just a friction point in the onboarding experience?


In Esther's view, most people seem to not really be all that concerned with the data they share until it seems it was used improperly, especially if a company sold their data to a partner or some scammer got ahold of it. And most don't expect a scam to happen to them. So consumers need a better way of understanding their information attack surface and to not give information as freely but that's mostly on companies to not ask for it. And if they can profit off it, can we really expect companies to stop asking? Is this a chicken and egg scenario where neither side is really going to move first?


If a company does really do a good job of disclosing potential risk of handing over data and exactly how they are going to use it, that presumably increases customer trust according to Esther. Unless people just don't want anyone to use their data but they are happier when they don't have to think about it. And then, for a company, is that increased trust worth the hassle or even the potential friction and scaring certain users? It might be the 'right' thing to do ethically, but how many companies are _really_ focused on acting ethically? So we aren't sure if this is a high cost for companies and we aren't really sure it leads to anything all that positive and we aren't sure consumers really care… so why would a company do it unless they have to or feel the need to act ethically? Are any companies brave enough to test it out?


Esther wrapped the conversation up with a call to action: understand that your data has value and consider if giving away your data is worth the value a company is providing you. And how could it be misused. That doesn't mean be paranoid but also, don't give out information quite as easily.


Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/

If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Links

Chapters

Video

More from YouTube