Artwork for podcast Data Mesh Radio
#98 How to Nail Your Data Mesh Vendor Assessment: A Journey Story - Interview w/ Jen Tedrow
Episode 987th July 2022 • Data Mesh Radio • Data as a Product Podcast Network
00:00:00 01:09:42

Share Episode

Shownotes

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/

Please Rate and Review us on your podcast app of choice!

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

Episode list and links to all available episode transcripts here.

Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.

Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here.

Jen's LinkedIn: https://www.linkedin.com/in/jentedrow/

In this episode, Scott interviewed Jen Tedrow, a Product Management Consultant at Pathfinder Product Labs who is currently working with a large client on a data mesh implementation. She was only representing her own perspective in this episode.

Some key takeaways/thoughts from Jen's point of view from the conversation:

  1. A data mesh vendor assessment is likely to be different than almost any other vendor assessment you've done before, especially if you aren't evolving something existing. There is so much more to cover and the overall platform needs to meet your needs, integrate with how you handle data on the application side including integrations, comply with your governance standards, fit within budget, etc. That's a lot of needles to thread.
  2. Spend considerably more time doing the discovery process in your data mesh vendor assessment than you would for a normal vendor assessment. There are a lot of potentially hidden needs / wants and it is far better to surface them early.
  3. By digging deep into stakeholders' desired outcomes, you can understand what you need to deliver but also, you can get insight into driving buy-in. Address the challenges preventing them from the desired outcomes and they will feel seen and heard.
  4. As many guests have said, lead with empathy. Change is painful. But if you are realistic with people and make them feel seen and heard, it will be much less painful.
  5. When speaking with potential users, again really spend the time to make them feel seen and heard - reflect back to them what you heard. And have them share what is their ideal state. You may not be able to fully deliver on it but it's important to understand where they want to go.
  6. As you are learning new information, share that in a continuous stream with stakeholders so they understand the recommendations you are making along the way and at the "end" of the assessment - it doesn't really end when you finish the assessment, so "end".
  7. Be prepared for there to be capability gaps - possibly significant - between what you want now and what is available in the market or that you are able to build in your budget. There are just a number of capabilities that aren't really part of any vendor offering at the moment. Waiting until everything is perfect will mean you are waiting for a while!
  8. It's crucial to focus on what do you value most right now and also in the future when making vendor assessments. There are so many nice-to-haves with a data mesh implementation but you'll need to compromise. Jen and team developed a good framework for evaluating offerings as to whether they fit actual needs or just wants.
  9. The three most important aspects to meet right now for Jen and team through the self-serve platform were user experience/low barrier to usage, automation, and ability to easily integrate the tools together and with the existing stack.
  10. When figuring out what capabilities you need from your platform, create high-level task-based use cases, not systems requirements. This will prevent you from steering too much towards trying to serve any one specific use or getting bogged down in tech compared to capabilities.
  11. Look to past instances of failed or underwhelming implementations of tools and processes in your organization to find the common ways implementations fail in said organization. And then work to avoid those :D
  12. It is crucial to make sure everyone understands a data mesh implementation is an iterative process. You will continue to listen to feedback and evaluate how things are working and then make further improvements - it's a journey!
  13. It will likely be quite tough to move forward in a data mesh implementation if you don't align your data strategy and work with your business partners to create a mutually beneficial target outcome. And carve out time for teams to actually be able to deliver data products and participate in your data mesh implementation - they need time to do the actual work. Make it a priority.
  14. It's very important to provide an easy way for teams to start participating in your data mesh implementation/journey. Just asking a team to participate won't cut it. Make it low friction, make it beneficial to them. Easier said than done but still very important.



Jen has done a number of vendor assessments in the past. But this one was a doozy. There isn't a ton of information yet about how to do data mesh really well - especially the platform side - so it is difficult to assess exactly what capabilities you need. There are still a number of gaps in vendor offerings when you do know what capabilities you require to meet so that is more difficult. Then add in that you are likely bringing on multiple new vendors at once and making sure they play nicely together and with your existing technology stack. And then there is budget... So, as mentioned, it was a doozy.


Overall, Jen's role was trying to account for specifically 4 different capabilities: data discoverability, provisioning, observability and quality, and access control.


For Jen, in most vendor assessments, there is a much tighter scope around what you are trying to assess. E.g. looking for a streaming technology or looking for an integration provider. But with a data mesh platform, there are so so many moving pieces, it was a very unique and difficult challenge to find a good harmonious match to cover as much of the needs for now and to work with the vendors to grow together to cover future needs.


Jen used a framework she's used for her vendor assessments historically of Discover -> Align -> Assess. The goal is to answer what is important now and where do we need to go - and then does a certain offering help us address both of those points. But the big difference with this assessment was how much more time was spent on discovery.


Discovery in general can be challenging for a few reasons, per Jen. One is that people want to move quickly - make the decision and plow forward. But that can often lead to not picking a good direction and creating hard to pay down tech debt. So investing the time to understand all your needs/wants is crucial. Digging deep into stakeholders' desired outcomes has a few benefits: 1) you know what you need to deliver and 2) you know how to drive buy-in - by addressing the challenges preventing them from their desired outcomes.


Jen stressed, as many many past guests have, the importance of leading with empathy when working on anything data mesh related. Change is on the horizon for everyone and change is painful. When leading with empathy, stakeholders and users alike have been very willing to share with Jen their challenges and where they want to go in the future with data.


"Why are we spending this time together?" is an important question to answer, per Jen. The people you interview will be more willing to openly share if you spend your time listening rather than selling. Make them feel seen and heard, spend the time to reflect back what someone said - let them know you weren't just hearing but listening. And talk with them about what is the current state and what is their desired state. And of course let them know that implementing data mesh isn't a threat to their jobs.


Per Jen, you may not be able to give everyone their desired state, especially in the initial implementation phase, but understanding why they want that certain capabilities might make it easier to deliver something of value to them if not the ideal. And then spend time with the stakeholders to constantly share what you are learning so it isn't a sudden recommendation at the end of an evaluation phase - that constant feedback lets people know why you are making the decisions you are.


Right now, pretty much no matter what capabilities you are looking for in your data mesh platform as part of a vendor assessment, expect gaps per Jen. Zhamak has mentioned this frequently too. It is important to evaluate what is necessary and what is nice to have now and then what will be necessary down the road. Will the vendors or offerings you are looking at be able to grow into those gaps over time to meet future needs too?


One aspect that made this assessment so different for Jen was that vendor assessments are typically looking for a single or logically bundled capability and doing a vendor bakeoff. It is based on the "known knowns". But for data mesh right now, there had to be so much more discovery work. It was crucial to focus on what stakeholders really value. For Jen and team, right now, that was on lowering the barriers and friction to usage so the UX (user experience) was pretty crucial. As was being able to stitch solutions together without a ton of custom work.


For Jen, a few requirements really came to the surface as crucial. Again, that user experience was one. But automation was another. How could they make creating and managing a data product an easy transition - or at least as easy as possible?


What worked well for Jen and team to really understand what capabilities were actually crucial was focusing very much on task-based high level use cases without involving any necessary systems requirements. It meant they could focus much more on what needed to get accomplished instead of specific examples that had more custom needs. Doing this created a very clear picture of what they actually needed.


When thinking about how to adapt data mesh to your organization, Jen recommends looking at what has worked and even more closely at what hasn't worked in past tool and process implementations specifically in your organization. What caused failures so you can look to avoid going down the same path. There are so many potential areas of friction in a data mesh implementation, do your diligence to find common failure patterns to your organization to avoid them.


Jen talked about how a successful data mesh implementation will really be about the intersection of people, process, and technology. You need to be including good change management principles into everything you touch. And make sure people understand that this will be iterative. It won't be perfect from day one but you are going to be listening and improving along the way.


To drive momentum, Jen recommends highlighting - loudly and often - early adopter successes. It shows that you are adding value but also rewards your early adopters - that hopefully spurs others forward to move forward in their data mesh participation too. And be honest with everyone that doing something like data mesh will involve change - and change is painful.


Jen and Scott discussed what use cases to look for in your early journey. Jen recommends balancing three factors: 1) what will be impactful, 2) what is possible, and 3) who is willing to partner with you. It's important to show those early successes to keep the funding coming as data mesh is not a single upfront cost - it requires continuous investment.


Jen recognizes how lucky she is to have a leader that is sharing their vision widely, driving buy-in and aligning strategy on data mesh with business partners to make it successful for all parties. A big crucial aspect is that teams have enough time carved out to actually create data products - without this, incentivization is tough. And constantly look to raise the visibility and amplify the wins.


For Jen, it's been important to repeatedly paint a compelling vision in many conversations. It's fine to be a bit repetitive. Share the current picture and then talk about what it could become. This is important to making participants "willing to accept the pain of change". You want to develop a symbiotic, mutually beneficial relationship with those early adopters. Participating in data mesh has to be a win for them too. And teams aren't ready to just adopt data mesh, you need to create the processes to support and enable them.


Jen wrapped up the conversation reiterating a few points about your vendor assessment process: 1) be prepared to spend more time on discovery than you probably think is necessary going in because it will highlight the pain points and capabilities that are most crucial; 2) focus on task-based use cases when considering necessary capabilities - keep the systems out of it; 3) really spend the time to understand your sourcing process internally; 4) it's okay to have very frank discussions with vendors - look to spend your AND their time wisely and share your hard constraints and requirements; and 5) constantly reflect back your progress and learnings in your assessment along the way and especially share the results of the assessment broadly to continue to share information and drive buy-in.




Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/

If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Links

Chapters

Video

More from YouTube