Artwork for podcast Data Mesh Radio
#95 Measuring Your Data Mesh Journey Progress with Fitness Functions - Interview w/ Dave Colls
Episode 9530th June 2022 • Data Mesh Radio • Data as a Product Podcast Network
00:00:00 01:00:09

Share Episode

Shownotes

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/

Please Rate and Review us on your podcast app of choice!

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

Episode list and links to all available episode transcripts here.

Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.

Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here

Dave's LinkedIn: https://www.linkedin.com/in/davidcolls/

Zhamak's data mesh book: https://www.oreilly.com/library/view/data-mesh/9781492092384/

Building Evolutionary Architecture book: https://www.oreilly.com/library/view/building-evolutionary-architectures/9781491986356/

Team Topologies book: https://teamtopologies.com/book

The Agile Triangle regarding helping you decide how thin to slice: https://www.projectmanagement.com/blog/blogPostingView.cfm?blogPostingID=5325&thisPageURL=/blog-post/5325/The-Agile-Triangle#_=_

In this episode, Scott interviewed Dave Colls, Director of Data and AI at Thoughtworks Australia. Scott invited Dave on due to a few pieces of content including a webinar on fitness functions with Zhamak in 2021. There aren't any actual bears, as guests or referenced, in the episode :)

To start, some key takeaways/thoughts and remaining questions:

  1. Fitness functions are a very useful tool to assess questions of progress/success at a granular and easy-to-answer level. Those answers can then be summed up into a greater big picture. You should start with fitness functions early in your data mesh journey so you can also measure your progress along the way. To develop your fitness functions, ask "what does good look like?"
  2. Focus your fitness functions on measuring things that you will act on or are important to measuring success. Something like amount of data processed is probably a vanity metric - drive towards value-based measurements instead.
  3. Your fitness functions may lose relevance and that is okay. You should be measuring how well you are doing overall, not locking on to measuring the same thing every X time period. What helps you assess your success? Again, measure things you will act on, otherwise it's just a metric.
  4. Dave believes the reason to create - or genesis of - a mesh data product should be a specific use case. The data product can evolve to serve multiple consumers but to start, you should not create data products unless you know how it will (likely?) be consumed and have at least one consumer.
  5. Team Topologies can be an effective approach to implementing data mesh. Using the TT approach, the enablement team should focus on simultaneously 1) speeding the time to value of the specific stream-aligned teams they are collaborating with and 2) look for reusable patterns and implementation details to add to the platform to make future data product creation and management easier.
  6. We still don't have a great approach to evolving our data products to keep our analytical plane in sync with "the changing reality" of the actual domain on the operating plane. On the one hand, we want to maintain a picture of reality. On the other, data product evolution can cause issues for data consumers. So we must balance reflecting a fast-changing reality with data consumer disruption, including downstream and cross data product interoperability. There aren't great patterns for how to do that yet.
  7. There is a tradeoff to consider regarding mesh data product size. Dave recommends you resist the pull of historical data ways - and woes - of trying to tackle too much at once. The smaller the data product, the less scope it has, which makes it easier to maintain and the quicker to deploy and feedback cycle. But smaller-scope data products will increase the number of total data products, likely leading to harder data discovery. And do we have data product owners with many data products in their portfolios? Dave recommends using the Agile Triangle, framework to figure out a good data product scope (link at the end).

Dave mentioned he first started discussing fitness functions regarding data mesh to shift the conversation from people asking "what do we build?" to "what does good look like?" Fitness functions, when done right, can give a good view of how well an organization is doing relative to data mesh implementation goals by providing objective measures of success at a granular level that can be summed up to a bigger picture.


So what is a fitness function? As defined by Thoughtworks in a May 2018 technology radar, "Borrowed from evolutionary computing, a fitness function is used to summarize how close a given design solution is to achieving the set aims. ... An architectural fitness function , as defined in Building Evolutionary Architectures, provides an objective integrity assessment of some architectural characteristics, which may encompass existing verification criteria, such as unit testing, metrics, monitors, and so on." Source: https://www.thoughtworks.com/radar/techniques/architectural-fitness-function


Fitness functions can take us from measuring success on vanity metrics - like amount of data processed or stored - to value-based metrics, per Dave. It is important to think about what good looks like for the now and the future. So starting to put your fitness functions in place early in your data mesh journey can give you a good sense of where you've been when designing where you want to go. Fitness functions give you an ability to stay focused on "why are we doing this?" - intentionality is crucial.


When thinking concretely about some fitness functions for a data product, Dave gave a few examples. E.g. does this meet health checks for testing, is it satisfying SLOs, etc. It can be a good idea to have a target metric with yes/no type of answer as you start to use fitness functions. Metric measurements without context are typically not valuable. Latency of 5min might be great for one data product and not another. Accuracy of 90% might be atrocious for one and great for another.


You can implement fitness functions for all aspects of a data mesh implementation in Dave's view. Look at the four key principles of data mesh from Zhamak's work and you can start to break down your goals for each one into fitness functions. A good overall question to try to answer is are we reducing the interdependence of domains. So for the domains, are they providing value via their data products to consumers? For the platform team, have they made it easier to create and manage data products? For governance, overall is the value of the whole implementation greater than the sum of the parts? You can answer fitness functions at a micro level and then take your overall measurements to get a more complete picture to assess how your implementation is going.


For Dave, when assessing that bigger picture, as previously mentioned, it is good to think about your bigger picture and measurement of success over time. What you measure with fitness functions can - and should - evolve but having your rates and ratios spanning your implementation timeline can give you a good indication of where you've been improving and where you need to work more.


Similar to Shane Gibson's episode on using patterns in data, fitness functions may be valuable to other organizations and not yours and they may lose relevance over time per Dave. Measure things that will cause you to act based on the outcomes. Nothing in your data mesh journey should really be seen as done and fixed. Things should evolve or it means your organization is stagnant. Change for the sake of change is obviously bad but you should evaluate if your fitness functions are still helping you measure against your idea of what good looks like. Do we need fitness functions against our fitness functions?


Dave talked about how well Team Topologies aligns with implementing data mesh as organizational changes are such a crucial part to success. The Team Topologies approach focuses on enabling the domain team - called a stream-aligned team - to be "the primary unit of value delivery in IT". A platform team enables but loosely coordinates at most when possible to prevent blocking the work of the stream-aligned teams as much as possible. Per some past research conducted by Dave, it took 12x longer for a team to do work if they had to go outside their team - so prevent that if possible! But, Dave warned that right now, especially in data mesh, it is important to not just add more and more work to the stream-aligned team.


Team Topologies can help us answer how do you build capabilities in a decentralized world, especially to implement something like data mesh. Per Dave, it is helpful when there is multi-disciplinary collaboration by the stream-aligned teams with the enablement teams to develop their first data products. The enablement team is also tasked with bringing back incremental learnings to add to the platform to make the next team's work creating and maintaining data products better - continuous improvement via learning from each implementation.


On this topic, Dave talked about how useful it is to optimize for learning - rather than optimize purely for initial value-creation of each data product - in a data mesh implementation. With the enablement teams bringing learning back to improve the core platform, we can implement friction-reducing enhancements like sensible defaults and starter kits/templates. Focus on efficiently learning.


When designing/creating a mesh data product, Dave recommends a customer-led approach. There should be a consumer with a specific use-case as the reason to create a data product. A mesh data product should be a valuable representation of the domain through data. But the operational and analytical planes naturally diverge unless we evolve the analytical to match the new reality in the operational. But we don't have a great way of evolving that analytical without data consumer disruption.


Per Dave, creating a "thin slice" is one way to help maintain representing your domains via mesh data products. Dave recommends you resist the pull - and resulting woes - of past data ways of trying to tackle too much at once - so thin slicing. Get to value delivery and feedback quickly. A thinner slice has a reduced scope so you will likely not have as much difficulty maintaining that singular data product. But Scott wanted to make sure people understand that the micromicroservices model - as in small even for microservices - was kind of a disaster so be careful to not slice too thinly - you don't want to have way too many data products - it can also make data discovery more difficult. It's all tradeoffs in the end :) And you can use fitness functions to measure if you are making the right tradeoffs too!


Dave's LinkedIn: https://www.linkedin.com/in/davidcolls/

Zhamak's data mesh book: https://www.oreilly.com/library/view/data-mesh/9781492092384/

Building Evolutionary Architecture book: https://www.oreilly.com/library/view/building-evolutionary-architectures/9781491986356/

Team Topologies book: https://teamtopologies.com/book

The Agile Triangle regarding helping you decide how thin to slice: https://www.projectmanagement.com/blog/blogPostingView.cfm?blogPostingID=5325&thisPageURL=/blog-post/5325/The-Agile-Triangle#_=_



Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/

If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/

If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here

All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Links

Chapters

Video

More from YouTube