Building a data-driven culture is a common goal for many companies. Data will provide all the answers we need. Budgets are set, people are hired and priorities are determined. And yet, most companies are unable to use the bulk of their data. Making data the center focus doesn't work for the majority of companies. In this short talk, Ruben will explain the 3 myths of data-driven cultures and what to strive for instead.
Why You Shouldn't Build a Data-driven Culture
Ruben Ugarte, Data Strategist
Sarah Berger, Sr. Marketing Manager, Kin + Carta Americas
The Myths of Data-Driven Culture
So today I wanna have a special sort of conversation around maybe some of the challenges with data, or how companies are approaching data, some of the myths of data-driven cultures, and then an alternative. What I've been tentatively calling the “data-supported culture.” So I've been hearing a lot about some controversy around this, especially for companies who are investing a lot then maybe this goes counter to that. So we can sort of follow this idea for this conversation, you know that we're gonna maybe hold two opposing ideas at the same time, but we still function. We can still make do here, so I want you to think about this as we go through the next few slides. So let's start with stats. You know, I love reading the surveys that different companies do on how companies are actually using data and investing in it and so on. So there's a couple surveys I pulled here, one from 2019 and one from 2020, and every survey is really similar. They all kind of say the same thing, you know, we're spending lots on data. We're gonna spend more on data the next few years. We can't really use most of it, and we're kind of struggling to get insights from the bulk of our investment, but they're still worth it, right? So you know some of the stats here, 92% will invest more on data. 69% don't consider themselves data-driven yet. For example, 57% thought they were on par with their peers, and 17% felt they were behind, and so on, so this is very similar. So the numbers do get better.
I've been reading stats for a few years now, so the numbers are actually improving. They're not staying still, but there's also something interesting about the way companies perceive the progress towards data and how they're actually performing, and I think that's one of the things I wanna focus on. I think a lot of these companies are actually doing much better than they think, and they might just be playing the wrong game, too.
So let's jump into that, right? When I talk about gaming, I think that there's an idea of what a data-driven organization looks like, what it should look like on a day to day basis, and this idea might actually not be correct for a lot of companies. It might be coming from the wrong sources. So for that, we can look at some of the myths of data-driven cultures. So the first myth is that data should be the first resource. That is, everything gets optimized around data. That's how I've seen a lot of countries start to approach it. This is, you're sort of data-driven first. And this is similar to you know, when big companies like Google and Facebook started to change from being a web company to a mobile first company. I think Google now considers themself an AI first company. So some companies wanna be a data first company, which I'm not sure if that actually is the right fit for most companies. So that's myth number one.
Myth number two is really about internal proficiency, and that is everyone in the company should be able to use data, visualize it, work with it, get insights out of it. For anyone who is trying to deploy a large-scale data program with an organization of a significant size, this is really hard to do. And then lastly, the first myth will be around how you treat non-data opinions. You know there's a quote, I think it goes, "in God we trust, all others must bring data," seems to be the model for a lot of companies. If you don't have any data to back this up, this hunch, then there's no role for that here. So some of the companies that we know that use data really well are some of the big tech companies, right? We hear about them for good and for bad. I guess recently maybe more for bad on how they're using data and the privacy limitations. And what I would argue is that for a lot of these companies on this picture, not all of them, data is their product.
So a company like Google and Facebook, they happen to offer an E-mail service and a map service, and Facebook has Messenger and WhatsApp and so on, but what they are really into is collecting data, and that's their product. They arrange their company structure around that and they have lots of data scientists, lots of data engineers, lots of ways to support that emphasis. We can maybe argue the same for Tesla, you know the data is a key fundamental to actually making the cars work, especially the self-driving cars in the future. But for other companies, they're not in the product of data. You know, data is there to support it, to help them make better decisions to increase sales or reduce costs. But it's not the main thing, you're not rearranging your entire company around data. And I'll show you a little bit what that looks like, when you make this a priority, and this is the thing you wanna focus on. But that's the first thing.
I'm a big fan of basketball, and I have been following basketball for the past few years, and there's a trend, maybe perhaps since Moneyball came out by Michael Louis on baseball to use analytics. To try to quantify everything in the game, and the best example of this in basketball is the Houston Rockets. They're very driven by analytics and how they structure teams, what kind of shots they take, what kind of shots they avoid, and it works for them, you know, they're a consistent playoff team, but they haven't won the championship, and in the NBA that's typically the criteria you wanna use. And for them they might be actually trying to quantify a few things that are really hard to quantify. You know, in the last few seconds when you wanna take a shot, should you let data tell the player what shot to take, or just let that player's intuition and experience do it? So that might be an example of a company that's trying to really use data too much and maybe diminish some of the actual talent or the human elements of basketball. So, making data the first resource, that's myth number one.
Internal Proficiency in Regards to Data
Number two, the internal proficiency. So every company has different kinds of roles and teams. Some teams are more technical, some teams are less technical. Some work is becoming inherently more technical, like marketing, but nonetheless, one of the challenges I see companies face is they have lots of people and they need to train them on how to use data, how to make data self-service, or maybe democratize data as we hear sometimes. But in reality I think for some companies, not everyone will be able to use data at a really high level proficiency, or maybe even any proficiency, and instead of trying to make everyone be data proficient on their own, some companies might actually just be better off with having support, you know, having a data analyst or someone who can support companies, which means maybe less availability to data, but it still gets to the end point, which is the insights. And I think here the focus on hiring is important, which a lot of companies are doing. But even better, focus on training, right, and giving training to the right people so they can use data. And I see, you know, there's a lot of work that goes here, just to simply get people trained up on how to use data.
This is a company I was reading about recently, Stitch Fix.They do personalized styling for you and if you get a chance you can read a little bit about the history, but this is a great example of what it means when you really arrange a company around data and you really commit to it. Their data science team is about 200 people in a company of about 1,500. They're a public company. They use data to determine what recommendations to make, but they treat it quite highly, and in fact the data team has a seat at the executive level, at the strategy level, so they actually sat strategy with data and that's still not as common to see a data executive just represent data. Typically data reports to the engineering department, so it's sort of mixed in with whatever else we do from a technical perspective. So they invested quite a bit from an early early stage, and they're likely a company you can say that treats data as their product, and it's what allows them to run their company, so if that's the path you wanna take, then typically, you know, you have to make similar investments, and similar decisions around how you prioritize data and what kind of voice it gets in the company.
Don’t Disregard Intuition
And lastly, the non-data opinion. I see this more and more where we'll be working with a team and we'll have an executive that has 30 years of experience in this field, and they may have a hunch, but the hunch is not really backed by data just yet and sometimes their opinion can get dismissed and not to say that every company should go and become anecdote-driven, but there's a balance here, right? And the balance between what the numbers are saying, what the intuition of the people in the company are saying and trying to make the best decision out of that. I actually think that data can actually help people train their intuition, you know, in the sense that you're making a decision, you have the data, you run the experiment, you get the results, and you get this feedback mechanism on how well your initial guess was or your initial plan, and you can use that to build your intuition and get better at it. So I think that there's a great potential there. Don't see it as much in companies just yet, but we're still in an early stage in the industry.
So that being said, let me talk about what I think it's really the approach for most companies who are not treating data as their product, who are not investing in the same level that maybe Stitch Fix is gonna do. And you do wanna balance the intuition and the experience that a team has with the numbers. So calling it a data-supported culture and the way I think about it, it's almost like a bridge. You have different teams: sales, marketing operations, customer success, and data is really meant to support those teams and it's really meant to help those teams make better decisions, but those teams still have flexibility in how they make those decisions. When they need to go to the data, when they need to dismiss the data. It's hard to find a balance. And when it comes time to establish this type of culture, this is the five steps in the broad sense that companies should go through.
People, Process, Providers
So to me, everything starts with people. And the reason why is at the end of the day, this is who's actually gonna be using the data, so we need to figure out what's their expertise, how technical they are, and make some realistic decisions here. You know, is this a company where we can have 100% self-service data, everyone can access it, build their own reports, maybe. If it's not, then we need to of course figure out how we're gonna hire and support the different teams and roles. You know I have come across teams where everyone knows how to use SQL, and everyone basically knows how to hop into the database, run a query, export data and get the answers, and that's fantastic. That kind of company can benefit from more advanced tools and techniques and I also come across companies that have never had much data and no one knows SQL, no one really knows how to build charts, in a comprehensive way, so we need to be careful. Because giving data to people who don't know how to use it just yet, we're just gonna frustrate them, right? And the thing to keep in mind is for any company, data's an addition to what you're doing, right? So you already have a marketing team that's basically with campaigns and everything else they're doing, and you're asking them to add data, to use it, on top of already their busy day job, so we wanna make it easy for teams to get the most out of it.
Once we get our people, we look at process. Process to me is just simply figuring out how you're gonna get insights out of data? And this might be figuring out what reports and dashboards you wanna build, but also what kind of frequency you wanna use the data? Do you wanna report it on a weekly basis? Do you wanna have a meeting where you talk about it? Do you wanna build it into your sprint, to be able to maintain it? So everything around maintaining data, using data, and getting insights out of it, and just thinking about this ahead of time. You can imagine how common it is and we look at the stats, that companies go through all the technical process of implementation, choosing tools, and then they get to a point where they're not really sure, okay, how do we mine it for insights? How do we actually find those things that will make our business better.
Next we're going to providers. This is a very popular category. This is tools, effectively. Most people who reach out to me reach out to me about tools, so here we tackle it, and we figure out what sorts of tools and technology do you need. There's lots of tools, lots of technology. I think it's getting better every single day, so you have lots of options here. And the thing is, once you get to this stage, we figure out people and process, we kind of know what kind of tools we need, right? We make some choices of what's actually gonna fit our organization, and not what's sort of the ideal stage, you know? Then we have training. And just training end users how to use the data. Again, depending on who's gonna access it, making sure that there is comprehensive training here and I typically recommend to companies to split it in four ways. You know you have group training, which is what's typically done, but then you supplement that with one-on-one training so you make it really specific to the person who's using the data. Reactive training, where you may create let's say a channel in Slack, and people can post their questions there and get responses quite quickly. And of course documentation, where you can start document dashboards, best practices, common analysis workflows, things like that.
Machine Learning, but not Machine Controlled
And then lastly, for some companies, you might get into some AI machine learning, which is where once again, we go back to surveys, it's typically where a lot of companies are focused on, and there's a lot of possibility here, you know there's a bit of hype in this world, from what might actually be doable, but there is lot's of potential in the machine learning world for a lot of companies, but it's typically towards the end of a good process. I actually saw another stat, from data scientists this time, of how we spend our time, and I believe the number was about 60% of data scientists actually spend their time simply just wrangling data, right, trying to figure out where data is, what's there, how to fix it, how to organize it, and then they can kind of start working on the models. So, if you spend the right amount of time in the previous steps, then the machine learning element can flow much nicely, because you have the data, you've figured it out, you've assigned it, you implement it. You don't have to go try and figure out where it's gonna come from. And that's sort of the argument here, and for a lot of companies, when they first start working, the idea was and the command and how this company will be extremely data-driven, and they're gonna have dashboards all over the wall and it's just gonna predict the future, almost like a Minority Report type of world, but for what actually happened, the reality was companies would get data, they would use it to make decisions, they will see how their work was being impacted, but it wasn't everything. No, they still had to make decisions without data. They still had to make choices with incomplete numbers or maybe just a lack of numbers completely. So it was there to support, but not to replace what the company can already do. Final key takeaways here as we get to the final few minutes. You know data guides decisions, but it's not everything. It doesn't have to be how you make every single decision. There's a balance between this and intuition.
I think the second takeaway here is working with people constraints. Every company's different, every makeup's different, so you have to take into account. You can't just look at, even someone like Stitch Fix, and say we're gonna do that with our makeup. You will have to completely match the structure, how many data scientists they have, how they actually set them up, and so on. And number three, trusting intuition. Again, I see lots of potential to use data to build intuition and help people get better at it, but I don't think it's a dirty word or something that should be completely avoided in a data organization. And then lastly, playing the right game, so knowing what game you're playing and if you are playing the game where data is your product, like a Stitch Fix, like a Google, like a Facebook, then that's fine. You gotta invest that properly. But if you're not those kinds of companies then you should be careful, 'cause you might just find yourself with missed expectations year after year even though you might just be doing fine. So with that being said, passing on for some questions, if there's any, and I'll do my best.
When You’re Headed Down the Wrong Path
Thank you Ruben. All right we have a couple questions here, so I'll start with the first one. What are some early indicators that you might be going down the wrong path or taking the wrong approach to building a data-supported culture?
That's a good question. I would say, a few things that come to mind. First, it's just maybe on a high level of frustration. I worked with companies like this where we have some numbers, we can make some decisions. The company is actually doing well. They're growing in the things that matter to them, but there still seems to be frustration there. You know maybe it's not good enough that yes, we have some numbers, but we're missing these other things or we have this but we don't have this machine learning, so that's one. So that to me sounds almost like an expectation issue, right, that there's an expectation of where you should be that doesn't match what you're doing, but what you're doing itself is actually pretty good. The second thing will be friction between what you hope would do and what's actually happening with your people. So we go back to this. Maybe we have a company that's not geared towards self-service data, but you're trying to really make it work. You're really trying to get everyone to be self-serviced, and you're going through training after training and it's just not picking up, right, people are still needing to go to a data analyst and they still need to get support, so that to me would be friction in what the plan should be, or what you think the plan should be and what actually your organization can support.
And then lastly might be if you're seeing some discrepancies or maybe hard to justify budgets, right, so if you have a data budget of a few hundred thousand dollars a year or maybe more and there's questions around, are we truly getting value out of this money, and this comes up quite a bit, especially with tools. You're spending six figures, seven figures worth of tools and someone will say, is this worth it? I think that's also a good chance to reconsider that maybe you might be spending the wrong tools in the wrong way, because the value's not there, it's not apparent. It could also be that the value has to be made apparent, you know you have to do some work to show how this is being used, but nonetheless, those will be three signs that you can look for.
Backing Intuition with Data
Our next question comes from Sam. Do you have any suggestions for how to back up an intuitive hypothesis to test if there isn't any initial data?
How to back up an intuitive hypothesis, yeah, there's more talk these days about decision frameworks, or actually experimentation frameworks might be a better word, where you design the experiments, you run them, you get the data back, or some data back, you learn something, and then you do that over and over again. So when there's less data or no data and you only have intuition or a guess, I think you can still run through the process, you can still try to document what that initial intuition is, what you think will happen, why you think that's the case, but the valuable thing there is to actually just to run it, right? And as you run it, and you run those experiments, you get the data back. Now you may have constraints around what you can do. You know you may have to go through an engineering team and build something and that might take a couple of weeks and so on, so that's fine, but a lot of cases when it comes to intuition or when you have little data, what we really wanna do is actually go get it. And to do that we have to go run the experiments, see if that worked or not, come back and then make another choice. So it's a trickier thing because you might get stuck with little data and then saying that we can't move because we have no data, and you're in this loop, so you really just wanna get started.
Thank you. And then we will wrap up. We have one final question here, how do you know you have too many cooks in the kitchen when you're building your data-supported culture?
Yeah, when I work with clients I always tell them, there has to be ownership, or accountability, so that the plan works. So I don't come across this too much about 20 cooks, but I can see it happening where you have maybe multiple technical people or multiple business people who are owning it. What I would say is, I would look for confusion among people who wanna get access to the data, so if someone has a question and they don't know who to go to or they're going to different people or the wrong people, that would be one. And you know the second thing is, are you moving? When I show the process before with companies, ideally of course you wanna be moving through it, so you wanna choose people, the process, and tools and training. You're gonna start adding more stuff to it, add more data points, then maybe get to the machine learning world. So if you're not moving, and the pace might be slow on your end, or it might feel slow to you, but it should be at least be progressing forward, so if that's not happening, that might be a case of multiple people trying to go in different directions and the whole sort of program coming to a standstill. So I'd look for those two things when it comes to too many cooks in the kitchen.
Don’t Look Far for the Perfect Candidate
I think this is in regard to best practices in training, but this person's asking, how can I make the most of my organization's existing workforce?
I think this is a fantastic area to start you know. Hiring gets talked about quite a bit, and there's of course a lot of data roles that you maybe wanna bring in, whereas the data scientists, data analysts, data engineers, are quite scarce. But nonetheless, training can be a way to close some of the gap and to find those people in your organization that might have an inclination for what you wanna do. For example, you might have an engineer who wants to get more into data, and they might wanna take on more of a data engineer role, or you may have a business person who may wanna take more of a data analyst role and so on, so training can be the way to take some of those people and actually use them, so the four things that I talked about will be one. I also think looking at training as a process, so almost like a loop. You can think about those four things, you know the group training, the individual, the reactive and the documentation, and then running those in a loop format to be able to keep building people up. So using the loop and the second thing will be looking for those people, 'cause I come across them all the time. I work with really smart people across different companies, and I work with someone who's like, wow, this person can probably do quite a bit more if they were trained properly, and the company doesn't have to go out and find some mythical person out there, some sort of unicorn candidate. So those two things.