059 - How Design Thinking Helps Organizations and Data Science Teams Create Economic Value with Machine Learning and Analytics feat. Bill Schmarzo

February 23, 2021 00:43:08
059 - How Design Thinking Helps Organizations and Data Science Teams Create Economic Value with Machine Learning and Analytics feat. Bill Schmarzo
Experiencing Data with Brian T. O'Neill
059 - How Design Thinking Helps Organizations and Data Science Teams Create Economic Value with Machine Learning and Analytics feat. Bill Schmarzo

Feb 23 2021 | 00:43:08

/

Show Notes

With a 30+ year career in data warehousing, BI and advanced analytics under his belt, Bill has become a leader in the field of big data and data science – and, not to mention, a popular social media influencer. Having previously worked in senior leadership at DellEMC and Yahoo!, Bill is now an executive fellow and professor at the University of San Francisco School of Management as well as an honorary professor at the National University of Ireland-Galway.

I’m so excited to welcome Bill as my guest on this week’s episode of Experiencing Data. When I first began specializing my consulting in the area of data products, Bill was one of the first leaders that I specifically noticed was leveraging design thinking on a regular basis in his work. In this long overdue episode, we dug into some examples of how he’s using it with teams today.  Bill sees design as a process of empowering humans to collaborate with one another, and he also shares insights from his new book, “The? Economics of Data, Analytics and Digital Transformation.”

In total, we covered:

Quotes from Today’s Episode

There’s certainly a UI aspect of design, which is to build products that are more conducive for the user to interact with – products that are more natural, more intuitive … But I also think about design from an empowerment perspective. When I consider design-thinking techniques, I think about how I can empower the wide variety of stakeholders that I need to service with my data science. I’m looking to identify and uncover those variables and metrics that might be better predictors of performance. To me, at the very beginning of the design process, it’s about empowering everybody to have ideas. – Bill (2:25)

Envisioning workshops are designed to let people realize that there are people all across the organization who bring very different perspectives to a problem. When you combine those perspectives, you have an illuminating thing. Now let’s be honest: many large organizations don’t do this well at all. And the reason why is not because they’re not smart, it’s because in many cases, senior executives aren’t willing to let go. Design thinking isn’t empowering the senior executives. In many cases, it’s about empowering those frontline employees … If you have a culture where the senior executives have to be the smartest people in the room, design is doomed. – Bill (10:15)

Organizational charts are the great destroyer of creativity because you put people in boxes. We talk about data silos, but we create these human silos where people can’t go out … Screw boxes. We want to create swirls – we want to create empowered teams. In fact, the most powerful teams are the ones who can embrace design thinking to create what I call organizational improvisation. Meaning, you have the ability to mix and match people across the organization based on their skill sets for the problem at hand, dissipate them when the problem is gone, and reconstitute them around a different problem. It’s like watching a great soccer team play … These players have been trained and conditioned, they make their own decisions on the field, and they interact with each other. Watching a good soccer team is like ballet because they’ve all been empowered to make decisions. – Bill (15:30)

I tend to feel like design thinkers can be born from any job title, not just “creatives” – even certain types of verytechnically gifted people can be really good at it. A lot of it is focused around the types of questions they ask and their ability to be empathetic. – Brian (25:55)

(Is there another quote from me? So many good ones in this episode from Bill though so if not, i understand)

The best design thinkers and the best data scientists share one common trait: they’re humble. They have the ability to ask questions, to learn. They don’t walk in with an answer…and here’s the beauty of design thinking: anybody can do it. But you have to be humble. If you already know the answer, then you’re never going to be a good designer. Never. – Bill (26:34)
From an economic perspective … The value of data isn’t in having it. The value in data is how you use it to generate more value … In the same way that design thinking is learning how to speak the language of the customer, economics is about learning how to speak the language of the business. And when you bring those concepts together around data science, that’s a blend that is truly a game-changer. – Bill (36:03)

Links

Transcript

Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill and today, Bill Maher-Schmarzo, sorry. We were just talking about your name, and it was like, “It could be an adjective. It could be a verb. And it’s a noun, too.” [laugh]. Bill Schmarzo, thanks for coming on Experiencing Data with me. How are you?

Bill: Yeah, Schmarzo as a verb, what would that mean? You Schmarzo-ed something. Would that be-

Brian: I don’t-

Bill: -would that be-that’d be a great success or a horrible failure?

Brian: It kind of falls into that Yiddish vibe, immediately. I don’t know, like-

Bill: [laugh].

Brian: You’re an [crosstalk 00:00:56]-

Bill: [singing]. Schmarzo.

Brian: [laugh].

Bill: [laugh].

Brian: Excellent. No, no. It’s great to have you on here. And I’ve had you on my list for some time. Part of the reason I had you on here is I think when I decided to focus my work and trying to bring design into the world of data products and helping teams get better outcomes from their work with design, you were one of the few people where I even saw that word anywhere relevantly close to your job title, your work description, the way you think about things.

And I’m like, okay, I got to get him on the show at some point to see why. So, maybe we can just start there. I think design is really hand-wavy for a lot of people. It’s this kind of like fluffy extra stuff. If you have some extra money, maybe you throw some of that little magic dust in at the end.

It’s not a normal way that, especially non-digital native companies tend to operate. I think it’s changing slowly; I would say it’s pretty normal now for most tech companies. Like, when I consult that designing user experiences an early hire with your technology people, as well. But talk to me about why does design matter at all in data science and analytics? You’ve had some experiences with this and I want you to tell us about it, but help someone who hasn’t really bought into this or kind of hears it as this creative-y the hand-wavy thing, why do we need that?

Bill: Well, Brian, it’s a loaded question. And the good news is, there’s actually not a simple answer to this. I think it’s actually a complex answer because it all starts by what you mean by design. Now, there’s certainly a UI aspect of design, to build products that are more conducive for the user to interact with, more natural, more intuitive. And product companies, for the most part, or last 10, 15 years, have really embraced design from a UI perspective.

But I also think about design from an empowerment perspective. So, when I think about design, and particularly things like design thinking techniques, I’m thinking about how do I empower the wide variety of stakeholders that I need to service with my data science, to identify and uncover those variables and metrics that might be better predictors of performance. And so to me, at the very beginning of the design process-really about empowering everybody to have ideas. All ideas are worthy of consideration-which by the way, doesn’t mean all ideas are good-it’s about having that ability to diverge your broad thinking in order to converge. So, to me, it’s an empowerment process where everybody has a chance to have a voice because you never know who might have the best idea.

The second part about design that I think is very critical is that good design-whether it be customer journey maps, or personas, or service design templates-forces you to speak the language of your customer. Way too many product companies and way too many service companies are internally out-focused. That is, they think about their products and their services first, and then how does the customer fit into what I have to offer? Wrong. You need to understand what your customer is trying to accomplish that-I love a customer journey map.

I think a customer journey map is an illuminating process, to understand what the customer is trying to accomplish. And then being able to figure out how do my products and services help support that journey map? So it’s very much-design is a pivot in how people think. That is you stop thinking internally out, and you start focusing externally back in. And the reason why this is so important is because the only real source of value creation is around the customer.

Customer is the only person with ink in their pen. They’re the only ones who are buying things. Even if you think about [unintelligible 00:04:52] in a B2B market, well, you’re probably B2B2C at some point. And so you need to be able to speak the language of the customer and walk in their shoes in order to be able to identify, validate, verify, and prioritize the sources of customer value creation.

Brian: So, you’re preaching to the choir here on this, but tell me, for someone who hasn’t experienced this, can you give me a specific example where-or maybe you’ve seen a team where there was like a before-after. So, in the old way, they were doing things this way, and this other way, when we tried doing this project, or this product, or whatever the thing was that they were making, we went through some of this process, what light bulbs went on? Was there a particular moment where things clicked for you, either personally, or just seeing a team go through this? Help make it concrete.

Bill: Okay. This is a long story; I apologize upfront, but it’s an illuminating story to me. It’s around what’s something we call an envisioning workshop. That is, before we ever do our data science work before we really start putting [unintelligible 00:05:52] the data, we bring in all the key business stakeholders and we run an envisioning workshop to identify the variables and metrics that might be better predictors of performance. And of course, the key phrase there is the word ‘Might.’

Because if you as an organization don’t have enough ‘Might’ moments, you’ll never have any breakthrough moments. And so we go through this process; we’re doing a project for a casino, a large casino. And they’re trying to figure out how to optimize the comps they give. Most casinos have a twenty percent rule that if you lose, like, $1,000, they’re going to give you $200 back in comps because they don’t want you not to come back. So, generally what casinos typically did is they gave everybody twenty percent of what you lost.

And that was a total waste because some people, you were wasting your money, they were never going to come back anyway. Some people, maybe, needed more, some people needed less. And so they wanted to create a very focused calculation on not only understanding what the customer’s lifetime value was, but they wanted to create a prediction of what that lifetime value could be. So we’re running this visioning workshop and we’re trying to brainstorm these variables and metrics, and there’s this woman in the back-I don’t remember her name, I’m going to call her Mary-Mary’s in the back, and Mary in the casino, her team sits behind-she’s the cashier-sits behind the bars. And they’re the ones extending credit to different players.

And she shares-so she-I-so we had interviewed her, and I knew she had some good insights so I said, “Hey, Mary, tell us what information and metrics you might know of, or some data you might know of that might really help us figure out who are our most highest potential customers?” And she says, “Well, every night, the thirty casinos down in Southern California, they fax each other what everybody-all the players who got a line of credit. And the reason why they do, the reason why they fax this is, they don’t want somebody bouncing from casino to casino, run up a line of credit and then bolting down Mexico.” Now, it makes total sense. So, every night, they fax all this information.

And she says this, and there’s a guy in the front of the room-let me call him, Buddy. He runs the slots. And in these casinos, the guy who runs-or gal who runs slots, they’re the king of the casino. And I remember, Buddy hears this, he pivots and he looks, he goes, “Wait, Mary. Are you telling me that every night we tell every other Casino in our area, who our biggest players are from a line-of-credit perspective?” She goes, “Yes.”

He goes, “And,” he says, “every other casino is telling us who their biggest vendors are from line-of-credit perspective.” And she said, “Yes,” And then she goes, “but it’s in a PDF, so, you know, you can’t get to it.” And of course, our data science team is all drooling. “PDF, [drooling sound] let me at that.” And it was at that moment that Buddy realized that, oh, my gosh, this is a very valuable piece of information, especially in combination with all the other data brought together.

So, what happened in that moment, you could literally in this room of 25 casino executives, the light bulb went off all at once when they realized, oh my gosh, we’re all sitting on these little pieces of data that individually may not sound important, but when you bring them all together, it gives you invaluable insights into who your most important customers might be.

So, that was one of those moments when that happened, you could literally see the whole room, look at each other and go, “Oh, my gosh, I didn’t know we had that.” And then all [unintelligible 00:09:00] talking, “Well, what other data do we have? What else do we know about our players?” And such. So, it was-anyway, long story.

Brian: Do you know how that turned out-

Bill: Oh, yeah.

Brian: -or what they ended up-

Bill: Oh, yeah. [laugh]. I can’t go into details, but let me tell you, the payback, the ROI on that project was measured in weeks, not years. It worked out very well.

Brian: That’s great. I love it. So, getting to the point, though, where some people that come from math, or statistics, engineering, some of these technology upbringings and they end up in management positions, how does someone decide that, “Oh, we need a design thinking workshop,” or, “We need”-that all of a sudden this matters, that we should spend the time to go and do something like this before we start deploying machine learning or data science at our problem? Is this something where you’re like, “Well, hold the phone here. If we’re going to do this, we’re going to do it my way,” and you kind of have your thing, or was this something where they knew that they were going to go through this process and they wanted that? How’s that experience like, and what had to change to allow that to happen? Because this doesn’t happen in a lot of places, in my experience.

Bill: You’ve got to allow them to experience the power of empowerment. You have to help them-the envisioning workshop is all designed to let people realize that there are valuable assets-people-all across the organization who bring very different perspectives to a problem, then when you combine those perspectives, you have an illuminating thing. Now, let’s be real honest: many large organizations don’t do this well at all. Horribly. And the reason why is not because they’re not smart, it’s because in many cases, senior executives aren’t willing to let go.

Design thinking-you think about the empowerment-isn’t empowering the senior executives. In many cases, empowering those frontline employees. Think about the COVID situation. Who knows best about the conditions around COVID better than a nurse or a doctor? It isn’t the chief hospital administrator; he doesn’t know anything. I mean, it’s the people at the front line who really know what’s going on. So, if you have a culture where the senior executives have to be the smartest people in the room, design is doomed.

Brian: This last mile, you know, we talked about all the amazing things we can do with the technology, and it always comes back to this last mile. There’s going to be some human touch-point at the end of this that’s ultimately going to determine what’s going to happen, how much-someone is going to handle a wad of cash or tokens to that gambler: that is the last mile. How much is that wad of cash? Well, she’s the last person, or whoever the woman, Mary behind bars or whatever it was, how do we help Mary know how much to give out? What is she empowered to do?

Does she have any personal judgment over that? If you don’t understand that last mile there, or if you bury it in some application, she’s like, “I deal with the cash. And I talk to the customers. I’m not going to go, open up a 500-page PDF to look up someone’s name to see what their-you know, how much cash they’re allowed to take out a credit line, or whatever it is.” I’m guessing. You speak for-tell me if I’m wrong here. But it’s really important to talk about this last mile stuff at the beginning of the project, and then cover ‘How might we’ questions to get what you mentioned.

Bill: Well, there’s a couple of points there here, Brian. Number one is that last mile is where you really are turning your analytic outputs into business outcomes. And so we didn’t expect Mary to go and pull up a record that showed everything this person hit. We created a score, a series of scores that said how important is this person to us? How much are we willing to give them?

And within the guidelines-it isn’t like there was an AI robot saying, “You get $2,000.” She had the ability to make a judgment. Now, what we also did is when she made a judgment as far as how much it gives somebody, we recorded it because we wanted to learn if that was a good decision or not. So, we want to use AI machine learning to give recommendations, and scores, and guidance to the frontline employees, and technicians, and people giving out cash, the cashiers, but we also want to empower the humans to override that.

Let me give you a really cool story. So, we know that most automobile loans today are driven by AI models. And they’ve got a model that works, and if somebody walks in, and their past payment and credit history doesn’t look worthy, they’re out the door. They’re automatically rejected. Well, they did this once, and one of the women who was at the bank who was giving out the loans, she said, “Well, tell me why do you want the money?”

And the person said, “Well I’ve had a rough life. I’ve been in and out of prison, and I’ve had some problems and I’ve decided I’m going to become an Uber driver, and I need to buy a car.” Now, think about it. Now, all of a sudden, you realize that this person would have been denied a car loan, but this person is going to get a car loan in order to make money, they’re going to use it. And she asks a few more questions to understand what his plans were, his process, kind of did a sanity check on the business model and gave him the loan.

Now, what she wants to do is she wants to immediately tag that and said, “Was that a good loan? Did that person pay it back, or did the person not pay it back?” Well, in this case, the person bought the car, became an Uber driver, made lots of money, bought another car, right? Pretty soon had a series of cars of people he was working with that were doing this. So, you need to empower the human at this last mile to override what the model might say, but you want to measure how effective it was because if you don’t do that, what happens is these AI models suffer from what’s called confirmation bias.

They keep making the same decisions over and over again, and they don’t look at the outliers that if the-false positives and the false negatives [unintelligible 00:14:39] could dramatically not only improve the quality of the AI model’s decision but also could improve dramatically the quality of an organization’s total addressable market.

Brian: Do you find that this kind of squishy human part of it and allowing some of this human touchpoint, is this an uncomfortable thing that an organization needs to get past, whether it’s the data scientists themselves who are thinking, often, about model accuracy as being the ultimate determinant of their worth in the organization, or the management to understand, wait a second, we’re really going to let Mary or John decide how much money to be handing out at the front gate? Is that a tough thing to swallow, or do you think by the time they’ve gone through a design-driven method for building a solution like this, it’s not a difficult thing to get to?

Bill: Oh, it’s very difficult. It’s very difficult because we have senior executives who went to school, learn certain management techniques, have been in business-I always get very frustrated when I see an organization-organizational charts are the great destroyer of creativity because you put people in boxes and then you almost forbid the people in marketing to talk to the people in sales, talk to-I mean, you put them in boxes. And you-they’re like silos. We talk about data silos, but we create these human silos where people can’t go out. And so, most organizations operate around boxes, organizational charts, and whenever they bring in, by the way, a senior expensive management consulting firm to do some analysis, the management consulting firm always comes back with a new set of boxes; here’s the box you need to be in.

You know, screw boxes. We want to create swirls, we want to create empowered teams. In fact, the powerful teams are the ones who can embrace design thinking to create what I call organizational improvisation. That is you have the ability to mix and match people across the organization based on their skill sets for the problem at hand, dissipate them when the problem is gone, and reconstitute them around a different problem. It’s like watching a great soccer team play.

This is like the coaches standing above there and yelling, “Bill, you go there.” “Max, you go there.” And, “Alec, you kick the ball over here.” No. These players have been trained and conditioned, they make their own decisions on the field, they interact with each other. It’s like ballet, watching a good soccer team because they’ve all been empowered to make decisions.

But the minute we get into a business world, it’s like, the senior guy at the top knows all the answers, and everybody else is a friggin’ bunch of robots, and just nod your head. That way I’ve run an organization is friggin’ dead. It is dead, and there are up and coming organizations that are going to knock those people on their butts because they’re going to empower all the creativity, they’re going to unleash the greatness in each of their employees by employing not only design thinking, but integrating design thinking with data science to really help to identify, codify, and operationalize all those sources of value.

Brian: Sure. Yeah, I’m totally with you. I think that the idea of owning the problem and not-I’m going to call it “solution” in quotes, but what often is the output, not the outcome. But if you can allow the team to own the problem, it really changes that dynamic now because it opens up other possibilities for doing things and not every solution that we come out with needs to be hit with this hammer-the machine learning hammer is my favorite one-you know, “No matter what it is, let’s hit it with that.” And so part of what I do in my seminars is to really-by understanding the-framing the problem correctly, we might find out we don’t need machine learning for it, and if we can let go that yes, that is your core technical skill, but maybe you find out what, you don’t need me on this project.

You need a more basic analytical technique here, we don’t need to build out a giant infrastructure to do this. We can get something done in three weeks, using a more elementary technique here, and that’s going to solve the problem, now that we know what it is. How does a team though, get-you talked about the-I understand the management and some of these larger changes that aren’t going to change on a dime, but if someone was feeling like, “I get this. I’ve been through the pain. This makes sense to me.”

What’s the zero to one step? We’re not doing any of this now. What’s the first step to getting into this world? Is it to deploy design on a small project? Or, like how would you recommend an organization take a first step into doing things a different way like this? Where does it begin?

Bill: Where I’ve seen this be successful is-you said here, Brian, you pick a use case that has meaningful business value. And in fact, what we do is we go through-in our visioning process, we go through the prioritization process, we identify use cases, and we go through a process of prioritizing based on value, and feasibility over a nine-month period. We’re not going to cure world hunger. That’s a project that’s doomed from the start. So, we find a use case we can go after, we find a friendly on the business side who wants to engage with us, who sees either a growth opportunity of how using data science can help them change a thing, or somebody who’s in trouble mode, who knows, “I got to change.”

So, we find a friendly, we do a proof of value. And a proof of value can be four to six months. Pretty straightforward. And we did one of these when I was at Hitachi Vantara. Our CIO, she realized that their data lake was basically a data swamp, wasn’t getting any value. So, she wanted to try something different. She partnered with our chief marketing officer, they picked a use case, we applied this process. The proof of value generated $28 million in additional revenue. $28 million in the proof of value. Guess-

Brian: Drain the swamp.

Bill: Guess who our biggest supporter going forward? The CIO now understood. She goes, “Oh, my gosh, I see-this is like printing money.” So, use case by use case, here she is, the data lake no longer has 25 data sources, it has the three most important and then we add one more. She saw the light and became our strongest proponent. It delivered value.

So, what happens-and I know this is a hard concept to grok, but you have to basically build it brick-by-brick, and you have to as a design team, as a data science team, you have to prove yourself every step. Because the minute you make a misstep, you go back to zero. And so you may have gotten to the 20, 30, 40-yard line, but you screw up, you’re back at the goal line again. Off you go again.

Brian: How do you handle that, though because I see a lot of the point of design is to move as quickly and to accelerate learning as fast as possible, working in low fidelity, getting ideas going visually, fast. And this includes things like journey mapping, and not just the final-as you said, you can apply this to things where there’s not a heavy user interface element, a lot of this is about the problem, the way we approach problem-solving. But to me, that failure part is very much a part of this. Not everything is going to have a Eureka moment, there’s not always going to be a success, and not everything is quantifiable. So, how do you handle that?

I don’t think every time we do this, it’s necessarily going to yield a substantially different type of outcome. I think overall on the trend, it’s a better way of designing solutions. It’s human-centered in every way and you’ll get the business value if you get the people part. But tell me about that. Maybe I misunderstood you, but it sounded like you’re saying if you don’t show a success every time, you don’t get another swing at it. That sounds like a really risky-especially if a team is trying to do this, and maybe they don’t have internal knowledge on how to do it. It sounds really risky.

Bill: It is. But here’s what you do. You cheat. You cheat by-you identify the problems ahead of time, through this envisioning process that you can be successful at. When we do an envisioning workshop, the data science team is right there with us.

It’s led by a design thinker. We have data engineers here, we have business subject matter experts. And so we cheat by making sure that we’ve invested enough time upfront to minimize the chances of failure. Because what you don’t want to have happen, what can easily happen, you do one of these things, you have success. Everybody, the organization sees it right?

Now, everybody wants it. The worst thing you can do is to open the floodgates and let everybody do it. Because then you’ll have everybody fail. So, what you have to do-you said a very important word here, Brian. Learning.

It’s a learning process, and so how does an organization exploit the economies of learning? Because in knowledge-based industries, the economies of learning are more powerful than economies of scale. So, you have to put in a governance process-a governance process with teeth-that says, “We’re going to review the use cases, we’re going to go use case by use case. And oh, by the way, if you do this right”-like my latest book says-“You can exploit the economics of data and analytics to drive material impact in the organization.” It’s around the economies of learning.

And it isn’t necessarily human-centered as much as it’s human empowered to help you figure out how you learn more quickly. So, you will have success. Every time we’ve done one of these things-and I’ve been doing this for 20-some years-every time we do this, we have a success. The challenge isn’t the first one. The challenge is the second and third one.

So, you don’t basically open the floodgates. And I’m serious about this governance process; you need to have a very rigid governance process that both blesses the use cases you go after, but ensures that the organization is leveraging the learning from the data and analytics, use case by use case.

Brian: Mm-hm. Does that require a lot of either specialized roles or a lot of additional, not necessarily technology that needs to be enabled, but a responsibility that someone has to own in the business? It’s this cost of doing design that has to be built in along this to keep the thing on the right track? Is that kind of what you’re saying?

Bill: I think the reason why most companies are really poor at monetizing their data is because nobody in the organization owns it. It’s spread across chief data officer, chief information officer, chief analytics officer. You have all these people who own pieces of it. Unless you have one throat to choke who is that point of governance by collaborating with other executives to make sure you have this approach, this thing can go-well, one thing, it probably never gets off the ground. But if you don’t have that, I call it the chief data monetization officer role, who is basically responsible for trying to drive the governance and the re-application of data analytics across the organization, then this thing can-yeah, you’ll have your first success, but it’s not the first one that matters. It’s the third, and the fifth, and the seventh. It’s all these other ones that come on after that where you get a really big impact.

Brian: Who are the right types of people to do this kind of work in an organization, especially if they don’t have designers who are trained in doing this type of facilitation, or work? Is there a personality type or something? A trend you’ve seen? I tend to feel like design thinkers can come out of almost anywhere, even certain types of very, very technical people can be really good at it; a lot of it is around the types of questions they ask. I find it’s people who have built enough bad stuff, and they’re tired of doing it that way, and they’re just like, “I’m at the point in my career where I want to work on good stuff. My job prospects are good. I’m tired of building stuff I don’t like that doesn’t go anywhere.” And those people sometimes can become really good at this. But can you tell me about your experience? If I was to tap four people, and run some small squads, and try to bring some of this into my organization, who would I look for as a leader in the data space?

Bill: The best design thinkers and the best data scientists share one common trait: they’re humble. They have the ability to ask questions, to learn. They don’t walk in with an answer, they walk in trying to seek an answer. And that’s a very different process. And here’s the beauty of this design thinking kind of approach.

Anybody can do it. Anybody. But you have to be humble. If you already know the answer, then you’re never going to be a good designer. Never. Once you have put together this team of people who are intrinsically humble, and willing to ask questions, and learn from each other, it creates this synergy around creativity.

And I would argue that we’re all born naturally with creativity. As little kids, you know, we took things apart, and we put them together. And I took things apart and put them back together and there’s always extra parts laying over, which always drove my dad nuts. “Oh, there goes that radio. That one’s no good anymore.”

But what happens is-and it starts in school, through things like standardized testing, and such, where we work really hard in schools, starting in grade school, and middle school, et cetera to wipe creativity out. “Oh, that kid, he’s a troublemaker. He’s an outlier. He can’t sit there and be still.” So, and then we get standardized testing in college, and everything else, and standardized curriculums.

And so we wipe creativity out. The best design thinkers and the best data scientists are, by their very nature, very creative. They have a strong curiosity about what variables and metrics might be important metrics. They leverage that curiosity to explore, and exploring is about failing. You don’t learn if you’re not failing.

So, you have to embrace-and the data science development methodology is full of failures. You are failing all the time. You’re constantly trying different combinations of variables, and metrics, and different algorithms, and different transformations and enrichments. You’re trying all these things, just to see if you can get a little bit better. It’s built on failure.

Brian: I would agree, that ability to ask questions and putting apart some of our assumptions about being the smartest one in the room, or I know the domain the best or whatever, I would agree those are critical skills. Do you find that the people that end up being really good at this, does this take away from or compliment the data science work they do? Do they tend to then kind of migrate out of the hands-on data science work if they do come from that area? Or is this just a different way for them to even do their technical work and everything? It just becomes part of the way they do that technical work as well?

Is it a different-do they kind of evolve into a different role within the organization? How do you see that because I could see some-just devil’s advocate, I could see some people saying, “Well, I really need those kinds of people on that really hard modeling stuff that we pay them really well to do, and we don’t have a lot of that resource, so I don’t want to have them doing this other stuff, this design stuff.” Play the other side of that argument for me.

Bill: So, imagine you’re a data scientist, and you’ve got an infinite number of ways to solve a problem. Truly infinite. There’s almost an infinite number of data combinations, of data enrichment techniques and transformations and algorithms. It’s almost an impossible job. How do you take the impossible job and make it manageable?

Will you put guardrails around it? What we do in this envisioning process, using these design techniques, is to really understand what are the variables and metrics are trying to optimize against? What are the decisions we’re trying to make? What metrics and variables might help us be better predictors of those decisions? And so, we automatically start putting some guardrails in place that helps the data scientist.

They’re still going to bounce around between those guardrails, but they’re not off in Etherland. They now have a concrete idea of what they need to do. It’s really this key about how do we transition data science-modern data science, data science 2.0-from outputs to outcomes? How do we transition that discipline, that practice from focusing on analytic outputs using ml and AI techniques to delivering business and operational outcomes that have meaningful financial value attached to them?

I think it’s all about the maturation of data science as a discipline. We’re not focused on the activities; we’re focused on the outcomes. And so I think what you’re seeing is that data scientists, they love this because now they know that their work has meaning, they know who their customers-if you do the design part right, they can even envision the customer, they can walk in their shoes, they can go to the store, and see what the customer is going through, and experience it firsthand. So, my experience with a data scientist-and by the way, it’s not all data s-I’ve had a couple of data scientists who couldn’t get this. I had to let him go. But I needed them to be able to think, and act, and talk to the customers. They needed to be a part of this process in order to be an effective data scientist.

Brian: Amen. I’m totally with you there, Jared Spool, in the design [unintelligible 00:31:37], talks a lot about exposure hours. How much exposure are the people that are making the decisions-and if the data science is doing the model, and the model is part of the solution, then they’re effectively one of the designers of the solution-we got to get more exposure time to all of the people, not just the researchers, and designers, or whoever. It’s the team that’s responsible to make the decisions; it’s really important to have that exposure: you develop the empathy; you start to first see solutions; instead of just being reactive, you can start to be proactive and say, “Wait a second. Why aren’t we doing this here? We could so easily do X, Y, and Z over here to help this, and maybe no one’s asking about it because they didn’t know it was possible. But I know it’s possible because I’ve been trained in this, and this is a very easy thing that we could get a win over here.” I’m with you on all of that. Where did someone like you-you have a computer science background, a math background as I understand, how the heck did you get into this? Where did you get exposed to this as someone at your level?

Bill: So I’ve always been fascinated with data and analytics because of what I can do with it. And it started probably at an early age when I was in middle school and we played this game board called Strat-O-Matic baseball, which was a kind of precursor to baseball sabermetrics. And I quickly realized because I knew more about math and stats, that I knew what players were more valuable than other players and I had an unfair advantage in trades and amassing a team that was pretty-you know, Murderers’ Row had a whole new definition. So I’ve always had this fascination that I knew that if you could leverage data and analytics, you could drive outcomes. The real place of indoctrination for me, though, was when I was working with Procter & Gamble in the 1980s.

Yeah, I am that old. And Procter & Gamble was moving towards a data-based decision making culture. And we built in 87 and 88, one of the very first, and maybe it was the first data warehouse and BI environment, with Procter & Gamble’s data combined with Walmart’s point-of-sale data. And the kind of insights we were able to gain on marketing programs, and pricing, and promotions, and all these other things were illuminating. We were printing money. It was staggering.

And I remember walking out of there-and we’d all been trained in six sigma as a methodology. I remember walking out of there thinking, “There’s a better way to do this,” Procter & Gamble has-they sort of got me my appetite whetted on this. And so all through my life here, I’ve been on this goal, Brian, to really try to understand, what is the value of data and how do I help organizations leverage data to make better decisions? And so it just came on and on. I know a lot of Forrest Gump moments in my life.

When I went to become vice president of advertising analytics at Yahoo, that was one of those Forrest Gump moments where everything I’d learned about BI and data warehousing, I had to unlearn because the way that we did analytics at Yahoo was very different than how we were doing analytics to other places I’d had been. So I’ve always sort of been on this journey. And not to bore you, but it was a research project-I teach at the University of San Francisco-and it was a research project we did, they said-I was always been fascinated with trying to understand the value of data, and so when I was at USF-I’m the executive fellow there-I was able to do a research project. I had lots of really bright, really motivated research assistants who were free, and I turned them loose on this problem. And the epiphany moment in that-and when I went into this conversation, I was thinking about, “How do I show data on the books? If data is truly an asset, you’ve got to find a way to put it onto a company’s balance sheet.” And so we’re doing this project, and I asked my team to go out, I said, “Find me an asset that sits on the balance sheet that looks like data.”

And so off they go. They do their work and do their brainstorming, and one of the research assistants, she comes back to me, says, “Professor Schmarzo,” she says, “I got to be honest, I can’t find anything.” She says, “Data isn’t like anything we have on the balance sheet.” She said, “Think about it. It never wears out. It never depletes. The same data set can be used across an unlimited number-an infinite number of use cases at zero marginal cost.” And that’s when I realized, “Oh, my gosh, I’ve been thinking about this entirely wrong.”

That zero marginal cost comment reflected back to a marginal propensity to consume or the economic multiplier effect, and I realized that my approach all along had been wrong in how I view data as a standalone asset. But when I took a look at it from an economic perspective, from this economic multiplier effect, I realized the value of data isn’t in having it. The value in data is how you use it to generate more value. And that’s just launched everything about-I’ve been doing it on economics. I’m now working on a concept around nanoeconomics.

Like I said, I mentioned in my book, the book’s called The Economics of Data, Analytics, and Digital Transformation, probably the most boring title one could ever think of. But it speaks to the heart of the opportunity is that this whole conversation is around economics. And I would argue in the same way that design thinking is learning how to speak the language of the customer, Economics is about learning how to speak the language of the business. And you bring those concepts together around data science, that’s a blend that is truly a game-changer.

Brian: Who would get the most out of your new text?

Bill: I think anybody: students, professionals, retirees, anybody who’s trying to understand, “How do I advance my career by understanding more about how one exploits the value of data and analytics?” Would benefit from this. I did a keynote recently at a large industrial company about two weeks ago, and after the keynote, one of the vice presidents said, “Your book is going to be mandatory reading for all of our leaders because our leaders need to transform how they think about data and analytics.” And it’s not just a technology conversation, it’s how do we leverage design and human empowerment in order to create a culture and a company of continuous learning and adapting? So, I think anybody can benefit from it.

But it’s not a fun read. It’s a horrible read; it’s a boring read because it was written as a textbook. It’s deep. It makes you do homework assignments at the end of each chapter. But if you’re really serious about understanding why data and analytics is such a unique asset, and how you personally and professionally can advance your career with it, I’ve got a lot of very positive feedback on the book as far as changing people, how they think about their careers, whether they’re a nurse, whether they’re a data scientist, whether they’re a teacher, whether they’re a technician, anybody whose career can benefit from data and analytics and making better decisions, I think we’ll enjoy the-well, honestly, enjoy is the wrong term, I think they’ll get value out of the book.

Brian: Yeah, that’s great. That’s great. This has been a great conversation. I really appreciate you sharing all these insights. Just kind of in closing, is there one particular message that you would send out to the leadership community in the data science and analytics and product space here about all the things that you’ve learned, putting together the economic side of data, your use of design as a strategic way of problem-solving within businesses to create better solutions? Is there one message you’d like to kind of leave them with?

Bill: Yeah. Here’s the message I leave them with. I believe in knowledge-based industries, economies of learning are more powerful than economies of scale. And organizations need to work hard to create both a technology and a cultural environment of continuous learning and adapting. In my book, chapter nine-which I think is the most powerful chapter in my book-it has nothing to do with technology or economics, it has everything to do with team empowerment.

If organizations are going to truly create a culture of continuous learning and adapting to learn faster than the competition and to adapt more quickly, then you have to empower your frontline people. You have to empower the frontline people because that’s the point where machine learning and human collaboration is going to drive new sources of customer product and operational value.

Brian: Love it, love it. Love it. This is so good. Where can people follow you or get more insights? Do you have a mailing list or something like that? What’s the best place to follow?

Bill: On LinkedIn. I try to post about one blog a week. My goal is to continue to write new chapters for the book. I mentioned this concept around nanoeconomics, I think is very much a game-changer. It’s a new concept, and I’ll create the equivalent of a chapter that would go in the book [unintelligible 00:40:02] actually won’t.

I’m also working a lot right now on ethical AI and how do organizations create a culture that enables you to overcome the confirmation bias that drives AI to do unethical things? So, LinkedIn is the place to find me. Come hang out on LinkedIn.

Brian: Awesome, awesome. Well, we’ll definitely link that up. The Economics of Data, Analytics, and Digital Transformation if you’re interested. Bill Schmarzo, this was such an awesome conversation. Thank you so much for coming on the show.

Bill: Thanks, Brian, for having me. It was a lot of fun.

Other Episodes

Episode 4

January 29, 2019 00:39:57
Episode Cover

005 – Jason Krantz (Dir. of Biz Analytics/Insights, Weil-McClain) on centering analytics around internal customers

Jason Krantz is the Director of Business Analytics & Insights for the 135-year old company, Weil McLain and Marley Engineered Products. While the company...

Listen

Episode 0

January 12, 2021 00:39:40
Episode Cover

056-How Design Helps Drive Adoption of Data Products Used for Social Work with Chief Data Officer Dr. Besa Bauta of MercyFirst

There’s a lot at stake in the decisions that social workers have to make when they care for people – and Dr. Besa Bauta...

Listen

Episode 0

May 21, 2019 00:44:35
Episode Cover

013 - Paul Mattal (Dir. of Network Systems, Akamai) on designing decision support tools and analytics services for the largest CDN on the web

Paul Mattal is the Director of Network Systems at Akamai, one of the largest content delivery networks in the U.S. Akamai is a major...

Listen