[

August 27, 2019 00:45:03
[
Experiencing Data with Brian T. O'Neill
[

Aug 27 2019 | 00:45:03

/

Show Notes

Ahmer Inam considers himself an evangelist of data science who’s been “doing data science since before it was called data science. With more than 20 years of leadership experience in the data science and analytics field at companies including Nike and Cambia health, Ahmer knows a thing or two about what makes data science projects succeed—and what makes them fail.

In today’s episode, Ahmer and I discuss his experiences using design thinking and his “human-centered AI” process to ensure that internal analytics and data science initiatives actually produce usable, useful outputs that turn into business value. Much of this was formed while Ahmer was a Senior Director and Head of Advanced Analytics at Nike, a company that is known as a design-mature organization. We covered:

Resources and Links

How Analytics Are Informing Change At Nike

LinkedIn

Quotes from Today’s Episode

“Build data products with the people, for the people…and bring a sense of vulnerability to the table.” — Ahmer

“What I have seen is that a lot of times we can build models, we can bring the best of the technologies on optimal technology it’s in the platforms, but in the end, if the business process and the people are not ready to take it and use it, that’s where it fails.” — Ahmer

“If we don’t understand people in the process, essentially, the adoption is not going to work. In the end, when it comes to a lot of these data science exercises or projects or development of data products, we have to really think about it as a change management exercise and nothing short of that.” — Ahmer

“Putting humans at the center of these initiatives drives better value and it actually makes sure that these tools and data products that we’re making actually get used, which is what ultimately is going to determine whether or not there’s any business value—because the data itself doesn’t have any value until it’s acted upon.” — Brian

“One of these that’s been stuck in my ear like an earworm is that a lot of the models fail to get to production still. And so this is the ongoing theme of basically large analytics projects, whether you call it big data analytics or AI, it’s the same thing. We’re throwing a lot of money at these problems, and we’re still creating poor solutions that end up not doing anything.” — Brian

“I think the really important point here is that early on with these initiatives, it’s important to figure out, What is going to stop this person from potentially engaging with my service?” — Brian

Transcript

Brian: All right. Welcome back to Experiencing Data. Today I’m happy to have Ahmer Inam on the phone to talk about data science, design, advanced analytics, and a bunch of other good things. We recently met on LinkedIn, and I really liked some of the stuff Ahmer was talking about when we had just kind of an ad hoc call about some research that I’m doing. And there was a nice click, and so I wanted to have him come on this show and repeat some of the cool things he said.

So Ahmer, welcome.

Ahmer: Thank you, Brian. Thank you for having me on your show.

Brian: Yeah. So tell my audience a little bit about who you are and where you’ve been and where you’re going.

Ahmer: Yeah, just a couple of minutes. I consider myself an evangelist of data science, and that’s kind of how I describe myself. And I have been doing data science since before it was called data science, for almost 20 years at this point. I’ve worked in many different industries, everything from financial services to healthcare. Journey has taken me to Wells Fargo, PwC, and Nike and various other places.

Brian: That’s great. And so, can you tell people a little bit about your particular role and your expertise, are you building the models yourself, are you leading team, what’s your kind of meat and potatoes?

Ahmer: My journey really started back in the days with institutional research, which again, looking at psychometric evaluations, understanding human behavior, attitudes and things like that, really hands on work. Moved from there to more statistical modeling and financial services, and over time, transitioning to slowly kind of the trajectory of the career into building and developing teams over time.

And more and more advanced work fit with machine learning, along with the progress of building bigger and bigger team and larger responsibilities. And most recently in the last three places that I’ve been, I headed up the data science, machine learning, advanced analytics functions, at some of these firms that I have mentioned.

It’s been fun, I mean, what excites me is how transferrable data science is, and again, when I said I consider myself an evangelist to data science, one of the things that it keeps me going and moving is these different industries and companies that are looking to transformation, looking forwards to that transformation and they do strongly believe that data science has to be a part of that equation.

I love being part of that journey, and then coming in, and starting things from ground up, setting up series, road maps, and executing and then solving interesting challenging problems.

Brian: Can you give me an example of a … If you’re coming in clean into a company, and they’re trying to do something new and innovative with data, for example. Well, first of all, is that the kind of example you’d get? And secondly, how do you take a broad mandate to perhaps use AI or something like that and get to the first something, that’s valuable? What’s that? How do you do it?

Ahmer: Yeah. It’s fascinating sometime, and that’s a good question because a lot of the companies that I’ve been in, sometimes the mandate has been, “Do something with AI. AI is this thing we’re hearing about and we want to talk about it, and just go do something with AI. Give me something,” Right? So it can be something as vague as that, to our competitiveness in the market place. It’s not as strong as it used to be, others have leaped frog us or leaping frogging us, we need to do something about it. And can analytics help in that?

So coming in, usually what I do is I take more of a management consulting approach to it and dig in, I sit down with … Starting with kind of as senior leadership and try to understand where they’re coming from, what they are trying to do, understanding the why, before we can understand the what and the how.

Understanding the why in the sense that understanding the pain points, looking into so for example the businesses strategies for understanding like, “Hey, what is your mind? What is your chapter for the year?” For example. What KPIs are you utilizing to measure the effectiveness of the business profitability and things like that and the growth and problem areas.

And through that initial dialogue and in discovery, started kind of to narrow the focus a little bit and go deeper into some of the functions. And depends on the type of the companies, for example, when I was doing this work at Automotive Retail, the inventory and then pricing for example, is essentially what drives the company.

So essentially thinking about like, “What is the currency of that particular organization?” And then digging into that core function of that firm and digging in deeper. So through that dialogue and discovery, understanding and evaluating what needs to be done, and what can be done. Along with that is an assessment of the quality and health of the data and quality and health of their platform infrastructure, and quality and health of their business processes.

What I have seen is that a lot of times we can build models, we can bring the best of the technologies on optimal technology it’s in the platforms, but in the end, if the business process and the people are not ready to take it and use it, that’s where it fails. And I’ve seen a statistic like 87% of the data science models don’t make it into production. And I strongly have seen that. And one of the thing around that is really it needs to get out of that experimentation phase. And one of the approaches that I have taken is more of a rapid prototyping and incremental development and continuous testing.

So throughout that process and then through my experience, what I have seen is that human centering is very critical in success. Unless we augment human decision, with the human at the center of it, any kind of production, development, utilization, it’s going to be hard.

So again, digging back to your question around how do I come in first so really it’s a very comprehensive assessment of readiness and maturity. And along with that readiness and maturity is also trying to understand who are going to be the advocates of the work and who are going to be detractors, who are persuadables and then who I should not even bother talking to. Exactly.

And then from there on is in that assessment then usually leads to a document that lists out what is possible and doable and working with essentially the leadership that supports analytics, to figure out prioritization of what to do next and new way to go next from there on. And then that step then is very essential because that’s where the buy-in comes in, and after that, once the budget comes through, I usually don’t try to bring in a lot of folks and then start to build out an empire or something.

The approach usually is get a couple of folks in, perhaps some augmented workforce either with partnership with an analytics consulting firm or outsource resources just to get some momentum with experimentation to prove out the value, before we start to kind of … It helps in justification for requesting for the budgets to start to build out team. But that experimentation phase is really, really critical in legitimizing this new way of thinking about business.

Brian: And tell me about … That’s great information. And as you know, part of the reason why I felt was like-minded thing is because as you said putting humans at the center of these initiatives drives better value and it actually makes sure that these tools and products and data products and things that we’re making actually get used, which is what ultimately is going to determine whether or not there’s any business value because the data itself doesn’t have any value until it’s acted upon.

Ahmer: No.

Brian: One of these that’s been stuck in my ear like an earworm or just this idea in my head is that you hear about like you’ve mentioned a lot of the models failed to get to production still. And so this is the ongoing theme of basically large analytics, whether you call it big data analytics, AI, it’s the same thing. We’re throwing a lot of money at these problems, we’re still creating poor solutions that end up not doing anything.

Brian: And I kind of wonder if putting aside the technical reasons why things can fail. Like, “Well, it’s a great model, but it’s too complicated.” It’s like, “We don’t even have the machine power to use it.” So that’s I would say, okay, that’s a technical problem. But when we say like, “Oh, we couldn’t get across departmental thing, or the right integration.” Do you agree that like ultimately the finance department or the sales team, those are still people, even though they’re business departments.

So part of the human centered design thinking is that these are actually people, even though their business departments and businesses have their own kind of vibe, they’re like an organic creature and I get that. But there’s still people, and I wonder if this the reason why these things get lost is because we think of them as like, well, “I don’t know, but we gave IT didn’t do what we asked.” And so in our part, and it’s fine and they treat it like this behemoth thing as opposed to people like, do you agree with that? Or?

Ahmer: No, I 100% wholeheartedly agree with that. And I’ve seen that so much. The talents that I’ve seen, of course I’ve had failures too, right? And then the first thing, and usually like, “I’m willing to come in and my team is going to come in and we’re data scientists, we are going to build this amazing model that’s going to help everybody. And why won’t they just love it and they will use it?” And we would build something in and take it out and head nods and “Yeah, looks good.” And try to get into the hands of the people and then utilization is just not there. There’s no trust. People say, “Yeah, there’s another.” The thing is, “What I’ve seen is.” And I don’t know whether like sounds negative but let me phrase it, thinking how to phrase it.

But we’re typically, sometime I run into is business organizations respond, especially the end users respond. Another IT type thing where they are going to come up with some giving us some sort of a new software or some sort of a new tool, why won’t they just let us be? And I’ve seen that a lot, or we know our business better. We know the ground realities better.

Here comes management with the new flavor in town, “Sure this will disappear.” I’ve actually heard that type of language very openly. And then that’s more recently I began to sit down with end users, try to truly understand the mindset. Again, going back to the human send tree. And the other third thing I also hear it again, is this bundling of data science as an IT initiative, because again, for a lot of end users, they have a hard time differentiating and a lot of these enterprises have gone through ERP implementations and things like that and all the pain that comes with it.

So it all kind of gets bundled in. So when that happens, there’s all led in expectation of complexity, lack of awareness and understanding, transparency that is built in, and when that expectations are built in, going where that hump is hard.

So in a way to make these data science projects more successful, we have to understand the reality of that and these sort of the learned and untrained behavior at enterprise level, in different among people and processes and in without truly acknowledging and understanding that and in tackling these data science implementation issues as essentially change management. It’s going to be really hard to actually get successful implementation of data science based to data products, and in getting into the hands of people.

Brian: Do you have any concrete examples you can give us of how you got an internal customer who’s a recipient of one of these, “IT tools,” That are going to be thrown upon us and maybe you did it a different way such that they were … I don’t want to feed you, like what I might do, but like something that involved before and after where, “Let’s try it a different way.” Where we actually involve that person in the creation of a service that’s going to help them as opposed to delivering them something that we made in the closet over here. Do you have any thought of that and what that experience was like?

Ahmer: Yeah. Actually have several examples of that. That really has been the approach that I have taken for the last seven years or so in developing and building and implementing the data products. Going back with the failure first a little because, we all learn from our mistakes. And failure really was again that approach where we built some models, and I was in Automotive Retailer that coin and we came in we’re like, “Yeah, we’re going to do this pricing optimization for vehicles, new car pricing.” And gave it to all of these general managers in different stores and then their new pricing managers will use our pricing there’s going to be better.

And it failed because we focused only on a very typical process data, like let’s use the data, understand from the data, talk about, understand the business, look at the breadth and depth of the business and try to understand all of the complexities. We did the best we could from a data science perspective. But we truly didn’t really hone in, or even involved the end users at all because assuming that, “Yeah, if it’s wrong they can always override it.” We build that override as the functionality in it.

But when we brought that pricing to the end users, which were the new car sales managers, I mean they hated it, they didn’t like it. And the first reaction really was this extreme negativity, where they were just picking it apart. Like “Why would you price it like this? Why did you do that? Why would you do it this way? This price is way off, You’re going to lose so much money on this one.”

So after that initial setback, we said, “There’s a lot of actual value in this feedback. Let’s listen to this feedback.” And then this introspection started and this dialogue started. So the dialogue … We’ll take a look at this particular car. Why he thinks that this is a highly desirable retail? This is a BMW something, something that typically sells at this much margin on average, and then the sell through usually is this much.

So based on that, if we want it optimize on between the sell through and the margin, the model is predicting that it should be priced this way. And the car sales folks will say, “Well, there’s no way I can sell this car for that much, because this car is brown color.” Like, “Oh, uh huh.” Or “This particular truck, we modeled it for this much.” And it’s like, “No, you guys have model is $10,000 lower than it should be.” Okay. “So what are we missing?” Well, we put in after market lift kit in this, that is highly desirable in this particular market, in South Carolina.

So that said, that’s an issue of data not available. And then desirability at that local market place level, things like that. And also there was an aspect of competitiveness that people talking about. So again, through that dialogue, introspection, and we acknowledge like, “No, you guys are right.” We are basically a bunch of geeks who played with data and without your help and participation, we can’t get it right.

So that acknowledgement and understanding was again kind of being a little bit of vulnerable is a good thing with your end user, at least in my opinion. Many others may disagree with that but-

Brian: Could you just explain, I think you just said something really important. Share with my audience what it means to be vulnerable in that context, well.

Ahmer: Yeah. Vulnerable in this sense is the perspective that I’ve found to be very successful. And again, like we’re all scientists in the end, and science is about learning. And then being open and introspective and being able to understand where things work and not work.

So if we come to the table with corporates that we can do everything and anything, we’re not going to learn. So that vulnerability in the sense is coming to the table with various sense of empathy and respect for the others as well.

So what the other person on the other side of the table doesn’t have a college degree but they’re also doing a job that they’re good at. And we must learn from that because we are trying to build a system that is supposed to augment their world and we have to earn their … Again their empathy component is critical because without that empathy and transparency, number one, we can’t build that human connection. And without that human connection, we can’t build a solution that is something that they would essentially accept.

And acceptance here I mean I can see it’s not … And I’m kind of using the term lose here a little bit there because sometimes like you just have to have a management mandate for the tools we put out there and used. I think what I’m trying to get to is, acceptance a product, versus something that is forced on them, when you have an acceptance, you actually have people who would wholeheartedly use it and they’ll participate with you. And then they make the product better. And also part of that we learn a lot too.

So again, coming to the table with that sense of respect and empathy, is very critical and then that a little bit of vulnerability in the sense that, “We don’t have all the solutions. Let’s work on it together.” And then that brings transparency to the process.

The transparency again as you mentioned, there’s a lot of distrust about the black boxes and without that sense of transparency and understanding, people wouldn’t trust the black boxes. And a lot of people do see AI or things like that as potentially a replacement for humans. And we have to kind of get through that barrier, and one of the way to do that is really understanding people and helping them understand that this is not a replacement, but this is an augmentation and it cannot all be augmented unless you participate in it.

Brian: Yeah. I love what you’re saying here and I think one of the really important points here is that there’s very impressive technical work that data scientists and people working in advanced analytics are doing. It’s very impressive on an academic level, but it may not be “Impressive on a real world business level to someone that doesn’t understand how it’s going make my life better or how I’m going to look better to my boss or whatever it may be.”

So I think the really important point here is that early on with these initiatives, it’s important to figure out, “What is going to stop this person from potentially engaging with my service.” I mean ultimately you want to know what will help them, but you definitely also want to be keeping an ear out for what stuff would make them not, like as soon as they see like, “There’s no fricking way, I’m going to sell this truck with this lift kit, it’s paged at this price in South Carolina or whatever.” As you said, because you have no idea what their life is like, you know nothing about that market, but you just assume because your model’s 96 point whatever percent predictive that of course they’ll use this recommendation we gave them.

No they won’t. Now their opinion might be different if they had participated along the way and maybe boxed about some of these things and use and they learned that, “Oh I know that this model may get things wrong because right now there’s no color information.” But they know that, they’re working on that and they know that we’ve said, “Color is way more important that you thought.” And I’ll bet maybe a lot of the data people never thought that the color was really that relevant to the overall pricing and they’re like, “Are you kidding? Does it have a cup holder?” Right? Like we all that know these stories like, “Well, I bought this one because I liked the feel of the seat and the cup holder.” And this is why they always want you to take a test drive. Right?

All the qualitative, mushy kinds of things that determine value to somebody. The same thing happens to the guy selling the car too. Right? Like, “Are you going to help me make more money? Get my commission up, make me look good to my boss.” Those are the things that you want to uncover as part of your mission to provide data. It’s not just about the software, it’s about making the software engaging.

Ahmer: Yes, exactly. And then the human centering that I’ve been speaking of is, it’s not just putting your end user at the center of it, but your end users are the last mile, that connects to the consumers and they are the best in terms of having that pulse on the consumer.

So if the ultimate goal really is on human centering is to center it on the consumer, these folks in the frontline really are the ones who are providing some of the best insights on, “What do the consumers actually and really want.” Because they have those daily interactions.

So through that interaction and in their participation, we can build products that are more and more closer to consumer wants.

Brian: Sure. So you kind of told us that before story and kind of you had some learning that happened there. So can you get even down a level, how you do it now and really tactically speaking like we all talk about okay, human centered and like, “Oh, I involve my stakeholder.” But it’s so general, so what does it really mean? What are you doing all day and to get them involved? Could you comment on that?

Ahmer: Yeah, that’s a good kind of a next step into the dialogue that we are having. Like all sciences and everything too, I’ve gone through this process of evolution in understanding from the mistakes and the failures on what do we need to do, to get to a success. So digging into a little bit here, my process now and I kind of roughly call it human centered data science or human centered AI and in a way, because I want to make sure that human centering is part of the process.

So the process, the way it works now is this, first phase is still a very management consulting type approach with a deep dive discovery with lots of interviews, up and down the ladder from senior management to end users, to truly trying to understand the depth and the breadth of the problem.

And as part of that, you’re not only understanding what we are trying to solve, but what type of data we would need to bring in, internal, external, different factors. And with that we’re continuously testing and checking against the readiness from the data technology platform perspective.

The other critical element as part of the discovery is that, I’ve started engaging lean coaches as part of the initial discovery and lean coach has truly get embedded into it. The evaluation of the people and the process aspect. Because in the end, if we don’t understand people in the process, essentially the adoption is not going to work.

In the end, as I had mentioned, a lot of these data science exercises or projects or development of data products, we have to really think about it as a change management exercise, and then nothing short of that.

So having that change management mentality from the get go is very critical. So understanding that people process and then people … It really also helps understand who are going to be the promoters, versus the detractors. And who do we need to engage, who are the experts, things like that.

As part of the 10 Discovery with this deep discovery, which takes about a two weeks, sometime we would also do these game storming sessions where we would bring in let’s just say three different layers of folks from the function. So leaders, not the senior most leader, but some sort of a middle management layer, all the way down to the end users.

And try to do some brainstorming, game storming time sessions, very kind of a design thinking type, of just to get people out of their shell and start to think about business problems, without the burden of whether something is solvable or not. Like, “Leave that burden to us.”

And these open dialogues, and then taking motivated design thinking based approaches to brainstorming lots and lots of sticky notes on the walls. And then letting the people’s thinking run a little bit wild. And then we bring that in, start to consolidate and put them into different buckets. And then for each of it’s kind of doing as manual classification or segmentations of what we have heard, our topic modeling in this case.

And with that bucketing of these topics, we start to truly understand that the breadth of the problem and what do we need to bring in. So for example, when I was at Nike, we worked on a project there, we built an optimization tool for Nike’s high heat footwear. And high heat fitness model is really around a high demand, scarce products in the marketplace that create buzz for the brand.

And with that buzz comes the halo effect that pulls the rest of the inline products with it. So it’s actually a very critical strategy, a product strategy for Nike from a brand perspective, and what it does to the rest of the brand and the rest of the business, but also a very complex problem to solve because there’s so much art goes into it.

And then the art being human decisioning, understanding and awareness because we are really trying to hone in and understand this sneaker head culture. And the subculture of sneaker has and what they want and date, and then don’t want, and what drives them.

What are all the factors that impact desirability of a certain product so that we can optimize the volume that we’ll put into market place to keep that buzz alive and then keep people pining for more and more.

So without truly understanding that end users, we can’t really build a model, if you just use a typical data science approach to building a demand forecast is just going to be looking at the previous sales and maybe some product features and then some external factors from macroeconomic and stuff like that, who build a model, test it out, back test it, try to get some sense of accuracy, hyper parameter, play with that so much that we finally get a good model and then put it out there. But it just wouldn’t work in situations like these.

And so that evaluation, understanding how some stand that no, what macros here is desirability of the product. Stories that each product is telling to the consumer. The look and the feel of the product, how much brand is putting value on it? Are there any KOLs involved here, or key opinion leaders associated with this particular product in the marketplace? How did other things do and or even factors like sports, certain athletes performance or a season versus week to week performance, and many other factors or competitiveness.

So there so much breath of things that impact a product’s strategy, that understanding and honing into that is very critical. In many cases, the data may even not be available to truly just, put it into a model. So coming up with creative ways of developing and adding that data into the model is critical.

So that deep dive discovery really helps to understand at least an initial set and it’s a highly iterative process. From there on, from that discovery, then it goes into exploration and ideation phase.

Exploration and ideation phase is essentially the tried and tested exploratory data analysis, along with a data gap assessment, technology assessment, existing tools that are in the business versus, “Do we need to bring something new, and how would potentially hit a scale, and what would IT be available to support and if they are not able to support then what can we build that we would end up taking the tech debt from support of that.” So a lot of that assessment comes in this phase.

The coming out of the exploration and ideation, one of the most critical factor that comes out of that is strategic planning and strategic finance evaluation. So we actually do some finance modeling as well to get to some initial sense of the value that the initiative might provide. And then they may be actually several competing or complimentary, initiative that would come out of that because we’re trying to funnel it down from maybe hundred plus, initial ideas discussed to maybe 10.

And in those 10 then would go through a rapid prototyping. At this rapid prototyping phase as we are trying to do that. This is where we would start to create the scrum teams that have the lean coaches as well as the end users are the domain experts, as part of the scrum team itself.

So when I talked about bringing transparency into the process, empathy and in all that, it’s in the end, everybody wants to be all humans. It’s our human nature, like we all want to be respected and valued. And by including the end users from the beginning, and in fact, one of the key checkpoint for me that right now is also that from the business leaders, it’s one of my request is that they would make their certain types of certain members of their team available 100% to this initiative.

And unless test sign on comes in, I don’t proceed with that particular project. That also ensures that there is a buy-in from the top and then there’s an involvement from the top and then they are willing to put in their own resources as well.

So kind of figure the sense of buy-in is pushed in a little bit earlier in the earlier phases. So in this rapid prototyping phase, and it’s really fail fast, fail forward approach to in building several naive models, a lot of exploration. And of course models suck at that phase. Many of them would have just really bad and awful, but that’s the vulnerability component I spoke to about earlier.

So when we show … And it’s just been fun, we would get the results hours and a sit down with either the pricing analyst or merchandising planners or demand planners or sales planners or whatever the function you’re working with. And we laugh about the results. Like, “Wow, look at this how bad the results from the machines are.” Or and it’s part of building trust and at that level of transparency really, really bring that trust into the process.

And people also feel that it’s analytics gets a little bit demystified to them, and so an example I’ll give you, right? So let’s say we built a model for different certain shoes and some of our forecasts are better than what the demand planners may have planned forecasting before and some of the outputs are forecasts are really bad, right? So, there’s a sense of vulnerability that when we bring to the table, then our end users also bring to the table. Because they see that … Again, like you’re really building a bond of trust here.

So then they start opening up saying, “Yeah. Maybe your model doesn’t know that XYZ, or there were certain, additional brand push behind this product, or there is a certain new technology in this particular footwear, that was desirable to runners.” For example. Things like that or that this particular product has a really compelling story because, LeBron did something or something in his past. It’s associated with this particular thing that LeBron’s fans really love, right? So that there isn’t part of that story, on the other hand, they would say, “Well, we’re just planned this really high because we were trying to chase, some revenue targets, right? So stuff like that. So that gives us additional data point. So again, that vulnerability based dialogue brings all of these as the additional factors into the exercise that we would otherwise just not get.

Brian: See if you agree with this or not that by having these downstream customers or internals customers who are going to use this tool involved early, it’s no longer about just coming up with a super accurate predictive model. It’s about them trusting the direction.

So even and your idea of prototyping early where the model is going to suck and give bad information, if it’s presented in a way, and this customer was involved in the design process of creating this tool, now you going to have a dialogue about while the numbers suck, but they can see the mountain top now, and they’re on the climb with you and you laugh at it together about how stupid it is, but they can see where it’s going as opposed to it landed in their laps out of nowhere.

And now it’s like I don’t trust this thing. You don’t know what my job is like? You have no idea what goes into pricing this, like you just took a bunch of database fields and put them together and whatever, with that they see the mountain top with you-

Ahmer: They do.

Brian: And this is why it doesn’t have to be super accurate to provide value and get engagements.

Ahmer: And it’s a very good point because with it Brian, one thing I forgot to mention is that, we rerun that prototyping with it two weeks prints and then at the end of the two weeks is the art demo with the end users, and then sitting down and having these deep dive sort of a murder board, we may call it to dig through like line by line of the output.

But as far as the output, we always have a dashboard that shows based on the accuracy of the model, what is the likelihood business KPI that we can impact. So even with a sucky model, it starts to show positivity. Right? So they’ve really start to see that, “Hey, as we improve this, there is a lot that we can gain for the business from here.”

Brian: That’s great feedback. Can you tell us about what this murder board is? Because I don’t think everyone’s probably going to know what that is?

Ahmer: Sorry. Yeah. Murder board is another concept from management consulting. And the idea is again in innovation or prototyping is you put an idea on to the board. And again, there may be many different variations for what I’m calling murder board. But the way we use it is, you put multiple ideas or outputs on a board, and you do your best to kill it. The idea is to not let it survive. And after that level of scrutiny, if it survives that when is his good. Survival of the fittest.

Brian: This is great, man. It’s been great chatting with you here. We’re running up against our time here, but I wanted to ask you, do you have any closing words or is there one thing, some kind of change someone may be in your position working with the data science team, a piece of advice or something that you’ve learned? Like, “Before I did this, now I do things this way.” Like a learning that you can share with our audience.

Ahmer: Yeah. At least for me, what has been a formula of success has been, build data products with the prop people for the people. Bring a sense of transparency and vulnerability to the table and build in increments. Science is in the title of data science and we should approach it with that, with continuous evaluation. And then that evaluation and evolution will lead to products that are going to go into production and they are going to get adopted and utilized and actually start to drive business value that can be measurable.

Brian: Great. I think that’s great advice, and thanks for sharing.

So lastly, if someone want to get in touch with you or learn more about … I know you do some speaking, and some writings, so how does someone check in with you on the line?

Ahmer: LinkedIn really is the best medium to connect with me.

Brian: Okay, great. I will put your LinkedIn link in the show notes and Ahmer it has been great to talk with you about your work here. I love what you’re doing, I love the centering around humans and people, so we’re not putting crap out there that no one uses. So yeah, thank you for coming on the show.

Ahmer: Thank you for giving me the opportunity, Brian.

Other Episodes

Episode 0

September 22, 2020 00:37:57
Episode Cover

048 - Good vs. Great: (10) Things that Distinguish the Best Leaders of Intelligent Products, Analytics Applications, and Decision Support Tools

Today I’m going solo on Experiencing Data! Over the years, I have worked with a lot of leaders of data-driven software initiatives with all...

Listen

Episode 0

June 16, 2020 00:44:05
Episode Cover

041 - Data Thinking: An Approach to Using Design Thinking to Maximize the Effectiveness of Data Science and Analytics with Martin Szugat of Datentreiber

The job of many internally-facing data scientists in business settings is to discover,explore, interpret, and share data, turning it into actionable insight that can...

Listen

Episode 0

January 28, 2020 00:41:28
Episode Cover

031 - How Design Helps Enable Repeatable Value on AI, ML, and Analytics Projects with Ganes Kesari

Ganes Kesari is the co-founder and head of analytics and AI labs at Gramener, a software company that helps organizations tell more effective stories...

Listen