Kathy Koontz is the Executive Director of the Analytics Leadership Consortium at the International Institute for Analytics and my guest for today’s episode. The International Institute of Analytics is a research and advisory firm that discusses the latest trends and the best practices within the analytics field. We touch on how these strategies are used to build accurate and useful custom data products for businesses.
Kathy breaks down the steps of making analytics more accessible, especially since data products and analytics applications are more frequently being utilized by front-line workers and not PhDs and analytics experts. She uses her experience with a large property and casualty insurance company to illustrate her point about shifting your company’s approach to analytics to make it more accessible. Small adjustments to a data application make the process effective and comprehensible.
Kathy brings some great insights to today’s show about incorporating analytic techniques and user feedback to get the most value from your analytics and the data products you build for the information.
Thank you for joining us for today’s episode of Experiencing Data. Keep coming back for more episodes with great conversations about the world of analytics and data.
“Oftentimes data scientists see the world through data and algorithms and predictions and they get enamored with the complexity of the model and the strength of its predictions and not so much with how easy it is for somebody to use it.” — Kathy Koontz
“You are not fully deployed until after you have received this [user] feedback and implemented the needed changes in the application.” — Kathy Koontz
“Analytics especially being deployed pervasively is maybe not a project but more of a transformation program.” — Kathy Koontz
“Go out and watch your user group that you want to utilize this data or this analytics to improve the performance.” — Kathy Koontz
“Obviously, it’s always cheaper to adjust things in pixels, and pencils than it is to adjust it in working code.” — Kathy Koo
Brian: I’m excited to have Kathy Koontz of the line. She is the Executive Director of the Analytics Leadership Consortium at the International Institute for Analytics. That’s quite a mouthful. Did I get that totally right?
Kathy: Yeah. You did. I tell people, “Yes, it is a real job.” Yeah, you got it right, you nailed it.
Brian: Tell our listeners, what does that mean? What do you do?
Kathy: The International Institute for Analytics is an organization that was founded by Tom Davenport, one of the early readers and using analytics for business performance. We are research and advisory firm that helps companies realize value from analytics.
Brian: How do you work with them in your leadership capacity there? What’s your specific role there?
Kathy: I lead the product line that’s called the Analytics Leadership Consortium. What that is, is a group of analytics executives from different companies who are in non-competing industries that meet of a regular basis to better understand trends and best practices in analytics to vet ideas with one another and ensure that they’re doing the best that they can to deliver analytics value for their organization.
A really great opportunity for leaders in this field of analytics that’s changing a lot and has a lot of emerging practices, have regular time to get together in a confidential setting to understand what’s working and what’s not, and how they can improve analytic values for their company.
Brian: You mentioned trends, I obviously try to stay on top of what’s going on in that industry and that’s actually how I came across you, originally, I think was you guys had put on a webinar on five trends and analytics going on right now. At the end of that, you had mentioned that one of the things that’s starting to change now is the importance of design and user experience as we move beyond designing reports which is one of the most difficult deliverables, so to speak, of analytics—as we move into the user experience it’s becoming more important. That’s what I was like, “Oh, this could be really interesting to hear what you have to say about what’s changing. Why is UX now relevant? How is capital D design relevant to the world of analytics?” that’s what I was curious to learn about. Can you talk about that a little bit?
Kathy: I think as analytics mature in organizations, the need for design is what’s going to drive adoption and utilization of those analytics. In companies that are just starting their analytics journey, it’s a little bit easier to realize analytic value by doing a couple of really big data science projects that don’t really require a lot of design thinking. It’s a powerpoint to an executive group that has some higher level of organizational level thinking that the executive lean in, understand the analytics, understand the value of making their decisions in line with the analytics and then move on and do their regular roles.
But as the organizations try to make analytics more pervasive, particularly into front-line associates or individual contributors who are really using analytics to make a lot of small decisions within the execution of their work or as they try to integrate analytics into processes that are monitored, that design thinking taking an approach as user experience, can really break down a lot of barriers that organizations encounter when trying to have people who are not necessarily used to using analytics in their decision-making process, use them so that they can make a better decision for the organization.
Brian: Got it. Is the trend that people are recognizing the problem because they went through some pain of maybe they delivered some big multimillion dollar platform, and like you said the front line associates didn’t use it. As I always use the example, it’s like, “They’re out driving a truck. They don’t have a laptop. They’re not going to go download a report and change the columns or whatever.” It’s like the wrong mechanism, you didn’t fit it into their job; you try to get them to change their job to accommodate your tool. Is the awareness because they went through a failure or is it like, “We already know this isn’t going to work if we don’t get the UX right,” just because companies are a little bit more design aware these days? What drove that to change? Why is it now?
Kathy: I think there’s two things. First of all, they’ve failed, they invested millions of dollars in some sort of decision-support sort of application that may have millions of dollars and data integration work that we’ve done to build it and purchased a lot of really advanced software that can help users slice and dice the data and dive in and understand it. But they really took more of a data-centric approach rather than user-centric approach.
They really didn’t take that extra step to say, “How does this person go through their normal day in their decision making. Where do they need this information in that decision making, and how should it be best presented so that they don’t have to do any cognitive task switching, that it just fits into how they go about making the decisions.” I think those failures are one big driver.
But then I also think as organizations move up the analytics maturity curve and move from BI reports to predictive analytics to prescriptive analytics. Those prescriptive analytics are going to be much more pervasive across the organization. I think it’s this analytics maturity that’s also driving this need to put more design thinking into this creation of analytic products.
Brian: Do you have any specific examples of a company that may have like, “Version one was X, and we didn’t get what we wanted. Version Y or X.2—we then went back and tried to fit this in better with that employee, and we saw some kind of change of result.” Can you cite any examples of how design allowed the data to actually be insightful and create a meaningful change?
Kathy: One was a project that I was involved in at a large property and casualty insurance company. We were trying to alert our franchise insurance agents if they had somebody in their book of business who was likely to not renew. It was based on really good data science and model scores with different, clear […] that showed significant likelihood to a trait between them.
It was originally deployed in a separate application with some general groupings of how likely they were to leave. The utilization of it was more of a curiosity at that point. There were some adopters, but not as broad as it would be hoped to really be able to substantially move the needle on this. As the design was redone and integrated into their overall CRM system—was legit on their CRM system that came up when they logged in in the morning, they didn’t have to go look to see, it was there for them. As we moved from groupings and scores, into one star to five stars, five star is they’re likely to go–you really need to work those. So just very simple little changes that drove a significant change in the utilization of that capability.
Brian: Do you know what the blockers were such that it took a redesign? If someone was listening to this and they’re like, “I’m that person right now. We don’t want to go through the learning experience that your company went through.” What do you do to prevent that? Like, “Here are some roadblocks to watch out for.”
Kathy: The thing is, these products are usually developed by folks within a data science group. We have the data, and we know this is a business problem, and we’ve done the analytics. That is expected but not sufficient. Then the next step is to really use just basic research techniques from consumer companies. Go out and watch your user group that you want to utilize this data or this analytics to improve the performance. See how they do their job, what tools do they use, where would this information be most relevant, how can it be presented in the context of their general activities, where it’s not a separate thing, where it’s integrated into the stuff that shows up of their performance evaluation.
That’s the way to really avoid some of the deployments that may have really great data and science behind them, but don’t get user adoption that’s needed. Just take that extra step and really understanding the user and where this information is going to be most relevant for them.
Brian: Do you think the appetite from executives and people that are at the top of the reporting chain for these things support the time and the effort to go out and do that type of research or to try to fit it in and not so much focus on “When are we releasing code? Show me some progress.” They want to see a glee. “Just show me some proof that these millions of dollars is doing something,” versus this kind of squishy. It can be squishy—at least in my experience with certain executives, especially qualitative stuff. “How many people did you talk to?” It’s like, “Oh, we talked to eight so far.” It’s like, “We have 10,000 employees.” It’s hard for certain ones to understand the value of qualitative research in these things. Do you have any experience or thoughts about that?
Kathy: It can be squishy, but I think really analytics especially being deployed pervasively is maybe not a project but more of a transformation program and you have to take the transformation program perspective to it which includes such squishy things that change management and business process redesigned.
Somebody using those analytics within their decision-making process is really where the organization gets value from. That code that gets released and deployed, it is an interim step. But it is not the final step to that organization getting value, maybe you’re just a data scientist of the company, trying to deploy a great app that could help a group of marketing folks better invest marketing, or supply chain folks better manage costs to suppliers.
It doesn’t have to be this big concerted professional effort. One way to do that in a very agile, low-cost way is to find a couple of folks that seem to really jazzed about getting this and use them as some beta testers. Maybe you start out with just the information on a spreadsheet and say, “Hey, does this make sense just from this information that you would use?” And then have them walk you through, “Where would you use this information? How would it be best presented to you?”
And then think about working within the constraints that you have of maybe you can’t change the screen for this digital marketing […] that somebody uses, but how can you make that information as accessible as possible, where it really is more of a push of information at the right time, as opposed to pull of information that a human being has to remember to go get when they’re in the process of executing their “day job”.
Brian: You hit on some great stuff there. I talk about this to my list frequently which is understanding tasks and workflows and people’s day-to-day jobs that the goal is to fit your solution into their existing behavior as much as possible. It’s really hard to change behavior. “Oh, I got to go remember not to load this other screen and pull out those number of page two and paste this into the other screen and then hit enter and then it does some analytics.” These are the kind of stuff why people don’t bother to do it. It’s like, “Well, my guess is good enough. I’ve been doing this for 20 years. I’m this good. Whatever. No one’s going to know if I did that or not. They don’t know where that number came from.” They don’t want to do that.
First of all, it’s to understand their pain, and then another good technique there is if you’re going to improve a process and we want people to use this tool in order to realize this new value, first of all, understand that benchmark of what their existing workflow, their playbook is and then ask them to do the same playbook using the new platform. This is a great way to uncover the things that you don’t know to ask about necessarily because if you first get that model of how they do it now, it’ll help elucidate those gaps that we can’t see.
I just read an article, there is an example like, “Let’s imagine designing a new hotel room.” and you might take for granted that that hotel room has a bathroom. […] is talking about this. You’d be really surprised if you walked in and there was no bathroom because no one stated that was a requirement, and because everyone just took it for granted. It’s those kinds of things that you won’t see as a data scientist or product manager, perhaps because you don’t know what these front-line workers are necessarily doing all day. But I love that you talked about getting into the minds of what people are doing and fitting your solution into them.
One problem I see is that in some places, it’s really hard to get access to customers. I think sometimes, this is more on enterprise companies that are selling a product that has a data-specific part of it, it can be hard to access them. You would think companies doing internal analytics that this would be easier. Do you think that comes from leadership? Do think that comes from the ground up? Any comments on how do you get access to the right people? What if they’re not being like, “Hey, I’m a call center. I get paid hourly by the number of calls I do, and you want me to go work on your new software?” How do you incentivize that so that they become a design partner which ultimately, is really where you want to go is to have a team of partners from subject matter experts, and product managers, the data science people, whoever it is that is going to affect the solution that comes out. What needs to change in order to incentivize that participation? It takes time to get these things right.
Kathy: Yeah, for sure. We see that a lot just because as you noted, the breath of the organization or the different incentives and priorities that other groups might have. I hear companies all the time, senior leaders, we are going to be an analytics competitor, we’re going to do analytics. They think if they hire 200 Ph.D., data scientists that they are now an analytics company.
I do think you’ve got to have somebody at a leadership role at least over the data science group say, “Look, this company gets value from this when we are making better decisions because of this analytics.” And those better decisions are going to come from people accessing the analytics. When they’re in their decision-making process, and really working more leader-to-leader. But that can’t be where it stops. You’ve got to create a peer relationship that all levels, number one.
And then number two, if you get the leadership of the other group engaged early on—as this is a problem that they feel they have ownership in and that they are a co-creator with you, and starts at the leader level, and then works its way down—then, I think you’re going to have greater access and more ability to do that. Finally, in the absence of that, if you don’t get it right, then I would at least, as you’re deploying the new app, say, “Our next step is to watch the staff members use it and identify how to better integrate it.” So that at least you’re showing some leadership in your thinking that, “I know this is going to be a problem because I didn’t have access to them. I’m already teeing it up.” We’re going to have to come back and see how people are using it and how we can utilize some user experience and design thinking in the subsequent phase after […] is reached.
Brian: Sure, sure. I think some of my clients in the past, they have to go through a failure first in order to decide that they don’t want to do it. We don’t want to do the build-first, design-second, process on the next one. There’s a certain level of convincing that can happen, and then at some point, you got to move forward because typically, IT or the business, they have been tasked with, “Deploy this new model and this software into the survey,” and they’re going to do that no matter what. They’re not going to stop and wait for design if that organization doesn’t have a matured design practice. So you might have to go through that.
I would say, obviously, it’s always cheaper to adjust things in pixels, and pencils than it is to adjust it in working code. The more you can get in front of these decisions, and then form the engineering and the data science prior to deploying a large application, it’s a lot cheaper, and it’s a lot easier, and you don’t have all the change costs associated with that, both time, money, labor. No one likes to do it twice, most people want to work on new problems, they don’t want to redo the same ones. Engineers usually don’t like doing this, so getting in front of it is good.
Kathy: I think what you’re touching on there is, maybe if you haven’t done your user design work, then you think about this release as a beta release. You plan this release, and you plan this app, and you manage your code knowing it’s going to change at some point, and knowing it’s going to change at some point, and knowing it’s going to change soon.
The second thing is, key point is to decouple the analytics insight from the application. There is an analytic insight, whether it’s a number, or a recommendation, or some score, or something, is it is just loosely coupled to the delivery mechanism than it is easier from a code engineering perspective to have to it delivered in a different way.
Brian: Right. No, that’s great. A part of that is that the attitude and culture of change and not falling in love with our first versions of stuff. To me, that has to be ingrained in both the engineering–all of the teams that are touching the product that the service that’s being put out there.
A lot of companies say they’re doing agile. A lot of them are skipping one of the most important parts which is getting some customer representation involved, and actually, iterating and not just doing incremental design where you keep adding more, “Add another feature, add another data point,” that’s incremental design, that’s not iterating and changing as you get feedback. That’s something to watch out for as well.
At some point, you need to get some stuff out there, and it costs to have come down obviously, to deploy software. It’s overall cheaper to get something out there quickly and to start getting feedback of it, but it’s very easy to no get the feedback and to let the working code feel like success. Until someone at the top is like, “Where is all the bang we’re supposed to get for this?”
Kathy: I think the approach there is, get it deployed, so that’s great; it’s out there, and it’s working. But again, build into your deployment plan, use your feedback, and change from there. If you are not fully deployed, until after you have received this feedback and implemented the needed changes within the application, then I think that’s another way to prevent that falling in love with your first design. You know that this is just deployed so that people can use it, and that getting user feedback and making those needed changes is an expected part of the deployment process and is not a failure that the application was not sufficient.
Brian: I think that should be part of any software development process to have a loop of test design, refine, deploy in the circle. For the most part, for larger applications, you’re never really done with it. It’s a process of getting better and figuring out the ROI like, “Have we hit the market? Is it worth spending more time and money on this?” but yeah, those are great insights.
I’m curious in terms of the roles, I feel like someone that’s at the top of the responsibility chain here needs to have a healthy dose of skepticism about their own stuff, especially when it comes to prescriptive and predictive analytics or any service where it’s custom software they’re deploying into the organization have a healthy dose of skepticism about how great it really is. Maybe you deployed on time, bug-free, you can see the stats and all of these. But is that responsibility primarily falling into the data science realm because companies are investing in that area right now and so they become the de facto, what we would call a product manager more in the SaaS world? Did they become that and is that the right place for that responsibility to be?
Kathy: I see that happen really, not necessarily through design and intention. But just because if a data scientist wants to take this great science that he’s discovered and make it accessible, in a lot of organization they have to do it, and so it’s put upon them when they’re probably not well prepared for that. I do think that that’s the problem.
And then as a lot of different tool and capabilities make it easier for data scientist to deploy an application, if there’s not a really good user design construct within whatever application they’re deploying, in their data science application, then they need to be the one to take the extra step. As I said earlier, often times, the data scientist sees the world through data and algorithms and prediction. They get enamored with the complexity of the model and the strength of its prediction and not so much with how easy it is for somebody to use it.
I think that’s probably a reorientation with training that should happen within data science groups around design, best practices, and user experience design. They’re not going to evolve into great user experience designers, but they are going to be able to recognize something that’s really bad and perhaps ask for help and guide to make it better.
Brian: Do you think that’ll stick then that that role will continue to live there? This person that needs to understand the business value that needs to be obtained, the user experience side of it and the technology side is trifecta there. Do you think that’ll stay in that data science world?
Kathy: I don’t think so. I do think there will be a separation between the data application design and the creation of the data science that informs that […] in that application. I think as we talked about as some of these companies get more companies get more mature and need to have more pervasive deployment of data science insights, they are going to realize that they need to take a different approach.
I often said that between data and analytics, I see the industry mature along the lines of software development. That design focus was not a big part of software development early on. I think it’s just going to have to happen within data science.
Brian: I look at it this way, product owners can take many different titles. I have had all kinds of different clients, but ultimately, the box stops with everybody that’s working on it. But it helps to have someone that’s at that intersection of, “What do we need to do? What’s the overall picture of this?” and they understand the tech, the business, and the user experience side. Whatever the title of that person is, that role to me is really critical so that technology doesn’t run with everything. You have to have all three of those ingredients to deploy successfully, at least in my experience. It seems like that’s pretty critical to have that.
Kathy: Yeah, for sure. A number of organizations that ask us about how do we show the value of our data science investments, how do we demonstrate our data science ROI. I think if more data science groups really looked at how their data science products were being consumed and started quantifying those, and then using some of the business metrics that are involved with that business process, whether it’s optimizing a spend or reducing average handle time at the call center, that would give them a great task to be able to validate their ROI to the organization, but I don’t see a lot of data science groups doing that.
Brian: I’m curious, who ask that question? Is that the data scientist themselves or is it the business stakeholder who’s hiring the data scientist? What role asks that?
Kathy: Usually, it’s a data science leader who has a large organization in large enterprise organizations. If you have a large organization with a lot of expensive resources, there is that continuing need to show the value that this organization brings to the overall company, especially when the output of that organization is not well understood or has not then a traditional part of that company.
A lot of senior executives, C-Suite executives at large organizations, didn’t have data science when they were coming up through the ranks. This is something new that they don’t really understand, they know how much money they spend on it. A lot of data science organizations will need to demonstrate, “Here’s the hardline benefit that this company has gotten for investing in data science capabilities.”
Brian: This blends nicely into my next question which is about obviously, AI, and machine learning are hot topics right now in technology. I hear this from people I talk to frequently which is, “Oh, the board knows we’re supposed to be doing some AI.” They’re asking me, “How many sensors do we have installed? Do we have digital transformation?” they ask these really high-end questions, and they want to go spend some money on it because they’re so afraid they’re going to miss the boat on that.
From a design standpoint, we would say, “That smell, it reeks of possibly putting a cart before the horse.” The tool comes out, “We got to go buy this hammer because everyone else is buying this hammer. We have no idea what you hit with it, but we got to have it. We got to go spend some money on it.” How do you ensure that you don’t waste money? You want to invest in this. You don’t want to miss the boat. Maybe there’s potential for a project to deploy machine learning. It’s just pretty much what a lot of these companies are doing in terms of AI right now. How do you make sure that the desired investment from the business is actually going to have some ROI? They have heard this tool is hot.
Kathy: It’s what I call the hype cycle mandate. Whatever is the new thing, it was a big data, it was AI and machine learning, with data sciences right, we have to have them, we have to tell the board we’re doing this. I think that is where executives earn their money, is being able to manage the message to the senior leaders who may not understand what’s needed and how you use it so that they can say, “Yes, we’re using it.” But the leadership than the executive leadership or the ones who has to go out and figure out where are the business problems and what is actually needed within our operating environment and our company to really deliver value from this capability.
I will say, oftentimes, I see executives making the right call in that way. I have seen cases where folks have gone out and bought a lot of a software, and hardware and stuff that they have no idea of how they’re going to use, and that’s the shame.
Brian: Do you think the right step there is to take on a small project, find a small win, show a small value and you can at least satisfy the, “Are we doing something?” “Yes, we’re doing some machine learning or whatever.” Do you think that’s the way it starts?
Kathy: I am big just in any data […] investment, a big fan of used case-based development. Come up with a used case that requires this type of capability to execute it at the scale and precision that’s needed and then do some pilots to prove out the value, show the value, and then that then builds up a business case for the larger deployment that way. Yeah, I totally agree with that, Brian.
Start with a used case, start small, understand, have an eye towards scaling as you start small, but get a small win, show that you’ve done it, the CEO can, in all honesty, say, “Yeah, we’re doing machine learning, and we’re going about it in a way that is physically sound but will also put us in a position to be able to compete with this capability in a quick amount of time.”
Brian: That’s great. I think that I don’t know if I’ve said it, but this concept of falling in love with the problem and if you and your team, the people I work for you, or whoever it may be can fall in love with the problem and then weaponize your machine learning against that. That’s always a great thing, it’s to get everyone jazzed about the problem so that you know, especially if you can line it up with that technology not that the goal is to do that, but that’s when big wins happen to me, at least in my experience.
This have been awesome. Do you have any advice, overall what will we talk about in terms of data product managers, data science managers, data science leaders, analytics leaders in terms of design, experience, what they should be looking for, going forward and just, in general, bringing more value to the customers. Is there a theme or something on your mind right now that needs to get mitigated to them?
Kathy: My theme is a data science is about big complex data and a lot of technology and really advanced math, but they’re still human beings who have to use it. Don’t forget the humans. Focus in of how great this data set is that you created or how advanced this analytics technique that you’re using, but remember, the humans are your last line to realizing value from all of the stuff you’ve done before. Make sure you keep them in mind as you go through all of those other stuff as well.
Brian: I think that’s great advice. And it sucks, those pesky humans.
Kathy: I know, I know! They just get in the way of all those greatness!
Brian: This is awesome. I have one last question for you and that’s have you ever surfed on a river?
Kathy: I have not.
Brian: I’m just curious. That came up in the webinar. I just saw this article on The Times about river surfing and I’m like, “I got to ask Kathy about this.”
Kathy: Yeah. I’ve seen a lot of videos around that. I think it’s like this bore tide where the tidal action creates like this perfect wave that you can surf like forever. There’s some really good videos of folks doing that down in Brazil with some of the rivers that are draining out into the Amazon and other rivers that are draining out. It looks really awesome, but there’s this surf ranch in California that has this man-made wave. Kelly Slater, I think worked to create the engineering for this technology. That’s exactly what it looks like for those river-bore waves. But I’ve seen actually somebody in Munich surfing one of those […]. It’s on my bucket list, Brian.
Brian: Sweet. I’m going to put a link to the surf camp, but where can we put some links to you? Where can people find you on the interwebs?
Kathy: You can find me on LinkedIn, I am, Kathy Koontz. You can also find me at the International Institute for Analytics, it’s iianalytics.com, that’s our website. My LinkedIn name is customerjourneykoontz because that’s always been my passion is using data and analytics for customer journey. That’s where you can find me on LinkedIn.
Brian: Cool. I will put those links in the show notes. This was super awesome. Thanks for coming and talking to me today. I’m sure people are going to enjoy listening to this. Thank you.
Kathy: Thanks for having me, Brian. Have a good day.
Brian: Alright, see you.
Designing a data product from the ground up is a daunting task, and it is complicated further when you have several different user types...
Tom Davenport has literally written the book on analytics. Actually, several of them, to be precise. Over the course of his career, Tom has...
Dr. Bob Hayes, will be the first to tell you that he’s a dataphile. Ever since he took a stats course in college in...