009 - Nancy Hensley (Chief Digital Officer, IBM Analytics) on the role of design and UX in modernizing analytics tools as old as 50 years

March 26, 2019 00:49:44
009 - Nancy Hensley (Chief Digital Officer, IBM Analytics) on the role of design and UX in modernizing analytics tools as old as 50 years
Experiencing Data with Brian T. O'Neill
009 - Nancy Hensley (Chief Digital Officer, IBM Analytics) on the role of design and UX in modernizing analytics tools as old as 50 years

Mar 26 2019 | 00:49:44

/

Show Notes

Nancy Hensley is the Chief Digital Officer for IBM Analytics, a multi-billion dollar IBM software business focused on helping customers transform their companies with data science and analytics. Nancy has over 20 years of experience working in the data business in many facets from development, product management, sales, and marketing.   

Today’s episode is probably going to appeal to those of you in product management or working on SAAS/cloud analytics tools.  It is a bit different than our previous episodes in that we focused a lot on what “big blue” is doing to simplify its analytics suite as well as facilitating access to those tools. IBM has many different analytics-related products and they rely on good design to make sure that there is a consistent feel and experience across the suite, whether it’s Watson, statistics, or modeling tools. She also talked about how central user experience is to making IBM’s tools more cloud-like (try/buy online) vs. forcing customers to go through a traditional enterprise salesperson.

If you’ve got a “dated” analytics product or service that is hard to use or feels very “enterprisey” (in that not-so-good way), then I think you’ll enjoy the “modernization” theme of this episode. We covered:

Resources and Links:

Nancy Hensley on Twitter

Nancy Hensley on LinkedIn

Quotes:

“It’s really never about whether it’s a great product. It’s about whether the client thinks it’s great when they start using it.” –Nancy

“Every time we add to the tool, we’re effectively reducing the simplicity of everything else around it.”–Brian

“The design part of it for us is so eye-opening, because again, we’ve built a lot of best in class enterprise products for years and as we shift into this digital go-to-market, it is all about the experience…”–Nancy

“Filling in that “why” piece is really important if you’re going to start changing design because you may not really understand the reasons someone’s abandoning.”–Brian

“Because a lot of our products weren’t born in the cloud originally, they weren’t born to be digitally originally, doesn’t mean they can’t be digitally consumed. We just have to really focus on the experience and one of those things is onboarding.” –Nancy

“If they [users] can’t figure out how to jump in and use the product, we’re not nailing it. It doesn’t matter how great the product is, if they can’t figure out how to effectively interact with it. –Nancy

Episode Transcript

Brian: Today on Experiencing Data, I [talked] to Nancy Hensley, the Chief Digital Officer of IBM Analytics. Nancy brings a lot of experience and has a lot to say about how user experience and design have become very integral to IBM’s success especially as they move their applications into the cloud space. They really try to bring the price point down and make their services and applications much more low touch in order to access a new base of subscribers and users.

I really enjoyed this talk with her about what the designers and people focused on the product experience have been doing at IBM to keep their company relevant and keep them pushing forward in terms of delivering really good experiences to their customers. I hope you enjoy this episode with Nancy Hensley.

Hello everybody. I’m super stoked to have Nancy Hensley, the Chief Digital Officer of IBM Analytics. How’s it going, Nancy?

Nancy: Good. I’m happy to be here. Happy Friday.

Brian: Yeah. It’s getting cold here in Cambridge, Mass ; [ you’re] in Chicago area, if I remember correctly.

Nancy: Yeah, it’s a little bit chilly here as well.

Brian: Nice. So it begins. You’ve done quite a bit of stuff at IBM when we had our little pre-planning call. You talked a lot about growth that’s been happening over at IBM. I wanted to talk to you specifically about the role that design and experience has played, how you guys have changed some of your products, and how you’re talking to new customers and that type of thing. Can you tell people, first of all, just a little bit about your background, what you’re currently doing, and then we could maybe […] some of those things.

Nancy: Sure, happy to. Thank you for having me again. I think I’m one of those people that doesn’t fit nicely into a box of, “Are you product? Are you marketing?” I am a little bit of both. Most of my IBM career, I have moved in between product marketing and product management. That’s why I love digital so much because it really is a nice mixture. And in particular, growth hacking because it combines all the things I love, including data as a part of what we do.

What I’m doing right now as a Chief Digital Officer in the Analytics Division and Hypercloud is how do we transform our products to make them more consumable, more accessible? We have best in class products in data science, in unified governments and integration, in hyper data management products, but our products and our business is built on a traditional face-to-face model. There is even a perception that we’re not as accessible to them and that’s what we’re looking to change.

Creating those lower entry points, making it easier for people who didn’t have access to us before, to start small and grow through a digital channel, through a lower entry point product, and then scale up from there. That’s really what we’re trying to do and as part of a bigger mission to really democratize data science—I kind of cringe when I say that word—I think it’s really important for more clients to be able to be more data-driven, have tools that are easy to use, and leverage data science to optimize their business. Part of the way we’re doing that is to develop a digital route to market. We’re pretty excited about it.

Brian: I think a lot of our listeners probably come from internal roles of companies. They might be someone that’s purchasing vendor software as opposed to a SaaS company where they may have a closer role to marketing and all that. Can you tell me what you guys are doing there?

Part of the thing with my experience is that some of the legacy companies, the older companies that are out there tend to get associated with big giant enterprise installations, really crappy user experience. It’s just so powerful, you have to put up with all these stuff. People’s tendency these days to accept that poor experience as just status quo is changing.

What have you guys done? Not that you’re to blame but I’m sure that opinion exist. How do you guys adapt to that and wonder if upstart analytics companies coming out with other things, what do you guys to to address the experience?

Nancy: There’s certainly a perception that IBM is that big, complicated, enterprise-focused product out there. We see the data. There’s a lot of articles, there’s a lot of feedback, there’s endless report that all validate that clients are trading off complexity or features and functions for consumability, because they got to get things done, they have less people to do it.

We fully recognize that. Where we started to look for that was how we first started to make things much more accessible, not just our cloud products because that’s pretty easy if you have stuff in the cloud—it’s pretty accessible—but our on-prime products as well. So, for clients that are running analysis behind the private cloud, whether it’s a statistical product, or a predictive analytics product, or data science project, or even what they’re doing on their data catalog, all of that was not something people would go to the cog to look for it. There are some things they need especially financial and health care, and there’s large and small companies on both sides.

One of the things we set out to do is how do we create that cloud-like experience for clients that are running things behind their firewall. We started a project about a year ago to look at some of our on-prime products and create that experience where literally you could, within a couple of clicks, download, try, and be using a product within 15 minutes. That was our goal. As opposed to before where you would have to contact and IBM salesperson, get them to come out and meet with you, and then set-up a trial.

That’s what we started to change was that at least make it accessible. As we progressed that capability, we started changing our pricing and packaging to be appropriate, to create that entry-level point, to create a shift to subscription. You want to buy everything on subscription these days, I think.

The last part of that shift for us has been to really focus on the experience because a lot of these products were not born digital. We really need to make sure that when clients were coming through that channel, that it was a great experience. That’s really where design experience came into play for us.

Brian: How did you know of what’s wrong beyond broad surveys or just that general feeling that like, “Oh, it’s the big giant bloated software…” the stereotype, right? How do you guys get into the meat and potatoes of like you said, sounds like there’s a benchmark there, 15 minutes on that first onboarding experience, but can you tell us a little bit of maybe if you have a specific example about how you figured it out? What do we need to change about this software application to make it easier to get value out of the analytics of the data that’s there?

Nancy: I’ve got lots of examples. We’ll opt with one that clients actually are very familiar with, which is SPSS Statistics that a lot of us used back in college. That was a product that actually turns 50 years old this year. It’s been out a while, a lot of people still using it a lot, and most of our base of users for statistics, I think if you look at the demographics of it, over 60% are under the age of 25. So, their buying preferences were very different than they were when they started out in 1968.

We look at the verbatims from our NPS feedback and it was clear that clients really wanted a much more simplified and flexible experience than buying SPSS Statistics and having access to it. A lot of times, students have to get it really quickly for a project because they’ve might have waited until the last minute and they wanted a much more flexible subscription-based program. They might only use it for a few months and then come back to it.

That was one of the first things that we implemented was to change the buying experience for the consumption model. We didn’t actually change the product at that point. We just changed the consumption model to see if in fact that actually will help us have some growth on that product, and it absolutely did.

Since then, we’ve actually gone back and change the product as well. It’s got a whole new UI for its 50th anniversary. Joke around that it’s got a face lift for it’s 50th anniversary.

Brian: Does it have a green screen mode?

Nancy: It is a completely different experience, not just from a buying perspective, but also from a UI perspective as well. We have other products, too, that have been around maybe not 50 years but have been very popular products like our DB2 Warehouse on Cloud and our DB2 database that clients have been buying for years to run their enterprises.

We wanted to make that again, as we created a SaaS alternative of these products that it was extremely consumable. So, we’ve been looking specifically, is it easy to figure out which version to buy? How much to buy? What it’s going to do for you? Like I said, which version? How do I calculate things? We’ve been really looking at the experience of that is, if there was no salesperson at all, how do we help clients through that buying experience?

Brian: I’m curious. When you decided to helping them through the buying experience, does any of that thinking or that strategy around hand-holding someone through that experience happen in the product itself? I’m guessing you’re downloading a package at some point, you’re running an installer, and at that point, did you continue that hand-holding process to get them out of the weeds of the installation and onboarding again to the actual, “Is this tool right for what I needed to do?” Everything else being friction up until that point where you’re actually working with your data, did you guys carry that through? Can you talk about that?

Nancy: You’re hitting one of my favorite topics which is onboarding. Because a lot of our products weren’t born in the cloud originally, they weren’t born to be digital originally, doesn’t mean they can’t be digitally consumed. We just have to really focus on the experience and one of those things in onboarding.

Let’s say, DB2 in particular, which won the process of creating onboarding experience for DB Warehouse on Cloud. For anybody who’s used DB2, we do have an updated UI for that. They can jump in and start using it. But that’s not everyone, the people that haven’t used it before.

So, we just started working with a couple of different onboarding tools to create these experiences. Our goal was to be able—at least I’m offering management side alongside our partners but design—to create these experiences in a very agile way and make them measurable—my second favorite topic, which is instrumentation—but not have a burden on development, because the fact is, in almost any organization, development wants to build features and functions.

Whenever we talk about this, they were prioritized lower because they want to build new capabilities. They’re less enthusiastic about building in things like onboarding experiences. With some of the tools like [.DB2..] give us, is a way to make it codeless to us. We can create these experiences, then pass the code snippet, and then measure whether those are effective or not because we actually see those flowing through segment into our amplitude as a part of the shuttle.

We’ve got some great feedback as to whether they’re working or where they’re falling down. We can create checklists of things that we want the clients to do that we know makes the product sticky, and see if they actually complete that checklist. It’s giving us so much better view because before, what we would see with a client is register for trial, they downloaded the trial, they’ve created their instance, and then boom they fall off the cliff. What happened?

Now we’re getting a much better view to what’s actually going on for the products that have been instrumented as well as the view we’re getting in from the onboarding experiences.

Brian: For every one of these applications that you’re trying to move into a cloud model or simplify whether it’s cloud, to me the deployment model doesn’t matter. It’s really about removing the friction points whether it’s on-premise software or not. I think we all tend to use the word ‘cloud’ to kind of feel like, “Oh, is this browser-based thing? There’s no hard clients? There’s no running scripts at the terminal and all that kind of stuff?” Do you guys have a set of benchmarks or something that you establish for every one of these products that are going to go through a redesign?

Nancy: We do. We’ve got a set of criteria, it’s really broken down into two pieces. Whether it’s going to be a cloud product or an on-premise product—I actually have a mix of both—there is what we call the MVP side, which might be something that’s not born in the cloud, it’s not a new product, and we’re looking to create a lower entry point, a really good trial experience, a very optimized journey. We’re even doing things like taking some of the capabilities that we used to have from a technical perspective and making those more digitally available.

Online proof of concepts, hands-on labs that you do online instead of waiting for a technical salesperson to come out to see you. Tap us that can answer your questions faster even before you talk to a sales rep. All of that is included in the what we call the MVP portion of the criteria that we look at. Pricing and packaging’s got to be right for the product, for the marketplace. Got to have that right product market fit that you’ve got a good valuable product but a low-enough entry point where somebody can start small and scale up.

The second part of the criteria is where the growth magic happens, where we’re dumbing down a lot more on the experimentation, where we’re making sure that we’ve got onboarding, instrumentation we want done, and the MVP phase, we don’t always get it, but our development partners really understand the value of that now, which is great.

Though more often, we’re getting into the second phase of where we’re more doing the transformation. Through that, then we’re getting a lot more feedback, where we can create the onboarding experience. We can do even more on the optimized journey. We’re doing a lot of growth hacking that’s based on terms of optimizing.

Things like how clear is information on the pricing page? Is it easy for the customer to figure out what they need to buy? What the pricing is for that? Can they get their questions answered quickly? Can we create a deeper technical experience for them, even outside of the trial itself? Like I mentioned, things we’re doing with our digital technical engagement, thinking that what used to be our tech sales modeling and making it more digital.

Brian: That’s cool. When you guys go through this process of testing, are you primarily looking at quantitative metrics then that are coming back from the software that you guys are building, or you’re doing any type of qualitative research to go validate like, “Hey, is the onboarding working well?” Obviously, the quantitative can tell you what. It doesn’t tell you why someone might have abandoned at this point. You guys do any research there?

Nancy: We do. It happens in a couple of places. We run squads that are cross-functional across marketing, product, development, and design, each product. Then every Monday we have this thing called Metrics Monday where we get the cross-functional routines together, we share the insights around the metrics. If we had a big spike or we had a big decrease, or if we had a change in engagement, or if we did some experimentation that came out with a very interesting result, we actually share that across teams.

We really focus on why did things happen. We have a dashboard. Everybody is religious in using on a daily basis that tracks all of our key metrics, whether it’s visits, engage visits, trials, trial-to-win conversions, number of orders, things like that, but we also want to dive deeper into the ebbs and flows of the business itself, why things are happening, and if the experimentation we’re doing is helping or not helping. We’ve got a lot of focus on that on a daily and a weekly basis.

Brian: Do you have any way to access the trial users and do one-on-one usability study or a follow-up with them that’s not so much quantitative?

Nancy: Our research team and design will do that and they’ll take a very thorough approach to both recording users using the product, getting their feedback. It’s pretty thorough and also gives us some feedback. We usually don’t do that until the product’s been in the market for a little bit longer. We’ve got some hypothesis of how we think it’s doing, and then the research team will spend a couple of weeks diving a lot deeper into it.

We get some great feedback from that. Honestly, as a product person, as much as I’d to think I’m focused on a beautiful experience, my lens versus our designers’ lens is completely different and they just see things we don’t.

Brian: Yeah, the friction points and filling in the why’s, it takes time to go and do that, but it can tell you things, it helps you qualify the data, and makes sense especially when you’re collecting. I’m sure at the level that you guys are collecting that, you have a lot of inbound analytics coming back on what’s happening. But it’s really filling in that “why” piece that is really important if you’re going to start changing design because you may not really understand the reasons someone’s abandoning.

Maybe it’s like, “I couldn’t find the installer. I don’t know where the URL is. I ended up locking the server on my thing and I don’t know how to localhost, but I forgot the port number,” and the whole product is not getting accessed because they don’t know the port number for the server they installed or whatever the heck it is, and it’s like, “Oh, they dropped off. They couldn’t figure it out how to turn it on, like load the browser…”

Nancy: Right, and even behavioral things that we don’t always think of, like putting a really cool graphic in the lead space that actually takes the attention away from the callback-ends. We’re all proud of, “Hey look at this cool graphic we built.”

One of our designers uses a tool that tracks eye movements and [wait a second] “We’re losing the focus here.” But again, you don’t always see from that lens. The design part of it, for us has been so eye-opening because again, we’ve built a lot of best in class enterprise products for years. As we shift into this digital go-to market, it is all about the experience. It’s all about how good the experience is, how easy the experience is, how frictionless it is, and it’s also about how consumable and accessible the product is in the marketplace.

Brian: You mentioned earlier, it sounded like engineering doesn’t want to go back and necessarily add onboarding on all of this. This gets into the company culture of who’s running the ship, so to speak. Is it engineering-driven in your area? How do you guys get aligned around those objectives?

I’ve seen this before with larger enterprise clients where engineering is the most dominant force and sprints are often set up around developing a feature and all the plumbing and functionality required to get that feature done, but there’s not necessarily a collective understanding of, “Hey, if someone can’t get from step A to step G, horizontally across time, then all that stuff’s a failure. Step F which you guys went in deep on is great, but no one can get from E to F, so definitely they can’t get to G.”

So, that’s you’re qualifier of success. How do you guys balance that? Who’s running the ship? Does your product management oversee the engineering? Can you talk a little bit about that structure?

Nancy: We call operating management aside from product management for a reason, because we really do want the operating managers to feel like they’re the CEO of their business and run the ship. Of course, development has a big say at the table, but they have a natural tendency to want to build capabilities. It’s never going to go away. It’s been that way for ages. We just don’t want to fight that tendency. We want them to focus on building, not take six months to build an onboarding experience when they could build in really valuable functionality in that six months instead.

So, we really run it as a squad, just like many other companies. Operating management does leave a lot of the strategy with our products and development, but I would say that design is also a really, really chief at the table, for sure, absolutely.

Brian: Tell us a little bit about your squads and is this primarily a designer or a UX professional up in your offering manager? Are they a team and then you pull in the engineering representatives as you strategize?

Nancy: My team is a digital offering management. We’re a subset of offering management better known as product management. We will run the squads and the squads will be a cross-function of our product marketing team, our performance marketing team, which is demand to and type marketing. They run the campaigns, design, developments, the core product managers because we’re the digital product managers and such, and then there’s the core product managers. They have all routes to market. We’re just focused on the digital ones.

With that is the cross-functional squad that gets together on a weekly basis and they run as a team. From a digital perspective, it’s led by the Digital OM for our route to market there.

Brian: That’s interesting. How do you ensure that there are some kind of IBMness to all these offerings? Your UX practice and offering managers sound like they are part of one organization, but I imagine some of these tools, you might be crossing boundaries as you go from tool X to tool Y. Maybe you need to send data over like, “Oh, I have this package of stuff and I need to deploy this model,” then we have a different tool for putting the model into production and there’s some cross user experience there. Can you talk about that?

Nancy: That’s really why design’s been key because their job is to keep us onus making sure that the experience is somewhat consistent across the tools so they seem familiar to us, especially within a segment data science. Some of these are using our Watson Studio tool and then moves to our statistics for our modeler tool. There should be a very familiar experience across those. That’s why design is really the lead in the experience part of it.

From pricing and packaging, we try to maintain a consistency as much as possible across all the products again. Whatever level of familiarity you have and how we price and package things should be consistent across the entire segment. So we strive for that as well.

On the digital side, in terms of the experience on the actual web, we partner with a team called the Digital Business Group. They are basically the host of our digital platform and they maintain a level of consistency worldwide across all the products in terms of the digital journey itself with us.

Brian: That’s cool that you guys are keeping these checkpoints, so to speak, as stuff goes out the door. You’ve got the front lenses on it looking at it from different quality perspectives, I guess you could say.

Earlier, you mentioned democratizing data science and we hear this a lot. Are we talking about democratizing the results of the data science, so at some point there’s maybe decision support tool or there’s some kind of outcome coming from the data science? Is that what you’re talking about democratizing? Or are you saying for a data scientist of all levels of ability, it’s more for the toolers as opposed to the [consumers..]?

Nancy: It’s about the capability. The ability to put more of these products or these products in people’s hands that bought, that they might have been out of their reach, or that they were too enterprisey, or that they are for big companies. That’s one of the key things that we want to do.

When you look at some of our products, they start really, really low. Cognizant Analytics is another great example where people might have had a perception that it’s really expensive but we just introduced a new version of it, and it’s less than $100 a month. You can get these powerful tools for analysis for a lot less than you think.

Statistics in $99 a month, one of our pay products are significantly less, and it allows these companies that might not have considered doing business with us, to smart small and build up. That’s one of the key things we noticed as we shifted to a subscription model. With that, we started to see double digit increases in the number of clients that were new on products.

Just because opened up this new route to market, doesn’t mean that we still didn’t maintain our enterprise face-to-face relationships because, of course, we did, but this allowed us to open up relationships with clients might have not gotten to before.

Brian: How are the changes affecting the legacy users that you have? I imagine you probably do have some people that are used to, “Don’t change my toolset,” like, “I’ve been using DB2 for 25 years.” How are they reacting to some of the changes? I imagine at some point maybe you have some fat clients that turn to browser-based interfaces. They undergo some redesign at that point. Do you have a friction between the legacy experience and maybe do you employ the slow change mentality? Or do you say, “Nope, we’re going to cut it here. We’re jumping to the new one and we’re not going to let the legacy drag us back”? You talk about how you guys make those changes?

Nancy: We’re shifting towards the subscription model. Our clients are, too. We have clients that are demanding that this is the only way that they actually want to buy software is through a subscription model. So it’s changing for them as well. I think in many ways, it’s a welcome change across the board. I can’t think of any negativity that we’ve had in both the change for the consumption models on a subscription side, as well as the new UI changes and things that we’re doing to the product that really update them and give them a modern feel.

I know a lot of the onboarding is a welcome change, even for clients that are familiar with us. It helps them because they have to do less training internally to help people use the tool because now we’re building it into the product.

Brian: How do you measure that they’re accepting that? Do you wait for that inbound feedback? Do you see if there’s attrition and then go talk to them? I imagine there’s some attrition that happens when you make a large tooling change. Is there a way to validate that or why that happened? Was it a result of changing too quickly? Any comment on that?

Nancy: I think it’s a couple of things. We’re constantly monitoring the flow of MRR and the contraction of revenue where the attrition that we get through some of our subscription, to see if there is any anomalies there. But also we’re always were very in-tune with NPS. A lot of our product managers live and die in the verbatims and with the integration of FLAX, they get a lot of it. They’re coming right at them constantly, that they respond to.

We are very, very in-tune with NPS and the feedback we’re getting there. We’re also getting a lot of reviews now on our software using tools like G2 Crowd where we keep an eye on that. I think the feedback doesn’t just come from one place. We’ll look at things like the flow through Amplitude.

Our clients, when they’re coming in and during the trial, are they getting stuck someplace? Are they falling off someplace? Are they falling off either at a specific page like the pricing page? Or are they falling off as soon as they get the trial because they don’t know what to do with it?

We look at things like that. We look at NPS in particular after we’ve introduced new capabilities. Did our NPS go up? What’s the feedback? Are our clients truly embracing this? I think it’s a combination of things. There is a lot of information, a lot of data that we just need to stay in-tune with. We’ve got a couple of dashboards that I know my team wakes up with everyday and takes a look at, and the product team. The core product manager stayed very focused on NPS.

Brian: Do you have a way of collecting end-user feedback directly? I would imagine maybe in your newer tools, it’s easier to tool some of that in, but is there any way to provide customer feedback or something to chat or any type of interactivity that’s directly in the tools that you’re creating these days?

Nancy: Sure. We are rolling out more end-product nurture capability than we ever had before. That gives them the ability to chat directly within the product, as well as schedule a time with an expert. We’re working in making that even easier through a chat bot. So if you do get stuck and you’re chatting with that bot, you can schedule the appointment with an expert right there.

I think there’s lots of ways to do that. I think sometimes I worry that there’s too much data coming at us but we [didn’t have enough..] before, so I’m not going to do that.

Brian: Right. It’s not about data, right? It’s, do we have information?

Nancy: Do we have information? Exactly. I would say my team spends a lot of time going through that, looking at Amplitude, analyzing the flows, looking in the patterns, in the orders, in the data, and the revenue. With the NPS feedback, it’s a combination of all of that stuff that really gives us a good view.

As well as looking at the chat data, and analyzing some of the keywords that’s coming across on the chat, the Watson robots are constantly learning, which is great. We’re using machine learning to get smarter about what do people ask about, and that’s giving us also some good insight into the questions they ask, the patterns of information they’re searching for by product.

Brian: In terms of the net promoter score that you talked about, tell me about the fact that how do you interpret that information when not everybody is going to provide a net promoter score? You have nulls, right?

Nancy: Right.

Brian: How do you factor that in? That’s the argument against NPS as the leading indicator. Sometimes, it’s not having any information. So you may not be collecting positive or potentially negative stuff because people don’t even want to take the time to respond. Do you have comments on how you guys interpret that?

Nancy: I think you also have to look at the NPS is going to go up and down. If you have a client who has particularly a bad experience, it’s the week of thanksgiving, there was only X amount of surveys, and one of them had a bad experience that could make your NPS score looks like it drops like a rock. [right] you’ve got to look at it like the stock market. It’s more of the patterns over the long haul, what’s coming across within those patterns of information and feedback the clients are giving you.

We react but you have to look at the data set, you have to look at the environmental things that are happening, and take that all into consideration from an NPS perspective. We’re very driven by that and that comes down from our CEO. She’s very cognizant, making sure that the product teams and the development teams are getting that feedback directly from the clients.

As an organization—we’re a few years old—the way we used to do that is we would have these client advisory boards. It was a small number of clients that would give us feedback on our products, roadmap, and usability of that. The reality is just that then you end up building the product for 10% of your clients.

Now it’s been eye-opening for us as we really open that up. Obviously, we’re still getting feedback from a larger community and client advisory board still, but NPS comments and feedback has really widen the aperture of the feedback we’ve gotten from a broader scope of clients.

Brian: You brought up a good point. I had a client who luckily was cognizant of this and they did the same things where they fly their clients, they do two-day workshops, and they gather feedback from them. I was doing some consulting there and he said, “Brian, I’d like you to just go walk around, drop in some of the conversations and just listen, but take it with a grain of salt because I hate these freaking things. All we do is invite people that are willing to come for 2–3 days and tell us how much they love our stuff, it’s a free trip, we’re not getting to the people that don’t like our stuff…”

Nancy: Or don’t use it.

Brian: Or don’t use it at all. I love the concept of design partners, which is new, where you might have a stable of customers who are highly engaged, but that the good ones are the ones that are engaged who will pummel you when you’re stuff is not happening. They will come down on you and they will let you know.

So it’s really about finding highly communicative and people who are willing to tell it like it is. It’s not, we’ll go out and find people that rah-rah, cheerleading crowd for you. Did that inspire the changes?

Nancy: Even in the client advisory councils that we had—I ran a couple of them for products like Netezza for a while—we started to change the way we even ran those. I remember the biggest aha moment was, we had a client advisory board for Netezza one year and not too long ago. We decided to run a design thinking camp as part of the agenda, so that they would actually drive what they wanted from our requirements prospectus, going through the design thinking process through that.

What came out of it was truly eye-opening. You know how a design thinking process progresses. I think even they were surprised at what they ended up prioritizing across the group of requirement. I think we’re really starting using differently about that feedback from clients. I do remember that day when we were looking at those things and that was not where we thought we would end up.

Brian: Do you have a specific memory about something that was surprising to the group that really stuck? Something you guys learned in particular that stuck with you?

Nancy: I think we focused a lot more at that point. At the time there was a lot of issues around security and what was one of our leading things going into the next version. What clients actually were not necessarily as verbal about was that, as they were using these appliances and they were becoming more mission-critical, they were doing more mixed workloads.

Yes, security was still incredibly important, but what was emerging beyond that for them was workload management because they had this mixed workload that was emerging. So many different groups were jumping in with different types of workload. They have not anticipated on their [day route?] appliance, so it was something that I think came out of the next in the design thinking process that was important to them that they actually hadn’t been able to verbalize to us.

Outside of that process, which was really, really interesting to us, we were on track with the requirements that we have but beyond that, the requirements that we just hadn’t thought of and quite honestly they hadn’t verbalized.

Brian: You make a good point there. Part of the job of design is to get really good clarity on what the problems are and they’re not always going to be voiced to you in words or in direct statements. It’s your job to uncover the latent problems that are already there, crystalize them, so ideally whoever your project manager in the organization and your leadership, can understand and make them concrete because then you can go and solve them. When they’re not concrete and vague, like, “We need better security.”

But what does that mean specifically? If you start there and really the problem had to do with the mixed workloads and managing all that, it’s like you can go down a completely different path. You can still write a lot of code, you can build a lot of stuff, and you can do a lot of releases, but if you don’t really know what that problem is that you’re solving, then you’re just going through activity and you’re actually building debt. You’re building more technical debt, you’re wasting money and time for everybody, and you’re not really driving the experience better for the customer.

I think you made a good point about the design thinking helps uncover the reality of what’s there, when it’s not being explicitly stated, support requests are not going to get that type of information. They tend to be much more tactical. You’re not going to get a, “Hey, strategically I think the project needs to go this direction.”

Nancy: Right and if you would have asked of us an open-ended question, you would have gotten and answer that could have been interpreted slightly differently. I think this was when I became the biggest fan of design is that, there was this magical person who was running this design camp for me that got information that I didn’t think I could get to. I mean, I knew nothing about the product. It was pretty amazing.

Brian: That can happen when you also get that fresh lens on things even when they may not be a domain expert. You get used to seeing the friction points that people have and you can ask questions in a way to extract information that’s not biased. You’re not biased by the legacy that might be coming along with that product or even that domain space. It’s sometimes having jthat almost like first grade, “Tell it to me like I’m your grandfather,” or, “Explain that to me this way,” and then you can start to see where some of those friction points are and make them real.

I always enjoyed that process of when you’re really fresh. Maybe this happens for other people but especially as a designer and consultant, coming into a product and a new domain, and just having that first-grader lens on it like, “Hey, could you unpack that for me?” “What is the workload in there like?” looking at you like, “What?” and you make them unpack that but you give that full honesty there to really get them to extract out of their head into words that you [and.] everyone can understand.

That’s where some of those magical things happen like, “Oh my gosh. We had no idea that this was a problem,” because he or she thought it was so obvious like, “Of course, they know this,” and it’s like, “No. No one’s ever said that.”

Nancy: Right. We’re experiencing that now. We have an embedded designer into our team that’s focused on our growth products. Again, she’s coming in with a complete fresh set of eyes and her perspective that she brings on the experience is just so completely different, not completely different but there are things that she flushes out we would have never see.

It’s really helping because a lot of times, too, when you’re focused on the experience as opposed to the features and functions analysis, and you come down to looking at it from that perspective. I don’t want to go to development and tell them this because it’s like calling their baby ugly. But at the end of the day, the client needs to have a great experience. They need to see the value. When they’re even just trying the product out, they don’t get to that aha experience like, “I know how this will help me within 15 minutes.” We’re just not nailing it. If they can’t figure out how to jump in and use the product, we’re not nailing it. It doesn’t matter how great the product is, if they can’t figure out how to effectively interact with it.

Brian: Effectively, none of the stuff really exist in their world. It just doesn’t exist because they can’t get to it. So, effectively it’s totally worthless. Whatever that island you have on the island, if there’s no bridge to get there, it doesn’t matter because its just totally inaccessible.

Nancy: Right and it’s harder sometimes for even the product managers to see it. When I was sitting down in a demo of a product that we are going to be releasing, dude was cruising through the demo, my eyes were like glazed over, I just look and I was like, “Boy, we’re going to need some onboarding with that.”

Great product, amazing capabilities, very complex and dense in its capability. It’s never really about whether it’s a great product. It’s about whether the client understands that’s great, when they start using it.

Brian: Yeah and I think especially for analytics tools, highly technical tools used by engineers and other people that have better working in this kind of domain. Sometimes we gloss over stuff that seems like it would be totally easy or just not important.

I have this specific example I was working on a storage application. It was a tool I think for migrating storage between an old appliance and a new appliance. At some point during that workload migration, something as simple as like, “Oh, I need a list of these host names and these IP addresses,” some other information that’s just basically setup-related stuff, and all the tool needed to do was have a CSV download of a bunch of numbers to be piped into another thing so that they could talk to each other. It’s not sexy. It’s literally a CSV. It was the only technical lift required, but it was not seen as engineering. It’s not part of the product. That has to do with some other product but you have to go type it into. It’s like yes, but that bridge is never going to happen. It takes them 10 years to go figure out where all these IP addresses are listed, domain names, and all these kind of stuff. It’s not sexy but if you look at the big picture, the full end-to-end arc, and if we’re all lying around, what is that A to G workflow, there’s six steps that have to happen there. This is not sexy, it’s not a new feature but this is the blocker from getting from B to E. They’re never get to A, which is where the product begins.

Nancy: We definitely had those discussions in the early days about making it more consumable instead of giving it more features and functions, and can’t we really hack growth that way? That is a mind shift that if you are a design-led organization, you get it, and we believed in every part of our being that we are.

Sometimes we still have that natural resistance that we really need to add more features and functions to make this product grow, but I think we’ve really turned the corner on that. Digital really has been the task for us to do that because we build the experience in the products as if there was no IBM sales team that’s going to surround you to help make you a success. That’s a very different way that we’ve done things for so many years, and the only way you can do that is by focusing on experience.

Brian: You bring up a good point and I think that it’s worth reiterating to listeners. You can add these features but they do come at a cost. The cognitive load goes up. Every time we add to the tool, we’re effectively reducing the simplicity of everything else around it. Typically as a general rule, removing choice simplifies because you’re just removing the number of things that someone has to think about.

So those features don’t really come for free. It’s almost like you have a debt as soon as you add the feature and then you hope you recoup it by, “Oh there’s high engagement. People are really using that,” so that was a win. If there’s low engagement with it, you just add it. It’s like Microsoft Word 10 years ago. You just added another menu bar and another thing that no one’s gonna use, and now it’s even worse. The pile continues to grow and it’s so hard to take stuff out of software once it’s in there, because you’re going to find, “You know what? But IBM’s our client, and they’re using it. IBM makes $3 million a year. We’re not taking that button out of the tool. End of story,” and now you have that short-term like, “We can’t take that out because Nancy’s group uses this.”

Nancy: That’s right and we can’t point out exactly. I think my favorite story when it comes to that is the Instagram story that people talk about, where it was launched as a tool, a product called Bourbon. It had all of these great capabilities and it was going nowhere. So they dug into the usability side of things and said, “Well, what are people actually using?” which is what we do as well from an instrumentation perspective, and found that they were really only using a couple of things. They wanted to post a picture, they wanted to comment on the picture, they want to add some sort of emojis or in like system the picture and they are like, “Let’s [do.]. Let’s just do three or four things, do them really great, and relaunch the product,” and then of course the rest is history.

I think that that’s a great illustration of more features and functions. If they’re not important, relevant, and consumable, all three of those things, are not going to give you growth. It comes down to, is it easy to use? Can I get value out of it? Do I immediately see that I can get value out of it? That’s all product market fit. That’s where we shifted our focus and digital’s helped us, too. That’s why my job is so cool.

Brian: Cool. This has been super fun. Can you leave us with maybe an anecdote? Do you have a big lesson learned or something you might recommend to people that are either building internal tools, internal enterprise software or even SaaS products, something like, “Hey, if I was starting fresh today, I might do this instead of doing that.” Anything from your experience you could share?

Nancy: For me, the biggest thing is just really focusing on product market fit because we build something sometimes to be competitively great, but not necessarily competitively great and competitively different, or that. So to understand that you not only have something that solves somebody’s problem but does it in a way that’s unique, and that’s so valuable that they’ll pay the price that’s appropriate for whatever they’ll pay for it.

You’ve got to start thinking about that upfront because oftentimes, we’ll build something we’ll see a market opportunity for, but we may not truly understand product market fit whereas we know who the target is, we know what they’ll pay for this, we know what the value is, we know how to get to them, and I think you’ve got to start with that upfront, like you really got to understand product market fit or you’re never be able to grow the product.

I’ve got a lot of religion around that and we really try very, very hard to create pricing and packaging around making sure we hit that, but the product has to have that value. It can’t be too overwhelming, it can’t be too underwhelming, it’s got to hit that great value spot.

Brian: Fully agree on getting that fit upfront. You save a lot of time, you could solve a lot of technical debt instead of jumping in with the projects that you going to have to change immediately because you find out after the fact and now you’re starting it like…

Nancy: See you in Instagram not a Bourbon, right?

Brian: Exactly. Tell us where can people find you on the interwebs out there?

Nancy: I probably spend a lot of time on Twitter. Maybe not so much lately. It’s been a little bit crazy but you can find me on Twitter @nancykoppdw […] or you can find me on LinkedIn. I am going to try and do better. I am on Medium. I haven’t done as good about blogging but that’s one of my goals for trying to get back on blogging. I’m usually out there on Medium or Twitter talking about growth hacking and digital transformation. I do podcast as well.

Brian: Cool. I will put those links up on the show notes for anyone. Thanks for coming to talk with us, Nancy. It’s been fun. This has been Nancy Hensley from IBM Analytics, the Chief Digital Officer. Thanks again for coming on the show and hope we get the chance to catch up again.

Nancy: Thank you.

Other Episodes

Episode 0

August 11, 2020 00:44:25
Episode Cover

045 - Healthcare Analytics…or Actionable Decision Support Tools? Leadership Strategies from Novant Health’s SVP of Data Products, Karl Hightower

Healthcare professionals need access to decision support tools that deliver the right information, at the right time. In a busy healthcare facility, where countless...

Listen

Episode 0

February 09, 2021 00:35:04
Episode Cover

058 - IoT Spotlight: 8 UI / UX Strategies for Designing Indispensable Monitoring Applications

On this solo episode of Experiencing Data, I discussed eight design strategies that will help your data product team create immensely valuable IOT monitoring...

Listen

Episode 0

September 10, 2019 00:42:41
Episode Cover

[

John Purcell has more than 20 years of experience in the technology world. Currently, he’s VP of Products at CloudHealth, a company that helps...

Listen