Dean Malmgren cut his teeth as a chemical and biological engineer. In grad school, he studied complex systems and began telling stories about them through the lens of data algorithms. That led him to co-found Datascope Analytics, a data science consulting company which was purchased by IDEO, a global design firm. Today, Dean is an executive director at IDEO and helps teams use data to build delightful products and experiences.
Join Dean and I as we explore the intersection of data science and design and discuss:
“One of the things that we learned very, very quickly, and very early on, was that designing algorithms that are useful for people involves a lot more than just data and code.” — Dean
“In the projects that we do at IDEO, we are designing new things that don’t yet exist in the world. Designing things that are new to the world is pretty different than optimizing existing processes or business units or operations, which tends to be the focus of a lot of data science teams.” — Dean
“The reality is that designing new-to-the-world things often involves a different mindset than optimizing the existing things.” — Dean
“You know if somebody rates a movie incorrectly, it’s not like you’d throw out Siskel and Ebert’s recommendations for the rest of your life. You just might not pay as much attention to them. But that’s very different when it comes to algorithmic recommendations. We have a lot less tolerance for machines making mistakes.” — Dean
“The key benefit here is the culture that design brings in terms of creating early and getting feedback early in the process, as opposed to waiting you know three, five, six, seven months working on some model, getting it 97% accurate but 10% utilized.” — Brian
“You can do all the best work in the world. But at the end of the day, if there’s a human in the loop, it’s that last mile or last hundred feet, whatever you want to call it, where you make it or break it.” — Brian
“Study after study shows that 10 to 20% of big data analytics projects and AI projects succeed. I’ve actually been collecting them as a hobby in a single article, because they keep coming out.” — Brian
Brian: Hey everyone. This is Brian, and I’m really happy to share my conversation with Dean Malmgren today. Dean was the co-founder of Datascope, a data science consulting firm that was acquired by IDEO, and he’s now an executive director at IDEO, which, if you’re not familiar with it, IDEO is a global design consultancy, and they create positive impact through design.
Brian: And the reason I wanted to talk to Dean and share this conversation with you is that I think Dean has some really great perspectives about how human-centered design can have really positive impacts on the work that data science groups are doing.
Brian: We talked a lot about talking to customers early. We talked a lot about prototyping and working in really low fidelity in order to get feedback early on solutions, so that we don’t spend a lot of time building things that people don’t want to use. So it was really fascinating to hear Dean’s perspective, since he comes more from the world of data, but he’s now living, not living, but working inside one of the world’s largest design consulting companies. So without further ado, here’s my conversation with Dean.
Brian: Welcome back to Experiencing Data. This is Brian, and I’m happy to have Dean Malmgren on the line. How are you, Dean?
Dean: Good. How are you doing, Brian?
Brian: I’m doing great. It’s fall, and it’s starting to cool down, but we got, we got the awesome foliage outside, so as they say around here, the leaf peepers are coming up to see the leaves change in New England. So, but yeah, so Dean, you are the executive director at IDEO, and the co-founder of Datascope, which was acquired by IDEO.
Brian: And I’m super happy to talk to you, because you’re sitting in a really interesting place in terms of, at least, I think it’s interesting, and I think my, the listeners here on the show are going to be interested too, because you have a data science background, and you’ve run a company there, and you got sucked into a giant design company.
Brian: And so these two worlds are coming together, and that’s what I wanted to talk to you today about. So can you tell people a little bit about your background, and what you’re doing at IDEO now, and yeah, give us a little overview?
Dean: Yeah, sure. So, just by way of background, I was formally trained in chemical and biological engineering, and I was in a research group in grad school that was studying complex systems. And as I reflect back on that experience today, what that largely involved was taking these systems that had various statistical properties associated with them and telling stories about them through the lens of data and algorithms.
Dean: And so the inspiration for starting Datascope really came from wanting to scale that expertise more broadly into the commercial space, and finding ways to design algorithms that would be useful for people. I think, I mean one of the things that we learned very, very quickly, very early on was that designing algorithms that are useful for people involves a lot more than just data and code.
Dean: It involves thinking about interfaces, how you can establish trust with people, and thinking about services and lots of other things that can bring those recommendations or personalizations to life in a meaningful way. And so along the way, we took a lot of inspiration from the agile development community, the lean startup community, and in particular, the design thinking community.
Dean: And as you might imagine, along the way, we became pretty good friends with IDEO. They were initially informal advisers, so we’d get together with them for coffee from time to time, shoot the breeze about the intersection of data and design. We’d pick their brain about scaling a creative practice.
Dean: So from those conversations, I ended up doing an externship swap at IDEO where I went and did a couple of weeks worth of work on a project. We had one of their designers coming to work with us for a couple of weeks. We collaborated on projects. Then we were partnering on things. As we reflected on that experience, what we realized is that we did some of our best work when we treated data science as a discipline of design.
Dean: It allowed us to prototype things much more authentically in a real context, solicit meaningful input from users, and ultimately have better impact for our clients and their customers. So that was the thesis behind IDEO acquiring Datascope almost two years ago to the day.
Dean: And since then, we’ve been pushing the edge of what it means to practice human-centered data science, and also what it means to be an algorithm designer at IDEO. So it’s been a really interesting journey, and pleased to be here to share whatever might be useful for you and the listeners.
Brian: That’s awesome. Yeah, no, I have tons of questions on this. So, my first one goes back to, at some point, somebody, I’m assuming, picked up the phone and called someone else because they had a challenge here, and if I remember correctly, it was you probably inquiring with IDEO, is that correct?
Dean: No. Well, we were actually introduced through a mutual acquaintance.
Brian: Oh, okay.
Dean: Yeah.
Brian: How did you get to that point where … Like what was the symptom or feeling, or there was some impetus that, that someone felt like there’s something to explore here? Either a particular project or some, something. Was something not going well on an engagement, on one of your ends? Or was it more like, “We have no idea what would happen if we put these together?” Like, how, what was the driver?
Dean: You know that’s a good question. I mean I suppose the way that I tell this story is that I was first introduced to who was then the managing director at IDEO Chicago. His name is Ian Roberts. He’s now the chief operating officer at IDEO. And he was particularly interested in better understanding all the hullabaloo around big data, AI, machine learning.
Dean: It’s a field that’s full of jargon, and not terribly human-centered.
Brian: Yeah.
Dean: I think he was particularly interested in better understanding the nature of data as a medium for design. I think, through our collaborations over the years, I think that we’ve fleshed out what that actually means in practice, and it’s given us a unique point of view on how you can design algorithms in parallel with services and products and interfaces and everything else. That make sense?
Brian: Yes. Yes, it does. So now that you’re inside IDEO, what is it like now as it relates to what it was before you were working with IDEO? Like, can you paint a before, after picture, in terms of how your work with building data products or solutions you know with data? What’s that before, after?
Brian: I’m trying to paint a picture for the listeners of what it means to integrate design with data science, because I think it’s not, there may not always be a huge interface output, you know like a software application that’s completely based on you know the data science, or whatever. It could just be a small part of something. So can you help, help us understand what that before and after is like?
Dean: The way that I would characterize it is that today, we’re treating data science being embedded in design teams. So projects that IDEO typically have you know three to five people on them, almost everyone on that team will come from different backgrounds, interaction design, design research, software design, data science, you know the list goes on.
Dean: But it will involve these different perspectives that allows us to explore an idea from lots of different angles. And so, treating data science as a discipline of design means that that is very embedded into the core team. Whereas before joining IDEO, we were in one of three buckets.
Dean: We would either be doing data science to lead to designs, so that’s particularly valuable when you’re, there’s a big question around technical feasibility, and you just need to be careful not to over-engineer solutions before they’ve actually seen the eyes of customers. We also practice data science when it follows design.
Dean: So actually, one of the projects that IDEO and Datascope collaborated on early on was something that a design team at IDEO developed and then handed over to us to refine and pilot with the client to improve the algorithms. I can talk more about that later. And the third way that we would often collaborate was actually in parallel to design teams.
Dean: So having a data science team separate from a design team. And I see that pattern pretty often, especially at our clients where they’ll have data science teams that are working in parallel with their design teams, and that allows you to keep separate cultures and move relatively fast in a deep way.
Dean: But there is the risk of miscommunication across those teams. So anyway, those are all really interesting modes of working, and I’ve kind of worked in all of them. But the biggest difference is sort of treating data sciences as a discipline of design today.
Brian: I’m going to throw a scenario at you that maybe is typical or not. I don’t know for an ideal project, but maybe you come into a you know a brick and mortar, like a non-digital company, and they have a data science team. Maybe you guys are providing some expertise here or something. Maybe they have a product design, or a UX design group, that, perhaps, works on software applications, or the eCommerce website, or something like that.
Brian: Is part of your work to bring those worlds together and you guys interface with that, or do you guys necessarily try to glom those two together, or is it really based on the project? Like I’m curious how … Is it kind of like … Is it an essential ingredient in most of the projects, or does it really kind of depend on what the work is?
Brian: Because I’m assuming if you come in, right, they’re coming to you, there’s probably some assumption there’s an intersection of design and data science that’s required, or at least they’re interested in that. Like could you paint that for us?
Dean: In the projects that we do at IDEO, we’re, I mean generally speaking, as a design firm, we are designing new things that don’t yet exist in the world. Designing things that are new to the world is pretty different than optimizing existing processes or business units or operations, which tends to be the focus of a lot of data science teams.
Brian: Mmhmm.
Dean: That’s not strictly true, for sure, and for anybody that I just offended, please forgive me.
Brian: Our, our listeners have no way to throw rocks at you on this show cause…
Dean: They’re making fun of me, though. That’s the problem.
Brian: That’s right.
Dean: And anyway, when we get started with a client, almost always they have a data science team today, and almost always they have a design team, too. The reality is that designing new to the world things often involves a different mindset than optimizing the existing things.
Dean: And so sometimes, it will involve working alongside those data science teams and encouraging more glue between data science teams and design teams. But other times, it will mean helping those firms think about how they can bring in different types of skills to accelerate the process by which they design things.
Dean: So it’s a bit of a … I mean there’s always a balance there, for sure, but we’ve definitely noticed that, in some circumstances, existing data science teams are not a great fit for creating new stuff. They tend to be dogmatic about particular tool sets, or perhaps not really thinking holistically about the entire experience, and overly focused on the technical side of things. And so prototyping involves figuring out what the biggest risk factor is and addressing that in a head on way.
Brian: And I assume that that blocker can be a people thing, as opposed to a, “We don’t have the data for this,” or “The data is dirty,” or some technical impediment. Am I correct about that?
Dean: Yeah. That’s my experience, anyway. People are really tricky. We have all kinds of nuance, we have very distinct preferences and tastes, and algorithms really don’t get the benefit of the doubt when you get it wrong. Whereas people do, actually. You know if somebody rates a movie incorrectly, it’s not like you’d throw out Siskel and Ebert’s recommendations for the rest of your life.
Dean: You just might not pay as much attention to them. But that’s very different when it comes to algorithmic recommendations. We have a lot less tolerance for machines making mistakes,
Brian: Mmhmm.
Dean: So designing those interfaces and services in a way that makes that a little bit more palatable is really important.
Brian: What I’m hearing from Dean is that there’s some benefit that my team could get, or maybe I’m a CEO, or CIO, or a director, and there’s apparently some benefit I can get with integrating with design that’s going to improve the work my team is doing. So, and let’s assume it’s not just some group working on operate, you know optimization of internal processes or something like you were talking about.
Brian: What would they do when they pick up the phone and call the you know design department, or they call some external design? Like how, what could they get out of that, and what what is the opportunity that’s there for them?
Dean: I regularly recommend with clients and anybody that I speak with that it’s so important to prototype as quickly as possible. I think we tend to over index on using the latest and greatest technology, when some of the simplest stuff will do. And more often than not, the the biggest point of friction
Brian: Mmhmm.
Dean: Isn’t the prediction that we’re making or the recommendation or how it’s personalized, but rather in how that information is communicated.
Dean: And so, working side by side with people that are experts in interface design, or service design, or anything else, allows you to think about both sides of that equation at the same time rather than over indexing on the technology.
Dean: I can’t tell you the number of projects that we’ve done where we’ll start out with the simplest possible thing, knowing that the data was really, really dirty, and it had all these different problems, but that, at the end of the day, that didn’t matter nearly as much as making sure that we had the visual right, or that we were communicating our confidence in an appropriate way to build trust with the users.
Dean: I tend to feel like prototyping rapidly is a skill set that sometimes doesn’t come naturally to data scientists. And by rapidly, I mean if you had a day to make a thing, what would you leave work with that day? Or an afternoon, or maybe it’s two hours, or whatever it is. But giving yourself those artificial time constraints to force things to get into the hands of people as early as possible is really important.
Brian: So the key benefit here is the culture that design brings in terms of creating early and getting feedback early in the process, as opposed to waiting you know three, five, six, seven months working on some model, getting it 97% accurate but 10% utilized.
Dean: Yes, exactly.
Brian: Got it, got it. Are there any other things that you think you know data science and analytics people can learn from designers to improve the work that they do? Like we talk a lot on this show kind of about the non-technical aspects, and and this people aspect, right? Which is, you know you can do all the best work in the world, but at the end of the day, if there’s a human in the loop, it’s that last mile or last hundred feet, whatever you want to call it, where things make … You make it or break it. Right?
Brian: Because if they don’t use your service, your application, your solution, whatever it may be, then it doesn’t matter what model you used and how great it was and what paper you wrote about it. It’s, it’s from a business perspective. You know you got some exercise in, some technical exercise in, but there wasn’t a value outcome. So, can you talk to us about, about that?
Dean: One thing that I get really fired up about is the importance of data scientists talking to the users for whom they’re designing.
Brian: Mmhmm.
Dean: And that is not something that I think happens very regularly in practice. All too often, as data scientists, and I’m guilty of this earlier in my career for sure, would tend to analyze data versus like do some exploratory data analysis on a thing first, before you would make something, versus talking to the people that you’re designing for, before you even analyze the data and do anything else.
Dean: And I think that, by developing pretty deep empathy for the people that you’re trying to design something for, gives you a much clearer intuition for the types of things that they would find useful. One of my favorite stories along these lines was actually the first project that we did for Procter & Gamble.
Dean: They were interested in rolling out a new technology across their R&D department, which, at that time, consisted of about 2,500 people in that community of practice. What they would do is, they had had a lot of success training people, like a small set of ambassadors, they called them, at a training summit for about a week.
Dean: And then those ambassadors were tasked with going out into the world and training their peers on this new technology. We got really inspired by this. I guess just to finish that thought, the challenges that they … While the training had been very successful the, generally, these things had not caught on, and they had limited change impact.
Dean: And the reason was because individuals that were selected were not fundamentally trusted by the average Joe employee at P and G. And so what we did is we designed a way to identify the organically trusted people within the organization by doing a couple of things. So we did a social network survey to understand how people were connected across the organization.
Dean: So we asked them things like, questions like, who do you go to for advice when you’re having a bad day? Who do you seek out? Who comes to you with their questions? As a way to start to understand how trust flowed in the system. And then based on that, we ran simulations, epidemiological simulations, to try to simulate the spread of this new technology.
Dean: And normally, when you use epidemiological simulations, you’re trying to suppress the spread of a disease. In this case, we were trying to create a pandemic as quickly as possible. And so it became a matter of optimizing the group of people that would maximally spread this new technology as quickly as possible.
Dean: This is the broad context for what I’m about to share, which is that our first deliverable in this particular case was a spreadsheet of a list of names. And I mean we walked our client through the process that we did, and how rigorous it was, and cited all the academic research that had been done in some of our spaces. We were really proud of the technology.
Dean: But we had actually totally over engineered the solution, to be totally honest with you. But the reality was that when we shared a spreadsheet of names, it didn’t give our client an opportunity to really trust the results that we’d come up with. And so we ended up having to build out like a web interface that allowed you to see those those results in real time, and also to edit the people that you might recommend for that training summit.
Dean: And only by doing that last piece did the rest of the managers at P and G come to trust the results and end up using the vast majority of the people that we recommended for that that ambassador summit. When you talk about the need to prototype things, and make sure that we are getting trust with the people involved, people are really tricky.
Dean: They have nuance perspective, they have very particular tastes, and without giving them a view into the black box to develop trust for what they’re seeing, it’s just very hard for people to interact with that in a meaningful way.
Brian: That’s a great story. Can I, I want to try to summarize something I heard, and you tell me if this is accurate or not. I’m guessing that the client did not hand you a well-formed problem statement that said, “And by the way, we probably won’t believe your spreadsheet when it comes up, unless it allows us to edit the names, so please make sure there’s an edit, you know edit person functionality.” That was probably never stated, right?
Dean: No, no it wasn’t. It was help us find the right set of people to invite to this training summit.
Brian: Right. So yeah, I think that’s, I think that’s something that happens frequently, right? Is that the, the problems need to be discovered and found, and the talking to the customers and the users and getting that feedback on stuff early helps you uncover some of these latent problems that will simply not be stated, because the client, or the customer, may not understand that that’s actually an issue for them.
Brian: Because they don’t have the context to announce that. They can’t see that yet. Right? They, they need to see the solution to understand what they don’t like about it, and to provide more feedback to you.
Dean: Uhhuh.
Brian: So by getting that, you know is this, I imagine this is part of your regular process, I mean and most, many designers go through this, right? But you want to get, kind of figure out that make it or break it part and understand this early, so that you can start to address it earlier in the technology and the solution side.
Dean: Right. I mean if I were to do that project all over again, the first thing I would’ve done was talked to all these managers to learn about what, how they choose people today, and what they would like to see differently in the future.
Dean: And that would have uncovered all kinds of things that we had to figure out after the fact, which was fine, and we were able to adjust, and you know we’re here today telling the story, so it’s, it’s all good, but it would definitely be something that I would do differently today if given the chance.
Brian: Mmhmm. Do you think this, these behaviors and activities that we’re talking about, are these things that are simply part of what it’s going to mean to be a successful data scientist going forward, or is it really a discipline that needs to live in a separate body team, staff, you know or external partner, or whatever it may be?
Brian: Like, talk to me about, is this the, is this part of the job description? To be going out and talking to customers and like prototyping. How much is that? I could hear some data scientists say, “That’s not my job. That’s not, I was hired to do the math part, and the modeling.” Like talk to me about that a little bit.
Dean: Yeah. And I think there are data scientists for whom that job is appropriate, too. I mean, I really do think that there are lots of different … The title data scientist means so many things,
Brian: Yeah.
Dean: And simultaneously nothing at the same time, so that’s one part of the challenge. I’ve been leaning towards using the title of algorithm designer recently
Brian: Mmhmm.
Dean: To describe what we do, which is slightly different than what I see out in the world for most data scientists.
Dean: At the end of the day, it doesn’t really matter what you call it, it involves using data to make predictions or recommendations, personalizations, whatever, and do that in a context that’s relevant for people. Sometimes you should be …
Dean: Like once you know that an algorithm is really valuable and improving its accuracy has a meaningful business impact or impact for people, then it does make sense to eke out those last few decimal points, and you see massive teams at Google and Facebook that do exactly that.
Brian: Mmhmm.
Dean: For a lot of things that you’re just getting off the ground, I do think you’d benefit from a much more agile mindset and a willingness to experiment rapidly and broadly, just to see what works and what doesn’t with different people. And that that’s yet another skill set.
Dean: Another skill set that I see out in the world is analyzing data that already exists to try to prove out hypotheses that we observe in the world. That sort of analyst or business intelligence role is also very important, and has a very similar skill set as well.
Dean: So I feel like there are these different phenotypes that exist out in the world. I don’t think everybody is necessarily, fits into one bucket, but that, rather, that spectrum of things does exist, and they’re all important in their own right.
Brian: Mmhmm, Mmhmm. When someone picks up the phone and calls you know calls you about a project, is it typically because they’ve, they have gone through a process where they’ve had an outcome that they didn’t like, they built the wrong solution, or people don’t use, you know they’re not using the interfaces, they’re complaining about usability? You know and they’re, is that is that the impetus for calling sometimes, and why they want to talk to you, is that something is off, and like the human part’s not connecting? Or is it usually something you know simpler?
Dean: Fortunately for us, that does not tend to be the case.
Brian: Mmhmm.
Dean: It tends to be more of a strategic decision,
Brian: Mmhmm.
Dean: Wanting to take advantage of the data assets that they might have, or looking to grow top line revenue in a way that they’d might not otherwise expect.
Dean: We’ll get started projects with them by exploring what those those opportunities might look like, usually without necessarily knowing for sure whether it’s going to be a data solution or not. But in 2019, when you’re designing things, more often than not, it tends to be the case that people find a lot of value in having intelligent products and services, and so it tends to be more a part of the mix more than it doesn’t.
Brian: The, you talked a lot here about how … What can the practice of data science or algorithm designers you know as people can learn from designers and design behaviors and activities that we do. Is there something that the design world needs to learn from data science?
Dean: Hmm. I think one of the things that a lot of the designers at IDEO have appreciated from working with data scientists is the mode of thinking quantitatively yet creatively.
Brian: Mmhmm.
Dean: Oftentimes, we find ourselves designing systems that are pretty complicated, and you could think about designing one part of that system in perfection, but it might actually destroy the rest of it.
Dean: It’s kind of like the butterfly effect. So one of the things that we find ourselves doing pretty frequently actually is simulating how all these pieces fit together, so that you can see how a different service might feel.
Dean: So for as one example, we are doing this project for a major research hospital in the U.S., and one of the things, the brief was all around improving the patient experience for their hospital. They’re in the middle of a hospital redesign.
Dean: And so in this particular case, we took their existing data and kind of modeled out the process of going to the pharmacy, waiting in line for your meds, then waiting in line to see a doctor, then waiting in line to fill your prescription, and then finally leaving.
Dean: And because the way that inventory was managed and the number of people that were working at various hours of the day, you could see those individuals, the patients that were going through that experience, had a horrible, horrible time of it. And you could just see that their wait times would explode when certain conditions happen.
Brian: Mmhmm.
Dean: And it turns out that those conditions happen very frequently. And so designing systems that actually decoupled a lot of these serialized processes that are maybe optimal for businesses, but definitely suboptimal for patients, is a really interesting lens into how we think about all of those things.
Brian: As a data scientist, then, in that type of setting, when you’re talking about this hospital experience, in terms of the roles and responsibilities, I’m I’m always curious how much you model the data scientist as kind of, you’re not there to just do the technical math part. You’re there also to help figure out what the problem is here that we’re trying to solve with this patient experience thing, and it’s kind of like a broadening of the roles.
Brian: Is that typically how you shape things at IDEO, where everyone’s wearing a bit of a wider hat, at least in parts of the project before they go kind of deep and narrow on a particular area?
Dean: Yeah, absolutely. I mean I think the idea for creating that simulation in the first place came from the team, including a data scientist, being out in the field and observing people going through this experience and realizing that people were starting to get really agitated. In fact, you can see it on their faces.
Dean: One of the things, the clever things that they did with this simulation, is that they had these circles that would kind of go through the system. You could imagine them sitting in fake chairs, and then moving onto the next room to sit in another fake chair, and so on and so forth.
Dean: When they had been waiting in any particular space for more than, I don’t know, an hour or whatever, they would start to get angry and vibrate. We lovingly refer to that as the angry balls simulation, because it really brings to life the sentiment that people have in the room, and of course, the staff at this hospital don’t feel great about it, either.
Dean: You know they would love for it to be different, but that’s kind of outside of their purview. So this gave everybody the opportunity to have a common language to share, and in creating these, the next generation of designs for this building. And that came to life because there was a data scientist on that project that was working alongside environments designers and interaction designers, and so on and so forth, that could bring to life a broader experience.
Dean: But this particular piece was really important for getting alignment, and making sure that everybody understood what they were aiming for.
Brian: Mmhmm, Mmhmm. So Dean, the success rate for big data analytics projects, now it’s AI projects and solutions. Study after study shows that that we’re looking at 10 to 20% on average from the studies that I’ve read. I’ve been actually collecting them as a hobby in a single article, because they keep coming out.
Brian: Why, why are we batting such a low average here? Can you talk a little bit about your experience with why things are, are … And then not so much the technical issues, like the data wasn’t available, or you know these kinds of things. What are the other reasons that we bat so low on here?
Dean: Yeah, I think fundamentally, it boils down to an issue of trust. When we are designing algorithms for people, at the end of the day, algorithms make mistakes. They always do. If, if they didn’t make mistakes, they would be perfect, and that is very rare, and probably should, if you can find one of those, you should use it on the lottery or sports bets.
Dean: But I think when people interface with the algorithms, they have an un- they just tend to question the results pretty scrupulously. And I mean in that example with the Procter & Gamble thing, it really required people to get comfortable with what this algorithm was doing for them in a way that was difficult to anticipate beforehand.
Dean: They had to play with it, they had to interrogate the results. They had to wrap their heads around what it was telling them that was different from what they believed. And exposing people to those new thoughts are often challenging, particularly when they’re uncertain.
Dean: So that’s one of the reasons why I find it to be really important for data scientists to work side by side with designers to make sure that we’re thinking really carefully about the way that we’re communicating uncertainty, the degree of confidence with which we express our, our results. And I guess the examples that I like to share, I mean I’m a big Spotify Discover Weekly fan, and among other things that I think are brilliant about that service, are the naming.
Dean: It’s Discover Weekly. It’s not your favorite music list for 2019. You know if they called it your favorite music list for 2019, you probably wouldn’t go back to it time and again. It would get old, they would get stale. The likelihood that you might actually find music that is new for you, it’d be totally different.
Dean: And the same way, you know Google, when they serve up search results, they don’t give you one hit, although there is the get lucky button. They show you a page of 10 hits, and if any of those, if you click on any of those, they call that a win. But they’re in fact returning millions of results.
Dean: By communicating things in a way that allows people to understand how trustworthy something is, in the case of Spotify, it’s that, “Look, we’re going to refresh this list every week. If you don’t like it, just let us know.” In the case of Google, it’s, “Hey, here’s the things that we think are most relevant. Either go to the next page or try a different query, because we’re doing the best that we can here.”
Dean: But those interfaces actually give you the ability to interrogate them in a way that builds trust with users. I think that’s the name of the game.
Brian: Yeah. I would add to that that it’s, it’s really hard to develop trust with users if you’ve never spoken to one as well.
Dean: Also true. Absolutely.
Brian: Yeah. So, slightly technical question here, and you know I’m a designer by trade, and not a data scientist, but I do know that certain, certain tools and techniques and models can be augmented with explainable AI packages, so that you understand which features are correlating with you know why a model is predicting what it does.
Brian: So I’m curious, do you tend to avoid using models such as, if I understand correctly, things like deep learning, you can’t use an XAI package with one of these that’s going to explain to you how it came up. You’re basically into black box territory.
Brian: So if you find that, well, this is the right technical tool, because you know perhaps there’s a level of accuracy that the client needs, do you then look at design as a way to either build trust or control in other ways, since the technology itself can’t provide you know the transparency? Or do you just tend to not use that tool unless it’s absolutely necessary?
Dean: That depends on the use case.
Brian: Mmhmm.
Dean: Most use cases that I’ve seen tend to benefit from some degree of explainability to build trust, but there are certainly situations for which that doesn’t matter. I think it really depends on the relative importance of getting it right
Brian: Mmhmm.
Dean: With the relative importance of understanding why something is making a recommendation that it’s making.
Brian: Mmhmm.
Dean: For example, autocorrect texts. What do you call that?
Brian: Auto complete, or whatever. Or the suggestion words, grammar suggestion.
Dean: Yeah. Look at me. I’m super tech savvy. So those don’t have any degree of explainability.
Brian: Mmhmm.
Dean: I have no idea why it recommends certain things
Brian: Mmhmm.
Dean: For me other than they’re words that I use in combination relatively frequently,
Brian: Right.
Dean: But I appreciate that that works as a service when I’m on the train and I just have one hand,
Brian: Mmhmm.
Dean: Barely send a text message or something.
Dean: Whereas other things, perhaps balancing a portfolio, or making decisions about where to live, those do matter a lot,
Brian: Mmhmm.
Dean: So I think it just depends on that use case.
Brian: Right, right. Cool. We’re almost up at our mark here, but I did want to ask you, do you have any closing advice for data scientists listening to this show, or analytics leaders, people in this space that are trying to kind of figure out, how do I get, how do I deliver better value with the work that we’re doing with data?
Dean: With the data science community in particular, I like to encourage folks to be as low tech as possible, as fast as possible.
Brian: Mmhmm.
Dean: We tend to work in code, and with algorithms, and create plots for ourselves, and that’s a really valuable skill set once you know what you’re … Should be making in the first place.
Dean: And I think I’ve always found it to be really productive and helpful to work backwards from a problem, to think about the interfaces that you might show to someone that could make a better decision, or that they’d might want to see, and then work backwards from that interface to think about data sets that you would need, and the algorithms that you would need to support that.
Dean: It’s just way too easy to over engineer a solution without making sure it’s anchored in people’s needs. So that would be the advice that I would give to folks. Start with Crayola tech.
Brian: Love it. That’s awesome. Awesome. So Dean, where can people find out more about you? This has been a great talk, and I imagine you might have some people pinging you to learn more. So what’s the best place to do that?
Dean: Yeah, yeah, certainly. You can find me on Twitter, my handle’s @DeanMalmgren, and there’s more information on the IDEO website, and certainly on the Datascope website as well.
Brian: Cool. Yeah, this was a great chat. I’m happy to link those up in the show notes. So by the way, what are you listening to this week? What did it say you should like and that you did like? Anything come to mind?
Dean: I have been on a random Muse kick, recently. So it has me going deep in Muse right now.
Brian: All right. Awesome. Well, cool. It’s been great. Great to talk to you about these two worlds coming together, being together, and I appreciate you coming on Experiencing Data.
Dean: Thanks Brian. Thanks for having me.
Di Dang is an emerging tech design advocate at Google and helped lead the creation of Google’s People + AI Guidebook. In her role,...
John Cutler is a Product Evangelist for Amplitude, an analytic platform that helps companies better understand users behavior, helping to grow their businesses. John...
In Episode #003, I talked to Mark Madsen of Teradata on the common interests of analytics software architecture and product design. Mark spent most...