060 - How NPR Uses Data to Drive Editorial Decisions in the Newsroom with Sr. Dir. of Audience Insights Steve Mulder

March 09, 2021 00:36:52
060 - How NPR Uses Data to Drive Editorial Decisions in the Newsroom with Sr. Dir. of Audience Insights Steve Mulder
Experiencing Data with Brian T. O'Neill
060 - How NPR Uses Data to Drive Editorial Decisions in the Newsroom with Sr. Dir. of Audience Insights Steve Mulder

Mar 09 2021 | 00:36:52

/

Show Notes

Experiencing Data with Brian T. O'Neill
Experiencing Data with Brian T. O’Neill
060 – How NPR Uses Data to Drive Editorial Decisions in the Newsroom with Sr. Dir. of Audience Insights Steve Mulder
Play Episode Pause Episode icon-loader.svg
Mute/Unmute Episode Rewind 10 Seconds 1x Fast Forward 30 seconds
00:00 / 00:36:52
Subscribe Share

Journalism is one of the keystones of American democracy. For centuries, reporters and editors have kept those in power accountable by seeking out the truth and reporting it.

However, the art of newsgathering has changed dramatically in the digital age. Just take it from NPR Senior Director of Audience Insights Steve Mulder – whose team is helping change the way NPR makes editorial decisions by introducing a streamlined and accessible platform for data analytics and insights.

Steve and I go way, way back (Lycos anyone!?) – and I’m so excited to welcome him on this episode of Experiencing Data! We talked a lot about the Story Analytics and Data Insights (SANDI) dashboard for NPR content creators that Steve’s team just recently launched, and dove into:

Quotes from Today’s Episode

People with backgrounds in UX and design end up everywhere. And I think it’s because we have a couple of things going for us. We are user-centered in our hearts. Our goal is to understand people and what they need – regardless of what space we’re talking about. We are grounded in research and getting to the underlying motivations of people and what they need. We’re focused on good communication and interpretation and putting knowledge into action – we’re generalists. – Steve (1:44)

The familiar trope is that quantitative research tells you what is going on, and qualitative research tells you why. Qualitative research gets underneath the surface to answer why people feel the way they do. Why are they motivated? Why are they describing their needs in a certain way? – Steve (6:32)

The more we work with people and develop relationships – and build that deeper sense of trust as an organization with each other – the more openness there is to having a real conversation. – Steve (9:06)

I’ve been reading a book by Nancy Duarte called DataStory (see Episode 32 of this show), and in the book she talks about this model of the career growth […]that is really in sync with how I’ve been thinking about it. […]you begin as an explorer of data – you’re swimming in the data and finding insights from the data-first perspective. Over time in your career, you become an explainer. And an explainer is all about creating meaning: what is the context and interpretation that I can bring to this insight that makes it important, that answers the question, “So what?” And then the final step is to inspire, to actually inspire action and inspire new ways of looking at business problems or whatever you’re looking at. – Steve (25:50)

I think that carving things down to what’s the simplest is always a big challenge, just because those of us drowning in data are always tempted to expose more of it than we should. – Steve (29:30)

There’s a healthy skepticism in some parts of NPR  around data and around the fact that ‘I don’t want data to limit what I do with my job. I don’t want it to tell me what to do.’ We spend a lot of time reassuring people that data is never going to make decisions for you – it’s just the foundation that you can stand on to better make your own decision. … We don’t use data-driven decisions. At NPR, we talk about data-??? decisions because that better reflects the fact that it is data and expertise together that make things magic.  – Steve (34:34)

Links

Transcript

Brian: Welcome back to Experiencing Data. This is Brian T. O’Neill. Today I have my friend Steve Mulder from, like, the naughts. We met in the naughts, didn’t we?

 

Steve: Long, long ago in a galaxy far away.

 

Brian: [laugh]. Called Lycos. Some of our listeners probably don’t know what Lycos is. So, Steve and I met at lycos.com, which was like… the number two Yahoo. I guess we were always number two, except for that one little time when we were ahead of Yahoo, and it was such a big deal-

 

Steve: That’s right.

 

Brian: -in the search engine race. [laugh].

 

Steve: [laugh]. Wow, that feels like a million years ago, doesn’t it?

 

Brian: Oh man, it does. And somehow, you found your way to NPR as a Senior Director of Audience Insights. I don’t even know what-were you an Information Architect at Lycos? I was a designer when I first came in there. What was your title and how the hell did you end up at NPR as the Senior Director of Audience Insights?

 

Steve: That’s an excellent question. I have yet to figure that out. Somehow it just happened. So yeah, you and I are both from the worlds of UX, obviously. And I’ve been thinking about this, it’s kind of amazing that people with backgrounds in user experience and design, we end up everywhere. 

 

And I think it’s because we have a couple of things going for us, collectively, in this space. We are user-centered in our hearts, so we under-our goal is to understand people and what they need, regardless of what space we’re talking about; we are grounded in research and getting to underlying motivators of people and what they need; we’re focused on good communication and interpretation and putting knowledge into action, and we’re generalists. I mean, I know you feel this way. We both acquired a ton of different kinds of skills over the years, and I have a feeling that because of that we have an easier time moving across disciplines. And in my case, moving from user experience information architecture, like you said, through user experience design, and usability testing, and user research, and strategy, into analytics, they’re all sort of small moves in their own right, and yet over the course of a career, it definitely adds up to a real shift. And I’m totally loving it; totally loving the idea of applying data and making it actionable at a place like NPR, is so rewarding. 

 

Brian: Yeah, yeah. Can you give the audience a little bit of background-and this is-part of the reason I wanted Steve on here besides, he’s just a great guy, and super sharp, and I always looked up to you at the Lycos days, and all of that, but I wanted to know… really, what I wanted to understand today is maybe how do you see that world differently than a lot of the traditional data science and analytics leaders that I have on the show. I have a feeling-and you kind of went over some of this, but what do you do at NPR with this data? How do you guys make it actionable? I know you have-there’s a newsroom project that just came out; I’d love for you to kind of unpack that. So I’ll let you start where you want, but I really want the Steve lens on this data world, like, would be really fascinating to share, I think.

 

Steve: Yeah, of course. So, let me start with our team. So, we are the Audience Insights Team at NPR, and we work across the organization, and with NPR stations across the country, providing a holistic understanding of our audiences-plural-and applying that knowledge; bringing it to life and creating an audience-centered community that is using data to make better decisions. Data-informed decision-making, that’s what we are all about. So, we work across, obviously, content creation because NPR is all about creating content, whether that’s radio stories, or tech stories, or podcasts, or what have you, but also working to influence product development within NPR, and technology choices and evolution, to influence what we do with marketing, and sponsorship, and revenue, and really, across the board, trying to influence how we use our knowledge of audience most effectively. That’s what it’s all about. 

 

So we’re a team of researchers and analysts that come together and really dive deep into the great ocean of all this data we have at our disposal, and research that we’re conducting ourselves, of course, and to get that full 360-degree view of our audiences. And whatever tools, whatever process that takes, we will go there. So, we are nimble, and we really think carefully about how can we use-because we’re a small team-how can we use our time most effectively to make a difference towards NPR’s overall goals?

 

Brian: And does that re-when you say research, are you talking about qual and quant, or what kind of-

 

Steve: Yeah. Yeah, absolutely. So, everything from qualitative research, testing new shows and learning directly through one-on-one interviews, going deep and understanding, for example, how do younger and diverse audiences respond to traditional NPR content that has historically been a little too white, shall we say, to doing quantitative validation and deep quantitative research directly with our audiences, surveys, and segmentation analyses, and so forth. And then, of course, looking at analytics data, looking at behavioral data in every way we can to really understand reach, engagement, and the trends that tell us where to invest our time. Because with a company like NPR we’re a mission-based nonprofit, we don’t have the kinds of resources that a New York Times or CNN does, so we have to be really careful about how we invest and feel a level of confidence that the research and the data are pointing us to, investing in these shows, or these platforms, or these marketing campaigns.

 

Brian: Right, right. Unpack a little bit the qualitative research part here. The fact that you even started with that, that is not a typical role that you hear. Analyst, yes. A qualitative researcher on a data team is not something-and for someone that has limited resources, the fact that that’s important says something. Why is that important? What do you get out of having that? Why should someone who doesn’t come out of our world think that that matters? You chose to spend your budget that way for a reason.

 

Steve: Yeah, yeah. The familiar trope is that quantitative tells you what is going on, and qualitative tells you why. You know, getting underneath the surface to why do people feel the way they do? Why are they motivated? Why are they describing their needs in a certain way? 

 

So we’ll unleash qualitative research with our audiences, of course, but also internally. As we talk about, for example, what kind of analyses and what kinds of dashboards that we want to build with our analyst team, every single time we’ll do assessments and deep interviews to make sure that we are as audience-centered in our data tool creation as we are in our audience-facing work. If you’re going to be user-centered, you’ve got to be user-centered everywhere: with your end-users, with your prospects, and internally with the people that you work with day-to-day. That’s super important, so whenever we have a new assessment process that we use when we are going to go talk with a division or department about some kind of a new tool like a dashboard, every time we will absolutely spend a lot of time upfront, deeply understanding, not just verbatim what they’re asking for, but what are the underlying needs and motivations that they’re trying to get to? Because sometimes people, if they’re asking for an analysis, or a quick answer, or a dashboard, they don’t always know exactly what they need, what the problem is, they’re trying to solve, the job they’re trying to get done. 

 

They’ll come to you with a question, but it’s your job to play consultant and get to the underlying problem they’re trying to solve. So that’s where qualitative research really shines because you’re just not going to learn that in a survey of a couple of stakeholders. You need to spend time and dive deep.

 

Brian: How do they respond to that? The common objection that I’ll hear is, like, “They tell me they don’t have time for this, and they just want this thing.” And there’s a lot more of the, what I call the order fulfillment mentality of analytics. Which is, “This spreadsheet, I need these columns pulled from this data source. Do you have that over time? With fries?” 

 

How do you get to that point where the person that’s asking for it isn’t just, “Can I just have this dashboard, please?” Did they see it as like-it took a while, and now they’re like, I know why you’re interviewing. Let’s do it because I know I get something better, or is it still a tax for that? How do your internal stakeholders look at that?

 

Steve: Yeah. Yeah, yeah. Well, look, there are times when we all respond to data questions with data, and it’s over and done. And when the CEO calls and he needs a number, you give it. [laugh]. 

 

But what we find is that the more we work with people and develop relationships, and that deeper sense of trust as an organization with each other, and how we work, the more openness there is to having a real conversation. So, it might begin short and sweet, with just, “Hey, can we just get on a Slack call for five minutes and kind of talk about what you’re looking for? I want to make sure I’m tuned in to exactly what you need.” And just sort of progressively get in there and learn more and more about what’s underlying their request. Then over time, they’re more and more open to having deep hour-long conversations where you really can spend the time. But you have to earn it. You have to earn it through trust.

 

Brian: Do you involve them in the prototyping, or low-fidelity sketches, or whatever, like, some kind of design prototyping process before you get to final digital comp or a working data-driven prototype or something? How much do they get involved with that?

 

Steve: Yeah, absolutely. The earlier you learn to make sure you’re on the same page and to uncover any sort of hidden obstacles, the earlier you learn the better, right? So, absolutely. And this is where the old skills of information architecture and user experience come into play because we know how to prototype in a lean way and we know how to conceptualize ideas and start making them real for people without investing in queries, without investing in data infrastructure. So, good old-fashioned wireframes, whiteboards, they go a long way. 

 

We’re doing that right now with revisiting some of the ways we do to display podcast metrics internally, where we’re going through this process of revisiting the way we do that. And absolutely, we are in the mode of good old sketching and wireframing to bring that to life as early as possible.

 

Brian: That’s great. Tell me about this recent project. So, you guys just rolled something out to the newsroom. Can you help us visualize, literally, what that looks like, what kind of content on it? And then kind of tell us, who’s it for? And then how did you guys come up with that particular design in that whole process?

 

Steve: Yes, let me introduce you to [SANDI 00:11:10]. SANDI is the Story Analytics and Data Insights dashboard for content creators within NPR. And for a long time, in working with our newsroom and our programming department, they had access to a lot of data, but it was siloed. So, we had a dashboard for podcasts, we had a dashboard for npr.org, and for both of our apps. 

 

And we had all this data, but it was so compartmentalized and separated out. And on top of that, we are an organization that is, well, heavily invested in audio, no surprise, so the fact that we have audio listening going on and we have reading of stories going on as well and video consumption. It was well past time that we needed a holistic view for content creators where we could bring all that together in one place. So we’ve worked for quite a while on this notion of where we wanted to go with this. 

 

To get there was a journey, both in terms of the culture within the newsroom and making sure that we build something that would actually get used-because otherwise what’s the point?-but also building the backend, the infrastructure needed to support this, so that we can finally break down those silos, and get all this data all together in BigQuery-in this case-but get it all in one place, which gives us just tons of flexibility now about what to do with that. So, at the core of this dashboard is the concept of the story. And so, a story at NPR can be something that you might hear on All Things Considered on radio, or you might listen to on our website, or app, or on a smart speaker, or you might read on any NPR properties, or on Apple News, or wherever else it might be distributed as a written story, as well. So, being holistic and thinking about it that way, is just incredibly important to us. 

 

So, the dashboard itself is organized around that principle. So you’ve got, what are the stories? By story, I can now look at its reach in reading and listening, and audience engagement, both in reading and listening. And of course, being able to filter by all random aspects of metadata that we have in there. So, I can look at certain topics or bylines, reporters, keywords within the piece, any number of places that I can drill in as a reporter or as an editor. So that’s what it is in a nutshell.

 

Brian: So, is the idea here, there’s a topic, which is, like, impeachment number two, and then I can see all the material we created on that, and it’s supposed to drive-do we need more on impeachment number two? And that’s like-and then you have all those stories related to that? Or it’s down at the specific one, which is, like, the capitol riot video, like, down at that level, and then it’s-talk to me about, kind of, the different tiers because you have topics, right? Broad topic, narrow topic, story-level topic, how do you guys navigate that space? That would seem difficult.

 

Steve: Well, one thing we talk about a lot is where is a dashboard important, where daily or weekly data is really driving decision making, and where do we just need to do ad hoc analyses or occasional analyses that answer core questions? So, where we are beginning, with this work is focusing the dashboard at the story level, but knowing that we’re going to carry forward with topic analyses on a regular basis, both outside the dashboard, and then eventually that’ll become in the dashboard as well. So, right now you can see all the stories around something like inauguration and see how they did individually. Eventually, we’d like to roll that up so you can look at the entire topic as a whole as a series of rolled-up numbers or trends as well, not just at the story level.

 

Brian: What are some of the other use cases like? Is this a-9 a.m. the head of the editorial team comes in and check some numbers? Or what are some of the scenarios that it was designed to solve for?

 

Steve: Yeah, we actually-we spent a lot of time upfront through research, as we talked about, diving into that, and one of the ones we crystallized this-because we talked with a lot of people over what turned out to be months, really, of getting down to the use cases, both current uses of data within a daily newsroom workflow, but also what we want to be true. So, leadership had a visions for how to better use data within the org. And so vetting some of those ideas and also just learning other new ideas along the way, as a generative process that it is. So, the way we ended up crystallizing all of this is we captured, for each role within the newsroom-like, a reporter, and editors, and managers and so on-we would capture a list of what are the core questions that we would want any new dashboard to answer. In very simple ways, right? 

 

So I’m a reporter coming in in the morning, and what do I want to know? I want to know, how did my story perform-in terms of audience reach and engagement-yesterday that was on the air or online? How did that story compare with my other stories? How did it compare with other stories in the same topic-around inauguration or whatever? With other stories that occurred overall, yesterday? Was yesterday a good day for context? How did people get to my story? How did they learn about it? Does that tell me anything about how I’m reaching or not reaching the audiences that I want to? 

 

The overarching question for the reporter is, how is my work doing and how can I get even better at my job? I mean, this is-at the highest level, that’s what it comes down to. But coming up with that list, the list of questions that we wanted this tool to answer was a great way of getting on the same page with our stakeholders, and within our team, and with management to make sure that this thing was going to do what we wanted to. And I’m so glad we spent the time upfront in actually doing that. But it took lots of feet on the ground, and lots of conversations-over Zoom [laugh] because the year we’ve had-to actually make that happen. 

 

But it’s paid off in spades because if we hadn’t done that, I guarantee you with the large and diverse kind of newsroom that NPR has, that we would have failed to seize-we’d have had gaps. We would have blind spots where we wouldn’t have seen certain needs, or at the very end, the newsroom leadership could have come in and said, “Oh, but you guys, the dashboard doesn’t do this, doesn’t do that.”

 

Brian: [crosstalk 00:17:16].

 

Steve: But because we spent the time upfront-yeah, exactly. But because we spent all that time upfront, looped them in about what questions do we want the dashboard to answer? What are the KPIs for the newsroom that are most critical to tie all this work to? Leadership, what kind of behavior do you want to encourage in the newsroom? Bringing those two things together, then we can get serious and talk about the user experience of it, and the data infrastructure, and get in the details of it, but I’m so glad we didn’t rush into queries [laugh] and rush into the data. 

 

We spent the time upfront. And yeah, it took a long time, but for a project of this scope because we knew if we didn’t get it right, we would just miss the opportunity of having real impact. We have to take the time; it was totally worth it.

 

Brian: I know this is pretty new. Have you had any anecdotal stories about how this has changed anything, or just feedback yet, or is it still too early to know?

 

Steve: Yeah. Yeah, already we’ve been seeing it. So, we did a soft launch right around the election, and then had a much bigger launch than just in the last couple of weeks here, and we’re already seeing it take hold, both in ways that are sort of explicit and implicit. And what I mean by that is we see people using the dashboard directly and finding their own insights, for example, after the election, seeing certain themes, and, “Hey, I’m looking at the written stories that have the most user engagement after the election, and I noticed a trend. Let’s talk about that. Let’s act on that. What can we learn going forward?” 

 

So, seeing those sparks of insights happen organically, which is great. The other thing that’s happened is, like many newsrooms, there is a morning editorial planning meeting. It’s the huddle where all the folks get together and talk about what’s on tap for today, what’s the key areas of coverage, who’s got what, how are we executing stories? That morning meeting was, I would say not data-free a year ago, but kind of close to data-free.

 

Brian: Data loose. [laugh].

 

Steve: Data loose. Data casual. So-but-with this new-

 

Brian: [laugh]. With jeans. Like more jeans than slacks. [laugh].

 

Steve: Yeah. That’s totally fair. Some loungewear. [laugh]. So we’ve really reinvented that meeting. So, working with some just great allies in the newsroom, taking a fresh look at five minutes every morning, what can we share in terms of data and insights on that data that will really drive the conversation in an even more audience-centered way than ever before? 

 

And without this dashboard, there was no way to get a holistic view; it would have taken you a long time to go to all these different tools and all these different questions, and the data is presented differently, there’s not a consistent way of looking at engagement, for example. Without this new environment, that prep for that important five minutes just couldn’t have happened. So we’re already seeing that make a big difference. That meeting has transformed in terms of being grounded in what just happened and what is performing well with audiences. And people talking about it, right? 

 

The importance of data and insights percolating in just the daily language people use. That’s it. I mean, that’s the magic. And that’s what we’re focused on here. So that’s the more implicit way, right, where we’re seeing it become kind of an ongoing engine of insights that doesn’t even need us to be there to help operate that engine. Because we’re a small team, so building those kinds of engines of data, those can really pay off.

 

Brian: During your research phase. Did you know that this was going to be part of the huddle? Or did that organically emerge, and the editorial team just decided that we should be having this SANDI at the meeting?

 

Steve: It was absolutely on our list, yes. That was in our sights, that particular meeting because we knew how much of that drove daily editorial decision-making. And we knew that to bring audience insights into the newsroom on a daily basis, we had to be present-well, our impact had to be present in that daily meeting. Yeah.

 

Brian: Got it. So, they had seen-these teams had seen what this was going to look like before it ever was real? They knew they were going to have SANDI there and it was going to look like this, and it was going to provide this kind of stuff? They had some idea what was coming?

 

Steve: Yeah, yeah. And we had allies in the newsroom for years who have been asking for that, and have been kind of operating in pockets of, like, “We need better use of data, and better insights, and better aggregations.” And so finding those allies and finding the sort of executive authority, too, to make this happen was a critical first step in all of this. It wouldn’t have happened if we didn’t have those allies already present, and we didn’t work hard to deepen those relationships and make that work. So yeah, we had some folks raring to go, once we had the right tools and processes in place. And then for a lot of folks in the newsroom, it’s going to be a much longer process to get them just used to living with this kind of data on a daily basis.

 

Brian: One of the things that I’ve noticed, people coming, usually, more from technical backgrounds, from data science, or engineering, or analytics, or statistics, they tend to look at the world from the data first, and it’s like, “Well, this is what’s been collected. Maybe it got collected before I was ever here. What can we do with it?” And then the human factors-oriented person is like, “We’re trying to find a need first, and then we’ll go back to what’s available.” Did you find out, when you were doing the research, I do think there’s an argument-I’ve changed my opinion on this that both of these matter because what happens is sometimes, especially internal stakeholders-and this is happening a lot with machine learning and AI right now which is, the business thinks they want it, but they don’t really understand it enough to even know what questions that it can answer, and they’re relying on the experts, usually data scientists to do a job that they didn’t think they were there to do, which is, “How could you help me with this new tool that you have?” 

 

And they’re more like, “Hand me a problem that’s well-defined, and I will go solve it. I’ll build a model, I’ll build a forecast, I’ll do whatever.” But the infrastructure is not there to do it. And they don’t see their role as being to go and figure out what should we be doing? Did you have that challenge, too, which is people saying like, “Well, I don’t know. What podcast analytics can you tell me? What could I get out of that data?” 

 

Because they don’t even know what’s being collected, let alone what questions to ask, it’s just so abstract. Did you have to work from both ends? Or did you only work from the user experience lens, and just really, out of what they asked for and not presenting options? Like, “Well, we could tell you listenership over the week compared to other stories of the same topic,” or these data-generated use cases. Is this resonating at all? I hear this a lot.

 

Steve: Oh, yeah. Absolutely. The folks that are leading the business, or product lines, or creating podcasts, their job is not to necessarily know what kinds of data exists and what is possible and what’s the latest in the technical evolution of podcasting measurements based on log file parsing? [laugh]. Like, they’re not going to, they’re going to know that stuff, right? 

 

Our job is to be-I like to talk about it as, like, I’m a committed generalist. I work really hard to, one, speak business, so I understand the world they live in and what’s motivating, and what’s the value of the business, I’m trying to connect to; speaking users, obviously, that’s my gig; speaking data and technology to make it all function; speaking product because if we can’t express what we are doing in a product that scales and endures, it’ll die; and we speak marketing. It’s really critical to be able to market the work to see it have real influence and life. And we are the interpreters in many ways. Analysts have to be interpreters to be able to have impact. We can’t bury ourselves in the data. I’ve been reading a book recently. I don’t know if Nancy Duarte-

 

Brian: [crosstalk 00:25:56]-

 

Steve: She wrote this book called DataStory-oh, excellent, yeah. So I’m reading through her book, DataStory, and it’s fantastic. And she really captures-I think, she has this model of the career growth of an analyst that is really in sync with how I’ve been thinking about it, too, which is, early in your career as an analyst, you begin as an explorer of data. You’re sort of swimming in the data, starting with the data-to your earlier point-and finding insights from the data first perspective. And then you become an explainer over time in your career. 

 

And an explainer is all about creating meaning: what is the context and interpretation that I can bring to this insight that makes it important, that answers the question, “So what?” And then the final step is to inspire, to actually inspire action and inspire new ways of looking at business problems or whatever you’re looking at. And that career progression is ringing so true. We have a mantra on our team that we follow that’s very similar which is, “Moving from ‘what?’ To ‘so what?’ And then ‘now what?'” moving from simply reporting on the past-what happened-to talking about so what? What does it matter? Why should I care? And then getting to now what? Now what? What does the organization do next? 

 

Getting comfortable with making real recommendations based on analysis we do, or dashboards, or whatever it is, but getting more comfortable with that approach. So, we can evolve our own skill sets, as analysts and consultants, and evolve what the organization needs to be to actually use that data. So, our job as translators going along this journey of ‘what’ to ‘so what’ to ‘now what’ is absolutely all about taking stakeholders along with us, who may not know how to speak the language of data like we do. How could we expect them to? We spend so much time immersed in it, it’s our job to be translators, and bring them along, and understand them so well that to them, it feels painless. 

 

Not always easy, but that’s a critical piece of it is making them feel like we’re doing this for them, not for us, and it’s going to have an impact because we understand what matters to them underneath the surface so well, that we can tailor what we’re doing for them. So, talking to a reporter, all this deep research that we did in our newsroom, we then knew how to translate that into a dashboard that would just feel very intuitive to them because it was answering questions, the way they thought of the questions, not the way that we as analysts think about those questions.

 

Brian: Was there one main risk or hurdle that you’re-not technical. Unless the only ones were technical. Usually, it’s not. But was there one particular thing you’re like, “Team, we got to get this right. If we don’t get this right, we’re done.” Was there something like that, or what was just the hardest part of making-getting-some places, it’s simply getting people to use the dashboard, or whatever the-

 

Steve: Yeah. [laugh].

 

Brian: -tool is. It’s usually because it’s not helping anything or it was a one-hit wonder and answered one question one time, and then it’s done. Was your concern that it wouldn’t get used, or-

 

Steve: I would say for this year, now, that’s our main challenge is making it part of regular life in the newsroom. That is our big ongoing challenge. And it’s going to be-and it’s ongoing. This is a whole process. The dashboard launch is not an end of anything; it’s the beginning of the hard work. 

 

But looking back, what we’ve done so far, I mean, there’s a lot of things I could talk about, but I think that carving things down to what’s the simplest is always a big challenge, just because those of us drowning in data are always tempted to expose more of it than we should. We’ve got a million ways of looking at the reach of a given story, and engagement-because engagement metrics vary so much, whether you’re talking about audio or text, or by platform, the data that’s available-and our instinct is to expose it all. You give them, like, to the 100th decimal point, this calculated engagement score and all of its components. And just giant tables and giant line graphs. And honestly, again, it was all about this process of divergence and convergence. 

 

So, diverge and go big and think of all the things that you might provide to answer the questions that the dashboard needs to answer, and then converge on the things that matter most, and test, and iterate, and gradually take more and more away. Because you realize you know what, most people are never going to care to know the number. They just want to know, was it above or below average in terms of engagement compared to other stories that are like it. That’s what they need to know, so why are we showing a number to the 100th decimal? Let’s be real; let’s make this default view, and the daily email that they get that’s part of this ecosystem, now, why don’t we just make that the simplest possible view, and the people who want to go deeper, we will provide pathways for them to do so, but let’s not optimize it for the advanced person. Let’s optimize it for the most common usage, and the people that aren’t using it at all yet, and we need to guide them along the way, step by step.

 

Brian: Yeah, that’s excellent. This has been a great conversation. Thanks for sharing. It’s great to hear what’s going on there. And congrats on the project, too. I know it’s nice to [laugh] get one of those things out.

 

Steve: Yeah. Big milestone. 

 

Brian: Out the door. 

 

Steve: Yeah.

 

Brian: That’s where the learning really begins, right?

 

Steve: Yes.

 

Brian: It’s kind of like, really the beginning. It feels like the end, but it’s actually the beginning of the project. [laugh]. Do you see the natural evolution of what you’re doing is to move into forecasting or predictive analytics? My understanding is, I think I read something but a lot of the content on even Netflix and Apple and some of these places, it’s literally driven by what people watch. 

 

Like, plotlines, genre, all this stuff is heavily guided by the data. They’re designing shows loosely based on, or somewhat-I don’t actually know if it’s loose or not, but it is data-informed how they come up with new shows. Do you see that happening the-I don’t know if the scale of the data is there, but do you see that as the future iteration will be going into more machine-based forecasting, or predicting what needs to be done based on audience consumption?

 

Steve: Yeah. Yeah, we’re already making forays into that, although it’s more it’s more done through ad hoc analysis, rather than deep machine learning, and algorithms that are continuously running. So, analyses where we can go, for example, into podcast listening and understand deeply the audience’s that are listening to podcasts, what do we know about them? And what kind of cross-listening behaviors do we know? And so applying that deep knowledge forward to future strategies for podcast development and new shows, and so it’s forward-looking and it’s based in data, but it’s not the algorithmic approach that the Netflixes of the world are doing. 

 

And this is again, where the NPR, nonprofit budget, [laugh] and resources hold us back a little bit from going there. But definitely exploring that. Some folks who might know about the NPR One mobile app might know about this as a personalized mix of news, and information, and local and national stories, and podcasts that is a custom flow and absolutely is all about creating a custom flow that responds to what people listen to and what they need. And it’s early days even for that, for sure, in terms of what’s possible. I’m super excited about what that all means down the road, but at the same time, I feel like-no surprise coming from a guy who works for NPR, but I think there’s always going to be power in expert curation around something like the news where a lot of our job is not just writing stories, but on the radio, or on smart speakers, or on our website, curating what are the stories that matter most right now? 

 

And that active curation is super critical. So, how do we get the best of both worlds, as a universe? How do we get the best of both worlds of, kind of, responsiveness to what matters to me personally, to having that sense of communal and expert curation that gives meaning to the insane amount of news happening on any given day?

 

Brian: Right. Yeah, I can imagine. This is great. Do you have any closing thoughts for our listeners? Any general takeaways or something that you’ve had a learning experience at NPR about working with data that you’re, like, “Don’t do that, again. I learned this.” Or something. Just any advice. It sounds like you’ve done some great work there.

 

Steve: As we were talking with the newsroom about this whole project and, kind of, ongoing, how do we use data, there’s a healthy skepticism in some parts of our organization-and probably yours, too-around data and around the fact that I don’t want data to limit what I do with my job. I don’t want it to tell me what to do. And we spend a lot of time reassuring people that data is never going to make decisions for you. It’s just the foundation that you can stand on to better make your own decision. It’s like if you build a solid foundation, a platform in a field, and that platform is based on data, and insights driven by that data, and you get up on that platform, it’s a solid platform from which you as a reporter, or a manager, you can see the horizon better, you can see what’s happening better, and you can jump better because you’re jumping off a firm platform, but you’re still the one jumping, and deciding where to jump. 

 

You’re still making the decisions. So, for example, we don’t use data-driven decisions. At NPR, we talk about data-informed decisions because that better reflects the fact that it is data and expertise together that make things magic. So that’s a theme that we talk a lot about.

 

Brian: Awesome. I love it. Steve, where can people find more? Where do you hang out? Is LinkedIn, or tell our listeners how they can follow your work?

 

Steve: Yeah, generally, all those places, all the Twitter @muldermedia. You can find me at-

 

Brian: @muldermedia? Okay, you’re still th-okay.

 

Steve: Still-

 

Brian: Yeah. I remember that.

 

Steve: -still, after all these years.

 

Brian: [laugh]. And now you’re in a media company. [laugh].

 

Steve: All of a sudden, my Twitter handle makes sense, like, you know, two decades later. Who knew? Who knew?

 

Brian: [laugh] predictive analytics. [laugh].

 

Steve: Just got lucky. And otherwise, while you’ll never hear me on NPR, please tune in to NPR and give to your local station.

 

Brian: Ah. That’s a good place to leave it right there. Well, thank you, Steve. This is so great. Thank you for coming on. It’s been great to catch up with you, and we’ll hopefully see each other soon. 

 

Steve: Great to join you.

 

Other Episodes

Episode 0

October 22, 2019 00:35:21
Episode Cover

024 - How Empathy Can Reveal a 60%-Accurate Data Science Solution is a Solid Customer Win with David Stephenson, Ph.D.

David Stephenson, Ph.D., is the author of Big Data Demystified, a guide for executives that explores the transformative nature of big data and data...

Listen

Episode 0

October 06, 2020 00:40:50
Episode Cover

049 - CxO & Digital Transformation Focus: (10) Reasons Users Can’t or Won’t Use Your Team’s ML/AI-Driven Software and Analytics Applications

Watch the Free Webinar Related to this Episode I went depth about how to address the challenges in this episode on Oct 9, 2020....

Listen

Episode 0

April 07, 2020 00:45:34
Episode Cover

036 - How Higher-Ed Institutions are Using AI and Analytics to Better Serve Students with Professor of Learning Informatics and Edtech Expert Simon Buckingham Shum

Simon Buckingham Shum is Professor of Learning Informatics at Australia’s University of Technology Sydney (UTS) and Director of the Connected Intelligence Centre (CIC)—an innovation...

Listen