[

August 13, 2019 00:38:23
[
Experiencing Data with Brian T. O'Neill
[

Aug 13 2019 | 00:38:23

/

Show Notes

Dr. Bob Hayes, will be the first to tell you that he’s a dataphile. Ever since he took a stats course in college in the 80s, Bob’s been hooked on data. Currently, Bob is the Research and Analytics Director at Indigo Slate. He’s also the president of Business over Broadway, a consultancy he founded in 2007.In a past life, Bob served as Chief Research Officer at Appuri and AnalyticsWeek, Chief Customer Officer at TCELab, and a contributing analyst at Gleanster, among many other roles.

In today’s episode, Bob and I discuss a recent Kaggle survey that highlighted several key non-technical impediments to effective data science projects. In addition to outlining what those challenges are and exploring potential solutions to them, we also covered:

Resources and Links:

Dr. Bob Hayes on LinkedIn

Seeing Theory

Calling Bullshit

Doctor Bob Hayes on Twitter

Business Over Broadway

IndigoSlate

Quotes from Today’s Episode

“I’ve always loved data. I took my first stats course in college over 30 years ago and I was hooked immediately. I love data. Sometimes I introduce myself as a dataholic. I love it.” — Bob

“I’m a big fan of just kind of analyzing data, just getting my hands on data, just exploring it. But that can lead you down a path of no return where you’re just analyzing data just to analyze it. What I try to tell my clients is that when you approach a data set, have a problem that you’re trying to solve. The challenge there I think it stems from the fact that a lot of data science teams don’t have a subject matter expert on the team to pose the right questions.” — Bob

“The three findings that I found pretty interesting were, number one, a lack of a clear question to be answering or a clear direction to go in with the available data. The second one was that data science results were not used by the business decision makers. And the third one was an inability to integrate findings into the organization’s decision making processes.” — Brian

“It makes you wonder,‘if you didn’t have a good problem to solve, maybe that’s why [the findings] didn’t get used in the first place.’” — Brian

“That part isn’t so much the math and the science. That’s more the psychology and knowing how people react. Because you’re going to have certain business stakeholders that still want to kind of shoot from the hip and their experience. Their gut tells them something. And sometimes that gut is really informed.” — Brian

“If executives are looking at data science and AI as a strategic initiative, it seems really funny to me that someone wouldn’t be saying, ‘What do we get out of this? What are the next steps?’ when the data teams get to the end of a project and just moves on to the next one.” — Brian

Transcript

Brian: Dr. Bob Hayes is a data scientist, blogger, and recently named top 50 digital influencer for 2019 who I discovered while researching the types of nontechnical challenges that data scientists face.

Bob wrote a great blog post summarizing the findings of this survey by Kaggle, and I wanted him to come on the show to specifically discuss three of the nontechnical challenges that the survey uncovered. These three challenges are similar to the types of problems that the field of service design and product management often address. And on this episode we explore some of these parallels from Bob’s perspective. So if you work in data science or analytics or you’re a product manager working with data scientists, I think you’ll find today’s episode exploring these challenges interesting. And here we go with Bob Hayes.

All right. We’re back with Bob Hayes, Dr. Bob Hayes actually. You are a data scientist. Is that correct?

Bob: People have called me that, yes.

Brian: Not only that. You’re a top 50, was it an AI and analytics influencer? Is that what I recently saw come across my LinkedIn feed?

Bob: That sounds about right.

Brian: Yeah. I thought that was really cool to see some names I know and including yours.

Bob: Thank you.

Brian: I forget how we met exactly, but I’ve been enjoying. You’re pretty active on LinkedIn, posting what’s going on in kind of the world of data and data science and analytics. So I thought you might be an interesting guest to come and talk specifically about a survey. You had written an article for your business. You used to be 100% consulting oriented, correct, with Business Over Broadway? Is that-

Bob: That’s correct. That’s correct. Now I work for a company called Indigo Slate. I’m the research director there. And we want to infuse research and analytics into everything that we do. So I was, I got hired on there probably a couple months ago and I love it. I love it there.

Brian: Awesome. Awesome. Well, previously, I think this is about a year ago you had written an article summarizing some of the findings that were in a Kaggle survey of challenges for data scientists. In particular I wanted to talk to you a little bit about some of the nontechnical ones there because a lot of these remind me of activities that product and service designers are … it’s similar challenges that usually result from, at least in my experience, a lack of getting cross-team collaboration, getting the right stakeholders involved, understanding goals and metrics. So today that’s what we wanted to get your expertise and your thoughts on, were some of those findings.

So can you talk? First, maybe give a little bit of background on just yourself, and then we can go into some of those findings. But what kinds of work are you doing today and what are your current … what’s your current opinion on what the nontechnical challenges are in doing effective data science and analytics today?

Bob: Okay, sure. My background, I have a PhD in industrial organizational psychology, and that’s simply psychology applied to the world of work. My consulting research has revolved around the areas of customer experience management, customer loyalty measurement, and just the practice of data science. I want to understand how teams can be more effective in applying the principle of data science in their jobs. So my research kind of revolves around those key areas.

I’ve always loved data. I took my first stats course in college over 30 years ago and I was hooked immediately. I love data. Sometimes I introduce myself as a dataholic. I love it.

Brian: That’s a new one.

Bob: Yeah. So Kaggle does an annual study. I think this is their third year coming up this year, and they survey a bunch of data pros from around the world on various topics around data science, data they use, the challenges they face, the skills they have and so forth. And the blog you were referring to, they had a list of what are the top challenges of applying data science in work. There were a lot of things at service, I’m not surprised, because you probably find this in other kinds of work as well.

A lot of the challenges revolve around nontechnical things like just working together, having the right skill sets and so forth. I’ve found the same kind of, or similar results in my research on my best practices and customer experience management programs. So that you got pretty much the same roadblocks, the results aren’t applied to a business decision making. Teams don’t work together effectively. So you have these common themes that kind of cover just the culture, the social aspect of the workplace. And those seem to have a big impact on whether or not a team can be successful or not. It’s not just about skills or the technology. It’s about how the people work together.

Brian: Yeah. Let me summarize what these kind of three topics were, and maybe we can pick them apart separately.

Bob: Sure.

Brian: The three findings that I found were pretty interesting there were, number one was lack of a clear question to be answering or a clear direction to go in with the available data. The second one was data science results were not used by the business decisions makers. And the third one was an inability to integrate findings into the organization’s decision making processes.

Maybe if we go in order, can you speak a little bit to maybe your experiences and what you think might be going on with the first of these? So that was the lack of a clear question to be answering or a clear direction to go in.

Bob: Yeah. I’m a big fan of just kind of analyzing data, just getting my hands on data, just exploring it. But that can lead you down a path of no return where you’re just analyzing data just to analyze it. What I try to tell my clients is that when you approach a data set, have a problem that you’re trying to solve, and that roadblock or that challenge I think it stems from the fact that a lot of data science teams don’t have a subject matter expert on the team to kind of pose the right questions. For example, if you’re trying to study cancer treatments, you should have an oncologist on your team because that oncologist knows all the variables at play, knows about treatment options and so forth. And if you don’t, you’re going to miss out on opportunities of asking questions that you don’t even know to ask.

So first of all, have somebody on the team who has good content domain knowledge. If you want to talk about improving the marketing process, have a marketer on your team. If you’re talking about improving call support, then have a support person on your team. I think the key there is to have the right people on the team to ask the right questions. And not only that. It’s good to have … It’s good to be precise with your language.

In my work, I try to improve customer loyalty for my clients. You can say that one of the problems can be like, “Oh, I want to increase loyalty because I want to have my business grow.” Now a lot of people equate customer loyalty with just recommendation behavior like how likely are you to recommend this company to your family and friends. But there are other kinds of loyalty metrics out there, and if you just stick with one, the recommend question or the recommend component of loyalty, then you’re missing out on trying to retain your customers or trying to upsell and cross-sell other products to them.

So to be precise, if you’re talking about customer loyalty, it’s not customer loyalty you’re trying to optimize. You see, you’re trying to optimize retention rates. You’re trying to optimize how many new customers you get. Or you’re trying to optimize how many products or services you can upsell and cross-sell to your existing customers.

So be precise with your measurement and your outcomes. I think that’ll go a long way in helping you ask the right questions to be precise about the things you’re trying to optimize in your modeling.

Brian: It was interesting to me that this was the fourth, I think this was the fourth biggest challenge that was cited with 22% of respondents saying this was an issue. So it suggested me that respondents actually proceeded with these projects of these data science efforts, despite not having this question stated or a problem to work on.

Do you think this is an equal problem between the data team, the data scientist is the top of that team, I’m not sure how they’re all structured, not having the soft skills to help the non-data stakeholders surface their true objectives? Or is it more that you feel like the business doesn’t know how to pose questions and they’re asking for like AI and data science solutions without really having any type of problem? Like where is that responsibility? Do you think it’s a training issue for nontechnical skills that the data team needs to have? Or is it more the business needs to work on being able to ask better questions with some understanding of how it might tie back to actionable data? Like where does that fall?

Bob: Right. I don’t want to blame any one particular job role or anything like that. I think it’s, again, it’s working together. I did a study a few years ago on the data science skills needed to be successful in your work, and I found that data science skills fall into three broad buckets that we’ve always talked about. You got the subject matter expert skills, whether it be business, whatever you kind of study. The second set of skills is around math and stats. So how do you analyze data? And the last skill, the third skill is technology and programming, is how do you get access to the data you need? How do you program algorithms and machine learning algorithms and so forth?

So if you understand that, there’s three skillsets, and typically people fall into one of those buckets. You’re either an expert in the content domain or stats math or technology programming. It’s hard to find somebody who’s skilled at everything.

What I try to do is I try to … I show them the scientific method. It’s a pretty straight forward process from data to insights. And it’s been used by scientists for hundreds of years. I tell you five steps are first you identify your problem statement. Next thing you do, step two is, you state some hypotheses or some hunches about what you think might … what you might find. The third step is you gather or you generate the data. The fourth step is you analyze that data. And the fifth step is you take action or communicate the results to the powers that be.

So if you laid this out, this five step process, you can see of those three job roles, the SME, the stats person, and the tech person, you can see where they might fit into this whole scientific approach. For example, the pose the problem, the problem statement. You need somebody who knows what they’re talking about it. So you need the subject matter expert. That’s where the person comes in and here’s my problem, here’s what I want to do.

The next one is you state some hypotheses, and that could involve the stats math person to make sure you’re pretty precise in your language about what you’re trying to study, what outcomes you’re trying to optimize or maximize. And then you gather the data, and that might require the technologist or programmer in order to get access to that data in order to give it to the stats person in order to analyze the data for the business person.

So if you kind of look at the whole lifecycle from data to insights and you show them this is the kind of steps you go through, then I think it’ll help them see where they can contribute the most with the skills that they have. I think that’s a good first step, is to know where your limits are and what other kinds of team members you’re going to have to have on your team in order to be successful in applying analytics to solve a particular problem.

Brian: Are there particular activities or tactics or things that these data teams should be doing to get better involvement from their subject matter expert colleagues for example, or the other departments? Do you think there’s a gap there or something in terms of how they’re supposed to integrate and work with those people or it’s …

It seems interesting to me that the survey stated this as a challenge from the perspective of the data scientists, which again, and this isn’t about blaming anybody, right? But it’s kind of like, are they staying as a challenge because they think it’s someone else’s responsibility or because they’re having trouble getting those team members to participate? Or they don’t know how to participate? Like they don’t how to be useful to the data team? Do you have any feelings on that? I know there’s several questions there kind of.

Bob: Yeah, that’s a loaded question. Well, first of all I think, in the study I looked at, I found that even business people who have really good quant skills are happier with the outcome of their data science projects compared to business people who lack those quant skills. So I think a good first step would be to kind of get a good baseline about who knows what about data and analytics.

I tell this to everybody I meet. Take an intro class to statistics or just a basic stats course, just to give you an idea about how data can be used, the kinds of things you can do with it. You can predict stuff. You can look at historical data, things like that. And if you give people that, kind of that base knowledge, I think it’ll help them understand the value of data. And they may want to be more involved with these kinds of data projects, at least will have some sort of common language with the stats person who wants to convey maybe a complex idea to them. So I recommend to everybody take a just a basic stats course, just so you know kind of the language of data and analytics.

Brian: So moving onto the second one and it’s kind of interesting to think about whether or not the first challenge fed into the second one, and this was data science results were not used by the decision makers. And it makes you wonder, “Well, if you didn’t have a good problem to solve, maybe that’s why they didn’t get used in the first place.”

Bob: Yes, exactly.

Brian: Can you go have you experienced or kind of overheard examples of this happening and do you know why they happened?

Bob: Yeah. Yeah, I’ve run into this in my own consulting career. I’ve had one example or maybe a handful examples where they have a question that they pose. It’s pretty clear. You analyze the data and you show them here’s the results and they don’t follow your recommendations because they don’t believe what you’re showing them. And I think that’s due to confirmation bias where that’s just the tendency to seek out information that supports your prior beliefs. So if they already go into a study with a belief system and you show them something that counteracts their beliefs, they’re less apt to follow that advice than if it supports their current beliefs.

And I’ve seen that a few times and it’s … I mean, that’s human nature, right? You just got to tell them this is why this study is valid and just, and hopefully they’ll have people in the room that can help argue your point where this is … I mean, listen to this study because it’s valid, it’s reliable, and it makes sense.

Again, another reason why I think business decision makers don’t take heed is they lack basic knowledge of stats like I said earlier. Like I said, if a business person has a good understanding of quantitative methods, they are more successful on their data science projects than business leaders who don’t have that quant skill. And that makes sense because it’s a mathematical endeavor looking at data. If you understand the fundamentals about how to analyze data, you’re more apt to believe studies that you’re doing. Are you still there?

Brian: That’s good feedback. I appreciate that. The not using results part. So do you ever hear about, or maybe in your own practice, prototype out … So as a matter of course, a lot of times, and when we’re designing products and services, we are prototyping and we’re creating mockups of stuff before we code it. And the reason why is to try to understand how would it be used in the real world before we invest all this technology in it.

For example, if you’re building a predictive model for something, is there a common practice of prototyping the findings, making, essentially making up some findings and saying, “Here’s some ideas what might happen? How would you Mr. and Mrs. Business Stakeholder react to this?” “What we found is that actually Florida is the number one state for,” whatever, “loan requests that are not approved,” whatever. I can’t even think of a scenario, but the point being some kind of unexpected finding, and understanding how they would react to that prior to the data actually supporting it.

Do you ever do any type of prototyping like that to actually investigate how it will be adopted before the experiment happens? Or is it too hard to know what the possible outcomes might be?

Bob: Well, I kind of always think ahead about how I want to present the data once I … I do surveys for a living. So I always kind of thinking ahead about how am I going to report these results, the spider chart, graph, bar, like 3D, how am I going to do this? I always kind of thinking about those things.

But I haven’t done what you recommend, and I think that’s a really good idea, is showing or just telling the recipient of this information some hypothetical and seeing how they react. I’ve never done that. And that can be something that sounds kind of useful actually. Because you’ll get a sense of maybe their mental blocks about what you’re trying to do and what you’re trying to show them. And by talking to them and giving them hypothetical situations about the results were like this, what would you say, I think that can go a long way in helping you design a better study or at least present the results in a way that are more effective, that kind of resonate with them and their mindset. I want to try that next time.

Brian: Well cool. Let me know how it goes. Yeah, it seemed interesting to me that I hadn’t heard about this before because it’s one of the … kind of in the world of usability testing, which is stuff you do on the back-end of a design to figure out that’s the actual process of testing the design to see if it was usable and it accomplished the goals that you set out for.

As you were talking about the believability part, one of the common questions you get is, “Well, how many people did you ask?” “Well, actually four out of five people couldn’t actually finish the checkout process. That’s how bad it was.” “Well, you only talked to five people.” “So how many do we need to talk to?” And the answer that you sometimes hear recommended for designers is, “How many would it take for you to believe the data?” And you forget about p values and statistical significance and all of this and you focus on what that stakeholder actually needs to believe. And you can tell them like, “Statistically or from our practice, we don’t need to keep hearing the same thing over and over. Four out of five is enough. And plus, our experience says we have a problem with this, and we can make a change now. We can keep testing it. So you tell us how much you need to believe it.”

I wonder if the same thing could happen, that practice could happen too with these data science projects where if you find some wanky result like that you’re just not expecting, what do we need to do to back that up so you believe it’s not correlation for example, it’s actually causal, or whatever, whatever it may be. It kind of gets to that, the non … That part isn’t so much the math and the science. That’s more the psychology and knowing how people react, because you’re going to have certain business stakeholders that still want to kind of shoot from the hip and experience. Their gut tells them something. And sometimes that gut is really informed.

Bob: Totally.

Brian: They may not be able to explain how it works, but we can’t always explain our models either, right? So.

Bob: Exactly. Exactly. Exactly. Yeah. So when I try to convince somebody of a certain point of view, I try to use multiple lines of evidence. I don’t just rely on a single study if I can. I mean, look at any journal. There’s tons of studies that have been done over and over and over again, and if they find the same results, then that’s pretty good evidence to me that, okay, this is a reliable result. I kind of take that approach. It’s not just showing them a single study, but here’s a handful of studies that show the same thing.

Brian: Yep. Yep. Yep. The third topic to restate that was in this Kaggle survey was the inability to integrate findings into the organization’s decision making process. I didn’t see the format of the question here on the survey, but I’m assuming this kind of suggests that there was some type of successful project or product or something that occurred, but then it failed to get into the organization’s decision making process.

This makes me wonder if, again, this, what I perceive to be lack of what I would call a product management skill set and some internal in some non tech companies for example, not having that person that looked beyond the findings and like, “Yep, we’re able to predict x,” and then it’s like off to the next project, and it’s like, “Well, wait a second. There’s a whole bunch of stuff we need to do to take advantage of making the sales campaign, target this region,” or whatever it is. Can you speak to this third one?

Bob: Yeah. I think it still, it revolves around the fact that maybe they don’t have the right team member as part of the data science team. If you’re doing a study and trying to understand call center support and you don’t have a call center support person on your team, then who are you going to give the results to, to make changes happen? Or if you’re trying to develop a product based on some data insights, you don’t have a product person to think about that in a certain mindset, then it’s just going to die right there. Again, you have to have the right team members with you in order to move those results to the next level.

In my line of work, I consult. I don’t have the authority to dictate to my clients you have to do this. I just say, “Here are the results,” and I’m hoping I’m talking to the right person who can actually implement those recommendations that I’m suggesting. And if that person is not in the room, then those words may fall on deaf ears and nothing may ever get done, which is what I suspect happens a lot as the results that you see here.

Brian: Why do you think that happens so much though? It seems like especially if executives are looking at data science and AI as a strategic initiative and we need to invest in this, and you get to the end of this project and then it’s just like, “All right, that was the deck and onto the next thing.” It seems really funny to me that someone wouldn’t be saying, “What do we get out of this? What are the next steps?” Like now that we have this model or now that we have this feature or whatever the heck it is, it seems really weird to me that this just gets dropped. Like it doesn’t get integrated properly. I don’t know.

Bob: It could be a lot of things. I mean you think about how the timeframe with which things get done and things can change in a company, and maybe it’s just, you say do this or they do that and you don’t see changes for like six months, but you’re onto the next thing within two months. So how do you keep a team focused on the metrics and not be detracted by the next shiny thing that’s coming down the pipe? That’s tough. That’s a tough one. If you have an answer, let me know.

Brian: Well, again, I think this … gets back to at the beginning of the project, if we don’t understand this kind of how our work fits into the bigger picture and how is it going to be integrated, like that has to be part of the project and a lot of this it’s almost consultative work. It’s okay. Let’s say that we’re able to come up with a predictive model for x, like where the next sales increase is going to be. What would we need to do to make sure that the sales organization is able to leverage this data?

And this gets into again service design and thinking about the end customer experience, the employee experience, the business processes behind the scenes, and not just looking at it like at the time that we put together the PowerPoint deck to communicate the data science findings, there is some follow on work. That’s never the actual … That’s not the end. In a lot of ways that’s the beginning of something. Just like design is the beginning of the journey. It’s not the … When you finish the design, you’ve just put a concrete hypothesis into place. You haven’t actually built it and put it into practice with customers. It’s a design. It’s a hypothesis.

But knowing that what those follow on steps are, to me is really important in figuring out who the team members are that are required, what departments need to be involved to make sure this happens. And are they aware of the work we’re doing? Do they know that they’re going to have some responsibility to change some business processes or whatever it may be to adapt this work?

This actually sort of leads into my next topic, which is have you been … I’ve been reading a fair amount about kind of this growing role of what’s called an analytics translator. I don’t like the title. It sounds very low level and it sounds like something you do after something else has been done. Like we take really bad and hard to understand stuff and try to make it useful to some other people. And that’s not what I understand the role is really about. But are you familiar with this kind of growing role? It’s kind of …

Bob: No, I haven’t, but go on. I’m sure…

Brian: Well effectively my understanding is it’s someone that’s actually upstreaming the process and they’re interfacing between the business and the analytics team. So they’re effectively playing, again, what I would think of as what to me should just be called product managers even if they’re not actually building a commercial product, like if you’re working at a farming John Deere or something like that. You may not be creating any commercial software products, but that the role and the skill set of the product management person is effectively what I see very strongly tied to what is called an analytics translator.

This gets into whether or not there’s a distinct role. Do you need a distinct role and person for this or do you need to get the data teams and the analytics, the leaders and the data science leaders more skilled, and some of these kinds of soft skills, the psychology pieces and working with teams and looking at services holistically, not just the data science part, but understanding how does our work fit into the overall business so that it actually delivers some type of value. I don’t know if that’s … I’m sure in some places you do need a dedicated body for it and other places you can stay lean and just improve the quality of the work that the existing team is doing.

Bob: Right. The whole notion of an analytics translator to me sounds like it’s somebody, and it could be a product manager. It sounds to me like it’s somebody who’s trying to translate these quantitative results into some like easy to understand words for the rest of the team now, which is awesome. I think that’s a good thing to do. But instead of focusing on trying to get the best translator, when I try to raise the level of the team’s knowledge of math and statistics, because it would be easier to communicate results to somebody who has a basic level of understanding of data and analytics compared to somebody who has no idea what a mean is or what a variance is.

And I agree with you that you need to increase maybe the soft skills, communication skills of the data experts, but also in the same token, increase the tech skills or the math skills of business people, so we’re all playing on the same level playing field where we can talk and we don’t need an analytics translator because when I talk to you, you understand me and vice-versa. I don’t want to have to go to some intermediary to translate what I’m saying into something that’s meaningful to you.

Brian: Yeah, I hear what you’re saying. I wonder about the practicality of that and especially as you go up the business responsibility chain and you get into more senior level management and people whose time is pretty bogged down to begin with, how much it is that that the skill of the translate … If you want to call it translation, fine. To me it feels like can we also help the data and analytics people to provide better visualization, better storytelling, better communication of the information in a way that their stakeholders can understand it, whatever their particular …

And you may have different audiences. One audience may want to see, “Let me see your data. Let me see your findings.” Another person really want, “Give me the conclusion.” You can pepper in the supporting evidence, but I trust you. This is why we hired you. I don’t need to know what the p value was. I just want to know can we sell more here or not and what do you guys think we can do? Like how many more sales can we do and what would it take to make this happen in the business? They’re looking at it from that perspective. And to me, I think there’s some shared responsibility there. You have to know something about technology if you’re going to work on software. And if you want to work on data, you probably do need to have some understanding. But realistically I don’t see … I don’t know. I would find it hard to believe that most senior level decision makers are going to take a stats course and all that to understand it. Could be wrong. Could be … I don’t know.

Bob: Well. In my new company I’m working with now, my goal is to educate everybody on just basic stats, where you have these things called lunch and learns and you can present a topic. So my goal this year is to do like two classes per quarter on just basic stats. Nothing sophisticated. Throw some charts of their frequencies, correlations, the-tests, just basic stuff so they can start seeing the value of data and what it is.

In fact, going forward in our company, where we have to include key performance indicators in all of our projects, and we don’t have any, the project just stops until we get the KPI instill. We can get that from the client. We can make some ourselves to make sure we’re on task and on track of being successful for our client.

Maybe that’s just a matter of just being in the field for many, many years. You just know more stuff. And maybe that’s just, if you’re a senior level executive who’s been around the block 30 years, maybe you just know more and they’re easier to talk to because they, maybe they get numbers because they deal with numbers on a daily basis with finances and so forth.

Brian: Got it. By the way, on the … you reminded me when you were talking about the training work that you’re doing. I don’t know if you’re familiar with this website called Seeing Theory. It’s on the Brown University. It’s seeing-theory.brown.edu, and it’s a visual introduction to probability and statistics. And it’s … I visit it every now and then, because my background is not in a math or data science or statistics. But it’s a really fun, well-designed website that has lots of visuals to go along with different statistic terms and how they work. I’ll put a link to the show notes in it, but if you’re doing that kind of work, you might check that out. It’s a really well-designed.

Bob: That’s cool. I will.

Brian: At least the last time I looked at it was.

Bob: There’s also a course that’s being tied to the University of Washington by two professors is called Calling Bullshit and it’s, it teaches us … It’s great. All the classes are online. You can watch them on YouTube. And it’s excellent. They teach kids or students how to think critically about information they’re given. It’s wonderful. It’s called Calling Bullshit. It’s a class offered at the University of Washington.

Brian: Got it. Bob, do you have any other closing advice or anything for data scientists or people working with them that are trying to create data products and better analytics tools? We’ve talked a lot about maybe getting some training on statistics in order to better understand kind of the world of data scientists, but any other general things about how it can help?

Bob: Yeah. Yeah, I got some advice for some, either your aspiring data scientist or just data scientists in general, is that you can’t know everything. So focus on improving skills that are a part of your core strengths. And I found [inaudible 00:35:27] data science skills kind of tend to occur together. If you’re a quant person, you know a lot about quantitative methods. So maybe you should dive deep into that and become an expert in whatever you want to become an expert in that field. If you’re into programming and technology, focus on that. And if you’re a subject matter expert, dive deep into learning about that field.

So just know your lane and just become an expert at that. And the more you know, the more successful you’ll be at your projects. And also, make sure you work with other people who compliment your skills. Again, you don’t know everything and you can’t know everything. So bring other people on board. I know for me, I’m much more successful when I have other people working with me who have skills that I don’t have. It just, it makes life easier.

Now, if you’re an executive, you need to understand that data science is a team sport. So you’ll never just hire a single data scientist and say, “Oh, I got my data scientist. We’re all set to go.” You need a team of people with a diverse skill set that work together effectively. So just keep that in mind if you’re an executive trying to instill data science in your company. You need to hire a team of people instead of just one.

Brian: Yeah, I think that’s really great advice. It’s so similar to doing effective product design work as well. You can’t really throw something over the wall to a designer and expect them to understand what the problems are and how we’re going to measure the stuff, what does a successful task completion look like, and an application. Why did people come to the site? The software, all that stuff, it requires the cross-cultural or cross-departmental skillsets and having the right stakeholders in the room and all that kind of stuff. So I find a lot of parallels with you on that.

Cool, man. Well, this has been a great conversation. Where can people find you? What’s the best or is LinkedIn the best? I’m going to put your LinkedIn in the show links.

Bob: LinkedIn, Twitter. The account in Twitter Bob E. Hayes.

Brian: Bob E. Hayes.

Bob: And my personal site, Business Over Broadway. Now I’m a senior research director at indigoslate.com. So you can find me in any of those places.

Brian: Got it. Awesome. Well I will definitely put those links in the show notes, and man, this has been a really fun conversation to have with you about this today.

Bob: Great. I appreciate it.

Brian: Yeah. Cool.

Bob: Yes. Awesome.

Brian: Well, keep me posted on things, especially if you do some prototyping. I’d love to hear how that goes, and we’ll stay in touch.

Bob: All right.

Other Episodes

Episode 0

June 18, 2019 00:49:09
Episode Cover

015 - Opportunities and Challenges When Designing IoT Analytics Experiences for the Industrial & Manufacturing Industries with CEO Bill Bither

Bill Bither, CEO and Co-Founder of MachineMetrics, is a serial software entrepreneur and a manufacturing technology leader. He founded and bootstrapped Atalasoft to image-enable...

Listen

Episode 0

September 22, 2020 00:37:57
Episode Cover

048 - Good vs. Great: (10) Things that Distinguish the Best Leaders of Intelligent Products, Analytics Applications, and Decision Support Tools

Today I’m going solo on Experiencing Data! Over the years, I have worked with a lot of leaders of data-driven software initiatives with all...

Listen

Episode 0

October 18, 2022 00:35:05
Episode Cover

102 - CDO Spotlight: The Non-Technical Roles Data Science and Analytics Teams Need to Drive Adoption of Data Products w/ Iván Herrero Bartolomé

Today I’m chatting with Iván Herrero Bartolomé, Chief Data Officer at Grupo Intercorp. Iván describes how he was prompted to write his new article...

Listen