All About Data: Interview with Data Expert Alice-Anne Hardwood Sherrill

If you prefer to read the interview, here it is:

Melissa: Hi Alice! and we are recording now and I'm excited to get to know you a little bit better so tell everybody about yourself and your organization.

Alice-Anne: Sure, my name is Alice Anne Harwood. I am the Co-founder of Carnelian. We are a newly launched company. We're actually right in the midst of redoing our website, as we speak.

I’ve spent about 20 years as a practitioner in the nonprofit sector, and I have a specialization in the arts, but my expertise is program design evaluation and fund development organizational design and development as well. My main focus now is really centered around data and all things data so Carnelian is our new company and we're moving forward.

What we do is build tools and resources that help individuals and organizations operationalize change. So there are a lot of folks talking about the theory behind where we need to go. What we do is figure out what are the tools you need to get there.

Her Background

Melissa: Sounds great. Tell me a little bit more about, how did you get into data? Because you are really into data from what you told me.

Alice-Anne: Yes, that is a thing, so I've always been interested. I grew up in an environment that was very focused on exploring, focused on discovery. And so it was sort of built into me but I didn't fully understand that until I worked with a team of researchers at Yale for a little while and I had never taken a stats class I had zero formal training.

And here I am thrown into the midst of all these postdocs asked to to support them in both creative work and in development work.

And I had no idea what I was doing and I had to learn on the fly I had to suddenly be able to analyze data in a heartbeat in order to do my job to serve them in the way that I wanted to serve them, so it was this trial by fire.

It is actually very hysterical. There were some moments that I didn't fully understand who I was working with at the time, some very, very important people, but I was so young, and so green, I had no clue. No clue. The president of the APA was my boss, and I didn't get that like. It didn't click with me. I was so green, right? You know he came to me one day, “So everybody's busy. I need some slides. Can you do some data visualizations for me for my presentation tomorrow morning at nine?” Ok, yeah sure.

I had no idea. I went home and spent the whole night, up all night, trying to figure it out. I brought him some slides and what I discovered in that moment, the next morning.

Later, that afternoon I brought in the slides he did his presentation. I got called into his office later that afternoon and thought I am fired. I'm dying, right? I’m done and, instead, it was, “Thank you, you you clearly stayed up all night to do this, and I want to ask your permission to use these in my book.”

Melissa: Wow!

Alice-Anne: And it wasn't because they were these mind blowing interpretations. It was because his audience was me. His audience was the everyday person who might not be completely immersed in data.

And my interpretation was very straightforward, was very streamlined and clear, so it was the start of - data does not have to be this scary complicated thing. It can be something that becomes part of your process and part of your everyday. 

So that was sort of the start of it, and then, as I kept using it. Especially with an interest in program evaluation and program theory, along with development I think fund development folks were the original ux designers. Because when you start looking at moves management programs and how you work with a donor to engage them, that is the foundation of ux as well. The data feeds that moves management process. It also feeds the quality and impact of program outcomes. 

So it just kept layering it into my work more and more and more until it actually took over and became my life's work from there. So there's how I came to be where I am.

Melissa: That is a great story of thinking, you’re going to be fired and then actually...

Alice-Anne: “Can I put you in my book and I’ll cite you?”  I'm like, “What?! I'm going to be in your book?”

Data Is Qualitative and Quantitative

Melissa: When you use the word data I started thinking quantitative. Is that kind of the world you run in, like are we saying the same thing?

Alice-Anne: I actually work in both worlds.

So I love my work actually is about combining quantitative data and your qualitative data. So my work is about layering your data so that it's a learning tool.

So when I think “data,” it's not just about...my research centered around intrinsic and contextual data. Intrinsic is counting beans, right? It's the basic, it's the information counting heads.

Whereas contextual data is experiences, behaviors, environmental factors. What are those intersections that also inform the counting beans? That allows you to look at it and find anomalies, find discoveries, find gaps, and in that moment of discovery that can happen when you overlay information.

Melissa: That sounds great, yeah that sounds a lot like just mixed methods methods UX and when you were talking about the donors, I was thinking it’s like the user journey of the donors and that sort of thing and how it’s qualitative and quantitative combined. In my mind, I do go to quantitative when I hear the word “data.”

Alice-Anne: Yeah, I think most people do, and I think that is probably one of the traps, right? That's one of the little traps, there are land mines all over this world.

Especially in the sector that the work we do. That's one of the big land mines is stepping right on -  data is just counting data is quantitative.

Melissa: Great, well, I'm glad we got that cleared up right away. That, even if we're using our data there's lots of different kinds.


But Where to Start? 

Melissa: If I am part of an organization or I'm an individual, I know I need to start paying attention to the data that's coming in that you know we're all surrounded by. Where would somebody start?

Alice-Anne: I think if I were to say where to begin, it's three steps.

First understanding what data you actually collect and from where. Then looking at whether or not you use it. And do you have a process for maintaining its integrity. What is your process for making sure that data is not flawed, that that data is accurate, that that data is consistent and has a consistent way of being looked at as apples to apples, year over year.

I would say the second step, and something that came out in one of my research studies that blew my mind actually was this huge percentage, 70+ percent of participants in this study, have a self perception of the ability to analyze data as being very high.

Yet those same participants, when asked about what tools they use to actually analyze their data. Less than 10% of the available tools were put into use. So the perception of the ability to interpret it versus knowing actually how to use the tools to interpret it for learning doesn't exist. There's a complete gap between those two. They are completely opposite versus what the reality is, completely not in alignment, so if I'd say looking at skills, looking at developing data analysis skills, and just simply learning how to use some of the tools that exist.

I am by no means a data scientist. I know what I don't know how to do, but I know what's possible and I know what tools can make it possible, and if somebody who is a data scientist is working with me, I can help them talk them through it, so I don't need to know how to do it.

But as the LEADER I need to hire someone smarter than me who does know how to do it, but I need to at least understand:

  • what do they need

  • what professional development they need to continue learning

  • how to use those tools

  • what investment do I need to make into my infrastructure to understand what tools are necessary to actually utilize the data you have

I’d say that's where to start.

No More Excuses 

Melissa: Okay, that sounds pretty good now. I'm going to bring up another objection that people might now in general, which would be like, "That sounds really expensive and like a ton of work." Do you bump into that?

Alice-Anne: All the time! I think that's the belief that data is hard, data is expensive. Data is beyond our capacity, and I don't have time to do what I'm doing.

You don't have time to not pay attention to the data because the data should be informing every decision you are making. So I’m speaking now in terms of the nonprofit world. So if you're not using your data to make organizational decisions and to inform it, why are you doing any of it?

You're working completely in a vacuum. So that objection is to, “That's not us,” is, I just won’t accept it. I don't think so because it is affordable. I mean there are free courses everywhere.

Google is your friend. Anytime I need to know how to do something I can Google it and get a step-by-step. Just follow the steps printed out. Follow the steps. If you're an analog person use your printer. And there are courses, Lynda courses, that are free, which, when it was purchased right when it was bought out, I believe.

Melissa: It’s Linkedin Learning now.

Alice-Anne: You know and simple things, I mean, “Oh I don't have the tools to get data,” "Do you have a Gmail account?"

“Yes? Okay, what I would like you to do is set up Google alerts on this topic on this person on this, whatever you need to learn more about.” You can tap into it and tap into other people's data sources and start layering them into your own. And finding out who of my audience actually fits this data that someone else is pulling already and what does that mean? How does that inform me? So it doesn't have to be expensive. Go take an Excel course. 

Power queries in Excel are a basic simple way to learn how to analyze data. You can find them by Googling them, step-by-step. But you can also get a lot of free courses and just find someone who can teach you those power queries that are actually quite user-friendly. They're very intuitive - you know it's easy to use them.

Melissa: Yeah, it's also just great that we have so many resources available to us. I often tell people to go check out Google Analytics Academy because oftentimes we have Google Analytics on our website, but we don't actually know how to do much with it, and we were not excluding bots, are there just kind of weird things going on.

It’s Not Just Counting

Alice-Anne: Yeah. I think there's also that perception of it's counting. And so, looking at analytics and not understanding well how do I count, what where's the counting in that? Well, it might not be a counting thing. You might actually need to layer two different things together.

To find out something new, based on the information that's being out how are you synthesizing the different pieces of information, together, to tell the full story.

Melissa: Yeah that's what it's all about, right? The full picture of what's really going on so you can make great decisions.

Alice-Anne: Yeah, yeah, and I think data, for me, is how I use data, where I rely on it the most, is confirming my assumptions -  from the most basic assumption possible to things I may not have ever even thought about that came up.

So I'm looking at a program theory. I’m looking at my data. And I can't figure out why it's not coming out as the theory said it would - where's it broken - usually it's an assumption. Usually, there's an assumption. The same thing happens if you're thinking on the donor realm, the data and donor work in your fund development program. Usually, you've made a wrong assumption. You decided that the husband was the decision-maker. In all actuality, it's the daughter.

And you know there, there are so many assumptions that we make, so the data qualifies our assumptions and helps us determine what might be broken. So I think that's where I use data the most is helping me understand where I didn't see something or something I assumed actually was inaccurate. And so the data helps beef those thoughts up.

Melissa: Yeah that's great that is definitely a big part of user experience is like getting the assumptions, to the explicit knowledge enables so we can take a peek at them and be like, is this really true or is this just what all the stakeholders think and sends us down this really wacky path with a website, or whatever other digital tool we're typically working with.

Other Pitfalls

Melissa: Can you tell us other pitfalls that you see people fall into around data?

Alice-Anne: Oh sure. Manipulating it instead of learning from it.

So I have a colleague who calls me on a regular basis. He used to work for me and now he's at another, he said, an organization, now that is huge it's a giant organization. His frustration is they don't actually want to know the real results. They want certain results. 

It's like a researcher going in with a thesis and disregarding the fact that it's okay to disprove their thesis that that is just as valuable and just as quality work as proving your thesis but we are programmed for accountability. So we're always pushing what's right, how are we proving that what we're doing works, instead of saying how are we finding out it doesn't and needing to make changes based on what we're learning.

And I think the biggest pitfall other than the...thinking you have the skills and you really don't and counting is also trying to manipulate it to tell the story you want to tell versus letting it narrate the story itself.

Becoming More Open-Minded

Melissa: Yeah we've certainly seen that many different cases of course around user experience types of research and qualitative data and confirmation bias and those sorts of things. How do youtry to get people in the mindset of being more open minded? Like a true scientist would be?

Alice-Anne: You have to be open to failure. Your risk threshold needs to be set, I think, part of your assumptions has to include how do I articulate what my risk threshold is so that I know how much risk I can take towards failure so that I can learn. Without that risk of failure, there is no learning. We learn through our failures.

And if we're always only focused on - “Here we're doing it, we're doing it, we're doing it, - when you're not actually doing it. And when it comes down to it, eventually, that will come to the surface. Others will know that you're not doing it right. You're spinning your wheels in the mud.

So I think it's creating a culture that's not risk-averse that is encouraging learning. You know, I'm very much in support of those environments that have creativity time. So where Google gives 20% of your time is just exploring, figuring things out, failing.

3M, I mean 3M is amazing, and some of their biggest products, Post-it notes, came out of employee free creativity time. Explore, figure something out. They invest, what I think, is 3 billion a year, I'd have to look at the stats to know for sure. They invest so much money in R&D (research and development.) So what is the R&D threshold risk threshold of your organization? Most organizations don't even have an R&D aspect to their work. It’s succeed or else, there's no room for failure.

That's perpetuated from the funders, from the way that we're structured as a sector, of course, but you don't have to live within that. You can work to build in opportunity and funders actually are very supportive of failure.

I find that my relationships with funders are better when I’ve failed and worked through the failures with them to overcome the challenges and find new solutions and actually learn something together.

They're more apt to give me more funding to continue working than the one who I kept saying, “it's working, it's working, it's working!” And then suddenly it comes to the surface it's not working. You are never going to see funding from them again. 

So it's really creating a culture that is accepting of failure is this is the biggest step. That's just using a lot of influence. Slow wins. Little tiny wins fail and show how that fail helped move something forward. And work in those little tiny increments to overcome and bring people on board and help them see the value in risk-taking.

Melissa: That was a great answer. Thanks for sharing with us about that, I think that's a real challenge, I think there's certainly a lot of desire. I even see it in the startup world and other areas beyond nonprofits where we say we want to fail. We say that we're willing to take risk. But then, when it comes down to it, people are being punished for ideas, doing something that didn't work out, that sort of thing. We’ve got to get away from that and we're going to actually learn right.

Alice-Anne: Absolutely, absolutely, and that's all on the leader. That's 100 hundred percent on the leadership of organizations working to create a culture. It can't come from staff. The leader needs to embrace that risk and be okay with it.

Dark Data and Your CRM

Melissa: That makes sense. Are there other pitfalls that we should be aware of?

Alice-Anne: Understanding why you're collecting data. There's a lot of dark data in the nonprofit world. It's just accumulating and accumulating and accumulating - which creates a risk that could be mitigated. So there you're putting data into the system, but you're not ever using it. 

For example, if you're thinking of the donor world, right, you've got your CRM. And you're entering it all say, “ Look i've got the best CRM ever. It takes everything that you put in all these notes, I’ve got total histories, I can add anything.”

But guess what? You can only query on gift date, gift amount, campaign code. You can’t actually inquire about any of those other contextual things you're adding into your system. That requires a custom report, which requires paying for more or the tech team can even build it because they just don't have the capacity to do that.

You know, thinking about when you're purchasing a system - Does it do what you need it to do to help you learn? And many of the CRM that are touted as amazing are not thinking. All they can do is count heads, they count beans.

They can’t actually analyze or layer in anything contextual, so I think that's a major place where organizations get stuck but don't know what's happening. You think you have something. You think you have the tool that's going to work. But it's actually not working for you.

You can drive the simple systems just as well as you can drive the Cadillacs and Lexus's of CRMs in the same way. You just have to make sure that those reports can be built-in, and you can tap into that information and you're using the information. 

It doesn't do any good to have a crackerjack development person entering in data about your donors and no one ever utilizing it. It doesn't do any good to have a program person who's gathering, all of this information and you completely disregard it. It goes into a file folder and put into a drawer. 

One of the results, one of the things that blew my mind in a research project was seeing where does your data go. It’s put in a file and in a drawer. Not even a file somewhere on my network! it's in a folder in a filing cabinet. What?! Nothing is being done with that data. It’s only using what's mandated, only looking at data as a requirement and an added thing. Not looking at data as part of what you do every day, in every decision you make. 

You Need a Data Agenda 

I think the other major major pitfall, major landmine, is not having a data agenda, not having a path of data. If the Board has never seen data that would inform decision making, and the board makes decisions...how are you making decisions that are going to accurately reflect the outcomes and move your organization forward? 

Or if the Board is mandating data and the frontline people gathering it have no idea why it will be flawed. it will have issues in it that you can't interpret it in the way you need. So having an understanding at all levels across the entire organization that that data isn't just that program person's data or just that development person's data. This is the organization's data. It's all of ours, and we all need to understand how it comes in, how it flows, how it's managed, and how we use it to layer it for each piece of our work to make organizational decisions.

Melissa: Okay is what you just said right there is that the data agenda,? How we use it, why we use it?

Alice-Anne: All that -  how we gather it, who sees it, how it's stored, when is it purged, how long can it stay. You know, in a system when you know all of those really important pieces. Risk threshold, so your risk policy needs to be reflected in your data agenda. You know who determines what data is gathered and why, what is the process, what is the process for using it. Are you following ethical standards?

There are no standards for the nonprofit world. We don't follow the same research standards that a researcher would have to. Yet our subjects are, more often than not, human subjects, but we have zero training, zero awareness, no regulation about what that data capture is, and are we ethically gathering that data.

So that's a part of your data agenda is adopting ethical processes. Animals, even if you work with animals, there are the whole set of ethical standards. So learning IRB processes and making that part of your data agenda. It's bigger because there's a risk to it, there is a risk. There's always a risk. But the benefits of having it well planned, having it mapped out, Understanding where it's going and how you're using it -  that mitigates legal issues. 

It removes that barrier from people saying, “We can't do that because it's a legal issue.” No, you've got yourself covered. You've got a data agenda that maps all of this out that you're protected. You're making the right decisions. Your risk team, your risk policy is up to date and includes your data agenda. But then it's making sure that data doesn't sit and become dark data just sitting there, contributing to that huge pool of data that someone could tap into at some point. That’s a higher risk than data that's actively being used.

Melissa: Thanks for explaining all that! So you would want to have your data agenda explicitly written out, policies decided on, that sort of thing. It's interesting when you were talking about data gathering. A lot of the conversations I've been having lately are about being trauma-informed in our gathering. And being sure in our research methods and our process of talking to people that were being mindful that we're not making it worse, and the same way, the IRB is there to protect subjects in the university realm. A lot of things to think about.

Don’t Just Check Off The Data Box

Alice-Anne: Yeah, and I think sometimes organizations get stuck in, “I only collect what the funder tells me I need to.”

Melissa: Yeah, what the funders want.

Alice-Anne: How is that informing your organization? How is that making you a data-driven organization that's making the best decisions and proving to your donors, that you are the best investment?

“I'm just doing what I'm told and checking off the box.”

So that is probably the other pitfall — checking off the box.

Melissa: Checking off the boxes, yes, and also it's about probably when you are, you know collecting that data, and maybe just for funders you're asking a certain question explaining to the person you're getting it from why, why do you need it, why do you need to know that? 

Alice-Anne: And it’s having the audacity, which is wonderful. I don't look at audacity as a negative word. It's one of my favorite things to be, is completely audacious. It’s going back to a funder and saying, “This data point you're asking me to gather is a hindrance to my trust-building, it's not in alignment with the values of our program. Could we consider a different data point? Here's what we capture. Will this work?”

Funders love that! Yet most organizations think, "I just have to do what the funder wants, or I won't get funded."

Well, no, it's a dialogue. It's a communication. There's an ecosystem you're building here.

Part of that ecosystem is collaboratively coming up with, ”What are those data points?” If that data point they're asking you to collect is completely counterintuitive to the work that you do there's nothing wrong with reaching out to the funder and saying, “Can we work through this? Because this isn't serving our Program.” Here's what we want to collect and why and what we're going to learn from that data. 

That funder will…their mind will be blown! And you will get multi-year support and they will love you forever. You will now be the spokesperson for modeling how to do data capture and analysis. Your organization and you will be their poster child. So do not fear the foundation.

They have good intentions, but a lot of the problem is not the foundation. I hear a lot of people blaming the foundations. It's also our inability as a sector is to have a dialogue and communicate and build those personal relationships and collaborate in a real way.

Melissa: That is a great point. I love this idea of pushing back on data that might be requested that maybe isn't the right data — to really get to have a real conversation with people who are funding you. I love it.

Alice-Anne: Yeah they're not they're on the front lines with you. You're there. You're doing the work. And who better to inform and have that. That's setting up the model so you've got your data agenda incorporating all levels of the organization into what process.

And then you're bringing in the next level of your ecosystem and saying okay funders let's be a part of this conversation too. And understand the why because collecting data that you're not going to use is unethical.

Melissa: That's a great point too. Thank you. This is chock full. I love it. If people could only remember one thing about everything you've talked about, or one thing about data, what is important that they walk away with? 

Alice-Anne: It's not hard. It doesn't have to be hard. Start from where you are. Look at what you already have. What is it telling you. Then learn, find ways to grow from there, and make it a part....the biggest thing actually is making it a part of what you do.

Everything you do should involve data, everything, it's not an extra. It's not an add on. It's not, “I wish we could, if we had the money. If I had the human resources, if I had the tech infrastructure, I would.” You can do it simple, and start there. But do it and use it as a part of every single thing that you do.

Melissa: Thank you, I really appreciate you sharing all that. This has been really helpful!

Alice-Anne: Thank you, and thank you for all you do too!