Ep 160 | Survey Your Audience to Increase Engagement with Peter Schafer
Surveying your audience is a great way to learn more about them and inform your marketing decisions. You can learn more about what your donors really care about, how they think you're doing and what would help them be more engaged. You can learn more about your community on the services you're providing and what is making the biggest impact. Peter Shafer is here to help you craft that perfect survey so you can take in the information and make choices from the perspective of those you serve and get out of your own head.
What you'll learn:
→ sample questions you'll want to include.
→ how to go into the survey with the right mindset.
→ unique ways to get people to fill out your survey.
Key Takeaways
[10:13] Be Outcome Driven Each poll should be about a specific topic. Having too many questions or topics can deter your audience from completing the survey. Use “I” statements as questions so the audience feels more connected to the survey.
[17:04] Reverse Engineering Doing a general poll every year may not provide valuable information that you can use. Think about what results you want to get from polling your audience and build your survey from there. You can poll a smaller, more direct, group of people and compare that data to the larger audience to find anomalies in your data.
[26:53] Scale Questions Using questions with a 1-5, or 10, scale is the easiest and fastest way for your audience to complete the survey, and allows you to ask more questions. Limit the survey to 1 to 3 open ended questions at the end to avoid longer completion times.
[32:05] Incentivization Rather than offering an individual incentive to every participant, offer a raffle or sweepstakes prize to one winner at the end. Find a sponsor that will make a donation for every completed survey so the participant feels that they are making a donation.
Resources

Peter Shafer
VP of Sales and Marketing at Everest Communications
Peter Shafer is the Vice President of Sales and Marketing at Everest Communications. Everest is a digital communications firm that provides counsel and program execution support to companies in the areas of analytics, social media strategy, and digital reputation repair.
Having worked for prestigious polling organizations such as Gallup and Harris, as well as large global PR firms, Peter is the ideal person to shed light on how using data effectively can vastly improve your digital marketing campaigns.
Your audience is going to love his advice – whether it is made up of high-level executives, marketing experts, researchers, or business owners yearning to effectively communicate their messages and build long-term engagement with their brand.
Peter believes strongly that context, content, and collaboration are the keys to success in today's competitive digital environment. He wants to share insights with listeners to enable them to start building better digital strategies that will not only work… they will exceed their expectations! Learn more at https://everestcomms.com/
We love creating the podcast. If you like what you learned here please give us a tip and help us offset our production costs.
When you leave a review it helps this podcast get in front of other nonprofits that could use the support. If you liked what you heard here, please leave us a review.
Full Transcript
[Sami Bedell-Mulhern] Never fails, there is a topic that I think my audience is going to go gangbusters over and it flops. But then something that I'm really not as excited about does amazing. So proof is that we don't always know what our audience wants, the experience that they want, what they're most excited about that we do, unless we ask. And so today I am talking with Peter Schafer about how to create that perfect poll and survey so that you can learn more about what your audience is looking for, so that you can show up for them and get better quality donors that are quality, whatever, volunteers, participants, and so I think you're really gonna love this episode. I could not stop talking to him. And I think we're gonna have him back for another episode later this year, which I'm very excited about. But Peter Schafer is the Vice President of Sales and Marketing at Everest communications. Everest is a digital communications firm that provides counsel and program execution support to companies in the areas of analytics, social media strategy and digital reputation repair. Having worked for prestigious polling organizations such as Gallup and Harris, as well as large global PR firms, Peter is the ideal person to shed light on how using data effectively can vastly improve your digital marketing campaigns. I am so excited about this, because Peter believes strongly that context, content and collaboration are the keys to success in today's competitive digital environment. And he wants to share his insight with you all to enable you to start building better digital strategies that will not only work but will exceed your expectations. And here's the thing, we are talking about holes and strategies around that in this episode, but this can be used for so many things outside of just your digital marketing strategy. So I hope that you'll take a listen and really think about maybe one or two times that you can survey your audience. Or maybe it's multiple times that you're serving your audience. But ones only your donors, then ones only your volunteers, or your event attendees, or just your general email list and what information can you glean? But what I also love about this is really getting real with ourselves. So when the results come in, how do we interpret them? And how do we use them? And how do we trust them? Those are all kind of the topics that we talk about in this episode. So make sure you give it a listen and let me know if you're going to run a poll or survey your audience because I really think you should. But before we get into it, this episode is brought to you by our digital marketing therapy sessions. I hope that you'll head on over to thefirstclick.net/officehours. I am here for you if you are thinking about digital marketing strategies going into the second half of the year. And you just want to know if it's kind of in line with what you're thinking or you want to do a survey but you have so many things you want to do and you're just not sure how to go about it. That's what these sessions are here for thefirstclick.net/OfficeHours book, your 30 minute session today. Let's get into the episode.
[INTRO] You're listening to the digital marketing therapy podcast. I'm your host, Sami Bedell-Mulhern. And each week, I bring you tips from myself and other experts, as well as hot seats with small business owners and entrepreneurs to demystify digital marketing, and get you on your way to generating more leads and growing your business.
[Sami Bedell-Mulhern] Everybody, please join me in welcoming Peter Schafer to the podcast. Peter, thanks for joining me today.
[Peter Schafer] Sami, it's great to join you. Thank you so much.
[Sami Bedell-Mulhern] So we talk a lot about data and like how to find data when we're trying to make marketing decisions like analytics, or social media demographics. But there's something that's just super special about, like, just talking to our existing audience. So why is polling our audience or pulling data from our existing customers and donors something that you are so passionate about?
[Peter Schafer] Well, for two particular reasons. One is I think more and more leaders are becoming comfortable and are embracing data as an asset. So they're looking at it differently. They're talking about it differently. They're digging into it deeper, more deeply. And I think that just the fondness of seeing what comes out of that has really invigorated this, this whole area of polling and research and even, you know, the reemergence of focus groups, nobody would have thought that they would have come back and now that they are. The second is that I think the information sharing among recipients or donors or whatever, has become so much more robust. It used to be kind of we’ll ask a question, you give us an answer. And it was a one way transaction. And now with both the way questions are asked, but also the technology, you can have a discussion even within a poll where you wouldn't have been able to do that 10 years ago, and I think that's allowed for a lot more conversational information sharing but also higher quality of data being shared. And so that's exciting. That's been the fun part, is that there's been this upgrade and quality of the data.
[Sami Bedell-Mulhern]Well, and I think there's two things that people say often if you go to your leadership, and you say, Okay, I want to poll our donors, I feel like there's two points of pushback. First one being, oh, no, if we do if we poll our audience, and we're going to look like we don't know what we're doing, if we're asking them for their opinion, and then number two is, well, we have to, we don't have a big enough audience to get enough information that would be valuable, or we have too big of an audience, it would be way too much work. So two questions on that, like one, how would you contradict that leadership question of our audience is going to think we're incapable and two, like how many people do really need to poll to get good numbers?
[Peter Schafer] Yeah, that's the second one's a great question. And we'll get to that in a second. What I tell most of the leadership teams that I work with on this particular topic, is that number one, their audiences are changing almost every day, they may not necessarily change their values, they may not necessarily change, you know, their philosophy or approach to life. But there's so many stimulus that there are that they're, you know, hitting or getting every day that they are adapting and changing their opinions or minds about things so that a snapshot in time is not a movie, let's say. So I say that, you know, your audience is changing. And the fact that you recognize that their needs are changing, actually shows that you're ahead of things versus behind things. The second that I've noticed, and I talked about this with leadership teams all the time, is that you can ask about the same topic using different types of questions. So you can, you know, so that you're, you're able to gather and build out the data by asking those different questions. So it's not like you're going back and asking the same thing over and over and over. And I think audiences, especially donor audiences, that I've seen, really, really appreciate that, because they feel like they're included in something and that they're building on something versus, you know, are you going to give to us this month or next month? Or does this theme work for the campaign, that kind of thing.
[Sami Bedell-Mulhern] That's part of the project.
[Peter Schafer] That's exactly right, they now feel like they have a different level of ownership. I was working with a small liberal arts college a couple months ago on this very thing. And they really, you know, they took their statistics, one on one class way, way too seriously, and started doing all this thing, you know, and tried to do a lot of cutting and pasting. And I said, No, I said, this really is simple. What you just need to do is engage, use interesting questions and probing questions to engage them, and then go back a second time and ask for clarification. And you'll get even better responses. To your point about the number of people that you need to poll if you're doing a finite set of people. So let's say that you only have 200 donors, or that your group is, the typical rule of thumb is that if you're able to get at least 25% of that 200. So if you're able to get 50, that's a reasonable set of data to use and work with. And it may not necessarily be representative, but it's actually usable for what your research is supposed to do. In public polls in large public polls, the number of people who have been participating in those has been dropping. In private polling, or something that's much more targeted, you know, those levels have stayed pretty much the same, if not increased a little bit. So if you're looking at something that the Wall Street Journal does, or Gallup Poll those who I used to work with, or Harris, you know, they're looking at sample sizes of 750 to 1000 to, you know, even bigger than that. But they're also contacting almost 50 to 60,000 people to just get those 1000. So it's, it's, you know, the numbers begin to get exponential and it becomes cost in, you know, cost ineffective to do.
[Sami Bedell-Mulhern] Well I might be completely wrong. So since you work for Gallup, you have to let me know, but I want to say like sometimes Gallup polls will even run off of 100 people, and that's enough for them to extrapolate kind of what the general public is thinking. 100 people versus the population of the US like, that's a very small sample size.
[Peter Schafter] Yeah. And one of the things that has happened in the evolution of this, Sami, is that we now have more data points on people to start with before we even go into the field. So we don't have to ask all those demographic questions. We don't have to ask a lot of the things that you used to, but we're able to then take that and, you know, there's this, there's a term in survey and polling work that's about incidents in the population. So for example, you know, you know, mom's represent 23% of the overall population. We can narrow down that field just by some of the data that we already have and not have to ask the question again or you know, bug them about those types of things. That allows us to narrow down the sample set so that we do have, really, a very tight, but also a very representative audience because of that.
[Sami Bedell-Mulhern] So the most important part before you poll, I would guess, is understanding what the goal is of that poll, right? Because we want to make sure that what we're looking for, we're not giving them 100 questions to try to get all the data, right, we might call our audience two or three times a year, but each one has that very specific topic, right?
[Peter Schafer] Yes, yep. Yes. And you know, you nailed it in that the poll has to be outcomes driven. Meaning number one is, what do we really want to find out? You know, is it directional data? Is it specific transactional data? Is it, you know, intention data? You know, are you going to give or not? That kind of thing. But the second is that, how are we going to use the data? What is it that we're going to make? What decisions do we want to make off of this? And I think that's where we kind of get stumbled around. And then to your point, because everybody has a different opinion about it, we now end up with a salad bowl poll, which has got every vegetable and everything and you'd mix it all up. And it's a you know, now it's become a 100 question survey, and nobody wants to take it. Interestingly enough, one of the things that has changed in the polling community, because of the technology, is that a lot of surveys are now less than 12 minutes versus even 10 or 12 years ago, the standard was about 18 minutes. So we know now that’s the time it takes you to complete. The second dynamic is this, and about 50% of all polls, online polls, are taken on a mobile device, not your PC. So you need to make sure that the questions are recognizable enough on a small screen, because you can just point so you've got this, you've got this kind of dual dynamic of compression. And also, you know, just, we need to speed through it. And I think that has intimidated a number of people from asking, they want to ask the academic question versus the question they really need the answer to, if that makes sense.
[Sami Bedell-Mulhern] Yes, it does. And it's, it kind of is taking me back to like, we coach people on when they're writing their blog content when they're writing their social media content, like you want to write at a fourth grade level, not because that is the academic level of the people or the cognitive ability of the people that are doing it. But like you said, we want to get them through quickly. And we want them to be able to scan, process, and answer quickly. So would you say something like that is similar when you're crafting your surveys?
[Peter Schafer] Yeah, exactly. Right. And and the two questions I would do, or I would say to everybody is this, number one is that conversational questions are more effective than the, you know, on a scale of one to 10 with this and that, and you know, if Mars happened to align with Earth at the same time, don't put too many conditions into the question, just ask it as quickly as possible. The other thing too, and this is a little bit of a difference is that if you form the question, as an I statement, like I support this, versus Do you support this? Do you support, that’s binary. If you say I support this, you can scale that to be zero to five or zero to 10, and get a measure of intensity or even a measure of frequency about it. And then we've seen it and this was in research that Gallup did, and some work that Harris Poll did as well. That those I statements create an ownership from a respondent or a donor that a question does not. So if you say, you know, please write the following is like, you know, yes, I plan to give in 2022, yes, you are in my estate planning process, you know, that kind of thing. Those are statements that people can own versus, you know, would you be interested in estate planning, that kind of thing,
[Sami Bedell-Mulhern] And wouldn’t a better question be more, because you're pulling on the emotional piece, right? Like, I am now invested in what it is you're talking about. So wouldn't better questions be something like, I would be interested in giving to the after school program, or I would be interested in giving to the nutritional education program? So that you can get more data on specific things your organization is doing versus kind of those broad, like, how specific do you want your questions to be?
[Peter Schafer] I think you want to kind of think about it in a pyramid or at least a funnel is that if you were going to ask about, let's say programmatic things, you might want to start out with a broad general statement about the programmatic things that the organization wants to do and are you generally supportive of that are you know, yes, these are important to me, you know, you know, there's that the important scale of somewhat important to very important, you know, is always one of the better, is a better metric. And then you might ask an intermediary question and then say, Here are a list of programs that we now have. And we're currently considering, would you, you know, are these of interest to you, and then you can kind of get that narrowed down. That three pronged approach typically helps if you are working on either a marketing or a campaign, a donation campaign, capital campaign, because it allows you to start at the broad and then get down to the specific. When you're looking at the results, though, you flip it and you look at the specific down to the broad, and does it connect, so it's almost like you turn the funnel upside down in terms of your data analysis. But it's a very easy way to funnel people's interest into those topics pretty quickly. One note on this, though, and I get this question a lot. The inherent bias in these questions is a lot less of an issue than it was 20 years ago, because number one, almost everybody has taken surveys, they understand kind of the dynamic, they're exposed to this all the time. So there's a lot less inherent bias in it simply because people are much more aware of it. The second is that, especially for marketing and for donor campaigns and other, you know, not for profit type research, it is much better to be, to have a little bias in it simply because you already know that that group is predisposed to be part of your audience. It's a warm audience. So you actually want, because they're captive, you want to ask them more specific questions, because they will give you much more specific answers, which will help create, create better programming, create, you know, better results for the survey. Will give you better directional data in regard to what's the next step strategically. So there's, you know, I wouldn't be concerned about that.
[Sami Bedell-Mulhern] That's a really good point. And I want to jump back to kind of what you said at the beginning about like having guests, you want to know what your goals are, and all of that. So when you come to crafting your survey, do you think it's best to start kind of at the result side, and then reverse engineer your survey to like the questions that you're asking, so that you make sure everything is leading back to that end goal?
[Peter Schafer] Yeah, there's two there. I mean, ultimately, a lot of methods. But yes, what you want to do in one regard is reverse engineer and one exercise that we used to have people go through a lot when I was working with on the Gallup poll, and also the Harris Poll, was to have the client, think about what headline they would like to see on the front page of The New York Times or the Wall Street Journal or some major news outlet, you know, was it going to be on time magazine, when it when it was still in print? That kind of thing. But that helped kind of start to define, and then we would say, okay, you know, such and such has cured hunger. You know, do you want a specific percentage that says something like that, that 62% of the American population supports our effort to cure hunger or, you know, so you begin to kind of piece it together to say, Okay, here's generally what we'd like to see, here's what we would specifically like to say, and then we build the questions around to get to those answers. And that's, I think, it's a good creative method that almost everybody can employ, and not have to hire, you know, consultants and things like that. Because it's, it's something that is, it just resonates because everybody's looking at media all the time. And you could, but the other thing too, and to your point about the, you know, just being the reverse engineering, part of it, is just being aware of the barriers, if there, if you get to a point where you really don't know what the outcome is, and we're just doing this because we've done it every year or something like that, that's kind of that red flag to challenge yourself to either say, okay, is this really going to be valuable? Do we really need to do this? Or if we did it differently, what would we hope to get the outcome to be? And I worked with a, this happened, I guess probably about six months ago, I worked with a not for profit, that they have to report to the board the survey results. So it's become kind of one of this very tedious, and oh, we don't like to do it and blah, blah, blah. And I said, why don't we just do this? Let's do 80% for what the board wants, but use the other 20% for your strategic directional look forward, that kind of thing. And everybody's like, well, we can do that? And I'm like, Well, unless you're asking it like, you know, 25 minute survey, sure, why not? And but, you know, again, just don't get locked into that. Oh, We've got to do this again, or we have this trend question that we've used for 20 years. You know, I tell people, a lot of times, you know, think about who was president 20 years ago. I mean, think about, you know, think about the audiences that, you know, that you have added or subtracted in that amount of time. So, you know, don't be beholden to a trend that, you know, is not going to necessarily give you the information you need.
[Sami Bedell-Mulhern] Yeah, well, and I think also, so crafting the questions to get the results, not I mean, you're not like trying to manipulate the results by any means. But what happens? And how do you handle when those results come back, because I think a lot of times with nonprofit organizations, we are so emotionally invested in them ourselves. We are so emotionally invested in the product and the project, we think everybody else cares about it. We think everybody cares about it in the way that we care about it. Sometimes those surveys go out and the feedback we get is maybe not what we were expecting and or wanting. So kind of any thoughts around how to wrap your head around results that you've gotten that maybe weren't ideal? And how to kind of use those to, I mean, because it could be a whole host of things, right? It could be you're talking to the wrong people, it could be that you just frame the questions wrong, like how do you go about processing all of that?
[Peter Schafer] Yeah. So it's, it's a really good point that you mentioned about kind of the forensic process of did this go the way we wanted it to go? Did it? You know, was the fielding time enough? Was it, you know, did we get to talk to the right people? You know, was it the flamethrowers? The, you know, the ones that always give us the ones where they, you know, so certainly going through that forensic process. The two things that I usually tell clients to do, number one, is poll together four or five people and just that, are not as attached to it, but are still, you know, engaged and emotional about it, and just run the results by them and see if they, you know, and usually that little focus group or that little triad or whatever, will give you some good feedback that you hadn't been able to pick up in the data itself. One of the things that, you know, and this is an old methodology, but it's still pretty relevant today, is that when big surveys used to get done, it was qualitative, quantitative, qualitative, so that, and I think a lot of people have dropped off that qualitative part, you know, and so I recommend it, because it's a good, you know, it's just a good, you know, gut check. The second is that, besides doing the forensic process, would be to go back and then look at demographic groups to see if the data is consistent with those demographic groups, or I should say, psychographic groups. So I'll give you an example. So if you have within that set of data, donors that have given for 10 plus years, and you know, that might be a good group to isolate, to see if their results are different than the rest of the results. Because you would assume that they would be engaged up to a higher level, right? So there are groupings of your donor base that probably are better, or more representative of the group and just poll those people out, look at that group, and compare them to the rest of the data. And that usually will give you some, you know, how far away or how close you are, actually, to the data point you want to be. It doesn't, I mean, especially with some of the software today, it's not that difficult to do. But it is an exercise that I think is a good test, to see or sort out what these anomalies in the data would be. I would say this, if you, let's say that you're, you know, sending it to, you know, survey to 500 people, you get 200-250 back. If you have a question that has less than 40 responses, you know, particular, I would probably kick that data out. And just, you know, because at that point, you only have 20% of the actual participants, but realistically, you only have 8% of the entire universe. And that's just not enough. Yeah. You know, and so anything, you know, I would say anything that is below, you know, probably a 10 to 15% threshold, I would just kind of, it might be interesting to know, but it wouldn't be you know, spending a million dollars.
[Sami Bedell-Mulhern] Basing your entire marketing plan based off of those 15 people.
[Peter Schafer]That's exactly right. Yes.
[Sami Bedell-Mulhern] I love that you said that because I think that that's a decision that I have been a part of so many organizations, where the one board member who happens to give a lot of money shows up and says, Hey, we got to do this because I heard that this is the thing to do, and then you shift your whole strategy, but it doesn't make sense to the goals of what you're trying to accomplish, and the survey results are almost similar, right? Like, you want to make sure that you're not just making decisions based off of one handful, small handful of people.
[Peter Schafer] Yeah. And you know, that's a great, great point, Sami. And so many people have different interpretations of what the numbers mean. And I think sometimes we get either too lost in the significance of what the number we think is saying, you know, so for example, we would do a lot of political polling, a lot of issue polling. And I tell this story. When I was working at Gallup right after the Bush Gore election of 2000. We did subsequent polls on trust in the Supreme Court. And at the beginning of that process is 61% trust. At the end of that process with the hanging chads in the Florida election, the recount, and they decided it was still 61%. Underneath it, all of the data had changed, but it was still you know, at the aggregate it was 61. So to your point, sometimes you need to go back to that board member and say, Well, you know what, yeah, it says 62%. But underneath it, this, this, this and this. I think, you know, was it the focus group of one? You have to account for it to a certain degree, but you're right, I mean, it's easy to toss out data results if you don't like them. Because it's, you know, and I think having that discipline, having that rigor, having that, you know, open minded approach at the outset to say, hey, look, we know that we're asking some tough questions. We know, we're asking some things that we may not want to actually find out. But that's okay. And we're comfortable with that, because this is what we need to do.
[Sami Bedell-Mulhern] Well, I was going to ask you kind of what questions we should ask and what questions we shouldn't ask. But I feel like that's so dependent on the purpose behind your poll. So instead, I want to flip that and ask like, how should we ask the question? So you already mentioned, I statements, like making people emotional, but is there kind of that mix of the scale questions versus open ended versus multiple choice? Like, kind of how might we want to craft this in order to make sure we get the most impact? And still make it short and sweet?
[Peter Schafer] Yeah, that's a great question. So these are not hard and fast, but they are at least guidelines for what you know, we know, in terms of market research. And what helps completion rates go on. For the most part, if you can offer scaled questions, or scaled statements on either one to five, or one to 10 scale, that's probably the best way to go. Most people are used to it, you know, so it's like, it's very easy, they can do it. If you have a complex question where you need multiple answers. So I'll use this one. You know, which subscription streaming services do you have? You know, just, you know, list is up, usually 12 is where you need to stop.
[Sami Bedell-Mulhern] Oh, wow, that’s more than I would have expected
[Peter Schafer] Yeah, yeah. 12 is usually the limit. But again, if you have a list like that, you know, either put it in a grid, or make it, but 12 is usually, and of course have others so they can fill it in. One of the things for open ends is this. Typically, we recommend one to three open ends per survey, unless you tell the respondent or the donor or whomever you're serving upfront, that we want you to elaborate on each of the points that we do. And at that point, then you keep the survey to about eight to 10 questions, but you know that you're going to get an open end at the end of each question. So actually, I now have 20 questions, but the 20 questions are 10 rate. And then why did you rate it this way? 10 rate, why did you rate it this way? So that I mean, and again, that 20 questions would be about a six minute survey with them typing in.
[Sami Bedell-Mulhern] I feel like that's two sided. Like number one, you're asking the surveyor to do more work when you have more open ended questions, but you're also then requiring yourself to do more work, because to process all of that information is going to take you and your team a whole lot longer.
[Peter Schafer] Yeah, it is. I mean, there's good news there, I mean, there are software packages like word clouding things that you can drop the verbatims into, and it spits out some really nice graphics and kind of gives you, you know, how many, what the frequency and the variance. Some packages have a little bit of sentiment analysis in it so that you know, they can, but you know, it's still very rudimentary science. So I would caution everybody before, you know, but on those particular questions, like I said, on our normal survey, 1 to 3 is usually the limit. And typically, they're used at the end, not the beginning, you know, obviously, but and that's the, you know, that's usually the kind of the cadence is that, you know, you can, you can get through about 30 questions in about seven to eight minutes with pretty much ease. The one thing that I would also suggest to you, in terms of when you're using scaling, if you just need kind of, you know, an agree or disagree scale, a five point scale is perfectly fine. If you need to look at intensity, you know, how people are really, you know, usually one to 10 scales, or one to seven scales are better at that, because you can have larger extremes. So it is, for example, it's easier to say somebody who rated, or we got an 8.2 on this question. That's different than a 6.8. There's a statistical difference there, it's a little bit harder to say 3.2 versus 3.6, if that makes sense.
[Sami Bedell-Mulhern] So, would it be kind of like, hey, how would you rate our customer service on a scale of one to five versus how likely are you to send your child to our after school program that might be a one to 10? Because that's a statistics, you're probably going to want to share a little bit more publicly with your audience?
[Peter Schafer] That's right. That's right. You know, and the other thing too, is that one to 10 scale gives you a little more flexibility in regards to the buckets that you're looking at. So, you know, like you mentioned, likelihood is always, you know, it's a typical question. Once you get above six on likelihood, that's a pretty good indicator of intent. That's, you know, if they're, you know, in that three to four to five range, that's kind of like, oh, well, maybe but once, you know, and I think that that's helpful for you all, because then it's saying, Okay, we probably have a little bit more distance to convince them, or a little bit more distance to sweeten the offer or change the, you know, the marketing a little bit, but at least you know that there's an intent there that didn't exist.
[Sami Bedell-Mulhern] yeah. Okay, so I have two questions left, because I know we've gone through a lot of things and given people a lot of things to think about. But number one, the second last question I want to ask you is anonymity of surveys and polls versus asking people for their contact information, and then kind of pairing that with incentivizing people for completing the result or completing the survey in order to get more results. Talk a little bit more about that?
[Peter Schafer] Sure, absolutely. Let me do the anonymous first. Privacy is a big issue for most people. I find that though in not for profit settings, it's a little less intrusive than in other places, and there's usually a more gentle or gentler ask than in, you know, kind of getting an anonymous email or something like that. It's always a good practice to say, if you have any questions about this, here's a, you know, here's a contact person, they can clear up any questions, that kind of thing. But it is perfectly fine to ask for a recontact opportunity at some point.
[Sami Bedell-Mulhern]Would you recommend that just be an optional thing?
[Peter Schafer] Yeah, it's totally optional. Usually, it's at the end, usually, it's not, you know, it's either, you know, name, email, phone, whatever, or whatever the best way is to reach them. Typically, interestingly enough, about 40% of people will say they want to be contacted at the end. So I think, you know, again, that practice is fine. If there is something that is sensitive that, for example, you would have some issue in regard to privacy, you know, or that you're going to such a small donor audience, that it would be, it could be possible to figure out who's saying what, I would say make it as confidential, as you can, you know, and but the larger you get, the better, the better option is. Incentivization is really, it's a tough question to ask these days, or it's a tough question to address these days. Number one is nonprofits have very limited budgets, so they're not going to be able to do elaborate, you know, incentives and sometimes even offering the incentive sends a mixed signal like, oh, well, if you can do this, you know, typically, for those types of surveys, either we will make a donation in on, you know, or we will, you know, give you an option to make a donation or have your name associated with this list of people. Or it would be some special incentive related to the not for profit, like, you know, you would be eligible to win two tickets for our event. And typically, that can either be done by a sweepstakes or done by a raffle. So when all the participants are finished, you randomize them, you pick one and that's the winner.
[Sami Bedell-Mulhern] Well you can even find somebody who's as data driven, that's a sponsor and say, hey, what if we say for every survey result that comes in, you'll match $1, you'll give a $1 donation. So then if they turn it in, they're going to automatically make a contribution to it.
[Peter Schafer] Exactly right, and those are totally within reason, they're certainly totally you know, within common practice. Where the incentive process gets a little dicey is most people want an immediate response for that incentive. So if you complete the survey, they want their gift card within, you know, 24 hours of that kind of thing. So, you know, just have to be careful about it. I won't name the organization, but they are not for profit. And they asked me to do something for them, and it was a $10 Amazon card. It took me an hour just to activate the code and all this other stuff. It became a distance that now I still am giving to the organization, but you want to make sure that, but I, you know, what I have found is that the incentives, they help, but they are not, they're not the main driver for why somebody participates. So you know, it's a help, but it's not a requirement.
[Sami Bedell-Mulhern]nWell, I just want to throw out there, if you're asking for testimonials on like Google, or any other of the platforms as part of it, and you incentivize it, that can also get you in trouble. So just bear in mind, if that's part of your survey process, don't do it. Okay, the last question that I have for you is software. So people might be listening to this. And they might be like, Okay, this all sounds well and good. But like, we have no way of like pulling this together. There's so many tools out there that are free or cost effective. So kind of what are your favorite survey tools in the digital space? And I can share mine if you don't want to get yourself into trouble.
[Peter Schafer] Yeah. Well, you know, so two things on that, before I start naming names. Number one is that you're right, there is a ton of software out there. And there's actually even a subset of software, if you have your list, and that's all you're going to use, you have your email list, and you can use it for free, and it's, you know, great. I don't have a particular favorite, I work with clients who work with SurveyMonkey, work with, you know, SurveyGizmo, work with Toluna, work with Suzy, work with, you know, I mean, you kind of I hate to say swinging, you know, you could swing a bat and hit a survey.
[Sami Bedell-Mulhern]nYou are naming ones I've never even heard of.
[Peter Schafer]nYeah. But the one thing that I would say is this. And this is especially true in some of the education clients that I've worked with, is that number one, that some of the software that they already use, have survey modules built into them. So it is, it's kind of one of those toss-ins that actually if you did a little bit of digging, you might be able to get some good value out of it. You know, so on that, but the second thing that strikes me is this, there are a lot of software packages that are specifically for not for profits, that I think in one regard, make it harder for you all to complete surveys, because it's it doesn't give you the flexibility to be especially on the analytics package, to do some other things around it. So I would just say, you know, one, when you're selecting it, there are two criteria. One is how easy it is to actually program at the survey and launch it. But the second is, what kind of analytics packages on the back end, I don't need to have this, you know, gargantuan MIT built system. So that's what I would look at too. The ones that I just mentioned, you know, from Survey Monkey to SurveyGizmo, whatever. They all have statistical packages built into them that are used by market researchers and fortune 500 companies and other groups.
[Sami Bedell-Mulhern]nYeah, well, and I would throw out there too. A lot of times, nonprofits have a CRM that they're using that has a ton of stuff in it that they're not even taking advantage of. So that's always my go to talk to your CRM, even if it's not something you can see on your dashboard. They might have that as an easy add on building. If you're doing very simple surveys, I think Google Forms are great. They're easy and they will export to an Excel document. To your point, they don't have the back end machine learning kind of, or not machine learning, but like data poll. But I think it's important to take a look at the goal of your survey. And then kind of do the research on the platform that makes the most sense. Because when you, like you said, when you start to Google what, how to take surveys, it can be overwhelming, and if you don't know what you need it to do, then you're gonna like get lost in a sea of features and benefits and companies.
[Peter Schafer] Yeah, and you're right. It's so fragmented these days, that it's not easy to kind of navigate. And each of these companies is also transforming right now into different types of things. So when Qualtrics, which used to be, you know, a very, very great platform to use, now, they've changed their business model because they're public. And you know, this, it's not, they don't have as many free services or free tools. So it has changed a lot.
[Sami Bedell-Mulhern]Yeah, well, as you know nonprofits, TechSoup is a great place to go to find out those tech tools that have the nonprofit discounts as well. So make sure you check that out. I feel like we could talk about this forever. And we may need to have you come back for another episode to talk more about the psychology of questions.
[Peter Schafer] I would love that.
[Sami Bedell-Mulhern] That is something we didn't dive too deep into today, but I think is super helpful. But is there any kind of last takeaway, Peter, that you might want to share with folks as they're kind of starting to think about and craft their polls for their audience?
[Peter Schafer] The only takeaway that I would have is this. And this has been an evolution just across the last five years. I have noticed that conversational questions, almost like what we're talking about here, are actually one of the best ways to start building your question for a survey. And that if you start writing those down, it's almost as if you can start building what you think the responses are, in a way. And I would challenge everybody to, to not over engineer these surveys and not make them more complex than they need to be. And to your point is just to say we're committed to using the outcomes, regardless of whether it's good, bad or ugly. But we're open to it, at least accepting that that's the case. And I think, you know, most people, they hear survey, and it's like, oh, my gosh, we got to do this, and we got to do this, and it's got to be this pristine. And, you know, in a lot of cases, it doesn't, it just has to be good enough to make a good decision.
[Sami Bedell-Mulhern] That's really, that's a really great spot to end this. Peter, how do people find out more about you and learn more from you?
[Peter Schafer] The best way is to connect with me on LinkedIn. Peter Schafer on LinkedIn and my email is Peter at Everest, E V, E, R, E, S, T, C, O, M, M, S .com.
[Sami Bedell-Mulhern] Awesome. Well, I will link all of this up in the show notes. Thefirstclick.net/160. So we'll have all of the resources in there as well. You can check that out. Peter, thank you so much for joining me.
[Peter Schafer] It's been a real pleasure. Thank you so much. I really had a great time.
[Sami Bedell-Mulhern] Yeah, thank you so much to Peter again for joining me on this episode, you can find all the resources and things that we chatted about in the show notes at thefirstclick.net/160. But I hope that you will like and subscribe wherever you listen, even if that's on YouTube, because yes, we are on YouTube at thefirstclick.net/YouTube. But thank you so much for listening, and I cannot wait to see the surveys that you create. Let me know how the results come in and if we can support you and your digital marketing strategies to support that, but for now, I'll see you in the next one.