Most surveys are terrible. From poorly designed questions, to incoherent survey flow, to useless results, it’s no wonder data-driven organizations have so little faith in survey research. But this isn’t the fault of the tool, it’s because most surveys are built without adhering to some basic best practices, which once fixed can transform any survey from a zero to a hero. This lecture will show you how to create data-science quality surveys that provide unique and immediately actionable insight about your customers, competitors, and marketplace.
This Lecture Will:
-EXPLAIN THE DATA SCIENCE APPROACH TO SURVEY LAYOUT AND QUESTION DESIGN.
-HOW TO INCREASE RESPONSE AND COMPLETION RATES THROUGH ITERATIVE TESTING.
-LINKING SURVEY RESULTS TO OTHER DATA SOURCES TO ENRICH YOUR ANALYSIS.
You can watch this lecture here: https://youtu.be/WuBenXuVzqc
Kantar AI Summit- Under Embargo till Wednesday, 24th April 2024, 4 PM, IST.pdf
Step Up Your Survey Research - Dawn of the Data Age Lecture Series
1. Dawn of the Data Age Lecture Series
Interpreting Data Like a Pro
2. Hi. I’m Luciano Pesci…
Co-Founder & CEO, EMPERITAS
● Team of economists and data scientists delivering bi-weekly Customer Lifetime Value intelligence so
our clients can beat their competitors for the most profitable customers.
Founder & Director, Utah Community Research Group, Univ. of Utah
● Teach microeconomics, data science, applied research, & American economic history.
2
3. Today’s Lecture Outline
● Teach you the data science approach to surveys.
● Show you how to increase survey engagement.
● Explain how to blend survey data with other sources.
3
5. Why Surveys Can Suck
● The common causes are your “opportunities”:
○ No game plan for the resulting data.
○ Survey appears illegitimate and/or cryptic.
○ One-size-fits-all unengaging user experience.
○ It wasn’t pretested, and pretested, and pretested.
● Great tool for data science when trying to
answer the “why” behind observed patterns.
5
6. Opportunity #1: Create a Game Plan
● The primary problem with surveys is not
understanding the connection between the
information you need and how you plan to use it.
○ This should be done BEFORE the survey is written.
● Using S.M.A.R.T. goals can easily fix this.
○ See our “Getting To Quick Data Wins” lecture.
6
7. Opportunity #2: Express Legitimacy
● The survey needs to immediately express a
legitimate purpose.
○ Be as transparent as you can without
compromising the research design.
● People judge everything about the survey
in a (calculable) instant.
○ This includes language and images, but especially flow.
7
8. Opportunity #3: Focus on User Experience
● Survey takers are unique individuals, customize their
experience with text-piping, adaptive questions, etc.
○ This gives you pre-segmented data & allows for more questions.
● Any path through the survey must be logical.
○ Don’t leave anything to the respondent’s imagination
unless it’s a required part of the survey design.
8
9. ...Side Note On Wording
● Avoid being too high brow or too low brow.
○ Use language that sounds natural when spoken aloud.
● Know your audience and speak in their terms.
○ Do qualitative research before the survey.
■ See our “Killing It With Qualitative Research” lecture.
9
10. ...Side Note On Questions
● Watch out for double-barreled questions.
○ They are questions which ask “this” AND “that.”
○ They make analysis impossible.
● Group questions by a logical theme.
○ Arrange individuals questions in logical order too.
○ Ask if a question is “absolutely needed.” If not, cut it.
○ Ask sensitive questions at the end, after building trust.
10
11. Opportunity #4: Pretest, Pretest, Pretest
● 99% of survey problems can be caught by pretesting.
○ The rest can be caught in the soft-launch.
■ The difference being, your team takes the survey for the
pretest, but respondents take it during the soft launch.
● Use a checklist before launching (like a pilot).
○ I’ll offer you one at the end of lecture.
11
12. Data Collection (Fieldwork)
● Once the pretest ends, you enter a new phase
in your survey project: fieldwork.
● You only get one chance to survey someone.
○ The key to success is a running a series of small batch
engagement tests before fully launching into the field.
■ Test, iterate, test, iterate, repeat.
12
14. Soft Launching
● The first step of fieldwork is to soft launch.
○ These are controlled, small batch experimental
distributions of the survey done iteratively.
● You’re not the target respondent, get out of
the building and talk to them directly.
○ This can be done digitally, it just has to involve them.
14
15. Rates To Guide By
● Two metrics to track during your soft launch are:
○ Response rate (# opened)/(# people contacted)*
○ Completion rate (# surveys finished)/(# surveys started)
● One measures the initial engagement success,
the other is the ultimate engagement success.
15
*See surveymonkey's definition of response rate, https://goo.gl/VgV1zD
16. Response Rates Explained
● Solicitation channel mix is very important.
○ Calculate response rates for each channel by setting
up a system of link tracking and distribution control.
● Optimize during the soft launch.
○ A/B test the solicitation channels, message timing,
calls to action, images, and text copy.
○ Only test one dimension at a time.
16
17. Completion Rates Explained
● This is where you learn if your survey works.
○ Study survey length (avg, min, max).
■ You can ask 2-3 questions per minute, open-ended take longer.
○ An acceptable survey length is determined by the incentive.
● Look for patterns in question drop off.
○ Read open-ended questions. If the survey has problems, this
is where people will tell you.
17
18. The Point of No Return
● With your soft launch complete, it’s time to fully launch the survey.
○ Once the survey is fully launched you should only make minimal changes (if needed).
○ Still track performance metrics & read feedback.
■ Set up a real-time report and share access with stakeholders.
18
20. Don’t Skip Data Prep
● Before you blend your survey data with any
other sources, make sure it’s analysis ready.
○ Always keep a backup of your raw data.
● Inspect any survey takers that sped through,
straight lined, gave incomplete open-ended
responses, or took the survey multiple times.
20
21. Precode Your Survey Data
● Some survey tools (like Qualtrics) allow you to
precode the labels and values of your survey
data before (or after) fieldwork is completed.
● Always include a copy of the survey (with coded
values) with your analysis-ready survey data set.
21
22. Overlaying Other Data
● Survey data’s power increases dramatically if
you can tie specific survey responses to other
data you have about the same individuals.
● For example: analyzing sentiment from survey
data with observed product usage data.
22
23. The Missing Link
● Your survey data will have individual people as the rows,
and their responses to questions as the columns.
● That means to link this data to other sources,
you must be able to attribute things to individuals.
○ You can use one source of data as the link, or you can validate
across multiple sources: unique ID (user, respondent), name, email.
23
24. Removing Human Error
● 99% of mess ups in the data (that result
in a loss of the 1-for-1 nature of rows &
columns) is the result of human error.
● Tools (like Qualtrics) allow for automated
merging through embedded data fields.
24
26. What We Covered Today...
● The data science approach to surveys.
● How to increase survey engagement.
● Blending survey data with other sources.
26
27. Additional Resource: Survey Checklist
● For a copy of Emperitas’ survey
checklist, just send me an email:
○ luciano@emperitas.com
27
28. Additional Resource: Survey Timeline (in weeks)
28
W1 W2 W3 W4 W5 W6 W7 W8 W9 W10 W11 W12
S.M.A.R.T Goals
Qual Research
Program Survey
Pretest
Soft Launch
Fieldwork
Data Prep
29. JOIN US FOR THE NEXT LECTURE
Data Drive Your Content Creation, Thursday January 18th 2018
emperitas.com/lecture