Up: presentation-increasing-response-rates-incentives
Evaluating the Effect of Monetary Incentives on Web Survey Response Rates in the UK Millennium Cohort Study
Reading: Booth, Charlotte, Erica Wong, Matt Brown, and Emla Fitzsimons. 2024. “Evaluating the Effect of Monetary Incentives on Web Survey Response Rates in the UK Millennium Cohort Study.” Survey Research Methods 18(1): 47–58. doi:10.18148/SRM/2024.V18I1.8210.
KEYWORDS: incentive experiment; longitudinal survey; data quality; response rates
Overview
- This study tests if a £10 shopping voucher increases participation in an online survey and improves answer quality for the UK Millennium Cohort Study (MCS), which follows nearly 19,000 young adults born in 2000-02.
- Conducted during the COVID-19 pandemic (Feb-March 2021)
- Compares a voucher group (75%) to a no-voucher group (25%) in the third online survey.
This study checks if offering a £10 shopping voucher after completing an online survey gets more young adults to respond in the UK’s Millennium Cohort Study (MCS), a big project tracking people born in 2000–02. They tested this during a COVID-19 web survey in 2021 with 13,351 participants, comparing a group offered the voucher to one with no reward. The £10 voucher boosted responses by 6%, helped a bit with survey quality, but didn’t help much with people who skipped past surveys. Here’s a clear, everyday explanation:
What They Did
- Background: The MCS follows 18,818 young UK adults to learn about their lives. Normally, surveys were face-to-face (72% response at start, 74% at age 17), but COVID-19 forced three online surveys in 2020–21. The first two got low responses (27% and 24%), so they tested if a reward could help the third.
- The Plan:
- Who: 13,351 MCS participants (aged 19–21) invited to a web survey about COVID-19 effects in February–March 2021.
- Groups: Randomly split:
- Voucher Group (75%, ~10,000 people): Offered a £10 shopping voucher (like Amazon or love2shop) after finishing the survey.
- No-Reward Group (25%, ~3,328 people): No voucher, as usual.
- Survey Process:
- Sent invitations by email and post, with a web link. Non-responders got three email reminders (or one postal), plus two texts.
- Some non-responders were called for a phone survey, but these were counted as non-responders for the web study since phone invites weren’t random.
- Voucher group was told about the reward in the invite; after finishing, they got instructions to claim it (online or by mail).
- What They Checked:
- Response Rate: How many completed the survey.
- Who Responded: If the voucher helped get more men, ethnic minorities, or others who usually skip surveys.
- Survey Quality: Things like quitting early (break-off), skipping questions (e.g., income), giving lazy answers (straight-lining), skipping a free-text question, or time spent.
- Future Impact: If the voucher made people more likely to do a survey six months later (when everyone got a £10 offer).
- How They Analyzed:
- Used stats (regression) to compare response rates, controlling for traits like gender, ethnicity, poverty, or past survey skips.
- Checked if the voucher worked better for some groups (e.g., men vs. women).
- Looked at survey quality for the 3,601 who did the web survey.
What Happened
- Response Rate:
- No-Reward Group: 22% (741/3,328) responded.
- Voucher Group: 28% (2,861/10,000) responded, a 6% boost (p < 0.001).
- That’s ~840 extra people, a big win for a study needing lots of responses.
- Who Responded:
- The voucher didn’t work better for specific groups like men, ethnic minorities, or poorer families—everyone responded about the same.
- But it worked less for people who skipped the last big survey at age 17. For them, response rose from 4% to 7% (vs. 26% to 34% for regular responders).
- Women, White participants, those with educated parents, and non-poor families were more likely to respond overall.
- Survey Quality (for 3,601 web responders):
- Good Stuff:
- Less Quitting: Voucher group had lower break-off (3% vs. 6%, p < 0.001).
- More Time: They spent ~2.5 minutes longer (total ~30 minutes, p < 0.001), maybe feeling rewarded for effort.
- Mixed Stuff:
- Free Text: Voucher group skipped the free-text question slightly more (63% vs. 60%, p < 0.001), maybe because they already spent longer.
- No Change:
- Skipping Questions: 20% skipped the income question in both groups.
- Lazy Answers: 7% gave the same answer repeatedly (straight-lining) in both.
- Good Stuff:
- Future Impact:
- Six months later, everyone was offered a £10 voucher for another web survey. Both groups responded at 33% (p = 0.589), so the earlier voucher didn’t make people more likely to join later.
What It Means
- Key Points:
- A £10 voucher got 6% more young adults to do the online survey, adding ~840 responders.
- It slightly improved quality (fewer quit, more time spent) but didn’t fix everything (e.g., skipping free-text).
- It didn’t help much with people who often skip surveys, a tough group to reach.
- The voucher didn’t make people more likely to do future surveys.
- Why It Worked:
- £10 is decent money for a 20-minute survey, motivating more people (like leverage-salience theory says: rewards grab attention).
- Vouchers (Amazon, love2shop) were easy to use online or in stores, fitting young adults.
- Multiple reminders (email, text, post) kept people engaged.
- Why It Didn’t Do More:
- £10 might be too small for hard-to-reach people (past non-responders). Bigger rewards (like $40 in a US study) might work better.
- Online surveys get lower responses than face-to-face (24–27% vs. 74%), and COVID-19 stress might’ve made people less interested.
- Skipping free-text could mean people felt they’d done enough after 30 minutes.
- Tips for Online Surveys:
- Offer £10 vouchers to boost responses, especially for young adults.
- Use multiple reminders (email, text, post) to keep people engaged.
- Try bigger rewards or special outreach for people who often skip surveys.
- Watch survey length—30 minutes might tire people out for extra questions.