"We designed the questions carefully, yet 30–40% of people who hit Start drop off midway." Anyone running surveys for a year will hit this wall. The cause is usually not the questions — it's the copy that surrounds them. Respondents read the intro before the first question, see the progress indicator midway through, and receive a thank-you page and follow-up email after they finish. These four pieces of copy directly drive completion rate, data quality, and re-participation rate, yet design priority for them is conspicuously low in practice.
This article organizes the role and required elements of each of the four pieces, the design principles grounded in Dillman's Social Exchange Theory, Conrad et al. (2010) on progress bars, the Informed Consent structure from IRB practice, and the editorial team's working templates. Read it alongside 10 Practical Techniques to Lift Response Rate, Reminder Email Best Practices, and Mobile Survey Design Guide as a piece focused on lifting completion rate.
1. Why Copy Design Drives Completion Rate
Before answering any question, respondents enter a mode of judging "should I cooperate with this survey?" Dillman, Smyth & Christian (2014) Internet, Phone, Mail, and Mixed-Mode Surveys explains response behavior through Social Exchange Theory. Unconsciously, respondents weigh whether "the benefit of cooperating ≥ the cost + risk of cooperating," and that judgment is formed by the few lines of the intro.
Three Judgments Decided by Your Copy
From the intro, progress indicator, thank-you page, and follow-up email, respondents make three judgments:
- Trust: Who is running this survey, and for what purpose?
- Cost: How much time and effort will this take?
- Meaning: What will my response be used for, and what value does it create?
When these judgments are ambiguous as they enter the questions, the moment respondents feel "this is too much trouble" mid-flow, they drop off. Cook, Heath & Thompson (2000) A Meta-analysis of Response Rates in Web- or Internet-based Surveys similarly identified prenotification, quality of the request copy, and expression of gratitude as top variables for response rate.
Time Allocation Between Question Design and Copy Design
In practice, teams often spend "90% on question design, 10% on copy." The editorial team recommends 70/30. Question improvements ratchet up a few percentage points at a time, but copy design improvements can move completion rate by 10–20 points.
2. Map of the Four Pieces of Copy
The survey response experience is made up of four pieces of copy.
The Four Pieces and Their Roles
Designing these four as one continuous experience is the key to balancing completion rate and re-participation rate.
3. Designing the Intro — The Required Five Elements
The intro is where the biggest gaps open up. In academic research, the IRB's Informed Consent standard provides a structure that translates directly to commercial surveys. The five elements rooted in the Belmont Report (1979) are equally applicable outside the academy.
The Required Five
- ① Purpose: What is this survey trying to learn? (1–2 sentences, plain)
- ② Time: How many minutes does it take? (Never understate. Display the measured value + 20%.)
- ③ Anonymity / Privacy: Are individuals identified? What's the scope of data use?
- ④ Use of Data: What are results used for? (Product improvement / academic publication / any third-party sharing)
- ⑤ Contact: How to reach out with questions
BAD vs GOOD
BAD example (a typical failure):
"Please help us improve our service. Takes only a few minutes."
"We" and "improve our service" are vague, time is fuzzy, anonymity / use of data / contact are all missing. Respondents enter the questions with insufficient information to make a decision, and dropout mid-flow spikes.
GOOD example:
"Purpose: Help us improve onboarding by sharing your experience during your first week. Time: About 5 minutes (10 questions) Privacy: We collect no personally identifying information; responses are aggregated as statistical data. Use of data: For product improvement only; no third-party sharing. Contact: research@example.com"
Respondents complete all three judgments (trust, cost, meaning) in under 30 seconds and enter the questions in a focused state.
The Trap of Understating Time
Saying "3 minutes" when it actually takes 8 makes respondents feel "bait-and-switched," and response quality drops too. Galesic & Bosnjak (2009) Effects of questionnaire length on participation and indicators of response quality demonstrated that understating time depresses both completion rate and quality. Show the measured median from pilot, plus 20% as a safe operating rule.
4. The Psychology of Progress Indicators — Bad Design Backfires
The progress indicator (progress bar / "n of m") is a powerful dropout-suppression device, but research has shown that poor design backfires.
Conrad et al. (2010) Experiment
Conrad, Couper, Tourangeau & Peytchev (2010) The impact of progress indicators on task completion compared three bar designs (fast first half / constant / fast second half):
- Fast first half: highest completion rate (respondents feel "almost done" in the second half)
- Constant: middle
- Fast second half: lowest completion rate (respondents feel "this is never ending" in the first half)
In other words, progress bars work when the design creates a sense of progress. With long surveys, placing simple items (demographics, frequency) early and heavy items (open-text, complex selection) later creates a "fast first half" experience.
Implementation Patterns
- Progress bar (visual): Shows e.g. "50% done" graphically. Most common.
- "n of m" indicator: States numbers explicitly. Suited to academic surveys.
- Section indicator: "Section 2 of 4" by chapter. Good for long-form.
- Hide: Surveys of 5 or fewer questions can omit it.
For surveys of 8–10 questions or fewer, deliberately hiding the progress bar can yield higher completion. When respondents know the survey is short, there's no benefit to belaboring progress.
5. Thank-You Page Design — Beyond "Thanks for Your Response"
The thank-you page appears for a few seconds right after completion. Most surveys end with a single "Thank you for your response," but adding three elements here changes re-participation rate and relationships significantly.
Three Elements on the Thank-You Page
- ① Concrete Gratitude: For how many minutes' help, for what improvement
- ② Preview of Result Sharing: When and where results will be published (or whether individual feedback is available)
- ③ Next Action: Optional links to related resources, product pages, communities
Example
BAD example:
"Thank you for your response."
GOOD example:
"Thank you for your response.
Your 5 minutes gave us valuable input for our onboarding improvements. We'll publish the aggregate results on our blog in mid-June (email research@example.com if you'd like individual feedback).
If you have ideas for "I wish you had…" features, send them anytime via our feedback form."
When the value of cooperation becomes visible and there's a next touchpoint, respondents leave with a sense of "this was worth it." That's the foundation for future cooperation.
Honor Your Result-Sharing Promise
Once you've written "we'll publish results," you must actually publish. Breaking the promise costs you trust on every subsequent survey. The same applies to internal surveys: if you've previewed an internal summary, deliver it.
6. Follow-up Email Design
Sending a follow-up email within 24–72 hours of completion, on top of the thank-you page, lifts re-participation rate substantially. Singer & Ye (2013) The use and effects of incentives in surveys showed that beyond monetary incentives, expressed gratitude and result feedback have statistically significant effects on willingness to cooperate again.
Four Elements in a Follow-up Email
- ① Personal Thanks: Named gratitude (use a merge variable)
- ② What's the Next Step: When and where results will be shared
- ③ Preview of Improvement Action: What you plan to do with the feedback
- ④ Opt-out Path: A way to stop receiving future surveys
Template
Subject: [Onboarding Survey] Thank you for your response
Hi [First Name],
Thank you for completing our onboarding survey.
We deeply appreciate the 5 minutes you gave us.
[What's Next]
We'll publish the aggregate results on our blog in mid-June.
URL: https://example.com/blog/onboarding-survey-2026
[Preview of Improvement Action]
The most common feedback — that initial setup is confusing —
will be addressed in our June release with a setup wizard redesign.
If you'd prefer not to receive future surveys, you can opt out here:
[Unsubscribe Link]
Questions: research@example.com
Timing
- Within 24 hours: Express thanks while the experience is fresh. Highest effect.
- 48–72 hours: Normal operating range. For B2B, 48 hours is safe given business-day patterns.
- A week or later: Effect fades and the email shifts in purpose toward a reminder.
Don't conflate with reminders (resends to non-respondents). For reminder design, see Reminder Email Best Practices.
7. Editorial Team's Review Checklist
Finally, the checklist the editorial team uses when reviewing copy design.
Intro Checks
- Purpose stated in 1–2 sentences
- Time displayed as measured value + 20%
- Anonymity / privacy handling stated explicitly
- Scope of result use stated
- Contact email listed
Progress Indicator Checks
- For 8 questions or fewer, considered hiding the bar
- Light items placed early, creating "fast first half"
- No designs that walk progress backward (via conditional branching)
Thank-You Page Checks
- Concrete gratitude (minutes, improvement use case)
- Result-sharing preview (when, where)
- Next action link (optional)
Follow-up Email Checks
- Sent within 24–72 hours of completion
- Personal thanks with merge variable
- Preview of improvement action
- Working opt-out path
8. Implementation in Kicue
Kicue provides the collection, aggregation, and export foundation for surveys. In the copy-design layer, the realistic implementation pattern is as follows.
Intro and Thank-You Page
Intro and thank-you page customization is available from the Kicue form settings. Prepare templates that satisfy the five and three required elements outlined above, and swap them in per survey.
Progress Indicator
Toggle the progress bar on or off in settings. For short surveys (8 questions or fewer), consider deliberately hiding it.
Follow-up Email Delivery
Kicue itself does not include an email delivery feature, so follow-up email sends are operated in combination with an external email delivery tool (Mailchimp / SendGrid / your own SMTP). Export the respondent list from Kicue (where you've captured email addresses), feed it into a CRM or email delivery tool, and control timing and merge variables there.
Result Sharing Operations
For the result sharing you previewed on the thank-you page, export aggregates from Kicue and shape them into a blog post, internal document, or PDF report. The basic statistics in Kicue's aggregation view (crosstabs, distributions) are usable as the primary data for external publication.
Copy design tends to get less attention than question design. Yet for completion rate and re-participation rate, its impact is often larger — research and practice agree. Designing the intro, progress indicator, thank-you page, and follow-up email as one continuous experience lets respondents leave with a sense of "this was worth it," and that becomes the foundation for cooperation going forward.
When you hit the wall of "I designed the questions carefully but completion rate isn't moving," start by revisiting the five required elements of the intro. One hour of improvement compounds across months of operation.
References (6)
- Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method (4th ed.). Wiley. https://www.wiley.com/en-us/Internet%2C+Phone%2C+Mail%2C+and+Mixed+Mode+Surveys%3A+The+Tailored+Design+Method%2C+4th+Edition-p-9781118456149
- Conrad, F. G., Couper, M. P., Tourangeau, R., & Peytchev, A. (2010). The impact of progress indicators on task completion. Interacting with Computers, 22(5), 417-427. https://doi.org/10.1016/j.intcom.2010.03.001
- Cook, C., Heath, F., & Thompson, R. L. (2000). A meta-analysis of response rates in web- or internet-based surveys. Educational and Psychological Measurement, 60(6), 821-836. https://doi.org/10.1177/00131640021970934
- Galesic, M., & Bosnjak, M. (2009). Effects of questionnaire length on participation and indicators of response quality in a web survey. Public Opinion Quarterly, 73(2), 349-360. https://doi.org/10.1093/poq/nfp031
- Singer, E., & Ye, C. (2013). The use and effects of incentives in surveys. The ANNALS of the American Academy of Political and Social Science, 645(1), 112-141. https://doi.org/10.1177/0002716212458082
- The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. (1979). The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. U.S. Department of Health and Human Services. https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/index.html
If you want to set up the foundation for completion-driving copy design, try the free survey tool Kicue. From intro and thank-you page customization, to progress bar display control, to follow-up email operation in combination with an external email tool via raw CSV export — design and operate all four pieces of the response experience from a single account.
