Below are some tips, based on empirical research, which can boost response rates.
- To optimize this method include a pre-notice, invitation, at least two reminders, and a thank you.
Keep it short
- The shorter the better.
- If it will take less than 20 minutes to complete, make sure to tell your respondents that.
- The research around this is mixed.
- Research hasn’t supported the use of post-paid incentives (e.g., ones given for completing a survey).
- However, if you are going to use these types of incentives they should be enticing and something that everyone could use.
- iPods may not be an incentive for someone that already has one.
- It may also be helpful to use incentives that have long and short odds. Some people may be enticed by a small prize that they have a better chance at winning and others may be enticed by a large prize that they have less of a chance of winning.
Survey should be salient
- People are more likely to complete a survey if the topic is important to them. A parking survey will likely have a higher response rate than a health survey.
- If the topic itself isn’t salient, salience can be increased by telling students that they part of a selected sample or that they are one of the few asked for their opinion.
Blank subject line
- Using a blank subject line can increase response rate only for a topic that respondents do not find interesting.
- While the research isn’t clear on the how much effect on responses ensuring confidentiality is, it seems that when it can be ensured that it is worth telling respondents as it may increase response rates.
Requests for help
- Just say, “we need your help.”
- While, most of the research regarding the impact of official sponsorship regards sponsorship by the state and federal government it seems logical that this could also apply to surveys sponsored by a college or university.
- This sponsorship can be indicated by the use of a logo, letterhead, or signature by a high-ranking college official such as a president or vice-president.
Provide explicit deadlines
- There have been mixed results regarding the impact of providing deadlines for response.
- Since it seems like a simple thing to employ it would probably be advantageous to include them.
- There have been mixed results regarding the impact of personalized correspondence increase response rate.
- Given the ease of including personalized correspondence, it seems that it would worth including this.
Specific institutional characteristics
- Each institution is different so some things may work on one campus, but not on another. It is important to know your students and what will work for them. If you don’t know then ask them.
Crawford, S. D., Couper, M. P., and Lamais, M. J. (2001). Web surveys: Perceptions of burden. Social Science Computer Review, 19(2), 146-162.
Dillman, D. A. (2000). Mail and Internet surveys: A tailored design method. New York: John Wiley and Sons.
Fowler, F., Jr. (1995).Improving survey questions: Design and evaluation (Applied Social Research Methods Series, Volume 38). Thousand Oaks: SAGE Publications.
Fraenkel, J. R., and Wallen, N. E. (2003). How to design and evaluate research in education (5th ed.). New York: McGraw-Hill.
Groves, R. M., Singer, E., and Corning, A. (2000). Leverage-saliency theory of survey participation. Public Opinion Quarterly, 64, 299-308.
Porter, S. R. and Whitcomb, M. E. (2005). E-mail subject lines and their effect on web survey viewing and response. Social Science Computer Review, 23(3), 380-387.
Porter, S. R. (Ed.). (2004). Overcoming survey problems. New Directions for Institutional Research, 121.
Sax, L. J., Gilmartin, S. K., and Bryant, A. N. (2003). Assessing response rates and nonresponse bias in web and paper surveys. Research in Higher Education, 44(4), 409-432.
Singer, E. (2002). The use of incentives to reduce nonresponse in household surveys. In Groves, Dillman, Ettinger, and Little (Eds.), Survey Nonresponse. New York: Wiley.
Umbach, P. D. (Ed.). (2005).Survey research: Emerging issues. New Directions for Institutional Research, 127.