Survey Fatigue—Literature Search and Analysis of Implications for Student Affairs Policies and Practices

Authors

  • Barry Fass-Holmes University California San Diego

Keywords:

non-response, participation, response rate, student affairs, students, survey, survey fatigue, undergraduates

Abstract

Undergraduates’ low participation/response rates on interdisciplinary (campus-wide) and disciplinary (specific department) surveys have been attributed to survey fatigue. To investigate this attribution’s merits, the present study conducted a systematic literature search ( five electronic databases plus one search engine) and critiqued findings of relevant publications returned by the search. This study found that (a) survey fatigue has not been rigorously defined, (b) the number of relevant peer-reviewed publications is unexpectedly limited, and (c) their findings are contradictory. These results have implications for policies and practices that restrict undergraduate survey administrations to minimize survey fatigue and boost participation/response rates. The present report recommends improving undergraduates’ participation/response rates by requiring instruction about surveys combined with assessments of student learning outcomes.

Downloads

Download data is not yet available.

Author Biography

  • Barry Fass-Holmes, University California San Diego

    BARRY FASS-HOLMES, PhD, is the Analytical Studies Coordinator for the International Students & Programs Office at the University of California, San Diego. His research interests include international students’ academic achievement, academic integrity, and survey participation. Email: bfholmes@ucsd.edu

References

Adams, M. J. D., & Umbach, P. D. (2012). Nonresponse and online student evaluations of teaching: understanding the influence of salience, fatigue, and academic environments.
Research in Higher Education, 53(5), 576–591.
Asiu, B. W., Antons, C. M., & Fultz, M. L. (1998). Undergraduate perceptions of survey participation: Improving response rates and validity. Paper presented at the 38th annual meeting of the Association of Institutional Research, Minneapolis, MN. Retrieved from ERIC database. (ED422805).
Bennett, L., & Nair, C. S. (2008, July 9–11). Survey fatigue—Myth or reality? Proceedings of the Australian Universities Quality Forum, Quality & Standards in Higher Education: Making a Difference. Australian Universities Quality Agency. Retrieved from ResearchGate.
Bennett, L., & Nair, C. S. (2010). A recipe for effective participation rates for web-based surveys. Assessment & Evaluation in Higher Education, 35(4), 357–365.
Birkett, A. (2017). Customer satisfaction surveys—how to design surveys that get results. Retrieved from https://blog.hubspot.com/service/customer-satisfaction-survey
Chen, P-S. D. (2011). Finding quality responses: The problem of low-quality survey responses and its impact on accountability measures. Research in Higher Education, 52(7), 659–674.
DePountis, V. M., Pogrund, R. L., Griffin-Shirley, N., & Lan, W. Y. (2015). Technologies that facilitate the study of advanced mathematics by students who are blind: teachers’ perspectives. International Journal of Special Education, 30(2), 131–144.
Dillman, D. A. (2007). Mail and Internet Surveys: The Tailored Design Method (2nd ed., 2007 update). John Wiley & Sons.
Downes-Le Guin, T., Baker, R., Mechling, J., & Ruyle, E. (2012). Myths and realities of respondent engagement in online surveys. International Journal of Market Research, 54(5), 613–633.
Egleston, B. L., Miller, S. M., & Meropol, N. J. (2011). The impact of misclassification due to survey response fatigue on estimation and identifiability of treatment effects. Statistics in Medicine, 30, 3560–3572.
Fan, W., & Yan, Z. (2010). Factors affecting response rates of the web survey: A systematic review. Computers in Human Behavior, 26(2), 132–139.
Fosnacht, K., Sarraf, S., Howe, E., & Peck, L. K. (2017). How important are high response rates for college surveys? The Review of Higher Education, 40(2), 245–265.
Gansemer-Topf, A. M., & Wohlgemuth, D. R. (2008). Selecting, sampling, and soliciting subjects. In J. H. Schuh & Associates (Eds.), Assessment Methods for Student Affairs (pp. 77–105). Josey-Bass.
Gofton, K. (1999). Data firms react to survey fatigue; a recent mistake over lifestyle questionnaires has forced marketers to seriously re-evaluate their approach to customers. Retrieved from http://bi.galegroup.com/global/article/GALE|A54658550/2942e26bbc00e0a2ba126fcd05de3b92
Goyder, J. (1986). Surveys on surveys: Limitations and potentialities. The Public Opinion Quarterly, 50(1), 27–41.
Groves, R. M. (1987). Research on survey data quality. The Public Opinion Quarterly, 51(2), S156–S172.
Groves, R. M. (2011). Three eras of survey research. The Public Opinion Quarterly, 75(5), 861–871.
Groves, R. M., Fowler, F. J., Couper, M. P., Lepkowski, J. M., Singer, E„ & Tourangeau, R. (2004). Survey Methodology. Wiley.
Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-saliency theory of survey participation: Description and an illustration. The Public Opinion Quarterly, 64(3), 299–308.
Hansen, M. H., & Hurwitz, W. N. (1946). The problem of non-response in sample surveys. Journal of the American Statistical Association, 41(236), 517–529.
Harvey, L. (2003). Student feedback [1]. Quality in Higher Education, 9(1), 3–20.
Hill, A., Roberts, J., Ewings, P., & Gunnell, D. (1997). Non-response bias in a lifestyle survey. Journal of Public Health Medicine, 19(2), 203–207.
Janzow, F., & Eison, J. (1990). Grades: Their influence on students and faculty. New Directions for Teaching and Learning, 1990(42), 93–102.
Kahan, D. (2011). What is motivated reasoning? How does it work? Retrieved from http://blogs.discovermagazine.com/intersection/2011/05/05/what-is-motivated-reasoning-how-does-it-work-dan-kahan-answers/
Klemenčič, M., & Chirikov, I. (2015). How do we know how students experience higher education? On the use of student surveys. In A. Curaj, L. Matei, R. Pricopie, J. Salmi, & P. Scott (Eds.), The European Higher Education Area. Between Critical Reflections and Future Policies (pp. 361–379). Springer.
Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5(3), 213–236.
Lipka, S. (2011). Want data? Ask students. Again and again. The Chronicle of Higher Education. https://www.chronicle.com/article/Want-Data-Ask-Students-Again/128537
Lynn, P. (1996). Weighting for non-response. In R. Banks, J. Fairgrieve, L. Gerrard, T. Orchard, C. Payne & A. Westlake (Eds.), Survey and Statistical Computing (pp. 205–214). Association for Survey Computing.
MacCorquodale, K., & Meehl, P. E. (1948). On a distinction between hypothetical constructs and intervening variables. Psychological Review, 55(13), 95–107.
McCarthy, J. S., & Beckler, D. (1999). An analysis of the relationship between survey burden and nonresponse: If we bother them more are they less cooperative? Paper presented at the International Conference on Survey Non-response, Portland, Oregon.
McCarthy, J., Beckler, D., & Qualey, S. (2006). An analysis of the relationship between survey burden and nonresponse: If we bother them more, are they less cooperative? Journal of Official Statistics, 22(1), 97–112.
McNair, D. E. (2009). Review of “Assessment methods for student affairs.” NASPA Journal, 46(3), 544–550.
National Survey of Student Engagement. (2017). NSSE 2017 overview. Retrieved from http://nsse.indiana.edu/2017_Institutional_Report/pdf/NSSE_Overview_2017.pdf
Nulty, D. D. (2008). The adequacy of response rates to online and paper surveys: What can be done? Assessment & Evaluation in Higher Education, 33(3), 301–314.
Olson, C. A. (2014). Survey burden, response rates, and the tragedy of the commons. Journal of Continuing Education in the Health Professions, 34(2), 93–95.
Pew Research Center. (n.d.). Election polling. Retrieved from http://www.pewresearch.org/methodology/u-s-survey-research/election-polling/
Pike, G. R. (2007). Adjusting for nonresponse in surveys. In J. C. Smart (Ed.), Higher Education: Handbook of Theory and Research, Vol. 22 (pp. 411–450). Springer.
Pons, X. (2017). Fifteen years of research on PISA effects on education governance: A critical review. European Journal of Education, 52(2), 131–144.
Porter, S. R. (2005). Survey research policies: An emerging issue for higher education. New Directions in Institutional Research, 2005(127), 5–15.
Porter, S. R., & Whitcomb, M. E. (2005). Non-response in student surveys: The role of demographics, engagement and personality. Research in Higher Education, 46(2), 127–152.
Porter, S. R., Whitcomb, M. E., & Weitzer, W. H. (2004). Multiple surveys of students and survey fatigue. New Directions in Institutional Research, 2004(121), 63–73.
Presser, S., & McCulloch, S. (2011) The growth of survey research in the United States: Government-sponsored surveys, 1984–2004. Social Science Research, 40(4), 1019–1024.
Reio, T. G., Jr. (2007). Survey nonresponse bias in social science research. New Horizons in Adult Education and Human Resource Development, 21(1/2), 48–51.
Rogelberg, S. G., Conway, J. M., Sederburg, M. E., Spitzmüller, C., Aziz, S., & Knight, W. E. (2003). Profiling active and passive nonrespondents to an organizational survey. Journal of Applied Psychology, 88(6), 1104–1114.
Saleh, A., & Bista, K. (2017). Examining factors impacting online survey response rates in education research: Perceptions of graduate students. Journal of MultiDisciplinary Evaluation, 13(29), 63–74.
Sharp, L. M., & Frankel, J. (1983). Respondent burden: A test of some common assumptions. The Public Opinion Quarterly, 47(1), 36–53.
Sinickas, A. (2007). Finding a cure for survey fatigue. Strategic Communication Management, 11(2), 11.
Sivo, S. A., Saunders, C., Chang, Q., & Jiang, J. J. (2006). How low should you go? Low response rates and the validity of inference in IS questionnaire research. Journal of the Association for Information Systems, 7(6), 351–415.
Steeh, C. G. (1981). Trends in nonresponse rates, 1952–1979. The Public Opinion Quarterly, 45(1), 40–57.
Tarantola, A. (2018). Our democracy is broken. Why can’t technology fix it? Retrieved from https://www.engadget.com/2018/04/18/voting-tech-gerrymandering-av-star-vote/
Thielsch, M. T., Brinkmöller, B., & Forthmann, B. (2018). Reasons for responding in student evaluation of teaching. Studies in Educational Evaluation, 56, 189–196.
Tschepikow, W. K. (2012). Why don’t our students respond? Understanding declining participation in survey research among college students. Journal of Student Affairs Research and Practice, 49(4), 447-462.
Van Mol, C. (2017). Improving web survey efficiency: The impact of an extra reminder and reminder content on web survey response. International Journal of Social Research Methodology, 20(4), 317–327.
Vandervert, L. R. (1980). Operational definitions made simple, useful, and lasting. Teaching of Psychology, 7(1), 57–59.
Vinson, M. N. (1996). The pros and cons of 360-degree feedback: Making it work. Training & Development, 50(4), 11–12.
Wise, V. L., & Barham, M. A. (2012). Moving beyond surveys. About Campus, 17(2), 26–29.

Additional Files

Published

2022-03-28

How to Cite

Survey Fatigue—Literature Search and Analysis of Implications for Student Affairs Policies and Practices. (2022). Journal of Interdisciplinary Studies in Education, 11(1), 56-73. https://ojed.org/jise/article/view/3262