Drop out Factors in Data Literacy and Research Data Management Survey: Experiences from Lithuania and Finland
Articles
Jurgita Rudžionienė
Vilnius University
Vincas Grigas
Vilnius University
https://orcid.org/0000-0003-2414-6277
Heidi Enwald
University of Oulu
https://orcid.org/0000-0003-1953-2157
Terttu Kortelainen
University of Oulu
https://orcid.org/0000-0001-8407-1380
Published 2018-12-28
https://doi.org/10.15388/Im.2018.82.8
PDF

Keywords

survey
research data management
information literacy
Vilnius University (Lithuania)
University of Oulu (Finland)

How to Cite

Rudžionienė, J., Grigas, V., Enwald, H. and Kortelainen, T. (2018) “Drop out Factors in Data Literacy and Research Data Management Survey: Experiences from Lithuania and Finland”, Informacijos mokslai, 82, pp. 115-130. doi: 10.15388/Im.2018.82.8.

Abstract

[full article, abstract in English; abstract in Lithuanian]

The purpose of this paper is to develop an understanding of factors that affect the respondents to drop out of an already started survey on research data management. We decided to take a questionnaire on data management survey at Vilnius University and Oulu University implemented in 2017 as a case study. The data for the analysis was collected using the questionnaire, which was used in multinational research for Data Literacy and Research Data Management, performed by a group of researchers in more than ten countries, initiated by Serap Kurbanoğlu and Joumana Boustany. This paper describes the analysis of 1 185 survey samples, of which 515 were unfinished and 670 finished in both universities. For the analysis of the data, we used Framework for Web Survey Participation created by Andy Peytchev (2009). The collected data was analyzed using IBM SPSS Statistics ver. 19 with descriptive and inferential statistical tests. The most significant factors on deciding not to finish the survey were the length of the survey, the scientific field, experience, age and the topic of the survey. No statistically significant difference was measured between those who finished the survey and unfinished evaluating the data by gender and job position. An important factor in not finishing the survey was the design of the survey.

PDF

References

ARMSTRONG, J. Scott and OVERTON, Terry S., 1977. Estimating Nonresponse Bias in Mail Surveys. Journal of Marketing Research [online]. August 1977. Vol. 14, no. 3, p. 396. [Accessed 24 July 2018]. DOI 10.2307/3150783. Available from: https://www.jstor.org/stable/3150783?origin=crossref
BANKS, Marcus and ZEITLYN, David, 2015. Visual methods in social research. 2nd ed. Los Angeles Calif: SAGE. ISBN 9781446269756. 2nd edition. Previous edition: 2001.
BARUCH, Yehuda, 1999. Response Rate in Academic Studies-A Comparative Analysis. Human Relations. 1999. DOI 10.1177/001872679905200401.
BOSNJAK, M., POGGIO, T., BECKER, K. R., FUNKE, F., WACHENFELD, A., FISCHER, B., 2013. Online survey participation via mobile devices. In: The American Association for Public Opinion Research (AAPOR) 68th Annual Conference. 2013.
BOSNJAK, Michael and TUTEN, Tracy L., 2001. Classifying Response Behaviors in Web-based Surveys. Journal of Computer-Mediated Communication [online]. 23 June 2001. Vol. 6, no. 3, p. 0–0. [Accessed 24 July 2018]. DOI 10.1111/j.1083-6101.2001.tb00124.x. Available from: https://academic.oup.com/jcmc/article/4584257
CHAIKEN, Shelly, 1980. Heuristic versus sys-
tematic information processing and the use of source versus message cues in persuasion. Journal of Personality and Social Psychology [online]. 1980. Vol. 39, no. 5, p. 752–766. [Accessed 24 July 2018]. DOI 10.1037/0022-3514.39.5.752. Available from: http://doi.apa.org/getdoi.cfm?doi=10.1037/0022-3514.39.5.752
CHUDOBA, Brent, 2018. How much time are respondents willing to spend on your survey? | SurveyMonkey. [online]. 2018. [Accessed 29 July 2018]. Available from: https://www.surveymonkey.com/curiosity/survey_completion_times/
COUPER, Mick P. and KREUTER, Frauke, 2013. Using paradata to explore item level response times in surveys. Journal of the Royal Statistical Society: Series A (Statistics in Society) [online]. January 2013. Vol. 176, no. 1, p. 271–286. [Accessed 26 July 2018]. DOI 10.1111/j.1467-985X.2012.01041.x. Available from: http://doi.wiley.com/10.1111/j.1467-985X.2012.01041.x
DE BRUIJNE, Marika and WIJNANT, Arnaud, 2014. Improving Response Rates and Questionnaire Design for Mobile Web Surveys. Public Opinion Quarterly [online]. 1 January 2014. Vol. 78, no. 4, p. 951–962. [Accessed 26 July 2018]. DOI 10.1093/poq/nfu046. Available from: https://academic.oup.com/poq/article-lookup/doi/10.1093/poq/nfu046
DILLMAN, Don A., SMYTH, Jolene D., CHRISTIAN, Leah Melani. and DILLMAN, Don A., 2009. Internet, mail, and mixed-mode surveys : the tailored design method. [online]. Wiley & Sons. [Accessed 27 July 2018]. ISBN 9780471698685. Available from: https://www.wiley.com/en-lt/Internet,+Mail,+and+Mixed+Mode+Surveys:+The+Tailored+Design+Method,+3rd+Edition-p-97804716986853rd
DING, Dan, POQUET, Oleksandra, WILLIAMS, Joseph Jay, NIKAM, Radhika and COX, Samuel Rhys, 2018. Increasing Response Rates to Email Surveys in MOOCs. In: Adjunct Publication of the 26th Conference on User Modeling, Adaptation and Personalization – UMAP ’18 [online]. New York, New York, USA: ACM Press. 2018. p. 203–206. [Accessed 24 July 2018]. ISBN 9781450357845. Available from: http://dl.acm.org/citation.cfm?doid=3213586.3225247
EDWARDS, Phil, ROBERTS, Ian, CLARKE, Mike, DIGUISEPPI, Carolyn, PRATAP, Sarah, WENTZ, Reinhard and KWAN, Irene, 2002. Increasing response rates to postal questionnaires: systematic review. BMJ (Clinical research ed.) [online]. 18 May 2002. Vol. 324, no. 7347, p. 1183. [Accessed 27 July 2018]. DOI 10.1136/BMJ.324.7347.1183. Available from: http://www.ncbi.nlm.nih.gov/pubmed/12016181OBJECTIVE
FAN, W and YAN, Z, 2010. Factors affecting response rates of the web survey: A systematic review. Computers in Human Behavior. 2010. DOI 10.1016/j.chb.2009.10.015.
FAN, Weimiao and YAN, Zheng, 2010. Factors affecting response rates of the web survey: A systematic review. Computers in Human Behavior [online]. 1 March 2010. Vol. 26, no. 2, p. 132–139. [Accessed 5 April 2018]. DOI 10.1016/J.CHB.2009.10.015. Available from: https://www.sciencedirect.com/science/article/pii/S0747563209001708
FRICK, A., BÄCHTINGER, M. T., & REIPS, U-D, 1999. Financial incentives, personal information and drop-out rate in online studies. In: Current Internet science. Trends, techniques, results.
FRICK, A, BÄCHTIGER, M T and REIPS, U D, 2001. Dimensions of Internet Science [online]. Lengerich: Pabst Science Publishers. [Accessed 27 July 2018]. ISBN 3935357524. Available from: https://www.researchgate.net/profile/Andrea_Frick/publication/290392761_Financial_incentives_personal_information_and_drop-out_in_online_studies/links/569e352d08ae950bd7a9479a.pdf%0Ahttp://deusto.academia.edu/UlfDietrichReips/Papers/106583/Financial_incen
GALESIC, M. and BOSNJAK, M., 2009. Effects of Questionnaire Length on Participation and Indicators of Response Quality in a Web Survey. Public Opinion Quarterly [online]. 1 June 2009. Vol. 73, no. 2, p. 349–360. [Accessed 29 July 2018]. DOI 10.1093/poq/nfp031. Available from: https://academic.oup.com/poq/article-lookup/doi/10.1093/poq/nfp031
GALESIC, Mirta, 2006. Dropouts on the Web: Effects of Interest and Burden Experienced During an Online Survey. Journal of Official Statistics [online]. 2006. Vol. 22, no. 2, p. 313–328. [Accessed 27 July 2018]. Available from: http://www.scb.se/contentassets/ca21efb41fee47d293bbee5bf7be7fb3/dropouts-on-the-web-effects-of-interest-and-burden-experienced-during-an-online-survey.pdf
GANASSALI, Stéphane, 2008. The Influence of the Design of Web Survey Questionnaires on the Quality of Responses. Survey Research Methods [online]. 30 March 2008. Vol. 2, no. 1, p. 21–32. [Accessed 29 July 2018]. DOI 10.18148/srm/2008.v2i1.598. Available from: https://ojs.ub.uni-konstanz.de/srm/article/view/598he
HEERWEGH, Dirk and LOOSVELDT, Geert, 2002. An Evaluation of the Effect of Response Formats on Data Quality in Web Surveys. Social Science Computer Review [online]. 18 November 2002. Vol. 20, no. 4, p. 471–484. [Accessed 28 July 2018]. DOI 10.1177/089443902237323. Available from: http://journals.sagepub.com/doi/10.1177/089443902237323
JOINSON, Adam N., WOODLEY, Alan and REIPS, Ulf-Dietrich, 2007. Personalization, authentication and self-disclosure in self-administered Internet surveys. Computers in Human Behavior [online]. 1 January 2007. Vol. 23, no. 1, p. 275–285. [Accessed 24 July 2018]. DOI 10.1016/J.CHB.2004.10.012. Available from: https://www.sciencedirect.com/science/article/pii/S0747563204001682
JONES, Sarah, PRYOR, Graham and WHYTE, Angus, 2013. How to Develop Research Data Management Services – a guide for HEIs. DCC How-to Guides [online]. 2013. Available from: http://www.dcc.ac.uk/resources/how-guides
KEUSCH, Florian, 2015. Why do people participate in Web surveys? Applying survey participation theory to Internet survey data collection. Management Review Quarterly [online]. 9 June 2015. Vol. 65, no. 3, p. 183–216. [Accessed 27 July 2018]. DOI 10.1007/s11301-014-0111-y. Available from: http://link.springer.com/10.1007/s11301-014-0111-y
KLOFSTAD, Casey A., BOULIANNE, Shelley and BASSON, Danna, 2008. Matching the Message to the Medium. Social Science Computer Review [online]. 18 November 2008. Vol. 26, no. 4, p. 498–509. [Accessed 27 July 2018]. DOI 10.1177/0894439308314145. Available from: http://journals.sagepub.com/doi/10.1177/0894439308314145
LENZNER, Timo, KACZMIREK, Lars and LENZNER, Alwine, 2010. Cognitive burden of survey questions and response times: A psycholinguistic experiment. Applied Cognitive Psychology [online]. October 2010. Vol. 24, no. 7, p. 1003–1020. [Accessed 27 July 2018]. DOI 10.1002/acp.1602. Available from: http://doi.wiley.com/10.1002/acp.1602
LIU, Mingnan and WRONSKI, Laura, 2018. Examining Completion Rates in Web Surveys via Over 25,000 Real-World Surveys. Social Science Computer Review [online]. 23 February 2018. Vol. 36, no. 1, p. 116–124. [Accessed 29 July 2018]. DOI 10.1177/0894439317695581. Available from: http://journals.sagepub.com/doi/10.1177/0894439317695581A
LUGTIG, Peter and TOEPOEL, Vera, 2016. The Use of PCs, Smartphones, and Tablets in a Probability-Based Panel Survey. Social Science Computer Review [online]. 26 February 2016. Vol. 34, no. 1, p. 78–94. [Accessed 26 July 2018]. DOI 10.1177/0894439315574248. Available from: http://journals.sagepub.com/doi/10.1177/0894439315574248
MALHOTRA, Neil, 2008. Completion Time and Response Order Effects in Web Surveys [online]. 2008. Oxford University PressAmerican Association for Public Opinion Research. [Accessed 24 July 2018]. Available from: https://www.jstor.org/stable/25548051
MANFREDA, Katja Lozar, BERZELAK, Jernej, VEHOVAR, Vasja, BOSNJAK, Michael and HAAS, Iris, 2008. Web Surveys versus other Survey Modes: A Meta-Analysis Comparing Response Rates. International Journal of Market Research [online]. 1 January 2008. Vol. 50, no. 1, p. 79–104. [Accessed 26 July 2018]. DOI 10.1177/147078530805000107. Available from: http://journals.sagepub.com/doi/10.1177/147078530805000107
MATZAT, Uwe, SNIJDERS, Chris and VAN DER HORST, Wouter, 2009. Effects of Different Types of Progress Indicators on Drop-Out Rates in Web Surveys. Social Psychology [online]. 10 January 2009. Vol. 40, no. 1, p. 43–52. [Accessed 28 July 2018]. DOI 10.1027/1864-9335.40.1.43. Available from: http://econtent.hogrefe.com/doi/abs/10.1027/1864-9335.40.1.43
O’NEIL, Kevin M. and PENROD, Steven D., 2001. Methodological variables in Web-based research that may affect results: Sample type, monetary incentives, and personal information. Behavior Research Methods, Instruments, & Computers [online]. May 2001. Vol. 33, no. 2, p. 226–233. [Accessed 27 July 2018]. DOI 10.3758/BF03195369. Available from: http://www.springerlink.com/index/10.3758/BF03195369
O’NEIL, Kevin M., PENROD, Steven D. and BORNSTEIN, Brian H., 2003. Web-based research: Methodological variables’ effects on dropout and sample characteristics. Behavior Research Methods, Instruments, & Computers [online]. May 2003. Vol. 35, no. 2, p. 217–226. [Accessed 24 July 2018]. DOI 10.3758/BF03202544. Available from: http://www.springerlink.com/index/10.3758/BF03202544
OPEN SCIENCE AND RESEARCH INITIATIVE, 2018. Open Science and Research. [online]. 2018. Available from: https://openscience.fi/
PETTY, Richard E. and CACIOPPO, John T., 1984. The effects of involvement on responses to argument quantity and quality: Central and peripheral routes to persuasion. Journal of Personality and Social Psychology [online]. 1984. Vol. 46, no. 1, p. 69–81. [Accessed 24 July 2018]. DOI 10.1037/0022-3514.46.1.69. Available from: http://content.apa.org/journals/psp/46/1/69
PEYTCHEV, Andy, 2009. Survey Breakoff [online]. 2009. Oxford University PressAmerican Association for Public Opinion Research. [Accessed 24 July 2018]. Available from: https://www.jstor.org/stable/25548063Survey
PORTER, Stephen R. and WHITCOMB, Michael E., 2003. The Impact of Contact Type on Web Survey Response Rates [online]. 2003. Oxford University PressAmerican Association for Public Opinion Research. [Accessed 24 July 2018]. Available from: https://www.jstor.org/stable/3521694
RATH, Jessica M., WILLIAMS, Valerie F., VILLANTI, Andrea C., GREEN, Molly P., MOWERY, Paul D. and VALLONE, Donna M., 2017. Boosting Online Response Rates Among Nonresponders. Social Science Computer Review [online]. 14 October 2017. Vol. 35, no. 5, p. 619–632. [Accessed 27 July 2018]. DOI 10.1177/0894439316656151. Available from: http://journals.sagepub.com/doi/10.1177/0894439316656151
SCHLOSSER, Stephan and MAYS, Anja, 2018. Mobile and Dirty. Social Science Computer Review [online]. 27 April 2018. Vol. 36, no. 2, p. 212–230. [Accessed 26 July 2018]. DOI 10.1177/0894439317698437. Available from: http://journals.sagepub.com/doi/10.1177/
0894439317698437
SCHNEIDER, René, 2018. Training Trainers for Research Data Literacy: A Content- and Method-Oriented Approach. In: [online]. Springer, Cham. p. 139–147. [Accessed 26 July 2018]. Available from: http://link.springer.com/10.1007/978-3-319-74334-9_15
ŠPIRANEC, Sonja and KOS, Denis, 2018. Data Literacy and Research Data Management: The Croatian State of Affairs. In: [online]. Springer, Cham. p. 148–157. [Accessed 26 July 2018]. Available from: http://link.springer.com/10.1007/978-3-319-74334-9_16
STRASSER C., 2015. Research Data Management. A Primer Publication of the National Information Standards Organization [online]. 2015. Available from: http://wiki.lib.sun.ac.za/images/2/24/PrimerRDM-2015-0727.pdf
TUTEN, Tracy L, BOSNJAK, Michael and BANDILLA, Wolfgang, 1999. Banner-advertised Web surveys. Marfketing Research [online]. 1999. Vol. 11, no. 4, p. 16. [Accessed 27 July 2018]. Available from: http://search.proquest.com/openview/d6285b57d5de719d0df82495ab456b84/1?pq-origsite=gscholar&cbl=31079
UNIVERSITY OF OULU, 2018. Research data policy at the University of Oulu. [online]. 2018. Available from: http://www.oulu.fi/university/node/44529
VINCENT, Clark E., 1964. Socioeconomic Status and Familial Variables in Mail Questionnaire Responses. American Journal of Sociology [online]. May 1964. Vol. 69, no. 6, p. 647–653. [Accessed 24 July 2018]. DOI 10.1086/223695. Available from: https://www.journals.uchicago.edu/doi/10.1086/223695
WELLS, Tom, BAILEY, Justin T. and LINK, Michael W., 2014. Comparison of Smartphone and Online Computer Survey Administration. Social Science Computer Review [online]. 7 April 2014. Vol. 32, no. 2, p. 238–255. [Accessed 26 July 2018]. DOI 10.1177/0894439313505829. Available from: http://journals.sagepub.com/doi/10.1177/0894439313505829
YAN, Ting, RYAN, Lindsay, BECKER, Sandra E and SMITH, Jacqui, 2015. Assessing Quality of Answers to a Global Subjective Well-being Question Through Response Times. Survey research methods [online]. 2015. Vol. 9, no. 2, p. 101–109. [Accessed 26 July 2018]. Available from: http://www.ncbi.nlm.nih.gov/pubmed/27398099
YAN, Ting and TOURANGEAU, Roger, 2008. Fast times and easy questions: the effects of age, experience and question complexity on web survey response times. Applied Cognitive Psychology [online]. January 2008. Vol. 22, no. 1, p. 51–68. [Accessed 27 July 2018]. DOI 10.1002/acp.1331. Available from: http://doi.wiley.com/10.1002/acp.1331
ZHANG, Chan and CONRAD, Frederick, 2014. Speeding in Web Surveys: The tendency to answer very fast and its association with straightlining. Survey Research Methods [online]. 23 July 2014. Vol. 8, no. 2, p. 127–135. [Accessed 24 July 2018]. DOI 10.18148/srm/2014.v8i2.5453. Available from: https://ojs.ub.uni-konstanz.de/srm/article/view/5453
Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Please read the Copyright Notice in Journal Policy

Downloads

Download data is not yet available.