scispace - formally typeset
Journal ArticleDOI

Data Quality in PC and Mobile Web Surveys

TLDR
This article compares the data quality between two survey modes: self-administered web surveys conducted via personal computer and those conducted via mobile phones, and finds mobile web was associated with a lower completion rate, shorter length of open answers, and similar level of socially undesirable and non-substantive responses.
Abstract
The considerable growth in the number of smart mobile devices with a fast Internet connection provides new challenges for survey researchers. In this article, I compare the data quality between two survey modes: self-administered web surveys conducted via personal computer and those conducted via mobile phones. Data quality is compared based on five indicators: (a) completion rates, (b) response order effects, (c) social desirability, (d) non-substantive responses, and (e) length of open answers. I hypothesized that mobile web surveys would result in lower completion rates, stronger response order effects, and less elaborate answers to open-ended questions. No difference was expected in the level of reporting in sensitive items and in the rate of non-substantive responses. To test the assumptions, an experiment with two survey modes was conducted using a volunteer online access panel in Russia. As expected, mobile web was associated with a lower completion rate, shorter length of open answers, and similar level of socially undesirable and non-substantive responses. However, no stronger primacy effects in mobile web survey mode were found.

read more

Citations
More filters
Journal ArticleDOI

Improving response rates and evaluating nonresponse bias in surveys: AMEE Guide No. 102

TL;DR: This AMEE Guide explains response rate calculations and discusses methods for improving response rates to surveys as a whole and to questions within a survey (item nonresponse).
Journal ArticleDOI

Comparison of self-administered survey questionnaire responses collected using mobile apps versus other methods.

TL;DR: To assess the impact that smartphone and tablet apps as a delivery mode have on the quality of survey questionnaire responses compared to any other alternative delivery mode, 14 studies that compared the electronic delivery of self-administered survey questionnaires via a smartphone or tablet app with any other delivery mode are compared.
Journal ArticleDOI

The Use of PCs, Smartphones, and Tablets in a Probability-Based Panel Survey

TL;DR: The higher measurement error in tablets and smartphones is associated with self-selection of the sample into using a particular device, which is larger on tablets and smartphone than on PCs.
Journal ArticleDOI

Comparison of Smartphone and Online Computer Survey Administration

TL;DR: Responses across modes show that mobile survey responses are sensitive to the presentation of frequency scales and the size of open-ended text boxes, as are responses in other survey modes, which may open the possibility for multimode (mobile and online computer) surveys.
Journal ArticleDOI

Why Do Web Surveys Take Longer on Smartphones

TL;DR: Using multilevel models, a secondary analysis of student surveys finds that much of the time difference can be accounted for by the additional scrolling required on mobile devices, especially for grid questions.
References
More filters
Journal ArticleDOI

A Meta-Analysis of Response Rates in Web- or Internet-Based Surveys

TL;DR: In this article, a meta-analysis explores factors associated with higher response rates in electronic surveys reported in both published and unpublished research and concludes that response representativeness is more important than response rate in survey research.
Journal ArticleDOI

Sensitive questions in surveys.

TL;DR: The article reviews the research done by survey methodologists on reporting errors in surveys on sensitive topics, noting parallels and differences from the psychological literature on social desirability.
Journal ArticleDOI

Asking sensitive questions the impact of data collection mode, question format, and question context

TL;DR: The authors compared three methods of collecting data about sexual behaviors and other sensitive topics: com- puter-assisted personal interviewing (CAPI), computer-assisted self-administered interviewing, and audio computer assisted self-directed interviewing (ACASI) with an area probability sample of more than 300 adults in Cook County, Illinois.
Journal ArticleDOI

Social Desirability Bias in CATI, IVR, and Web Surveys The Effects of Mode and Question Sensitivity

TL;DR: The authors examined the effect of different modes of self-administration on the reporting of potentially sensitive information by a sample of university graduates, and found that the effects of the mode of data collection and the actual status of the respondent influenced whether respondents found an item sensitive.
Journal ArticleDOI

Web Surveys versus other Survey Modes: A Meta-Analysis Comparing Response Rates:

TL;DR: In this article, the authors conducted a meta-analysis of 45 published and unpublished experimental comparisons between web and other survey modes and found that on average, web surveys yield an 11% lower response rate compared to other modes (the 95% confidence interval is confined by 15% and 6% to the disadvantage of the web mode).
Related Papers (5)