Mixed methods data collection using simulated Google results: reflections on the methods of a point-of-selection behaviour study
01 Jan 2020-Information Research: An International Electronic Journal (University of Boras, Faculty of Librarianship, Information, Education and IT)-Vol. 25, Iss: 4
TL;DR: This paper reflects on the data collection methods and highlights opportunities for data analysis, and combines data on participants’ behaviour, thoughts and characteristics to provide a more complete picture of factors influencing online resource selection.
Abstract: Introduction. A multi-institutional, grant-funded project employed mixed methods to study 175 fourth-grade through graduate school students’ point-of-selection behaviour. The method features the use of simulated search engine results pages to facilitate data collection. Method. Student participants used simulated Google results pages to select resources for a hypothetical school project. Quantitative data on participants’ selection behaviour and qualitative data from their think-aloud protocols were collected. A questionnaire and interviews were used to collect data on participants’ backgrounds and online research experiences. Analysis. This paper reflects on the data collection methods and highlights opportunities for data analysis. The ability to analyse data both qualitatively and quantitatively increases the rigor and depth of findings. Results. The simulation created a realistic yet controlled environment that ensures the comparability of data within and across a wide range of educational stages. Combining data on participants’ behaviour, thoughts and characteristics provides a more complete picture of factors influencing online resource selection. Conclusions. Using simulated results pages in combination with multiple data collection methods enables analyses that create deeper knowledge of participants' information behaviour. Such a complicated research design requires extensive time, expertise and coordination to execute.
Citations
More filters
04 Nov 2021
TL;DR: This paper showed that students cannot accurately identify containers when they rely on heuristics like the URL and Google snippet, and that this requires thoughtful engagement with the information resources and critical evaluation of the sources that produced them.
Abstract: To combat misinformation, librarians can teach students to evaluate containers and the indicators of credibility that they provide. Information containers can be evaluated prior to consuming information within a resource, while fact-checking only can happen after. Because of this, container evaluation can help prevent misinformation from being encoded. Our research demonstrates that this requires thoughtful engagement with the information resources and critical evaluation of the sources that produced them, and that students cannot accurately identify containers when they rely on heuristics like the URL and Google snippet.
1 citations
TL;DR: This paper assess the realism of a behavioral simulation used to study the evaluation behavior of 175 students from fourth grade through graduate school and find that a thoughtfully designed simulation can elicit naturalistic behavior when the controlled environment is designed to be realistic in meaningful ways.
Abstract: A challenge of studying information-seeking behavior in open web systems is the unpredictability of those systems. One solution to counteract this issue is employing a simulation to ensure experimental control. However, concerns arise over the realism of such an environment. This paper assesses the realism of a behavioral simulation used to study the evaluation behavior of 175 students from fourth grade through graduate school. We assess realism through the examination of targeted participant feedback about what would have made the simulated environment and tasks more realistic to these participants. Based on this feedback, we reflect on decisions made in designing the simulation and offer recommendations for future studies interested in incorporating behavioral simulation in their research design. We find that a thoughtfully designed simulation can elicit naturalistic behavior when the controlled environment is designed to be realistic in meaningful ways. Because the simulation does not have to perfectly match reality to elicit these behaviors, designing a simulation that is real enough is an effective method to study information-seeking behavior.