scispace - formally typeset
Journal ArticleDOI: 10.1080/13658816.2020.1802032

Reproducibility and replicability: opportunities and challenges for geospatial research

04 Mar 2021-International Journal of Geographical Information Science (Taylor & Francis)-Vol. 35, Iss: 3, pp 427-445
Abstract: A cornerstone of the scientific method, the ability to reproduce and replicate the results of research has gained widespread attention across the sciences in recent years. A corresponding burst of ...

... read more

Topics: Geospatial analysis (55%)
Citations
  More

10 results found


Journal ArticleDOI: 10.1080/15230406.2020.1830856
Abstract: This paper proposes elevation models to promote, evaluate, and compare various terrain representation techniques. Our goal is to increase the reproducibility of terrain rendering algorithms and tec...

... read more

Topics: Terrain rendering (66%), Terrain (61%), Digital elevation model (61%) ... show more

8 Citations


Open accessJournal ArticleDOI: 10.1371/JOURNAL.PONE.0255259
Zhenlong Li1, Xiao Huang2, Tao Hu3, Huan Ning1  +3 moreInstitutions (5)
05 Aug 2021-PLOS ONE
Abstract: In response to the soaring needs of human mobility data, especially during disaster events such as the COVID-19 pandemic, and the associated big data challenges, we develop a scalable online platform for extracting, analyzing, and sharing multi-source multi-scale human mobility flows. Within the platform, an origin-destination-time (ODT) data model is proposed to work with scalable query engines to handle heterogenous mobility data in large volumes with extensive spatial coverage, which allows for efficient extraction, query, and aggregation of billion-level origin-destination (OD) flows in parallel at the server-side. An interactive spatial web portal, ODT Flow Explorer, is developed to allow users to explore multi-source mobility datasets with user-defined spatiotemporal scales. To promote reproducibility and replicability, we further develop ODT Flow REST APIs that provide researchers with the flexibility to access the data programmatically via workflows, codes, and programs. Demonstrations are provided to illustrate the potential of the APIs integrating with scientific workflows and with the Jupyter Notebook environment. We believe the platform coupled with the derived multi-scale mobility data can assist human mobility monitoring and analysis during disaster events such as the ongoing COVID-19 pandemic and benefit both scientific communities and the general public in understanding human mobility dynamics.

... read more

Topics: Big data (51%)

2 Citations


Open accessJournal ArticleDOI: 10.1080/19475683.2021.1903996
Abstract: We discuss the nature of processes relating to human behaviour and how to model such processes when they vary over space. In so doing, we describe the role of local modelling and how the bandwidth ...

... read more

Topics: Bandwidth (computing) (56%), Human geography (51%)

2 Citations


Journal ArticleDOI: 10.1073/PNAS.2015759118
Michael F. Goodchild1, Wenwen Li2Institutions (2)
Abstract: Replicability takes on special meaning when researching phenomena that are embedded in space and time, including phenomena distributed on the surface and near surface of the Earth. Two principles, spatial dependence and spatial heterogeneity, are generally characteristic of such phenomena. Various practices have evolved in dealing with spatial heterogeneity, including the use of place-based models. We review the rapidly emerging applications of artificial intelligence to phenomena distributed in space and time and speculate on how the principle of spatial heterogeneity might be addressed. We introduce a concept of weak replicability and discuss possible approaches to its measurement.

... read more

1 Citations



References
  More

60 results found


Open access
01 Jan 2014-
Abstract: Scientific work is heterogeneous, requiring many different actors and viewpoints. It also requires cooperation. The two create tension between divergent viewpoints and the need for generalizable findings. We present a model of how one group of actors managed this tension. It draws on the work of amateurs, professionals, administrators and others connected to the Museum of Vertebrate Zoology at the University of California, Berkeley, during its early years. Extending the Latour-Callon model of interessement, two major activities are central for translating between viewpoints: standardization of methods, and the development of 'boundary objects'. Boundary objects are both adaptable to different viewpoints and robust enough to maintain identity across them. We distinguish four types of boundary objects: repositories, ideal types, coincident boundaries and standardized forms.

... read more

7,305 Citations


Journal ArticleDOI: 10.1177/030631289019003001
Abstract: Scientific work is heterogeneous, requiring many different actors and viewpoints. It also requires cooperation. The two create tension between divergent viewpoints and the need for generalizable findings. We present a model of how one group of actors managed this tension. It draws on the work of amateurs, professionals, administrators and others connected to the Museum of Vertebrate Zoology at the University of California, Berkeley, during its early years. Extending the Latour-Callon model of interessement, two major activities are central for translating between viewpoints: standardization of methods, and the development of `boundary objects'. Boundary objects are both adaptable to different viewpoints and robust enough to maintain identity across them. We distinguish four types of boundary objects: repositories, ideal types, coincident boundaries and standardized forms.

... read more

6,999 Citations


Journal ArticleDOI: 10.1037/0033-2909.86.3.638
Robert Rosenthal1Institutions (1)
Abstract: For any given research area, one cannot tell how many studies have been conducted but never reported. The extreme view of the "file drawer problem" is that journals are filled with the 5% of the studies that show Type I errors, while the file drawers are filled with the 95% of the studies that show nonsignificant results. Quantitative procedures for computing the tolerance for filed and future null results are reported and illustrated, and the implications are discussed. (15 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)

... read more

Topics: Poison control (51%)

6,443 Citations


Open access
15 Aug 2006-
Abstract: There is increasing concern that most current published research findings are false. The probability that a research claim is true may depend on study power and bias, the number of other studies on the same question, and, importantly, the ratio of true to no relationships among the relationships probed in each scientific field. In this framework, a research finding is less likely to be true when the studies conducted in a field are smaller; when effect sizes are smaller; when there is a greater number and lesser pre-selection of tested relationships; where there is greater flexibility in designs, definitions, outcomes, and analytical modes; when there is greater financial and other interest and prejudice; and when more teams are involved in a scientific field in chase of statistical significance. Simulations show that for most study designs and settings, it is more likely for a research claim to be false than true. Moreover, for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias. In this essay, I discuss the implications of these problems for the conduct and interpretation of research.

... read more

5,003 Citations


Open accessJournal ArticleDOI: 10.1038/SDATA.2016.18
15 Mar 2016-Scientific Data
Abstract: There is an urgent need to improve the infrastructure supporting the reuse of scholarly data. A diverse set of stakeholders—representing academia, industry, funding agencies, and scholarly publishers—have come together to design and jointly endorse a concise and measureable set of principles that we refer to as the FAIR Data Principles. The intent is that these may act as a guideline for those wishing to enhance the reusability of their data holdings. Distinct from peer initiatives that focus on the human scholar, the FAIR Principles put specific emphasis on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by individuals. This Comment is the first formal publication of the FAIR Principles, and includes the rationale behind them, and some exemplar implementations in the community.

... read more

Topics: Guiding Principles (57%), Data curation (52%), Stewardship (51%)

4,666 Citations


Performance
Metrics
No. of citations received by the Paper in previous years
YearCitations
202110
Network Information
Related Papers (5)
The New Reality of Reproducibility: The Role of Data Work in Scientific Research28 May 2020

Melanie Feinberg, Will Sutherland +3 more

70% related
Leveraging Semantics to Improve Reproducibility in Scientific Workflows01 Jan 2014

Idafen Santana-Perez, Rafael Ferreira da Silva +4 more

65% related
#EEGManyLabs : investigating the replicability of influential EEG experiments02 Apr 2021, Cortex

Yuri G. Pavlov, Nika Adamian +57 more

61% related
The Road Towards Reproducibility in Science: The Case of Data Citation26 Jan 2017

Nicola Ferro, Gianmaria Silvello

61% related