scispace - formally typeset
Search or ask a question
Author

Tristan Glatard

Bio: Tristan Glatard is an academic researcher from Concordia University. The author has contributed to research in topics: Grid & Workflow. The author has an hindex of 30, co-authored 182 publications receiving 4010 citations. Previous affiliations of Tristan Glatard include Argonne National Laboratory & Claude Bernard University Lyon 1.


Papers
More filters
Journal ArticleDOI
TL;DR: The Brain Imaging Data Structure (BIDS) is developed, a standard for organizing and describing MRI datasets that uses file formats compatible with existing software, unifies the majority of practices already common in the field, and captures the metadata necessary for most common data processing operations.
Abstract: The development of magnetic resonance imaging (MRI) techniques has defined modern neuroimaging. Since its inception, tens of thousands of studies using techniques such as functional MRI and diffusion weighted imaging have allowed for the non-invasive study of the brain. Despite the fact that MRI is routinely used to obtain data for neuroscience research, there has been no widely adopted standard for organizing and describing the data collected in an imaging experiment. This renders sharing and reusing data (within or between labs) difficult if not impossible and unnecessarily complicates the application of automatic pipelines and quality assurance protocols. To solve this problem, we have developed the Brain Imaging Data Structure (BIDS), a standard for organizing and describing MRI datasets. The BIDS standard uses file formats compatible with existing software, unifies the majority of practices already common in the field, and captures the metadata necessary for most common data processing operations.

1,037 citations

Journal ArticleDOI
04 Jun 2020-Nature
TL;DR: The results obtained by seventy different teams analysing the same functional magnetic resonance imaging dataset show substantial variation, highlighting the influence of analytical choices and the importance of sharing workflows publicly and performing multiple analyses.
Abstract: Data analysis workflows in many scientific domains have become increasingly complex and flexible. Here we assess the effect of this flexibility on the results of functional magnetic resonance imaging by asking 70 independent teams to analyse the same dataset, testing the same 9 ex-ante hypotheses1. The flexibility of analytical approaches is exemplified by the fact that no two teams chose identical workflows to analyse the data. This flexibility resulted in sizeable variation in the results of hypothesis tests, even for teams whose statistical maps were highly correlated at intermediate stages of the analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Notably, a meta-analytical approach that aggregated information across teams yielded a significant consensus in activated regions. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset2-5. Our findings show that analytical flexibility can have substantial effects on scientific conclusions, and identify factors that may be related to variability in the analysis of functional magnetic resonance imaging. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for performing and reporting multiple analyses of the same data. Potential approaches that could be used to mitigate issues related to analytical variability are discussed.

551 citations

Journal ArticleDOI
TL;DR: Intentions from developing a set of recommendations on behalf of the Organization for Human Brain Mapping are described and barriers that impede these practices are identified, including how the discipline must change to fully exploit the potential of the world's neuroimaging data.
Abstract: Given concerns about the reproducibility of scientific findings, neuroimaging must define best practices for data analysis, results reporting, and algorithm and data sharing to promote transparency, reliability and collaboration. We describe insights from developing a set of recommendations on behalf of the Organization for Human Brain Mapping and identify barriers that impede these practices, including how the discipline must change to fully exploit the potential of the world's neuroimaging data.

544 citations

Posted ContentDOI
05 May 2020
TL;DR: In this paper, the same dataset was independently analyzed by 70 teams, testing nine ex-ante hypotheses, and the results showed that analytic flexibility can have substantial effects on scientific conclusions, and demonstrate factors related to variability in fMRI.
Abstract: Summary Data analysis workflows in many scientific domains have become increasingly complex and flexible. To assess the impact of this flexibility on functional magnetic resonance imaging (fMRI) results, the same dataset was independently analyzed by 70 teams, testing nine ex-ante hypotheses. The flexibility of analytic approaches is exemplified by the fact that no two teams chose identical workflows to analyze the data. This flexibility resulted in sizeable variation in hypothesis test results, even for teams whose statistical maps were highly correlated at intermediate stages of their analysis pipeline. Variation in reported results was related to several aspects of analysis methodology. Importantly, meta-analytic approaches that aggregated information across teams yielded significant consensus in activated regions across teams. Furthermore, prediction markets of researchers in the field revealed an overestimation of the likelihood of significant findings, even by researchers with direct knowledge of the dataset. Our findings show that analytic flexibility can have substantial effects on scientific conclusions, and demonstrate factors related to variability in fMRI. The results emphasize the importance of validating and sharing complex analysis workflows, and demonstrate the need for multiple analyses of the same data. Potential approaches to mitigate issues related to analytical variability are discussed.

286 citations

Journal ArticleDOI
01 Aug 2008
TL;DR: This paper describes the design and implementation of MOTEUR, a workflow engine that fulfills the need for well-defined data composition strategies on the one hand and for a fully parallel execution on the other.
Abstract: Workflows offer a powerful way to describe and deploy applications on grid infrastructures. Many workflow management systems have been proposed but there is still a lack of a system that would allow both a simple description of the dataflow of the application and an efficient execution on a grid platform. In this paper, we study the requirements of such a system, underlining the need for well-defined data composition strategies on the one hand and for a fully parallel execution on the other. As combining those features is not straightforward, we then propose algorithms to do so and we describe the design and implementation of MOTEUR, a workflow engine that fulfills those requirements. Performance results and overhead quantification are shown to evaluate MOTEUR with respect to existing comparable workflow systems on a production grid.

208 citations


Cited by
More filters
Journal ArticleDOI
01 May 1975
TL;DR: The Fundamentals of Queueing Theory, Fourth Edition as discussed by the authors provides a comprehensive overview of simple and more advanced queuing models, with a self-contained presentation of key concepts and formulae.
Abstract: Praise for the Third Edition: "This is one of the best books available. Its excellent organizational structure allows quick reference to specific models and its clear presentation . . . solidifies the understanding of the concepts being presented."IIE Transactions on Operations EngineeringThoroughly revised and expanded to reflect the latest developments in the field, Fundamentals of Queueing Theory, Fourth Edition continues to present the basic statistical principles that are necessary to analyze the probabilistic nature of queues. Rather than presenting a narrow focus on the subject, this update illustrates the wide-reaching, fundamental concepts in queueing theory and its applications to diverse areas such as computer science, engineering, business, and operations research.This update takes a numerical approach to understanding and making probable estimations relating to queues, with a comprehensive outline of simple and more advanced queueing models. Newly featured topics of the Fourth Edition include:Retrial queuesApproximations for queueing networksNumerical inversion of transformsDetermining the appropriate number of servers to balance quality and cost of serviceEach chapter provides a self-contained presentation of key concepts and formulae, allowing readers to work with each section independently, while a summary table at the end of the book outlines the types of queues that have been discussed and their results. In addition, two new appendices have been added, discussing transforms and generating functions as well as the fundamentals of differential and difference equations. New examples are now included along with problems that incorporate QtsPlus software, which is freely available via the book's related Web site.With its accessible style and wealth of real-world examples, Fundamentals of Queueing Theory, Fourth Edition is an ideal book for courses on queueing theory at the upper-undergraduate and graduate levels. It is also a valuable resource for researchers and practitioners who analyze congestion in the fields of telecommunications, transportation, aviation, and management science.

2,562 citations

Journal Article
TL;DR: Definition: To what extent does the study allow us to draw conclusions about a causal effect between two or more constructs?
Abstract: Definition: To what extent does the study allow us to draw conclusions about a causal effect between two or more constructs? Issues: Selection, maturation, history, mortality, testing, regression towrd the mean, selection by maturation, treatment by mortality, treatment by testing, measured treatment variables Increase: Eliminate the threats, above all do experimental manipulations, random assignment, and counterbalancing.

2,006 citations

01 Jan 2016
TL;DR: As you may know, people have search numerous times for their chosen novels like this statistical parametric mapping the analysis of functional brain images, but end up in malicious downloads.
Abstract: Thank you very much for reading statistical parametric mapping the analysis of functional brain images. As you may know, people have search numerous times for their chosen novels like this statistical parametric mapping the analysis of functional brain images, but end up in malicious downloads. Rather than enjoying a good book with a cup of coffee in the afternoon, instead they cope with some infectious bugs inside their desktop computer.

1,719 citations

Journal ArticleDOI
TL;DR: fMRIPrep is a robust and easy-to-use pipeline for preprocessing of diverse fMRI data that dispenses of manual intervention, thereby ensuring the reproducibility of the results.
Abstract: Preprocessing of functional magnetic resonance imaging (fMRI) involves numerous steps to clean and standardize the data before statistical analysis. Generally, researchers create ad hoc preprocessing workflows for each dataset, building upon a large inventory of available tools. The complexity of these workflows has snowballed with rapid advances in acquisition and processing. We introduce fMRIPrep, an analysis-agnostic tool that addresses the challenge of robust and reproducible preprocessing for fMRI data. fMRIPrep automatically adapts a best-in-breed workflow to the idiosyncrasies of virtually any dataset, ensuring high-quality preprocessing without manual intervention. By introducing visual assessment checkpoints into an iterative integration framework for software testing, we show that fMRIPrep robustly produces high-quality results on a diverse fMRI data collection. Additionally, fMRIPrep introduces less uncontrolled spatial smoothness than observed with commonly used preprocessing tools. fMRIPrep equips neuroscientists with an easy-to-use and transparent preprocessing workflow, which can help ensure the validity of inference and the interpretability of results.

1,465 citations

Journal ArticleDOI
TL;DR: A high-level overview of the features of the MRtrix3 framework and general-purpose image processing applications provided with the software is provided.

1,228 citations