scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Multisensor data fusion: A review of the state-of-the-art

01 Jan 2013-Information Fusion (Elsevier)-Vol. 14, Iss: 1, pp 28-44
TL;DR: A comprehensive review of the data fusion state of the art is proposed, exploring its conceptualizations, benefits, and challenging aspects, as well as existing methodologies.
About: This article is published in Information Fusion.The article was published on 2013-01-01. It has received 1684 citations till now. The article focuses on the topics: Domain (software engineering) & Sensor fusion.
Citations
More filters
Journal ArticleDOI
TL;DR: This paper attempts to present a comprehensive review of AI algorithms in rotating machinery fault diagnosis, from both the views of theory background and industrial applications.

1,287 citations

Journal ArticleDOI
TL;DR: A novel fusion algorithm, named Gradient Transfer Fusion (GTF), based on gradient transfer and total variation (TV) minimization is proposed, which can keep both the thermal radiation and the appearance information in the source images.

729 citations

Journal ArticleDOI
TL;DR: This survey discusses clear motivations and advantages of multi-sensor data fusion and particularly focuses on physical activity recognition, aiming at providing a systematic categorization and common comparison framework of the literature, by identifying distinctive properties and parameters affecting data fusion design choices at different levels.

680 citations

Journal ArticleDOI
20 Aug 2015
TL;DR: In this paper, a number of data-driven solutions based on matrix and tensor decompositions are discussed, emphasizing how they account for diversity across the data sets, and a key concept, diversity, is introduced.
Abstract: In various disciplines, information about the same phenomenon can be acquired from different types of detectors, at different conditions, in multiple experiments or subjects, among others. We use the term “modality” for each such acquisition framework. Due to the rich characteristics of natural phenomena, it is rare that a single modality provides complete knowledge of the phenomenon of interest. The increasing availability of several modalities reporting on the same system introduces new degrees of freedom, which raise questions beyond those related to exploiting each modality separately. As we argue, many of these questions, or “challenges,” are common to multiple domains. This paper deals with two key issues: “why we need data fusion” and “how we perform it.” The first issue is motivated by numerous examples in science and technology, followed by a mathematical framework that showcases some of the benefits that data fusion provides. In order to address the second issue, “diversity” is introduced as a key concept, and a number of data-driven solutions based on matrix and tensor decompositions are discussed, emphasizing how they account for diversity across the data sets. The aim of this paper is to provide the reader, regardless of his or her community of origin, with a taste of the vastness of the field, the prospects, and the opportunities that it holds.

673 citations

Journal ArticleDOI
TL;DR: This paper aims to provide an overview of four emerging unobtrusive and wearable technologies, which are essential to the realization of pervasive health information acquisition, including: 1) unobTrusive sensing methods, 2) smart textile technology, 3) flexible-stretchable-printable electronics, and 4) sensor fusion.
Abstract: The aging population, prevalence of chronic diseases, and outbreaks of infectious diseases are some of the major challenges of our present-day society. To address these unmet healthcare needs, especially for the early prediction and treatment of major diseases, health informatics, which deals with the acquisition, transmission, processing, storage, retrieval, and use of health information, has emerged as an active area of interdisciplinary research. In particular, acquisition of health-related information by unobtrusive sensing and wearable technologies is considered as a cornerstone in health informatics. Sensors can be weaved or integrated into clothing, accessories, and the living environment, such that health information can be acquired seamlessly and pervasively in daily living. Sensors can even be designed as stick-on electronic tattoos or directly printed onto human skin to enable long-term health monitoring. This paper aims to provide an overview of four emerging unobtrusive and wearable technologies, which are essential to the realization of pervasive health information acquisition, including: 1) unobtrusive sensing methods, 2) smart textile technology, 3) flexible-stretchable-printable electronics, and 4) sensor fusion, and then to identify some future directions of research.

647 citations

References
More filters
Book
01 Aug 1996
TL;DR: A separation theorem for convex fuzzy sets is proved without requiring that the fuzzy sets be disjoint.
Abstract: A fuzzy set is a class of objects with a continuum of grades of membership. Such a set is characterized by a membership (characteristic) function which assigns to each object a grade of membership ranging between zero and one. The notions of inclusion, union, intersection, complement, relation, convexity, etc., are extended to such sets, and various properties of these notions in the context of fuzzy sets are established. In particular, a separation theorem for convex fuzzy sets is proved without requiring that the fuzzy sets be disjoint.

52,705 citations

Journal ArticleDOI
TL;DR: A generalization of the sampling method introduced by Metropolis et al. as mentioned in this paper is presented along with an exposition of the relevant theory, techniques of application and methods and difficulties of assessing the error in Monte Carlo estimates.
Abstract: SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and difficulties of assessing the error in Monte Carlo estimates. Examples of the methods, including the generation of random orthogonal matrices and potential applications of the methods to numerical problems arising in statistics, are discussed. For numerical problems in a large number of dimensions, Monte Carlo methods are often more efficient than conventional numerical methods. However, implementation of the Monte Carlo methods requires sampling from high dimensional probability distributions and this may be very difficult and expensive in analysis and computer time. General methods for sampling from, or estimating expectations with respect to, such distributions are as follows. (i) If possible, factorize the distribution into the product of one-dimensional conditional distributions from which samples may be obtained. (ii) Use importance sampling, which may also be used for variance reduction. That is, in order to evaluate the integral J = X) p(x)dx = Ev(f), where p(x) is a probability density function, instead of obtaining independent samples XI, ..., Xv from p(x) and using the estimate J, = Zf(xi)/N, we instead obtain the sample from a distribution with density q(x) and use the estimate J2 = Y{f(xj)p(x1)}/{q(xj)N}. This may be advantageous if it is easier to sample from q(x) thanp(x), but it is a difficult method to use in a large number of dimensions, since the values of the weights w(xi) = p(x1)/q(xj) for reasonable values of N may all be extremely small, or a few may be extremely large. In estimating the probability of an event A, however, these difficulties may not be as serious since the only values of w(x) which are important are those for which x -A. Since the methods proposed by Trotter & Tukey (1956) for the estimation of conditional expectations require the use of importance sampling, the same difficulties may be encountered in their use. (iii) Use a simulation technique; that is, if it is difficult to sample directly from p(x) or if p(x) is unknown, sample from some distribution q(y) and obtain the sample x values as some function of the corresponding y values. If we want samples from the conditional dis

14,965 citations


"Multisensor data fusion: A review o..." refers methods in this paper

  • ...Their method was later extended by Hastings [49] and is referred to as the Metropolis–Hastings algorithm....

    [...]

  • ...The popular Gibbs sampler is a special case of the Metropolis– Hastings algorithm where the candidate point is always accepted....

    [...]

  • ...The Metropolis–Hastings algorithm is sensitive to the sample initilization and the choice of jumping distribution....

    [...]

Book
01 Jan 1976
TL;DR: This book develops an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions.
Abstract: Both in science and in practical affairs we reason by combining facts only inconclusively supported by evidence. Building on an abstract understanding of this process of combination, this book constructs a new theory of epistemic probability. The theory draws on the work of A. P. Dempster but diverges from Depster's viewpoint by identifying his "lower probabilities" as epistemic probabilities and taking his rule for combining "upper and lower probabilities" as fundamental. The book opens with a critique of the well-known Bayesian theory of epistemic probability. It then proceeds to develop an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions. This rule, together with the idea of "weights of evidence," leads to both an extensive new theory and a better understanding of the Bayesian theory. The book concludes with a brief treatment of statistical inference and a discussion of the limitations of epistemic probability. Appendices contain mathematical proofs, which are relatively elementary and seldom depend on mathematics more advanced that the binomial theorem.

14,565 citations


"Multisensor data fusion: A review o..." refers background in this paper

  • ...Examples of such hybrid frameworks are fuzzy rough set theory (FRST) [37] and fuzzy Dempster–Shafer theory (Fuzzy DSET) [38]....

    [...]

  • ...Unlike the Bayesian Inference, the Dempster–Shafer theory allows each source to contribute information in different levels of detail....

    [...]

  • ...In contrast to probability theory that assigns a probability mass to each element of X, Dempster–Shafer theory assigns belief mass m to each elemenet E of 2X, which represent possible propositions regarding the system state x. Function m has two properties as follows: 1. m(/) = 0 2....

    [...]

  • ...Shenoy and Shafer [63] demonstrated the applicability of this local computing method to Bayesian probabilities and fuzzy logics....

    [...]

  • ...The theory of belief functions initiated from Dempster’s work [53] in understanding and perfecting Gisher’s approach to probability inference, and was then mathematically formalized by Shafer [36] toward a general theory of reasoning based on evidence....

    [...]

Journal ArticleDOI
TL;DR: The theory of possibility described in this paper is related to the theory of fuzzy sets by defining the concept of a possibility distribution as a fuzzy restriction which acts as an elastic constraint on the values that may be assigned to a variable.

8,918 citations


"Multisensor data fusion: A review o..." refers background or methods in this paper

  • ...As shown in a famous counterexample by Zadeh [135], naive application of Dempster’s rule of combination to fusion of highly conflicting data results in unintuitive results....

    [...]

  • ...There are a number of mathematical theories available to represent data imperfection [31], such as probability theory [32], fuzzy set theory [33], possibility theory [34], rough set theory [35], and Dempster– Shafer evidence theory (DSET) [36]....

    [...]

  • ...Possibility theory was founded by Zadeh [34] and later extended by Dubois and Prade [68,69]....

    [...]

Journal ArticleDOI
01 Apr 1993
TL;DR: An algorithm, the bootstrap filter, is proposed for implementing recursive Bayesian filters, represented as a set of random samples, which are updated and propagated by the algorithm.
Abstract: An algorithm, the bootstrap filter, is proposed for implementing recursive Bayesian filters. The required density of the state vector is represented as a set of random samples, which are updated and propagated by the algorithm. The method is not restricted by assumptions of linear- ity or Gaussian noise: it may be applied to any state transition or measurement model. A simula- tion example of the bearings only tracking problem is presented. This simulation includes schemes for improving the efficiency of the basic algorithm. For this example, the performance of the bootstrap filter is greatly superior to the standard extended Kalman filter.

8,018 citations


"Multisensor data fusion: A review o..." refers background in this paper

  • ...This step is included in the original proposal of the particle filters [46], which is called sequential importance resampling (SIR)....

    [...]