scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Sensor fusion in anti-personnel mine detection using a two-level belief function model

01 May 2003-Vol. 33, Iss: 2, pp 269-283
TL;DR: A two-level approach for modeling and fusion of antipersonnel mine detection sensors in terms of belief functions within the Dempster-Shafer framework is presented and an original decision rule adapted to this type of application is proposed.
Abstract: A two-level approach for modeling and fusion of antipersonnel mine detection sensors in terms of belief functions within the Dempster-Shafer framework is presented. Three promising and complementary sensors are considered: a metal detector, an infrared camera, and a ground-penetrating radar. Since the metal detector, the most often used mine detection sensor, provides measures that have different behaviors depending on the metal content of the observed object, the first level aims at identifying this content and at providing a classification into three classes. Depending on the metal content, the object is further analyzed at the second level toward deciding the final object identity. This process can be applied to any problem where one piece of information induces different reasoning schemes depending on its value. A way to include influence of various factors on sensors in the model is also presented, as well as a possibility that not all sensors refer to the same object. An original decision rule adapted to this type of application is proposed, as well as a way for estimating confidence degrees. More generally, this decision rule can be used in any situation where the different types of errors do not have the same importance. Some examples of obtained results are shown on synthetic data mimicking reality and with increasing complexity. Finally, applications on real data show promising results.
Citations
More filters
Journal ArticleDOI
TL;DR: An extension of the discounting operation is proposed, allowing to use more detailed information regarding the reliability of the source in different contexts, i.e., conditionally on different hypotheses regarding the variable on interest.

153 citations


Cites background from "Sensor fusion in anti-personnel min..."

  • ...In belief functions theory, the discounting operation allows to combine information provided by a source in the form of a belief function with meta-knowledge regarding the reliability of that source, resulting in a “weakened”, less informative belief function....

    [...]

Journal ArticleDOI
TL;DR: The problem is sequential association of combat ID declarations in the multi-target environment and the solution is provided in the framework of ''object to ID declaration'' association based on assignment techniques, derives the global cost of assignment based on the plausibility of the global assignment.

98 citations

Journal ArticleDOI
TL;DR: The theoretical basis of data fusion for the purpose of target identification using the belief function theory is presented, which allows the knowledge sources to supply their information in the form of uncertain implication rules.
Abstract: Presented here is the theoretical basis of data fusion for the purpose of target identification using the belief function theory. The key feature is that we allow the knowledge sources to supply their information in the form of uncertain implication rules. How these rules can be elegantly handled within the framework of the belief function theory is described. A small scale, practical example for target identification is worked through in detail to clarify the theory for future users.

77 citations

Journal ArticleDOI
01 May 2006
TL;DR: This case study shows that belief function theory may be considered as a valuable framework for risk analysis studies in ill-structured or poorly informed application domains.
Abstract: Whereas probability theory has been very successful as a conceptual framework for risk analysis in many areas where a lot of experimental data and expert knowledge are available, it presents certain limitations in applications where only weak information can be obtained. One such application investigated in this paper is water treatment, a domain in which key information such as input water characteristics and failure rates of various chemical processes is often lacking. An approach to handle such problems is proposed, based on the Dempster-Shafer theory of belief functions. Belief functions are used to describe expert knowledge of treatment process efficiency, failure rates, and latency times, as well as statistical data regarding input water quality. Evidential reasoning provides mechanisms to combine this information and assess the plausibility of various noncompliance scenarios. This methodology is shown to boil down to the probabilistic one where data of sufficient quality are available. This case study shows that belief function theory may be considered as a valuable framework for risk analysis studies in ill-structured or poorly informed application domains

54 citations


Cites background from "Sensor fusion in anti-personnel min..."

  • ...tern recognition [5], [6], and data fusion [12], [19]....

    [...]

Journal ArticleDOI
TL;DR: A support vector machine algorithm for online abrupt change detection is implemented and proves to be efficient in detecting buried landmines from Bscan data and is evaluated using simulated and real data.
Abstract: Ground-penetrating radars (GPRs) are very promising sensors for landmine detection as they are capable of detecting landmines with low metal contents. GPRs deliver so-called Bscan data which are, roughly, vertical slice images of the ground. However, due to the high dielectric permittivity contrast at the air-ground interface, a strong response is recorded at early time by GPRs. This response is the main component of the so-called clutter noise and it blurs the responses of landmines buried at shallow depths. The landmine detection task is therefore quite difficult. This paper proposes a new method for automated detection and localization of buried objects from Bscan records. A support vector machine algorithm for online abrupt change detection is implemented and proves to be efficient in detecting buried landmines from Bscan data. The proposed procedure performance is evaluated using simulated and real data.

52 citations

References
More filters
Book
01 Jan 1976
TL;DR: This book develops an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions.
Abstract: Both in science and in practical affairs we reason by combining facts only inconclusively supported by evidence. Building on an abstract understanding of this process of combination, this book constructs a new theory of epistemic probability. The theory draws on the work of A. P. Dempster but diverges from Depster's viewpoint by identifying his "lower probabilities" as epistemic probabilities and taking his rule for combining "upper and lower probabilities" as fundamental. The book opens with a critique of the well-known Bayesian theory of epistemic probability. It then proceeds to develop an alternative to the additive set functions and the rule of conditioning of the Bayesian theory: set functions that need only be what Choquet called "monotone of order of infinity." and Dempster's rule for combining such set functions. This rule, together with the idea of "weights of evidence," leads to both an extensive new theory and a better understanding of the Bayesian theory. The book concludes with a brief treatment of statistical inference and a discussion of the limitations of epistemic probability. Appendices contain mathematical proofs, which are relatively elementary and seldom depend on mathematics more advanced that the binomial theorem.

14,565 citations


"Sensor fusion in anti-personnel min..." refers background or methods in this paper

  • ...Masses assigned for each of the measures are shown in Table II, as well as the masses after combination....

    [...]

  • ...Indeed, the 2Masses are rounded on two digits, and small values are not truncated but preserved in order to avoid multiplications by zero during combination....

    [...]

  • ...In addition, it is impossible to model every object (neither mines nor objects that could be confused with them)....

    [...]

  • ...Masses assigned by MD area measure....

    [...]

  • ...The deminer is included in the reasoning process by expressing his opinion regarding the sensors and, accordingly, the importance of each of the four measures via discounting factors [16], [21], [22]....

    [...]

Journal ArticleDOI
01 Jan 1996
TL;DR: A classification of operators issued from the different data fusion theories with respect to their behavior provides a guide for choosing an operator in a given problem and can be refined from the desired properties of the operators, from their decisiveness, and by examining how they deal with conflictive situations.
Abstract: In most data fusion systems, the information extracted from each sensor (either numerical or symbolic) is represented as a degree of belief in an event with real values, taking in this way into account the imprecise, uncertain, and incomplete nature of the information. The combination of such degrees of belief is performed through numerical fusion operators. A very large variety of such operators has been proposed in the literature. We propose in this paper a classification of these operators issued from the different data fusion theories with respect to their behavior. Three classes are thus defined. This classification provides a guide for choosing an operator in a given problem. This choice can then be refined from the desired properties of the operators, from their decisiveness, and by examining how they deal with conflictive situations.

719 citations

Journal ArticleDOI
TL;DR: The Bayes’ theorem is generalized within the transferable belief model framework and the DRC and GBT and their uses for belief propagation in directed belief networks are analysed.

692 citations

Book ChapterDOI
01 Jan 1990
TL;DR: The probability function is derived axiomatically the probability function that should be used to make decisions given any form of underlying uncertainty.
Abstract: Summary We derive axiomatically the probability function that should be used to make decisions given any form of underlying uncertainty.

511 citations

01 Jan 1997
TL;DR: An algorithm for the detection of ellipse shapes in images, using the Randomized Hough Transform is described, found to give improvements in accuracy, and a reduction in computation time and the number of false alarms detected.
Abstract: We describe an algorithm for the detection of ellipse shapes in images, using the Randomized Hough Transform. The algorithm is compared to a standard implementation of the Hough Transform, and the Probabilistic Hough Transform. Tests are performed using both noise-free and noisy images, and several real-world images. The algorithm was found to give improvements in accuracy, and a reduction in computation time, memory requirements and the number of false alarms detected.

266 citations