scispace - formally typeset
Search or ask a question
Book

Beyond the Kalman Filter: Particle Filters for Tracking Applications

TL;DR: Part I Theoretical concepts: introduction suboptimal nonlinear filters a tutorial on particle filters Cramer-Rao bounds for nonlinear filtering and tracking applications: tracking a ballistic object bearings-only tracking range- only tracking bistatic radar tracking targets through blind Doppler terrain aided tracking detection and tracking of stealthy targets group and extended object tracking.
Abstract: Part I Theoretical concepts: introduction suboptimal nonlinear filters a tutorial on particle filters Cramer-Rao bounds for nonlinear filtering Part II Tracking applications: tracking a ballistic object bearings-only tracking range-only tracking bistatic radar tracking tracking targets through blind Doppler terrain aided tracking detection and tracking of stealthy targets group and extended object tracking
Citations
More filters
Book
21 May 2004
TL;DR: The authors draw upon extensive industry and classroom experience to introduce todays most advanced and effective chip design practices, and present extensively updated coverage of every key element of VLSI design, and illuminate the latest design challenges with 65 nm process examples.
Abstract: For both introductory and advanced courses in VLSI design, this authoritative, comprehensive textbook is highly accessible to beginners, yet offers unparalleled breadth and depth for more experienced readers. The Fourth Edition of CMOS VLSI Design: A Circuits and Systems perspective presents broad and in-depth coverage of the entire field of modern CMOS VLSI Design. The authors draw upon extensive industry and classroom experience to introduce todays most advanced and effective chip design practices. They present extensively updated coverage of every key element of VLSI design, and illuminate the latest design challenges with 65 nm process examples. This book contains unsurpassed circuit-level coverage, as well as a rich set of problems and worked examples that provide deep practical insight to readers at all levels.

2,355 citations

Journal ArticleDOI
TL;DR: Using a deep learning approach to track user-defined body parts during various behaviors across multiple species, the authors show that their toolbox, called DeepLabCut, can achieve human accuracy with only a few hundred frames of training data.
Abstract: Quantifying behavior is crucial for many applications in neuroscience. Videography provides easy methods for the observation and recording of animal behavior in diverse settings, yet extracting particular aspects of a behavior for further analysis can be highly time consuming. In motor control studies, humans or other animals are often marked with reflective markers to assist with computer-based tracking, but markers are intrusive, and the number and location of the markers must be determined a priori. Here we present an efficient method for markerless pose estimation based on transfer learning with deep neural networks that achieves excellent results with minimal training data. We demonstrate the versatility of this framework by tracking various body parts in multiple species across a broad collection of behaviors. Remarkably, even when only a small number of frames are labeled (~200), the algorithm achieves excellent tracking performance on test frames that is comparable to human accuracy.

2,303 citations

Journal ArticleDOI
TL;DR: Under linear, Gaussian assumptions on the target dynamics and birth process, the posterior intensity at any time step is a Gaussian mixture and closed-form recursions for propagating the means, covariances, and weights of the constituent Gaussian components of the posteriorintensity are derived.
Abstract: A new recursive algorithm is proposed for jointly estimating the time-varying number of targets and their states from a sequence of observation sets in the presence of data association uncertainty, detection uncertainty, noise, and false alarms. The approach involves modelling the respective collections of targets and measurements as random finite sets and applying the probability hypothesis density (PHD) recursion to propagate the posterior intensity, which is a first-order statistic of the random finite set of targets, in time. At present, there is no closed-form solution to the PHD recursion. This paper shows that under linear, Gaussian assumptions on the target dynamics and birth process, the posterior intensity at any time step is a Gaussian mixture. More importantly, closed-form recursions for propagating the means, covariances, and weights of the constituent Gaussian components of the posterior intensity are derived. The proposed algorithm combines these recursions with a strategy for managing the number of Gaussian components to increase efficiency. This algorithm is extended to accommodate mildly nonlinear target dynamics using approximation strategies from the extended and unscented Kalman filters

1,805 citations


Cites background from "Beyond the Kalman Filter: Particle ..."

  • ..., [44] and [46]–[49]), the developments for the EK-PHD and UK-PHD filters are conceptually straightforward, though notationally cumbersome, and will be omitted....

    [...]

Journal ArticleDOI
TL;DR: A dynamic factor model is estimated to solve the problem of endogeneity of inputs and multiplicity of inputs relative to instruments and the role of family environments in shaping these skills at different stages of the life cycle of the child.
Abstract: This paper estimates models of the evolution of cognitive and noncognitive skills and explores the role of family environments in shaping these skills at different stages of the life cycle of the child. Central to this analysis is identification of the technology of skill formation. We estimate a dynamic factor model to solve the problem of endogeneity of inputs and multiplicity of inputs relative to instruments. We identify the scale of the factors by estimating their effects on adult outcomes. In this fashion we avoid reliance on test scores and changes in test scores that have no natural metric. Parental investments are generally more effective in raising noncognitive skills. Noncognitive skills promote the formation of cognitive skills but, in most specifications of our model, cognitive skills do not promote the formation of noncognitive skills. Parental inputs have different effects at different stages of the child’s life cycle with cognitive skills affected more at early ages and noncognitive skills affected more at later ages.

1,636 citations

Journal ArticleDOI
TL;DR: It is demonstrated that trackers can be evaluated objectively by survival curves, Kaplan Meier statistics, and Grubs testing, and it is found that in the evaluation practice the F-score is as effective as the object tracking accuracy (OTA) score.
Abstract: There is a large variety of trackers, which have been proposed in the literature during the last two decades with some mixed success. Object tracking in realistic scenarios is a difficult problem, therefore, it remains a most active area of research in computer vision. A good tracker should perform well in a large number of videos involving illumination changes, occlusion, clutter, camera motion, low contrast, specularities, and at least six more aspects. However, the performance of proposed trackers have been evaluated typically on less than ten videos, or on the special purpose datasets. In this paper, we aim to evaluate trackers systematically and experimentally on 315 video fragments covering above aspects. We selected a set of nineteen trackers to include a wide variety of algorithms often cited in literature, supplemented with trackers appearing in 2010 and 2011 for which the code was publicly available. We demonstrate that trackers can be evaluated objectively by survival curves, Kaplan Meier statistics, and Grubs testing. We find that in the evaluation practice the F-score is as effective as the object tracking accuracy (OTA) score. The analysis under a large variety of circumstances provides objective insight into the strengths and weaknesses of trackers.

1,604 citations


Cites methods from "Beyond the Kalman Filter: Particle ..."

  • ...To reduce the search space further motion prediction is done by a linear motion model, which can be described using Kalman filter [92]–[94]....

    [...]

References
More filters
Book
01 Jan 1994
TL;DR: The methods of van der Corput, Weyl and van der Weyl as discussed by the authors are described in detail in the introduction to Turan's method in Section 2.2.1.
Abstract: Uniform distribution van der Corput sets Exponential sums I: The methods of Weyl and van der Corput Exponential sums II: Vinogradov's method An introduction to Turan's method Irregularities of distribution Mean and large values of Dirichlet polynomials Distribution of reduced residue classes in short intervals Zeros of $L$-functions Small polynomials with integral coefficients Some unsolved problems Index.

713 citations

Book
01 Jan 2005
TL;DR: Whether you're a digital forensics specialist, incident response team member, law enforcement officer, corporate security specialist, or auditor, this book will become an indispensable resource for forensic investigations, no matter what analysis tools you use.
Abstract: The Definitive Guide to File System Analysis: Key Concepts and Hands-on TechniquesMost digital evidence is stored within the computer's file system, but understanding how file systems work is one of the most technically challenging concepts for a digital investigator because there exists little documentation. Now, security expert Brian Carrier has written the definitive reference for everyone who wants to understand and be able to testify about how file system analysis is performed.Carrier begins with an overview of investigation and computer foundations and then gives an authoritative, comprehensive, and illustrated overview of contemporary volume and file systems: Crucial information for discovering hidden evidence, recovering deleted data, and validating your tools. Along the way, he describes data structures, analyzes example disk images, provides advanced investigation scenarios, and uses today's most valuable open source file system analysis tools-including tools he personally developed. Coverage includes Preserving the digital crime scene and duplicating hard disks for "dead analysis" Identifying hidden data on a disk's Host Protected Area (HPA) Reading source data: Direct versus BIOS access, dead versus live acquisition, error handling, and more Analyzing DOS, Apple, and GPT partitions; BSD disk labels; and Sun Volume Table of Contents using key concepts, data structures, and specific techniques Analyzing the contents of multiple disk volumes, such as RAID and disk spanning Analyzing FAT, NTFS, Ext2, Ext3, UFS1, and UFS2 file systems using key concepts, data structures, and specific techniques Finding evidence: File metadata, recovery of deleted files, data hiding locations, and more Using The Sleuth Kit (TSK), Autopsy Forensic Browser, and related open source toolsWhen it comes to file system analysis, no other book offers this much detail or expertise. Whether you're a digital forensics specialist, incident response team member, law enforcement officer, corporate security specialist, or auditor, this book will become an indispensable resource for forensic investigations, no matter what analysis tools you use.Brian Carrier has authored several leading computer forensic tools, including The Sleuth Kit (formerly The @stake Sleuth Kit) and the Autopsy Forensic Browser. He has authored several peer-reviewed conference and journal papers and has created publicly available testing images for forensic tools. Currently pursuing a Ph.D. in Computer Science and Digital Forensics at Purdue University, he is also a research assistant at the Center for Education and Research in Information Assurance and Security (CERIAS) there. He formerly served as a research scientist at @stake and as the lead for the @stake Response Team and Digital Forensic Labs. Carrier has taught forensics, incident response, and file systems at SANS, FIRST, the @stake Academy, and SEARCH.Brian Carrier's http://www.digital-evidence.org contains book updates and up-to-date URLs from the book's references.© Copyright Pearson Education. All rights reserved.

536 citations

Journal ArticleDOI
TL;DR: The role of self-determination in the recent break-up of the former Soviet Union and Yugoslavia is discussed in this article, where the authors compare self-dependence as an international political postulate with traditional international law.
Abstract: 1. Introduction Part I. The Historical Background: 2. Self-determination as an international political postulate Part II. Self-Determination Becomes an International Legal Standard: 3. Treaty law 4. The emergence of customary rules: external self-determination 5. The emergence of customary rules: internal self-determination 6. The holders of the right of self-determination and the means of ensuring observance of the right 7. Comparing customary and treaty law Part III. The Right to Self-Determination in Operation: 8. The impact of self-determination on traditional international law 9. Testing international law - some particularly controversial issues 10. The role of self-determination in the recent break-up of the Soviet Union and Yugoslavia Part IV. The New Trends Emerging in the World Community: 11. Attempts at expanding self-determination Part V. General Stocktaking: 12. Recapitulation and conclusion.

372 citations

Book
08 Sep 2005
TL;DR: This paper presents a probabilistic procedure called "Spline Interpolation–Bezier Approximation–Subdivision Methods", which automates the very labor-intensive and therefore time-heavy and expensive process of solving the inequality of the following types of inequality:.
Abstract: Basic Theory.- Linear Interpolation.- Polynomial Interpolation.- Hermite Interpolation.- Spline Interpolation.- Bezier Approximation.- B-Spline Approximation.- Subdivision Methods.- Sweep Surfaces.

317 citations

BookDOI
01 Dec 1995
TL;DR: In this paper, the troposhere and stratosphere are emphasized and a review of the state-of-the-art work in atmospheric chemistry is presented, with a focus on gas phase, condensed phase, and heterogeneous chemistry.
Abstract: Atmospheric chemistry is central to understanding global changes — ozone depletion, appearance of the polar ozone holes, and compositional changes which worsen the greenhouse effect. Because of its importance, work is progressing on many fronts.This volume emphasizes the troposhere and stratosphere and has chapters on gas phase, condensed phase, and heterogeneous chemistry. Present progress is emphasized, and important future directions are also described.This book fills a need not satisfied by any others and will be popular for some years to come. It informs students and newcomers to the field of the many facets of atmospheric chemistry and can be used as a text for advanced students. It is also a valuable desk reference summarizing activities by quite a number of the most active research groups.Chapter 18 by Kolb et al. on heterogeneous chemistry is especially noteworthy because it represents a unique joint effort by several groups working on a very timely subject; they describe a conceptual framework and establish conventions which will be standard in future papers on this subject.

150 citations