scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Signal Processing Magazine in 2006"


Journal ArticleDOI
TL;DR: The future possibilities of radar are focused on with particular emphasis on the issue of cognition, and the problem of radar surveillance applied to an ocean environment is considered.
Abstract: This article discusses a new idea called cognitive radar. Three ingredients are basic to the constitution of cognitive radar: 1) intelligent signal processing, which builds on learning through interactions of the radar with the surrounding environment; 2) feedback from the receiver to the transmitter, which is a facilitator of intelligence; and 3) preservation of the information content of radar returns, which is realized by the Bayesian approach to target detection through tracking. All three of these ingredients feature in the echo-location system of a bat, which may be viewed as a physical realization (albeit in neurobiological terms) of cognitive radar. Radar is a remote-sensing system that is widely used for surveillance, tracking, and imaging applications, for both civilian and military needs. In this article, we focus on future possibilities of radar with particular emphasis on the issue of cognition. As an illustrative case study along the way, we consider the problem of radar surveillance applied to an ocean environment.

1,022 citations


Journal ArticleDOI
TL;DR: This paper presents an overview of various deconvolution techniques of 3D fluorescence microscopy images and provides a summary of the microscope point-spread function (PSF), which often creates the most severe distortion in the acquired 3D image.
Abstract: This paper presents an overview of various deconvolution techniques of 3D fluorescence microscopy images. It describes the subject of image deconvolution for 3D fluorescence microscopy images and provides an overview of the distortion issues in different areas. The paper presents a brief schematic description of fluorescence microscope systems and provides a summary of the microscope point-spread function (PSF), which often creates the most severe distortion in the acquired 3D image. Finally, it discusses the ongoing research work in the area and provides a brief review of performance measures of 3D deconvolution microscopy techniques. It also provides a summary of the numerical results using simulated data and presents the results obtained from the real data.

490 citations


Journal ArticleDOI
TL;DR: In this article, the authors discuss nonparametric distributed learning in WSNs and discuss the challenges that wireless sensor networks pose for distributed learning, and research aimed at addressing these challenges is surveyed.
Abstract: This paper discusses nonparametric distributed learning. After reviewing the classical learning model and highlighting the success of machine learning in centralized settings, the challenges that wireless sensor networks (WSN) pose for distributed learning are discussed, and research aimed at addressing these challenges is surveyed.

396 citations


Journal ArticleDOI
TL;DR: It is proved theoretically, and corroborated with examples, that when the noise distributions are either completely known, partially known or completely unknown, distributed estimation is possible with minimal bandwidth requirements which can achieve the same order of mean square error (MSE) performance as the corresponding centralized clairvoyant estimators.
Abstract: This paper provides an overview of distributed estimation-compression problems encountered with wireless sensor networks (WSN). A general formulation of distributed compression-estimation under rate constraints was introduced, pertinent signal processing algorithms were developed, and emerging tradeoffs were delineated from an information theoretic perspective. Specifically, we designed rate-constrained distributed estimators for various signal models with variable knowledge of the underlying data distributions. We proved theoretically, and corroborated with examples, that when the noise distributions are either completely known, partially known or completely unknown, distributed estimation is possible with minimal bandwidth requirements which can achieve the same order of mean square error (MSE) performance as the corresponding centralized clairvoyant estimators.

362 citations


Journal ArticleDOI
TL;DR: The goals of this article are to survey basic cooperative communications and outline two potential architectures for cooperative MANETs, which provide modified wireless link abstractions and suggest tradeoffs in complexity at the physical and higher layers.
Abstract: he motivation of this article is to clarify and help resolve the gap between the link abstraction used in traditional wireless networking and its much broader definition used in the context of cooperative communications, which has received significant interest as an untapped means for improving performance of relay transmission systems operating over the ever-challenging wireless medium. The common theme of most research in this area is to optimize physical layer performance measures without considering in much detail how cooperation interacts with higher layers and improves network performance measures. Because these issues are important for enabling cooperative communications to practice in real-world networks, especially for the increasingly important class of mobile ad hoc networks (MANETs), the goals of this article are to survey basic cooperative communications and outline two potential architectures for cooperative MANETs. The first architecture relies on an existing clustered infrastructure: cooperative relays are centrally controlled by cluster heads. In another architecture without explicit clustering, cooperative links are formed by request of a source node in an ad hoc, decentralized fashion. In either case, cooperative communication considerably improves the network connectivity. Although far from a complete study, these architectures provide modified wireless link abstractions and suggest tradeoffs in complexity at the physical and higher layers. Many opportunities and challenges remain, including distributed synchronization, coding, and signal processing among multiple radios; modeling of new link abstractions at higher layers; and multiaccess and routing protocols for networks of cooperative links.

334 citations


Journal ArticleDOI
TL;DR: The state-of-the-art in automatic genre classification of music collections through three main paradigms: expert systems, unsupervised classification, and supervised classification is reviewed.
Abstract: This paper reviews the state-of-the-art in automatic genre classification of music collections through three main paradigms: expert systems, unsupervised classification, and supervised classification. The paper discusses the importance of music genres with their definitions and hierarchies. It also presents techniques to extract meaningful information from audio data to characterize musical excerpts. The paper also presents the results of new emerging research fields and techniques that investigate the proximity of music genres

327 citations


Journal ArticleDOI
TL;DR: This paper reviews the classical decentralized decision theory in the light of new constraints and requirements and concludes that an integrated channel-aware approach needs to be taken for optimal detection performance given the available resources.
Abstract: This paper reviews the classical decentralized decision theory in the light of new constraints and requirements. The central theme that transcends various aspects of signal processing design is that an integrated channel-aware approach needs to be taken for optimal detection performance given the available resources.

306 citations


Journal ArticleDOI
TL;DR: This paper focuses not on the high-level video analysis task themselves but on the common basic techniques that have been developed to facilitate them, including shot boundary detection and condensed video representation.
Abstract: There is an urgent need to develop techniques that organize video data into more compact forms or extract semantically meaningful information. Such operations can serve as a first step for a number of different data access tasks such as browsing, retrieval, genre classification, and event detection. In this paper, we focus not on the high-level video analysis task themselves but on the common basic techniques that have been developed to facilitate them. These basic tasks are shot boundary detection and condensed video representation

282 citations


Journal ArticleDOI
Joseph R. Guerci1, E.J. Baranoski1
TL;DR: An overview of the KASSPER program is provided highlighting both the benefits of KA adaptive radar, key algorithmic concepts, and the breakthrough look-ahead radar scheduling approach that is the keystone to the KAsSPER HPEC architecture.
Abstract: For the past several years, the Defense Advanced Research Projects Agency (DARPA) has been pioneering the development of the first ever real-time knowledge-aided (KA) adaptive radar architecture. The impetus for this program is the ever increasingly complex missions and operational environments encountered by modern radars and the inability of traditional adaptation methods to address rapidly varying interference environments. The DARPA KA sensor signal processing and expert reasoning (KASSPER) program has as its goal the demonstration of a high performance embedded computing (HPEC) architecture capable of integrating high-fidelity environmental knowledge (i.e., priors) into the most computationally demanding subsystem of a modern radar: the adaptive space-time beamformer. This is no mean feat as environmental knowledge is a memory quantity that is inherently difficult (if not impossible) to access at the rates required to meet radar front-end throughput requirements. In this article, we will provide an overview of the KASSPER program highlighting both the benefits of KA adaptive radar, key algorithmic concepts, and the breakthrough look-ahead radar scheduling approach that is the keystone to the KASSPER HPEC architecture.

248 citations


Journal ArticleDOI
TL;DR: The paper discusses the main ingredients of the commonly used tracking paradigm and subsequently reconsider its competence by comparing it to certain aspects of visual motion perception in human beings, keeping in mind the complexity and variability of biological image data.
Abstract: This paper aims to simulate the application of more advanced computer vision techniques to tracking in biological molecular imaging by surveying the literature and sketching the current state of affairs in the field for a signal and image processing audience. After describing the basic principles of visualizing molecular dynamics in living cells and giving some examples of biological molecular dynamics studies, the paper summarizes the problems and limitations intrinsic to imaging at this scale. The paper then discusses the main ingredients of the commonly used tracking paradigm and subsequently reconsider its competence by comparing it to certain aspects of visual motion perception in human beings, keeping in mind the complexity and variability of biological image data. Finally, it summarizes the main points of attention for future research and the challenges that lie ahead.

240 citations


Journal ArticleDOI
TL;DR: The high potential of the affective video content analysis for enhancing the content recommendation functionalities of the future PVRs and VOD systems is shown.
Abstract: This paper considers how we feel about the content we see or hear. As opposed to the cognitive content information composed of the facts about the genre, temporal content structures and spatiotemporal content elements, we are interested in obtaining the information about the feelings, emotions, and moods evoked by a speech, audio, or video clip. We refer to the latter as the affective content, and to the terms such as happy or exciting as the affective labels of an audiovisual signal. In the first part of the paper, we explore the possibilities for representing and modeling the affective content of an audiovisual signal to effectively bridge the affective gap. Without loosing generality, we refer to this signal simply as video, which we see as an image sequence with an accompanying soundtrack. Then, we show the high potential of the affective video content analysis for enhancing the content recommendation functionalities of the future PVRs and VOD systems. We conclude this paper by outlining some interesting research challenges in the field

Journal ArticleDOI
TL;DR: This paper presents an overview of research conducted to bridge the rich field of graphical models with the emerging field of data fusion for sensor networks.
Abstract: This paper presents an overview of research conducted to bridge the rich field of graphical models with the emerging field of data fusion for sensor networks. Both theoretical issues and prototyping applications are discussed in addition to suggesting new lines of reasoning.

Journal ArticleDOI
TL;DR: This article provides a brief review of radar space-time adaptive processing from its inception to state-of-the art developments, focusing on signal processing for radar systems using multiple antenna elements that coherently process multiple pulses.
Abstract: This article provides a brief review of radar space-time adaptive processing (STAP) from its inception to state-of-the art developments. The topic is treated from both intuitive and theoretical aspects. A key requirement of STAP is knowledge of the spectral characteristics underlying the interference scenario of interest. Additional issues of importance in STAP include the computational cost of the adaptive algorithm as well as the ability to maintain a constant false alarm rate (CFAR) over widely varying interference statistics. This article addresses these topics, developing the need for a knowledge-based (KB) perspective. The focus here is on signal processing for radar systems using multiple antenna elements that coherently process multiple pulses. An adaptive array of spatially distributed sensors, which processes multiple temporal snapshots, overcomes the directivity and resolution limitations of a single sensor.

Journal ArticleDOI
TL;DR: A tutorial on the existing abstraction work for generic videos and state-of-the-art techniques for feature film skimming are presented and the authors' recent work on movie skimming using audiovisual tempo analysis and specific cinematic rules are described.
Abstract: With the proliferation of digital video, video summarization and skimming has become an indispensable tool of any practical video content management system. This paper provides a tutorial on the existing abstraction work for generic videos and presents state-of-the-art techniques for feature film skimming. The paper also describes the authors' recent work on movie skimming using audiovisual tempo analysis and specific cinematic rules. With the maturity of the movie genre classification, content understanding and video abstraction techniques, an automatic movie content analysis system that facilitates navigation, browsing, and search of desired movie content is possible in the near future

Journal ArticleDOI
TL;DR: An overview of the main aspects of modern fluorescence microscopy is provided, which covers the principles of fluorescence and highlights the key discoveries in the history of fluorescent microscopy.
Abstract: This paper provides an overview of the main aspects of modern fluorescence microscopy. It covers the principles of fluorescence and highlights the key discoveries in the history of fluorescence microscopy. The paper also discusses the optics of fluorescence microscopes and examines the various types of detectors. It also discusses the signal and image processing challenges in fluorescence microscopy and highlights some of the present developments and future trends in the field.

Journal ArticleDOI
TL;DR: The Viterbi algorithm is now used in most digital cellular phones and digital satellite receivers as well as in such diverse fields as magnetic recoding, voice recognition, and DNA sequence analysis.
Abstract: This paper describes how Andrew J. Viterbi developed a non-sequential decoding algorithm which proved useful in showing the superiority of convolutional codes over block codes for a given degree of decoding complexity. The Viterbi algorithm is now used in most digital cellular phones and digital satellite receivers as well as in such diverse fields as magnetic recoding, voice recognition, and DNA sequence analysis.

Journal ArticleDOI
TL;DR: A new closed-form approximate solution is introduced for the problem of locating a radiating source from range-difference observations and briefly comments on the related problem of source localization from energy or range measurements.
Abstract: This paper considers the problem of locating a radiating source from range-difference observations. This specific source localization problem has received significant attention for at least 20 years, and several solutions have been proposed to solve it either approximately or exactly. However, some of these solutions have not been described clearly, and confusions seem to persist. This paper aims to clarify and streamline the most successful solutions. It introduces a new closed-form approximate solution, and briefly comments on the related problem of source localization from energy or range measurements

Journal ArticleDOI
TL;DR: This paper addresses the important aspect of compressing and transmitting video signals generated by wireless broadband networks while heeding the architectural demands imposed by these networks in terms of energy constraints as well as the channel uncertainty related to the wireless communication medium.
Abstract: This paper addresses the important aspect of compressing and transmitting video signals generated by wireless broadband networks while heeding the architectural demands imposed by these networks in terms of energy constraints as well as the channel uncertainty related to the wireless communication medium. Driven by the need to develop light, robust, energy-efficient, and low delay video delivery schemes, a distributed video coding based framework dubbed PRISM is introduced. PRISM addresses the wireless video sensor network requirements far more effectively than current state-of-the-art standards like MPEG. This paper focuses on the case of a single video camera and use it as a platform to describe the theoretical principles and practical aspects underlying distributed video coding.

Journal ArticleDOI
TL;DR: In this article, knowledge-based (KB) signal and data processing techniques offer the promise of significantly improved performance of all radar systems, including positioning, waveform selection, and modes of operation.
Abstract: Radar systems are an important component in military operations. In response to increasingly severe threats from military targets with reduced radar cross sections (RCSs), slow-moving and low-flying aircraft hidden in foliage, and in environments with large numbers of targets, knowledge-based (KB) signal and data processing techniques offer the promise of significantly improved performance of all radar systems. Radars under KB control can be deployed to utilize valuable resources such as airspace or runways more effectively and to aid human operators in carrying out their missions. As battlefield scenarios become more complex with increasing numbers of sensors and weapon systems, the challenge will be to use already available information effectively to enhance radar performance, including positioning, waveform selection, and modes of operation. KB processing fills this need and helps meet the challenge.

Journal ArticleDOI
TL;DR: This paper recollects the events that led to proposing the linear prediction coding (LPC) method, then the multipulse LPC and the code-excited LPC.
Abstract: This paper recollects the events that led to proposing the linear prediction coding (LPC) method, then the multipulse LPC and the code-excited LPC

Journal ArticleDOI
TL;DR: The signal processing communities are motivated to address challenging data modeling and other informatics issues of HTS using automated fluorescence microscopy technology, otherwise known as high-content screening (HCS) in the pharmaceutical industry.
Abstract: In this article, we discussed the emerging informatics issues of high-throughput screening (HTS) using automated fluorescence microscopy technology, otherwise known as high-content screening (HCS) in the pharmaceutical industry. Optimal methods of scoring biomarkers and identifying candidate hits have been actively studied in academia and industry, with the exception of data modeling topics. To find candidate hits, we need to score the images associated with different compound interventions. In the application example of RNAi genome-wide screening, we aim to find the candidate effectors or genes which correspond to the images acquired using the three channels. Scoring the effectors is equivalent to scoring the images based on the number of phenotypes existing in those images. Our ultimate objective of studying HTS is to model the relationship between gene networks and cellular phenotypes, investigate cellular communication via protein interaction, and study the disease mechanism beyond the prediction based on the molecular structure of the compound. Finally, computational image analysis has become a powerful tool in cellular and molecular biology studies. Signal processing and modeling for high-throughput image screening is an emerging filed that requires novel algorithms for dynamical system analysis, image processing, and statistical modeling. We hope that this article will motivate the signal processing communities to address challenging data modeling and other informatics issues of HTS

Journal ArticleDOI
TL;DR: Some of the main difficulties posed by cellular imaging are pointed out, and a selection of current efforts on tracking moving cells are reviewed, with an emphasis on deformable model approaches.
Abstract: Cell migration is a field of intense current research, where biologists increasingly rely on methods and expertise from physics and engineering. Signal processing approaches can contribute significantly to this research, most notably to help analyze the exploding quantity of imaging data produced with standard and new microscopy techniques. In this article, we first provide a brief background on the importance of understanding cell movements, then review a selection of current efforts on tracking moving cells, with an emphasis on deformable model approaches (some ideas expressed in this article have been previously discussed. We will point out some of the main difficulties posed by cellular imaging, discuss advantages and limitations of different tracking techniques, and suggest a few directions for future advances


Journal ArticleDOI
TL;DR: The latest advances made in determining the theoretical capacity bounds are surveyed and the best practical code designs reported so far are described, predicting that cooperative communication can provide increased capacity and power savings in ad hoc networks.
Abstract: Cooperative diversity is a novel technique proposed for conveying information in wireless ad hoc networks, where closely located single-antenna network nodes cooperatively transmit and/or receive by forming virtual antenna arrays. For its building blocks, the relay channel and the two-transmitter, two-receiver cooperative channel, we survey the latest advances made in determining the theoretical capacity bounds and describe the best practical code designs reported so far. Both theory and practice predict that cooperative communication can provide increased capacity and power savings in ad hoc networks.

Journal ArticleDOI
TL;DR: It is shown how the structure of the distributed sensing and communication problem dictates new processing architectures, and the key challenge lies in the discretization of space, time and amplitude.
Abstract: To illustrate the conceptual issues related to sampling, source representation/coding and communication in sensor networks, we review the underlying theory and discuss specific examples. We show how the structure of the distributed sensing and communication problem dictates new processing architectures. The key challenge lies in the discretization of space, time and amplitude, since most of the advanced signal processing systems operate in discrete domain.


Journal ArticleDOI
TL;DR: Theoretical background on necessary information theoretic concepts are provided, nonparametric sample estimator for these quantities are derived and discussed, and the use of these estimators for various statistical signal processing problems have been illustrated.
Abstract: Recent advances in computing capabilities and the interest in new challenging signal processing problems that cannot be successfully solved using traditional techniques have sparked an interest in information-theoretic signal processing techniques. Adaptive nonlinear filters that process signals based on their information content have become a major focus of interest. The design and analysis of such nonlinear information processing systems is demonstrated in this paper. Theoretical background on necessary information theoretic concepts are provided, nonparametric sample estimators for these quantities are derived and discussed, the use of these estimators for various statistical signal processing problems have been illustrated. These include data density modeling, system identification, blind source separation, dimensionality reduction, image registration, and data clustering

Journal ArticleDOI
TL;DR: Results suggest that the fuzzy approach is a valid means of evaluating the relative importance of the radar tasks; the resulting priorities have been adapted by the fuzzy logic prioritization method, according to how the radar system perceived the surrounding environment.
Abstract: In this article, we consider two related aspects of radar resource management, scheduling and task prioritization. Two different methods of scheduling are examined and compared and their differences and similarities highlighted. The comparison suggests that prioritization of tasks plays a dominant role in determining performance. A prioritization scheme based on fuzzy logic is subsequently contrasted and compared with a hard logic approach as a basis for task prioritization. The setting of priorities is shown to be critically dependent on prior expert knowledge. By assessing the priorities of targets and sectors of surveillance according to a set of rules it is attempted to imitate the human decision-making process such that the resource manager can distribute the radar resources in a more effective way. Results suggest that the fuzzy approach is a valid means of evaluating the relative importance of the radar tasks; the resulting priorities have been adapted by the fuzzy logic prioritization method, according to how the radar system perceived the surrounding environment.

Journal ArticleDOI
TL;DR: This paper provides several efficient approximations for the arctangent function using Lagrange interpolation and minimax optimization techniques, and extends them to all four quadrants.
Abstract: This paper provides several efficient approximations for the arctangent function using Lagrange interpolation and minimax optimization techniques. These approximations are particularly useful when processing power, memory, and power consumption are important issues. In addition to comparing the errors and the computational workload of these approximations, we also extend them to all four quadrants.

Journal ArticleDOI
TL;DR: It is only through the use of automated content-based analysis that sports viewers will be given a chance to manipulate content at a much deeper level than that intended by broadcasters, and hence put true meaning into interactivity.
Abstract: This paper aims to identify the current trends in sports-based indexing and retrieval work. It discusses the essential building blocks for any semantic-level retrieval system and acts as a case study in content analysis system design. While one of the major benefits of digital media and digital television in particular has been to provide users with more choices and a more interactive viewing experience, the freedom to choose has in fact manifested as the freedom to choose from the options the broadcaster provides. It is only through the use of automated content-based analysis that sports viewers will be given a chance to manipulate content at a much deeper level than that intended by broadcasters, and hence put true meaning into interactivity