scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Signal Processing Magazine in 2009"


Journal ArticleDOI
TL;DR: This article has reviewed the reasons why people want to love or leave the venerable (but perhaps hoary) MSE and reviewed emerging alternative signal fidelity measures and discussed their potential application to a wide variety of problems.
Abstract: In this article, we have reviewed the reasons why we (collectively) want to love or leave the venerable (but perhaps hoary) MSE. We have also reviewed emerging alternative signal fidelity measures and discussed their potential application to a wide variety of problems. The message we are trying to send here is not that one should abandon use of the MSE nor to blindly switch to any other particular signal fidelity measure. Rather, we hope to make the point that there are powerful, easy-to-use, and easy-to-understand alternatives that might be deployed depending on the application environment and needs. While we expect (and indeed, hope) that the MSE will continue to be widely used as a signal fidelity measure, it is our greater desire to see more advanced signal fidelity measures being used, especially in applications where perceptual criteria might be relevant. Ideally, the performance of a new signal processing algorithm might be compared to other algorithms using several fidelity criteria. Lastly, we hope that we have given further motivation to the community to consider recent advanced signal fidelity measures as design criteria for optimizing signal processing algorithms and systems. It is in this direction that we believe that the greatest benefit eventually lies.

2,601 citations


MonographDOI
TL;DR: A broad survey of models and efficient algorithms for nonnegative matrix factorization (NMF) can be found in this paper, where the authors focus on the algorithms that are most useful in practice, looking at the fastest, most robust, and suitable for large-scale models.
Abstract: This book provides a broad survey of models and efficient algorithms for Nonnegative Matrix Factorization (NMF). This includes NMFs various extensions and modifications, especially Nonnegative Tensor Factorizations (NTF) and Nonnegative Tucker Decompositions (NTD). NMF/NTF and their extensions are increasingly used as tools in signal and image processing, and data analysis, having garnered interest due to their capability to provide new insights and relevant information about the complex latent relationships in experimental data sets. It is suggested that NMF can provide meaningful components with physical interpretations; for example, in bioinformatics, NMF and its extensions have been successfully applied to gene expression, sequence analysis, the functional characterization of genes, clustering and text mining. As such, the authors focus on the algorithms that are most useful in practice, looking at the fastest, most robust, and suitable for large-scale models. Key features: Acts as a single source reference guide to NMF, collating information that is widely dispersed in current literature, including the authors own recently developed techniques in the subject area. Uses generalized cost functions such as Bregman, Alpha and Beta divergences, to present practical implementations of several types of robust algorithms, in particular Multiplicative, Alternating Least Squares, Projected Gradient and Quasi Newton algorithms. Provides a comparative analysis of the different methods in order to identify approximation error and complexity. Includes pseudo codes and optimized MATLAB source codes for almost all algorithms presented in the book. The increasing interest in nonnegative matrix and tensor factorizations, as well as decompositions and sparse representation of data, will ensure that this book is essential reading for engineers, scientists, researchers, industry practitioners and graduate students across signal and image processing; neuroscience; data mining and data analysis; computer science; bioinformatics; speech processing; biomedical engineering; and multimedia.

1,142 citations


Journal ArticleDOI
TL;DR: In this article, the authors provided a comprehensive overview of coalitional game theory and its usage in wireless and communication networks, and provided an in-depth analysis of the methodologies and approaches for using these games in both game theoretic and communication applications.
Abstract: In this tutorial, we provided a comprehensive overview of coalitional game theory, and its usage in wireless and communication networks. For this purpose, we introduced a novel classification of coalitional games by grouping the sparse literature into three distinct classes of games: canonical coalitional games, coalition formation games, and coalitional graph games. For each class, we explained in details the fundamental properties, discussed the main solution concepts, and provided an in-depth analysis of the methodologies and approaches for using these games in both game theory and communication applications. The presented applications have been carefully selected from a broad range of areas spanning a diverse number of research problems. The tutorial also sheds light on future opportunities for using the strong analytical tool of coalitional games in a number of applications. In a nutshell, this article fills a void in existing communications literature, by providing a novel tutorial on applying coalitional game theory in communication networks through comprehensive theory and technical details as well as through practical examples drawn from both game theory and communication application.

892 citations


Journal ArticleDOI
Hany Farid1
TL;DR: The field of digital forensics has emerged to help restore some trust to digital images and the author reviews the state of the art in this new and exciting field.
Abstract: We are undoubtedly living in an age where we are exposed to a remarkable array of visual imagery. While we may have historically had confidence in the integrity of this imagery, today's digital technology has begun to erode this trust. From the tabloid magazines to the fashion industry and in mainstream media outlets, scientific journals, political campaigns, courtrooms, and the photo hoaxes that land in our e-mail in-boxes, doctored photographs are appearing with a growing frequency and sophistication. Over the past five years, the field of digital forensics has emerged to help restore some trust to digital images. The author reviews the state of the art in this new and exciting field.

825 citations


Journal ArticleDOI
TL;DR: How photo-response nonuniformity (PRNU) of imaging sensors can be used for a variety of important digital forensic tasks, such as device identification, device linking, recovery of processing history, and detection of digital forgeries is explained.
Abstract: The article explains how photo-response nonuniformity (PRNU) of imaging sensors can be used for a variety of important digital forensic tasks, such as device identification, device linking, recovery of processing history, and detection of digital forgeries. The PRNU is an intrinsic property of all digital imaging sensors due to slight variations among individual pixels in their ability to convert photons to electrons. Consequently, every sensor casts a weak noise-like pattern onto every image it takes. This pattern, which plays the role of a sensor fingerprint, is essentially an unintentional stochastic spread-spectrum watermark that survives processing, such as lossy compression or filtering. This tutorial explains how this fingerprint can be estimated from images taken by the camera and later detected in a given image to establish image origin and integrity. Various forensic tasks are formulated as a two-channel hypothesis testing problem approached using the generalized likelihood ratio test. The performance of the introduced forensic methods is briefly illustrated on examples to give the reader a sense of the performance.

326 citations


Journal ArticleDOI
TL;DR: Attributes common to all multicore processor implementations are covered, including application domain, power/performance, processing elements, memory system, and accelerators/integrated peripherals.
Abstract: General-purpose multicore processors are being accepted in all segments of the industry, including signal processing and embedded space, as the need for more performance and general-purpose programmability has grown. Parallel processing increases performance by adding more parallel resources while maintaining manageable power characteristics. The implementations of multicore processors are numerous and diverse. Designs range from conventional multiprocessor machines to designs that consist of a "sea" of programmable arithmetic logic units (ALUs). In this article, we cover some of the attributes common to all multicore processor implementations and illustrate these attributes with current and future commercial multicore designs. The characteristics we focus on are application domain, power/performance, processing elements, memory system, and accelerators/integrated peripherals.

299 citations


Journal ArticleDOI
TL;DR: An overview of the state of the art of recent research activities in noncoherent ultra-wideband receiver front ends is provided with a focus on architectures that perform the initial signal processing tasks in the analog domain, such that the receiver does not need to sample the UWB received signals at Nyquist rate.
Abstract: The need for low-complexity devices with low-power consumption motivates the application of suboptimal noncoherent ultra-wideband (UWB) receivers. This article provides an overview of the state of the art of recent research activities in this field. It introduces energy detection and autocorrelation receiver front ends with a focus on architectures that perform the initial signal processing tasks in the analog domain, such that the receiver does not need to sample the UWB received signals at Nyquist rate. Common signaling and multiple access schemes are reviewed for both front ends. An elaborate section illustrates various performance tradeoffs to highlight preferred system choices. Practical issues are discussed, including, for low-data-rate schemes, the allowed power allocation per pulse according to the regulator's ruling and the estimated power consumption of a receiver chip. A large part is devoted to signal processing steps needed in a digital receiver. It starts with synchronization and time-of-arrival estimation schemes, introduces studies about the narrowband interference problem, and describes solutions for high-data-rate and multiple access communications. Drastic advantages concerning complexity and robustness justify the application of noncoherent UWB systems, particularly for low-data-rate systems.

278 citations


Journal ArticleDOI
TL;DR: If the experiments are performed on a large data set, the algorithm is compared to the state-of-the-art methods, the code and/or data are well documented and available online, the community will all benefit and make it easier to build upon each other's work.
Abstract: What should we do to raise the quality of signal processing publications to an even higher level? We believe it to be crucial to maintain the precision in describing our work in publications, ensured through a high-quality reviewing process. We also believe that if the experiments are performed on a large data set, the algorithm is compared to the state-of-the-art methods, the code and/or data are well documented and available online, we will all benefit and make it easier to build upon each other's work. It is a clear win-win situation for our community: we will have access to more and more algorithms and can spend time inventing new things rather than recreating existing ones.

273 citations


Journal ArticleDOI
TL;DR: The working group producing this article was charged to elicit from the human language technology community a set of well-considered directions or rich areas for future research that could lead to major paradigm shifts in the field of automatic speech recognition (ASR) and understanding.
Abstract: To advance research, it is important to identify promising future research directions, especially those that have not been adequately pursued or funded in the past. The working group producing this article was charged to elicit from the human language technology (HLT) community a set of well-considered directions or rich areas for future research that could lead to major paradigm shifts in the field of automatic speech recognition (ASR) and understanding. ASR has been an area of great interest and activity to the signal processing and HLT communities over the past several decades. As a first step, this group reviewed major developments in the field and the circumstances that led to their success and then focused on areas it deemed especially fertile for future research. Part 1 of this article will focus on historically significant developments in the ASR area, including several major research efforts that were guided by different funding agencies, and suggest general areas in which to focus research.

244 citations


Journal ArticleDOI
TL;DR: An overview of approaches for detection for MIMO, in the communications receiver context, finds that notions that are important in slow fading are less important in fast fading, where diversity is provided anyway by time variations.
Abstract: The goal of this lecture has been to provide an overview of approaches, in the communications receiver context. Which method is the best in practice? This depends much on the purpose of solving : what error rate can be tolerated, what is the ultimate measure of performance (e.g., frame-error-rate, worst-case complexity, or average complexity), and what computational platform is used. Additionally, the bits in s may be part of a larger code word and different s vectors in that code word may either see the same H (slow fading) or many different realizations of H (fast fading). This complicates the picture, because notions that are important in slow fading (such as spatial diversity) are less important in fast fading, where diversity is provided anyway by time variations. Detection for MIMO has been an active field for more than ten years, and this research will probably continue for some time.

231 citations


Journal ArticleDOI
TL;DR: The paper will focus on technical problems arising at the physical and medium access layers of a wireless network, and not on economic aspects related to it, like the auction problem for spectrum, even though it is also an important scenario where game theory is used.
Abstract: Non exhaustive methodologies for characterizing equilibria in wireless games in terms of existence, uniqueness, selection, and efficiency are provided. The paper will focus on technical problems arising at the physical and medium access layers of a wireless network, and not on economic aspects related to it, like the auction problem for spectrum, even though it is also an important scenario where game theory is used.

Journal ArticleDOI
TL;DR: Distributed resource allocation schemes in which each transmitter determines its allocation autonomously, based on the exchange of interference prices, can be adapted according to the size of the network.
Abstract: In this article, we discuss distributed resource allocation schemes in which each transmitter determines its allocation autonomously, based on the exchange of interference prices. These schemes have been primarily motivated by the common model for spectrum sharing in which a user or service provider may transmit in a designated band provided that they abide by certain rules (e.g., a standard such as 802.11). An attractive property of these schemes is that they are scalable, i.e., the information exchange and overhead can be adapted according to the size of the network.

Journal ArticleDOI
TL;DR: JPEG XR is the newest image coding standard from the JPEG committee and achieves high image quality, on par with JPEG 2000, while requiring low computational resources and storage capacity.
Abstract: JPEG XR is the newest image coding standard from the JPEG committee. It primarily targets the representation of continuous-tone still images such as photographic images and achieves high image quality, on par with JPEG 2000, while requiring low computational resources and storage capacity. Moreover, it effectively addresses the needs of emerging high dynamic range imagery applications by including support for a wide range of image representation formats.

Journal ArticleDOI
TL;DR: Examples include sampling rate conversion for software radio and between audio formats, biomedical imaging, lens distortion correction and the formation of image mosaics, and super-resolution of image sequences.
Abstract: Digital applications have developed rapidly over the last few decades. Since many sources of information are of analog or continuous-time nature, discrete-time signal processing (DSP) inherently relies on sampling a continuous-time signal to obtain a discrete-time representation. Consequently, sampling theories lie at the heart of signal processing devices and communication systems. Examples include sampling rate conversion for software radio and between audio formats, biomedical imaging, lens distortion correction and the formation of image mosaics, and super-resolution of image sequences.

Journal ArticleDOI
TL;DR: This article described some basic concepts from noncooperative and cooperative game theory and illustrated them by three examples using the interference channel model and correlated equilibria where a certain type of common randomness can be exploited to increase the utility region.
Abstract: In this article, we described some basic concepts from noncooperative and cooperative game theory and illustrated them by three examples using the interference channel model, namely, the power allocation game for SISO IFC, the beamforming game for MISO IFC, and the transmit covariance game for MIMO IFC. In noncooperative game theory, we restricted ourselves to discuss the NE and PoA and their interpretations in the context of our application. Extensions to other noncooperative approaches include Stackelberg equilibria and the corresponding question "Who will go first?" We also correlated equilibria where a certain type of common randomness can be exploited to increase the utility region. We leave the large area of coalitional game theory open.

Journal ArticleDOI
TL;DR: Connected operators as mentioned in this paper are filtering tools that act by merging elementary regions called flat zones, and they have very good contour preservation properties and are capable of both low-level filtering and higher-level object recognition.
Abstract: Connected operators are filtering tools that act by merging elementary regions called flat zones. Connecting operators cannot create new contours nor modify their position. Therefore, they have very good contour preservation properties and are capable of both low-level filtering and higher-level object recognition. This article gives an overview on connected operators and their application to image and video filtering. There are two popular techniques used to create connected operators. The first one relies on a reconstruction process. The operator involves first a simplification step based on a "classical" filter and then a reconstruction process. In fact, the reconstruction can be seen as a way to create a connected version of an arbitrary operator. The simplification effect is defined and limited by the first step. The examples we show include simplification in terms of size or contrast. The second strategy to define connected operators relies on a hierarchical region-based representation of the input image, i.e., a tree, computed in an initial step. Then, the simplification is obtained by pruning the tree, and, third, the output image is constructed from the pruned tree. This article presents the most important trees that have been used to create connected operators and also discusses important families of simplification or pruning criteria. We also give a brief overview on efficient implementations of the reconstruction process and of tree construction. Finally, the possibility to define and to use nonclassical notions of connectivity is discussed and Illustrated.

Journal ArticleDOI
TL;DR: While Pal et.
Abstract: Presents the evolution of file carving and describes in detail the techniques that are now being used to recover files without using any file system meta-data information. We show the benefits and problems that exist with current techniques. In the future, solid-state devices (SSDs) will become much more prevalent. SSDs will incorporate wear-leveling, which results in files being moved around so as to not allow some clusters to be written to more than others. This is done because after a certain amount of writes a cluster will fail and, therefore, the SSD controller will attempt to spread the write load across all clusters in the disk. As a result, SSDs will be naturally fragmented, and should the disk controller fail the clusters on the disk will require file carving techniques to recover. There is a lot of research yet to be done in this area for data recovery. Finally, while Pal et. al's techniques are useful for recovering text and images, new weighting techniques need to be created for video, audio, executable and other file formats, thus allowing the recovery to extend to those formats.

Journal ArticleDOI
TL;DR: The embedded DSP market has been swept up by the general increase in interest in multicore that has been driven by companies such as Intel and Sun, but it is too early to say in what way this industry-wide effort will have an effect on the way multicore DSPs are programmed and perhaps architected.
Abstract: In the last two years, the embedded DSP market has been swept up by the general increase in interest in multicore that has been driven by companies such as Intel and Sun. One reason for this is that there is now a lot of focus on tooling in academia and also a willingness on the part of users to accept new programming paradigms. This industry-wide effort will have an effect on the way multicore DSPs are programmed and perhaps architected. But it is too early to say in what way this will occur. Programming multicore DSPs remains very challenging. The problem of how to take a piece of sequential code and optimally partition it across multiple cores remains unsolved. Hence, there will naturally be a lot of variations in the approaches taken. Equally important is the issue of debugging and visibility. Developing effective and easy-to-use code development and real-time debug tools is tremendously important as the opportunity for bugs goes up significantly when one starts to deal with both time and space. The markets that DSP plays in have unique features in their desire for low power, low cost, and hard real-time processing, with an emphasis on mathematical computation. How well the multicore research being performed presently in academia will address these concerns remains to be seen.

Journal ArticleDOI
P.E. Hart1
TL;DR: The previously untold history of how the familiar sinusoidal transform came about illustrates how important advances sometime come from combining not-obviously-related ideas.
Abstract: Where did this transform come from? You may vaguely recall learning that it goes back to a 1962 patent by P.V.C. Hough, though the author suspect very few readers have actually looked at that patent, the title page. If you do, you may be surprised to find that the popular transform used today is not described there. Indeed, today's transform was not a single-step invention but instead took several steps that resulted in Hough's initial idea being combined with an idea from an obscure branch of late 19th century mathematics to produce the familiar sinusoidal transform. The previously untold history of how this came about illustrates how important advances sometime come from combining not-obviously-related ideas. The history perhaps also illustrates that the observation of Louis Pasteur, "Chance favors the prepared mind," remains as apt in the 20th and 21st centuries as it was in the 19th.

Journal ArticleDOI
TL;DR: Looking at the different points highlighted in this article, it is affirm that forensic applications of speaker recognition should still be taken under a necessary need for caution.
Abstract: Looking at the different points highlighted in this article, we affirm that forensic applications of speaker recognition should still be taken under a necessary need for caution. Disseminating this message remains one of the most important responsibilities of speaker recognition researchers.

Journal ArticleDOI
TL;DR: This article focuses on problems and issues related to PQ and power system diagnostics, in particular those where signal processing techniques are extremely important.
Abstract: This article focuses on problems and issues related to PQ and power system diagnostics, in particular those where signal processing techniques are extremely important. PQ is a general term that describes the quality of voltage and current waveforms. PQ problems include all electric power problems or disturbances in the supply system that prevent end-user equipment from operating properly.

Journal ArticleDOI
TL;DR: The paper provided an overview of research work on waveform-agile target tracking and found that waveforms can be selected to optimize a tracking performance criterion such as minimizing the tracking MSE or maximizing target information retrieval.
Abstract: Waveform-agile sensing is fast becoming an important technique for improving sensor performance in applications such as radar, sonar, biomedicine, and communications. The paper provided an overview of research work on waveform-agile target tracking. From both control theoretic and information theoretic perspectives, waveforms can be selected to optimize a tracking performance criterion such as minimizing the tracking MSE or maximizing target information retrieval. The waveforms can be designed directly based on their estimation resolution properties, selected from a class of waveforms with varying parameter values over a feasible sampling grid in the time-frequency plane, or obtained from different waveform libraries.

Journal ArticleDOI
TL;DR: The DSP audience is given some insight into the types of problems and challenges that face practitioners in audio forensic laboratories, and several of the frustrations and pitfalls encountered by signal processing experts when dealing with typical forensic material due to the standards and practices of the legal system.
Abstract: The field of audio forensics involves many topics familiar to the general audio digital signal processing (DSP) community, such as speech recognition, talker identification, and signal quality enhancement. There is potentially much to be gained by applying modern DSP theory to problems of interest to the forensics community, and this article is written to give the DSP audience some insight into the types of problems and challenges that face practitioners in audio forensic laboratories. However, this article must also present several of the frustrations and pitfalls encountered by signal processing experts when dealing with typical forensic material due to the standards and practices of the legal system.

Journal ArticleDOI
TL;DR: It is shown that the frequency selective interference channel has many intriguing aspects from a game theoretic point of view as well, and that various levels of interference admit different types of game theory techniques.
Abstract: As discussed in this paper, the frequency selective interference channel is important, both from a practical as from an information theoretic point of view. We show that it has many intriguing aspects from a game theoretic point of view as well, and that various levels of interference admit different types of game theoretic techniques.

Journal ArticleDOI
TL;DR: This article gives an overview on the techniques needed to implement the discrete Fourier transform (DFT) efficiently on current multicore systems and shows and analyzes DFT benchmarks of the fastest libraries available for the considered platforms.
Abstract: This article gives an overview on the techniques needed to implement the discrete Fourier transform (DFT) efficiently on current multicore systems. The focus is on Intel-compatible multicores, but we also discuss the IBM Cell and, briefly, graphics processing units (GPUs). The performance optimization is broken down into three key challenges: parallelization, vectorization, and memory hierarchy optimization. In each case, we use the Kronecker product formalism to formally derive the necessary algorithmic transformations based on a few hardware parameters. Further code-level optimizations are discussed. The rigorous nature of this framework enables the complete automation of the implementation task as shown by the program generator Spiral. Finally, we show and analyze DFT benchmarks of the fastest libraries available for the considered platforms.

Journal ArticleDOI
TL;DR: In the 50 or so years since P.M.
Abstract: The design of radar waveforms has received considerable attention since the 1950s. In 1953, P.M. Woodward (1953; 1953) defined the narrowband radar ambiguity function or, simply, ambiguity function. It is a device formulated to describe the effects of range and Doppler on matched filter receivers. Woodward acknowledged the influence that Shannon's communication theory from 1948 had on his ideas; and he explained the relevance of ambiguity in radar signal processing, perhaps best conceived in terms of a form of the uncertainty principle (see the sections "Motivation" and "Ambiguity Functions"). However, in the 50 or so years since Woodward's book was published, radar signal processing has used the ambiguity function as an intricate and flexible tool in the design of waveforms to solve diverse problems in radar. In the process, substantial connections were established in mathematics, physics, and other areas of signal processing. As such, we are introducing two new methods, discussed in sections "CAZAC Sequences" and "Aperiodic Simulations".

Journal ArticleDOI
TL;DR: This article shows that suitably transmitted and processed, radar waveforms based on Golay sequences provide new primitives for adaptive transmission that enable better detection and finer resolution, while managing computational complexity at the receiver.
Abstract: This article shows that suitably transmitted and processed, radar waveforms based on Golay sequences provide new primitives for adaptive transmission that enable better detection and finer resolution, while managing computational complexity at the receiver. The ability to exploit space-time adaptive processing is limited by the computational power available at the receiver, and increased flexibility on transmission only exacerbates this problem unless the waveforms are properly designed to simplify processing at the receiver.

Journal ArticleDOI
TL;DR: The main needs for signal processing development are surveyed and a sensor fusion approach where all tasks are considered jointly is argued, where a limited number of sensors can be sufficient to implement a variety of safety systems.
Abstract: In this article, we have surveyed the main needs for signal processing development and argued for a sensor fusion approach where all tasks are considered jointly. First, the section "automotive safety systems" summarized a number of safety systems, and it was pointed out that a limited number of sensors can be sufficient to implement a variety of safety systems. Second, the active development of improved communication networks enables new sensor fusion strategies.

Journal ArticleDOI
TL;DR: The role of paper has been transformed from the archival record of a document to a convenient and aesthetically appealing graphical user interface and the use of paper is now intimately linked to the electronic systems that capture, process, transmit, generate, and reproduce textual and graphic content.
Abstract: Contrary to popular opinion, the use of paper in our society will not disappear during the foreseeable future. In fact, paper use continues to grow rather than decline. It is certainly true that as individuals, we may be printing less than we used to. And the role of paper has been transformed from the archival record of a document to a convenient and aesthetically appealing graphical user interface. The use of paper is now intimately linked to the electronic systems that capture, process, transmit, generate, and reproduce textual and graphic content. Paper can be thought of as an interface between humans and the digital world. If this interface is not secure, the entire system becomes vulnerable to attack and abuse. Although paper is read by humans in the same way that it has been for millennia and has had the same fundamental form and composition for almost that long, the technologies for printing and scanning documents and capturing their content have evolved tremendously, especially during the last 20 years. This has moved the capability to generate printed documents from the hands of a select few to anyone with access to low-cost scanners, printers, and personal computers. It has greatly broadened the opportunities for abuse of trust through the generation of fallacious documents and the tampering with existing documents, including the embedding of messages in these documents.

Journal ArticleDOI
TL;DR: The goal of this article is to show how many challenging unsolved resource allocation problems in the emerging field of cognitive radio networks fit naturally either in the game theoretical paradigm or in the more general theory of VI.
Abstract: The goal of this article is to show how many challenging unsolved resource allocation problems in the emerging field of cognitive radio (CR) networks fit naturally either in the game theoretical paradigm or in the more general theory of VI This provides us with all the mathematical tools necessary to analyze the proposed equilibrium problems for CR systems (eg, existence and uniqueness of the solution) and to devise distributed algorithms along with their convergence properties