scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Signal Processing Magazine in 1995"


Journal ArticleDOI
H.-L. Lou1
TL;DR: The article explains the basics of the Viterbi algorithm as applied to systems in digital communication systems, and speech and character recognition, and focuses on the operations and the practical memory requirements to implement the Vterbi algorithm in real-time.
Abstract: The Viterbi algorithm, an application of dynamic programming, is widely used for estimation and detection problems in digital communications and signal processing. It is used to detect signals in communication channels with memory, and to decode sequential error-control codes that are used to enhance the performance of digital communication systems. The Viterbi algorithm is also used in speech and character recognition tasks where the speech signals or characters are modeled by hidden Markov models. The article explains the basics of the Viterbi algorithm as applied to systems in digital communication systems, and speech and character recognition. It also focuses on the operations and the practical memory requirements to implement the Viterbi algorithm in real-time. >

253 citations


Journal ArticleDOI
TL;DR: The authors focus on a tutorial description of the hybrid HMM/ANN method, which provides a mechanism for incorporating a range of sources of evidence without strong assumptions about their joint statistics, and may have applicability to much more complex systems that can incorporate deep acoustic and linguistic context.
Abstract: The authors focus on a tutorial description of the hybrid HMM/ANN method. The approach has been applied to large vocabulary continuous speech recognition, and variants are in use by many researchers, The method provides a mechanism for incorporating a range of sources of evidence without strong assumptions about their joint statistics, and may have applicability to much more complex systems that can incorporate deep acoustic and linguistic context. The method is inherently discriminant and conservative of parameters. Despite these potential advantages, the hybrid method has focused on implementing fairly simple systems, which do surprisingly well on large continuous speech recognition tasks, Researchers are only beginning to explore the use of more complex structures with this paradigm. In particular, they are just beginning to look at the connectionist inference of language models (including phonology) from data, which may be required in order to take advantage of locally discriminant probabilities rather than simply translating to likelihoods. Finally, the authors' current intuition is that more advanced versions of the hybrid method can greatly benefit from a perceptual perspective. >

230 citations


Journal ArticleDOI
TL;DR: An implementable three dimensional terrain adaptive transform based bandwidth compression technique for multispectral imagery based on Karhunen-Loeve transformation followed by the standard JPEG algorithm for coding the resulting spectrally decorrelated eigen images.
Abstract: We present an implementable three dimensional terrain adaptive transform based bandwidth compression technique for multispectral imagery. The algorithm exploits the inherent spectral and spatial correlations in the data. The compression technique is based on Karhunen-Loeve transformation for spectral decorrelation followed by the standard JPEG algorithm for coding the resulting spectrally decorrelated eigen images. The algorithm is conveniently parameterized to accommodate reconstructed image fidelities ranging from near-lossless at about 5:1 CR to visually lossy beginning at about 30:1 CR. The novelty of this technique lies in its unique capability to adaptively vary the characteristics of the spectral correlation transformation as a function of the variation of the local terrain. The spectral and spatial modularity of the algorithm architecture allows the JPEG to be replaced by a alternate spatial coding procedure. The significant practical advantage of this proposed approach is that it is based on the standard and highly developed JPEG compression technology. >

176 citations


Journal ArticleDOI
TL;DR: The author addresses the problem of detecting and identifying stationary and moving targets with foliage penetrating UHF synthetic aperture radar (SAR) with a signal theory based analysis of SAR signal via Fourier transform.
Abstract: The author addresses the problem of detecting and identifying stationary and moving targets with foliage penetrating UHF synthetic aperture radar (SAR). The role of a target's coherent SAR signature, which varies with the radar's frequency and aspect angle, in forming the Fourier space of the SAR signal is analyzed. The resultant relationship is the basis of an algorithm which, after extracting (digital spotlighting) the target's coherent SAR signature in the reconstruction domain, could be used to differentiate man-made structures from foliage. Methods for blind-velocity moving target indication are discussed. The main tool of the work is a signal theory based analysis of SAR signal via Fourier transform. However, the theory is at most as good as the collected SAR data. >

138 citations


Journal ArticleDOI
TL;DR: An introduction to the experience gained from the Advanced Land Remote Sensing System (ALRSS) compression development, and an insight into the challenges of MSI and space-based compression algorithm design are provided.
Abstract: Multispectral image (MSI) compression has evolved into a viable solution for band limited communications problems in current and future remote sensing systems. MSI compression technology continues to mature as research identifies the interaction of compression distortion and typical multispectral exploitation tasks. Understanding of both compression artifacts and exploitation techniques must proceed in parallel because sensitivity to errors (distortion) must be addressed for a much larger usage base. This article provides an introduction to the experience gained from the Advanced Land Remote Sensing System (ALRSS) compression development, and an insight into the challenges of MSI and space-based compression algorithm design. The ALRSS studies provide an initial look at the challenges of designing and evaluating MSI compression systems. The results of these studies have shown that compression rates between 2.2 and 1.3 bpp are viable and feasible for space-based applications today. MSI systems can be designed to include compression without changing the significance of the final image product. >

55 citations


Journal ArticleDOI
TL;DR: A distributed team-oriented senior level curriculum in digital signal processing (DSP) and an experiment in collaborative education using distributed teams of students from two universities are described.
Abstract: Discusses the development of a distributed team-oriented senior level curriculum in digital signal processing (DSP). The authors describe an experiment in collaborative education using distributed teams of students from two universities. They also discuss plans for expanded collaboration with additional schools, and present pedagogical objectives. The two courses considered are 'Introduction to signal processing' and 'Advanced topics in signal processing'. The authors discuss computational and Internet tools and evaluate the project. >

46 citations


Journal ArticleDOI
TL;DR: The aim of the article is to add to the evidence of the need for interaction between the two field of signal processing and control and estimation by focusing on the the interaction specialities from (i) adaptive filtering, and (ii) adaptive identification and control.
Abstract: The aim of the article is to add to the evidence of the need for interaction between the two field of (i) signal processing and (ii) control and estimation. The focus in on the the interaction specialities from (i), i.e., adaptive filtering, and (ii)i.e., adaptive identification and control. For evidence, three basic communication systems applications of adaptive filtering are used to evoke system identification (and adaptive control) style problems amenable to analytical tools. The applications raise a number of unresolved issues that challenge existing theory. The connections drawn to adaptive identification and control problems (and relevant analytical tools) suggest approaches to these challenges. >

28 citations


Journal ArticleDOI
TL;DR: The visualization process is explained, two signal processing issues, sampling and color space selection, are described, and a survey of some of the various visualization techniques is provided, emphasizing the difference in visualizing time-invariant and time-variant data.
Abstract: The goal of this article is to encourage the signal processing community to address the needs of the scientific visualization community. To aid in this effort, we first explain the visualization process. Then we describe two signal processing issues, sampling and color space selection, that arise in various visualization techniques. Next, we provide a survey of some of the various visualization techniques, emphasizing the difference in visualizing time-invariant and time-variant data. Finally, two visualization applications will be described in detail to exemplify the signal processing aspects of scientific visualization. >

25 citations


Journal ArticleDOI
TL;DR: The article highlights the roles played by software in DSP education, including symbolic and numeric processing as well as high-level programming of DSPs.
Abstract: Global competition is forcing universities to rethink their approach to undergraduate education. Advanced hardware and software tools make it possible for students to grasp fundamental engineering concepts quicker, ultimately enhancing the undergraduate educational experience. The article highlights the roles played by software in DSP education, including symbolic and numeric processing as well as high-level programming of DSPs. The authors include a survey of some of the more popular software environments and identify important logistical issues. >

12 citations


Journal ArticleDOI
TL;DR: The article discusses the technological elements of this mix, which includes low-cost DSP software, a low- cost TI TMS320C5x DSP starter kit, and a generic soundboard.
Abstract: The University of Florida is developing a technology-based comprehensive engineering educational delivery system for the study of digital processing (DSP). The DSP curriculum, which covers both the undergraduate and graduate programs, exhibits a commitment to cooperative learning and personal computer-based instructional (CBI) tools. The article discusses the technological elements of this mix, which includes low-cost DSP software, a low-cost TI TMS320C5x DSP starter kit, and a generic soundboard. >

6 citations


Journal ArticleDOI
Don H. Johnson1
TL;DR: The rudiments of accessing the Web and how to create your own information resources are described, focusing on signal processing resources and how the Web catalyzes signal processing research and development.
Abstract: The World Wide Web (WWW) offers much information useful to the signal processing community. Using the Web, information having a variety of different forms can be transferred in a cohesive fashion. The article describes the rudiments of accessing the Web and how to create your own information resources. The authors focus on signal processing resources and how the Web catalyzes signal processing research and development.

Journal ArticleDOI
TL;DR: A simple signal processing scheme to extend resolution to nearly twice the Nyquist without the problem of alias emerged by placing one-dimensional signals into a simulated sample-and-hold process in a Mathcad application.
Abstract: The author discusses the concept that data sampled beyond the Nyquist frequency is meaningless. A simple signal processing scheme to extend resolution to nearly twice the Nyquist without the problem of alias emerged. It starts by placing one-dimensional signals into a simulated sample-and-hold process in a Mathcad application. >

Journal ArticleDOI
J.L. Smith1
TL;DR: The author discusses how one can describe a measure of the effect of the BCA on the observation and describes the power spectrum, Bode diagram, or modulation transfer function for the transmitted, reconstructed signal.
Abstract: Discusses image compression and the block compression algorithm (BCA). The power spectrum, Bode diagram, or modulation transfer function (MTF) for the transmitted, reconstructed signal are the usual measures of system resolution. Any one of these resolution measures is often nearly identical for the ordinary signal and that for which the BCA is used. Yet, observers of that same transmitted signals perceive a much more profound difference than indicated by the resolution measure. At high compression, they say it is choppy and disturbing for the dynamic features. The author discusses how one can describe a measure of the effect of the BCA on the observation. >

Proceedings ArticleDOI
TL;DR: This work introduces various Web mining algorithm, which efficiently reduces the computing cost of Web search, and proposes more intelligent search systems, which are based on the technologies of data mining.
Abstract: We have developed Japanese Web search engine \"Mondou (RCAAU)\", which was based on the emerging technologies of data mining. Our search engine provides associative keywords which are tightly related to focusing Web pages. We also implemented the visual interface based on the technology of information visualization. In order to improve the performance of various search strategies by using characteristics of Web systems, we try to implement the advanced Web information systems with data mining and information technologies. Firstly, we introduce various Web mining algorithm, which efficiently reduces the computing cost of Web search. We pay attention to a part of useful pages effectively and improve the performance of Web search by using our proposed algorithms. Secondly, for preserving huge volume of born-digital information in the Internet, we are focusing on technologies of Web archiving system like WARP. In order to handle monotonously increasing digital information, we have to resolve many difficult problems of long life data preservation by improving Web searching techniques. Our experiences of our Mondou Web search engine and cooperative distributed Web robots are very useful and effective. Finally, the technologies of P2P (Peer-to-Peer) distributed search systems are becoming important rapidly. For example, it is very hard to discover appropriate information resources by simple queries of Gnutella, Freenet and so on. Therefore, in order to realize the topic-driven search, we propose more intelligent search systems, which are based on the technologies of data mining.