scispace - formally typeset
Search or ask a question

Showing papers by "Sos S. Agaian published in 2005"


Proceedings ArticleDOI
18 Mar 2005
TL;DR: An enhancement technique based upon a new application of histograms on transform domain coefficients called logarithmic transform coefficient histogram shifting (LTHS) is presented and its performance is compared quantitatively to classical histogram equalization using a measure of enhancement based on contrast entropy.
Abstract: This paper presents an enhancement technique based upon a new application of histograms on transform domain coefficients called logarithmic transform coefficient histogram shifting (LTHS). A measure of enhancement based on contrast entropy is used as a tool for evaluating the performance of the proposed enhancement technique and for finding optimal values for variables contained in the enhancement. The algorithm's performance is compared quantitatively to classical histogram equalization using the aforementioned measure of enhancement. Experimental results are presented to show the performance of the proposed algorithm alongside classical histogram equalization.

35 citations


Proceedings ArticleDOI
01 Jan 2005
TL;DR: A lossless adaptive digital audio steganographic technique based on reversible two and higher dimensional integer transform to choose the best blocks for embedding perceptually inaudible stego information and to select the best block sizes to maximize the number of blocks/capacity.
Abstract: This paper presents a lossless adaptive digital audio steganographic technique based on reversible two and higher dimensional integer transform. The adaptive technique is used to choose the best blocks for embedding perceptually inaudible stego information, and to select the best block sizes to maximize the number of blocks/capacity. The stego information is embedded in the integer domain by bit manipulation. In addition, we intro- duce a capacity measure to select audio carriers that in- troduce minimum distortion after embedding. The above technique is also applicable to compression based audio formats, such as MPEG audio layer-3 (mp3).

34 citations


Proceedings ArticleDOI
25 May 2005
TL;DR: An image enhancement algorithm that is based on utilizing histogram data gathered from transform domain coefficients that will improve on the limitations of the histogram equalization method and achieves a much more balanced enhancement, that out performs classical histograms equalization.
Abstract: In this paper we propose an image enhancement algorithm that is based on utilizing histogram data gathered from transform domain coefficients that will improve on the limitations of the histogram equalization method. Traditionally, classical histogram equalization has had some problems due to its inherent dynamic range expansion. Many images with data tightly clustered around certain intensity values can be over enhanced by standard histogram equalization, leading to artifacts and overall tonal change of the image. In the transform domain, one has control over subtle image properties such as low and high frequency content with their respective magnitudes and phases. However, due to the nature of many of these transforms, the coefficient’s histograms may be so tightly packed that distinguishing them from one another may be impossible. By placing the transform coefficients in the logarithmic transform domain, it is easy to see the difference between different quality levels of images based upon their logarithmic transform coefficient histograms. Our results demonstrate that combing the spatial method of histogram equalization with logarithmic transform domain coefficient histograms achieves a much more balanced enhancement, that out performs classical histogram equalization.

33 citations


Proceedings ArticleDOI
21 Mar 2005
TL;DR: In this paper, a local universal steganalysis technique combining the advantages of both wavelet coefficient and higher-order statistics based methods is presented, which can detect hidden messages without using prior information about the steganographic system.
Abstract: Universal blind steganalysis can detect hidden messages without using prior information about the steganographic system. Recently, Farid developed a wavelet coefficient, higher-order statistics based, universal blind steganalysis method. This approach is a global method which demonstrated a high-quality in performance standards. Fridrich and Goljan also presented a DCT based local targeted steganalysis method to break the F5 algorithm. However, both Farid’s and Fridrich and Goljan’s methods have some limitations. This paper presents a local universal steganalysis technique combining the advantages of both methods. The basic components of the presented method are: novel DCT multilevel decomposition with wavelet structure; a new set of feature vectors; and a modified kernel function in the Kernel Fisher Discriminant. Experimental results show the presented method offers better performance than commonly used schemes. Inherently, the presented method has the ability to localize the hidden information, it can capture stego information in small blocks, and it is functional using only a small training set.

9 citations


Proceedings ArticleDOI
TL;DR: This work proposes a new adaptive technique that is able to overcome embedding capacity limitations and reduce the revealing artifacts that are customarily introduced when applying other embedding methods.
Abstract: Adaptive steganography is a statistical approach for hiding the digital information into another form of digital media. The goal is to ensure the changes introduced into the cover image remain consistent with the natural noise model associated with digital images. There are generally two classes of steganography − global and local. The global class encompasses all non-adaptive techniques and is the simplest to apply and easiest to detect. The second classification is the local class, which defines most of the present adaptive techniques. We propose a new adaptive technique that is able to overcome embedding capacity limitations and reduce the revealing artifacts that are customarily introduced when applying other embedding methods. To obtain the objectives, we introduce a third faction which is the pixel focused class of steganography. Applying a new adaptive T-order statistical local characterization, the proposed algorithm is able to adaptively select the number of bits to embed per pixel. Additionally, a histogram retention process, an evaluation measure based on the cover image and statistical analysis allow for the embedding of information in a manner which ensures soundness from multiple statistical aspects. Based on the results of simulated experiments, our method is shown to securely allow an increased amount of embedding capacity, simultaneously avoiding detection by varying steganalysis techniques.

9 citations


Proceedings ArticleDOI
25 May 2005
TL;DR: In this paper, an algorithm for digital filtering using the logical transform is proposed, which is able to achieve mean-squared-error results similar to median type filters while maintaining image details.
Abstract: Median filters excel at removing impulse noise from digital signals, with high accuracy and quick running-times. However, they have limitations if the resulting signals are to be used for feature recognition purposes, as they often remove crucial details and add unwanted noise. In this paper, an algorithm for digital filtering using the logical transform is proposed. This method is able to achieve mean-squared-error results similar to median type filters while maintaining image details. Variations of the algorithm allow for grea ter noise reduction, but at the cost of increased computation. Keywords: impulse noise, digital filtering, Histogram of Primar y Implicants, Boolean Minimization, logical transform 1. INTRODUCTIONDigital signals can be corrupted by impulse noise (i.e. salt and pepper noise) in a variety of ways, from errors in collection sensors to noisy transmission channels to faulty storage devices. The industry standard for eliminating impulse noise is median type filters

8 citations


Book ChapterDOI
20 Dec 2005
TL;DR: A run length based steganography algorithm for embedding the secured data into binary images based on variable embedding rate for each block, which enhances the security of embedded data and the capacity of the embedding method.
Abstract: Binary images (like cartoons, text documents, signatures captured by signing pads and/or 2-color imagery) are very commonly used in our daily life. Changing the pixel values in these images for hiding the data, may produce a noticeable change in the cover media. The primary problem is the capacity of the embedding technique and preserving the visible artifacts of the cover image even after embedding the secured data. In this paper, we present a run length based steganography algorithm for embedding the secured data into binary images. The proposed algorithm alters pixels of the embeddable blocks of cover image depending on there run length characteristics and chactersitics values of the block. In addition, the new algorithm is based on variable embedding rate for each block, which enhances the security of embedded data and the capacity of the embedding method. We test the performance of the algorithm over 50 various sizes of binary cover images embedding various sizes of secured data. Comparisons with existing algorithms will be presented. Keyword: Run length, steganography, binary image.

5 citations


Proceedings ArticleDOI
TL;DR: An original algorithm and two variations, all using the logical transform, that strive to do this accurately and with low levels of computation are presented.
Abstract: During critical situations, the precise digital processing of medical signals such as heartbeats is essential. Outside noise introduced into this data can lead to misinterpretation. Thus, it is important to be able to detect and correct the signal quickly and efficiently using digital filtering algorithms. With filtering, the goal is to remove noise locations by correctly identify the corrupted data points and replacing these locations with acceptable estimations of the original values. However, one has to be careful throughout the filtering process not to also eliminate other important detailed information from the original signal. If the filtered output is to be analyzed post-filtering, say for feature recognition, it is important that both the structure and details of the original clean signal remain. This paper presents an original algorithm and two variations, all using the logical transform, that strive to do this accurately and with low levels of computation. Using real heartbeat signals as test sets, the output is compared to that produced by median type filters, and results demonstrated over a variety of noise levels.

4 citations


Proceedings ArticleDOI
17 Oct 2005
TL;DR: This paper suggests an approach to get a position fix without this time stamp recovery based on an iterated least squares method and uses an excessive number of satellites to perform positioning with partially available measurements.
Abstract: The GPS receiver operation in a jamming environment brings several challenging tasks. If the signal is lost for a certain period of time then a reacquisition delay can be undesirable for time critical applications. The delay is due to necessity to find the lost signal again and to determine the time stamp on this signal which is contained in the navigation message. While the signal can be found in sub-second intervals with state-of-the-art prediction and searching strategies the time-stamp reading from navigation message may take up to six seconds provided good signal conditions. This paper suggests an approach to get a position fix without this time stamp recovery. It is based on an iterated least squares method and uses an excessive number of satellites to perform positioning with partially available measurements. It has been suggested initially for indoor positioning but is equally valuable for jamming environments. The method is simulated with GPS LI C/A signals which provided an evidence of its potential

4 citations



Proceedings ArticleDOI
TL;DR: This paper proposes a new adaptive embedding technique which alters the least significant bit layers of an image and introduces a new stego capacity measure which will offer a means of comparing steganography methods applied across different formats of images.
Abstract: Adaptive steganographic techniques have become a standard direction taken when striving to complicate the detection of secret communication. The consideration of cover image features when embedding information is an effort to insert digital media while keeping the visual and the statistical properties of the cover image intact. There are several such embedding methods in existence today, applicable for different formats of images and with contrasting approaches. In this paper, we propose a new adaptive embedding technique which alters the least significant bit layers of an image. This technique is driven by three separate functions: (1) Adaptive selection of locations to embed. (2) Adaptive selection of number of bits per pixel to embed. (3) Adaptive selection of manner in which the information is inserted. Through the application of sensitive median-based statistical estimation and a recorded account of actions taken, the proposed algorithms are able to provide the desired level of security, both visually and statistically. In comparison with other methods offering the same level of security, the new technique is able to offer a greater embedding capacity. In addition, for the sake of thorough investigation and fair comparison, we will introduce a new stego capacity measure which will offer a means of comparing steganography methods applied across different formats of images. Finally, this new algorithm is created with the intention of implementing a new visual communication system acknowledging low energy consumption and limited computationally related resources.

Journal ArticleDOI
TL;DR: A general and efficient algorithm for decomposition of binary matrices and the corresponding fast transform is developed and it is demonstrated that, in typical applications, the proposed algorithm is significantly more efficient than the conventional Walsh-Hadamard transform.
Abstract: Binary matrices or (± 1)-matrices have numerous applications in coding, signal processing, and communications. In this paper, a general and efficient algorithm for decomposition of binary matrices and the corresponding fast transform is developed. As a special case, Hadamard matrices are considered. The difficulties of the construction of 4n-point Hadamard transforms are related to the Hadamard problem: the question of the existence of Hadamard matrices. (It is not known whether for every integer n, there is an orthogonal 4n × 4n matrix with elements ± 1.) In the derived fast algorithms, the number of real operations is reduced from O(N2) to O(N log N) compared to direct computation. The proposed scheme requires no zero padding of the input data. Comparisions revealing the efficiency of the proposed algorithms with respect to the known ones are given. In particular, it is demonstrated that, in typical applications, the proposed algorithm is significantly more efficient than the conventional Walsh-Hadamard transform. Note that for Hadamard matrices of orders ≥ 96 the general algorithm is more efficient than the classical Walsh-Hadamard transform whose order is a power of 2. The algorithm has a simple and symmetric structure. The results of numerical examples are presented.

Proceedings ArticleDOI
TL;DR: A reduced complexity frequency domain convolution approach for the search over limited number of code phases in embedded algorithms for spread spectrum communication receivers, which are using FFT as an engine to compute convolutions.
Abstract: Most of the current wireless communication devices use embedded processors for performing different tasks such as physical layer signal processing and multimedia applications. Embedded processors provide a reasonable trade-off between application specific implementation and hardware sharing by different algorithms for more optimal design and flexibility. At the same time the widespread popularity of these processors drives the development of algorithms specifically tailored for embedded environments. Fast Fourier Transform (FFT) is a universal tool, which has found many applications in communications and many application specific architectures and Digital Signal Processor (DSP) implementations are available for FFT. In this paper our focus is in embedded algorithms for spread spectrum communication receivers, which are using FFT as an engine to compute convolutions. Using FFT-based correlators one can search over all possible so-called code phases of direct sequence spread spectrum (DS-SS) signal in parallel with fewer operations than conventional correlators do. However in many real-life scenarios the receiver is provided with a timing assistance which confines the uncertainty in code phase within a limited area. The FFT based search is becoming redundant and a reasonable strategy is to modify the FFT based methods for better utilization of embedded processor resources. In this paper we suggest a reduced complexity frequency domain convolution approach for the search over limited number of code phases.

Proceedings ArticleDOI
25 May 2005
TL;DR: An alternative capacity for steganographic systems is investigated, defined as the maximum number of embeddable bit within a digital signal while maintaining imperceptible requirements.
Abstract: The goal of this article is to investigate an alternative capacity for steganographic systems. We will define steganographic capacity as the maximum number of embeddable bit within a digital signal while maintaining imperceptible requirements. This capacity makes it somewhat possible to solve two fundamental steganographic problems: first, how to choose the best cover image among classes of images and second which embedding method may be employed to reduce the detection of hidden information within the embedded areas. In addition, the new capacity may be used for 1) the separation of an analyzed image into embeddable areas, 2) the identification of maximum embedding capacities within a cover digital image, and 3) estimating the length in bits used for embedding information within the identified regions.

Proceedings ArticleDOI
25 May 2005
TL;DR: This paper introduces a two new compression technique that compresses the random data like noise with reference to know pseudo noise sequence generated using a key and developed a representation model for digital media using the pseudo noise signals.
Abstract: Compression is a technique that is used to encode data so that the data needs less storage/memory space. Compression of random data is vital in case where data where we need preserve data that has low redundancy and whose power spectrum is close to noise. In case of noisy signals that are used in various data hiding schemes the data has low redundancy and low energy spectrum. Therefore, upon compressing with lossy compression algorithms the low energy spectrum might get lost. Since the LSB plane data has low redundancy, lossless compression algorithms like Run length, Huffman coding, Arithmetic coding are in effective in providing a good compression ratio. These problems motivated in developing a new class of compression algorithms for compressing noisy signals. In this paper, we introduce a two new compression technique that compresses the random data like noise with reference to know pseudo noise sequence generated using a key. In addition, we developed a representation model for digital media using the pseudo noise signals. For simulation, we have made comparison between our methods and existing compression techniques like Run length that shows the Run length cannot compress when data is random but the proposed algorithms can compress. Furthermore, the proposed algorithms can be extended to all kinds of random data used in various applications.© (2005) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.