scispace - formally typeset
Search or ask a question
Author

Sos S. Agaian

Bio: Sos S. Agaian is an academic researcher from City University of New York. The author has contributed to research in topics: Image processing & Steganography. The author has an hindex of 38, co-authored 532 publications receiving 8216 citations. Previous affiliations of Sos S. Agaian include College of Staten Island & University of Texas System.


Papers
More filters
01 Jan 2011
TL;DR: The question of whether a given NPCR/UACI score is sufficiently high such that it is not discernible from ideally encrypted images is answered by comparing actual NPCR and UACI scores with corresponding critical values.
Abstract: The number of changing pixel rate (NPCR) and the unified averaged changed intensity (UACI) are two most common quantities used to evaluate the strength of image encryption algorithms/ciphers with respect to differential attacks. Conventionally, a high NPCR/UACI score is usually interpreted as a high resistance to differential attacks. However, it is not clear how high NPCR/UACI is such that the image cipher indeed has a high security level. In this paper, we approach this problem by establishing a mathematical model for ideally encrypted images and then derive expectations and variances of NPCR and UACI under this model. Further, these theoretical values are used to form statistical hypothesis NPCR and UACI tests. Critical values of tests are consequently derived and calculated both symbolically and numerically. As a result, the question of whether a given NPCR/UACI score is sufficiently high such that it is not discernible from ideally encrypted images is answered by comparing actual NPCR/UACI scores with corresponding critical values. Experimental results using the NPCR and UACI randomness tests show that many existing image encryption methods are actually not as good as they are purported, although some methods do pass these randomness tests.

857 citations

Journal ArticleDOI
TL;DR: A new nonreference underwater image quality measure (UIQM) is presented, which comprises three underwater image attribute measures selected for evaluating one aspect of the underwater image degradation, and each presented attribute measure is inspired by the properties of human visual systems (HVSs).
Abstract: Underwater images suffer from blurring effects, low contrast, and grayed out colors due to the absorption and scattering effects under the water. Many image enhancement algorithms for improving the visual quality of underwater images have been developed. Unfortunately, no well-accepted objective measure exists that can evaluate the quality of underwater images similar to human perception. Predominant underwater image processing algorithms use either a subjective evaluation, which is time consuming and biased, or a generic image quality measure, which fails to consider the properties of underwater images. To address this problem, a new nonreference underwater image quality measure (UIQM) is presented in this paper. The UIQM comprises three underwater image attribute measures: the underwater image colorfulness measure (UICM), the underwater image sharpness measure (UISM), and the underwater image contrast measure (UIConM). Each attribute is selected for evaluating one aspect of the underwater image degradation, and each presented attribute measure is inspired by the properties of human visual systems (HVSs). The experimental results demonstrate that the measures effectively evaluate the underwater image quality in accordance with the human perceptions. These measures are also used on the AirAsia 8501 wreckage images to show their importance in practical applications.

671 citations

Journal ArticleDOI
TL;DR: The presented algorithms use the fact that the relationship between stimulus and perception is logarithmic and afford a marriage between enhancement qualities and computational efficiency to choose the best parameters and transform for each enhancement.
Abstract: Many applications of histograms for the purposes of image processing are well known. However, applying this process to the transform domain by way of a transform coefficient histogram has not yet been fully explored. This paper proposes three methods of image enhancement: a) logarithmic transform histogram matching, b) logarithmic transform histogram shifting, and c) logarithmic transform histogram shaping using Gaussian distributions. They are based on the properties of the logarithmic transform domain histogram and histogram equalization. The presented algorithms use the fact that the relationship between stimulus and perception is logarithmic and afford a marriage between enhancement qualities and computational efficiency. A human visual system-based quantitative measurement of image contrast improvement is also defined. This helps choose the best parameters and transform for each enhancement. A number of experimental results are presented to illustrate the performance of the proposed algorithms

527 citations

Journal ArticleDOI
TL;DR: The proposed local Shannon entropy measure overcomes several weaknesses of the conventional global Shannon entropyMeasure, including unfair randomness comparisons between images of different sizes, failure to discern image randomness before and after image shuffling, and possible inaccurate scores for synthesized images.

476 citations

Journal ArticleDOI
TL;DR: A new class of the "frequency domain"-based signal/image enhancement algorithms including magnitude reduction, log-magnitude reduction, iterative magnitude and a log-reduction zonal magnitude technique, based on the so-called sequency ordered orthogonal transforms, which include the well-known Fourier, Hartley, cosine, and Hadamard transforms.
Abstract: This paper presents a new class of the "frequency domain"-based signal/image enhancement algorithms including magnitude reduction, log-magnitude reduction, iterative magnitude and a log-reduction zonal magnitude technique. These algorithms are described and applied for detection and visualization of objects within an image. The new technique is based on the so-called sequency ordered orthogonal transforms, which include the well-known Fourier, Hartley, cosine, and Hadamard transforms, as well as new enhancement parametric operators. A wide range of image characteristics can be obtained from a single transform, by varying the parameters of the operators. We also introduce a quantifying method to measure signal/image enhancement called EME. This helps choose the best parameters and transform for each enhancement. A number of experimental results are presented to illustrate the performance of the proposed algorithms.

373 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: The 11th edition of Harrison's Principles of Internal Medicine welcomes Anthony Fauci to its editorial staff, in addition to more than 85 new contributors.
Abstract: The 11th edition of Harrison's Principles of Internal Medicine welcomes Anthony Fauci to its editorial staff, in addition to more than 85 new contributors. While the organization of the book is similar to previous editions, major emphasis has been placed on disorders that affect multiple organ systems. Important advances in genetics, immunology, and oncology are emphasized. Many chapters of the book have been rewritten and describe major advances in internal medicine. Subjects that received only a paragraph or two of attention in previous editions are now covered in entire chapters. Among the chapters that have been extensively revised are the chapters on infections in the compromised host, on skin rashes in infections, on many of the viral infections, including cytomegalovirus and Epstein-Barr virus, on sexually transmitted diseases, on diabetes mellitus, on disorders of bone and mineral metabolism, and on lymphadenopathy and splenomegaly. The major revisions in these chapters and many

6,968 citations

01 Apr 1997
TL;DR: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity.
Abstract: The objective of this paper is to give a comprehensive introduction to applied cryptography with an engineer or computer scientist in mind. The emphasis is on the knowledge needed to create practical systems which supports integrity, confidentiality, or authenticity. Topics covered includes an introduction to the concepts in cryptography, attacks against cryptographic systems, key use and handling, random bit generation, encryption modes, and message authentication codes. Recommendations on algorithms and further reading is given in the end of the paper. This paper should make the reader able to build, understand and evaluate system descriptions and designs based on the cryptographic components described in the paper.

2,188 citations

Proceedings Article
01 Jan 1994
TL;DR: The main focus in MUCKE is on cleaning large scale Web image corpora and on proposing image representations which are closer to the human interpretation of images.
Abstract: MUCKE aims to mine a large volume of images, to structure them conceptually and to use this conceptual structuring in order to improve large-scale image retrieval. The last decade witnessed important progress concerning low-level image representations. However, there are a number problems which need to be solved in order to unleash the full potential of image mining in applications. The central problem with low-level representations is the mismatch between them and the human interpretation of image content. This problem can be instantiated, for instance, by the incapability of existing descriptors to capture spatial relationships between the concepts represented or by their incapability to convey an explanation of why two images are similar in a content-based image retrieval framework. We start by assessing existing local descriptors for image classification and by proposing to use co-occurrence matrices to better capture spatial relationships in images. The main focus in MUCKE is on cleaning large scale Web image corpora and on proposing image representations which are closer to the human interpretation of images. Consequently, we introduce methods which tackle these two problems and compare results to state of the art methods. Note: some aspects of this deliverable are withheld at this time as they are pending review. Please contact the authors for a preview.

2,134 citations

Journal Article
TL;DR: In this article, the authors explore the effect of dimensionality on the nearest neighbor problem and show that under a broad set of conditions (much broader than independent and identically distributed dimensions), as dimensionality increases, the distance to the nearest data point approaches the distance of the farthest data point.
Abstract: We explore the effect of dimensionality on the nearest neighbor problem. We show that under a broad set of conditions (much broader than independent and identically distributed dimensions), as dimensionality increases, the distance to the nearest data point approaches the distance to the farthest data point. To provide a practical perspective, we present empirical results on both real and synthetic data sets that demonstrate that this effect can occur for as few as 10-15 dimensions. These results should not be interpreted to mean that high-dimensional indexing is never meaningful; we illustrate this point by identifying some high-dimensional workloads for which this effect does not occur. However, our results do emphasize that the methodology used almost universally in the database literature to evaluate high-dimensional indexing techniques is flawed, and should be modified. In particular, most such techniques proposed in the literature are not evaluated versus simple linear scan, and are evaluated over workloads for which nearest neighbor is not meaningful. Often, even the reported experiments, when analyzed carefully, show that linear scan would outperform the techniques being proposed on the workloads studied in high (10-15) dimensionality!.

1,992 citations