scispace - formally typeset
Search or ask a question
Journal Article•DOI•

Image quality measures and their performance

01 Dec 1995-IEEE Transactions on Communications (IEEE)-Vol. 43, Iss: 12, pp 2959-2965
TL;DR: Although some numerical measures correlate well with the observers' response for a given compression technique, they are not reliable for an evaluation across different techniques, and a graphical measure called Hosaka plots can be used to appropriately specify not only the amount, but also the type of degradation in reconstructed images.
Abstract: A number of quality measures are evaluated for gray scale image compression. They are all bivariate, exploiting the differences between corresponding pixels in the original and degraded images. It is shown that although some numerical measures correlate well with the observers' response for a given compression technique, they are not reliable for an evaluation across different techniques. A graphical measure called Hosaka plots, however, can be used to appropriately specify not only the amount, but also the type of degradation in reconstructed images.
Citations
More filters
Journal Article•DOI•
TL;DR: In this article, a structural similarity index is proposed for image quality assessment based on the degradation of structural information, which can be applied to both subjective ratings and objective methods on a database of images compressed with JPEG and JPEG2000.
Abstract: Objective methods for assessing perceptual image quality traditionally attempted to quantify the visibility of errors (differences) between a distorted image and a reference image using a variety of known properties of the human visual system. Under the assumption that human visual perception is highly adapted for extracting structural information from a scene, we introduce an alternative complementary framework for quality assessment based on the degradation of structural information. As a specific example of this concept, we develop a structural similarity index and demonstrate its promise through a set of intuitive examples, as well as comparison to both subjective ratings and state-of-the-art objective methods on a database of images compressed with JPEG and JPEG2000. A MATLAB implementation of the proposed algorithm is available online at http://www.cns.nyu.edu//spl sim/lcv/ssim/.

40,609 citations

Journal Article•DOI•
TL;DR: Although the new index is mathematically defined and no human visual system model is explicitly employed, experiments on various image distortion types indicate that it performs significantly better than the widely used distortion metric mean squared error.
Abstract: We propose a new universal objective image quality index, which is easy to calculate and applicable to various image processing applications. Instead of using traditional error summation methods, the proposed index is designed by modeling any image distortion as a combination of three factors: loss of correlation, luminance distortion, and contrast distortion. Although the new index is mathematically defined and no human visual system model is explicitly employed, our experiments on various image distortion types indicate that it performs significantly better than the widely used distortion metric mean squared error. Demonstrative images and an efficient MATLAB implementation of the algorithm are available online at http://anchovy.ece.utexas.edu//spl sim/zwang/research/quality_index/demo.html.

5,285 citations


Cites background from "Image quality measures and their pe..."

  • ...Unfortunately, none of these complicated objective metrics in the literature has shown any clear advantage over simple mathematical measures such as RMSE and PSNR under strict testing conditions and different image distortion environments [3]–[5]....

    [...]

Proceedings Article•DOI•
09 Nov 2003
TL;DR: This paper proposes a multiscale structural similarity method, which supplies more flexibility than previous single-scale methods in incorporating the variations of viewing conditions, and develops an image synthesis method to calibrate the parameters that define the relative importance of different scales.
Abstract: The structural similarity image quality paradigm is based on the assumption that the human visual system is highly adapted for extracting structural information from the scene, and therefore a measure of structural similarity can provide a good approximation to perceived image quality. This paper proposes a multiscale structural similarity method, which supplies more flexibility than previous single-scale methods in incorporating the variations of viewing conditions. We develop an image synthesis method to calibrate the parameters that define the relative importance of different scales. Experimental comparisons demonstrate the effectiveness of the proposed method.

4,333 citations

Journal Article•DOI•
TL;DR: An image information measure is proposed that quantifies the information that is present in the reference image and how much of this reference information can be extracted from the distorted image and combined these two quantities form a visual information fidelity measure for image QA.
Abstract: Measurement of visual quality is of fundamental importance to numerous image and video processing applications. The goal of quality assessment (QA) research is to design algorithms that can automatically assess the quality of images or videos in a perceptually consistent manner. Image QA algorithms generally interpret image quality as fidelity or similarity with a "reference" or "perfect" image in some perceptual space. Such "full-reference" QA methods attempt to achieve consistency in quality prediction by modeling salient physiological and psychovisual features of the human visual system (HVS), or by signal fidelity measures. In this paper, we approach the image QA problem as an information fidelity problem. Specifically, we propose to quantify the loss of image information to the distortion process and explore the relationship between image information and visual quality. QA systems are invariably involved with judging the visual quality of "natural" images and videos that are meant for "human consumption." Researchers have developed sophisticated models to capture the statistics of such natural signals. Using these models, we previously presented an information fidelity criterion for image QA that related image quality with the amount of information shared between a reference and a distorted image. In this paper, we propose an image information measure that quantifies the information that is present in the reference image and how much of this reference information can be extracted from the distorted image. Combining these two quantities, we propose a visual information fidelity measure for image QA. We validate the performance of our algorithm with an extensive subjective study involving 779 images and show that our method outperforms recent state-of-the-art image QA algorithms by a sizeable margin in our simulations. The code and the data from the subjective study are available at the LIVE website.

3,146 citations


Cites background from "Image quality measures and their pe..."

  • ...In [ 15 ] and [16], a number of these were evaluated for the purpose of quality assessment....

    [...]

Journal Article•DOI•
TL;DR: This paper presents results of an extensive subjective quality assessment study in which a total of 779 distorted images were evaluated by about two dozen human subjects and is the largest subjective image quality study in the literature in terms of number of images, distortion types, and number of human judgments per image.
Abstract: Measurement of visual quality is of fundamental importance for numerous image and video processing applications, where the goal of quality assessment (QA) algorithms is to automatically assess the quality of images or videos in agreement with human quality judgments. Over the years, many researchers have taken different approaches to the problem and have contributed significant research in this area and claim to have made progress in their respective domains. It is important to evaluate the performance of these algorithms in a comparative setting and analyze the strengths and weaknesses of these methods. In this paper, we present results of an extensive subjective quality assessment study in which a total of 779 distorted images were evaluated by about two dozen human subjects. The "ground truth" image quality data obtained from about 25 000 individual human quality judgments is used to evaluate the performance of several prominent full-reference image quality assessment algorithms. To the best of our knowledge, apart from video quality studies conducted by the Video Quality Experts Group, the study presented in this paper is the largest subjective image quality study in the literature in terms of number of images, distortion types, and number of human judgments per image. Moreover, we have made the data from the study freely available to the research community . This would allow other researchers to easily report comparative results in the future

2,598 citations


Cites background or methods from "Image quality measures and their pe..."

  • ...In [3], the entire dataset was derived from only three reference images and distorted by compression distortion only, with a total of 84 distorted images....

    [...]

  • ...In [3], [4], and [5], a number of mathematical measures of quality have been evaluated against subjective quality data....

    [...]

References
More filters
Journal Article•DOI•
N. Nill1•
TL;DR: A new analytical solution, taking the form of a straightforward multiplicative weighting function, is developed which is readily applicable to image compression and quality assessment in conjunction with a visual model and the image cosine transform.
Abstract: Utilizing a cosine transform in image compression has several recognized performance benefits, resulting in the ability to attain large compression ratios with small quality loss. Also, incorporation of a model of the human visual system into an image compression or quality assessment technique intuitively should (and has often proven to) improve performance. Clearly, then, it should prove highly beneficial to combine the image cosine transform with a visual model. In the past, combining these two has been hindered by a fundamental problem resulting from the scene alteration that is necessary for proper cosine transform utilization. A new analytical solution to this problem, taking the form of a straightforward multiplicative weighting function, is developed in this paper. This solution is readily applicable to image compression and quality assessment in conjunction with a visual model and the image cosine transform. In the development, relevant aspects of a human visual system model are discussed, and a refined version of the mean square error quality assessment measure is given which should increase this measure's utility.

297 citations


"Image quality measures and their pe..." refers methods in this paper

  • ...The function for the model is defined as [ 2 ] and the latter a wavelet...

    [...]

Proceedings Article•DOI•
01 Apr 1993
TL;DR: In this paper, the authors survey and give a classification of the criteria for the evaluation of monochrome image quality, including the mean square error (MSE) and mean square errors (SSE).
Abstract: Although a variety of techniques are available today for gray-scale image compression, a complete evaluation of these techniques cannot be made as there is no single reliable objective criterion for measuring the error in compressed images. The traditional subjective criteria are burdensome, and usually inaccurate or inconsistent. On the other hand, being the most common objective criterion, the mean square error (MSE) does not have a good correlation with the viewer's response. It is now understood that in order to have a reliable quality measure, a representative model of the complex human visual system is required. In this paper, we survey and give a classification of the criteria for the evaluation of monochrome image quality.

66 citations

Book•
06 Aug 1990
TL;DR: Recursive Block Coding, a new image data compression technique that has its roots in noncausal models for 1d and 2d signals, is the subject of this book and the basic result shows that a random field can be coded block by block such that the interblock redundancy can be completely removed while the individual blocks are transform coded.
Abstract: Recursive Block Coding, a new image data compression technique that has its roots in noncausal models for 1d and 2d signals, is the subject of this book. The underlying theory provides a multitude of compression algorithms that encompass two course coding, quad tree coding, hybrid coding and so on. Since the noncausal models provide a fundamentally different image representation, they lead to new approaches to many existing algorithms, including useful approaches for asymmetric, progressive, and adaptive coding techniques. On the theoretical front, the basic result shows that a random field (an ensemble of images) can be coded block by block such that the interblock redundancy can be completely removed while the individual blocks are transform coded. On the practical side, the artifact of tiling, a block boundary effect, present in conventional block by block transform coding techniques has been greatly suppressed. This book contains not only a theoretical discussion of the algorithms but also exhaustive simulation and suggested methodologies for ensemble design techniques. Each of the resulting algorithms has been applied to twelve images over a wide range of image data rates and the results are reported using subjective descriptions, photographs, mathematical MSE values, and h-plots, a recently proposed graphical representation showing a high level of agreement with image quality as judged subjectively.

46 citations


"Image quality measures and their pe..." refers methods in this paper

  • ...Three controlled examples are presented in [ 9 ] to illustrate the use of h-plots....

    [...]

  • ...For the construction of the h-plots in Fig. 4, the two parameters, the initial block size N and the variance threshold T, were chosen as 16 and 10, respectively, as in Hosaka's or Farrelle's work [8], [ 9 ]....

    [...]

Dissertation•
01 Dec 1993
TL;DR: A deep lossy compression of still and moving grayscale pictures while maintaining their fidelity is presented with a specific goal of creating a working prototype of a software system for use in low bandwidth transmission of still satellite imagery and weather briefings with the best preservation of features considered important by the end user.
Abstract: The scope of the present dissertation is a deep lossy compression of still and moving grayscale pictures while maintaining their fidelity, with a specific goal of creating a working prototype of a software system for use in low bandwidth transmission of still satellite imagery and weather briefings with the best preservation of features considered important by the end user.

2 citations