Performance evaluation of objective quality metrics for HDR image compression
read more
Citations
HDR-VDP-2.2: a calibrated method for objective quality prediction of high-dynamic range and standard images
Hdr-vqm
Benchmarking of objective quality metrics for HDR image quality assessment
Overview and evaluation of the JPEG XT HDR image compression standard
Perception-driven Accelerated Rendering
References
Image quality assessment: from error visibility to structural similarity
Multiscale structural similarity for image quality assessment
Image information and visual quality
A Statistical Evaluation of Recent Full Reference Image Quality Assessment Algorithms
Related Papers (5)
Frequently Asked Questions (12)
Q2. What are the common terms used for quality assessment of HDR images?
Since PSNR and SSIM are widely used for quality assessment of LDR images, in the following, the authors will refer to them as LDR metrics.
Q3. What is the common encoding of HDR to perceptually linear values?
Typical encodings from HDR to perceptually linear values include the simple logarithm, based on the Weber-Fechner law, or more sophisticated transfer functions such as the PU encoding.
Q4. How many pixels per degree were used to measure the quality of the stimulus?
Viewers participated individually to test sessions, sitting at a distance of approximately 1 meter, which corresponds to an angular resolution of about 40 pixels per degree.
Q5. How many image quality factors are used in the coding of HDR?
The authors coded each content with a JPEG quality factor QF ranging from 20 to 100, with a step of 5, producing a total of 17 rate points × 5 contents = 85 images.•
Q6. Why are there so many LDR image coding techniques?
Due to the huge bulk of available LDR images, the most promising HDR image coding techniques are those that offer backward compatibility with legacy LDR pictures.
Q7. What is the definition of spatial information for an LDR image?
For an LDR image, spatial information is defined as the standard deviation of the output of a Sobel operator applied to the image.
Q8. What are the main characteristics of the metrics used in the literature?
These metrics can provide very good approximations of human perception but require in general a delicate tuning of several parameters in order to be computed, which limits their use in many practical applications.
Q9. What are the two quality factors that control the base and enhancement layer quality?
The base and enhancement layer quality is controlled by two quality factors, which take values on [0, 100] and that the authors varied as follows: QFb ∈ [40, 70, 90, 100] and QFe ∈ [50, 75, 80, 90, 95], respectively.
Q10. How many images were retained for the test?
As a result of the screening phase, the authors retained a set of 50 images to use for the test (details about the exact coding parameters of the test dataset, as well as coded images, are available as supplementary material on the reference author’s website).
Q11. Why is the LDR pixel values gammacorrected?
This is partially due to the fact that LDR pixel values are gammacorrected in the sRGB color space,6 which not only does compensate for the non-linear luminance response of legacy CRT displays, but also accounts somehow for the lower contrast sensitivity of the human visual system (HVS) at dark luminance levels.
Q12. What has motivated research towards HDR processing algorithms?
This has motivated research towards novel HDR processing algorithms, including acquisition/generation1 and compression2,3 and, consequently, towards methods for assessing the quality of the processed results.