scispace - formally typeset
Open AccessProceedings ArticleDOI

An Evaluation Framework for the Accuracy of Camera Transfer Functions Estimated from Differently Exposed Images

Reads0
Chats0
TLDR
This paper describes a radiometry-based experimental setup to directly measure the CTF and shows how to obtain image pairs of known exposure ratios from the same experiment, i.e., under identical environmental circumstances (light, temperature, camera settings).
Abstract
Intensity values read from CCD- or CMOS-cameras are usually not proportional to the irradiance acquired by the sensor, but are mapped by a mostly nonlinear camera transfer function (CTF). This CTF can be measured using a test chart. This, however, is afflicted with the difficulty of ensuring uniform illumination of the chart. An alternative to chart-based measurements of the CTF is to use differently exposed images of the same scene. In this paper, we describe a radiometry-based experimental setup to directly measure the CTF. We furthermore show how to obtain image pairs of known exposure ratios from the same experiment, i.e., under identical environmental circumstances (light, temperature, camera settings). We use these images to estimate the CTF on differently exposed images, thus enabling a quantitative comparison of estimated and measured CTF

read more

Content maybe subject to copyright    Report

Lehrstuhl für Bildverarbeitung
Institute of Imaging & Computer Vision
An Evaluation Framework for the Accuracy
of Camera Transfer Functions Estimated
from Differently Exposed Images
Andr
´
e A. Bell and Jens N. Kaftan and Dietr ich Meyer-Ebrecht and
Til Aach
Institute of Imaging and Computer Vision
RWTH Aachen University, 52056 Aachen, Germany
tel: +49 241 80 27860, fax: +49 241 80 22200
web: www.lfb.rwth-aachen.de
in: 7th IEEE Southwest Symposium on Image Analysis and Interpretation. SSIAI 2006. See
also BIBT
E
X entry below.
BIBT
E
X:
@inproceedings{BEL06a,
author = {Andr\’{e} A. Bell and Jens N. Kaftan and Dietrich Meyer-Ebrecht and Til Aach},
title = {{A}n {E}valuation {F}ramework for the {A}ccuracy of {C}amera
{T}ransfer {F}unctions {E}stimated from {D}ifferently {E}xposed {I}mages},
booktitle = {7th IEEE Southwest Symposium on Image Analysis and Interpretation. SSIAI 2006},
publisher = {IEEE},
year = {2006},
pages = {168--172},
}
© 2006 IEEE. Personal use of this material is permitted. However, permission to reprint/republish
this material for advertising or promotional purposes or for creating new collective works for
resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in
other works must be obtained from the IEEE.
document created on: May 19, 2006
created from file: hdrprecission.tex
cover page automatically created with CoverPage.sty
(available at your favourite CTAN mirror)

An Evaluation Framework for the Accuracy of Camera Transfer Functions
Estimated from Differently Exposed Images
Andr
´
e A Bell, Jens N Kaftan, Dietrich Meyer-Ebrecht, Til Aach
Institute of Imaging and Computer Vision, RWTH Aachen University, Germany
{bell,kaftan,meyer-ebrecht,aach}@lfb.rwth-aachen.de
Abstract
Intensity values read from CCD- or CMOS-cameras are
usually not proportional to the irradiance acquired by the
sensor, but are mapped by a mostly nonlinear camera trans-
fer function (CTF). This CTF can be measured using a test
chart. This, however, is afflicted with the difficulty of en-
suring uniform illumination of the chart. An alternative
to chart-based measurements of the CTF is to use differ-
ently exposed images of the same scene. In this paper,
we describe a radiometry-based experimental setup to di-
rectly measure the CTF. We furthermore show how to ob-
tain image pairs of known exposure ratios from the same ex-
periment, i.e., under identical environmental circumstances
(light, temperature, camera settings). We use these images
to estimate the CTF on differently exposed images, thus
enabling a quantitative comparison of estimated and mea-
sured CTF.
1. Estimating and Measuring the Camera
Transfer Function
Many algorithms in computer vision assume that the ra-
diance of the scene is known. For instance, changes in
scene radiance between images can be used to determine
scene structure and illumination [12]. Also, the orientation
of visible surfaces of the scene can be obtained from the
radiance by shape from shading algorithms [13]. Radiance
maps were moreover used to render synthetic objects more
realistically into the scene [3].
In our application, we seek to tonally register cytopatho-
logical images taken at different exposures to generate a
high dynamic range image. Image acquisition is based on a
microscope equipped with a three-chip RGB color camera.
Towards this end, it is crucial to determine the irradiance
values in the image plane from the recorded intensity val-
ues. Unfortunately, the intensity values read from CCD- or
CMOS-cameras are usually not proportional to the irradi-
ance acquired by the sensor, but are mapped by the mostly
nonlinear camera transfer function (CTF) f . In other words,
we seek to apply the inverse f
1
of the CTF f .
This CTF can be measured using a test chart like the
Macbeth- or the CamAlign-CGH-chart, which consist of
patches of known reflectance. Measuring the CTF using
charts, however, is afflicted with the difficulty of ensuring
uniform illumination of the chart. Furthermore it might not
always be practicable since the CTF depends on parameter
settings of the camera and the environment (e.g. tempera-
ture).
An alternative to chart-based measurements of the CTF
is to use differently exposed images of the same scene
[10, 8, 9, 4, 11, 5, 6, 1, 2]. Assuming that the exposure ra-
tios of image pairs are known and the CTF can be modeled
by, e.g., a γ-function
I := f(q) = α + βq
γ
, (1)
the parameters α and γ can be estimated by comparing the
intensity values f(q) of corresponding pixels in differently
exposed images [10, 8, 9]. The scaling factor β can not
be recovered from these exposure sets and the parameter q,
named the photoquantity in [8], denotes the amount of light
received by a sensor element. Such a parametric model can
also be replaced by a smoothness constraint [4]. Alterna-
tively a polynomial [11] or a constrained piecewise linear
model has been used [1, 2]. It has been shown that one
needs to either know the exposure ratios or make an as-
sumption about the CTF [5, 6].
These methods offer appealing ways to recover the CTF,
even in a non-laboratory environment. So far the accuracy
of the estimated CTFs has been evaluated qualitatively with
CamAlign-CGH measurements plotted into the curve of the
recovered CTF [9] or using a Macbeth chart instead [5]. The
influence of noisy measurements has been shown to be less
than 2.7% in synthetic data [11].
In this paper, we describe a radiometry-based experimen-
tal setup to directly measure the CTF. We furthermore show
how to obtain image pairs of known exposure ratios from

the same experiment, viz. under identical environmental cir-
cumstances (light, temperature, camera settings). We use
these images to estimate the CTF by one of the above meth-
ods, thus enabling a quantitative comparison of estimated
and measured CTF.
2. Experimental Setup
Our CTF measurements are based essentially on a homo-
geneous light source realized by an integrating sphere (Ul-
brichtsphere). An integrating sphere provides an isotropic
and homogeneous light output in terms of radiance L (mea-
sured in
W
sr·m
2
) at its opening of diameter r [7]. We illumi-
nate the camera sensor directly by this light source, with all
optics removed to ensure homogeneous illumination over
the entire sensor area, and to avoid distortions introduced
by the optics. The irradiance impinging on the sensor is
then given by [7]
E = π
r
2
r
2
+ x
2
L (2)
where x is the distance between sensor and exit pupil of the
light source. The image acquisition process integrates this
irradiance E (measured in
W
m
2
) over exposure time t and
area A of a sensor element, yielding the energy
Q = AEt
detected by that sensorelement. Q corresponds to
N
p
= AEt
λ
hc
detected photons, where h is Planck’s constant and c is the
speed of light. N
p
maps to the sensor signal q via the quan-
tum efficiency η of a sensor element according to
q = ηAEt
λ
hc
(3)
In non-monochromatic light, η and E may depend on the
wavelength λ, the total response q is then given by integrat-
ing (3) over λ [7]. Additionally, in case of a color cam-
era, the irradiance incident on the sensor will be filtered by
colorfilters (e.g., Bayer color-filter-array) beforehand. The
quantity of light q undergoes further mostly nonlinear trans-
formations described by the CTF f such as, e.g., dynamic
range compression and quantization.
To measure the CTF in our experimental setup, we record
the camera response f (q) for different known distances x.
Samples of the CTF are then straightfowardly obtained from
(2) and (3).
Additionally, for every image taken at a distance x
1
, we ac-
quire a second image at position
x
2
=
q
r
2
+ 2x
2
1
(4)
which, according to (2), receives half of the irradiance in
the image plane. The image pairs of exposure ratio k = 2
recorded this way are used to estimate the CTF like in [10,
8].
0 50 100 150 200 250
0
50
100
150
200
250
f(q)
f(kq)
Figure 1. Joint histogram example of the JAI
CV-M90 red channel.
3. Estimating the Camera Transfer Function
Let us consider two images f
1
and f
2
with exposure ratio
k = 2, such that f
2
receives half of the irradiance as f
1
.
This is equivalent to observing the quantity of light q and
kq through the same CTF f . Based on such an image pair
the joint histogram between these two images [10, 8] can
be computed. In case of having more image pairs of equal
exposure ratio k these can be added to the joint histogram.
Afterwards the function g(f), defined by
g(f(q)) = f(kq)
has to be estimated from this joint histogram. Choosing the
camera model (1) gives
g(f) = f (kq)
= α + β(kq)
γ
= α k
γ
α + k
γ
α + k
γ
βq
γ
= α(1 k
γ
) + k
γ
(α + βq
γ
)
= α(1 k
γ
) + k
γ
f
which is a straight line in the joint histogram between the
toe- and shoulder-region [8]. The slope m and intercept b
are therefore given by m = k
γ
and b = α(1 k
γ
). After
fitting a line into the joint histogram, it is possible to deter-
mine the parameter α and γ of the camera model by

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0
50
100
150
200
250
q [arb. u.]
f(q)
Figure 2. Measured datapoints and fitted CTF for a JAI CV-M90 3-chip-CCD (8bit) camera.
γ =
log m
log k
(5)
α =
b
1 m
(6)
Figure 1 shows a plot of a joint histogram generated from 60
image pairs of exposure ratio k = 2 obtained as described
in section 2. The line has been fitted into the datapoints
between the toe and shoulder region using a linear regres-
sion χ
2
-fitting. From the estimated values m = 1.9109
and b = 14.207 and by using (5) and (6) one obtains
γ = 0.9343 and α = 15.5967 as parameter for the cam-
era model.
4. Results
Based on (1) as a model for the 3-chip RGB camera JAI
CV-M90, we estimated the parameters α and γ. Range lim-
its are not accounted for in this model. Therefore α cor-
responds to the dark current for positive values only. For
negative α, the CTF is truncated to zero. Towards higher
values the CTF exhibits the expected saturation due to the
limited maximum (“full-well”) charge generation capacity
of the sensor. The dynamic compression of the camera is
captured by the parameter γ. For the measurements, the
parameter β is determined by the radiance L of the light
source. Since L cannot be recovered by estimation from
sets of differently exposed images, we performed a least-
squares fit of the estimated CTF to the measured CTF with
respect to this parameter for comparison. Results are shown
in figure 2. Since γ 6= 1 the CTFs are not linear functions.
Evidently, as can be seen from figure 2, the fitted CTF repre-
sents the measured data of a JAI CV-M90 3-chip-CCD cam-
era very precisely. The mean absolute difference between
these data points and the fitted model of the measured CTF
is µ
red
= 0.4973, µ
g reen
= 0.1870, and µ
blue
= 0.4705
intensity values for the red, green, and blue channel respec-
tively, which proves that the model assumption is valid for
this camera. Table 1 shows the measured and estimated pa-
rameters for each color channel of the JAI CV-M90 3-chip-
CCD camera. The mean absolute error between measured
and estimated CTF is µ
red
= 0.4641, µ
g reen
= 0.4969,
and µ
blue
= 0.9856 for red, green, and blue channel re-
spectively.
Table 1. Comparison of measured and es-
timated parameters α and γ for each color
channel of the JAI CV-M90 3-chip-CCD cam-
era.
Measured CTF Estimated CTF
α
red
14.5919 15.5967
γ
red
0.9210 0.9343
α
g reen
25.9173 24.6797
γ
g reen
0.9723 0.9706
α
blue
27.9872 23.3515
γ
blue
0.8300 0.8817

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0
200
400
600
800
1000
q [arb. u.]
f(q)
Figure 3. Measured datapoints and fitted CTF for a AVT Dolphin F145C single chip Bayer color-filter-
array (10bit) camera (b).
Alternatively we have chosen the same camera model
(1) for the AVT Dolphin F145C single chip Bayer color-
filter-array camera. We estimated α and γ and measured
the CTF accordingly. As can be seen in figure 3, the cam-
era model (1) does not fit as good as for the 3-chip cam-
era. The displayed curves in figure 3 represent the color
channels corresponding to the rasters of the Bayer pattern.
Therefore we have two green channels. The mean absolute
difference between the measured data points and the fitted
model of the CTF is µ
red
= 0.6591, µ
g reen1
= 1.5541,
µ
g reen2
= 0.6728, and µ
blue
= 5.4489 intensity values
for the red, green1, green2, and blue channel respectively.
It should be noticed that the two green channels diverge
as soon as red is going to be saturated and that blue is to
be dented as soon as the second green channel reaches the
saturation point (see figure 3). This effect may be due to
blooming from one color channel into the other, which is an
effect that can not occur for a 3-chip camera. Table 2 shows
the measured and estimated parameters for each color chan-
nel of the AVT Dolphin F145C single chip Bayer color-
filter-array camera. For comparison we have carried out
a least-squares fit of the estimated CTF to the measured
CTF with respect to the parameter β as in the case of the
other camera. The mean absolute error between measured
and estimated CTF is µ
red
= 0.6555, µ
g reen1
= 1.8828,
µ
g reen2
= 0.7635, and µ
blue
= 6.509 intensity values in
mean for red, green1, green2 and blue respectively.
Table 2. Comparison of measured and es-
timated parameters α and γ for each color
channel of the AVT Dolphin F145C single chip
Bayer color-filter-array camera.
Measured CTF Estimated CTF
α
red
34.6273 34.1294
γ
red
0.9654 0.9648
α
g reen1
7.9983 0.7144
γ
g reen1
0.8618 0.8822
α
g reen2
29.5660 29.8274
γ
g reen2
0.9628 0.9615
α
blue
46.6789 27.7228
γ
blue
1.0037 0.9130
5. Discussion
We have shown that it is possible to directly measure the
CTF and, under the same environmental circumstances, to
acquire image pairs with known exposure ratio. The di-
rectly measured CTF can be used to verify the accuracy
of the assumed camera model or to find a camera model
for a specific camera. From the simultaneously acquired
image pairs of known exposure ratio one can estimate the
CTF using the same algorithms which are available for non-
laboratory conditions. Both measured and estimated CTF,

Citations
More filters
Journal ArticleDOI

Multispectral Filter-Wheel Cameras: Geometric Distortion Model and Compensation Algorithms

TL;DR: A mathematical model of the distortions of the optical path is derived and it is shown that the color fringes vanish completely after application of two different algorithms for compensation.
Proceedings ArticleDOI

High resolution imaging for inspection of Laser Beam Melting systems

TL;DR: This work presents a high resolution imaging system for inspection of LBM systems which can be easily integrated into existing machines and shows that the system can detect topological flaws and is able to inspect the surface quality of built layers.
Proceedings ArticleDOI

Multispectral high dynamic range imaging

TL;DR: This work presents a promising combination of both technologies, a high dynamic range multispectral camera featuring a higher color accuracy, an improved signal to noise ratio and greater dynamic range compared to a similar low dynamic range camera.
Journal ArticleDOI

High Dynamic Range Microscopy for Cytopathological Cancer Diagnosis

TL;DR: It is shown how the dynamic range can be increased by acquiring a set of differently exposed cell images, which allow to measure cellular features that are otherwise difficult to capture, if at all, in high dynamic range (HDR) images.
Proceedings ArticleDOI

Modeling and compensation of geometric distortions of multispectral cameras with optical bandpass filter wheels

TL;DR: A mathematical model is derived for the effects of chromatic aberration that robustly estimates the parameters of an appropriate affine coordinate transformation of multispectral cameras for high-fidelity colour reproduction.
References
More filters
Journal ArticleDOI

Shape from interreflections

TL;DR: It is shown that, for Lambertian surfaces, the pseudo shape and reflectance are unique and can be mathematically related to the actual shape andreflectance of the surface.
Journal ArticleDOI

Comparametric equations with practical applications in quantigraphic image processing

TL;DR: It is argued that exposure adjustment by gamma correction is inherently flawed, and alternatives are provided, which give rise to a new kind of processing in the "amplitude domain".
Book

Practical handbook on image processing for scientific and technical applications

Bernd Jähne
TL;DR: The Invisible Depth Measurements and Radiometric Measurements: Revealing the Invisible depth measurements are explored and 3-D Space Velocity Measurements are Exploring: Exploring Dynamic Processes are explored.
Book ChapterDOI

What Can Be Known about the Radiometric Response from Images

TL;DR: This work presents a novel method to recover the brightness transfer function between images from only their brightness histograms, which allows it to be determined between images of different scenes whenever the change in the distribution of scene radiances is small enough.
Proceedings ArticleDOI

Quantigraphic Imaging: Estimating the camera response and exposures from differently exposed images

TL;DR: Methods are proposed to simultaneously estimate the unknown camera response function, as well as the set of unknown relative exposure changes among images, up to a single unknown scalar constant.
Related Papers (5)
Frequently Asked Questions (2)
Q1. What are the contributions mentioned in the paper "An evaluation framework for the accuracy of camera transfer functions estimated from differently exposed images" ?

In this paper, the authors describe a radiometry-based experimental setup to directly measure the CTF. The authors furthermore show how to obtain image pairs of known exposure ratios from the same experiment, i. e., under identical environmental circumstances ( light, temperature, camera settings ). In this paper, the authors describe a radiometry-based experimental setup to directly measure the CTF. The authors furthermore show how to obtain image pairs of known exposure ratios from the same experiment, viz. The authors illuminate the camera sensor directly by this light source, with all optics removed to ensure homogeneous illumination over the entire sensor area, and to avoid distortions introduced by the optics. Furthermore it might not always be practicable since the CTF depends on parameter settings of the camera and the environment ( e. g. temperature ). 

The authors intend to evaluate further estimation algorithms and a larger variety of cameras, in the near future.