scispace - formally typeset
Open AccessJournal ArticleDOI

Direct method for restoration of motion-blurred images

Reads0
Chats0
TLDR
This work proposes a straightforward method to restore motion-blurred images given only the blurred image itself, and identifies the point-spread function (PSF) of the blur and uses it to restore the blur image.
Abstract
We deal with the problem of restoration of images blurred by relative motion between the camera and the object of interest. This problem is common when the imaging system is in moving vehicles or held by human hands, and in robot vision. For correct restoration of the degraded image, it is useful to know the point-spread function (PSF) of the blurring system. We propose a straightforward method to restore motion-blurred images given only the blurred image itself. The method first identifies the PSF of the blur and then uses it to restore the blurred image. The blur identification here is based on the concept that image characteristics along the direction of motion are affected mostly by the blur and are different from the characteristics in other directions. By filtering the blurred image, we emphasize the PSF correlation properties at the expense of those of the original image. Experimental results for image restoration are presented for both synthetic and real motion blur.

read more

Content maybe subject to copyright    Report

Direct method for restoration of
motion-blurred images
Y. Yitzhaky, I. Mor, A. Lantzman, and N. S. Kopeika
Department of Electrical and Computer Engineering, Ben-Gurion University of the Negev, P.O. Box 653,
Beer Sheva, 84105 Israel
Received July 25, 1997; revised manuscript received January 5, 1998; accepted January 7, 1998
We deal with the problem of restoration of images blurred by relative motion between the camera and the
object of interest. This problem is common when the imaging system is in moving vehicles or held by human
hands, and in robot vision. For correct restoration of the degraded image, it is useful to know the point-spread
function (PSF) of the blurring system. We propose a straightforward method to restore motion-blurred im-
ages given only the blurred image itself. The method first identifies the PSF of the blur and then uses it to
restore the blurred image. The blur identification here is based on the concept that image characteristics
along the direction of motion are affected mostly by the blur and are different from the characteristics in other
directions. By filtering the blurred image, we emphasize the PSF correlation properties at the expense of
those of the original image. Experimental results for image restoration are presented for both synthetic and
real motion blur. © 1998 Optical Society of America [S0740-3232(98)01406-9]
OCIS codes: 100.3020, 100.0100, 100.2000, 100.1830.
1. INTRODUCTION
Image restoration methods can be considered as direct
techniques when their results are produced in a simple
one-step fashion.
1
Equivalently, indirect techniques can
be considered as those in which restoration results are ob-
tained after a number of iterations. Known restoration
techniques such as inverse filtering and Wiener
filtering
2,3
can be considered as simple direct restoration
techniques. The problem with such methods is that they
require a knowledge of the blur function [i.e., the point-
spread function (PSF)], which is, unfortunately, usually
not available when dealing with images blurred by mo-
tion.
The method proposed in this paper deals with applying
direct image restoration techniques even though the blur
function is unknown. Therefore it is concerned with di-
rect identification of the blur function separate from and
before the restoration operation. The quality and the re-
liability of the image restoration process is usually based
on the accuracy of information concerning the degrada-
tion process.
For a given digital picture of the original scene f(i, j), a
common practical model
2,4
of the corresponding degraded
picture g(i, j) is
g
~
i, j
!
5
(
m
(
n
h
~
i 2 m, j 2 n
!
f
~
m, n
!
1 n
~
i, j
!
,
(1)
where h(i, j) is a linear shift-invariant PSF and n(i, j) is
random noise.
Early approaches for identification of the blur
2,4
involve
methods in which identification is performed separately
from the restoration process. These approaches are usu-
ally rather simple and include fewer computational re-
quirements. A possible case for such approaches occurs
when it is known a priori that a certain portion of the de-
graded picture is the image of a point, a line, or an edge in
the original picture, but these cases are often not appli-
cable to real-life images. The early method for blur
identification,
5
where no specific knowledge about the
original image was assumed, dealt with the case of uni-
form linear motion blur that is described by a square-
pulse PSF and used its property of periodic zeros in the
spectral domain of the blurred image. These zeros were
emphasized in the spectral domain, and the blur extent
was estimated by measuring the separations between
these zeros. The assumption of zeros in the spectral do-
main is not satisfied in various cases of motion degrada-
tion such as accelerated motion
6,7
and low-frequency
vibrations.
8
More recent developments in blur identification
9 11
re-
late the identification process with the restoration pro-
cess. These methods are more complicated and require
more computations. Restoration results are criterion
based, and blur parameters can be corrected until each
criterion is satisfied. Therefore more types of blur can be
considered. The success of these methods depends on the
reliability of the original image model. Recent important
developments are the maximum-likelihood image and
blur identification methods. These methods model the
original image, the blur, and the noise process. The
original image is modeled as a two-dimensional autore-
gressive process, and the blur is modeled as a two-
dimensional linear system with finite impulse response.
A maximum-likelihood estimation is used for identifica-
tion of the image and blur parameters. The identifica-
tion of the blur model parameters is incorporated into the
restoration algorithm and requires many computations.
Another new blur identification method
12
uses an estima-
tion of the original image power spectrum (an expected
value). The PSF estimate is chosen from a collection of
candidate PSF’s to provide the best match between the
1512 J. Opt. Soc. Am. A/Vol. 15, No. 6/June 1998 Yitzhaky et al.
0740-3232/98/061512-08$15.00 © 1998 Optical Society of America

restoration residual power spectrum and the expected re-
sidual spectrum given that the candidate PSF is the true
PSF.
In this paper we propose a new method to estimate the
blur function given only the motion-blurred image. Pre-
vious work,
13
summarized in Section 2, investigated the
motion-blurring effects on an image and established the
basic concepts with which blur characteristics such as di-
rection and extent were extracted from the blurred image.
Based on these concepts, a method to identify the blur
function is proposed here. The identified function is then
used to restore the blurred image by using a Wiener filter.
The method addresses one-dimensional blur types, which
are common in the case of motion degradation, and we as-
sume the blur effect to be linear and space invariant and
the original image to be a stationary random process.
These assumptions are common when dealing with prac-
tical image restoration algorithms.
1,2,4
2. IDENTIFICATION OF THE MOTION
BLUR FUNCTION
The blur function needed for direct restoration of the
blurred image can be completely described by the PSF or
by the optical transfer function (OTF), which is the Fou-
rier transform of the PSF. The OTF can be formulated
as
OTF 5 MTF exp
~
j PTF
!
, (2)
where the modulation transfer function (MTF) is the ab-
solute value of the OTF and the phase transfer function
(PTF) is its angle.
The first step of the method is to identify the blur di-
rection. Given the blur direction, correlation properties
of the blur function are then identified. This is per-
formed by filtering the blurred image so that correlation
properties stemming from the original image are sup-
pressed, and the filtered version is characterized mostly
by the blur function correlation properties. This leads to
identification of the blurring MTF. For causal blur the
PTF can then be extracted directly from the MTF. Using
the OTF, we then employ a simple Wiener filter to restore
the blurred image. The method will be presented step by
step in the following subsections. The formulation will
be presented in Section 3.
A. Motion Blur Phenomena
As a result of relative motion between the camera and the
object of interest, adjacent points in the image plane are
exposed to the same point in the object plane during the
exposure time. The intensity of an image of an original
point is shared between these image plane points accord-
ing to the relative duration in which each point is exposed
to light from the original point. The smearing tracks of
the points determine the PSF in the blurred image. Con-
trary to other blur causes such as atmospheric or out-of-
focus effects, motion blur is usually considered as one di-
mensional, since during exposure time that is relatively
short (in real-time imaging, approximately 1/30 s), motion
direction does not change. This smearing effect in the
motion direction acts as a low-pass filter in the spatial-
frequency domain.
B. Identification of the Blur Direction
The first necessary step of the method should be identifi-
cation of the motion direction relative to the image axis.
Extensive studies of image power spectra show that an
excellent simple model for imagery statistics is that of a
spatially isotropic first-order Markov process.
1
Hence
the autocorrelation of the original image and its power
spectrum are assumed to be approximately isotropic. As
a consequence of motion, image resolution is decreased
mostly in the motion direction. Therefore implementa-
tion of a high-pass filter (such as a simple image deriva-
tive) to the blurred image in this direction should sup-
press more of the image intensity than implementing it in
other directions. Therefore motion direction is identified
by measuring the direction where the power spectrum of
the image derivative is lower.
C. Decorrelating Real Images
Real images are characterized by high spatial correlation.
A simple decorrelation (whitening) filter can be a deriva-
tive operation. This operation in a digital image is ap-
proximately a differentiating operation whereby each
pixel in the filtered image is the difference between two
adjacent pixels in the original image. This operation has
been found to be an effective decorrelating filter.
D. Extracting Motion Blur Correlation Properties
The effect of motion blur on real images was analyzed in
detail in Ref. 13. Since the motion blur is usually one di-
mensional, its effect varies according to the direction in
the blurred image relative to the motion direction. Since
the PSF is varying in the motion direction, it is not corre-
lated perpendicularly to the motion direction. Therefore
a whitening filter implemented perpendicularly to the
motion direction (i.e., a filter that is not varying in the
motion direction) will not affect the PSF correlation prop-
erties. However, such a filter will significantly suppress
correlation properties stemming from the original image,
which is highly correlated in all directions. On the other
hand, implementation of a whitening filter in the motion
direction will have a different effect. The PSF has the
same effect on all the image points. The points of the
original image will become PSF patterns that merge into
each other, forming the blurred image. A whitening de-
rivative filter in this direction will form patterns similar
to that of the PSF derivative. Such a filter implemented
in both directions will form such patterns surrounded by
extremely suppressed decorrelated regions. Therefore
these patterns can be evaluated by performing an auto-
correlation operation on the blurred image derivative.
Since these patterns are in the motion direction, the au-
tocorrelation should be performed in this direction.
Implementing the autocorrelation function (ACF) to all
the image derivative lines in the motion direction, and
then averaging them,
14
will suppress the noise stimulated
by the whitening operations. Furthermore, such averag-
Yitzhaky et al. Vol. 15, No. 6/June 1998/J. Opt. Soc. Am. A 1513

ing will cause cancellation of correlation properties left
over from the original image, which can be different from
one line to another. This is especially true since the as-
sumption of stationarity of the original image is often not
a very good one.
The final conclusion here is that the average of the
ACF’s of the blurred image derivative lines in the motion
direction is similar to the ACF of the PSF derivative.
E. Identification of the Motion Function
The average spectral density of the image derivative lines
(in the motion direction) can be obtained by Fourier-
transforming the averaged ACF. Given the similarity
concluded in Subsection 2D, the shape of this spectral
density should be similar to that of the PSF derivative
power spectrum. Dividing it by the power spectrum of
the derivative filter (performed in the motion direction)
will yield the power spectrum of the PSF itself. The
whitening filter performed perpendicularly to the motion
direction is not considered here, since it does not affect
the PSF correlation properties as discussed in Subsection
2.D. The MTF of the blur is then the square root of its
power spectrum. If the blur is causal, the PTF can be
straightforwardly extracted from the MTF by using the
Hilbert transform as described in Section 3. The motion
function (OTF) is then obtained from Eq. (2).
The reliability of the blur function estimate depends on
the success of the original image whitening operation.
When the whitening is imperfect, the ACF of the PSF de-
rivative will be also influenced by the correlation proper-
ties of the original image. In this case the image deriva-
tive will have more low-frequency content stemming from
the original image, and therefore the identified ACF will
usually have higher values close to its center. The iden-
Fig. 1. Image of the Earth horizontally blurred by accelerated motion with 20-pixel blur extent and different values of R.
Fig. 2. Average of the autocorrelation functions of the blurred Earth image derivative lines in the motion direction for different values
of R.
1514 J. Opt. Soc. Am. A/Vol. 15, No. 6/June 1998 Yitzhaky et al.

tified MTF will then show more modulation transfer at
the lower frequencies.
3. FORMULATION OF THE METHOD
A discrete derivative of the blurred image f(i, j), where i
and j are the horizontal and vertical directions, respec-
tively, can be approximated, for example, by
13
@
Df
~
i, j
!
#
k°
5 f
~
i, j
!
*
d
~
i, j
!
,
d
~
i, j
!
5
F
21 1 2 tan
~
k
!
0 tan
~
k
!
G
(3)
for 0 > k > 245° relative to the positive horizontal direc-
tion, where
*
is the convolution operator.
The motion direction is identified by employing a
simple high-pass filter such as a derivative operation [Eq.
(3)] in all the directions and measuring the total intensity
in each direction. The motion direction will then be the
direction in which the total intensity is the lowest. The
total intensity of the image derivative I(Dg) in direction k
is
@
I
~
Dg
!
#
k°
5
U
(
1
N21
(
1
M21
@
Dg
~
i, j
!
#
k°
U
, (4)
where M and N are the number of rows and columns, re-
spectively, in the image derivative Dg(i, j).
A digital ACF of each image derivative line in the mo-
tion direction is then performed, and the average of the
ACF’s of these lines, R
¯
Df
, is calculated.
Fig. 3. Comparison of the identified and true power spectra of acceleration motion blur PSF’s: (a) identified power spectra obtained by
Fourier-transforming the ACF’s of Fig. 2, (b) true power spectra.
Fig. 4. Blur function identification: (a) original image, (b) image blurred by accelerated motion with R 5 10 and 20-pixel blur extent,
(c) true versus identified MTF, (d) true versus identified phase.
Yitzhaky et al. Vol. 15, No. 6/June 1998/J. Opt. Soc. Am. A 1515

An ACF R
l
( j) of an M-pixel image line l is defined as
R
l
~
j
!
5
(
i52M
M
l
~
i 1 j
!
l
~
i
!
, integer j P
@
2M, M
#
,
(5)
where l(i) 5 0 for i ¹
@
0, M
#
. The computation of the
digital ACF’s in the motion direction k (relative to the
positive horizontal direction) is performed by rotating the
image itself 2k° with use of the two-dimensional interpo-
lation technique and then performing the autocorrelation
Fig. 5. (a) True versus identified PSF, (b) restored image with use of the identified OTF.
Fig. 6. Blur function identification from a noisy blurred image: (a) original image, (b) image blurred by accelerated motion with R
5 10 and 20-pixel blur extent and an additive noise forming a 30-dB signal-to-noise ratio, (c) true versus identified MTF, (d) true versus
identified phase.
Fig. 7. (a) True versus identified PSF, (b) restored image with use of the identified OTF.
1516 J. Opt. Soc. Am. A/Vol. 15, No. 6/June 1998 Yitzhaky et al.

Citations
More filters
Journal ArticleDOI

Fast motion deblurring

TL;DR: A fastdeblurring method that produces a deblurring result from a single image of moderate size in a few seconds by introducing a novel prediction step and working with image derivatives rather than pixel values, which gives faster convergence.
Proceedings ArticleDOI

Image deblurring with blurred/noisy image pairs

TL;DR: This paper shows in this paper how to produce a high quality image that cannot be obtained by simply denoising the noisy image, or deblurring the blurred image alone, by combining information extracted from both blurred and noisy images.
Journal ArticleDOI

Coded exposure photography: motion deblurring using fluttered shutter

TL;DR: It is demonstrated that manually-specified point spread functions are sufficient for several challenging cases of motion-blur removal including extremely large motions, textured backgrounds and partial occluders.
Journal ArticleDOI

Motion-based motion deblurring

TL;DR: The fundamental trade off between spatial resolution and temporal resolution is exploited to construct a hybrid camera that can measure its own motion during image integration and show that, with minimal resources, hybrid imaging outperforms previous approaches to the motion blur problem.
Proceedings Article

Blind Motion Deblurring Using Image Statistics

TL;DR: This work addresses the problem of blind motion deblurring from a single image, caused by a few moving objects, and relies on the observation that the statistics of derivative filters in images are significantly changed by blur.
References
More filters
Book

Fundamentals of digital image processing

TL;DR: This chapter discusses two Dimensional Systems and Mathematical Preliminaries and their applications in Image Analysis and Computer Vision, as well as image reconstruction from Projections and image enhancement.
Book

Digital Picture Processing

TL;DR: The rapid rate at which the field of digital picture processing has grown in the past five years had necessitated extensive revisions and the introduction of topics not found in the original edition.
Book

Digital Image Processing and Computer Vision

TL;DR: A Geometrical Model for Imaging and Some Applications Image Grey-Level Modeling and Early Processing Fundamentals, Part I: Transforms and Sampling.
Journal ArticleDOI

Blind deconvolution of spatially invariant image blurs with phase

TL;DR: In this paper, the frequency response of a two-dimensional spatially invariant linear system through which an image has been passed and blurred is estimated for the cases of uniform linear camera motion.
Frequently Asked Questions (14)
Q1. What are the contributions in "Direct method for restoration of motion-blurred images" ?

The authors propose a straightforward method to restore motion-blurred images given only the blurred image itself. 

Known restoration techniques such as inverse filtering and Wiener filtering2,3 can be considered as simple direct restoration techniques. 

Implementing the autocorrelation function (ACF) to all the image derivative lines in the motion direction, and then averaging them,14 will suppress the noise stimulated by the whitening operations. 

Since the motion blur is usually one dimensional, its effect varies according to the direction in the blurred image relative to the motion direction. 

such averag-ing will cause cancellation of correlation properties left over from the original image, which can be different from one line to another. 

The computation of the digital ACF’s in the motion direction k (relative to the positive horizontal direction) is performed by rotating the image itself 2k° with use of the two-dimensional interpolation technique and then performing the autocorrelationoperation [Eq. (5)] on the horizontal lines of the rotated image. 

The blurred image of Fig. 4(b) is obtained by blurring the original image of Fig. 4(a) with the use of a uniform motion function of 20-pixel blur extent. 

Therefore implementation of a high-pass filter (such as a simple image derivative) to the blurred image in this direction should suppress more of the image intensity than implementing it in other directions. 

The authors can see that here the capability of the phase identification decreases at higher spatial frequencies as a result of the noise. 

If the blur is causal, the PTF can be straightforwardly extracted from the MTF by using the Hilbert transform as described in Section 3. 

Therefore a whitening filter implemented perpendicularly to the motion direction (i.e., a filter that is not varying in the motion direction) will not affect the PSF correlation properties. 

1,2,4The blur function needed for direct restoration of the blurred image can be completely described by the PSF or by the optical transfer function (OTF), which is the Fourier transform of the PSF. 

The original image here was blurred according to Eq. (2) by a uniform motion function of 20-pixel blur extent and additive noise forming a 30-dB signal to-noise ratio. 

Contrary to other blur causes such as atmospheric or out-offocus effects, motion blur is usually considered as one dimensional, since during exposure time that is relatively short (in real-time imaging, approximately 1/30 s), motiondirection does not change.