scispace - formally typeset
Open AccessJournal ArticleDOI

Color-Decoupled Photo Response Non-Uniformity for Digital Image Forensics

TLDR
A couple-decoupled PRNU (CD-PRNU) extraction method is proposed, which first decomposes each color channel into four sub-images and then extracts the PRnU noise from each sub-image, to prevent the interpolation noise from propagating into the physical components, thus improving the accuracy of device identification and image content integrity verification.
Abstract
The last few years have seen the use of photo response non-uniformity noise (PRNU), a unique fingerprint of imaging sensors, in various digital forensic applications such as source device identification, content integrity verification, and authentication. However, the use of a color filter array for capturing only one of the three color components per pixel introduces color interpolation noise, while the existing methods for extracting PRNU provide no effective means for addressing this issue. Because the artificial colors obtained through the color interpolation process are not directly acquired from the scene by physical hardware, we expect that the PRNU extracted from the physical components, which are free from interpolation noise, should be more reliable than that from the artificial channels, which carry interpolation noise. Based on this assumption we propose a couple-decoupled PRNU (CD-PRNU) extraction method, which first decomposes each color channel into four sub-images and then extracts the PRNU noise from each sub-image. The PRNU noise patterns of the sub-images are then assembled to get the CD-PRNU. This new method can prevent the interpolation noise from propagating into the physical components, thus improving the accuracy of device identification and image content integrity verification.

read more

Content maybe subject to copyright    Report

http://go.warwick.ac.uk/lib-publications
Original citation:
Li, C-T. (2012). Color-Decoupled Photo Response Non-Uniformity for Digital Image
Forensics. IEEE Transactions on Circuits and Systems for Video Technology, 22(2), pp.
260-271
Permanent WRAP url:
http://wrap.warwick.ac.uk/48171
Copyright and reuse:
The Warwick Research Archive Portal (WRAP) makes the work of researchers of the
University of Warwick available open access under the following conditions. Copyright ©
and all moral rights to the version of the paper presented here belong to the individual
author(s) and/or other copyright owners. To the extent reasonable and practicable the
material made available in WRAP has been checked for eligibility before being made
available.
Copies of full items can be used for personal research or study, educational, or not-for-
profit purposes without prior permission or charge. Provided that the authors, title and
full bibliographic details are credited, a hyperlink and/or URL is given for the original
metadata page and the content is not changed in any way.
Publisher’s statement:
(c) 2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be
obtained for all other users, including reprinting/ republishing this material for advertising
or promotional purposes, creating new collective works for resale or redistribution to
servers or lists, or reuse of any copyrighted components of this work in other works.
DOI: 10.1109/TCSVT.2011.2160750
A note on versions:
The version presented here may differ from the published version or, version of record, if
you wish to cite this item you are advised to consult the publisher’s version. Please see
the ‘permanent WRAP url’ above for details on accessing the published version and note
that access may require a subscription.
For more information, please contact the WRAP Team at: wrap@warwick.ac.uk

1
Abstract The last few years have seen the use of photo
response non-uniformity noise (PRNU), a unique fingerprint of
imaging sensors, in various digital forensic applications such as
source device identification, content integrity verification and
authentication. However, the use of a colour filter array for
capturing only one of the three colour components per pixel
introduces colour interpolation noise, while the existing methods
for extracting PRNU provide no effective means for addressing
this issue. Because the artificial colours obtained through the
colour interpolation process is not directly acquired from the
scene by physical hardware, we expect that the PRNU extracted
from the physical components, which are free from interpolation
noise, should be more reliable than that from the artificial
channels, which carry interpolation noise. Based on this
assumption we propose a Couple-Decoupled PRNU (CD-PRNU)
extraction method, which first decomposes each colour channel
into 4 sub-images and then extracts the PRNU noise from each
sub-image. The PRNU noise patterns of the sub-images are then
assembled to get the CD-PRNU. This new method can prevent the
interpolation noise from propagating into the physical
components, thus improving the accuracy of device identification
and image content integrity verification.
Index Terms Image forensics, colour filter array, photo
response non-uniformity noise, demosaicing, image
authentication, source device identification
I. INTRODUCTION
S digital multimedia processing hardware and software
become more affordable and their functionalities become
more versatile, their use in our everyday life becomes
ubiquitous. However, while most of us enjoy the benefits these
technologies have to offer, the very same set of technologies
can also be exploited to manipulate contents for malicious
purposes. Consequently, the credibility of digital multimedia
when used as evidence in legal and security domains will be
constantly challenged and has to be scientifically proved. After
over 15 years of intensive research, digital watermarking [1, 2,
3, 4, 5, 6, 7] has been accepted as an effective way of verifying
content integrity in a wide variety of applications and will
continue to play an important role in multimedia protection and
security. However, because of the need of proactive
embedding of extra information in the host media, digital
Copyright (c) 2011 IEEE. Personal use of this material is permitted. However,
permission to use this material for any other purposes must be obtained from
the IEEE by sending an email to pubs-permissions@ieee.org.
watermarking is only applicable when such a data embedding
mechanism is available and the application standards/protocols
are followed. Given this limitation it is of no doubt that
unwatermarked multimedia will keep on being produced.
Moreover, there has been an enormous number of existing
unwatermarked media in circulation and it is in no way that
digital watermarking can be of help in carrying out digital
investigation when these pieces of unwatermarked multimedia
are the objects in question. In the light of these issues, the
recent years have seen an increasing number of publications on
digital forensics [8, 9, 10, 11, 12, 13], which rely on extracting
device signature left in the images/videos by acquisition devices
[14, 15, 16, 17] to verify the credibility and identify the source
of digital images/videos. A device signature may take the form
of sensor pattern noise (SPN) [11, 14, 18, 19, 20], camera
response function [21], re-sampling artefacts [22], colour filter
array (CFA) interpolation artefacts [12, 16], JPEG
compression [13, 23], lens aberration [24, 25], sensor dust [8],
etc. Other device and image attributes such as binary similarity
measures, image quality measures and higher order wavelet
statistics have also been exploited to identify and classify
source devices [17, 26, 27].
A. Photo Response Non-Uniformity (PRNU)
Among so many types of intrinsic device signatures, sensor
pattern noise [11, 14, 19, 20, 28] have drawn much attention
due to its feasibility in identifying not only device models of the
same make, but also individual devices of the same model [9,
11, 14]. Sensor pattern noise is mainly caused by imperfections
during the manufacturing process of semiconductor wafers and
slight variations in which individual sensor pixels convert light
to electrical signal [29]. It is this uniqueness of manufacturing
imperfections and non-uniformity of photo-electronic
conversion that makes sensor pattern noise capable of
identifying imaging sources to the accuracy of individual
devices. The reader is referred to [29] for more details in
relation to sensor pattern noise.
The dominating component of sensor pattern noise is photo
response non-uniformity (PRNU) [14, 29]. However, the
PRNU can be contaminated by various types of noise
introduced at different stages of the image acquisition process.
Figure 1 demonstrates the image acquisition process [30]. A
colour photo is represented in three colour components (i.e., R,
G, and B). For most digital cameras, during the image
acquisition process, the lenses let through the rays of the three
colour components of the scene, but for every pixel only the
Colour-Decoupled Photo Response Non-
Uniformity for Digital Image Forensics
Chang-Tsun Li, Member, IEEE and Yue Li
A

2
rays of one colour component is passed through the CFA and
subsequently converted into electronic signals by the sensor.
This colour filtering is determined by the CFA. After the
conversion, a colour interpolation function generates the
electronic signals of the other two colour components for
every pixel according to the colour intensities of the
neighbouring pixels. This colour interpolation process is
commonly known as demosaicking [31, 32, 33]. The signals
then undergo additional signal processing such as white
balance, gamma correction and image enhancement. Finally,
these signals are stored in the camera’s memory in a
customised format, primarily the JPEG format.
In acquiring an image, the signal will inevitably be distorted
when passing through each process and these distortions result
in slight differences between the scene and the camera-
captured image [14]. As formulated in [11], a camera output
model can be expressed as
qrs
KgI
)1(
(1)
where I is the output image, and
is the input signal of the
scene, g is the colour channel gain,
(= 0.455) is the gamma
correction factor, K is the zero-mean multiplicative factor
responsible for the PRNU, and
,
,
,
q
stand for dark
current, shot noise, read-out noise and quantisation (lossy
compression) noise, respectively. In Eq. (1),
s
and
r
are
random noise and
is the fixed pattern noise (FPN) that is
associated with every camera and can be removed by
subtracting a dark frame from the image taken by the same
camera [34]. Since
is the dominating term in Eq. (1), after
applying Taylor expansion to Eq. (1) and keeping the first two
terms of the expansion,
00
I I I K
(2)
where
0
I
is the denoised image and
is the ensemble of the
noises, including
,
s
,
r
and
q
. The PRNU pattern noise
K
can then be formulated as
0
I
W
K
(3)
where
0
IIW
(4)
is the noise residual obtained by applying a denoising filter on
image I. Although various denoising filters can be used, the
wavelet-based denoising process (i.e., the discrete wavelet
transform followed by a Wiener filtering operation), as
described in Appendix A of [14], has been reported as effective
in producing good results.
B. Use of PRNU in Device Identification
The basic idea of using the PRNU noise pattern in device
identification can be described as follows.
1) First, for each imaging device d, the noise residual
patterns are extracted using Eq. (5) from a number of
low-contrast images taken by device d and then the
PRNU is estimated using the ML estimation procedure
adopted by Chen et. al. [11], i.e.,
S
s
s,d
S
s
s,ds,d
d
)I(
IW
K
1
2
1
1
, (5)
where S is the number of images involved in the
calculation,
is the gamma correction factor
(
0.455
),
sd
I
,
is the s-th image taken by device d and
s,d
W
is the noise residual extracted from
sd
I
,
. Note the
multiplication operation in Eq. (5) is element-wise.
2) Secondly, the noise residual W
I
of image I under
investigation is extracted using Eq. (5) and compared
against the reference PRNU K
d
of each device d available
to the investigator in the hope that it will match one of
the reference fingerprints, thus identifying the source
device that has taken the image under investigation [14].
The normalised cross-correlation
IIdd
IIdd
Id
WWKIKI
)WW()KIKI(
)W,KI(
(6)
is used to compare the noise W
I
against the reference
fingerprint K
d
, where
is the mean function. Note in Eq.
(6), instead of using K
d
, we used
d
KI
as suggested in
[11]. Again the multiplication operation in Eq. (6) is
element-wise.
Given the PRNU-based approaches potential in resolving
device identification problem to the accuracy at individual
device level, it is important that the PRNU extracted is as close
to the genuine pattern noise due to the sensor as possible. Since
for most cameras, only one of the three colours of each pixel is
physically captured by the sensor while the other two are
artificially interpolated by the demosaicking process, this
inevitably introduce noise with power stronger than that of the
genuine PRNU. We can see from Eq. (2), (3) and (4) that the
accuracy of both PRNU K and noise residual W depends on the
denoising operation applied to I in obtaining
0
I
. However, as
mentioned earlier that the most common method [11, 14, 15,
18] of obtaining
0
I
is to apply the discrete wavelet transform
followed by a Wiener filtering operation directly to the entire
image I without differentiating physical components from
artificial components and, as a result, allowing the interpolation
noise in the artificial components to contaminate the real
PRNU in the physical components. Addressing this
shortcoming is the motivation of this work. In this work, we
will look at the impact of demosaicking on PRNU fidelity in
Section II and propose an improved formula for extracting
PRNU in Section III. In Section IV, we present some
experiments on device identification and image content
integrity verification to validate the proposed PRNU extraction

3
formula. Section V concludes this work. Because the PRNU is
formulated in Eq. (3) and (5) as a function of the noise residual
W (i.e., Eq. (4)), in the rest of the work we will use the two
terms, PRNU and noise residual, interchangeably whenever
there is no need to differentiate them.
II. DEMOSAICKING IMPACT ON PRNU FIDELITY
In this work, we call the colour components physically
captured by the sensor as physical colours and the ones
artificially interpolated by the demosaicking function as
artificial colours. Due to the fact that demosaicking is a key
deterministic process that affects the quality of colour images
taken by many digital devices, demosaicking has been
rigorously investigated [31, 32, 33, 35, 36]. Most
demosaicking approaches group the missing colours before
applying an interpolation function. The grouping process is
usually content-dependent, e.g., edge-adaptive or non-
adaptive, hence the accuracy of colour interpolation result is
also content-dependent [37]. For example, in a homogeneous
area, because of the low variation of the colour intensities of
neighbouring pixels, the interpolation function can more
accurately generate artificial components [30]. Conversely, in
inhomogeneous areas, the colour variation between
neighbouring pixels is greater, thus the interpolation noise is
also more significant.
This indicates that the PRNU in physical colour components
is more reliable than that in the artificial components.
However, the existing method for extracting PRNU as
formulated in Eq. (4) and (5) based on the definition of the
output image model in Eq. (1) does not take this into account
[11]. To extract the PRNU using Eq. (4) and (5), the discrete
wavelet transform followed by a Wiener filtering operation is
applied. The main problem inherent to Eq. (4) is that it involves
the whole image plane, which contains both artificial and
physical components, in one noise residual extraction process.
However, each coefficient of the wavelet transform used in the
noise residual extraction process involves multiple pixels and
thus both artificial and physical components. As a result the
interpolation noise gets diffused from the artificial components
into the physical ones. For example, in the red colour
component/plane of an image taken by a camera with a Bayer
CFA, only one fourth of the pixelsred colour are physical and
for each pixel with physical red colour all its 8-neighbours’ red
colours are artificial. When wavelet transform is applied during
the noise residual extraction process the interpolation noise
residing in the artificial components propagates into the
physical components. Therefore it is desirable to devise a noise
residual extraction method that can prevent the artificial
components from contaminating the reliable PRNU residing in
the physical components with the interpolation noise.
III. FORMULATION OF COLOUR DECOUPLED PRNU (CD-
PRNU)
In this section, we will discuss the formulation and extraction
of CD-PRNU. First, a mathematical model for the CD-PRNU
is derived and then an extraction algorithm is proposed to
extract the noise residual that is to be used for estimating the
final CD-PRNU, without prior knowledge about the CFA.
A. Mathematical Model of CD-PRNU
A generic demosaicking process is to convolve an
interpolation matrix with an image block of the same size
centred at the pixel where the artificial colour is to be
calculated [10, 16, 31]. Although the 2×2 Bayer CFA is the
most common CFA pattern, to make the proposed CD-PRNU
versatile and applicable to cameras adopting different CFA
patterns, we makes no assumption about the CFA pattern, F,
except that it is a 2 × 2 square array. Let
be an interpolation
matrix with 2N+1 × 2N+1 coefficients and
}},,{|{ BGRc
c
be a X × Y-pixel input signal from the
scene consisting of three colour components, R (red), G
(green) and B (blue) before colour interpolation. That is to say
that for each pixel
),( yx
, only one of the three colour
components takes a value physically captured by the sensor
and this colour is determined by the colour configuration of the
CFA pattern F. The other two colour components are to be
determined by the demosaicking process. For each colour
component of a pixel,
),( yx
c
,
},,{ BGRc
, can be
determined according to
N
Nvu
c
c
c
vyuxvu
cyxFyx
yx
,
),,(),(
),(),,(
),(
otherwise
2 mod 2 mod if
(7)
The first part of Eq. (7) means that if the colour component c
is the same as the colour that the CFA pattern F allows to
pass, i.e.,
cyxF ),( 2 mod 2 mod
, then no demosaicking is
needed because c has been physically captured by the sensor.
Otherwise, the second part of Eq. (7) is artificially applied to
calculate the colour. According to Eq. (7), the image output
model of Eq. (1) proposed in [11] can be re-formulated as
colour for
colour for
artificialKg
physicalKg
I
qrs
qrs
,)1(
,)1(
(8)
According to Eq. (3), we know that
K
<< 1 because
W
<<
)0(
I
. Therefore (1+ K) 1 and if we define the
interpolation noise P as
P 1
, the second part of Eq. (8)
becomes
qrs
KPg
)1)(1(
qrs
Pg
)1(
. This is because, for the
artificial components, the interpolation noise P is many orders
greater than the PRNU K and
K
<< 1, therefore (1+P)(1+K)
is virtually equal to (1+P). As a result, Eq. (8) can be re-
formulated as
colour for
colour for
artificialPg
physicalKg
I
qrs
qrs
,)1(
,)1(
(9)
Eq. (9) suggests that in the artificial components, the PRNU is
actually the interpolation noise P while, in the physical

4
components, the PRNU remains unaffected by the interpolation
noise.
It can also be seen from Eq. (9) that the physical
components and artificial components have similar
mathematical expression. Hence if the physical and artificial
colour components can be separated / decoupled, P can be
extracted in the same way as the sensor pattern noise K is
extracted (i.e., Eq. (3)). That is
a),(
a
I
W
P
0
(10)
where
a),(
I
0
is a low-passed filtered version of the artificial
components
a
I
and
a
W
is the corresponding sensor pattern
noise”, which is actually the interpolation noise. We can also
use the same ML estimate as in Eq. (5) to extract the reference
interpolation noise
d
P
for a particular device d from S low-
variation images taken by d such that
S
s
a
s,d
S
s
a
s,d
a
s,d
d
)I(
IW
P
1
2
1
1
, (11)
where
a
s,d
I
is the artificial colour components of the s-th low-
contrast image taken by device d and
a
s,d
W
is the interpolation
noise extracted from
a
s,d
I
. We will discuss how the physical and
artificial colour components can be decoupled in simple
manner without a priori knowledge about the CFA pattern in
Section III.B.
B. CD-PRNU Extraction Algorithm
According to Eq. (10) and (11), we can extract the sensor
pattern noise and interpolation noise, respectively, from the
physical and artificial components if the CFA is known.
However, manufacturers usually do not provide information
about the CFA used by their cameras [30]. Therefore, several
methods have been proposed to estimate the CFA [10, 12, 16,
38]. Unfortunately, these methods have to exhaust all of the
possible CFA patterns in order to infer/estimate the
‘real’/optimal CFA. However, exhaustive search is by no
means acceptable. In this work, to extract the CD-PRNU, we
first separate the three colour channels
c
I
,
,,c R G B
of a
colour image I of
XY
pixels. Most CFA patterns are of 2 × 2
elements and are periodically mapped to the sensors. We know
that, for each pixel of I, only one of the three colour
components is physical and the other two are artificial, so the
second step is, for each channel
c
I
, we perform a 2:1 down-
sampling across both horizontal and vertical dimensions to get
four sub-images,
,,c i j
I
,
, 0,1ij
, such that
,,
, 2 ,2
c i j c
I x y I x i y j
(12)
where
0, / 2 1xX



and
0, /2 1yY



.
For each colour channel,
c
I
, without knowing the CFA
pattern used by the manufacturer, we do not know (actually
we do not have to know) which pixels carry the colour
captured physically by the hardware and which are not. But by
decomposing
c
I
into four sub-images,
,,c i j
I
, we know that
each of the four sub-images either contains only the physical
colour or only the artificial colours. By de-coupling the
physical and artificial colour components in this fashion before
extracting the noise residual, we can prevent the artificial
components from contaminating the physical components
during the DWT process. Eq. (4) is then used to obtain noise
residual
j,i,c
W
from each sub-images
,,c i j
I
,
, 0,1ij
.
Finally the CD-PRNU W
c
of each colour channel c is formed
by combining the four sub-noise residuals
j,i,c
W
,
, 0,1ij
such that
222/,2/,
,,
mod , mod , yjxiyxWyxW
jicc
(13)
where
0, 1xX
,
0, 1yY
and mod is the modulo
operation. The framework of the colour decoupled noise
residual extraction process is shown in Figure 2 and the
procedures are listed in Algorithm 1. Note that Algorithm 1 is
for extracting the noise residual pattern W from an image I. To
estimate the CD-PRNU P
d
of a particular device d and use it as
the reference signature of d, Eq. (11) is applied.
Algorithm 1. Noise residual extraction algorithm.
Input: original image
I
Output: colour decoupled noise residual W
Noise residual extraction algorithm
1) Decompose image I into R, G and B components,
I
R
, I
G
, and I
B
.
2)
,,c R G B
, decompose I
c
into four sub-
images, I
c,0,0
, I
c,0,1
, I
c,1,0
and I
c,1,1
by using Eq. (12).
3)
,,c R G B
, extract W
c,0,0
, W
c,0,1
, W
c,1,0
and
W
c,1,1
from I
c,0,0
, I
c,0,1
, I
c,1,0
and I
c,1,1
by using Eq.
(4).
4)
,,c R G B
, generate the colour decoupled
noise residual W
c
by combining W
c,0,0
, W
c,0,1
, W
c,1,0
and W
c,1,1
according to Eq. (13)
5) Combine the colour decoupled noise residual W
R
,
W
G
, W
B
to form the final noise residual W.
IV. EXPERIMENTAL RESULTS
In this section, we carry out experiments on source camera
identification and image content integrity verification to
validate the feasibility of the proposed CD-PRNU in a
comparative manner.
A. Source Camera Identification
We have carried out source camera identification tests on 300
2048×1536-pixel photos of natural scenes taken by six cameras

Citations
More filters
Journal ArticleDOI

An Overview on Image Forensics

TL;DR: The aim of this survey is to provide a comprehensive overview of the state of the art in the area of image forensics by classifying the tools according to the position in the history of the digital image in which the relative footprint is left: acquisition- based methods, coding-based methods, and editing-based schemes.
Journal ArticleDOI

Image Splicing Localization using a Multi-task Fully Convolutional Network (MFCN)

TL;DR: Experiments show that the SFCN and MFCN outperform existing splicing localization algorithms, and that the M FCN can achieve finer localization than the S FCN.
Journal ArticleDOI

A Bayesian-MRF Approach for PRNU-Based Image Forgery Detection

TL;DR: Large-scale experiments on simulated and real forgeries show that the proposed technique largely improves upon the current state of the art, and that it can be applied with success to a wide range of practical situations.
Posted ContentDOI

Deep Learning for Deepfakes Creation and Detection: A Survey

TL;DR: This study provides a comprehensive overview of deepfake techniques and facilitates the development of new and more robust methods to deal with the increasingly challenging deepfakes.
Journal ArticleDOI

Large-scale evaluation of splicing localization algorithms for web images

TL;DR: This work presents the first exhaustive evaluation of today’s state-of-the-art algorithms for splicing localization, that is, algorithms attempting to detect which pixels in an image have been tampered with as the result of such a forgery.
References
More filters
Book

Scientific Charge-Coupled Devices

TL;DR: In this article, the spectral response, charge collection, charge transfer, and readout noise properties of charge-coupled devices have been investigated, and their potential for future improvement is discussed.
Journal ArticleDOI

Digital camera identification from sensor pattern noise

TL;DR: A new method is proposed for the problem of digital camera identification from its images based on the sensor's pattern noise, which serves as a unique identification fingerprint for each camera under investigation by averaging the noise obtained from multiple images using a denoising filter.
Journal ArticleDOI

Determining Image Origin and Integrity Using Sensor Noise

TL;DR: A unified framework for identifying the source digital camera from its images and for revealing digitally altered images using photo-response nonuniformity noise (PRNU), which is a unique stochastic fingerprint of imaging sensors is provided.
Journal ArticleDOI

Exposing digital forgeries by detecting traces of resampling

TL;DR: This work describes how resampling introduces specific statistical correlations, and describes how these correlations can be automatically detected in any portion of an image, and expects this technique to be among the first of many tools that will be needed to expose digital forgeries.
Patent

Adaptive color plane interpolation in single sensor color electronic camera

TL;DR: In this paper, an approach for processing a digitized image signal obtained from an image sensor having color photosites aligned in rows and columns that generate at least three separate color values but only one color value for each photoite location, and a structure for interpolating color values for each photosite location so that it has three different color values.
Related Papers (5)
Frequently Asked Questions (16)
Q1. What is the effect of wavelet transform on the noise residual extraction process?

When wavelet transform is applied duringthe noise residual extraction process the interpolation noiseresiding in the artificial components propagates into thephysical components. 

For a system with a Pentium Core II 1.3G CPU and 3 GB RAM, it takes 0.526 seconds to compute the similarity between the PRNUs of two images of 2048 × 1536 pixels and 0.567 seconds to calculate the similarity between a pair of CDPRNUs of the same size. 

The purpose of this experiment is to demonstrate the capability of the proposed CD-PRNU in dealing with the colour interpolation noise, so geometrical transformations will not be applied in order to prevent biased evaluation from happening. 

The main problem inherent to Eq. (4) is that it involvesthe whole image plane, which contains both artificial andphysical components, in one noise residual extraction process. 

Since the tampered area is 60 × 80 pixels, approximately one quarter of the window, the method based on PRNU can perform no better than a random7guess. 

as mentioned earlier that the most common method [11, 14, 15, 18] of obtaining 0I is to apply the discrete wavelet transform followed by a Wiener filtering operation directly to the entire image The authorwithout differentiating physical components from artificial components and, as a result, allowing the interpolation noise in the artificial components to contaminate the real PRNU in the physical components. 

Chen predicated in [11] that one quarter of the sliding window is the lower bound on the size of tampered regions that their algorithm can identify, and therefore areas smaller than this should be filtered in order to remove the falsely identified noise. 

The authors have also proposed a simple, yet effective, colour-decoupled PRNU (CD-PRNU) extraction method, which can prevent the CFA interpolation error from diffusing from the artificial colour channels into the physical channels, thus improving the accuracy of the fingerprint. 

Due to the fact that demosaicking is a key deterministic process that affects the quality of colour images taken by many digital devices, demosaicking has been rigorously investigated [31, 32, 33, 35, 36]. 

It is this uniqueness of manufacturing imperfections and non-uniformity of photo-electronic conversion that makes sensor pattern noise capable of identifying imaging sources to the accuracy of individual devices. 

Finally the CD-PRNU Wc of each colour channel c is formed by combining the four sub-noise residuals j,i,cW , , 0,1i j such that 222/,2/, ,, mod , mod , yjxiyxWyxW jicc (13)where 0, 1x X , 0, 1y Y and mod is the modulooperation. 

This is because, for theartificial components, the interpolation noise P is many ordersgreater than the PRNU K and K << 1, therefore (1+P)(1+K)is virtually equal to (1+P). 

in their experiment, the sliding step/displacement is set to 5 pixels in order to reduce the computational load without sacrificing the accuracy of the integrity verification. 

Although the 2×2 Bayer CFA is the most common CFA pattern, to make the proposed CD-PRNU versatile and applicable to cameras adopting different CFA patterns, the authors makes no assumption about the CFA pattern, F, except that it is a 2 × 2 square array. 

Because the PRNU is formulated in Eq. (3) and (5) as a function of the noise residual W (i.e., Eq. (4)), in the rest of the work the authors will use the two terms, PRNU and noise residual, interchangeably whenever there is no need to differentiate them. 

But bydecomposing cI into four sub-images, , ,c i jI , the authors know thateach of the four sub-images either contains only the physical colour or only the artificial colours.