scispace - formally typeset
Proceedings ArticleDOI

Fetal ECG Separation from Abdominal ECG Recordings Using Compressive Sensing Approach

11 Jul 2018-pp 831-834
TL;DR: This paper is presenting the framework for detection of fetal from maternal ECG based on sparse binary matrix using Compressive Sensing, and presenting the preprocessing algorithm on raw fetus ECG data to remove the noises like impulsive artifacts along with notch filtering for baseline removal.
Abstract: The fetal electrocardiogram (f-ECG) beats analysis as well as the heart rate interpretation using the raw ECG signals captured by machine helps in providing the state of the fetus in pregnancy. For timely detection of the fetal arrhythmias, monitoring fetal ECGs constantly is important. In this paper, we are presenting the framework for detection of fetal from maternal ECG based on sparse binary matrix using Compressive Sensing. Additionally, we are presenting the preprocessing algorithm on raw fetus ECG data to remove the noises like impulsive artifacts along with notch filtering for baseline removal. The proposed method is on the basis of sparse representation of the components that are acquired using Independent Component Analysis (ICA) method, which is designed for direct application in the compressed domain. Detection of fetal ECG is performed on the basis of activated atoms in a specially designed Gaussian dictionary. The verification of the proposed framework has been carried out on ten samples of Challenge dataset A by determining QRS detection parameters such as sensitivity, $\mathbf{S}= 90.62\%$ and positive predictivity, given by P+= 99.15%.
Citations
More filters

Proceedings ArticleDOI
18 Jul 2020-
TL;DR: The present scheme uses one dimensional (1D) convolution with a wavelet kernel to extract time domain features from subjects possessing normal fetal ECG and fetal arrhythmia ECG to develop an intelligent system for portable embedded system applications.
Abstract: This paper aims to present an intelligent system for autonomous diagnosis of fetal arrhythmia based on fetal ECG recordings. The present scheme uses one dimensional (1D) convolution with a wavelet kernel to extract time domain features from subjects possessing normal fetal ECG and fetal arrhythmia ECG. Time- domain features obtained from the convoluted signals are fed to a trained artificial neural network (ANN) with gradient descent learning to identify and classify fetal ECG signals. The experimental evaluation of the proposed scheme has been tested with a six- channel fetal ECG signal, available in the NIFEADB database. An overall accuracy of 96% is obtained by evaluating standard performance metrics. The use of 1D convolution not only reduces the computational burden but also helps to specify the feature space to develop an intelligent system for portable embedded system applications.

3 citations


Additional excerpts

  • ...ECG signal-based arrhythmia detection and classification has been reported in several literatures [5, 7-16]....

    [...]


Patent
19 Apr 2019-
Abstract: The invention provides a method for detecting the maternal electrocardiogram R peak on single-channel pregnant woman's abdominal wall myoelectricity. The method comprises the steps that 1, one-channelpregnant woman's abdominal wall myoelectricity signal is read; 2, the maternal electrocardiogram R peak on the abdominal wall myoelectricity signal is initially detected or pre-detected; 3, accordingto an initial detection result, an adaptive Gaussian dictionary is constructed, and based on the constructed adaptive Gaussian dictionary, enhancement of the maternal electrocardiogram R peak is achieved through sparse representation; 4, the R peak is detected on a maternal electrocardiogram R peak enhancement signal, and the position of the R peak is output. The method is characterized in that the Gaussian dictionary obtained in the step 3 is composed of Gaussian atoms corresponding to maternal electrocardiogram components, Gaussian atoms corresponding to fetal electrocardiogram components,and Gaussian atoms corresponding to noise components, wherein the Gaussian atoms corresponding to the maternal electrocardiogram components only involves one scale, and the scale is obtained through optimization according to the conditions of the maternal electrocardiogram R peak obtained through initial detection of the abdominal wall myoelectricity signal.

Book ChapterDOI
01 Jan 2021-
Abstract: The infant mortality rate is the number of newborns death under 1 year of age occurring among the live births in a given region during a given year Electrocardiogram is generally used for finding the cardiovascular variation The infant mortality rate can be drastically reduced by adopting this proposed diagnosis technique for fetus This proposed work provides an indication of fetal health and heart information Sometimes newborn babies will be affected by heart diseases like tachycardia and bradycardia By diagnosing these diseases, we could reduce the death rate of the newborns This method proposes an early detection of fetal ECG and the arrhythmia of the fetus So this will dynamically reduce the infant mortality rate It brings amazing changes in the fields of medical industries and medical research fields The main aim of this work is to detect the variations in the heart rate The variations in the heart can be detected by using feature vector that is extracted from the signal and a classification method is also used to classify the signal as abnormal and normal This work shows a result of accuracy about 9411% and sensitivity of 8888%

References
More filters

Book
D.L. Donoho1
01 Jan 2004-
TL;DR: It is possible to design n=O(Nlog(m)) nonadaptive measurements allowing reconstruction with accuracy comparable to that attainable with direct knowledge of the N most important coefficients, and a good approximation to those N important coefficients is extracted from the n measurements by solving a linear program-Basis Pursuit in signal processing.
Abstract: Suppose x is an unknown vector in Ropfm (a digital image or signal); we plan to measure n general linear functionals of x and then reconstruct. If x is known to be compressible by transform coding with a known transform, and we reconstruct via the nonlinear procedure defined here, the number of measurements n can be dramatically smaller than the size m. Thus, certain natural classes of images with m pixels need only n=O(m1/4log5/2(m)) nonadaptive nonpixel samples for faithful recovery, as opposed to the usual m pixel samples. More specifically, suppose x has a sparse representation in some orthonormal basis (e.g., wavelet, Fourier) or tight frame (e.g., curvelet, Gabor)-so the coefficients belong to an lscrp ball for 0

18,593 citations


Journal ArticleDOI
TL;DR: The theory of compressive sampling, also known as compressed sensing or CS, is surveyed, a novel sensing/sampling paradigm that goes against the common wisdom in data acquisition.
Abstract: Conventional approaches to sampling signals or images follow Shannon's theorem: the sampling rate must be at least twice the maximum frequency present in the signal (Nyquist rate). In the field of data conversion, standard analog-to-digital converter (ADC) technology implements the usual quantized Shannon representation - the signal is uniformly sampled at or above the Nyquist rate. This article surveys the theory of compressive sampling, also known as compressed sensing or CS, a novel sensing/sampling paradigm that goes against the common wisdom in data acquisition. CS theory asserts that one can recover certain signals and images from far fewer samples or measurements than traditional methods use.

8,847 citations


"Fetal ECG Separation from Abdominal..." refers background or methods in this paper

  • ...Compressive Sensing is not a conventional approach for sampling signals [7]....

    [...]

  • ...The Compressive Sensing [5] [6] [7] is a newly introduced sampling technique which uses very few samples as compared with the Shannon’s theorem may require....

    [...]


Journal ArticleDOI
01 May 2000-Neural Networks
TL;DR: The basic theory and applications of ICA are presented, and the goal is to find a linear representation of non-Gaussian data so that the components are statistically independent, or as independent as possible.
Abstract: A fundamental problem in neural network research, as well as in many other disciplines, is finding a suitable representation of multivariate data, i.e. random vectors. For reasons of computational and conceptual simplicity, the representation is often sought as a linear transformation of the original data. In other words, each component of the representation is a linear combination of the original variables. Well-known linear transformation methods include principal component analysis, factor analysis, and projection pursuit. Independent component analysis (ICA) is a recently developed method in which the goal is to find a linear representation of non-Gaussian data so that the components are statistically independent, or as independent as possible. Such a representation seems to capture the essential structure of the data in many applications, including feature extraction and signal separation. In this paper, we present the basic theory and applications of ICA, and our recent work on the subject.

7,434 citations


14


"Fetal ECG Separation from Abdominal..." refers background or methods in this paper

  • ...B. Independent Component Analysis for the ECG signal ICA is one of the most frequently used computational method for separation of any multivariate signal into subcomponents that are summative by nature....

    [...]

  • ...ICA algorithm calculates matrix as W= and then we can find the independent components s, where s is given by s = W , ICA is based on the theory of the central limit theorem, which defines that every variable tends to be having a Gaussian distribution and this is due to combining independent components linearly [8]....

    [...]

  • ...There are two common source separation techniques as mentioned above like PCA (Principal Component Analysis) and also ICA (Independent Component Analysis) [8]....

    [...]

  • ...C. Independent Component Analysis for the Compressed Domain As we intend to directly apply the ICA in compressed domain we here define = [ ( ), … ....

    [...]

  • ...This can be carried out by presumption that the additive sub-components are always non-Gaussian and also they are statistically independent from each other [8]....

    [...]


Proceedings Article
01 Mar 2008-
TL;DR: This paper overviews the recent work on compressive sensing, a new approach to data acquisition in which analog signals are digitized for processing not via uniform sampling but via measurements using more general, even random, test functions.
Abstract: This paper overviews the recent work on compressive sensing, a new approach to data acquisition in which analog signals are digitized for processing not via uniform sampling but via measurements using more general, even random, test functions. In stark contrast with conventional wisdom, the new theory asserts that one can combine "low-rate sampling" with digital computational power for efficient and accurate signal acquisition. Compressive sensing systems directly translate analog data into a compressed digital form; all we need to do is "decompress" the measured data through an optimization on a digital computer. The implications of compressive sensing are promising for many applications and enable the design of new kinds of analog-to-digital converters, cameras, and imaging systems.

1,537 citations


"Fetal ECG Separation from Abdominal..." refers methods in this paper

  • ...The Compressive Sensing [5] [6] [7] is a newly introduced sampling technique which uses very few samples as compared with the Shannon’s theorem may require....

    [...]


Journal ArticleDOI
TL;DR: This paper quantifies the potential of the emerging compressed sensing (CS) signal acquisition/compression paradigm for low-complexity energy-efficient ECG compression on the state-of-the-art Shimmer WBSN mote and shows that CS represents a competitive alternative to state- of- the-art digital wavelet transform (DWT)-basedECG compression solutions in the context of WBSn-based ECG monitoring systems.
Abstract: Wireless body sensor networks (WBSN) hold the promise to be a key enabling information and communications technology for next-generation patient-centric telecardiology or mobile cardiology solutions. Through enabling continuous remote cardiac monitoring, they have the potential to achieve improved personalization and quality of care, increased ability of prevention and early diagnosis, and enhanced patient autonomy, mobility, and safety. However, state-of-the-art WBSN-enabled ECG monitors still fall short of the required functionality, miniaturization, and energy efficiency. Among others, energy efficiency can be improved through embedded ECG compression, in order to reduce airtime over energy-hungry wireless links. In this paper, we quantify the potential of the emerging compressed sensing (CS) signal acquisition/compression paradigm for low-complexity energy-efficient ECG compression on the state-of-the-art Shimmer WBSN mote. Interestingly, our results show that CS represents a competitive alternative to state-of-the-art digital wavelet transform (DWT)-based ECG compression solutions in the context of WBSN-based ECG monitoring systems. More specifically, while expectedly exhibiting inferior compression performance than its DWT-based counterpart for a given reconstructed signal quality, its substantially lower complexity and CPU execution time enables it to ultimately outperform DWT-based ECG compression in terms of overall energy efficiency. CS-based ECG compression is accordingly shown to achieve a 37.1% extension in node lifetime relative to its DWT-based counterpart for “good” reconstruction quality.

648 citations


Performance
Metrics
No. of citations received by the Paper in previous years
YearCitations
20211
20201
20191