scispace - formally typeset
Search or ask a question

Showing papers on "Artifact (error) published in 2000"


Journal ArticleDOI
TL;DR: The results on EEG data collected from normal and autistic subjects show that ICA can effectively detect, separate, and remove contamination from a wide variety of artifactual sources in EEG records with results comparing favorably with those obtained using regression and PCA methods.
Abstract: Eye movements, eye blinks, cardiac signals, muscle noise, and line noise present serious problems for electroencephalographic (EEG) interpretation and analysis when rejecting contaminated EEG segments results in an unacceptable data loss. Many methods have been proposed to remove artifacts from EEG recordings, especially those arising from eye movements and blinks. Often regression in the time or frequency domain is performed on parallel EEG and electrooculographic (EOG) recordings to derive parameters characterizing the appearance and spread of EOG artifacts in the EEG channels. Because EEG and ocular activity mix bidirectionally, regressing out eye artifacts inevitably involves subtracting relevant EEG signals from each record as well. Regression methods become even more problematic when a good regressing channel is not available for each artifact source, as in the case of muscle artifacts. Use of principal component analysis (PCA) has been proposed to remove eye artifacts from multichannel EEG. However, PCA cannot completely separate eye artifacts from brain signals, especially when they have comparable amplitudes. Here, we propose a new and generally applicable method for removing a wide variety of artifacts from EEG records based on blind source separation by independent component analysis (ICA). Our results on EEG data collected from normal and autistic subjects show that ICA can effectively detect, separate, and remove contamination from a wide variety of artifactual sources in EEG records with results comparing favorably with those obtained using regression and PCA methods. ICA can also be used to analyze blink-related brain activity.

2,944 citations


Journal ArticleDOI
TL;DR: It is demonstrated that simultaneous EEG/ fMRI studies are for the first time possible, extending the scope of EEG/fMRI studies considerably.

1,285 citations


Journal ArticleDOI
TL;DR: ICA has been shown to be an efficient tool for artifact identification and extraction from electroencephalographic and magnetoencephalographical recordings and has been applied to the analysis of brain signals evoked by sensory stimuli.
Abstract: Multichannel recordings of the electromagnetic fields emerging from neural currents in the brain generate large amounts of data. Suitable feature extraction methods are, therefore, useful to facilitate the representation and interpretation of the data. Recently developed independent component analysis (ICA) has been shown to be an efficient tool for artifact identification and extraction from electroencephalographic (EEG) and magnetoencephalographic (MEG) recordings. In addition, ICA has been applied to the analysis of brain signals evoked by sensory stimuli. This paper reviews our recent results in this field.

789 citations


Journal ArticleDOI
TL;DR: In this article, the relative merits of a variety of EOG correction procedures are discussed, including the distinction between frequency and time domain approaches, the number of channels required for adequate correction, estimating correction coefficients from raw versus averaged data, differential correction of different types of eye movement, the most suitable statistical procedure for estimating correction coefficient, the use of calibration trials for the estimation of correction coefficients, and the difference between 'coefficient estimation' and 'correction phase' error.
Abstract: Eye movements cause changes to the electric fields around the eyes, and consequently over the scalp. As a result, EEG recordings are often significantly distorted, and their interpretation problematic. A number of methods have been proposed to overcome this problem, ranging from the rejection of data corresponding temporally to large eye movements, to the removal of the estimated effect of ocular activity from the EEG (EOG correction). This paper reviews a number of such methods of dealing with ocular artifact in the EEG, focusing on the relative merits of a variety of EOG correction procedures. Issues discussed include the distinction between frequency and time domain approaches, the number of EOG channels required for adequate correction, estimating correction coefficients from raw versus averaged data, differential correction of different types of eye movement, the most suitable statistical procedure for estimating correction coefficients, the use of calibration trials for the estimation of correction coefficients, and the distinction between 'coefficient estimation' and 'correction phase' error. A suggested EOG correction algorithm is also described.

664 citations


Journal ArticleDOI
TL;DR: This work proposes a procedure for statistical correction of artifacts in dense array studies (SCADS), which detects individual channel artifacts using the recording reference, detects globalartifacts using the average reference, replaces artifact-contaminated sensors with spherical interpolation statistically weighted on the basis of all sensors, and computes the variance of the signal across trials to document the stability of the averaged waveform.
Abstract: With the advent of dense sensor arrays (64-256 channels) in electroencephalography and magnetoencephalography studies, the probability increases that some recording channels are contaminated by artifact. If all channels are required to be artifact free, the number of acceptable trials may be unacceptably low. Precise artifact screening is necessary for accurate spatial mapping, for current density measures, for source analysis, and for accurate temporal analysis based on single-trial methods. Precise screening presents a number of problems given the large datasets. We propose a procedure for statistical correction of artifacts in dense array studies (SCADS), which (1) detects individual channel artifacts using the recording reference, (2) detects global artifacts using the average reference, (3) replaces artifact-contaminated sensors with spherical interpolation statistically weighted on the basis of all sensors, and (4) computes the variance of the signal across trials to document the stability of the averaged waveform. Examples from 128-channel recordings and from numerical simulations illustrate the importance of careful artifact review in the avoidance of analysis errors.

517 citations


Journal ArticleDOI
TL;DR: Quality EEG may be recorded simultaneously with fMRI and activation maps could be made of any relevant changes in the EEG, such as inter-ictal spikes or spectral variations, or of evoked response potentials (ERPs).

295 citations


Journal ArticleDOI
TL;DR: It was concluded that concurrent EEG and fMRI could be performed without compromising the image quality significantly if suitable equipment is used and the image noise originating from the EEG recording equipment was identified as coherent noise and could be eliminated by appropriate shielding of the EEG equipment.
Abstract: Electroencephalographic (EEG) monitoring during functional magnetic resonance imaging (fMRI) experiments is increasingly applied for studying physiological and pathological brain function. However, the quality of the fMRI data can be significantly compromised by the EEG recording due to the magnetic susceptibility of the EEG electrode assemblies and electromagnetic noise emitted by the EEG recording equipment. We therefore investigated the effect of individual components of the EEG recording equipment on the quality of echo planar images. The artifact associated with each component was measured and compared to the minimum scalp-cortex distance measured in normal controls. The image noise originating from the EEG recording equipment was identified as coherent noise and could be eliminated by appropriate shielding of the EEG equipment. It was concluded that concurrent EEG and fMRI could be performed without compromising the image quality significantly if suitable equipment is used. The methods described and the results of this study should be useful to other researchers as a framework for testing of their own equipment and for the selection of appropriate equipment for EEG recording inside a MR scanner. Hum. Brain Mapping 10:10-15, 2000. (C) 2000 Wiley-Liss, Inc.

153 citations


Journal ArticleDOI
TL;DR: A method for studying the geometric relations among responses generated by mathematical models is introduced that shows the artifact is a result of the combined contributions of three factors: arithmetic averaging of data that are generated from a nonlinear model in the presence of individual differences.
Abstract: The power law (y =ax −b) has been shown to provide a good description of data collected in a wide range of fields in psychology. R. B. Anderson and Tweney (1997) suggested that the model’s data-fitting success may in part be artifactual, caused by a number of factors, one of which is the use of improper data averaging methods. The present paper follows up on their work and explains causes of the power law artifact. A method for studying the geometric relations among responses generated by mathematical models is introduced that shows the artifact is a result of the combined contributions of three factors: arithmetic averaging of data that are generated from a nonlinear model in the presence of individual differences.

122 citations


Journal ArticleDOI
TL;DR: Investigation, identify and discuss artifacts and their sources arising in three‐dimensional ultrasound (3D US) in clinical practice in order to increase the awareness of clinicians and sonographers with respect to common 3D US artifacts and to use this increased awareness to avoid or reduce the occurrence of misdiagnosis in 3DUS studies.
Abstract: The purpose of this paper is to investigate, identify and discuss artifacts and their sources arising in three-dimensional ultrasound (3D US) in clinical practice in order to increase the awareness of clinicians and sonographers with respect to common 3D US artifacts and to use this increased awareness to avoid or reduce the occurrence of misdiagnosis in 3D US studies. Patient 3D US data were acquired using several different scanners and reviewed interactively on the scanner and graphics workstations. Artifacts were catalogued according to artifact origin. Two-dimensional ultrasound (2D US) artifacts were classified whether they were of a B-mode or color/power Doppler origin and their presentation in the original scan planes and the resulting volume re-sliced planes and rendered images was identified. Artifacts unique to 3D US were observed, noted and catalogued on the basis of whether they arose during acquisition, rendering or volume editing operations. Acoustic artifacts identified included drop-out, shadowing, etc. whose presentation depended on the relationship between slice and imaging plane orientation. Color/power Doppler artifacts were related to gain, aliasing, and flash which could add apparent structure or confusion to the volume images. Rendered images also demonstrated artifacts due to shadowing and motion of adjacent structures, cardiac motion or pulsatility of the cardiac septum or vessel walls. Editing artifacts potentially removed important structures. Three-dimensional ultrasound is prone to the same types of artifacts encountered in 2D US imaging plus others unique to volume acquisition and visualization. The consequences of these diagnostically significant artifacts include mimicking of abnormal development, masses, or missing structures thus requiring careful study before reaching a diagnosis.

117 citations


Journal ArticleDOI
TL;DR: In this paper, an early Neolithic village in the Jordan valley is investigated. But the authors focus on the Khiamian site and do not consider the other sites in the area.
Abstract: lithic village in the southern Jordan Valley. Journal of Field Archaeology 25:153–61. n a d e l , d . 1990. The Khiamian as a case of Sultanian intersite variability. Journal of the Israel Prehistoric Society 23:86–99. s i m m o n s , a . h . , a n d m . a l n a j j a r . 1996. Current investigation at Ghwair 1, a Neolithic settlement in southern Jordan. Neolithics 2(96):6–7. t c h e r n o v, e . 1980. The faunal remains from the Gilgal site. Israel Exploration Journal 30:73–82. ———. 1994. An Early Neolithic village in the Jordan Valley. Pt. 2. The fauna of Netiv Hagdud. Peabody Museum, Harvard University, American School of Prehistoric Research Bulletin 44.

115 citations


Proceedings ArticleDOI
24 Sep 2000
TL;DR: A portable ECG recorder is designed and implemented using 120 Hz impedance based motion artifact removal to investigate the utility of motion artifacts removal for Holter and stress recording.
Abstract: Motion artifact is a significant source of noise in an ambulatory ECG monitor and and can occur frequently during Holter recording and stress testing. We have investigated methods for removing motion artifact from ECG signals using adaptive noise removal. We have investigated both the use of an electrode/skin impedance signal and a signal from a physical sensor (Measurand Shape Sensor (TM)) mounted on the electrode for adaptively modeling and removing motion artifact. The skin/electrode signal and physical sensor signals can both be used to produce equivalent noise reduction, but, in addition to requiring the sensors, higher order adaptive filters (5/sup th/ order vs. 3/sup rd/ order) were required when the physical sensor signal was used. Guided by the results of these tests, we have designed and implemented a portable ECG recorder using 120 Hz impedance based motion artifact removal to investigate the utility of motion artifact removal for Holter and stress recording.

Journal ArticleDOI
TL;DR: Whether the artifacts presented by precordial compressions during cardiopulmonary resuscitation could be removed from the human electrocardiogram (ECG) using a filtering approach is assessed and the success of the proposed method is demonstrated through graphic examples, SNR, and rhythm classification evaluations.
Abstract: The purpose of this study was to assess whether the artifacts presented by precordial compressions during cardiopulmonary resuscitation could be removed from the human electrocardiogram (ECG) using a filtering approach. This would allow analysis and defibrillator charging during ongoing precordial compressions yielding a very important clinical improvement to the treatment of cardiac arrest patients. In this investigation the authors started with noise-free human ECGs with ventricular fibrillation (VF) and ventricular tachycardia (VT) records. To simulate a realistic resuscitation situation, they added a weighted artifact signal to the human ECG, where the weight factor was chosen to provide the desired signal-to-noise ratio (SNR) level. As artifact signals the authors used ECGs recorded from animals in asystole during precordial compressions at rates 60, 90, and 120 compressions/min. The compression depth and the thorax impedance was also recorded. In a real-life situation such reference signals are available and, using an adaptive multichannel Wiener filter, the authors construct an estimate of the artifact signal, which subsequently can be subtracted from the noisy human ECG signal. The success of the proposed method is demonstrated through graphic examples, SNR, and rhythm classification evaluations.


Journal ArticleDOI
TL;DR: The high true classification rate reported previously is believed to be an artifact arising from erroneous data preparation and off-line validation, and is unlikely to be feasible under current computer technology.

Journal ArticleDOI
TL;DR: Artifact is a common finding in patients requiring evaluation and monitoring in the prehospital, emergency department, or intensive care unit settings and may also produce electrocardiographic signals which mimick disease.
Abstract: Electrocardiographic artifact is a common finding in patients requiring evaluation and monitoring in the prehospital, emergency department, or intensive care unit settings. Artifact results from both internal (physiological) and external (nonphysiological) sources. In most instances, artifact is recognized as an incorrect electrocardiographic signal--its only impact producing interference in electrocardiogram interpretation; artifact may also produce electrocardiographic signals which mimick disease--these signals the physician must recognize as artifact.

Journal ArticleDOI
TL;DR: The results are very promising, indicating that integration of multiple signals by applying a classification system to sets of values derived from physiologic data streams may be a viable approach to detecting artifacts in neonatal ICU data.

Patent
18 Oct 2000
TL;DR: In this paper, a method for masking a scanning artifact within image data representing a document is described. But the method is not suitable for the task of image captioning, as it requires the pixel classification tags for the image data to be manually generated.
Abstract: A first aspect of the present invention is a process for masking a scanning artifact within image data representing a document. The process includes generating pixel classification tags for the image data; identifying a window within the image data associated with a scanning artifact using the pixel classification tags; analyzing the image data to derive a replacement video value; and replacing image data associated with a scanning artifact with the replacement video value.

Journal ArticleDOI
TL;DR: Nuclear magnetic resonance spectroscopy is a useful analytical tool for the examination of archaeological artifacts as mentioned in this paper, which can identify sources of raw materials, verify artifact authenticity, delineate ancient technology, and specify ancient diet.
Abstract: Nuclear magnetic resonance spectroscopy is a useful analytical tool for the examination of archaeological artifacts. Both organic and inorganic materials have been examined in solution and in the solid. NMR can identify sources of raw materials, verify artifact authenticity, delineate ancient technology, and specify ancient diet.

Journal Article
TL;DR: The increase in the number of projections is likely a necessary but not a sufficient condition to reduce the streak artifact: if not corrected, the attenuation could be a limiting factor in the removal of this artifact when theNumber of projections increases.
Abstract: Because of the limited number of projections, the mathematic reconstruction formula of the filtered backprojection (FBP) algorithm may create an artifact that streaks reconstructed images. This artifact can be imperfectly removed by replacing the ramp filter of the FBP with an ad hoc low-pass filter, the cost being the loss of contrast and definition. In this study, a solution was proposed to increase, by computational means, the number of projections to reduce the artifact at a lower cost. The cost was a postacquisition process, which was reasonably time consuming. Methods: The process was called interpolation of projections by contouring (IPC). First, level lines were plotted on the sinogram to delimit isocount regions; then, the regions containing the interpolated points were found, and to each point was assigned the intensity of its isocount region. Using this process, the data could be resampled, allowing an increase in the number of projections or the number of pixels by projections. A phantom study of bone scintigraphy was performed to compare the slices obtained with and without the IPC process with the true image. A clinical case was also presented. Results: The phantom study showed that with the IPC process, the reconstructed slice was closer to the model, inside and outside the body, when the sinogram was resampled to multiply by 2 or 3 the number of projections, with the same number of pixels per projection. In the clinical study, the streak artifact was reduced, especially outside the body, although only a ramp filter was used. Conclusion: The IPC process succeeded in reducing the streak artifact. This process did not require any modification in acquisition and was not operator dependent. The increase in the number of projections is likely a necessary but not a sufficient condition to reduce the streak artifact: if not corrected, the attenuation could be a limiting factor in the removal of this artifact when the number of projections increases.

Journal ArticleDOI
TL;DR: Analysis of random and systematic errors in nominal and ratio data recorded by observers on stone artifacts as part of a distributional study indicates where care must be taken in analyzing artifact variation as a reflection of past human behavior.
Abstract: Variation in artifact recording introduced through the use of multiple observers is common in many archaeological projects. We report a study designed to assess random and systematic errors in nominal and ratio data recorded by observers on stone artifacts as part of a distributional study. A random sample of artifacts was selected and double analyzed, once by the regular observers and once by the project director. Random and systematic differences between the two sets of observations are assessed statistically. Analysis of these errors either permits corrections to be applied or indicates where care must be taken in analyzing artifact variation as a reflection of past human behavior.

Proceedings ArticleDOI
15 Oct 2000
TL;DR: A two-pass cone beam reconstruction scheme based on the observation that to a first order approximation, the high-density object reconstructed with the Feldkamp algorithm is accurate.
Abstract: Cone beam reconstruction has been the focus of many studies. One of the most widely referenced and used algorithms for a circular trajectory is the Feldkamp algorithm. The advantage of the algorithm is its simplicity of implementation, efficiency in computation, and close resemblance to the well-known filtered backprojection algorithm for fan beam and parallel beam reconstruction. The algorithm is effective in terms of combating some of the cone beam artifacts. However, when high-density objects are placed off the center plane (the fan beam plane), severe shading artifact will result. In the paper, we propose a two-pass cone beam reconstruction scheme. The algorithm is based on the observation that to a first order approximation, the high-density object reconstructed with the Feldkamp algorithm is accurate. The shading and streaking artifacts near the high-density objects are caused mainly by the incomplete sampling of the circular trajectory. Therefore, we can use the reconstructed images with Feldkamp algorithm as the basis for error estimation. The final images are produced by removing error images from the first-pass images.


PatentDOI
G.C. Ng1, James Jago1
TL;DR: In this paper, a method for reducing the flash artifacts in ultrasonic harmonic images is described, where the amount of motion in the image is detected and the flash artifact is reduced in accordance with the detected motion.
Abstract: Ultrasonic imaging apparatus and method are described for reducing the flash artifact in ultrasonic harmonic images. Harmonic signals are separated by pulse inversion separation which uses multiple transmit pulses which may be subject to motion artifacts. The motion artifacts are detected and subtracted from the harmonic signals to produce harmonic images with reduced flash artifacts. The motion artifacts may also be reduced by notch filtering. In another embodiment the amount of motion in the image is detected and the flash artifact is reduced in accordance with the detected motion. The amount of artifact signal which is removed is variable in accordance with anticipated image motion or clinical application.


Journal ArticleDOI
TL;DR: Computer simulation results agreed well with observed production of water saturation by means of nominal fat suppression in MR imaging of phantoms and a representative clinical example.
Abstract: Artifactual water signal intensity loss can be observed on fat-saturation magnetic resonance (MR) images of inhomogeneous regions such as the thorax. Magnetic effects of air inclusions on fat-saturation pulses were investigated as the possible origin of this artifact. Computer simulation results agreed well with observed production of water saturation by means of nominal fat suppression in MR imaging of phantoms and a representative clinical example.


Patent
28 Sep 2000
TL;DR: In this paper, the authors detect comb artifacts by comparing pixel values between adjacent rows with the differences in pixel values of alternate rows, and calculating comb artifact factors based on the differences, and comparing the median value for a group of pixels with a threshold to determine if there is an comb artifact at the pixel.
Abstract: Correcting deinterlaced video by determining whether the deinterlaced video has comb artifact areas, and correcting the comb artifact areas. The detection of comb artifacts includes comparing the differences in pixel values between adjacent rows with the differences in pixel values of alternate rows. The detection includes calculating comb artifact factors based on the differences, and comparing the median value for a group of pixels with a threshold to determine if there is an comb artifact at the pixel.

Proceedings ArticleDOI
02 Jun 2000
TL;DR: This paper designed an experiment to measure the actual detection threshold of some MPEG-2 artifacts in typical video sequences, and computed a perceptually weighted error metric that was compared to the output of a commercial fidelity metric.
Abstract: Many approaches have been proposed recently for estimating the perceived fidelity of digitally compressed and reconstructed video clips. Several metrics rely on the accurate estimation of detection thresholds, basically assuming that perceived quality depends significantly on the error threshold. To test that assumption, we designed an experiment to measure the actual detection threshold of some MPEG-2 artifacts in typical video sequences. In each of the test clips, we briefly replaced a region with a test stimulus created using an MPEG-2 encoder. The location, time, and strength of the artifact were varied between test videos. At the end of each clip, each subject provided a detection response and, for the detected artifacts, location and annoyance information. From the data, we determined the detection threshold for each artifact. Using the thresholds, we computed a perceptually weighted error metric. In this paper, we describe the experiment in more detail, summarize the experimental results including the threshold calculations, and compare the weighted error measure to the output of a commercial fidelity metric. Finally, we discuss our conclusions on the validity of predicting quality from threshold measurements.

Journal ArticleDOI
TL;DR: Examination of the posterior fossa by magnetic resonance imaging is discussed with respect to modern techniques and equipment, and including recent results of non-conventional studies in multiple sclerosis.

Journal ArticleDOI
TL;DR: The correction method was applied to velocity data along a streamline parallel to the frequency encoding direction and the result after correction was a new location of the peak velocity and improved estimates of the velocity gradients.
Abstract: The acceleration-induced displacement artifact impairs the accuracy of MR velocity measurements. This study proposes a post processing method for correction of this artifact. Velocity measurements were performed in a flow phantom containing a constriction. Velocity curves were obtained from streamlines parallel to the frequency, phase, and slice directions, respectively. The acceleration-induced displacement artifact was most prominent when the frequency encoding direction was aligned with the flow direction. After correction, velocity assignment improved and a more accurate description of the flow was obtained. In vivo measurements were performed in the aorta in a patient with a repaired aortic coarctation. The correction method was applied to velocity data along a streamline parallel to the frequency encoding direction. The result after correction was a new location of the peak velocity and improved estimates of the velocity gradients.