scispace - formally typeset
Search or ask a question

Showing papers in "IEEE Engineering in Medicine and Biology Magazine in 1996"


Journal Article
TL;DR: In this paper, the authors proposed a sampling density for a charge-coupled device (CCD) camera that can be used to evaluate the quality of one's quantitative microscope systems and to identify which components are the "weakest link".
Abstract: While light microscopy is almost 400 years old, developments of the past decade have offered a variety of new mechanisms for examination of biological and material samples. These developments include exploitation of techniques such as confocal microscopy, scanning near field microscopy, standing wave microscopy, fluorescence lifetime microscopy, and two-photon microscopy. In biology, advances in molecular biology and biochemistry have made it possible to selectively tag (and thus make visible) specific parts of cells, such as actin molecules, or sequences of DNA of 1000 base pairs or longer. In sensor technology, modern charge-coupled device (CCD) cameras are capable of achieving high spatial resolution and high sensitivity measurements of signals in the optical microscope. Modern CCD camera systems are limited by the fundamental quantum fluctuations of photons, which cannot be eliminated by "better" design. Further, proper choice of the sampling density involves not only an understanding of classic linear system theory-the Nyquist theorem-but also the equally stringent requirements of digital measurement theory. Experimental procedures that rely on the CV can be used to evaluate the quality of one's quantitative microscope systems and to identify which components are the "weakest link". Typical values of relatively straightforward parameters such as size can easily be measured to CVs around 1%.

668 citations


Journal ArticleDOI
TL;DR: Approaches that have been attempted in the development of 3-D ultrasound imaging such as3-D B-mode, color Doppler, and power doppler systems are reviewed.
Abstract: The development of 3-D ultrasound imaging is a way to address the disadvantages of conventional ultrasound imaging. In this article the authors review approaches that have been attempted in the development of 3-D ultrasound imaging such as 3-D B-mode, color Doppler, and power Doppler systems. Acquisition, reconstruction, and rendering techniques for 3-D imaging are discussed, as well as applications and limitations.

428 citations


Journal ArticleDOI
TL;DR: The authors introduce the basic principles of high-frequency ultrasound imaging and discuss six applications of this new technology: eye imaging, skin imaging, catheter-based intravascular imaging, intra-articular imaging, high- frequencies flow imaging, and in-vivo imaging of mouse embryonic development.
Abstract: Most medical ultrasound imaging systems operate in the frequency range from 3 to 10 MHz and can resolve objects approximately 1 mm in size. In the mid 1980s, new transducer materials led to the development of the first transducers suitable for high-frequency (30-100 MHz) clinical imaging. These high-frequency transducers can provide images of subsurface structures with microscopic resolution. In this article, the authors introduce the basic principles of high-frequency ultrasound imaging and discuss six applications of this new technology: eye imaging, skin imaging, catheter-based intravascular imaging, intra-articular imaging, high-frequency flow imaging, and in-vivo imaging of mouse embryonic development. These examples illustrate a few of the potential applications of high-frequency ultrasound in medicine and biology.

275 citations


Journal ArticleDOI
TL;DR: In this article, the authors reviewed past achievements and current developments in the technology, including piezoelectric materials, transducer/array fabrication and design, and modeling.
Abstract: Ultrasonic imaging is one of the most important and still growing diagnostic tools in use today. To better understand transducer/array performance and some of the factors that prevent ultrasonic imagers from reaching higher resolution, this article reviews past achievements and current developments in the technology, including piezoelectric materials, transducer/array fabrication and design, and modeling. It is concluded that the array or transducer is a crucial part of an ultrasonic imaging system. Although much progress has been made in recent years to improve performance, it is still a limiting factor in preventing ultrasonic imaging systems from reaching their theoretical resolution. Investigations into novel piezoelectric materials, array stack architecture design, and modeling are being pursued. In the future, it is likely that multidimensional arrays will gradually replace linear arrays as the industry standard.

232 citations


Journal Article
TL;DR: It is concluded that Doppler ultrasound has progressed over the last 30 years from a simple audio signal to a predominantly subjective image format that still requires an operator and/or interpreter skilled in the art.
Abstract: The use of Doppler today ranges from assessing blood flow in the fetus and umbilical cord, to flow patterns through valves in the heart or monitoring of blood flow to the brain. This article looks at how the Doppler effect is applied in commercial systems, its clinical uses, and research developments. It is concluded that Doppler ultrasound has progressed over the last 30 years from a simple audio signal to a predominantly subjective image format that still requires an operator and/or interpreter skilled in the art. The improvements in velocity estimation methods and gradual changes in the use of Doppler information suggest that this diagnostic modality will continue to evolve. The most likely directions for this evolution appear to be physiological quantification and reduction in dependence on the user through automation of both the system parameters and measurements.

221 citations


Journal ArticleDOI
TL;DR: The authors compared 6 color segmentation methods and their effectiveness as part of an overall border-finding algorithm and found the PCT/median cut and adaptive thresholding algorithms provided the lowest average error and show the most promise for further individual algorithm development.
Abstract: The images used in this research were digitized from 35mm color photographic slides obtained from a private dermatology practice and from New York University. The authors compared 6 color segmentation methods and their effectiveness as part of an overall border-finding algorithm. The PCT/median cut and adaptive thresholding algorithms provided the lowest average error and show the most promise for further individual algorithm development. Combining the different methods resulted in further improvement in the number of correctly identified tumor borders, and by incorporating additional heuristics in merging the segmented object information, one could potentially further increase the success rate. The algorithm is broad-based and suggests several areas for further research. One possible area of exploration is to incorporate an intelligent decision making process as to the number of colors that should be used for segmentation in the PCT/median cut and adaptive thresholding algorithms. For comparison purposes, the number of colors was kept constant at three in the authors' application. Other areas that can be explored are noise removal and object classification to determine the correct tumor object.

204 citations


Journal ArticleDOI
TL;DR: It appears that measuring the degree of variability is a more useful measure of chaos, as demonstrated by the application of this work to the analysis of congestive heart failure patients as compared to normal controls.
Abstract: In the last decade, chaos theory has become a popular method for approaching the analysis of nonlinear data for which most mathematical models produce intractable solutions. The concept of chaos was first introduced with applications in meteorology. Since then, considerable work has been done in the theoretical aspects of chaos. Applications have abounded, especially in medicine and biology. A particularly active area for the application of chaos theory has been cardiology. Many aspects of heart disease have been addressed, including whether chaos represents the healthy or diseased state. Most approaches to chaotic modeling rely on discrete models of continuous problems, which are represented by computer algorithms. Due to the nature of chaotic models, both the discretization and the computer simulation can lead to propagation of errors that may overtake the actual solution. This article describes an approach to chaotic modeling in which a continuous model is developed based on a conjectured solution to the logistic equation. As a result of this approach, two practical methods for quantifying variability in data sets have been derived. The first is a graphical representation obtained by using second-order difference plots of time series data. The second is a central tendency measure (CTM) that quantifies this degree of variability. The CTM can then be used as a parameter in decision models, such as neural networks. It appears that measuring the degree of variability is a more useful measure of chaos, as demonstrated by the application of this work to the analysis of congestive heart failure patients as compared to normal controls.

179 citations


Journal ArticleDOI
TL;DR: In this article, the authors review imaging techniques for assessing the elastic properties of tissue and briefly look at the fundamental techniques in the above-mentioned fields and then present elastography techniques developed by current research groups.
Abstract: In this article the authors review imaging techniques for assessing the elastic properties of tissue. They briefly look at the fundamental techniques in the above-mentioned fields and then present elastography techniques developed by current research groups. The authors' review is focused on published papers in the archival literature.

148 citations


Journal ArticleDOI
TL;DR: The weighted-frequency Fourier linear combiner (WFLC) as mentioned in this paper is an adaptive noise canceer that precisely models tremor with zero phase lag, and describes its application to computer input filtering, clinical tremor quantification, and active tremor cancellation for microsurgery.
Abstract: Zero-phase modeling and canceling of tremor can improve precision in human-machine control applications. Past methods of tremor suppression have been hindered by feedback delays due to phase lag and by the inability to track tremor frequency over time. The weighted-frequency Fourier linear combiner (WFLC) is an adaptive noise canceller that precisely models tremor with zero phase lag. This article briefly presents the WFLC algorithm and describes its application to computer input filtering, clinical tremor quantification, and active tremor canceling for microsurgery.

130 citations


Journal ArticleDOI
TL;DR: In this article, the authors present the theoretical considerations that justify the choice of specific time-frequency transforms for processing nonstationary myoelectric signals as a method of studying fatigue prior to the failure point.
Abstract: This article presents the theoretical considerations that justify the choice of specific time-frequency transforms for processing nonstationary myoelectric signals as a method of studying fatigue prior to the failure point. It shows some preliminary results obtained by applying these techniques to computer-synthesized realizations of stochastic processes, as well as to real signals detected during different types of dynamic contractions of healthy human volunteers. Five different time-frequency transforms were applied in this study (the Wigner-Ville, the smoothed Wigner-Ville, the Cone kernel, the reduced interference, and the Choi-Williams), but for the sake of brevity, this article reports only the results obtained by applying the Choi-Williams transform, because the authors found it to be the most suitable for processing these specific signals.

120 citations


Journal ArticleDOI
TL;DR: The concept of wavelet networks as a means of biosignal classification is presented and an example is presented in which this approach was used for classifying preprocessed ECG signals to identify patients who were at high-risk of developing ventricular tachycardia (VT).
Abstract: In recent years, a particular challenge has arisen in noninvasive medical diagnostic procedures. Because biosignals recorded on the body surface reflect the internal behaviour and status of the organism or its parts, they are ideally suited to provide essential information of these organs to the clinician without any invasive measures. But how are the recorded time courses of the signals to be interpreted with regard to a diagnostic decision? What are the essential features and in what code is the information hidden in the signals? These questions are typical of so-called pattern-recognition tasks. This article reviews pattern recognition as it applies to medical diagnostics and discusses the concept of wavelet networks as a means of biosignal classification. An example is presented in which this approach was used for classifying preprocessed ECG signals to identify patients who were at high-risk of developing ventricular tachycardia (VT).

Journal ArticleDOI
TL;DR: Some of the published ECG enhancing techniques to overcome the noise problems are reviewed, and their performance on stress ECG signals under adverse noise scenarios are compared and the filter bank-based ECG enhances algorithm is described.
Abstract: There are two predominant types of noise that contaminate the electrocardiogram (EGG) acquired during a stress test: the baseline wander noise (BW) and electrode motion artifact, and electromyogram-induced noise (EMG). BW noise is at a lower frequency, caused by respiration and motion of the subject or the leads. The frequency components of BW noise are usually below 0.5 Hz, and extend into the frequency range of the ST segment during a stress test. EMG noise, on the other hand, is predominantly at higher frequencies, caused by increased muscle activity and by mechanical forces acting on the electrodes. The frequency spectrum of the EMG noise overlaps that of the ECG signal and extends even higher in the frequency domain. In this article, the authors review some of the published ECG enhancing techniques to overcome the noise problems, and compare their performance on stress ECG signals under adverse noise scenarios. They also describe the filter bank-based ECG enhancing algorithm.

Journal ArticleDOI
TL;DR: The proof of concept and initial DNA hybridization results indicate great potential for Nanogen's active device technology, and the multiplex hybridization array formats now being developed are directed at providing rapid simultaneous testing from a microliter volume of patient samples, with better sensitivity and specificity than current assays.
Abstract: Nanogen's approach to DNA analysis involves the development of a unique active microelectronic device that provides electronic control over a variety of molecular biological affinity reactions. The core technology is an automated programmable electronic matrix (APEX), which has the ability to transport, bind and separate charged molecules in an electric field generated on the surface of the device. This broad-based platform technology is potentially applicable for multiplexed DNA hybridizations, immunoassays, receptor binding assays, cell typing assays, enzyme assays, combinatorial synthesis of oligonucleotides and peptides, and nanoparticle manipulations. Nanogen's initial developmental focus is in the area of DNA probe diagnostics. Our proof of concept and initial DNA hybridization results indicate great potential for Nanogen's active device technology. The multiplex hybridization array formats now being developed are directed at providing rapid simultaneous testing from a microliter volume of patient samples, with better sensitivity and specificity than current assays. Nanogen active device technology offers a number of distinct advantages versus current DNA diagnostic technology and differentiates itself from other chip technologies in the following way: (1) electronic addressing transports DNA molecules by charge; (2) the electronic hybridization's concentration effect improves the DNA hybridization rate; and (3) electronic stringency control improves selectivity and discrimination of hybrids.

Journal ArticleDOI
TL;DR: The authors draw on point-process theory and wavelet analysis to examine the variability and correlation properties of spike trains from single neurons in the cat striate cortex, under conditions of spontaneous and stimulated (driven) firing.
Abstract: The authors draw on point-process theory and wavelet analysis to examine the variability and correlation properties of spike trains from single neurons in the cat striate cortex, under conditions of spontaneous and stimulated (driven) firing. It is not possible to infer the long-term correlation properties of a spike train from measures that reset at short times; thus, often-used spike-train measures, such as the interevent-interval histogram (IIH) and post-stimulus time (PST) histogram, cannot serve this purpose. Rather, the authors make use of the event-number histogram (ENH), also called the spike-number or spike-count distribution. This measure affords the experimenter the opportunity of externally controlling the counting time, T, and therefore the duration over which spike correlations can be viewed. A useful and relatively simple gauge of the correlation properties is obtained from the first two moments of the ENH. In particular, the Fano factor (FF), defined as the ratio of the spike-count variance to the spike-count mean: F(T)=var(N)|N, plotted as a function of the counting time, T, serves this purpose quite well. The FF is a special case of the wavelet Fano factor (WFF) implemented using the Haar wavelet basis.

Journal ArticleDOI
TL;DR: This article highlights the current state of ultrasound contrast imaging, including recent improvements in ultrasound contrast agents that allow the agents to pass through pulmonary circulation and even to circulate throughout the human body.
Abstract: This article highlights the current state of ultrasound contrast imaging, including recent improvements in ultrasound contrast agents that allow the agents to pass through pulmonary circulation and even to circulate throughout the human body. A brief history of the field is presented, and current contrast agents and their properties are discussed. Clinical applications in cardiology and potential applications in other areas are also detailed.

Journal ArticleDOI
TL;DR: Although original applications in virtual reality (VR) for medicine pertained to the planning of surgeries, efforts have now shifted to the use of data fusion, i.e. to fuse virtual patients onto real patients as a navigational aid in surgery.
Abstract: Although original applications in virtual reality (VR) for medicine pertained to the planning of surgeries, efforts have now shifted to the use of data fusion, i.e. to fuse virtual patients onto real patients as a navigational aid in surgery. Eventually, medical care with multiple professionals will be provided in a shared virtual environment that incorporates shared decision making for an actual surgical intervention or a rehearsal. The major applications of virtual reality in surgery can be divided into three areas: virtual humans for training, the fusion of virtual humans with real humans for performing surgery, and virtual telemedicine shared decision environments for training of multiple players. The applications pertaining to the realisation of virtual reality in medicine can be categorised into two areas: generic models and patient specific models.

Journal ArticleDOI
TL;DR: It is concluded that advances and contributions have been made with the introduction of two novel feature extraction methods for breast cancer diagnosis, wavelets and eigenmasses.
Abstract: This study focuses on improving microcalcification classification by establishing an efficient computer-aided diagnosis system that extracts Daubechies-4 and biorthogonal wavelet features. These wavelets were chosen because they have been used in military target recognition and fingerprint recognition research with images characterized by low contrast, similar to mammography. Feature selection techniques are employed to further increase classification performance. The artificial neural network feature selection techniques are complemented by a conventional decision boundary-based feature selection method. The results using the wavelet features are compared to more conventional measures of image texture, angular second moment, and Karhunen Loeve coefficients. The use of alternative signal processing to compare wavelet and neural techniques allows for a measure of the problem difficulty. It is concluded that advances and contributions have been made with the introduction of two novel feature extraction methods for breast cancer diagnosis, wavelets and eigenmasses. Additionally, feature selection techniques are demonstrated, compared, and validated, transforming adequate discrimination power into promising classification results.

Journal ArticleDOI
TL;DR: Much of what has been learned through studies using the most modern equipment and methodologies is in agreement with meridian theory dating from 100 B.C.E. and earlier.
Abstract: An overview of the electrodermal screening test is given. An existing model of the electrical properties of the skin has been the accepted scientific standard for decades. But this model is based entirely on mechanistic principles and it fails to explain many biological phenomena, particularly those relating to acupuncture points and meridians. The author has developed a new model which, unlike the standard model, includes an active biological response and the fact that the electricity passes though different types of tissue, not just skin. This model not only explains much of acupuncture phenomena, in general, but can also be used to explain all possible EDST readings. The author has reviewed studies of electrodermal properties with studies of qualities specific to meridians. The author discovered that meridians have higher conductance, faster electromagnetic wave propagation, and patterns of preferential direction. Because of these factors, the meridian system acts as a particularly good network for the communication of bioinformation and thus plays an essential role in biological function. It is very interesting that much of what has been learned through studies using the most modern equipment and methodologies is in agreement with meridian theory dating from 100 B.C.E. and earlier.

Journal ArticleDOI
TL;DR: A new ultrasonic image analysis system that can be utilized as an effective tool in classifying liver states as normal, hepatitis, or liver cirrhosis is proposed.
Abstract: The authors propose a new ultrasonic image analysis system that can be utilized as an effective tool in classifying liver states as normal, hepatitis, or liver cirrhosis. In this system, the authors first define suitable settings for the ultrasonic device, then remove the inhomogeneous structures from the area of interest in the image, and then, by using the forward sequential search method, look for the useful texture parameters from the co-occurrence matrix, the statistical feature matrix, the texture spectrum, and the fractal dimension descriptors. Finally, the selected parameters are fed into a probabilistic neural network for the classification of liver disease. Experimental results are presented that show the classification rate with and without the inclusion of the inhomogeneous structures.

Journal ArticleDOI
TL;DR: Because the EDST makes use of the body's meridian system, it can map and help analyze theBody's own signals, making it particularly useful in early diagnosis, and is perhaps the most "modern" medical methodology available today.
Abstract: Acupuncture has been used for thousands of years and is effective in a wide range of situations. It has not been integrated into modern health care primarily because of lingering suspicions that it is not scientific. A bioenergetic model has been developed to explain nearly all aspects of acupuncture and meridian theory, but there remains a definite prejudice against human energetic theories in the medical-scientific community, which must be overcome before integration can take place. The electrodermal screening test (EDST) and electrodermal screening device (EDSD) are outgrowths of the scientific, electromagnetic understanding of meridian theory. The EDST may appear similar to other modern diagnostic techniques such as MRT, but there are important differences. The EDST is also based on ancient practices and is safer and more holistic, versatile, and cost effective. The device is elegantly simple and not extremely expensive. Hopefully, it will help free medical progress from its dependence on ever more expensive and specialized medical instrumentation. This alone would have a profound effect on health care cost and accessibility. The quality of health care will also improve with integration of the EDST into modern medical practice. Because the EDST makes use of the body's meridian system, it can map and help analyze the body's own signals, making it particularly useful in early diagnosis. With its solid theoretical foundation in modern physics and quantum mechanics, it is perhaps the most "modern" medical methodology available today.

Journal ArticleDOI
TL;DR: The restoration results on the cylindrical objects show, however, that the EM-MLE algorithm has a tendency to reconstruct an image that is sharper and smaller than the original object, so methods to speed up these algorithms should be investigated more fully.
Abstract: The authors have compared the performance of the EM-MLE and ICTM restorations applied to confocal images. Both methods greatly reduce diffraction-induced distortions of confocal images. Due to their nonlinearity, both are able (partially) to restore data of missing frequencies. From the authors' experiments, it is clear that for their test objects, the EM-MLE algorithm performs much better than ICTM. The EM-MLE algorithm produces better results under all the conditions the authors tested, and with respect to all 3 performance measures (I-Divergence, MSE, GDT) the authors used. Only for high SNR conditions, the MSE performance of ICTM approaches the EM-MLE results. It must be noted that this conclusion is only valid for the type of objects the authors used in their experiments (sparse objects); it may well be that for more dense objects, the situation is different. The poor ICTM performance shows that its functional is not well suited for images distorted with Poisson noise. The authors did not find artifacts such as ringing in the results of either algorithm. The restoration results on the cylindrical objects show, however, that the EM-MLE algorithm has a tendency to reconstruct an image that is sharper and smaller than the original object. This aspect of EM-MLE should be investigated thoroughly. Greander's method of Sieves (1991) seems promising for regularizing the EM-MLE algorithm. Finally, to reduce the computational burden of ICTM and EM-MLE, methods to speed up these algorithms should be investigated more fully.

Journal ArticleDOI
TL;DR: The principles of ECT as a method of treating cancer, the requirements and development of electronic and electromechanical hardware for ECT, and it presents data for both in-vivo animal studies and clinical applications are discussed.
Abstract: Although electroporation in the past has mainly been used as a research tool, recent work has demonstrated its potential for clinical applications. Some of the areas explored include electrochemotherapy (ECT), which utilizes electroporation as a means for delivering chemotherapeutic agents directly into tumor cells, encapsulation of drugs or genes into cells for their use as carrier systems, transdermal delivery of drugs or genes, gene therapy, and delivery of drugs or genes with an electroporation catheter. This article discusses the principles of ECT as a method of treating cancer, the requirements and development of electronic and electromechanical hardware for ECT, and it presents data for both in-vivo animal studies and clinical applications, especially for subcutaneous tumors. It is concluded that ECT has shown promise in treating a variety of cancers in humans. The basic principles are reasonably well understood. A good start has been made in the development of the necessary hardware to generate and apply the needed electric fields. As the human genome project progresses in identifying gene-based diseases and their possible cures, the same hardware system used for ECT can also be used for electrogene therapy. As ECT-based therapy becomes more widely recognized, it will offer an additional treatment modality and increased hope for cancer patients.

Journal ArticleDOI
TL;DR: This handbook contains a great deal of valuable and interesting information for anyone who is involved with electrical systems and devices and an abundance of illustrative examples for the reader who may not be concerned with the mathematics behind the effects.
Abstract: Handbook of Electrical Hazards and Accidents Leslie A. Geddes (ed.), CRC Press, Inc, BocaRaton, Florida, 1995,204 pages, ISBN # 0-8493-9431-7. $59.95. This handbook contains a great deal of valuable and interesting information for anyone who is involved with electrical systems and devices. It will interest biomedical and clinical engineers and be a useful tool for investigators of electrical injury who may not be familiar with electrical concepts. In the first chapter, common electrical accidents and their causes are reviewed. Case histories are also provided to demonstrate through example how these incidents occur and how their symptoms might appear to an investigator. These examples cover a wide variety of hazards, from the well known (inserting a knife into a toaster to dislodge a slice of bread), to the unusual (ignition of bowel gas by an electrosurgical device). In chapter two, intentional applications of electrical stimulation are reviewed and explained. The author includes a brief review of cell stimulation and a clear explanation of how this stimulation affects cardiac and skeletal muscle. Other applications include iontopheresis, tasers, and electric fences. Chapter three covers the many effects of low frequency current, including an interesting discussion of stray voltage effects on milk cows. This is followed by a brief chapter on lightning injuries. Chapter five focuses on the effects of high frequency current. There is a discussion of several studies that address the physiologic effects of MRI. The final chapter is provided as a reference for the electrical properties of various body tissues. Overall, the handbook provides sufficient technical background for the interested engineer and an abundance of illustrative examples for the reader who may not be concerned with the mathematics behind the effects. It is valuable reading for designers of electrical devices, as well as those who treat and investigate the harmful effects when the use of such devices goes awry. -Barbara Donohue Biomedical Engineering Department Columbia-Presbyterian Medical Center

Journal ArticleDOI
TL;DR: A review of selective research and developments during the last decade in the field of ultrasound hyperthermia and non-invasive ultrasound surgery can be found in this paper, where some relevant clinical results are summarized as a measure of progress.
Abstract: The main purpose of this article is to review selective research and developments during the last decade in the field. There has been tremendous progress in engineering and science coupled with ultrasound transducer technology and imaging modalities during the last 10 years. These new R and D efforts have enhanced the capabilities of ultrasound therapy, and it now appears that the ultrasound therapy base will be expanded for clinical applications, particularly as noninvasive surgery becomes a more acceptable method of treatment. This article limits its review to novel developments in ultrasound hyperthermia and noninvasive ultrasound surgery. Along with descriptions of ultrasound devices and new accompanying techniques such as magnetic resonance image (MRI) guided noninvasive surgery, some relevant clinical results are summarized as a measure of progress.

Journal ArticleDOI
TL;DR: The techniques described here, and similar ones, will become a routine part of research and clinical practice as the use of FISH techniques expand and one can expect digital image processing to become an indispensable part of the activity.
Abstract: Fluorescence in situ hybridization (FISH) is a rapidly expanding imaging technique in medical research and clinical diagnosis. Both researchers and clinicians find it helpful to employ quantitative digital imaging techniques with FISH images. This technique is of particular interest for multi-probe mixtures and for the automated analysis of large numbers of specimens. In the preparation of FISH specimens, multiple probes, each tagged with a different fluorophore, are often used in combination. This permits simultaneous visualization of several different molecular components of the cell. Usually, the relative positions of these components within the specimen are of scientific or clinical interest. The authors discuss these techniques and their applications. FISH dot counting is increasingly used in research and clinical studies. Research procedures and clinical tests using FISH almost certainly have an increasingly significant role to play in the future of biology and medicine. In much the same way as cytogenetics has adopted digital imaging, the techniques described here, and similar ones, will become a routine part of research and clinical practice as the use of FISH techniques expand. As in radiology, one can expect digital image processing to become an indispensable part of the activity.

Journal ArticleDOI
TL;DR: In this article, a wavelet-based method for reducing correlated noise in noisy speech signals was proposed, which could be used to improve intelligibility in a wide variety of applications, with special attention given to digital hearing aids and other portable communication systems.
Abstract: The intelligibility of speech in communication systems is generally reduced by interfering noise. This interference, which can take the form of environmental noise, reverberation, competing speech, or electronic channel noise, reduces intelligibility by masking the signal of interest. The reduction in intelligibility is particularly troublesome for listeners with hearing impairments, who have greater difficulty understanding speech in the presence of noise than do normal-hearing listeners. Numerous digital signal processing (DSP)-based speech enhancement systems have been proposed to improve intelligibility in the presence of noise. Several of these systems have difficulty distinguishing between noise and consonants, and consequently attenuate both. Other methods, which use imprecise estimates of the noise, create audible artifacts that further mask consonants. The objective of the present study is to develop a new noise-reduction method that can reduce additive noise without impairing intelligibility. The new method could be used to improve intelligibility in a wide variety of applications, with special attention given to digital hearing aids and other portable communication systems (e.g., cellular telephones). Here, the authors present a new wavelet-based method for reducing correlated noise in noisy speech signals. The authors provide background information on the intelligibility problem and on previous attempts to address it. A theoretical framework is then proposed for reduction of correlated noise, along with some preliminary experimental results.


Journal ArticleDOI
TL;DR: The tetrapolar method was used in this article to measure the resistivity of an electrolyte that was placed between a pair of electrodes in a container, which was then used in conductivity cells.
Abstract: The technique of using an outer pair of electrodes to inject current and an inner pair of electrodes to measure potential eliminates electrode/electrolyte impedance errors when measuring the resistivity of a conducting substance. A pressing problem of the late 1800s was the accurate measurement of the resistivity of an electrolyte that was placed between a pair of electrodes in a container. Kohlrausch (1897) working in Charlottenburg, Germany, solved the problem in a limited way by 1) using 1000 Hz current and 2) creating the platinum-black electrode. By using what was then a high frequency, the electrode impedance became low. By placing a velvety-like deposit of platinum on a platinum electrode, the effective surface area was increased, thereby reducing the electrode impedance. Such blackened platinum electrodes are still used in conductivity cells designed to permit measurement of electrolytic resistivity. Working in Paris, France, well before Kohlrausch, Lippmann had solved the problem of measuring electrolytic resistivity in a simple and elegant way by introducing the tetrapolar method in conjunction with his capillary electrometer, a sensitive and rapidly responding potential indicator. Lippmann (1873) had become interested in the shape of a drop of mercury in a solution of sulfuric acid. When an iron wire was placed into the sulfuric acid and then contacted the mercury, the contour of the mercury changed. The combination of the iron wire, mercury, and electrolyte caused the charge distribution on the surface of the mercury to change, thereby altering the surface tension and, hence, the contour. Lippmann used the phenomenon of electrocapillarity, as it became known, to construct a sensitive and rapidly responding electrometer.

Journal ArticleDOI
TL;DR: In this article, the authors describe two main sensory systems that perceive motion: the visual and the vestibular, and their characteristics are described in detail, including visual stabilization, adaptation of vestibulum reflexes, simulator sickness and sickness in virtual environments, and vestibule adaptation.
Abstract: The defining feature of a virtual reality system is the sensation of presence. There are two main sensory systems that perceive motion: the visual and the vestibular. Their characteristics are described in the paper. Issues discussed include: visual stabilization; adaptation of vestibular reflexes; simulator sickness and sickness in virtual environments; and vestibular adaptation.

Journal ArticleDOI
TL;DR: It is concluded that evidence of dose-response exists for childhood leukemia and that causality remains a scientifically defensible explanation for the epidemiologic results produced thus far on this disease in relation to EMF exposure.
Abstract: We present an examination of available dose-response data on EMF and childhood cancer in the epidemiologic literature. We conclude that evidence of dose-response exists for childhood leukemia and that causality remains a scientifically defensible explanation for the epidemiologic results produced thus far on this disease in relation to EMF exposure.