scispace - formally typeset
Search or ask a question

Showing papers on "Data acquisition published in 2008"


Proceedings Article
01 Mar 2008
TL;DR: This paper overviews the recent work on compressive sensing, a new approach to data acquisition in which analog signals are digitized for processing not via uniform sampling but via measurements using more general, even random, test functions.
Abstract: This paper overviews the recent work on compressive sensing, a new approach to data acquisition in which analog signals are digitized for processing not via uniform sampling but via measurements using more general, even random, test functions. In stark contrast with conventional wisdom, the new theory asserts that one can combine "low-rate sampling" with digital computational power for efficient and accurate signal acquisition. Compressive sensing systems directly translate analog data into a compressed digital form; all we need to do is "decompress" the measured data through an optimization on a digital computer. The implications of compressive sensing are promising for many applications and enable the design of new kinds of analog-to-digital converters, cameras, and imaging systems.

1,537 citations


Journal ArticleDOI
TL;DR: In this article, the workflow for using lidar data, from the choice of field area and survey planning, to acquiring and processing data and, finally, extracting geologically useful data.
Abstract: Terrestrial laser scanning, or lidar, is a recent innovation in spatial information data acquisition, which allows geological outcrops to be digitally captured with unprecedented resolution and accuracy. With point precisions and spacing of the order of a few centimetres, an enhanced quantitative element can now be added to geological fieldwork and analysis, opening up new lines of investigation at a variety of scales in all areas of field-based geology. Integration with metric imagery allows 3D photorealistic models to be created for interpretation, visualization and education. However, gaining meaningful results from lidar scans requires more than simply acquiring raw point data. Surveys require planning and, typically, a large amount of post-processing time. The contribution of this paper is to provide a more detailed insight into the technology, data collection and utilization techniques than is currently available. The paper focuses on the workflow for using lidar data, from the choice of field area and survey planning, to acquiring and processing data and, finally, extracting geologically useful data. Because manufacturer specifications for point precision are often optimistic when applied to real-world outcrops, the error sources associated with lidar data, and the implications of them propagating through the processing chain, are also discussed.

412 citations


Journal ArticleDOI
TL;DR: An idea for real-time acquisition of 3D surface data by a specially coded vision system for fast 3D data acquisition is presented and a principle of uniquely color-encoded pattern projection is proposed to design a color matrix for improving the reconstruction efficiency.
Abstract: Structured light vision systems have been successfully used for accurate measurement of 3D surfaces in computer vision. However, their applications are mainly limited to scanning stationary objects so far since tens of images have to be captured for recovering one 3D scene. This paper presents an idea for real-time acquisition of 3D surface data by a specially coded vision system. To achieve 3D measurement for a dynamic scene, the data acquisition must be performed with only a single image. A principle of uniquely color-encoded pattern projection is proposed to design a color matrix for improving the reconstruction efficiency. The matrix is produced by a special code sequence and a number of state transitions. A color projector is controlled by a computer to generate the desired color patterns in the scene. The unique indexing of the light codes is crucial here for color projection since it is essential that each light grid be uniquely identified by incorporating local neighborhoods so that 3D reconstruction can be performed with only local analysis of a single image. A scheme is presented to describe such a vision processing method for fast 3D data acquisition. Practical experimental performance is provided to analyze the efficiency of the proposed methods.

195 citations


Journal ArticleDOI
TL;DR: The hardware and software of the proposed DFD‐FLIM method simplifies the process of data acquisition for FLIM, presents a new interface for data display and interpretation, and optimizes the accuracy of lifetime determination.
Abstract: Fluorescence lifetime imaging (FLIM) is a powerful microscopy technique for providing contrast of biological and other systems by differences in molecular species or their environments. However, the cost of equipment and the complexity of data analysis have limited the application of FLIM. We present a mathematical model and physical implementation for a low cost digital frequency domain FLIM (DFD-FLIM) system, which can provide lifetime resolution with quality comparable to time-correlated single photon counting methods. Our implementation provides data natively in the form of phasors. On the basis of the mathematical model, we present an error analysis that shows the precise parameters for maximizing the quality of lifetime acquisition, as well as data to support this conclusion. The hardware and software of the proposed DFD-FLIM method simplifies the process of data acquisition for FLIM, presents a new interface for data display and interpretation, and optimizes the accuracy of lifetime determination.

188 citations


Journal ArticleDOI
Craig J. Beasley1
TL;DR: This article discusses a field experiment carried out to test the feasibility of employing marine sources activated simultaneously, which does not require source-signature encoding, but relies on spatial-source positioning to allow for separation of the signa...
Abstract: Cost is one of the fundamental factors that determines where and how a seismic survey will be conducted. Moreover, the cost of 3D seismic acquisition and processing often plays a significant role in determining whether or not a prospect is economic. Unit costs of seismic data acquisition and processing have dropped dramatically as the technology has matured; however, these economies have raised demand for larger and more complex acquisition plans. More than ever, there is a great need to gain efficiency. In this article, I discuss a field experiment carried out to test the feasibility of employing marine sources activated simultaneously. Simultaneous source firing has long been recognized as a possible strategy for achieving dramatic cost reductions in seismic data acquisition. This approach is novel in that it does not require source-signature encoding (although such encoding combined with this approach is beneficial), but, rather, relies on spatial-source positioning to allow for separation of the signa...

187 citations


Proceedings ArticleDOI
Ian Moore1, Bill Dragoset1, Tor Ommundsen1, David M. Wilson1, Daniel Eke1, Camille Ward1 
TL;DR: In conventional data acquisition, the delay time between the firing of one source and the next is such that the energy from the previous source has decayed to an acceptable level before data associated with the following source arrives, which imposes constraints on the data acquisition rate.
Abstract: In conventional data acquisition, the delay time between the firing of one source and the next is such that the energy from the previous source has decayed to an acceptable level before data associated with the following source arrives. This minimum delay time imposes constraints on the data acquisition rate. For marine data, the minimum delay time also implies a minimum inline shot interval, because the vessel’s minimum speed is limited.

184 citations


Journal ArticleDOI
17 Nov 2008-Sensors
TL;DR: The results suggest that the use of unprocessed image data did not improve the results of image analyses and vignetting had a significant effect, especially for the modified camera, and normalized vegetation indices calculated with vigneta-corrected images were sufficient to correct for scene illumination conditions.
Abstract: The use of consumer digital cameras or webcams to characterize and monitor different features has become prevalent in various domains, especially in environmental applications. Despite some promising results, such digital camera systems generally suffer from signal aberrations due to the on-board image processing systems and thus offer limited quantitative data acquisition capability. The objective of this study was to test a series of radiometric corrections having the potential to reduce radiometric distortions linked to camera optics and environmental conditions, and to quantify the effects of these corrections on our ability to monitor crop variables. In 2007, we conducted a five-month experiment on sugarcane trial plots using original RGB and modified RGB (Red-Edge and NIR) cameras fitted onto a light aircraft. The camera settings were kept unchanged throughout the acquisition period and the images were recorded in JPEG and RAW formats. These images were corrected to eliminate the vignetting effect, and normalized between acquisition dates. Our results suggest that 1) the use of unprocessed image data did not improve the results of image analyses; 2) vignetting had a significant effect, especially for the modified camera, and 3) normalized vegetation indices calculated with vignetting-corrected images were sufficient to correct for scene illumination conditions. These results are discussed in the light of the experimental protocol and recommendations are made for the use of these versatile systems for quantitative remote sensing of terrestrial surfaces.

178 citations


Patent
15 May 2008
TL;DR: In this article, a system for intelligent monitoring and management of an electrical system is described, which includes a data acquisition component, a power analytics server, and a client terminal, including a virtual system modeling engine and a machine learning engine.
Abstract: A system for intelligent monitoring and management of an electrical system is disclosed. The system includes a data acquisition component, a power analytics server and a client terminal. The data acquisition component acquires real-time data output from the electrical system. The power analytics server is comprised of a real-time energy pricing engine, virtual system modeling engine, an analytics engine, a machine learning engine and a schematic user interface creator engine. The real-time energy pricing engine generates real-time utility power pricing data. The virtual system modeling engine generates predicted data output for the electrical system. The analytics engine monitors real-time data output and predicted data output of the electrical system. The machine learning engine stores and processes patterns observed from the real-time data output and the predicted data output to forecast an aspect of the electrical system.

175 citations


Journal ArticleDOI
TL;DR: A new system is designed in detail to perform micro-environmental monitoring taking the advantages of the WSN, and the system platform for data acquisition, validation, processing and visualization is systematically presented.
Abstract: Wireless Sensor Network (WSN) is increasingly popular in the field of micro-environmental monitoring due to its promising capability. However, most systems using WSN for environmental monitoring reported in the literature are developed for specific applications without functions for exploiting user's data processing methods. In this paper, a new system is designed in detail to perform micro-environmental monitoring taking the advantages of the WSN. The application-oriented hardware working style is designed, and the system platform for data acquisition, validation, processing and visualization is systematically presented. Several strategies are proposed to guarantee the system capability in terms of extracting useful information, visualizing events to their authentic time are also described. Moreover, a web-based surveillance subsystem is presented for remote control and monitoring. In addition, the system is extensible for engineers to carry their own data analysis algorithms. Experimental results are to show the path reliability and real-time characteristics, and to display the feasibility and applicability of the developed system into practical deployment.

171 citations


Journal ArticleDOI
TL;DR: The Digital Fields Board (DFB) performs data acquisition and signal processing for the Electric Fields Instrument and Search Coil Magnetometer on each of the THEMIS (Time History of Events and Macroscale Interactions during Substorms) satellites as mentioned in this paper.
Abstract: The Digital Fields Board (DFB) performs the data acquisition and signal processing for the Electric Fields Instrument and Search Coil Magnetometer on each of the THEMIS (Time History of Events and Macroscale Interactions during Substorms) satellites. The processing is highly flexible and low-power (∼1.1 watt orbit-averaged). The primary data products are time series waveforms and wave power spectra of the electric and magnetic fields. The power spectra can be computed either on the raw signals (i.e. in a system co-rotating with the spacecraft) or in a coordinate system aligned with the local DC magnetic field. Other data products include spectral power from multiple passbands (filter banks) and electric power in the 30–500 kHz band. The DFBs on all five spacecraft have been turned on and checked out in-flight, and are functioning as designed.

139 citations


BookDOI
01 Jan 2008
TL;DR: In this chapter, hardware and software for RE are presented, and the four RE phases used in a RE data processing chain are highlighted, in which the fundamental RE operations that are necessary for completing the RE dataprocessing chain are presented and discussed in detail.
Abstract: Reverse engineering (RE) is generally defined as a process of analysing an object or existing system (hardware and software) to identify its components and their interrelationships, and investigate how it works in order to redesign or produce a copy without access to the design from which it was originally produced [87,88]. In areas related to 3D graphics and modelling, RE technology is used for reconstructing 3D models of an object in different geometrical formats. RE hardware is used for RE data acquisition, which in the case of 3D modelling is the collection of geometrical data that represent a physical object. There are three main technologies for RE data acquisition: Contact, Non-Contact and Destructive. Outputs of the RE data acquisition process are 2D cross-sectional images and point clouds that define the geometry of an object. RE software is employed to transform the RE data produced by RE hardware into 3D geometrical models. The final outputs of the RE data processing chain can be one of two types of 3D data: (i) Polygons or (ii) NURBS (Non-Uniform Rational B Splines). Polygon models, which are normally in the STL, VRML or DXF format, are commonly used for rapid prototyping, laser milling, 3D graphics, simulation, and animations application. NURBS surfaces or solids are frequently used in Computer Aided Design, Manufacturing and Engineering (CAD-CAM-CAE) applications. In this chapter, hardware and software for RE are presented. Commercially available RE hardware based on different 3D data collection techniques is briefly introduced. The advantages and disadvantages of various data acquisition methods are outlined to help the selection of the right RE hardware for specific applications. In the RE software section, end-use RE applications are classified and typical commercialised RE packages are reviewed. The four RE phases used in a RE data processing chain are highlighted, in which the fundamental RE operations that are necessary for completing the RE data processing chain are presented and discussed in detail.

Patent
23 Sep 2008
TL;DR: In this article, a system and method for intelligent monitoring and management of an electrical system is described, which includes a data acquisition component, a power analytics server and a client terminal.
Abstract: A system and method for intelligent monitoring and management of an electrical system is disclosed. The system includes a data acquisition component, a power analytics server and a client terminal. The data acquisition component acquires real-time data output from the electrical system. The power analytics server is comprised of a real-time electrical system security index engine that calculates real-time system security index values from stability indices data generated from a virtual system model of the electrical system. The client terminal displays the system security index values to assess the security and stability of the electrical system.

Proceedings ArticleDOI
10 Mar 2008
TL;DR: An automated method is introduced for improving the utilization of the on-chip storage, by identifying a small set of trace signals from which a large number of states can be restored using a compute-efficient algorithm.
Abstract: Embedded logic analysis has emerged as a powerful technique for identifying functional bugs during post-silicon validation, as it enables at-speed acquisition of data from the circuit nodes in real-time. Nonetheless, the amount of data that is observed is limited by the capacity of the on-chip trace buffers. This paper introduces an automated method for improving the utilization of the on-chip storage, by identifying a small set of trace signals from which a large number of states can be restored using a compute-efficient algorithm. This enlarged set of data can then be used to aid the search of functional bugs in the fabricated circuit.

Journal ArticleDOI
TL;DR: Grain is a data analysis system developed to be used with the novel Total Data Readout data acquisition system and the accompanying software system have been written entirely in Java.
Abstract: Grain is a data analysis system developed to be used with the novel Total Data Readout data acquisition system In Total Data Readout all the electronics channels are read out asynchronously in singles mode and each data item is timestamped Event building and analysis has to be done entirely in the software post-processing the data stream A flexible and efficient event parser and the accompanying software system have been written entirely in Java The design and implementation of the software are discussed along with experiences gained in running real-life experiments

Journal ArticleDOI
TL;DR: In this paper, a low-cost, microcontroller-based data acquisition system has been built through interfacing a microcontroller with a signal transducer for collecting cutting vibration.
Abstract: Machine condition plays an important role in machining performance. A machine condition monitoring system will provide significant economic benefits when applied to machine tools and machining processes. Development of such a system requires reliable machining data that can reflect machining processes. This study demonstrates a tool condition monitoring approach in an end-milling operation based on the vibration signal collected through a low-cost, microcontroller-based data acquisition system. A data acquisition system has been built through interfacing a microcontroller with a signal transducer for collecting cutting vibration. The examination tests of this developed system have been carried out on a CNC milling machine. Experimental studies and data analysis have been performed to validate the proposed system. The onsite tests show the developed system can perform properly as proposed.

Patent
07 Nov 2008
TL;DR: In this paper, a system for utilizing a neural network to make real-time predictions about the health, reliability, and performance of a monitored system is described, which includes a data acquisition component, a power analytics server and a client terminal.
Abstract: A system for utilizing a neural network to make real-time predictions about the health, reliability, and performance of a monitored system are disclosed. The system includes a data acquisition component, a power analytics server and a client terminal. The data acquisition component acquires real-time data output from the electrical system. The power analytics server is comprised of a virtual system modeling engine, an analytics engine, an adaptive prediction engine. The virtual system modeling engine generates predicted data output for the electrical system. The analytics engine monitors real-time data output and predicted data output of the electrical system. The adaptive prediction engine can be configured to forecast an aspect of the monitored system using a neural network algorithm. The adaptive prediction engine is further configured to process the real-time data output and automatically optimize the neural network algorithm by minimizing a measure of error between the real-time data output and an estimated data output predicted by the neural network algorithm.

Patent
James Jiang1, Scott Barry1, Alex Cable1
18 Jan 2008
TL;DR: In this paper, an optical imaging system includes an optical radiation source ( 410, 510 ), a frequency clock module outputting frequency clock signals ( 420), an optical interferometer ( 430), a data acquisition (DAQ) device ( 440 ) triggered by the frequency clock signal, and a computer ( 450 ) to perform multi-dimensional optical imaging of the samples.
Abstract: An optical imaging system includes an optical radiation source ( 410, 510 ), a frequency clock module outputting frequency clock signals ( 420 ), an optical interferometer ( 430 ), a data acquisition (DAQ) device ( 440 ) triggered by the frequency clock signals, and a computer ( 450 ) to perform multi-dimensional optical imaging of the samples. The frequency clock signals are processed by software or hardware to produce a record containing frequency-time relationship of the optical radiation source ( 410, 510 ) to externally clock the sampling process of the DAQ device ( 440 ). The system may employ over-sampling and various digital signal processing methods to improve image quality. The system further includes multiple stages of routers ( 1418, 1425 ) connecting the light source ( 1410 ) with a plurality of interferometers ( 1420 a- 1420 n) and a DAQ system ( 1450 ) externally clocked by frequency clock signals to perform high-speed multi-channel optical imaging of samples.

Patent
15 Feb 2008
TL;DR: Methods and devices for providing diabetes management including automatic time acquisition protocol is provided in this article, where the authors also provide a discussion of the use of time acquisition protocols in the context of diabetes management.
Abstract: Methods and devices for providing diabetes management including automatic time acquisition protocol is provided.

Journal ArticleDOI
TL;DR: Web-ice as mentioned in this paper is a network of software applications and application servers, collectively known as Web-Ice, for diffraction experiments involving large numbers of crystals, including lattice indexing, Bragg spot integration, and symmetry determination.
Abstract: New software tools are introduced to facilitate diffraction experiments involving large numbers of crystals While existing programs have long provided a framework for lattice indexing, Bragg spot integration, and symmetry determination, these initial data processing steps often require significant manual effort This limits the timely availability of data analysis needed for high-throughput procedures, including the selection of the best crystals from a large sample pool, and the calculation of optimal data collection parameters to assure complete spot coverage with minimal radiation damage To make these protocols more efficient, a network of software applications and application servers has been developed, collectively known as Web-Ice When the package is installed at a crystallography beamline, a programming interface allows the beamline control software (eg Blu-Ice, DCSS) to trigger data analysis automatically Results are organized based on a list of samples that the user provides, and are examined within a Web page, accessible both locally at the beamline and remotely Optional programming interfaces permit the user to control data acquisition through the Web browser The system as a whole is implemented to support multiple users and multiple processors, and can be expanded to provide additional scientific functionality Web-Ice has a distributed architecture consisting of several stand-alone software components working together via a well defined interface Other synchrotrons or institutions may integrate selected components or the whole of Web-Ice with their own data acquisition software Updated information about current developments may be obtained at http://smbslacstanfordedu/research/developments/webice

Journal ArticleDOI
TL;DR: The ATLAS experiment is one of the experiments at the Large Hadron Collider, constructed to study elementary particle interactions in collisions of high-energy proton beams, and special emphasis was put on the use of standardized hardware and software components enabling efficient development and long-term maintainability of the DCS over the lifetime of the experiment.
Abstract: The ATLAS experiment is one of the experiments at the Large Hadron Collider, constructed to study elementary particle interactions in collisions of high-energy proton beams. The individual detector components as well as the common experimental infrastructure are supervised by the Detector Control System (DCS). The DCS enables equipment supervision using operator commands, reads, processes and archives the operational parameters of the detector, allows for error recognition and handling, manages the communication with external control systems, and provides a synchronization mechanism with the physics data acquisition system. Given the enormous size and complexity of ATLAS, special emphasis was put on the use of standardized hardware and software components enabling efficient development and long-term maintainability of the DCS over the lifetime of the experiment. Currently, the DCS is being used successfully during the experiment commissioning phase.

Journal ArticleDOI
TL;DR: A system of field’s data acquisition (herein referred as Meteologger) based on an ATmega 16 microcontroller, which scans 8 sensors together at any programmable intervals, and some main characteristics of the prototype system and its program are presented.

Patent
12 Dec 2008
TL;DR: In this article, a photoacoustic imaging apparatus is provided for medical or other imaging applications and also a method for calibrating this apparatus is also provided, which employs a sparse array of transducer elements and a reconstruction algorithm.
Abstract: A photoacoustic imaging apparatus is provided for medical or other imaging applications and also a method for calibrating this apparatus. The apparatus employs a sparse array of transducer elements and a reconstruction algorithm. Spatial calibration maps of the sparse array are used to optimize the reconstruction algorithm. The apparatus includes a laser producing a pulsed laser beam to illuminate a subject for imaging and generate photoacoustic waves. The transducers are fixedly mounted on a holder so as to form the sparse array. A photoacoustic (PA) waves are received by each transducer. The resultant analog signals from each transducer are amplified, filtered, and converted to digital signals in parallel by a data acquisition system which is operatively connected to a computer. The computer receives the digital signals and processes the digital signals by the algorithm based on iterative forward projection and back-projection in order to provide the image.

Journal ArticleDOI
TL;DR: The present study suggests that the proposed method could correct for magnetic field distortion inside the patient's abdomen during a laparoscopic procedure within a clinically permissible period of time, as well as enabling an accurate 3D US reconstruction to be obtained that can be superimposed onto live endoscopic images.
Abstract: This paper describes a ultrasound (3D US) system that aims to achieve augmented reality (AR) visualization during laparoscopic surgery, especially for the liver. To acquire 3D US data of the liver, the tip of a laparoscopic ultrasound probe is tracked inside the abdominal cavity using a magnetic tracker. The accuracy of magnetic trackers, however, is greatly affected by magnetic field distortion that results from the close proximity of metal objects and electronic equipment, which is usually unavoidable in the operating room. In this paper, we describe a calibration method for intraoperative magnetic distortion that can be applied to laparoscopic 3D US data acquisition; we evaluate the accuracy and feasibility of the method by in vitro and in vivo experiments. Although calibration data can be acquired freehand using a magneto-optic hybrid tracker, there are two problems associated with this method - error caused by the time delay between measurements of the optical and magnetic trackers, and instability of the calibration accuracy that results from the uniformity and density of calibration data. A temporal calibration procedure is developed to estimate the time delay, which is then integrated into the calibration, and a distortion model is formulated by zeroth-degree to fourth-degree polynomial fitting to the calibration data. In the in vivo experiment using a pig, the positional error caused by magnetic distortion was reduced from 44.1 to 2.9 mm. The standard deviation of corrected target positions was less than 1.0 mm. Freehand acquisition of calibration data was performed smoothly using a magneto-optic hybrid sampling tool through a trocar under guidance by realtime 3-D monitoring of the tool trajectory; data acquisition time was less than 2 min. The present study suggests that our proposed method could correct for magnetic field distortion inside the patient's abdomen during a laparoscopic procedure within a clinically permissible period of time, as well as enabling an accurate 3D US reconstruction to be obtained that can be superimposed onto live endoscopic images.

Journal ArticleDOI
TL;DR: In this paper, a wireless sensor prototype capable of data acquisition, computational analysis and actuation is proposed for use in a real-time structural control system, which is illustrated using a full-scale structure controlled by a semi-active magnetorheological (MR) damper and a network of wireless sensors.
Abstract: Wireless sensor networks have rapidly matured in recent years to offer data acquisition capabilities on par with those of traditional tethered data acquisition systems. Entire structural monitoring systems assembled from wireless sensors have proven to be low cost, easy to install, and accurate. However, the functionality of wireless sensors can be further extended to include actuation capabilities. Wireless sensors capable of actuating a structure could serve as building blocks of future generations of structural control systems. In this study, a wireless sensor prototype capable of data acquisition, computational analysis and actuation is proposed for use in a real-time structural control system. The performance of a wireless control system is illustrated using a full-scale structure controlled by a semi-active magnetorheological (MR) damper and a network of wireless sensors. One wireless sensor designated as a controller automates the task of collecting state data, calculating control forces, and issuing commands to the MR damper, all in real time. Additional wireless sensors are installed to measure the acceleration and velocity response of each system degree of freedom. Base motion is applied to the structure to simulate seismic excitations while the wireless control system mitigates inter-storey drift response of the structure. An optimal linear quadratic regulation solution is formulated for embedment within the computational cores of the wireless sensors. Copyright © 2007 John Wiley & Sons, Ltd.

Patent
13 May 2008
TL;DR: In this paper, a real-time kinematic (RTK) subsystem is used to generate correction data associated with the data acquisition period and correcting the position and orientation data based on the correction data.
Abstract: A method of generating post-mission position and orientation data comprises generating position and orientation data representing positions and orientations of a mobile platform, based on global navigation satellite system (GNSS) data and inertial navigation system (INS) data acquired during a data acquisition period by the mobile platform, using a network real-time kinematic (RTK) subsystem to generate correction data associated with the data acquisition period, and correcting the position and orientation data based on the correction data. The RTK subsystem may implement a virtual reference station (VRS) technique to generate the correction data.

Patent
07 May 2008
TL;DR: In this paper, a system for automatically generating a schematic user interface of an electrical system is presented, which includes a data acquisition component, a power analytics server and a client terminal.
Abstract: A system for automatically generating a schematic user interface of an electrical system is disclosed. The system includes a data acquisition component, a power analytics server and a client terminal. The data acquisition component acquires real-time data output from the electrical system. The power analytics server is comprised of a virtual system modeling engine, an analytics engine, a machine learning engine and a schematic user interface creator engine. The virtual system modeling engine generates predicted data output for the electrical system. The analytics engine monitors real-time data output and predicted data output of the electrical system. The machine learning engine stores and processes patterns observed from the real-time data output and the predicted data output to forecast an aspect of the electrical system. The schematic user interface creator engine is configured to create a schematic user interface that is representative of the virtual system model and link the schematic user interface to the data acquisition component.

Proceedings ArticleDOI
12 May 2008
TL;DR: The results show that the devices' color interpolation coefficients and noise statistics can jointly serve as good forensic features to help accurately trace the origin of the input image to its production process and to differentiate between images produced by cameras, cell phone cameras, scanners, and computer graphics.
Abstract: With widespread availability of digital images and easy-to-use image editing softwares, the origin and integrity of digital images has become a serious concern. This paper introduces the problem of image acquisition forensics and proposes a fusion of a set of signal processing features to identify the source of digital images. Our results show that the devices' color interpolation coefficients and noise statistics can jointly serve as good forensic features to help accurately trace the origin of the input image to its production process and to differentiate between images produced by cameras, cell phone cameras, scanners, and computer graphics. Further, the proposed features can also be extended to determining the brand and model of the device. Thus, the techniques introduced in this work provide a unified framework for image acquisition forensics.

Journal ArticleDOI
TL;DR: This work introduces 'k-Space tutorial', a MATLAB-based educational environment to learn how the image and the k-space are related, andHow the image can be affected through k- space modifications.
Abstract: A main difference between Magnetic Resonance (MR) imaging and other medical imaging modalities is the control over the data acquisition and how it can be managed to finally show the adequate reconstructed image. With some basic programming adjustments, the user can modify the spatial resolution, field of view (FOV), image contrast, acquisition velocity, artifacts and so many other parameters that will contribute to form the final image. The main character and agent of all this control is called k-space, which represents the matrix where the MR data will be stored previously to a Fourier transformation to obtain the desired image.This work introduces 'k-Space tutorial', a MATLAB-based educational environment to learn how the image and the k-space are related, and how the image can be affected through k-space modifications. This MR imaging educational environment has learning facilities on the basic acceleration strategies that can be encountered in almost all MR scanners: scan percentage, rectangular FOV and partial Fourier imaging. It also permits one to apply low- and high-pass filtering to the k-space, and to observe how the contrast or the details are selected in the reconstructed image. It also allows one to modify the signal-to-noise ratio of the acquisition and create some artifacts on the image as a simulated movement of the patient - with variable intensity level - and some electromagnetic spikes on k-space occurring during data acquisition.

Journal ArticleDOI
TL;DR: In this paper, the effect of the seismic instruments on the horizontal-to-vertical spectral ratio (H/V) using seismic noise for frequencies less than 1 Hz has been evaluated.
Abstract: Using three different short-period electromagnetic sensors with resonance frequencies of 1 Hz (Mark L4C-3D), 2 Hz (Mark L-22D), and 4.5 Hz (I/O SM-6), coupled with three digital acquisition system, the portable data acquisition system (PDAS) Teledyne Geotech, the refraction technology (REFTEK) 72A, and the Earth Data Logger PR6-24 (EDL), the effect of the seismic instruments on the horizontal-to-vertical spectral ratio (H/V) using seismic noise for frequencies less than 1 Hz has been evaluated. For all possible sensors–acquisition system pairs, the background seismic signal and instrumental self-noise power spectral densities have been calculated and compared. The results obtained when coupling the short-period sensors with different acquisition systems show that the performance of the considered instruments at frequencies <1 Hz strongly depends upon the sensor–acquisition system combination and the gain used, with the best performance obtained for sensors with the lowest resonance frequency. For all acquisition systems, it was possible to retrieve correctly the H/V peak down to 0.1–0.2 Hz by using a high gain and a 1-Hz sensor. In contrast, biased H/V spectral ratios were retrieved when low-gain values were considered. Particular care is required when using 4.5-Hz sensors, because they may not even allow the fundamental resonance frequency peak to be reproduced.

Proceedings ArticleDOI
07 Jul 2008
TL;DR: It is shown that the unrecorded satellite jitter during image acquisition, and the uncertainties on the CCD arrays geometry are the current major limiting factors for applications requiring high accuracy.
Abstract: Applications such as change detection and digital elevation model extraction from optical images require a rigorous modeling of the acquisition geometry. We show that the unrecorded satellite jitter during image acquisition, and the uncertainties on the CCD arrays geometry are the current major limiting factors for applications requiring high accuracy. These artifacts are identified and quantified on several optical satellites, i.e., SPOT, ASTER, QuickBird, and HiRISE.