scispace - formally typeset
Search or ask a question

Showing papers on "Data acquisition published in 2018"


Journal ArticleDOI
TL;DR: This tutorial provides guidelines on how to set up and plan a SWATH‐MS experiment, how to perform the mass spectrometric measurement and how to analyse SWath‐MS data using peptide‐centric scoring.
Abstract: Many research questions in fields such as personalized medicine, drug screens or systems biology depend on obtaining consistent and quantitatively accurate proteomics data from many samples. SWATH-MS is a specific variant of data-independent acquisition (DIA) methods and is emerging as a technology that combines deep proteome coverage capabilities with quantitative consistency and accuracy. In a SWATH-MS measurement, all ionized peptides of a given sample that fall within a specified mass range are fragmented in a systematic and unbiased fashion using rather large precursor isolation windows. To analyse SWATH-MS data, a strategy based on peptide-centric scoring has been established, which typically requires prior knowledge about the chromatographic and mass spectrometric behaviour of peptides of interest in the form of spectral libraries and peptide query parameters. This tutorial provides guidelines on how to set up and plan a SWATH-MS experiment, how to perform the mass spectrometric measurement and how to analyse SWATH-MS data using peptide-centric scoring. Furthermore, concepts on how to improve SWATH-MS data acquisition, potential trade-offs of parameter settings and alternative data analysis strategies are discussed.

613 citations


Journal ArticleDOI
TL;DR: Two methods of obtaining k-space mapping and real-space imaging in high-resolution ARPES microscopy are presented, which clearly indicate higher accuracy in k- space mapping as well as higher efficiency in real- space imaging, and thus improved throughput of high- resolution APRES microscopy.
Abstract: Angle-resolved photoemission spectroscopy (ARPES) is a powerful experimental technique in materials science, as it can directly probe electronic states inside solids in energy (E) and momentum (k) space. As an advanced technique, spatially-resolved ARPES using a well-focused light source (high-resolution ARPES microscopy) has recently attracted growing interests because of its capability to obtain local electronic information at micro- or nano-metric length scales. However, there exist several technical challenges to guarantee high precision in determining translational and rotational positions in reasonable measurement time. Here we present two methods of obtaining k-space mapping and real-space imaging in high-resolution ARPES microscopy. One method is for k-space mapping measurements that enables us to keep a target position on a sample surface during sample rotation by compensating rotation-induced displacements (tracing acquisition method). Another method is for real-space imaging measurements that significantly reduces total acquisition time (scanning acquisition method). We provide several examples of these methods that clearly indicate higher accuracy in k-space mapping as well as higher efficiency in real-space imaging, and thus improved throughput of high-resolution APRES microscopy.

362 citations


Journal ArticleDOI
TL;DR: A nonlinear projection is applied to achieve the compressed acquisition, which not only reduces the amount of measured data that contained all the information of faults but also realizes the automatic feature extraction in transform domain.
Abstract: Effective intelligent fault diagnosis has long been a research focus on the condition monitoring of rotary machinery systems. Traditionally, time-domain vibration-based fault diagnosis has some deficiencies, such as complex computation of feature vectors, excessive dependence on prior knowledge and diagnostic expertise, and limited capacity for learning complex relationships in fault signals. Furthermore, following the increase in condition data, how to promptly process the massive fault data and automatically provide accurate diagnosis has become an urgent need to solve. Inspired by the idea of compressed sensing and deep learning, a novel intelligent diagnosis method is proposed for fault identification of rotating machines. In this paper, a nonlinear projection is applied to achieve the compressed acquisition, which not only reduces the amount of measured data that contained all the information of faults but also realizes the automatic feature extraction in transform domain. For exploring the discrimination hidden in the acquired data, a stacked sparse autoencoders-based deep neural network is established and performed with an unsupervised learning procedure followed by a supervised fine-tuning process. We studied the significance of compressed acquisition and provided the effects of key factors and comparison with traditional methods. The effectiveness of the proposed method is validated using data sets from rolling element bearings and the analysis shows that it is able to obtain high diagnotic accuracies and is superior to the existing methods. The proposed method reduces the need of human labor and expertise and provides new strategy to handle the massive data more easily.

283 citations


Journal ArticleDOI
TL;DR: NeuroMatic is an open-source software toolkit that performs data acquisition, data analysis, simulations and simulations of electrophysiological properties of the nervous system and has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions.
Abstract: Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic.

190 citations


Patent
07 May 2018
TL;DR: In this paper, a monitoring apparatus, systems and methods for data collection in an industrial environment are disclosed, which include a data collector communicatively coupled to a plurality of input channels and to a network infrastructure, wherein the data collector collects data based on a selected data collection routine.
Abstract: A monitoring apparatus, systems and methods for data collection in an industrial environment are disclosed. A system may include a data collector communicatively coupled to a plurality of input channels and to a network infrastructure, wherein the data collector collects data based on a selected data collection routine, a data storage structured to store a plurality of collector routes and collected data, a data acquisition circuit structured to interpret a plurality of detection values from the collected data, and a data analysis circuit structured to analyze the collected data and determine an aggregate rate of data being collected from the plurality of input channels, wherein if the aggregate rate exceeds a throughput parameter of the network infrastructure, then the data analysis circuit alters the data collection to reduce the amount of data collected.

109 citations


Journal ArticleDOI
TL;DR: A semi-automated routine for continuous rotation electron diffraction has been developed, enabling high-throughput data collection and using Serial electron crystallography combined with a deep convolutional network to screen for suitable crystals.
Abstract: Single-crystal electron diffraction (SCED) is emerging as an effective technique to determine and refine the structures of unknown nano-sized crystals. In this work, the implementation of the continuous rotation electron diffraction (cRED) method for high-throughput data collection is described. This is achieved through dedicated software that controls the transmission electron microscope and the camera. Crystal tracking can be performed by defocusing every nth diffraction pattern while the crystal rotates, which addresses the problem of the crystal moving out of view of the selected area aperture during rotation. This has greatly increased the number of successful experiments with larger rotation ranges and turned cRED data collection into a high-throughput method. The experimental parameters are logged, and input files for data processing software are written automatically. This reduces the risk of human error, and makes data collection more reproducible and accessible for novice and irregular users. In addition, it is demonstrated how data from the recently developed serial electron diffraction technique can be used to supplement the cRED data collection by automatic screening for suitable crystals using a deep convolutional neural network that can identify promising crystals through the corresponding diffraction data. The screening routine and cRED data collection are demonstrated using a sample of the zeolite mordenite, and the quality of the cRED data is assessed on the basis of the refined crystal structure.

107 citations


Journal ArticleDOI
TL;DR: Signac as discussed by the authors is a framework designed to assist in the integration of various specialized data formats, tools and workflows, simplifying data access and modification through a homogeneous data interface that is largely agnostic to the data source.

92 citations


Journal ArticleDOI
TL;DR: A low-cost and confidentiality-preserving data acquisition framework for IoMT that harnesses chaotic convolution and random subsampling to capture multiple image signals and encrypts this master image based on Arnold transform and single value diffusion.
Abstract: Internet of Multimedia Things (IoMT) faces the challenge of how to realize low-cost data acquisition while still preserve data confidentiality. In this paper, we present a low-cost and confidentiality-preserving data acquisition framework for IoMT. First, we harness chaotic convolution and random subsampling to capture multiple image signals. The measurement matrix is under the control of chaos, ensuring the security of the sampling process. Next, we assemble these sampled images into a big master image, and then encrypt this master image based on Arnold transform and single value diffusion. The computation of these two transforms only requires some low-complexity operations. Finally, the encrypted image is delivered to cloud servers for storage and decryption service. Experimental results demonstrate the security and effectiveness of the proposed framework.

88 citations


Journal ArticleDOI
TL;DR: A portable continuous measurement toolbox which provides a robust, easily extendable, and low-cost setup for indoor environmental quality monitoring and performance assessment and utilized the open source, agent-based software platform VOLTTRON for data communication and analysis.
Abstract: Building performance monitoring could be limited due to the cost and inflexibility of hardware and software platforms for data acquisition. This paper describes a portable continuous measurement toolbox which provides a robust, easily extendable, and low-cost setup for indoor environmental quality (IEQ) monitoring and performance assessment. Various sensors—temperature, relative humidity, illuminance, CO2, VOC, PM2.5, and occupancy—for IEQ performance measurement are included in this toolbox. Arduino Uno boards were connected to the sensors for data acquisition. ZigBee communication protocol was established between an XBee device for each Arduino board and an XBee receiver connected to a computer. The toolbox utilized the open source, agent-based software platform VOLTTRON for data communication and analysis. The data collection system was calibrated against an accurate data acquisition card. Experiments have been conducted using the toolbox for assessing IEQ performance in an open computer lab within a commercial building. Thermal comfort, indoor air quality, and lighting performance have been analyzed based on collected data. The study demonstrated reliability and robustness of the toolbox for continuous monitoring of indoor environmental quality.

88 citations


Patent
07 May 2018
TL;DR: In this paper, a monitoring system for data collection in an industrial drilling environment comprising a data collector communicatively coupled to a plurality of input channels and to network infrastructure, wherein the data collector is sensitive to a change to a parameter of the network infrastructure within the environment, is presented.
Abstract: A monitoring system for data collection in an industrial drilling environment comprising a data collector communicatively coupled to a plurality of input channels and to network infrastructure, wherein the data collector is sensitive to a change to a parameter of the network infrastructure within the environment; a data storage structured to store a plurality of collector routes, each comprising a different data collection routine, and collected data that corresponds to the input channels; a data acquisition circuit structured to interpret a plurality of detection values from the collected data corresponding to one of the input channels; and a data analysis circuit structured to analyze the collected data from the plurality of input channels and evaluate a selected collection routine of the data collector, wherein the selected collection routine is switched to a second collection routine due to the data analysis circuit detecting a change to a network infrastructure parameter.

78 citations


Journal ArticleDOI
06 Aug 2018
TL;DR: This paper is a survey of the methodologies to perform distributed PCA on different data sets, their performance, and of their applications in the context of distributed data acquisition systems.
Abstract: Principal component analysis (PCA) is a fundamental primitive of many data analysis, array processing, and machine learning methods. In applications where extremely large arrays of data are involved, particularly in distributed data acquisition systems, distributed PCA algorithms can harness local communications and network connectivity to overcome the need of communicating and accessing the entire array locally. A key feature of distributed PCA algorithm is that they defy the conventional notion that the first step toward computing the principal vectors is to form a sample covariance. This paper is a survey of the methodologies to perform distributed PCA on different data sets, their performance, and of their applications in the context of distributed data acquisition systems.

Journal ArticleDOI
TL;DR: A data-driven coherency identification methodology is proposed based on the spectral clustering algorithm to identify the co herency of synchronous generators using real-time signals from phasor measurement units (PMUs).
Abstract: The wide-area measurement system provides a new data acquisition and supervisory control tool for a power system, and the data acquisition level is increased dramatically with its development in the smart grid environment Huge data associated with the power system operation are acquired, which are beneficial for enhancing situational awareness of a power system concerned Identifying the coherency among synchronous generators using real-time signals from phasor measurement units (PMUs) is one of the major tasks of situational awareness in power system operation Given this background, a data-driven coherency identification methodology is proposed based on the spectral clustering algorithm First, several trajectory dissimilarity indices for the rotor angle and rotor speed trajectories of generators as measured by PMUs are presented based on the trajectory similarity theory Second, a decision-making method based on the Gini coefficient and Kendall rank correlation coefficient is presented for integrating multiple indices describing trajectory dissimilarities Third, the spectral clustering algorithm is presented to identify the coherency of synchronous generators, and silhouette is presented for determining a reasonable number of coherent groups Finally, oscillation events happened/simulated in two actual power systems, ie, Guangdong power system in China and Western Interconnection power system in North America, are utilized to demonstrate the effectiveness of the proposed data-driven coherency identification methodology

Journal ArticleDOI
TL;DR: The seafloor backscatter working group (BSWG) as discussed by the authors was established by the Marine Geological and Biological Habitat Mapping group (Geohab.org) in 2013.
Abstract: Multibeam echosounders are becoming widespread for the purposes of seafloor bathymetry mapping, but the acquisition and the use of seafloor backscatter measurements, acquired simultaneously with the bathymetric data, are still insufficiently understood, controlled and standardized. This presents an obstacle to well-accepted, standardized analysis and application by end users. The Marine Geological and Biological Habitat Mapping group (Geohab.org) has long recognized the need for better coherence and common agreement on acquisition, processing and interpretation of seafloor backscatter data, and established the Backscatter Working Group (BSWG) in May 2013. This paper presents an overview of this initiative, the mandate, structure and program of the working group, and a synopsis of the BSWG Guidelines and Recommendations to date. The paper includes (1) an overview of the current status in sensors and techniques available in seafloor backscatter data from multibeam sonars; (2) the presentation of the BSWG structure and results; (3) recommendations to operators, end-users, sonar manufacturers, and software developers using sonar backscatter for seafloor-mapping applications, for best practice methods and approaches for data acquisition and processing; and (4) a discussion on the development needs for future systems and data processing. We propose for the first time a nomenclature of backscatter processing levels that affords a means to accurately and efficiently describe the data processing status, and to facilitate comparisons of final products from various origins.

Journal ArticleDOI
TL;DR: In this paper, a high-precision digital synthesis method and a digital demodulation technique with high noise immunity were used to eliminate random errors in electrical impedance tomography (EIT) data acquisition.
Abstract: In electrical impedance tomography (EIT), it is difficult to obtain the intracranial impedance due to the highly resistive skull enclosing the brain. Therefore, a high-precision data acquisition system is required for brain EIT. In this paper, we used a high-precision digital synthesis method and a digital demodulation technique with high noise immunity to eliminate random errors. Moreover, we focused on two problems encountered during EIT data acquisition: 1) the shunt effect on the excitation current due to the distributed capacitance between electrodes and ground and 2) high common-mode voltages in the boundary measurements. We designed a new electrode interface to reduce the influence of the distributed capacitance and a programmable current source to accurately compensate for the excited current. We also proposed a new voltmeter circuit with improved CMRR. Overall, this EIT data acquisition system can produce a programmable current with SNR greater than 89 dB. It can also measure the voltage difference precisely with CMRR higher than 75 dB with a 1- $\text{k}\Omega $ impedance imbalance. The results on a calibration model show that this system has a high SNR of 83 dB and a low reciprocity error of 0.125%. In addition, EIT imaging results were acquired using a brain physical phantom. The system can detect small disturbances of 0.35% in volume (1.99% of the cross-sectional area) and 17% in resistivity. Experiments on healthy volunteers also suggest that small intracranial impedance variations due to temporary occlusion and reperfusion of the unilateral carotid artery may be monitored by the system.

Journal ArticleDOI
TL;DR: Two new PIGs which can be used to investigate the speed variable while determining defects in pipelines are designed and the usability of these new designs in determining pipeline defects are examined through an example experiment result with the Origin analysis program.

Journal ArticleDOI
07 Feb 2018
TL;DR: The aspects of data acquisition and signal conditioning are briefly covered, and various methods of feature extraction, and classification are discussed.
Abstract: In the past few years the utilization of biological signals as a method of interface with a robotic device has become increasingly more prominent. With the many of these systems being based on EEG and EMG.EMG based control has five main parts data acquisition, signal conditioning, feature extraction, classification, and control. This paper seeks to briefly cover the aspects of data acquisition and signal conditioning. After which, various methods of feature extraction, and classification are discussed.

Journal ArticleDOI
TL;DR: An approach is demonstrated termed Tomosaic for tomographic imaging of large samples that extend beyond the illumination field of view of an X-ray imaging system.
Abstract: X-rays offer high penetration with the potential for tomography of centimetre-sized specimens, but synchrotron beamlines often provide illumination that is only millimetres wide. Here an approach is demonstrated termed Tomosaic for tomographic imaging of large samples that extend beyond the illumination field of view of an X-ray imaging system. This includes software modules for image stitching and calibration, while making use of existing modules available in other packages for alignment and reconstruction. The approach is compatible with conventional beamline hardware, while providing a dose-efficient method of data acquisition. By using parallelization on a distributed computing system, it provides a solution for handling teravoxel-sized or larger datasets that cannot be processed on a single workstation in a reasonable time. Using experimental data, the package is shown to provide good quality three-dimensional reconstruction for centimetre-sized samples with sub-micrometre pixel size.

Journal ArticleDOI
TL;DR: It is found that this low-cost solution of virtual instrumentation to provide a new technique for real-time instrumentation of the PV panel characteristics such as voltage, current and power presents several benefits compared to the traditional solution such as the data can be presented in graphical form in real time.
Abstract: This paper presents a low-cost solution of virtual instrumentation to provide a new technique for real-time instrumentation of the PV panel characteristics such as voltage, current and power. The system design is based on a low-cost Arduino acquisition board. The acquisition is made through a low-cost current and voltage sensors, and data are presented in Excel by using the PLX-DAQ data acquisition Excel Macro, which allows communication between the ATMega328 microcontroller of an Arduino UNO board and the computer by UART bus. Hence, the I–V and P–V characteristics, which processed under real-time conditions, can be obtained directly and plotted on an Excel spreadsheet without needing to reprogram the microcontroller. A comparison between this low-cost virtual instrumentation and the traditional instrumentation is drawn in this work. It is found that our solution presents several benefits compared to the traditional solution such as the data can be presented in graphical form in real time. Thus, several experimental tests to confirm the effectiveness of the developed virtual instrumentation system are presented in this study.

Journal ArticleDOI
TL;DR: Comparison results demonstrate that the proposed route planning methodology is able to greatly enhance the efficiency of 3D reconstruction by improving the data collection speed while minimizing redundant image datasets, as well as to provide a normalized approach to assign the single or multi-UAV data acquisition tasks.
Abstract: In order to provide a fast multi-UAV cooperative data acquisition approach for 3D building model reconstruction in emergency management domain, a route planning methodology is proposed. A minimum image set including camera shooting positions and attitudes can be firstly obtained, with the given parameters describing the target building, UAVs, cameras, and image overlap requirements. A specific flight route network is then determined, and the optimal solution for multi-UAV data capture route planning is computed on the basis of constraint conditions such as the time frame, UAV battery endurance, and take-off and landing positions. Furthermore, field experiments with manual operating UAV mode, single UAV mode, and multi-UAV mode were conducted to compare the data collection and processing runtimes, as well as the quality of created 3D building models. According to the five defined LoDs of OGC CityGML 2.0 standard, the fine 3D building models conform to the LoD3. Comparison results demonstrate that our method is able to greatly enhance the efficiency of 3D reconstruction by improving the data collection speed while minimizing redundant image datasets, as well as to provide a normalized approach to assign the single or multi-UAV data acquisition tasks. The quality analysis of 3D models shows that the metric difference is less than 20 cm mean error with a standard deviation of 11 cm, which is fairly acceptable in emergency management study field. A 3D GIS-based software demo was also implemented to enable route planning, flight simulation, and data collection visualization.

Journal ArticleDOI
TL;DR: The implementation of DA+ data acquisition and analysis software at Swiss Light Source macromolecular crystallography beamlines is presented and details of the user interface, acquisition engine, online processing and database are given.
Abstract: Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.

Journal ArticleDOI
TL;DR: In this paper, a new data processing method for deriving parameters with fast resolution and to provide reliable accuracies is presented based on tests in the field and in the laboratory, the limitations and verifiability of integrated sensors are discussed.
Abstract: The unmanned research aircraft ALADINA (Application of Light-weight Aircraft for Detecting in situ Aerosols) has been established as an important tool for boundary layer research For simplified integration of additional sensor payload, a flexible and reliable data acquisition system was developed at the Institute of Flight Guidance, Technische Universitat (TU) Braunschweig The instrumentation consists of sensors for temperature, humidity, three-dimensional wind vector, position, black carbon, irradiance and atmospheric particles in the diameter range of ultra-fine particles up to the accumulation mode The modular concept allows for straightforward integration and exchange of sensors So far, more than 200 measurement flights have been performed with the robustly-engineered system ALADINA at different locations The obtained datasets are unique in the field of atmospheric boundary layer research In this study, a new data processing method for deriving parameters with fast resolution and to provide reliable accuracies is presented Based on tests in the field and in the laboratory, the limitations and verifiability of integrated sensors are discussed

Journal ArticleDOI
27 Jan 2018-Sensors
TL;DR: It is concluded that these technologies are applicable to the automobile industry, thereby allowing the project costs to be reduced and thus facilitating access to this kind of research that requires limited resources.
Abstract: This project addresses the need for the implementation of low-cost acquisition technology in the field of vehicle engineering: the design, development, manufacture, and verification of a low-cost Arduino-based data acquisition platform to be used in <80 Hz data acquisition in vehicle dynamics, using low-cost accelerometers. In addition to this, a comparative study is carried out of professional vibration acquisition technologies and low-cost systems, obtaining optimum results for low- and medium-frequency operations with an error of 2.19% on road tests. It is therefore concluded that these technologies are applicable to the automobile industry, thereby allowing the project costs to be reduced and thus facilitating access to this kind of research that requires limited resources.

Journal ArticleDOI
TL;DR: The SerialEM package as discussed by the authors provides a comprehensive interface for microscope control, image acquisition and display, and automated data acquisition of various kinds, including tilt series acquisition for electron tomography, which has also become widely used for acquisition of images for singleparticle reconstruction.
Abstract: The SerialEM package provides a comprehensive interface for microscope control, image acquisition and display, and automated data acquisition of various kinds. Although its main focus has been tilt series acquisition for electron tomography, it has also become widely used for acquisition of images for singleparticle reconstruction. Some of the key features and recent developments that have contributed to the program’s popularity are briefly described below.

Journal ArticleDOI
TL;DR: In this article, the performance of a photovoltaic (PV) module depends on real operating conditions such as solar irradiance, ambient temperature, and wind speed, in addition to solar module technologies.

Journal ArticleDOI
TL;DR: Constant linear velocity spiral scanning (CLV-SC) is introduced as a novel beam scanning method to maximize the data acquisition efficiency of ultrahigh speed 4D OCT systems and achieves more uniform transverse sampling compared to raster scanning.
Abstract: Ultrahigh speed optical coherence tomography (OCT) systems with >100 kHz A-scan rates can generate volumes rapidly with minimal motion artifacts and are well suited for 4D imaging (volumes through time) applications such as intra-operative imaging. In such systems, high OCT data acquisition efficiency (defined as the fraction of usable A-scans generated during the total acquisition time) is desired to maximize the volumetric frame rate and sampling pitch. However, current methods for beam scanning using non-resonant and resonant mirror scanners can result in severe scan distortion and transverse oversampling as well as require acquisition dead times, which limit the acquisition efficiency and performance of ultrahigh speed 4D OCT. We introduce constant linear velocity spiral scanning (CLV-SC) as a novel beam scanning method to maximize the data acquisition efficiency of ultrahigh speed 4D OCT systems. We demonstrate that CLV-SC does not require acquisition dead times and achieves more uniform transverse sampling compared to raster scanning. To assess its clinical utility, we implement CLV-SC with a 400 kHz OCT system and image the anterior eye and retina of healthy adults at up to 10 volumes per second with isotropic transverse sampling, allowing B-scans with equal sampling pitch to be extracted from arbitrary locations within a single volume. The feasibility of CLV-SC for intra-operative imaging is also demonstrated using a 800 kHz OCT system to image simulated retinal surgery at 15 volumes per second with isotropic transverse sampling, resulting in high quality volume renders that enable clear visualization of surgical instruments and manipulation of tissue.

Journal ArticleDOI
TL;DR: The uniformity of threshold values obtained from IoT-based model in comparison with that of analysis carried out on the machines locally using myRIO for data acquisition ensures the integrity of the proposed statistical classification algorithm and reliability of the IoT model for condition monitoring with assured scalability.
Abstract: The aim of this paper is to propose an IoT-based model for real-time condition monitoring of electrical machines, which addresses the challenges of data storage and scalability. The proposed model is evolved with an experimental setup having two sets of dc motor coupled to ac generator and an IoT device to elucidate integrated monitoring and decision making. This IoT-based vibration analytic model uses an IoT2040 Gateway with custom Linux OS image built for acquisition and streaming of vibration signals. The Python target application acquires dc motors shaft vibration using vibration sensors and communicates the data as events to cloud through serial device driver interface. The IoT service running in cloud receives the data from multiple machines through lightweight RESTful HTTP and records the same which are retrievable for analysis and algorithm development in any platform. The retrieved data have been analyzed using the proposed statistical classification-based signal decomposition algorithm as well as time-frequency analysis to estimate the vibration thresholds of every machine connected to IoT cloud. Such estimated thresholds corresponding to different operating and environmental conditions maintained in cloud are used to build a repository of context specific solutions for machine conditions leading to improved maintenance decisions. The uniformity of threshold values obtained from IoT-based model in comparison with that of analysis carried out on the machines locally using myRIO for data acquisition ensures the integrity of the proposed statistical classification algorithm and reliability of the IoT model for condition monitoring with assured scalability.

Journal ArticleDOI
TL;DR: The design, development, and validation of a ‘modular photoplethysmography (PPG) system called ZenPPG, which has the capability to produce “raw” PPG signals at two different wavelengths using commercial and/or custom-made PPG sensors, is presented.
Abstract: In this paper, we present the design, development, and validation of a ‘modular photoplethysmography (PPG) system called ZenPPG. This portable, dual-channel system has the capability to produce “raw” PPG signals at two different wavelengths using commercial and/or custom-made PPG sensors. The system consists of five modules, each consisting of circuitry required to perform specific tasks, and are all interconnected by a system bus. The ZenPPG system also facilitates the acquisition of other physiological signals on-demand including electrocardiogram (ECG), respiration, and temperature signals. This report describes the technical details and the evaluation of the ZenPPG along with results from a pilot in vivo study on healthy volunteers. The results from the technical evaluations demonstrate the superiority and flexibility of the system. Also, the systems’ compatibility with commercial pulse oximetry sensors such as the Masimo reusable sensors was demonstrated, where good quality raw PPG signals were recorded with the signal-to-noise ratio (SNR) of 50.65 dB. The estimated arterial oxygen saturation (SpO2) values from the system were also in close agreement with commercial pulse oximeters, although the accuracy of the reported SpO2 value is dependent on the calibration function used. Future work is targeted toward the development of variations of each module, including the laser driver and fiber optic module, onboard data acquisition and signal processing modules. The availability of this system will help researchers from a wide range of disciplines to customize and integrate the ZenPPG system to their research needs and will most definitely enhance research in related fields.

Journal ArticleDOI
06 Dec 2018
TL;DR: This paper discusses and evaluates how various low-level compression algorithms could be used in the automotive LiDAR sensor in order to optimize on-chip storage capacity and link bandwidth and concludes several promising directions for future research.
Abstract: Due to specific dynamics of the operating environment and required safety regulations, the amount of acquired data of an automotive LiDAR sensor that has to be processed is reaching several Gbit/s. Therefore, data compression is much-needed to enable future multi-sensor automated vehicles. Numerous techniques have been developed to compress LiDAR raw data; however, these techniques are primarily targeting a compression of 3D point cloud, while the way data is captured and transferred from a sensor to an electronic computing unit (ECU) was left out. The purpose of this paper is to discuss and evaluate how various low-level compression algorithms could be used in the automotive LiDAR sensor in order to optimize on-chip storage capacity and link bandwidth. We also discuss relevant parameters that affect amount of collected data per second and what are the associated issues. After analyzing compressing approaches and identifying their limitations, we conclude several promising directions for future research.

Journal ArticleDOI
TL;DR: The CUORE experiment as discussed by the authors, an array of 988 TeO$_2$ bolometers that is taking data since April 2017 at the Laboratori Nazionali del Gran Sasso (Italy), exploits the large mass, low background, good energy resolution and low energy threshold of these detectors successfully.
Abstract: During the last couple of decades, the use of arrays of bolometers has represented one of the leading techniques for the search for rare events. CUORE, an array of 988 TeO$_2$ bolometers that is taking data since April 2017 at the Laboratori Nazionali del Gran Sasso (Italy), exploits the large mass, low background, good energy resolution and low energy threshold of these detectors successfully. Thanks to these characteristics, they could be also sensitive to other low energy rare processes, such as galactic dark matter interactions. In this paper we describe the data acquisition system that was developed for the CUORE experiment. Thanks to its high modularity, the data acquisition here described has been used in different setups with similar requirements, including the pilot experiment CUORE-0 and the demonstrator for the next phase of the project, CUPID-0, also taking data at LNGS.

Journal ArticleDOI
TL;DR: In this article, a hybrid state estimator based on supervisory control and data acquisition (SCADA) and phasor measurement unit (PMU) measurements is proposed, which consists of the improved robust estimation and linear state estimation.
Abstract: With the increasing importance of renewable energy and flexible loads, the operation of the distribution system is becoming more stochastic and complex, and it is necessary to monitor the power system in real-time. Considering the gradual applications of intelligent electronic devices in the distribution systems, a hybrid state estimator based on supervisory control and data acquisition (SCADA) and phasor measurement unit (PMU) measurements is proposed in this paper, which consists of the improved robust estimation and linear state estimation. At the time of SCADA data acquisition, the improved robust estimation combining the SCADA measurements with PMU measurements is performed. To eliminate the effect of bad data, the internal student residual method is introduced, and the robust thresholds are adjusted adaptively. Then the linear state estimation is performed at the time of PMU data acquisition based on the results of the previous estimation time and the PMU measurements, which can quickly correct the robust estimation results and track the changes of the distribution system. Finally, the effectiveness and performance of the proposed method are verified in a modified IEEE 33-bus distribution system and a real distribution system in China.