scispace - formally typeset
Search or ask a question

Showing papers on "Data acquisition published in 2022"


Journal ArticleDOI
01 Jan 2022-Sensors
TL;DR: A low-cost multi-sensor data acquisition system (DAQ) for detecting various faults in 3D printed products is proposed using an Arduino micro-controller that collects real-time multi-Sensor signals using vibration, current, and sound sensors.
Abstract: Fused deposition modelling (FDM)-based 3D printing is a trending technology in the era of Industry 4.0 that manufactures products in layer-by-layer form. It shows remarkable benefits such as rapid prototyping, cost-effectiveness, flexibility, and a sustainable manufacturing approach. Along with such advantages, a few defects occur in FDM products during the printing stage. Diagnosing defects occurring during 3D printing is a challenging task. Proper data acquisition and monitoring systems need to be developed for effective fault diagnosis. In this paper, the authors proposed a low-cost multi-sensor data acquisition system (DAQ) for detecting various faults in 3D printed products. The data acquisition system was developed using an Arduino micro-controller that collects real-time multi-sensor signals using vibration, current, and sound sensors. The different types of fault conditions are referred to introduce various defects in 3D products to analyze the effect of the fault conditions on the captured sensor data. Time and frequency domain analyses were performed on captured data to create feature vectors by selecting the chi-square method, and the most significant features were selected to train the CNN model. The K-means cluster algorithm was used for data clustering purposes, and the bell curve or normal distribution curve was used to define individual sensor threshold values under normal conditions. The CNN model was used to classify the normal and fault condition data, which gave an accuracy of around 94%, by evaluating the model performance based on recall, precision, and F1 score.

18 citations


Journal ArticleDOI
TL;DR: This work demonstrates that library-free BoxCarDIA acquisition, combining MS1-level BoxCar acquisition with MS2-level data-independent acquisition (DIA) analysis, outperforms conventional DDA and other library- free DIA (directDIA), and establishes it as the new method of choice for label-free quantitative proteomics across diverse sample types.
Abstract: Data-dependent acquisition (DDA) methods are the current standard for quantitative proteomics in many biological systems. However, DDA preferentially measures highly abundant proteins and generates data that is plagued with missing values, requiring extensive imputation. Here, we demonstrate that library-free BoxCarDIA acquisition, combining MS1-level BoxCar acquisition with MS2-level data-independent acquisition (DIA) analysis, outperforms conventional DDA and other library-free DIA (directDIA) approaches. Using a combination of low- (HeLa cells) and high- (Arabidopsis thaliana cell culture) dynamic range sample types, we demonstrate that BoxCarDIA can achieve a 40% increase in protein quantification over DDA without offline fractionation or an increase in mass-spectrometer acquisition time. Further, we provide empirical evidence for substantial gains in dynamic range sampling that translates to deeper quantification of low-abundance protein classes under-represented in DDA and directDIA data. Unlike both DDA and directDIA, our new BoxCarDIA method does not require full MS1 scans while offering reproducible protein quantification between replicate injections and providing more robust biological inferences. Overall, our results advance the BoxCarDIA technique and establish it as the new method of choice for label-free quantitative proteomics across diverse sample types.

17 citations


Journal ArticleDOI
TL;DR: The digitization and demodulation of the multibeam signals in the proposed quasi-parallel sensing technique are multiplexed over the high-frequency modulation within a wavelength scan to maintain the temporal response of the fully parallel sensing scheme and facilitate the cost-effective implementation of industrial CST.
Abstract: Chemical species tomography (CST) has been widely applied for the imaging of critical gas-phase parameters in industrial processes. To acquire high-fidelity images, CST is typically implemented by the line-of-sight wavelength modulation spectroscopy measurements from multiple laser beams. In this article, we present a novel quasi-parallel sensing technique and electronic circuits for industrial CST. Although the acquisition and processing of these multiple beams using a fully parallel data acquisition and signal processing system can achieve maximized temporal response in CST, it leads to a highly complex and power-consuming instrumentation with electronics-caused inconsistency between the sampled beams, in addition to a significant burden on data transfer infrastructure. To address these issues, the digitization and demodulation of the multibeam signals in the proposed quasi-parallel sensing technique are multiplexed over the high-frequency modulation within a wavelength scan. Our development not only maintains the temporal response of the fully parallel sensing scheme but also facilitates the cost-effective implementation of industrial CST with very low complexity and reduced load on data transfer compared with the fully parallel sensing technique. The proposed technique was analytically proofed and then numerically examined by noise-contaminated CST simulations. Finally, the designed electronics was experimentally validated using a lab-scale CST system with 32 laser beams.

12 citations


Journal ArticleDOI
TL;DR: This study evaluated the potential of smartphones as data acquisition tools in comparison with compact cameras based on the quality and accuracy of their photogrammetric results in extracting geometrical measurements and found that the smartphone data required less processing time and memory usage with higher applicability compared with the compact camera.
Abstract: Close-range photogrammetry (CRP) has proven to be a remarkable and affordable technique for data modeling and measurements extraction in construction management applications. Nevertheless, it is important to aim for making CRP more accessible by using smartphones on-site directly without a pre-calibration procedure. This study evaluated the potential of smartphones as data acquisition tools in comparison with compact cameras based on the quality and accuracy of their photogrammetric results in extracting geometrical measurements (i.e., surface area and volume). Two concrete specimens of regular shapes (i.e., beam and cylinder) along with an irregular-shaped sand pile were used to conduct this study. The datasets of both cameras were analyzed and compared based on lens distortions, image residuals, and projections multiplicity. Furthermore, the photogrammetric models were compared according to various quality criteria, processing time, and memory utilization. Though both cameras were not pre-calibrated, they both provided highly accurate geometrical estimations. The volumetric estimation error ranged from 0.37% to 2.33% for the compact camera and 0.67% to 3.19% for the smartphone. For surface area estimations, the error ranged from 0.44% to 0.91% for the compact camera and 0.50% to 1.89% for the smartphone. Additionally, the smartphone data required less processing time and memory usage with higher applicability compared with the compact camera. The implication of these findings is that they provide professionals in construction management with an assessment of a more direct and cost-effective 3D data acquisition tool with a good understanding of its reliability. Moreover, the assessment methodology and comparison criteria presented in this study can assist future research in conducting similar studies for different capturing devices in construction management applications. The findings of this study are limited to small quantification applications. Therefore, it is recommended to conduct further research that assesses smartphones as a photogrammetric data acquisition tool for larger construction elements or tracking ongoing construction activities that involve measurements estimation.

11 citations


Journal ArticleDOI
TL;DR: In this paper , the authors present a comprehensive Data-Dependent and Data-Independent Acquisition (DDA/DIA) dataset acquired using several of the most commonly used current day instrumental platforms.
Abstract: In the last decade, a revolution in liquid chromatography-mass spectrometry (LC-MS) based proteomics was unfolded with the introduction of dozens of novel instruments that incorporate additional data dimensions through innovative acquisition methodologies, in turn inspiring specialized data analysis pipelines. Simultaneously, a growing number of proteomics datasets have been made publicly available through data repositories such as ProteomeXchange, Zenodo and Skyline Panorama. However, developing algorithms to mine this data and assessing the performance on different platforms is currently hampered by the lack of a single benchmark experimental design. Therefore, we acquired a hybrid proteome mixture on different instrument platforms and in all currently available families of data acquisition. Here, we present a comprehensive Data-Dependent and Data-Independent Acquisition (DDA/DIA) dataset acquired using several of the most commonly used current day instrumental platforms. The dataset consists of over 700 LC-MS runs, including adequate replicates allowing robust statistics and covering over nearly 10 different data formats, including scanning quadrupole and ion mobility enabled acquisitions. Datasets are available via ProteomeXchange (PXD028735).

10 citations


Journal ArticleDOI
21 Mar 2022
TL;DR: In this article , a wireless Internet-based low-cost data acquisition system consisting of Raspberry Pi and several Arduinos as signal conditioners is introduced, aiming to improve the overall accuracy of several sensors with an unknown accuracy range.
Abstract: Today, low-cost sensors in various civil engineering sectors are gaining the attention of researchers due to their reduced production cost and their applicability to multiple nodes. Low-cost sensors also have the advantage of easily connecting to low-cost microcontrollers such as Arduino. A low-cost, reliable acquisition system based on Arduino technology can further reduce the price of data acquisition and monitoring, which can make long-term monitoring possible. This paper introduces a wireless Internet-based low-cost data acquisition system consisting of Raspberry Pi and several Arduinos as signal conditioners. This study investigates the beneficial impact of similar sensor combinations, aiming to improve the overall accuracy of several sensors with an unknown accuracy range. The paper then describes an experiment that gives valuable information about the standard deviation, distribution functions, and error level of various individual low-cost sensors under different environmental circumstances. Unfortunately, these data are usually missing and sometimes assumed in numerical studies targeting the development of structural system identification methods. A measuring device consisting of a total of 75 contactless ranging sensors connected to two microcontrollers (Arduinos) was designed to study the similar sensor combination theory and present the standard deviation and distribution functions. The 75 sensors include: 25 units of HC-SR04 (analog), 25 units of VL53L0X, and 25 units of VL53L1X (digital).

10 citations


Journal ArticleDOI
TL;DR: In this article , a novel quasi-parallel sensing technique and electronic circuits for industrial chemical species tomography (CST) is presented, which not only maintains the temporal response of the fully parallel sensing scheme but also facilitates the cost-effective implementation of industrial CST with very low complexity and reduced load on data transfer.
Abstract: Chemical species tomography (CST) has been widely applied for the imaging of critical gas-phase parameters in industrial processes. To acquire high-fidelity images, CST is typically implemented by the line-of-sight wavelength modulation spectroscopy measurements from multiple laser beams. In this article, we present a novel quasi-parallel sensing technique and electronic circuits for industrial CST. Although the acquisition and processing of these multiple beams using a fully parallel data acquisition and signal processing system can achieve maximized temporal response in CST, it leads to a highly complex and power-consuming instrumentation with electronics-caused inconsistency between the sampled beams, in addition to a significant burden on data transfer infrastructure. To address these issues, the digitization and demodulation of the multibeam signals in the proposed quasi-parallel sensing technique are multiplexed over the high-frequency modulation within a wavelength scan. Our development not only maintains the temporal response of the fully parallel sensing scheme but also facilitates the cost-effective implementation of industrial CST with very low complexity and reduced load on data transfer compared with the fully parallel sensing technique. The proposed technique was analytically proofed and then numerically examined by noise-contaminated CST simulations. Finally, the designed electronics was experimentally validated using a lab-scale CST system with 32 laser beams.

10 citations


Journal ArticleDOI
TL;DR: In this article , a wireless data acquisition system and a method of self-cleaning the PV panels are developed and tested and the proposed cleaning system not only cleans the PV system but also protects it from hailstorms.
Abstract: ABSTRACT Solar photovoltaic (PV) technology can be considered a suitable option for fossil fuels because of its free availability and ease of use. The deprivation of power generation from PV systems due to environmental factors shows a major flaw in solar PV systems. As a result, they are unreliable in deserts or remote locations. The accumulation of dust in solar PV systems is a major problem. Solar PV energy prediction is a critical factor in future ecological and reliable energy sources for system stability. Real-time observing systems are essential in a remote PV system for collecting all the parameters needed to evaluate and optimize system performance. Many existing studies use costly and difficult-to-use wired data acquisition systems that run on LABVIEW licensed software. PV panels must be cleaned on a regular basis to achieve maximum efficiency. Most existing cleaning methods require water for cleaning the PV system. In this study, a wireless data acquisition system and a method of self-cleaning the PV panels are developed and tested. The proposed cleaning system not only cleans the PV system but also protects it from hailstorms. We investigate the performance of a 106 W PV system under Jaipur weather conditions over a one-year period using a proposed wireless data acquisition and monitoring system. The results revealed that the exposure of 12 months of 106 W PV panels under different seasons in Jaipur reduced the PV system’s efficiency by 24.5% in summer, by 15.6% in winter, by 5.14% in post-monsoon and by 1.95% in monsoon. The PV panels’ maximum efficiency is reached at a panel temperature of 41°C in the summer and 48°C in the winter. We observed that the proposed data acquisition system is applicable, durable, efficient, and appropriate for severe outdoor conditions for observing and collecting operational information about the PV system. The efficiency of a fixed PV system with daily manual cleaning was compared to that of a proposed cleaning PV system for a month and the proposed cleaning PV system’s efficiency was only 1.13% lower. The result shows that the proposed cleaning PV system performs well even in semi-arid environments.

10 citations


Journal ArticleDOI
TL;DR: In this article , a deep learning approach for dynamic sampling (DLADS) was employed to reduce the number of required measurements, thereby improving the throughput of mass spectrometry experiments in comparison with conventional methods.
Abstract: Mass spectrometry imaging (MSI) enables label-free mapping of hundreds of molecules in biological samples with high sensitivity and unprecedented specificity. Conventional MSI experiments are relatively slow, limiting their utility for applications requiring rapid data acquisition, such as intraoperative tissue analysis or 3D imaging. Recent advances in MSI technology focus on improving the spatial resolution and molecular coverage, further increasing the acquisition time. Herein, a deep learning approach for dynamic sampling (DLADS) was employed to reduce the number of required measurements, thereby improving the throughput of MSI experiments in comparison with conventional methods. DLADS trains a deep learning model to dynamically predict molecularly informative tissue locations for active mass spectra sampling and reconstructs high-fidelity molecular images using only the sparsely sampled information. Experimental hardware and software integration of DLADS with nanospray desorption electrospray ionization (nano-DESI) MSI is reported for the first time, which demonstrates a 2.3-fold improvement in throughput for a linewise acquisition mode. Meanwhile, simulations indicate that a 5-10-fold throughput improvement may be achieved using the pointwise acquisition mode.

8 citations


Proceedings ArticleDOI
29 Aug 2022
TL;DR: The ASTRI Mini-Array project as mentioned in this paper is an international project led by the Italian National Institute for Astrophysics (INAF) to build and operate an array of nine 4m class Imaging Atmospheric Cherenkov Telescopes (IACTs) at the Observatorio del Teide (Tenerife, Spain).
Abstract: The ASTRI Mini-Array is an international project led by the Italian National Institute for Astrophysics (INAF) to build and operate an array of nine 4-m class Imaging Atmospheric Cherenkov Telescopes (IACTs) at the Observatorio del Teide (Tenerife, Spain). The system is designed to perform deep observations of the galactic and extragalactic gamma-ray sky in the TeV and multi-TeV energy band, with important synergies with other ground-based gamma-ray facilities in the Northern Hemisphere and space-borne telescopes. As part of the overall software system, the ASTRI (Astrofisica con Specchi a Tecnologia Replicante Italiana) Team is developing dedicated systems for Data Processing, Simulation, and Archive to achieve effective handling, dissemination, and scientific exploitation of the ASTRI Mini-Array data. Thanks to the high-speed network connection available between Canary Islands and Italy, data acquired on-site will be delivered to the ASTRI Data Center in Rome immediately after acquisition. The raw data will be then reduced and analyzed by the Data Processing System up to the generation of the final scientific products. Detailed Monte Carlo simulated data will be produced by the Simulation System and exploited in several data processing steps in order to achieve precise reconstruction of the physical characteristics of the detected gamma rays and to reject the overwhelming background due to charged cosmic rays. The data access at different user levels and for different use cases, each one with a customized data organization, will be provided by the Archive System. In this contribution we present these three ASTRI Mini-Array software systems, focusing on their main functionalities, components, and interfaces.

8 citations


Journal ArticleDOI
TL;DR: In this paper , the main robust data acquisition and processing tools for EIT proposed in the scientific literature are analyzed, in order to conclude on the feasibility of a robust EIT tool capable of providing resistivity or difference of resistivity mapping in a wide range of applications.
Abstract: Electrical impedance tomography (EIT) is a medical imaging technique with many advantages and great potential for development in the coming years. Currently, some limitations of EIT are related to the ill-posed nature of the problem. These limitations are translated on a practical level by a lack of genericity of the developed tools. In this paper, the main robust data acquisition and processing tools for EIT proposed in the scientific literature are presented. Their relevance and potential to improve the robustness of EIT are analysed, in order to conclude on the feasibility of a robust EIT tool capable of providing resistivity or difference of resistivity mapping in a wide range of applications. In particular, it is shown that certain measurement acquisition tools and algorithms, such as faulty electrode detection algorithm or particular electrode designs, can ensure the quality of the acquisition in many circumstances. Many algorithms, aiming at processing acquired data, are also described and allow to overcome certain difficulties such as an error in the knowledge of the position of the boundaries or the poor conditioning of the inverse problem. They have a strong potential to faithfully reconstruct a quality image in the presence of disturbances such as noise or boundary modelling error.

Journal ArticleDOI
TL;DR: In this paper , the authors present an automated open-source workflow for high-throughput metabolomics that combines data-dependent and dataindependent acquisition for library generation, analysis, and statistical validation, with rigorous control of the false-discovery rate while matching manual analysis regarding quantification accuracy.
Abstract: The extraction of meaningful biological knowledge from high-throughput mass spectrometry data relies on limiting false discoveries to a manageable amount. For targeted approaches in metabolomics a main challenge is the detection of false positive metabolic features in the low signal-to-noise ranges of data-independent acquisition results and their filtering. Another factor is that the creation of assay libraries for data-independent acquisition analysis and the processing of extracted ion chromatograms have not been automated in metabolomics. Here we present a fully automated open-source workflow for high-throughput metabolomics that combines data-dependent and data-independent acquisition for library generation, analysis, and statistical validation, with rigorous control of the false-discovery rate while matching manual analysis regarding quantification accuracy. Using an experimentally specific data-dependent acquisition library based on reference substances allows for accurate identification of compounds and markers from data-independent acquisition data in low concentrations, facilitating biomarker quantification.

Journal ArticleDOI
TL;DR: In this article, an intelligent All-In-One Spectral Imaging (ASI) laboratory system allowing standardised automated data acquisition and real-time spectral model deployment is presented, which is also benchmarked in performance against the current commercially available portable as well as high-end laboratory spectrometers.

Journal ArticleDOI
TL;DR: In this article , an internal array of 7 thermistors was constructed; these in conjunction with cell current, via bus bar mounted sensors, and voltage sensor measurements, forming smart cells.
Abstract: The internal core temperature of cells is required to create accurate cell models and understand cell performance within a module. Pack cooling concepts often trade off temperature uniformity, vs cost/weight and complexity. Poor thermal management systems can lead to accelerated cell degradation, and unbalanced ageing. To provide core temperature an internal array of 7 thermistors was constructed; these in conjunction with cell current, via bus bar mounted sensors, and voltage sensor measurements, we have developed instrumented cells. These cells are also equipped with power line communication (PLC) circuitry, forming smart cells. We report upon data from these miniature sensors during cell cycling, demonstrating successful operation of the PLC system (zero errors compared to a reference wired connection) during typical cell cycling (C/2 discharge, C/3 charge) and the application of automotive drive cycle, providing a transient current test profile. Temperature variation within the cell of approximately 1.2 °C gradients, and variation of >2.8 °C during just 30 min of 2C discharging demonstrate the need for internal sensing and monitoring throughout the lifetime of a cell. Our cycling experimental data, along with thorough cell performance tracking, where typically <0.5% degradation was found following instrumentation process, demonstrate the success of our novel prototype smart cells.

Journal ArticleDOI
TL;DR: In this article , the suitability of the Machine Learning approach for vibration based on-board supervision of two wheeled vehicles is evaluated using the data acquisition system (DAQ) and decision tree.
Abstract: The regulation of tyre pressure is treated as a significant aspect of ‘tyre maintenance’ in the domain of autotronics. The manual supervision of a tyre pressure is typically an ignored task by most of the users. The existing instrumental scheme incorporates stand-alone monitoring with pressure and/or temperature sensors and requires regular manual conduct. Hence these schemes turn to be incompatible for on-board supervision and automated prediction of tyre condition. In this perspective, the Machine Learning (ML) approach acts appropriate as it exhibits comparison of specific performance in the past with present, intended for predicting the same in near future. The current investigation experimentally assesses the suitability of ML scheme for vibration based on-board supervision of tyre pressure of two wheeled vehicle. In order to examine the vibration response of a wheel hub, the in-house design & development of DAQ (Data Acquisition System) is described. Micro Electro-Mechanical Scheme (MEMS) built accelerometer is incorporated with open source hardware and software to collect and store the data. This framework is easy to develop, monitor and can be retrofitted in two wheeled vehicle. For various pressure conditions, the change in response of wheel hub vibration with respect to time is collected. The statistical parameters describing these vibration signals are determined and the decision tree is applied to select distinguishing parameters between extracted parameters. The classification of different conditions of tyre pressure is carried out using ML classifiers.

Journal ArticleDOI
TL;DR: In this paper , an intelligent All-In-One Spectral Imaging (ASI) laboratory system allowing standardised automated data acquisition and real-time spectral model deployment is presented, which is also benchmarked in performance against the current commercially available portable as well as high-end laboratory spectrometers.

Proceedings ArticleDOI
17 Jun 2022
TL;DR: In this article , a flexible and extensible architecture to integrate WSN and IoT is presented, where REST based internet services as used as a layer interoperating as an application layer which has a possibility of being integrated directly into the other domains of application to remotely monitor smart homes, VAN (Vehicular area networks) or healthcare services.
Abstract: There is an increased use of WSN (Wireless Sensor Networks) in our daily lives with WSN finding application in different areas like maintaining health, better quality of life scenarios, production monitoring in industries, traffic control and various other fields. WSNs have a scope for being incorporated in IoT (Internet of Things). IoT is beneficial foe Web based applications having specific requirements of storage and computation. This paper gives a flexible and extensible architecture to integrate WSN and IoT. REST based internet services as used as a layer interoperating as an application layer which has a possibility of being integrated directly into the other domains of application to remotely monitor smart homes, VAN (Vehicular area networks) or healthcare services.

Journal ArticleDOI
TL;DR: Results of the on-beam validation of the Jefferson Lab SRO framework are reported, demonstrating that the SRO performs as expected and providing evidence of its superiority in implementing sophisticated AI-supported algorithms for real-time data analysis and reconstruction.


Journal ArticleDOI
TL;DR: In this paper, a non-intrusive and scalable robot signal extraction architecture is proposed for industrial robot data acquisition and predictive maintenance in real manufacturing assembly lines, which can be used to detect robot joint failures in real world scenarios.
Abstract: This manuscript presents a methodology and a practical implementation of a network architecture for industrial robot data acquisition and predictive maintenance. We propose a non-intrusive and scalable robot signal extraction architecture, easily applicable in real manufacturing assembly lines. The novelty of the paper lies in the fact that it is the first proposal of a network architecture which is specially designed to address the predictive maintenance of industrial robots in real production environments. All the infrastructure needed for the implementation of the architecture is comprised of traditional well-known industrial assets. We synchronize the data acquisition with the execution of robot routines using common Programmable Logic Controllers (PLC) to obtain comparable data batches. A network architecture that acquires comparable and structured data over time, is a crucial step to advance towards an effective predictive maintenance of these complex systems, in terms of effectively detecting time dependent degradation. We implement the architecture in a real automotive manufacturing assembly line and show the potential of the solution to detect robot joint failures in real world scenarios. The architecture is therefore specially interesting for industrial practitioners and maintenance personnel. Finally, we test the feasibility of using one-class novelty detection models for robot health status degradation assessment using data of a real robot failure. To the best of our knowledge, this is the first contribution that uses robot torque signals of a real production line failure to train one-class models.

Journal ArticleDOI
TL;DR: In this paper , the Monte-Carlo-plus-detector effect (MCDE) model was used to simulate the delivery of a 150 MeV clinical proton pencil beam to a tissue-equivalent plastic phantom.
Abstract: The purpose of this study was to determine how the characteristics of the data acquisition (DAQ) electronics of a Compton camera (CC) affect the quality of the recorded prompt gamma (PG) interaction data and the reconstructed images, during clinical proton beam delivery. We used the Monte-Carlo-plus-Detector-Effect (MCDE) model to simulate the delivery of a 150 MeV clinical proton pencil beam to a tissue-equivalent plastic phantom. With the MCDE model we analyzed how the recorded PG interaction data changed as two characteristics of the DAQ electronics of a CC were changed: (1) the number of data readout channels; and (2) the active charge collection, readout, and reset time. As the proton beam dose rate increased, the number of recorded PG single-, double-, and triple-scatter events decreased by a factor of 60× for the current DAQ configuration of the CC. However, as the DAQ readout channels were increased and the readout/reset timing decreased, the number of recorded events decreased by <5× at the highest clinical dose rate. The increased number of readout channels and reduced readout/reset timing also resulted in higher quality recorded data. That is, a higher percentage of the recorded double- and triple-scatters were "true" events (caused by a single incident gamma) and not "false" events (caused by multiple incident gammas). The increase in the number and the quality of recorded data allowed higher quality PG images to be reconstructed even at the highest clinical dose rates.

Journal ArticleDOI
TL;DR: A review on predictive monitoring of incipient stage faults following its whole program, i.e., from data acquisition to artificial intelligence implementation, and artificial intelligence based fault prediction approaches are discussed.

Proceedings ArticleDOI
29 Aug 2022
TL;DR: The ASTRI-Horn project as mentioned in this paper is an end-to-end prototype of the Small-Sized Telescope (SST) of the Cherenkov Telescope Array (CTA) in a dual-mirror configuration.
Abstract: The ASTRI (Astrofisica con Specchi a Tecnologia Replicante Italiana) Project was born as a collaborative international effort led by the Italian National Institute for Astrophysics (INAF) to design and realize an end-to-end prototype of the Small-Sized Telescope (SST) of the Cherenkov Telescope Array (CTA) in a dual-mirror configuration (2M). The prototype, named ASTRI-Horn, has been operational since 2014 at the INAF observing station located on Mt. Etna (Italy). The ASTRI Project is now building the ASTRI Mini-Array consisting of nine ASTRI-Horn-like telescopes to be installed and operated at the Teide Observatory (Spain). The ASTRI software is aimed at supporting the Assembly Integration and Verification (AIV), and the operations of the ASTRI Mini-Array. The Array Data Acquisition System (ADAS) includes all hardware, software and communication infrastructure required to gather the bulk data of the Cherenkov Cameras and the Intensity Interferometers installed on the telescopes, and make these data available to the Online Observation Quality System (OOQS) for the on-site quick look, and to the Data Processing System (DPS) for the off-site scientific pipeline. This contribution presents the ADAS software architecture according to the use cases and requirement specifications, with particular emphasis on the interfaces with the Back End Electronics (BEE) of the instruments, the array central control, the OOQS, and the DPS.

Journal ArticleDOI
TL;DR: In this article , micro-electromechanical systems (MEMS) accelerometers combined with an Arduino-based data acquisition system, were used to acquire vibration data of a reinforced concrete beam at various damage levels.

Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors proposed wave-encoded model-based deep learning (MoDL) for high-fidelity 3D image reconstruction using wave-controlled aliasing in parallel imaging (CAIPI).
Abstract: A recently introduced model-based deep learning (MoDL) technique successfully incorporates convolutional neural network (CNN)-based regularizers into physics-based parallel imaging reconstruction using a small number of network parameters. Wave-controlled aliasing in parallel imaging (CAIPI) is an emerging parallel imaging method that accelerates imaging acquisition by employing sinusoidal gradients in the phase- and slice/partition-encoding directions during the readout to take better advantage of 3D coil sensitivity profiles. We propose wave-encoded MoDL (wave-MoDL) combining the wave-encoding strategy with unrolled network constraints for highly accelerated 3D imaging while enforcing data consistency. We extend wave-MoDL to reconstruct multicontrast data with CAIPI sampling patterns to leverage similarity between multiple images to improve the reconstruction quality. We further exploit this to enable rapid quantitative imaging using an interleaved look-locker acquisition sequence with T2 preparation pulse (3D-QALAS). Wave-MoDL enables a 40 s MPRAGE acquisition at 1 mm resolution at 16-fold acceleration. For quantitative imaging, wave-MoDL permits a 1:50 min acquisition for T1, T2, and proton density mapping at 1 mm resolution at 12-fold acceleration, from which contrast-weighted images can be synthesized as well. In conclusion, wave-MoDL allows rapid MR acquisition and high-fidelity image reconstruction and may facilitate clinical and neuroscientific applications by incorporating unrolled neural networks into wave-CAIPI reconstruction.

Journal ArticleDOI
TL;DR: In this article , an entire operation structure covering PV data acquisition, PV power forecasting, and coordinated dispatch of power systems with large-scale behind-the-meter distributed PV units is established.

Journal ArticleDOI
TL;DR: In this article , a novel approach for material appearance acquisition using hyperspectral data is proposed, where a dense 3D point cloud filled with spectral data was generated from the images obtained by an UAV equipped with an RGB camera and a hypersensor sensor, and a parametrisation of Bidirectional Reflectance Distribution Function was carried out by sampling the BRDF space for each material.
Abstract: Modelling of material appearance from reflectance measurements has become increasingly prevalent due to the development of novel methodologies in Computer Graphics. In the last few years, some advances have been made in measuring the light-material interactions, by employing goniometers/reflectometers under specific laboratory’s constraints. A wide range of applications benefit from data-driven appearance modelling techniques and material databases to create photorealistic scenarios and physically based simulations. However, important limitations arise from the current material scanning process, mostly related to the high diversity of existing materials in the real-world, the tedious process for material scanning and the spectral characterisation behaviour. Consequently, new approaches are required both for the automatic material acquisition process and for the generation of measured material databases. In this study, a novel approach for material appearance acquisition using hyperspectral data is proposed. A dense 3D point cloud filled with spectral data was generated from the images obtained by an unmanned aerial vehicle (UAV) equipped with an RGB camera and a hyperspectral sensor. The observed hyperspectral signatures were used to recognise natural and artificial materials in the 3D point cloud according to spectral similarity. Then, a parametrisation of Bidirectional Reflectance Distribution Function (BRDF) was carried out by sampling the BRDF space for each material. Consequently, each material is characterised by multiple samples with different incoming and outgoing angles. Finally, an analysis of BRDF sample completeness is performed considering four sunlight positions and 16x16 resolution for each material. The results demonstrated the capability of the used technology and the effectiveness of our method to be used in applications such as spectral rendering and real-word material acquisition and classification.

Journal ArticleDOI
TL;DR: In this article , the obtained frequencies were compared with the theoretical values extracted from the theoretical equations, and the method proved its effectiveness in detecting the fault generated by a fault generated in the system.
Abstract: Finally, the obtained frequencies were compared with the theoretical values extracted from the theoretical equations, and the method proved its effectiveness in detecting the fault generated .

Proceedings ArticleDOI
01 Jan 2022
TL;DR: This paper presents a wearable system that integrates the signal acquisition and the electrostimulation using dry thin-film titanium-based electrodes and employs artificial intelligence algorithms to provide customised treatments for each patient profile and type of pathology.
Abstract: Data acquisition by electromyography, as well as the muscle stimulation, has become more accessible with the new developments in the wearable technology and medicine. In fact, for treatments, games or sports, it is possible to find examples of the use of muscle signals to analyse specific aspects related, e.g., to disease, injuries or movement impulses. However, these systems are usually expensive, does not integrate data acquisition with the muscle stimulation and does not exhibit an adaptive control behaviour that consider the pathology and the patient response. This paper presents a wearable system that integrates the signal acquisition and the electrostimulation using dry thin-film titanium-based electrodes. The acquired data is transmitted to a mobile application running on a smartphone by using Bluetooth Low Energy (BLE) technology, where it is analysed by employing artificial intelligence algorithms to provide customised treatments for each patient profile and type of pathology, and taking into consideration the feedback of the acquired electromyography signal. The acquired patient’s data is also stored in a secure cloud database to support the physician to analyse and follow-up the clinical results from the rehabilitation process.

Journal ArticleDOI
TL;DR: In this paper , a chest-scale epidermal electronic system (EES) was developed for standard precordial-lead ECG and hydration monitoring, including the only μm-thick substrate-free Epidermal sensing module and the soft wireless DAQ module.
Abstract: Abstract Six chest leads are the standardized clinical devices of diagnosing cardiac diseases. Emerging epidermal electronics technology shift the dangling wires and bulky devices to imperceptible wearing, achieving both comfortable experience and high-fidelity measuring. Extending small areas of current epidermal electronics to the chest scale requires eliminating interference from long epidermal interconnects and rendering the data acquisition (DAQ) portable. Herein, we developed a chest-scale epidermal electronic system (EES) for standard precordial-lead ECG and hydration monitoring, including the only μm-thick substrate-free epidermal sensing module and the soft wireless DAQ module. An electrical compensation strategy using double channels within the DAQ module and epidermal compensated branches (ECB) is proposed to eliminate unwanted signals from the long epidermal interconnects and to achieve the desired ECG. In this way, the EES works stably and precisely under different levels of exercise. Patients with sinus arrhythmias have been tested, demonstrating the prospect of EES in cardiac diseases.