scispace - formally typeset
Search or ask a question

Showing papers on "Data acquisition published in 2016"


Journal ArticleDOI
TL;DR: The free and open‐source R package camtrapR is described, a new toolbox for flexible and efficient management of data generated in camera trap‐based wildlife studies and should be most useful to researchers and practitioners who regularly handle large amounts of camera trapping data.
Abstract: Summary Camera trapping is a widely applied method to study mammalian biodiversity and is still gaining popularity. It can quickly generate large amounts of data which need to be managed in an efficient and transparent way that links data acquisition with analytical tools. We describe the free and open-source R package camtrapR, a new toolbox for flexible and efficient management of data generated in camera trap-based wildlife studies. The package implements a complete workflow for processing camera trapping data. It assists in image organization, species and individual identification, data extraction from images, tabulation and visualization of results and export of data for subsequent analyses. There is no limitation to the number of images stored in this data management system; the system is portable and compatible across operating systems. The functions provide extensive automation to minimize data entry mistakes and, apart from species and individual identification, require minimal manual user input. Species and individual identification are performed outside the R environment, either via tags assigned in dedicated image management software or by moving images into species directories. Input for occupancy and (spatial) capture–recapture analyses for density and abundance estimation, for example in the R packages unmarked or secr, is computed in a flexible and reproducible manner. In addition, survey summary reports can be generated, spatial distributions of records can be plotted and exported to gis software, and single- and two-species activity patterns can be visualized. camtrapR allows for streamlined and flexible camera trap data management and should be most useful to researchers and practitioners who regularly handle large amounts of camera trapping data.

255 citations


Journal ArticleDOI
TL;DR: An integral sliding-mode controller to be supplemented to the conventional TDC to improve the control precision even if the DVL navigation system is in operation and is computationally simple and robust to unmodeled dynamics and disturbances.
Abstract: This paper presents an enhanced time-delay controller (TDC) for the position control of an autonomous underwater vehicle (AUV) under disturbances. A conventional TDC performs well when the involved data acquisition rate is fast. However, in AUV control applications that use a Doppler velocity log (DVL) navigation system, we cannot keep the data acquisition rate sufficiently fast because a DVL sensor generally supplies data at a slow acquisition rate, which degrades the performance of the TDC. To overcome this problem, we propose an integral sliding-mode controller to be supplemented to the conventional TDC to improve the control precision even if the DVL navigation system is in operation. The proposed controller is computationally simple and robust to unmodeled dynamics and disturbances. We performed computer simulations and experiments with the Cyclops AUV to demonstrate the validity of the proposed controller.

192 citations


Journal ArticleDOI
07 Mar 2016-eLife
TL;DR: A method for in-focus data acquisition with a phase plate that enables near-atomic resolution single particle reconstructions and could enable single particle analysis of challenging samples in terms of small size, heterogeneity and flexibility that are difficult to solve by the conventional defocus approach.
Abstract: We present a method for in-focus data acquisition with a phase plate that enables near-atomic resolution single particle reconstructions. Accurate focusing is the determining factor for obtaining high quality data. A double-area focusing strategy was implemented in order to achieve the required precision. With this approach we obtained a 3.2 A resolution reconstruction of the Thermoplasma acidophilum 20S proteasome. The phase plate matches or slightly exceeds the performance of the conventional defocus approach. Spherical aberration becomes a limiting factor for achieving resolutions below 3 A with in-focus phase plate images. The phase plate could enable single particle analysis of challenging samples in terms of small size, heterogeneity and flexibility that are difficult to solve by the conventional defocus approach.

150 citations


Journal ArticleDOI
TL;DR: The proposed architecture provides a guideline to construct a CPS system from the hardware interconnection, to the data acquisition, processing, and visualization, and the final knowledge acquisition and learning.

93 citations


Journal ArticleDOI
TL;DR: In this article, the authors presented the results of an ongoing research project conducted by the U.S. Federal Highway Administration (FHWA) on developing an intelligent approach for structural damage detection.

91 citations


Journal ArticleDOI
TL;DR: According to the numerical analysis, the predict accuracy of the presented BDA improves clearly with the increase in data size and has higher accuracy than linear regression and back-propagation network in CT forecasting in the large-scale data-set.
Abstract: In order to improve the prompt delivery reliability of the semiconductor wafer fabrication system, a big data analytics (BDA) is designed to predict wafer lots’ cycle time (CT), which is composed by four parts: data acquisition, data pre-processing, data analysing and data prediction. Firstly, the candidate feature set is constructed to collecting all features by analysing the material flow of wafer foundry. Subsequently, a data pre-processing technique is designed to extract, transform and load data from wafer lot transactions data-set. In addition, a conditional mutual information-based feature selection process is proposed to select key feature subset to reduce the dimension of data-set through data analysing without pre-knowledge. To handle the large volumes of data, a concurrent forecasting model is designed to predict the CT of wafer lots in parallel as well. According to the numerical analysis, the predict accuracy of the presented BDA improves clearly with the increase in data size. And, in the la...

85 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed a virtual simulation environment that can be used to gain fundamental understanding regarding internal processes and operational responses of a cone crusher, which can not only be used for understanding but also for development of new crushers and for optimisation purposes.

85 citations


Journal ArticleDOI
TL;DR: The results lend support to shorter, more convenient ECG recording procedures for lnRMSSD assessment in athletes by reducing the prerecording stabilization period to 1 min.
Abstract: Resting heart rate variability (HRV) is a potentially useful marker to consider for monitoring training status in athletes. However, traditional HRV data collection methodology requires a 5-min recording period preceded by a 5-min stabilization period. This lengthy process may limit HRV monitoring in the field due to time constraints and high compliance demands of athletes. Investigation into more practical methodology for HRV data acquisitions is required. The aim of this study was to determine the time course for stabilization of ECG-derived lnRMSSD from traditional HRV recordings. Ten-minute supine ECG measures were obtained in ten male and ten female collegiate cross-country athletes. The first 5 min for each ECG was separately analysed in successive 1-min intervals as follows: minutes 0-1 (lnRMSSD0-1 ), 1-2 (lnRMSSD1-2 ), 2-3 (lnRMSSD2-3 ), 3-4 (lnRMSSD3-4 ) and 4-5 (lnRMSSD4-5 ). Each 1-min lnRMSSD segment was then sequentially compared to lnRMSSD of the 5- to 10-min ECG segment, which was considered the criterion (lnRMSSDC riterion ). There were no significant differences between each 1-min lnRMSSD segment and lnRMSSDC riterion , and the effect sizes were considered trivial (ES ranged from 0·07 to 0·12). In addition, the ICC for each 1-min segment compared to the criterion was near perfect (ICC values ranged from 0·92 to 0·97). The limits of agreement between the prerecording values and lnRMSSDC riterion ranged from ±0·28 to ±0·45 ms. These results lend support to shorter, more convenient ECG recording procedures for lnRMSSD assessment in athletes by reducing the prerecording stabilization period to 1 min.

83 citations


Journal ArticleDOI
TL;DR: Methods to detect outliers in network flow measurements that may be due to pipe bursts or unusual consumptions are fundamental to improve water distribution system on-line operation and management, and to ensure reliable historical data for sustainable planning and design of these systems.
Abstract: Methods to detect outliers in network flow measurements that may be due to pipe bursts or unusual consumptions are fundamental to improve water distribution system on-line operation and management, and to ensure reliable historical data for sustainable planning and design of these systems. To detect and classify anomalous events in flow data from district metering areas a four-step methodology was adopted, implemented and tested: i) data acquisition, ii) data validation and normalization, iii) anomalous observation detection, iv) anomalous event detection and characterization. This approach is based on the renewed concept of outlier regions and depends on a reduced number of configuration parameters: the number of past observations, the true positive rate and the false positive rate. Results indicate that this approach is flexible and applicable to the detection of different types of events (e.g., pipe burst, unusual consumption) and to different flow time series (e.g., instantaneous, minimum night flow).

78 citations


Journal ArticleDOI
TL;DR: The development and use of an open-source USB data acquisition device (with 16-bit acquisition resolution) built using simple electronic components and an Arduino Uno that costs under $50 is described.
Abstract: Many research and teaching laboratories rely on USB data acquisition devices to collect voltage signals from instrumentation. However, these devices can be cost-prohibitive (especially when large numbers are needed for teaching laboratories) and require software to be developed for operation. In this article, we describe the development and use of an open-source USB data acquisition device (with 16-bit acquisition resolution) built using simple electronic components and an Arduino Uno that costs under $50. Additionally, open-source software written in Python is included so that data can be acquired using nearly any PC or Mac computer with a simple USB connection. Use of the device was demonstrated for a sophomore-level analytical experiment using gas chromatography and a capillary electrophoresis-UV separation on an instrument used for research purposes.

76 citations


Journal ArticleDOI
TL;DR: This paper proposes a process with a generic data mining model that can be used for developing acoustic signal-based fault diagnosis systems for reciprocating air compressors, and thorough analysis has been presented where performance of the system is compared.
Abstract: Intelligent fault diagnosis of machines for early recognition of faults saves industry from heavy losses occurring due to machine breakdowns. This paper proposes a process with a generic data mining model that can be used for developing acoustic signal-based fault diagnosis systems for reciprocating air compressors. The process includes details of data acquisition, sensitive position analysis for deciding suitable sensor locations, signal pre-processing, feature extraction, feature selection, and a classification approach. This process was validated by developing a real time fault diagnosis system on a reciprocating type air compressor having 8 designated states, including one healthy state, and 7 faulty states. The system was able to accurately detect all the faults by analyzing acoustic recordings taken from just a single position. Additionally, thorough analysis has been presented where performance of the system is compared while varying feature selection techniques, the number of selected features, and multiclass decomposition algorithms meant for binary classifiers.

Journal ArticleDOI
TL;DR: An architecture based on a high-level control system that manages low-level data acquisition, data processing and device changes is described, suitable for routine as well as prototypical experiments, and provides specialized building blocks to conduct four-dimensional in situ, in vivo and operando tomography and laminography.
Abstract: Real-time processing of X-ray image data acquired at synchrotron radiation facilities allows for smart high-speed experiments. This includes workflows covering parameterized and image-based feedback-driven control up to the final storage of raw and processed data. Nevertheless, there is presently no system that supports an efficient construction of such experiment workflows in a scalable way. Thus, here an architecture based on a high-level control system that manages low-level data acquisition, data processing and device changes is described. This system is suitable for routine as well as prototypical experiments, and provides specialized building blocks to conduct four-dimensional in situ, in vivo and operando tomography and laminography.

Journal ArticleDOI
TL;DR: The novel real time voltage sag and swell detection, classification scheme using artificial neural network is presented and the suitability, robustness and adaptability to monitor power quality issues is claimed.

Patent
04 Apr 2016
TL;DR: In this article, an analytics server is communicatively connected to a data acquisition component and a virtual system model database to generate predicted data based on the virtual system models of the DR power network and update the model in real time based on a difference between the predicted data and the real-time data.
Abstract: Systems and methods for model-based demand response are disclosed. An analytics server is communicatively connected to a data acquisition component and a virtual system model database. The data acquisition component is operable to acquire and transmit real-time data from a demand response (DR) power network to the analytic server. The virtual system model database is operable to provide a virtual system model of the DR power network. The analytics server is operable to generate predicted data based on the virtual system model of the DR power network and update the virtual system model in real time based on a difference between the predicted data and the real-time data. The analytics server is further operable to optimize DR output of the DR power network to a power grid.

Journal ArticleDOI
TL;DR: An open-source data acquisition system based on the Simple Network Management Protocol (SNMP) that is able to record transmitted and received signal levels of a large number of CMLs simultaneously with a temporal resolution of up to 1 s.
Abstract: . The usage of data from commercial microwave link (CML) networks for scientific purposes is becoming increasingly popular, in particular for rain rate estimation. However, data acquisition and availability is still a crucial problem and limits research possibilities. To overcome this issue, we have developed an open-source data acquisition system based on the Simple Network Management Protocol (SNMP). It is able to record transmitted and received signal levels of a large number of CMLs simultaneously with a temporal resolution of up to 1 s. We operate this system at Ericsson Germany, acquiring data from 450 CMLs with minutely real-time transfer to our database. Our data acquisition system is not limited to a particular CML hardware model or manufacturer, though. We demonstrate this by running the same system for CMLs of a different manufacturer, operated by an alpine ski resort in Germany. There, the data acquisition is running simultaneously for four CMLs with a temporal resolution of 1 s. We present an overview of our system, describe the details of the necessary SNMP requests and show results from its operational application.

Journal ArticleDOI
Quansheng Chen1, Weiwei Hu1, Jie Su1, Huanhuan Li1, Qin Ouyang1, Jiewen Zhao1 
TL;DR: This work demonstrates that the artificial olfactory technique based on colorimetric sensor array, as a nondestructive sensing tool, has a high potential to quantify TVC in chicken.

Journal ArticleDOI
M. Abolins1, Ricardo Abreu2, R. Achenbach3, M. Aharrouche4  +661 moreInstitutions (96)
TL;DR: The data acquisition and high level trigger system of the ATLAS experiment at the Large Hadron Collider at CERN, as deployed during Run 1, is described.
Abstract: This paper describes the data acquisition and high level trigger system of the ATLAS experiment at the Large Hadron Collider at CERN, as deployed during Run 1. Data flow as well as control, configuration and monitoring aspects are addressed. An overview of the functionality of the system and of its performance is presented and design choices are discussed.

Journal ArticleDOI
TL;DR: In this paper, a home-built data acquisition unit (DAQ) specifically tailored to the needs of single-particle ICP-MS applications was developed to study and alleviate some of these limitations.
Abstract: In inductively coupled plasma mass spectrometry (ICP-MS), short transient signals originating from individual nanoparticles are typically recorded in a time-resolved measurement with reduced dwell times in the millisecond time regime. This approach was termed single-particle ICP-MS in the past and used for particle counting and sizing but is not without limitations. In this work, a home-built data acquisition unit (DAQ) specifically tailored to the needs of single-particle ICP-MS applications was developed to study and alleviate some of these limitations. For best comparison, data were acquired simultaneously with both techniques. Each experiment was carried out as a conventional time-resolved measurement, while the DAQ directly probed the instrument's detection circuitry. Our DAQ features dwell times as low as 5 μs during continuous data acquisition and can be operated for virtually unlimited measurement time. Using a time resolution much higher than the typical duration of a particle-related ion cloud, the probability of measurement artifacts due to particle coincidence could be significantly reduced and the occurrence of split-particle events in fact was almost eliminated. Moreover, a duty cycle of 100% of the counting electronics improves the method's accuracy compared to the acquisition system of currently available ICP-Q-MS instruments. Fully time-resolved temporal profiles of transient signals originating from single gold nanoparticles as small as 10 nm are presented. The advantages and disadvantages of millisecond versus microsecond dwell times are critically discussed including measurement artifacts due to particle coincidence, split-particle events, and particle number concentration.

Journal ArticleDOI
TL;DR: The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals, and the existing Neuroinformatics infrastructure for tool and data sharing.
Abstract: In recent years multichannel neuronal signal acquisition systems have allowed scientists to focus on research questions which were otherwise impossible. They act as a powerful means to study brain (dys)functions in in-vivo and in in-vitro animal models. Typically, each session of electrophysiological experiments with multichannel data acquisition systems generate large amount of raw data. For example, a 128 channel signal acquisition system with 16 bits A/D conversion and 20 kHz sampling rate will generate approximately 17 GB data per hour (uncompressed). This poses an important and challenging problem of inferring conclusions from the large amounts of acquired data. Thus, automated signal processing and analysis tools are becoming a key component in neuroscience research, facilitating extraction of relevant information from neuronal recordings in a reasonable time. The purpose of this review is to introduce the reader to the current state-of-the-art of open-source packages for (semi)automated processing and analysis of multichannel extracellular neuronal signals (i.e., neuronal spikes, local field potentials, electroencephalogram, etc.), and the existing Neuroinformatics infrastructure for tool and data sharing. The review is concluded by pinpointing some major challenges that are being faced, which include the development of novel benchmarking techniques, cloud-based distributed processing and analysis tools, as well as defining novel means to share and standardize data.

Journal ArticleDOI
TL;DR: In this article, the authors apply comprehensive testing methods to determine the reliability of the data and its suitability as a supplement to geophone data or to gain access to wells where it would be difficult to deploy geophones.
Abstract: Great advances have been made in distributed acoustic sensing (DAS) vertical seismic profile (VSP) data acquisition hardware and software. Here, we capture a quantitative assessment of the quality of DAS data at a single point in time. We apply comprehensive testing methods to determine the reliability of the data and its suitability as a supplement to geophone data or to gain access to wells where it would be difficult to deploy geophones. The test measurements are made on DAS and geophone data, which were collected at the same time and in the same well. We analyze the first breaks for waveform consistency, signal-to-noise (S/N) ratio, and slowness. Then, we examine the corridor stacks for waveform consistency and S/N ratio. Finally, we test the properties of the measurement, including linearity, repeatability, reliability, and response, as a function of the angle of incidence of a seismic wave to the fiber. The results show that the DAS VSP data provide accurate formation slowness logs and reliable ampl...

Patent
16 Mar 2016
TL;DR: In this paper, a network teaching method and a system consisting of multiple user terminals and a server is presented, which consists of a data acquisition step, a synchronous display step, and a real-time recording step and a broadcasting and replaying step.
Abstract: The invention provides a network teaching method and a system. The system comprises multiple user terminals and a server. The user terminals communicate with the server via the network. The method comprises a data acquisition step, a synchronous display step, a real-time recording step and a broadcasting and replaying step. In the data acquisition step, image data, application data and/or audio data in the network teaching processes are acquired and are used for synchronous display and real-time recording. In the synchronous display step, the acquired data are automatically synchronously displayed or transmitted on student user terminals. In the real-time recording step, the acquired data are separately stored into a database, so users can download the data. In the broadcasting and replaying step, according to the users' requests, the recorded and stored data are transmitted to the users and are combined and redisplay the teaching processes on the user terminals.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a low-cost data acquisition platform prototype for automotive telemetry applications such as driving style analysis, fleet management and fault detection, which can contribute significantly to road safety, define insurance premium, to engage user in saving fuel and money and to correlate faults of the car with the driving style.

Journal ArticleDOI
25 Oct 2016-JAMA
TL;DR: This publication is part of the National Academy of Medicine’s Vital Directions for Health and Health Care Initiative, which called on more than 150 leading researchers, scientists, and policy makers from across the United States to assess and provide expert guidance on 19 priority issues for U.S. health policy.
Abstract: This publication is part of the National Academy of Medicine’s Vital Directions for Health and Health Care Initiative, which called on more than 150 leading researchers, scientists, and policy makers from across the United States to assess and provide expert guidance on 19 priority issues for U.S. health policy. The views presented in this publication and others in the series are those of the authors and do not represent formal consensus positions of the NAM, the National Academies of Sciences, Engineering, and Medicine, or the authors’ organizations. Learn more: nam.edu/VitalDirections.

Journal ArticleDOI
01 Dec 2016
TL;DR: ICE as mentioned in this paper is a hardware and software framework that implements large arrays of interconnected field-programmable gate array (FPGA)-based data acquisition, signal processing and networking nodes economically for radio, millimeter and sub-millimeter telescope readout systems that have requirements beyond typical off-the-shelf processing systems.
Abstract: We present an overview of the ‘ICE’ hardware and software framework that implements large arrays of interconnected field-programmable gate array (FPGA)-based data acquisition, signal processing and networking nodes economically. The system was conceived for application to radio, millimeter and sub-millimeter telescope readout systems that have requirements beyond typical off-the-shelf processing systems, such as careful control of interference signals produced by the digital electronics, and clocking of all elements in the system from a single precise observatory-derived oscillator. A new generation of telescopes operating at these frequency bands and designed with a vastly increased emphasis on digital signal processing to support their detector multiplexing technology or high-bandwidth correlators — data rates exceeding a terabyte per second — are becoming common. The ICE system is built around a custom FPGA motherboard that makes use of an Xilinx Kintex-7 FPGA and ARM-based co-processor. The system is ...

Journal ArticleDOI
TL;DR: In this article, the authors proposed the use of three-dimensional building information model (BIM) objects to integrate schedule and cost by using the powerful BIM for data acquisition and storage BIM objects are incorporated into a proposed four-step model to establish construction progress curves.

Book ChapterDOI
01 Jan 2016
TL;DR: The goals of this chapter are to identify the current requirements for data acquisition by presenting open state-of-the-art frameworks and protocols for big data acquisition for companies, and to unveil the current approaches used for data Acquisition in the different sectors.
Abstract: Different data processing architectures for big data have been proposed to address the different characteristics of big data. Data acquisition has been understood as the process of gathering, filtering, and cleaning data before the data is put in a data warehouse or any other storage solution. The acquisition of big data is most commonly governed by four of the Vs: volume, velocity, variety, and value. Most data acquisition scenarios assume high-volume, high-velocity, high-variety, but low-value data, making it important to have adaptable and time-efficient gathering, filtering, and cleaning algorithms that ensure that only the high-value fragments of the data are actually processed by the data-warehouse analysis. The goals of this chapter are threefold: First, it aims to identify the current requirements for data acquisition by presenting open state-of-the-art frameworks and protocols for big data acquisition for companies. The second goal is to unveil the current approaches used for data acquisition in the different sectors. Finally, it discusses how the requirements of data acquisition are met by current approaches as well as possible future developments in the same area.

Proceedings ArticleDOI
01 Dec 2016
TL;DR: This paper considers the architecture implemented to stream marine data from instrument to end user and offers suggestions on how to standardise these data streams.
Abstract: In August 2015, a new seafloor observatory was deployed in Galway Bay, Ireland. The sensors on the observatory platform are connected by fibre-optic cable to a shore station, where a broadband connection allows data transfer to the Marine Institute's data centre. This setup involved the development of a new data acquisition system which takes advantage of open source streaming data solutions developed in response to the Big Data paradigm, in particular the Velocity aspect. This activity merges concepts from the arenas of both Big Data and Internet of Things where data standardisation is not normally considered. This paper considers the architecture implemented to stream marine data from instrument to end user and offers suggestions on how to standardise these data streams.

Proceedings ArticleDOI
01 Nov 2016
TL;DR: In this article, the authors deploy a low-cost sensor system, gather field data, and display the data through a graphical user interface (GUI) to improve the productivity and efficiency of limited agricultural resources.
Abstract: Precision Agriculture is utilized to improve the productivity and efficiency of limited agricultural resources by monitoring the relevant data in the field. The main objective of this study is to deploy a low-cost sensor system, gather field data, and display the data through a graphical user interface (GUI). Sensors such as humidity, temperature, moisture, luminosity, electrical conductivity, and pH was used for data acquisition and the Raspberry Pi, acting as a local server, was used for data processing and transfer. The data sent was stored in a main server and organized using SQL. A GUI was developed to provide visualization of the data gathered. The trends of data gathered revealed pattern such as the occurrence of a local maximum for humidity right after dawn and the inverse relationship of humidity and temperature. The whole system was tested and proven to work by the application of fertilizer to the soil and seeing its response in the GUI.

Journal ArticleDOI
TL;DR: In this article, the authors present a complete Data Acquisition System (DAQ) together with the readout mechanisms for the J-PET tomography scanner, which is capable of maintaining continuous readout of digitized data without preliminary selection.
Abstract: In this paper, we present a complete Data Acquisition System (DAQ) together with the readout mechanisms for the J-PET tomography scanner. In general detector readout chain is constructed out of Front-End Electronics (FEE), measurement devices like Time-to-Digital or Analog-to-Digital Converters (TDCs or ADCs), data collectors and storage. We have developed a system capable for maintaining continuous readout of digitized data without preliminary selection. Such operation mode results in up to 8 Gbps data stream, therefore it is required to introduce a dedicated module for online event building and feature extraction. The Central Controller Module, equipped with Xilinx Zynq SoC and 16 optical transceivers serves as such true real time computing facility. Our solution for the continuous data recording (trigger-less) is a novel approach in such detector systems and assures that most of the information is preserved on the storage for further, high-level processing. Signal discrimination applies an unique method of using LVDS buffers located in the FPGA fabric.

Journal ArticleDOI
TL;DR: A secure cloud-based framework for privacy-aware healthcare monitoring systems, which allows fast data acquisition and indexing with strong privacy assurance, and a novel encrypted index with high-performance customization that achieves memory efficiency, provable security, as well as greatly improved building speed with nontrivial multithread support.
Abstract: As e-health technology continues to advance, health related multimedia data is being exponentially generated from healthcare monitoring devices and sensors. Coming with it are the challenges on how to efficiently acquire, index, and process such a huge amount of data for effective healthcare and related decision making, while respecting user's data privacy. In this paper, we propose a secure cloud-based framework for privacy-aware healthcare monitoring systems, which allows fast data acquisition and indexing with strong privacy assurance. For efficient data acquisition, we adopt compressive sensing for easy data sampling, compression, and recovery. We then focus on how to secure and fast index the resulting large amount of continuously generated compressed samples, with the goal to achieve secure selected retrieval over compressed storage. Among others, one particular challenge is the practical demand to cope with the incoming data samples in high acquisition rates. For that problem, we carefully exploit recent efforts on encrypted search, efficient content-based indexing techniques, and fine-grained locking algorithms, to design a novel encrypted index with high-performance customization. It achieves memory efficiency, provable security, as well as greatly improved building speed with nontrivial multithread support. Comprehensive evaluations on Amazon Cloud show that our encrypted design can securely index 1 billion compressed data samples within only 12 min, achieving a throughput of indexing almost 1.4 million encrypted samples per second. Accuracy and visual evaluation on a real healthcare dataset shows good quality of high-value retrieval and recovery over encrypted data samples.