scispace - formally typeset
Search or ask a question

Showing papers on "Personal computer published in 2018"


Proceedings Article
15 Aug 2018
TL;DR: It is shown that the KAISER defense mechanism for KASLR has the important (but inadvertent) side effect of impeding Meltdown, which breaks all security guarantees provided by address space isolation as well as paravirtualized environments.
Abstract: The security of computer systems fundamentally relies on memory isolation, e.g., kernel address ranges are marked as non-accessible and are protected from user access. In this paper, we present Meltdown. Meltdown exploits side effects of out-of-order execution on modern processors to read arbitrary kernel-memory locations including personal data and passwords. Out-of-order execution is an indispensable performance feature and present in a wide range of modern processors. The attack is independent of the operating system, and it does not rely on any software vulnerabilities. Meltdown breaks all security guarantees provided by address space isolation as well as paravirtualized environments and, thus, every security mechanism building upon this foundation. On affected systems, Meltdown enables an adversary to read memory of other processes or virtual machines in the cloud without any permissions or privileges, affecting millions of customers and virtually every user of a personal computer. We show that the KAISER defense mechanism for KASLR has the important (but inadvertent) side effect of impeding Meltdown. We stress that KAISER must be deployed immediately to prevent large-scale exploitation of this severe information leakage.

777 citations


Posted ContentDOI
04 Nov 2018-bioRxiv
TL;DR: Harmony is a fast and flexible general purpose integration algorithm that enables the identification of shared fine-grained subpopulations across a variety of experimental and biological conditions.
Abstract: The rapidly emerging diversity of single cell RNAseq datasets allows us to characterize the transcriptional behavior of cell types across a wide variety of biological and clinical conditions. With this comprehensive breadth comes a major analytical challenge. The same cell type across tissues, from different donors, or in different disease states, may appear to express different genes. A joint analysis of multiple datasets requires the integration of cells across diverse conditions. This is particularly challenging when datasets are assayed with different technologies in which real biological differences are interspersed with technical differences. We present Harmony, an algorithm that projects cells into a shared embedding in which cells group by cell type rather than dataset-specific conditions. Unlike available single-cell integration methods, Harmony can simultaneously account for multiple experimental and biological factors. We develop objective metrics to evaluate the quality of data integration. In four separate analyses, we demonstrate the superior performance of Harmony to four single-cell-specific integration algorithms. Moreover, we show that Harmony requires dramatically fewer computational resources. It is the only available algorithm that makes the integration of ~1 million cells feasible on a personal computer. We demonstrate that Harmony identifies both broad populations and fine-grained subpopulations of PBMCs from datasets with large experimental differences. In a meta-analysis of 14,746 cells from 5 studies of human pancreatic islet cells, Harmony accounts for variation among technologies and donors to successfully align several rare subpopulations. In the resulting integrated embedding, we identify a previously unidentified population of potentially dysfunctional alpha islet cells, enriched for genes active in the Endoplasmic Reticulum (ER) stress response. The abundance of these alpha cells correlates across donors with the proportion of dysfunctional beta cells also enriched in ER stress response genes. Harmony is a fast and flexible general purpose integration algorithm that enables the identification of shared fine-grained subpopulations across a variety of experimental and biological conditions.

148 citations


Journal ArticleDOI
TL;DR: A new fully automated deep learning framework for EAT and thoracic adipose tissue quantification from non-contrast coronary artery calcium computed tomography (CT) scans and may improve cardiovascular risk stratification in patients referred for routine CT calcium scans is proposed.
Abstract: Epicardial adipose tissue (EAT) is a visceral fat deposit related to coronary artery disease. Fully automated quantification of EAT volume in clinical routine could be a timesaving and reliable tool for cardiovascular risk assessment. We propose a new fully automated deep learning framework for EAT and thoracic adipose tissue (TAT) quantification from non-contrast coronary artery calcium computed tomography (CT) scans. The first multi-task convolutional neural network (ConvNet) is used to determine heart limits and perform segmentation of heart and adipose tissues. The second ConvNet, combined with a statistical shape model, allows for pericardium detection. EAT and TAT segmentations are then obtained from outputs of both ConvNets. We evaluate the performance of the method on CT data sets from 250 asymptomatic individuals. Strong agreement between automatic and expert manual quantification is obtained for both EAT and TAT with median Dice score coefficients of 0.823 (inter-quartile range (IQR): 0.779–0.860) and 0.905 (IQR: 0.862–0.928), respectively; with excellent correlations of 0.924 and 0.945 for EAT and TAT volumes. Computations are performed in <26 s on a standard personal computer for one CT scan. Therefore, the proposed method represents a tool for rapid fully automated quantification of adipose tissue and may improve cardiovascular risk stratification in patients referred for routine CT calcium scans.

122 citations


Journal ArticleDOI
TL;DR: This research uses the combination of the canopy with the background sky to guide small utility vehicles in the orchard in the future by focusing on the tree canopy and sky of an orchard row.

74 citations


Journal ArticleDOI
Binsen Peng1, Hong Xia1, Yong-kuo Liu1, Bo Yang1, Dan Guo1, Shaomin Zhu1 
TL;DR: The results show that the proposed method has obvious advantages over other methods, and would be of profound guiding significance to the fault diagnosis of NPP.

72 citations


Journal ArticleDOI
TL;DR: These results verified that the developed portable EIT system with Red Pitaya STEMlab could measure the biological tissue in a high accuracy at low cost.
Abstract: A portable electrical impedance tomography (EIT) system has been developed with Red Pitaya STEMlab for biomedical applications. The Red Pitaya STEMlab is a portable device to realize voltage generation and data acquisition for the EIT system. The EIT system includes a modified howland circuit as a voltage-controlled current source, a high-speed analogy multiplexer module, an 8-electrode array, and a personal computer. The generalized vector sampled pattern matching algorithm and the Tikhonov regularization algorithm are used to reconstruct the image generated by the EIT system. The reconstructed images by using Red Pitaya STEMlab are compared with a commercial impedance analyzer IM3570 within the frequencies of $f = 100$ KHz. The results show that the maximum difference of the image correlation between Red Pitaya STEMlab and IM3570 is 5.36%. Finally, the EIT system is used to image the conductivity of eggs during heating process. These results verified that the developed portable EIT system with Red Pitaya STEMlab could measure the biological tissue in a high accuracy at low cost.

71 citations


Journal ArticleDOI
TL;DR: Results indicate that strategies to reduce personal and family-related risk factors have to be developed in order to alleviate the odds of AR expression and suggest that AR manifestation and presentation possibly might be strongly affected by variouspersonal and family factors.

67 citations


Journal ArticleDOI
06 Jun 2018-Sensors
TL;DR: Evaluating a rapid prototyping solution for information merging based on five health sensors and two low-cost ubiquitous computing components: Arduino and Raspberry Pi confirms that portable devices are suitable to support the transmission and analysis of biometric signals into scalable telemedicine systems.
Abstract: Health and sociological indicators alert that life expectancy is increasing, hence so are the years that patients have to live with chronic diseases and co-morbidities. With the advancement in ICT, new tools and paradigms are been explored to provide effective and efficient health care. Telemedicine and health sensors stand as indispensable tools for promoting patient engagement, self-management of diseases and assist doctors to remotely follow up patients. In this paper, we evaluate a rapid prototyping solution for information merging based on five health sensors and two low-cost ubiquitous computing components: Arduino and Raspberry Pi. Our study, which is entirely described with the purpose of reproducibility, aimed to evaluate the extent to which portable technologies are capable of integrating wearable sensors by comparing two deployment scenarios: Raspberry Pi 3 and Personal Computer. The integration is implemented using a choreography engine to transmit data from sensors to a display unit using web services and a simple communication protocol with two modes of data retrieval. Performance of the two set-ups is compared by means of the latency in the wearable data transmission and data loss. PC has a delay of 0.051 ± 0.0035 s (max = 0.2504 s), whereas the Raspberry Pi yields a delay of 0.0175 ± 0.149 s (max = 0.294 s) for N = 300. Our analysis confirms that portable devices ( p < < 0 . 01 ) are suitable to support the transmission and analysis of biometric signals into scalable telemedicine systems.

65 citations


Journal ArticleDOI
30 May 2018
TL;DR: The study modeled the system and implemented it to be tested to explore the relation between security, capacity and data dependency, and the main outcome proved applicability to be adopting 3-LSB approach to give acceptable security with practical capacity preferred.
Abstract: This paper proposed an enhanced system for securing sensitive text-data on personal computer benefitting from the combination of both techniques: cryptography and steganography. The system security is generated by involving RSA cryptography followed by video based steganography as two sequential layers to insure best possible security gaining the advantages from both. The study modeled the system and implemented it to be tested to explore the relation between security, capacity and data dependency. The experimentations covered testing securing data within 15 different size videos showing interesting results. The research gave enhancement to capacity vs. security, as enforced unavoidable tradeoff. The work uniqueness is presented in showing different measures allowing the user and application to be the decision maker to choose. The tests provided all possibilities of accepting security of 1-LSB, 2-LSB, and 3-LSB methods detailing their effects on the cover video. The main outcome proved applicability to be adopting 3-LSB approach to give acceptable security with practical capacity preferred making 3-LSB winning among 1-LSB and 2-LSB techniques.

56 citations


Journal ArticleDOI
TL;DR: A low-cost wearable wireless system specifically designed to acquire surface electromyography (sEMG) and accelerometer signals for monitoring the human activity when performing sport and fitness activities, as well as in healthcare applications is presented.
Abstract: The human activity monitoring technology is one of the most important technologies for ambient assisted living, surveillance-based security, sport and fitness activities, healthcare of elderly people. The activity monitoring is performed in two steps: the acquisition of body signals and the classification of activities being performed. This paper presents a low-cost wearable wireless system specifically designed to acquire surface electromyography (sEMG) and accelerometer signals for monitoring the human activity when performing sport and fitness activities, as well as in healthcare applications. The proposed system consists of several ultralight wireless sensing nodes that are able to acquire, process and efficiently transmit the motion-related (biological and accelerometer) body signals to one or more base stations through a 2.4 GHz radio link using an ad-hoc communication protocol designed on top of the IEEE 802.15.4 physical layer. A user interface software for viewing, recording, and analysing the data was implemented on a control personal computer that is connected through a USB link to the base stations. To demonstrate the capability of the system of detecting the user’s activity, data recorded from a few subjects were used to train and test an automatic classifier for recognizing the type of exercise being performed. The system was tested on four different exercises performed by three people, the automatic classifier achieved an overall accuracy of 85.7% combining the features extracted from acceleration and sEMG signals. A low cost wireless system for the acquisition of sEMG and accelerometer signals has been presented for healthcare and fitness applications. The system consists of wearable sensing nodes that wirelessly transmit the biological and accelerometer signals to one or more base stations. The signals so acquired will be combined and processed in order to detect, monitor and recognize human activities.

56 citations


Journal ArticleDOI
TL;DR: This study shows that a fast 2‐dimensional landmark search can be useful for 3D localization, which could save computational time compared with a full‐volume analysis, and confirms that by using CBCT for cephalometry, there are no distortion projections, and full structure information of a virtual patient is manageable in a personal computer.

Journal ArticleDOI
TL;DR: If these associations are causal, the greatest benefits from health promotion interventions to reduce discretionary screen time may be seen in those with low levels of strength, fitness and physical activity.
Abstract: Discretionary screen time (time spent viewing a television or computer screen during leisure time) is an important contributor to total sedentary behaviour, which is associated with increased risk of mortality and cardiovascular disease (CVD). The aim of this study was to determine whether the associations of screen time with cardiovascular disease and all-cause mortality were modified by levels of cardiorespiratory fitness, grip strength or physical activity. In total, 390,089 participants (54% women) from the UK Biobank were included in this study. All-cause mortality, CVD and cancer incidence and mortality were the main outcomes. Discretionary television (TV) viewing, personal computer (PC) screen time and overall screen time (TV + PC time) were the exposure variables. Grip strength, fitness and physical activity were treated as potential effect modifiers. Altogether, 7420 participants died, and there were 22,210 CVD events, over a median of 5.0 years follow-up (interquartile range 4.3 to 5.7; after exclusion of the first 2 years from baseline in the landmark analysis). All discretionary screen-time exposures were significantly associated with all health outcomes. The associations of overall discretionary screen time with all-cause mortality and incidence of CVD and cancer were strongest amongst participants in the lowest tertile for grip strength (all-cause mortality hazard ratio per 2-h increase in screen time (1.31 [95% confidence interval: 1.22–1.43], p < 0.0001; CVD 1.21 [1.13–1.30], p = 0.0001; cancer incidence 1.14 [1.10–1.19], p < 0.0001) and weakest amongst those in the highest grip-strength tertile (all-cause mortality 1.04 [0.95–1.14], p = 0.198; CVD 1.05 [0.99–1.11], p = 0.070; cancer 0.98 [0.93–1.05], p = 0.771). Similar trends were found for fitness (lowest fitness tertile: all-cause mortality 1.23 [1.13–1.34], p = 0.002 and CVD 1.10 [1.02–1.22], p = 0.010; highest fitness tertile: all-cause mortality 1.12 [0.96–1.28], p = 0.848 and CVD 1.01 [0.96–1.07], p = 0.570). Similar findings were found for physical activity for all-cause mortality and cancer incidence. The associations between discretionary screen time and adverse health outcomes were strongest in those with low grip strength, fitness and physical activity and markedly attenuated in those with the highest levels of grip strength, fitness and physical activity. Thus, if these associations are causal, the greatest benefits from health promotion interventions to reduce discretionary screen time may be seen in those with low levels of strength, fitness and physical activity.

Journal ArticleDOI
TL;DR: In this article, a new technique for reformulation of the rank constraints using both principal and non-principal 2-by-2 minors of the involved Hermitian matrix variable and characterize all such minors into three types.
Abstract: Alternating current optimal power flow (AC OPF) is one of the most fundamental optimization problems in electrical power systems. It can be formulated as a semidefinite program (SDP) with rank constraints. Solving AC OPF, that is, obtaining near optimal primal solutions as well as high quality dual bounds for this non-convex program, presents a major computational challenge to today’s power industry for the real-time operation of large-scale power grids. In this paper, we propose a new technique for reformulation of the rank constraints using both principal and non-principal 2-by-2 minors of the involved Hermitian matrix variable and characterize all such minors into three types. We show the equivalence of these minor constraints to the physical constraints of voltage angle differences summing to zero over three- and four-cycles in the power network. We study second-order conic programming (SOCP) relaxations of this minor reformulation and propose strong cutting planes, convex envelopes, and bound tightening techniques to strengthen the resulting SOCP relaxations. We then propose an SOCP-based spatial branch-and-cut method to obtain the global optimum of AC OPF. Extensive computational experiments show that the proposed algorithm significantly outperforms the state-of-the-art SDP-based OPF solver and on a simple personal computer is able to obtain on average a $$0.71\%$$ optimality gap in no more than 720 s for the most challenging power system instances in the literature.

Journal ArticleDOI
TL;DR: An IoT-based wireless polysomnography system for sleep monitoring, which utilizes a battery-powered, miniature, wireless, portable, and multipurpose recorder, which can facilitate the long-term tracing and research of personal sleep monitoring at home and can be applied in practice.
Abstract: Polysomnography (PSG) is considered the gold standard in the diagnosis of obstructive sleep apnea (OSA). The diagnosis of OSA requires an overnight sleep experiment in a laboratory. However, due to limitations in relation to the number of labs and beds available, patients often need to wait a long time before being diagnosed and eventually treated. In addition, the unfamiliar environment and restricted mobility when a patient is being tested with a polysomnogram may disturb their sleep, resulting in an incomplete or corrupted test. Therefore, it is posed that a PSG conducted in the patient’s home would be more reliable and convenient. The Internet of Things (IoT) plays a vital role in the e-Health system. In this paper, we implement an IoT-based wireless polysomnography system for sleep monitoring, which utilizes a battery-powered, miniature, wireless, portable, and multipurpose recorder. A Java-based PSG recording program in the personal computer is designed to save several bio-signals and transfer them into the European data format. These PSG records can be used to determine a patient’s sleep stages and diagnose OSA. This system is portable, lightweight, and has low power-consumption. To demonstrate the feasibility of the proposed PSG system, a comparison was made between the standard PSG-Alice 5 Diagnostic Sleep System and the proposed system. Several healthy volunteer patients participated in the PSG experiment and were monitored by both the standard PSG-Alice 5 Diagnostic Sleep System and the proposed system simultaneously, under the supervision of specialists at the Sleep Laboratory in Taipei Veteran General Hospital. A comparison of the results of the time-domain waveform and sleep stage of the two systems shows that the proposed system is reliable and can be applied in practice. The proposed system can facilitate the long-term tracing and research of personal sleep monitoring at home.

Journal ArticleDOI
01 Jan 2018
TL;DR: In this paper, a developed system was designed to work on a mobile device with the ability to detect four levels of ripeness of tomato and chili fruits, including unripe, medium, and ripe.
Abstract: Manual laxity of fruit ripeness classification is highly influenced by operator subjectivity, thus there is inconsistency for some periods in the classification process. Information Technology development allows fruit identification based on color characteristic by computer aids. A developed system was designed to work on a mobile device with the ability to detect four levels of ripeness of tomato and chili fruits. The acquisition of training data is done with a new approach. Training data came from objective observation of the same fruit of tomato and chili, captured since one month before harvesting until harvesting period. Image segmentation uses K-Means Clustering Method while ripeness detection uses fuzzy logic. The system output consists of types and level of ripeness grouped into four categories: unripe 1, unripe 2, medium, and ripe. This article explains preliminary results of the testing system in static and partial condition using a personal computer before being applied into a mobile-based integrated system. The results showed the level of success for fruit segmentation was 80% for tomato and 100% for chili. The fault is due to the similarity of fruit sample size. The level of success for detecting fruit ripeness is 80% for tomato and 90% for chili. By 10 training data of each, it is shown that the good result with an overall accuracy level of average ripeness detection is 85%.

DOI
24 Apr 2018
TL;DR: The study involved several testing scenarios for possible increase in the capacity and ambiguity within steganography adopting 1, 2, 3 and 4 least significant bits stego-systems choices.
Abstract: Security system for hiding sensitive text-data on personal computer (PC) is implemented. The system proposes using normal multimedia image based steganography replacing the pixel least significant bits with text to be hidden. The study involved several testing scenarios for possible increase in the capacity and ambiguity within steganography adopting 1, 2, 3 and 4 least significant bits stego-systems choices. The design novelty is in providing full security information to the user to select appropriate cover-image from the PC based on the security priority. The technique allows the PC user to test multi-bits steganography on several images to hide same sensitive texts. Then, the user can prefer one image to be used as cover image based on his selection knowing the capacity and security priority desired. The study proofs the data dependency and its security property by experimenting 35 fixed sizes PC images showing remarkable results.

Journal ArticleDOI
TL;DR: This study illustrates how user-centered design and service design can be applied to identify and incorporate essential stakeholder aspects in the entire design and development process and facilitated development of a stress management intervention truly designed for the end users, in this case, cancer survivors.
Abstract: Background: Distress is prevalent in cancer survivors. Stress management interventions can reduce distress and improve quality of life for cancer patients, but many people with cancer are unfortunately not offered or able to attend such in-person stress management interventions. Objective: The objective of this study was to develop an evidence-based stress management intervention for patients living with cancer that can be delivered electronically with wide reach and dissemination. This paper describes the design and development process of a technology-based stress management intervention for cancer survivors, including the exploration phase, intervention content development, iterative software development (including design, development, and formative evaluation of low- and high-level prototypes), and security and privacy considerations. Methods: Design and development processes were iterative and performed in close collaboration with key stakeholders (N=48). In the exploration phase, identifying needs and requirements for the intervention, 28 participants gave input, including male and female cancer survivors (n=11) representing a wide age range (31-81 years) and cancer diagnoses, healthcare providers (n=8) including psychosocial oncology experts, and eHealth experts (n=9) including information technology design and developers. To ensure user involvement in each phase various user-centered design and service design methods were included, such as interviews, usability testing, and think aloud processes. Overall, participants were involved usability testing in the software development and formative evaluation phase, including cancer survivors (n=6), healthy volunteers (n=7), health care providers (n=2), and eHealth experts (n=5). Intervention content was developed by stress management experts based on well-known cognitive behavioral stress management strategies and adjusted to electronic format through multiple iterations with stakeholders. Privacy and security issues were considered throughout. Results: The design and development process identified a variety of stakeholder requirements. Cancer survivors preferred stress management through a mobile app rather than through a personal computer (PC) and identified usefulness, easy access, user friendliness, use of easily understandable language, and many brief sections rather than longer ones as important components of the intervention. These requirements were also supported by recommendations from health care providers and eHealth experts. The final intervention was named StressProffen and the hospital Privacy and Security Protection Committee was part of the final intervention approval to also ensure anchoring in the hospital organization. Conclusions: Interventions, even evidence-based, have little impact if not actively used. This study illustrates how user-centered design and service design can be applied to identify and incorporate essential stakeholder aspects in the entire design and development process. In combination with evidence-based concepts, this process facilitated development of a stress management intervention truly designed for the end users, in this case, cancer survivors. Trial Registration: ClinicalTrials.gov NCT02939612; https://clinicaltrials.gov/ct2/show/NCT02939612 (Archived at WebCite at http://www.webcitation.org/71l9HcfcB)

Proceedings ArticleDOI
13 Mar 2018
TL;DR: This system helps the user to control the sources of energy, manually and remotely using smart phone or personal computer, and is very efficient, cheaper and flexible in operation.
Abstract: In this paper, authors have focused on controlling of hybrid energy system using IOT. There is various combination of energy and all of them are alternative to each other like solar energy, wind energy, bio fuel, fuel cell, etc. But the need of controlling of hybrid energy system arises when it is installed for domestic or commercial purpose. At this point IOT plays an important role in controlling system. The main criteria being switching between the two sources of energy i.e. solar and wind energy without any inconvenience through a website using ESP8266 Wi-Fi module. The data is transmitted wirelessly through website to ESP8266 module which controls the sources of energy. The transmitted data is controlled remotely using IOT. This enables user to have flexible control mechanism remotely through a secured internet web connection. This system helps the user to control the sources of energy, manually and remotely using smart phone or personal computer. This system is very efficient, cheaper and flexible in operation.

Journal ArticleDOI
TL;DR: The proposed hybrid algorithm shows that a fast initial 2‐dimensional landmark search can be useful for a more accurate 3D annotation and could save computational time compared with a full‐volume analysis and shows that full bone structures from CBCT are manageable in a personal computer for 3D modern cephalometry.

Journal ArticleDOI
TL;DR: A globally optimal solution to an important problem: given a real-world route, what is the most energy-efficient way to drive a vehicle from the origin to the destination within a certain period of time?
Abstract: This paper provides a globally optimal solution to an important problem: given a real-world route, what is the most energy-efficient way to drive a vehicle from the origin to the destination within a certain period of time. Along the route, there may be multiple stop signs, traffic lights, turns and curved segments, roads with different grades and speed limits, and even leading vehicles with pre-known speed profiles. Most of such route information and features are actually constraints to the optimal vehicle speed control problem, but these constraints are described in two different domains. The most important concept in solving this problem is to convert the distance-domain route constraints to some time-domain state and input constraints that can be handled by optimization methods such as dynamic programming (DP). Multiple techniques including cost-to-go function interpolation and parallel computing are used to reduce the computation of DP and make the problem solvable within a reasonable amount of time on a personal computer.

Journal ArticleDOI
TL;DR: In this paper, an artificial neural network (ANN)-based formula was proposed to precisely compute the critical elastic buckling load of simply supported cellular beams under uniformly distributed vertical loads, and the maximum and average relative errors among the 3645 data points were found to be 3.7% and 0.4%, respectively, whereas the average computing time per data point is smaller than a millisecond for any current personal computer.
Abstract: Cellular beams are an attractive option for the steel construction industry due to their versatility in terms of strength, size, and weight. Further benefits are the integration of services thereby reducing ceiling-to-floor depth (thus, building’s height), which has a great economic impact. Moreover, the complex localized and global failures characterizing those members have led several scientists to focus their research on the development of more efficient design guidelines. This paper aims to propose an artificial neural network (ANN)-based formula to precisely compute the critical elastic buckling load of simply supported cellular beams under uniformly distributed vertical loads. The 3645-point dataset used in ANN design was obtained from an extensive parametric finite element analysis performed in ABAQUS. The independent variables adopted as ANN inputs are the following: beam’s length, opening diameter, web-post width, cross-section height, web thickness, flange width, flange thickness, and the distance between the last opening edge and the end support. The proposed model shows a strong potential as an effective design tool. The maximum and average relative errors among the 3645 data points were found to be 3.7% and 0.4%, respectively, whereas the average computing time per data point is smaller than a millisecond for any current personal computer.

Book ChapterDOI
01 Jan 2018
TL;DR: This introduction is a friendly guide to quickly speak R language covering topics from R local installation on a personal computer to script writing for batch processing.
Abstract: The use of the softwave R is introduced such that R codes shown in the following chapters can be understood and repeated. This introduction is a friendly guide to quickly speak R language covering topics from R local installation on a personal computer to script writing for batch processing. More specifically, objects, operators, functions, indexes, conditions, graphics, and packages dedicated to sound are treated.

Journal ArticleDOI
TL;DR: In this paper, the authors provide quantitative evidence of natural disasters' (NDs) effect on corporate performance and studies the mechanisms through which the supply chain moderates and mediates the link.
Abstract: Purpose The purpose of this paper is to provide quantitative evidence of natural disasters’ (NDs) effect on corporate performance and studies the mechanisms through which the supply chain moderates and mediates the link. Design/methodology/approach Using two major NDs as quasi-experiment, namely the 2011 Japanese earthquake-tsunami (JET) and Thai flood (TF), and data over the period 2010Q1-2013Q4, effect of these events on end assemblers’ performance is studied, with a focus on the personal computer (PC) supply chain. The moderating influence of delivery and sourcing – as supply chain flexibility and agility – are examined through end assemblers’ and suppliers’ inventory. The suppliers’ mediating role is captured as disruption in obtaining PC components through their sales. Findings Only JET had any negative effect, further quantified as short-term and long-term. The TF instead portrays an insignificant but positive aftermath, which is construed as showing learning from experience and adaptability following JET. Inventory matters, but differently for the two events, and suppliers only exhibit a moderating influence on the assemblers’ disaster-performance link. Originality/value NDs, as catastrophic vulnerabilities, are distinct from other vulnerabilities in that they are hard to predict and have significant impact. Since little is known about the impact of NDs on firm performance and how supply chain mechanisms moderate or mediate their impact, they should be distinctly modelled and empirically studied from other vulnerabilities. This paper sheds light on supply chain resilience to such events with the role of dynamic capabilities.

Journal ArticleDOI
TL;DR: A number of simple, yet effective, techniques that improve GA performance for TCT problems are demonstrated; the most effective of which is a novel problem encoding, based on weighted graphs, that enables the critical path problem to be partially solved for all candidate solutions a priori, thus significantly increasing fitness evaluation.
Abstract: The Time/Cost Trade-off (TCT) problem has long been a popular optimization question for construction engineering and management researchers. The problem manifests itself as the opti- mization of total costs of construction projects that consist of indirect project costs and individual activity costs. The trade-off occurs as project duration and, as a result, indirect project costs de- crease with reduced individual activity duration. This reduction in individual activity duration is achieved by increasing resource allocation to individual activities, which increases their costs to completion. Historically, metaheuristic solutions have been applied to small scale problems due to computational complexities and requirements of larger networks. In this paper, we demonstrate that the metaheuristic approach is highly effective for solving large scale construction TCT problems. A custom Genetic Algorithm (GA) is developed and used to solve large benchmark networks of up to 630 variables with high levels of accuracy (<3% deviation) consistently using computational power of a personal computer in under ten minutes. The same method can also be used to solve larger net- works of up to 6,300 variables with reasonable accuracy (∼7% deviation) at the expense of longerprocessing times. A number of simple, yet effective, techniques that improve GA performance for TCT problems are demonstrated; the most effective of which is a novel problem encoding, based on weighted graphs, that enables the critical path problem to be partially solved for all candidate solutions a priori, thus significantly increasing fitness evaluation. Other improvements include parallel fitness evaluations, optimal algorithm parameters, and the addition of a stagnation criteria. We also present some guidelines of optimal algorithm parameter selection through a comprehensive parameter sweep and a computational demand profile analysis. Moreover, the methods proposed in this article are based on open source development projects that enable scalable solutions without significant development efforts. This information will be beneficial for other researchers in improving computational efficiency of their solution in addressing TCT problems.

Journal ArticleDOI
TL;DR: A large number of commercially available m-health applications may facilitate self-management of hypertension by enhancing medication adherence, maintaining a log of blood pressure measurements, and facilitating physician-patient communication.
Abstract: Mobile-health technology, frequently referred to as m-health, encompasses smartphone, tablet, or personal computer use in the management of chronic disease. There has been a rise in the number of commercially available smartphone applications and website-based platforms which claim to help patients manage hypertension. Very little research has been performed confirming whether or not use of these applications results in improved blood pressure (BP) outcomes. In this paper, we review existing literature on m-health systems and how m-health can affect hypertension management. M-health systems help patients manage hypertension in the following ways: (1) setting alarms and reminders for patients to take their medications, (2) linking patients’ BP reports to their electronic medical record for their physicians to review, (3) providing feedback to patients about their BP trends, and (4) functioning as point-of-care BP sensors. M-health applications with alarms and reminders can increase medication compliance while applications that share ambulatory BP data with patients’ physicians can foster improved patient-physician dialog. However, the most influential tool for achieving positive BP outcomes appears to be patient-directed feedback about BP trends. A large number of commercially available m-health applications may facilitate self-management of hypertension by enhancing medication adherence, maintaining a log of blood pressure measurements, and facilitating physician-patient communication. A small number of applications function as BP sensors, thereby transforming the smartphone into a medical device. Such BP sensors often generate unreliable recordings. Patients must be cautioned regarding the use of smartphones for BP measurement at least until these applications have been more extensively validated.

Proceedings ArticleDOI
01 Aug 2018
TL;DR: IoT devices like low power sensors will be used to collect data from patients and it will be displayed using LCD and stored on any personal computer and also on the cloud so that any actor in the system can refer to it.
Abstract: Health has become one of the global challenges for humanity. Cardiac diseases, Lung failures and heart related diseases are increasing at a rapid rate. Monitoring health of elderly people at home or patients at hospitals is necessary but it requires constant observation of Practitioners and Doctors. Information Technology (IT) and its growing applications are performing major role in making human life easier. Internet of Things (IoT) is transforming healthcare and the role of IT in healthcare. IoT consists of physical devices, such as sensors and monitoring devices for patients (glucose, blood pressure, heart rate & activity monitoring, etc) to connect to the internet and transforms information from the physical world into the digital world. The proposed system, with the help of IoT's such features, will help to keep the necessary details and reports of a patient organized and available to all actors in the system. IoT devices like low power sensors will be used to collect data from patients and it will be displayed using LCD and stored on any personal computer and also on the cloud so that any actor in the system can refer to it.

Journal ArticleDOI
Xiaopeng Gong1, Shengfeng Gu1, Yidong Lou1, Fu Zheng1, Maorong Ge, Jingnan Liu1 
TL;DR: The blocked QR factorization of the simulated matrix can greatly improve processing efficiency with a magnitude of nearly two orders on a personal computer with four 3.30 GHz cores.
Abstract: Global navigation satellite systems (GNSS) are acting as an indispensable tool for geodetic research and global monitoring of the Earth, and they have been rapidly developed over the past few years with abundant GNSS networks, modern constellations, and significant improvement in mathematic models of data processing. However, due to the increasing number of satellites and stations, the computational efficiency becomes a key issue and it could hamper the further development of GNSS applications. In this contribution, this problem is overcome from the aspects of both dense linear algebra algorithms and GNSS processing strategy. First, in order to fully explore the power of modern microprocessors, the square root information filter solution based on the blocked QR factorization employing as many matrix–matrix operations as possible is introduced. In addition, the algorithm complexity of GNSS data processing is further decreased by centralizing the carrier-phase observations and ambiguity parameters, as well as performing the real-time ambiguity resolution and elimination. Based on the QR factorization of the simulated matrix, we can conclude that compared to unblocked QR factorization, the blocked QR factorization can greatly improve processing efficiency with a magnitude of nearly two orders on a personal computer with four 3.30 GHz cores. Then, with 82 globally distributed stations, the processing efficiency is further validated in multi-GNSS (GPS/BDS/Galileo) satellite clock estimation. The results suggest that it will take about 31.38 s per epoch for the unblocked method. While, without any loss of accuracy, it only takes 0.50 and 0.31 s for our new algorithm per epoch for float and fixed clock solutions, respectively.

Journal ArticleDOI
TL;DR: In this paper, a micro paper-based analytical device (μPAD) for the simultaneous determination of fluoride and nitrite in real water samples was developed, which was used to create hydrophilic and hydrophobic zones on the laboratory filter paper device, a wax ink printer was used.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the benefits of introducing heating, CO2 supply, ventilation and LED lighting in a Chinese solar greenhouse and proposed a two-time-scale receding horizon optimal control system.

Book
18 Apr 2018
TL;DR: From the punch card calculating machine to the personal computer to the iPhone and more, the authors offers a comprehensive introduction to digital media history for students and scholars across media and communication studies, providing an overview of the main turning points in digital media and highlighting the interactions between political, business, technical, social, and cultural elements throughout history.
Abstract: From the punch card calculating machine to the personal computer to the iPhone and more, this in-depth text offers a comprehensive introduction to digital media history for students and scholars across media and communication studies, providing an overview of the main turning points in digital media and highlighting the interactions between political, business, technical, social, and cultural elements throughout history. With a global scope and an intermedia focus, this book enables students and scholars alike to deepen their critical understanding of digital communication, adding an understudied historical layer to the examination of digital media and societies. Discussion questions, a timeline, and previously unpublished tables and maps are included to guide readers as they learn to contextualize and critically analyze the digital technologies we use every day.