scispace - formally typeset
Search or ask a question

Showing papers on "Personal computer published in 2013"


Journal ArticleDOI
TL;DR: The code SusHi is described, which calculates the cross sections p p / p p ¯ → ϕ + X in gluon fusion and bottom-quark annihilation in the SM and the MSSM, where ϕ is any of the neutral Higgs bosons within these models.

443 citations


Book ChapterDOI
01 Jan 2013
TL;DR: PUMA, a new method for detecting malicious Android applications through machine-learning techniques by analysing the extracted permissions from the application itself, is presented.
Abstract: The presence of mobile devices has increased in our lives offering almost the same functionality as a personal computer. Android devices have appeared lately and, since then, the number of applications available for this operating system has increased exponentially. Google already has its Android Market where applications are offered and, as happens with every popular media, is prone to misuse. In fact, malware writers insert malicious applications into this market, but also among other alternative markets. Therefore, in this paper, we present PUMA, a new method for detecting malicious Android applications through machine-learning techniques by analysing the extracted permissions from the application itself.

261 citations


Journal ArticleDOI
TL;DR: While sociodemographic differences are more influential, device type can increase likelihood of use for some “capital enhancing” activities, but only for a computer, thus, although mobile Internet is available for those on the wrong side of the digital divide, these users do not engage in many activities, decreasing potential benefits.
Abstract: Digital inequality can take many forms. Four forms studied here are access to Internet, use of different devices, extent of usage, and engagement in different Internet activities. However, it is not clear whether sociodemographic factors, or devices, are more influential in usage and activities. Results from an unfamiliar context show that there are significant sociodemographic influences on access, device, usage, and activities, and differences in activities by device type and usage. While sociodemographic differences are more influential, device type can increase likelihood of use for some “capital enhancing” activities, but only for a computer. Thus, although mobile Internet is available for those on the wrong side of the digital divide, these users do not engage in many activities, decreasing potential benefits.

256 citations


Journal ArticleDOI
TL;DR: Quokka as discussed by the authors is a simulation tool for 3D solar cell simulation, which is based on the full set of charge carrier transport equations, i.e., quasi-neutrality and conductive boundaries.
Abstract: Details of Quokka, which is a freely available fast 3-D solar cell simulation tool, are presented. Simplifications to the full set of charge carrier transport equations, i.e., quasi-neutrality and conductive boundaries, result in a model that is computationally inexpensive without a loss of generality. Details on the freely available finite volume implementation in MATLAB are given, which shows computation times on the order of seconds to minutes for a full I-V curve sweep on a conventional personal computer. As an application example, the validity of popular analytical models of partial rear contact cells is verified under varying conditions. Consequently, it is observed that significant errors can occur if these analytical models are used to derive local recombination properties from effective lifetime measurements of test structures.

227 citations


Journal ArticleDOI
TL;DR: In this article, the precision of detecting novel miRNAs is improved by introducing new strategies to identify precursor miRNA, which is modeled off miRDeep and has a user-friendly graphic interface and accepts raw data in FastQ and Sequence Alignment Map (SAM) or the binary equivalent format.
Abstract: miRDeep and its varieties are widely used to quantify known and novel micro RNA (miRNA) from small RNA sequencing (RNAseq). This article describes miRDeep*, our integrated miRNA identification tool, which is modeled off miRDeep, but the precision of detecting novel miRNAs is improved by introducing new strategies to identify precursor miRNAs. miRDeep* has a user-friendly graphic interface and accepts raw data in FastQ and Sequence Alignment Map (SAM) or the binary equivalent (BAM) format. Known and novel miRNA expression levels, as measured by the number of reads, are displayed in an interface, which shows each RNAseq read relative to the pre-miRNA hairpin. The secondary pre-miRNA structure and read locations for each predicted miRNA are shown and kept in a separate figure file. Moreover, the target genes of known and novel miRNAs are predicted using the TargetScan algorithm, and the targets are ranked according to the confidence score. miRDeep* is an integrated standalone application where sequence alignment, pre-miRNA secondary structure calculation and graphical display are purely Java coded. This application tool can be executed using a normal personal computer with 1.5 GB of memory. Further, we show that miRDeep* outperformed existing miRNA prediction tools using our LNCaP and other small RNAseq datasets. miRDeep* is freely available online at http://www.australianprostatecentre.org/research/software/mirdeep-star.

208 citations


Journal ArticleDOI
TL;DR: This article compares the data quality between two survey modes: self-administered web surveys conducted via personal computer and those conducted via mobile phones, and finds mobile web was associated with a lower completion rate, shorter length of open answers, and similar level of socially undesirable and non-substantive responses.
Abstract: The considerable growth in the number of smart mobile devices with a fast Internet connection provides new challenges for survey researchers. In this article, I compare the data quality between two survey modes: self-administered web surveys conducted via personal computer and those conducted via mobile phones. Data quality is compared based on five indicators: (a) completion rates, (b) response order effects, (c) social desirability, (d) non-substantive responses, and (e) length of open answers. I hypothesized that mobile web surveys would result in lower completion rates, stronger response order effects, and less elaborate answers to open-ended questions. No difference was expected in the level of reporting in sensitive items and in the rate of non-substantive responses. To test the assumptions, an experiment with two survey modes was conducted using a volunteer online access panel in Russia. As expected, mobile web was associated with a lower completion rate, shorter length of open answers, and similar level of socially undesirable and non-substantive responses. However, no stronger primacy effects in mobile web survey mode were found.

166 citations


Journal ArticleDOI
TL;DR: This study sought to confirm the test–retest reliability and validity of the National Center for Geriatrics and Gerontology functional assessment tool (NCGG‐FAT), a newly developed assessment of multidimensional neurocognitive function using a tablet personal computer (PC).
Abstract: Aim This study sought to confirm the test–retest reliability and validity of the National Center for Geriatrics and Gerontology functional assessment tool (NCGG-FAT), a newly developed assessment of multidimensional neurocognitive function using a tablet personal computer (PC). Methods This study included 20 community-dwelling older adults (9 females, aged 65–81 years). Participants were administered the NCGG-FAT twice, separated by approximately 30 days to determine test–retest reliability. To test the validity of the measure, participants underwent established neurocognitive measurements, including memory, attention, executive function, processing speed and visuospatial function within a week from the first administration of the NCGG-FAT. Results Test–retest reliability was in an acceptable range for each component of the NCGG-FAT, with intraclass correlation coefficients ranging from 0.764 to 0.942. Each task in the NCGG-FAT showed a moderate to high correlation with scores on widely-used conventional neurocognitive tests (r = 0.496 to 0.842). Conclusion We found that the NCGG-FAT using a tablet PC was reliable in a sample of community-dwelling older adults. The NCGG-FAT might be useful for cognitive screening in population-based samples and outcomes, enabling assessment of the effects of intervention on multidimensional cognitive function among older adults. Geriatr Gerontol Int 2013; 13: 860–866.

154 citations


Journal ArticleDOI
TL;DR: The final goal of this work is to realize an upgraded application-specified integrated circuit that controls the microelectromechanical systems (MEMS) sensor and integrates the ASIP, which will allow the MEMS sensor gyro plus accelerometer and the angular estimation system to be contained in a single package.
Abstract: This paper presents an application-specific integrated processor for an angular estimation system that works with 9-D inertial measurement units. The application-specific instruction-set processor (ASIP) was implemented on field-programmable gate array and interfaced with a gyro-plus-accelerometer 6-D sensor and with a magnetic compass. Output data were recorded on a personal computer and also used to perform a live demo. During system modeling and design, it was chosen to represent angular position data with a quaternion and to use an extended Kalman filter as sensor fusion algorithm. For this purpose, a novel two-stage filter was designed: The first stage uses accelerometer data, and the second one uses magnetic compass data for angular position correction. This allows flexibility, less computational requirements, and robustness to magnetic field anomalies. The final goal of this work is to realize an upgraded application-specified integrated circuit that controls the microelectromechanical systems (MEMS) sensor and integrates the ASIP. This will allow the MEMS sensor gyro plus accelerometer and the angular estimation system to be contained in a single package; this system might optionally work with an external magnetic compass.

140 citations


Journal ArticleDOI
TL;DR: Turkish EFL teachers have little knowledge about certain software and experience difficulties using the software programs and that they suffer from a lack of technical and instructional support, although they have positive perceptions of computer integration and attitudes toward computer use.
Abstract: While research mainly focuses on the effectiveness of computer use and its contribution to the teaching of English as a foreign language (EFL), EFL teachers have received scant attention. Only a limited number of studies have been conducted regarding Turkish EFL teachers' perceptions of computer use for teaching EFL. Therefore, in the context of Turkish EFL teachers' perceptions of computer usage in learning and teaching, the current study aims to examine Turkish EFL teachers' knowledge of software and their reasons for personal computer use, including the attitudes and perceptions of self-confidence among teachers in integrating computers and the school climate and support with respect to the use of computers for teaching EFL. The sample group in the study consisted of 157 EFL teachers. Before descriptives were computed, a background questionnaire and survey that assessed the degree of knowledge about the computer software, the frequency of software use for personal purposes, the teachers' attitudes towa...

114 citations


Journal ArticleDOI
TL;DR: Improvements in hypertension and other chronic conditions have shown improved health outcomes using mHealth applications that have undergone rigourous usability testing, Nonetheless, the inability of most electronic medical record systems to receive and process information from mobile devices continues to be a major impediment in realizing the full potential of mHealth technology.

109 citations


Patent
20 Nov 2013
TL;DR: In this paper, an information pushing system of a WiFi (wireless fidelity) terminal and an implementation method thereof is described, which is suitable for all terminals with the WiFi function, such as mobile phones, PADs (personal access device) and PCs (personal computer).
Abstract: The invention discloses an information pushing system of a WiFi (wireless fidelity) terminal and an implementation method thereof. The system comprises a WiFi signal collector, a WiFi signal processing module, a WiFi indoor positioning module, a customer activity analyzing and data mining module, a user personality file database, an advertisement information storage database, an advertisement putting and matching engine and a WiFi logging portal module. The system is suitable for all terminals with the WiFi function, such as mobile phones, PADs (personal access device) and PCs (personal computer), and different software platforms, such as Andriod, wP, iOS, Windows, MAC OS and Linux, the WiFi terminals do not need any software modification and updating, and the applicable range is wide. The possible hobby of a consumer is judged through data analysis and mining, and the real-time and on-site precise marketing is realized for target users, so on one hand, the experience of the user consuming in business places is improved, and on the other hand, the commercial value of sellers and factories is improved.

Journal ArticleDOI
12 Jul 2013-PLOS ONE
TL;DR: It is demonstrated that the proposed algorithm, called SMETANA, outperforms many state-of-the-art network alignment techniques, in terms of computational efficiency, alignment accuracy, and scalability.
Abstract: In this paper we introduce an efficient algorithm for alignment of multiple large-scale biological networks In this scheme, we first compute a probabilistic similarity measure between nodes that belong to different networks using a semi-Markov random walk model The estimated probabilities are further enhanced by incorporating the local and the cross-species network similarity information through the use of two different types of probabilistic consistency transformations The transformed alignment probabilities are used to predict the alignment of multiple networks based on a greedy approach We demonstrate that the proposed algorithm, called SMETANA, outperforms many state-of-the-art network alignment techniques, in terms of computational efficiency, alignment accuracy, and scalability Our experiments show that SMETANA can easily align tens of genome-scale networks with thousands of nodes on a personal computer without any difficulty The source code of SMETANA is available upon request The source code of SMETANA can be downloaded from http://wwwecetamuedu/~bjyoon/SMETANA/

Journal ArticleDOI
06 Mar 2013
TL;DR: Real-time CKC identified slightly fewer MUs than its batch version (experimental EMG, 4 MUs versus 5 MUs identified by batch CKC, on average), but required only 0.6 s of processing time on regular personal computer for each second of multichannel surface EMG.
Abstract: This study addresses online decomposition of high-density surface electromyograms (EMG) in real time. The proposed method is based on the previously published Convolution Kernel Compensation (CKC) technique and shares the same decomposition paradigm, i.e., compensation of motor unit action potentials and direct identification of motor unit (MU) discharges. In contrast to previously published version of CKC, which operates in batch mode and requires ~ 10 s of EMG signal, the real-time implementation begins with batch processing of ~ 3 s of the EMG signal in the initialization stage and continues on with iterative updating of the estimators of MU discharges as blocks of new EMG samples become available. Its detailed comparison to previously validated batch version of CKC and asymptotically Bayesian optimal linear minimum mean square error (LMMSE) estimator demonstrates high agreement in identified MU discharges among all three techniques. In the case of synthetic surface EMG with 20 dB signal-to-noise ratio, MU discharges were identified with average sensitivity of 98%. In the case of experimental EMG, real-time CKC fully converged after initial 5 s of EMG recordings and real-time and batch CKC agreed on 90% of MU discharges, on average. The real-time CKC identified slightly fewer MUs than its batch version (experimental EMG, 4 MUs versus 5 MUs identified by batch CKC, on average), but required only 0.6 s of processing time on regular personal computer for each second of multichannel surface EMG.

Journal ArticleDOI
TL;DR: Manifest analysis for malware detection in Android (MAMA), a new method that extracts several features from the Android manifest of the applications to build machine learning classifiers and detect malware.
Abstract: The use of mobile phones has increased because they offer nearly the same functionality as a personal computer. In addition, the number of applications available for Android-based mobile devices has increased. Google offers programmers the opportunity to upload and sell applications in the Android Market, but malware writers upload their malicious code there. In light of this background, we present here manifest analysis for malware detection in Android MAMA, a new method that extracts several features from the Android manifest of the applications to build machine learning classifiers and detect malware.

Patent
07 Nov 2013
TL;DR: In this article, a method and system for crowdsourcing and managing contact and profile information of a user's contacts, and exchanging business and personal contact information through a mobile device, personal computer, or a web application is presented.
Abstract: A method and system for crowdsourcing and managing contact and profile information of a user's contacts, and exchanging business and personal contact information through a mobile device, personal computer, or a web application. The system comprises a crowdsourcing intelligence module that provides the software, analysis, and algorithms for automatically populating and updating an individual's contact information in a user's address book based on contributed information and changes made to the individual's profile by a large community of users. The module also automatically populates and updates business profile, captures business's external social and business profiles, and analyzes demographic information which can then be transmitted to users. Users may also search for job opportunities, review and purchase products and services, review the location of contacts in proximity to the user, and manage sales and account activities including lead generation, lead qualification, and better understanding their customer base.

Journal ArticleDOI
TL;DR: The extensive use of HCD and ETD spectral information and the pDAG algorithm make pNovo+ an excellent de novo sequencing tool and it is verified that the antisymmetry restriction is unnecessary for high resolution, high mass accuracy data.
Abstract: De novo peptide sequencing is the only tool for extracting peptide sequences directly from tandem mass spectrometry (MS) data without any protein database. However, neither the accuracy nor the efficiency of de novo sequencing has been satisfactory, mainly due to incomplete fragmentation information in experimental spectra. Recent advancement in MS technology has enabled acquisition of higher energy collisional dissociation (HCD) and electron transfer dissociation (ETD) spectra of the same precursor. These spectra contain complementary fragmentation information and can be collected with high resolution and high mass accuracy. Taking these advantages, we have developed a new algorithm called pNovo+, which greatly improves the accuracy and speed of de novo sequencing. On tryptic peptides, 86% of the topmost candidate sequences deduced by pNovo+ from HCD + ETD spectral pairs matched the database search results, and the success rate reached 95% if the top three candidates were included, which was much higher than using only HCD (87%) or only ETD spectra (57%). On Asp-N, Glu-C, or Elastase digested peptides, 69−87% of the HCD + ETD spectral pairs were correctly identified by pNovo+ among the topmost candidates, or 84−95% among the top three. On average, it takes pNovo+ only 0.018 s to extract the sequence from a spectrum or spectral pair on a common personal computer. This is more than three times as fast as other de novo sequencing programs. The increase of speed is mainly due to pDAG, a component algorithm of pNovo+. pDAG finds the k longest paths in a directed acyclic graph without the antisymmetry restriction. We have verified that the antisymmetry restriction is unnecessary for high resolution, high mass accuracy data. The extensive use of HCD and ETD spectral information and the pDAG algorithm make pNovo+ an excellent de novo sequencing tool.

Journal ArticleDOI
TL;DR: In this article, a two-layer sediment biogeochemical model (aerobic and anaerobic) was developed to understand the processes regulating porewater profiles and sediment-water exchanges.
Abstract: Sediment-water exchanges of nutrients and oxygen play an important role in the biogeochemistry of shallow coastal environments. Sediments process, store, and release particulate and dissolved forms of carbon and nutrients and sediment-water solute fluxes are significant components of nutrient, carbon, and oxygen cycles. Consequently, sediment biogeochemical models of varying complexity have been developed to understand the processes regulating porewater profiles and sediment-water exchanges. We have calibrated and validated a two-layer sediment biogeochemical model (aerobic and anaerobic) that is suitable for application as a stand-alone tool or coupled to water-column biogeochemical models. We calibrated and tested a stand-alone version of the model against observations of sediment-water flux, porewater concentrations, and process rates at 12 stations in Chesapeake Bay during a 4–17 year period. The model successfully reproduced sediment-water fluxes of ammonium ( NH 4 + ), nitrate ( NO 3 − ), phosphate ( PO 4 3 − ), and dissolved silica (Si(OH)4 or DSi) for diverse chemical and physical environments. A root mean square error (RMSE)-minimizing optimization routine was used to identify best-fit values for many kinetic parameters. The resulting simulations improved the performance of the model in Chesapeake Bay and revealed (1) the need for an aerobic-layer denitrification formulation to account for NO3- reduction in this zone, (2) regional variability in denitrification that depends on oxygen levels in the overlying water, (3) a regionally-dependent solid-solute PO 4 3 − partitioning that accounts for patterns in Fe availability, and (4) a simplified model formulation for DSi, including limited sorption of DSi onto iron oxyhydroxides. This new calibration balances the need for a universal set of parameters that remain true to biogeochemical processes with site-specificity that represents differences in physical conditions. This stand-alone model can be rapidly executed on a personal computer and is well-suited to complement observational studies in a wide range of environments.

Patent
14 Mar 2013
TL;DR: In this article, a method, system and computer program product for managing transportation and storage of goods, including a personal computer device; and a package control device associated with a package, is presented.
Abstract: A method, system and computer program product for managing transportation and storage of goods, including a personal computer device; and a package control device associated with a package. The personal computer device is configured to store transport and storage operations information regarding the package in the package control device. The package control device is configured for identifying an operator responsible for at least one of transport and storage of the package based on the transport and storage operations information. The package control device is configured to allow the operator to have access to the package for at least one of transport and storage of the package based on the transport and storage operations information.

Journal ArticleDOI
TL;DR: An alternative compressed-sensing algorithm, L1-Homotopy (L1H), is demonstrated, which can generate super-resolution image reconstructions that are essentially identical to those derived using interior point methods in one to two orders of magnitude less time depending on the emitter density.
Abstract: In super-resolution imaging techniques based on single-molecule switching and localization, the time to acquire a super-resolution image is limited by the maximum density of fluorescent emitters that can be accurately localized per imaging frame. In order to increase the imaging rate, several methods have been recently developed to analyze images with higher emitter densities. One powerful approach uses methods based on compressed sensing to increase the analyzable emitter density per imaging frame by several-fold compared to other reported approaches. However, the computational cost of this approach, which uses interior point methods, is high, and analysis of a typical 40 µm x 40 µm field-of-view super-resolution movie requires thousands of hours on a high-end desktop personal computer. Here, we demonstrate an alternative compressed-sensing algorithm, L1-Homotopy (L1H), which can generate super-resolution image reconstructions that are essentially identical to those derived using interior point methods in one to two orders of magnitude less time depending on the emitter density. Moreover, for an experimental data set with varying emitter density, L1H analysis is ~300-fold faster than interior point methods. This drastic reduction in computational time should allow the compressed sensing approach to be routinely applied to super-resolution image analysis.

Journal ArticleDOI
Mike Sharples1
TL;DR: With the introduction of personal computer devices such as smartphones and tablets into school classrooms, teachers have the added task of orchestrating complex dynamic systems of students interacting with networked technologies.
Abstract: With the introduction of personal computer devices such as smartphones and tablets into school classrooms, teachers have the added task of orchestrating complex dynamic systems of students interacting with networked technologies. One approach to managing the interactions is to add an orchestration system that enables a teacher view and control each student device. An alternative is to share responsibility for orchestration between the teacher, the students, and the technology. In this form of orchestration, the teacher and all the students have similar computer toolkits designed to guide the students through a productive learning activity. An advantage is that the teacher can hand over control to the students to continue the learning activity outside the classroom, assisted by the Activity Guide software.

Journal ArticleDOI
TL;DR: The CAALYX system can have a clear impact in increasing older persons' autonomy, by ensuring that they do not need to leave their preferred environment in order to be properly monitored and taken care of.

Journal ArticleDOI
TL;DR: Wearable technology has the potential to enhance medical education and patient safety once widely available and medical institutions should work on policies regarding the use of such technologies to enhancemedical care without compromising patient privacy.
Abstract: Graduate medical education (GME) is a balance between providing optimal patient care while ensuring that trainees (residents and fellows) develop independent medical decision making skills as well asand the ability to manage serious medical conditions. We used one form of wearable technology (“Google Glass”) to explore different scenarios in cardiovascular practice where fellows can better their education. We specified different scenarios encountered during routine clinical care in the month of July 2013. These scenarios were chosen based on their clinical significance, the difficulty posed to early stage trainees and the possibly deleterious effects of misdiagnosis or treatment. A mock trainee wearing Google glass enacted each scenario. Live video stream from the glass was transmitted via Wi-Fi or Bluetooth which could have been received by a smartphone, tablet or personal computer. In conclusion, wearable technology has the potential to enhance medical education and patient safety once widely available. Medical institutions should work on policies regarding the use of such technologies to enhance medical care without compromising patient privacy.

Journal ArticleDOI
TL;DR: In this paper, the authors assess changes in tear film stability caused by incomplete blinking and show that even if the total blink rate decreases, the tear film remains stable so long as almost all blinks are complete.
Abstract: PURPOSE The purpose of this study is to assess changes in tear film stability caused by incomplete blinking. METHODS Eleven subjects (mean age, 21.3 years) participated in this study. All subjects had a visual acuity of 20/20 or better and normal ocular health. The subjects were asked to play a game for 60 min on a personal computer as part of a visual display terminal (VDT) experiment. Each subject's blinking was observed by a Web camera that was attached to the top of the display. Every 15 min, the VDT experiment was interrupted for measurement. An RT-7000 (Tomey Co., Ltd., Nagoya, Japan) was used to measure ring breakup time as a parameter of tear film stability. An OPD-Scan II ARK-10000 (NIDEK Co., Ltd., Aichi, Japan) was used to measure corneal aberrations. RESULTS Although the total blink rate changed very little, the complete and incomplete blink rates fluctuated during the VDT experiment. Both types were plotted along symmetrical cubic approximation curves. Noninvasive (ring) breakup time at 30 min (4.33 ± 2.57 s) was significantly shorter (p < 0.01) than that at baseline before the VDT experiment (8.62 ± 1.54 s). After 30 min, the incomplete blink rate began decreasing (fewer incomplete blinks), whereas the complete blink rate began increasing. Ring breakup time increased (improved) after 45 min; however, the incomplete blink rate began to increase again after approximately 50 min. CONCLUSIONS Even if the total blink rate decreases, the tear film remains stable so long as almost all blinks are complete. The incomplete blinking contributes to tear film instability and is variable with prolonged VDT exposure. Our study indicated that the tear film stability was determined by blinking quality, and the predominance of blinking type relates to tear film stability.

Journal ArticleDOI
TL;DR: A simple, yet efficient, parallel disk-based algorithm for counting k-mers, called KMC, which is capable of counting the statistics for short-read human genome data, in input gzipped FASTQ file, in less than 40 minutes on a PC with 16 GB of RAM and 6 CPU cores.
Abstract: The k-mer counting problem, which is to build the histogram of occurrences of every k-symbol long substring in a given text, is important for many bioinformatics applications. They include developing de Bruijn graph genome assemblers, fast multiple sequence alignment and repeat detection. We propose a simple, yet efficient, parallel disk-based algorithm for counting k-mers. Experiments show that it usually offers the fastest solution to the considered problem, while demanding a relatively small amount of memory. In particular, it is capable of counting the statistics for short-read human genome data, in input gzipped FASTQ file, in less than 40 minutes on a PC with 16 GB of RAM and 6 CPU cores, and for long-read human genome data in less than 70 minutes. On a more powerful machine, using 32 GB of RAM and 32 CPU cores, the tasks are accomplished in less than half the time. No other algorithm for most tested settings of this problem and mammalian-size data can accomplish this task in comparable time. Our solution also belongs to memory-frugal ones; most competitive algorithms cannot efficiently work on a PC with 16 GB of memory for such massive data. By making use of cheap disk space and exploiting CPU and I/O parallelism we propose a very competitive k-mer counting procedure, called KMC. Our results suggest that judicious resource management may allow to solve at least some bioinformatics problems with massive data on a commodity personal computer.

Journal ArticleDOI
TL;DR: The analyzed apps reflected a variety of approaches to recording food intake and nutrition using different terminals--mostly mobile phones, followed by PCs and PDAs for older studies, designed mainly for users with obesity, diabetes mellitus and overweight, or people who want to stay healthy.

Journal ArticleDOI
TL;DR: In this paper, an integrated dielectric sensor with a read-out circuit in an unmodified SiGe BiCMOS technology at 125 GHz is presented, where the readout is obtained by reflection coefficient measurement with an integrated reflectometer and a signal source.
Abstract: In this paper, an integrated dielectric sensor with a read-out circuit in an unmodified SiGe BiCMOS technology at 125 GHz is presented. The sensor consists of a 500- $\mu{\hbox {m}}$ shorted half-wave coplanar-waveguide transmission line in the uppermost metal layer of the silicon process, while the read-out is obtained by reflection coefficient measurement with an integrated reflectometer and a signal source. The reflectometer is verified with a circuit breakout including an integrated dummy sensor. The reflectometer is able to measure the phase of the reflection coefficient from 117 to 134 GHz with a resolution of 0.1 $^{\circ}$ and a standard deviation of 0.082 $^{\circ}$ . The integrated sensor with the reflectometer circuit have been fabricated in a 190-GHz ${ f}_{ T}$ SiGe:C BiCMOS technology. It spans an area of 1.4 ${\hbox {mm}}^{2}$ and consumes 75 mA from a 3.3-V supply. The circuit has been assembled on a printed circuit board for characterization by immersion into test liquids. The sensor is controlled by a controller board and a personal computer enabling a measurement time of up to 1 ms per frequency point. Functionality of the sensor is demonstrated from 118 to 133 GHz with immersion of the sensor into different binary methanol–ethanol mixtures, showing good correlation between theory and measurement. The sensor shows a standard deviation of the measured phase of 0.220 $^{\circ}$ and is able to detect a difference in $\epsilon_{r}^{\prime}$ of 0.0125.

Journal ArticleDOI
TL;DR: Though communities have endeavored to further enhance participation by persons with disabilities in many aspects of mainstream society, there is a scarcity of research pertaining to how adults with intellectual disabilities can access a computer, especially those with severe impairments.
Abstract: Purpose: This article was written to summarize current efforts in the research community in regards to assisting adults with severe developmental and intellectual disabilities to access a computer. Method: A literature search was conducted to determine contemporary research that has been conducted to enable computer use in persons with significant developmental disabilities utilizing databases such as ERIC or PubMed. Results: Although various assistive technology devices and interventions have been developed for persons with all types of disabilities, a lack of research into methods to help persons with severe developmental disabilities access a computer is evident. This perpetuates the underutilization of computers in this population such as those attending day programs or residing in residential facilities. Conclusions: Persons with developmental disabilities, particularly adults, are often overlooked and are not thought to be capable of using a personal computer. Though communities have endeavored to f...

Journal ArticleDOI
TL;DR: The preliminary results are promising, with a mean absolute error of less than 2 mmHg in all the patients, and the proposed CFD-based algorithm is fully automatic, requiring no iterative tuning procedures for matching the computed results to observed patient data, thus making it feasible for use in a clinical setting.
Abstract: We propose a CFD-based approach for the non-invasive hemodynamic assessment of pre- and post-operative coarctation of aorta (CoA) patients. Under our approach, the pressure gradient across the coarctation is determined from computational modeling based on physiological principles, medical imaging data, and routine non-invasive clinical measurements. The main constituents of our approach are a reduced-order model for computing blood flow in patient-specific aortic geometries, a parameter estimation procedure for determining patient-specific boundary conditions and vessel wall parameters from non-invasive measurements, and a comprehensive pressure-drop formulation coupled with the overall reduced-order model. The proposed CFD-based algorithm is fully automatic, requiring no iterative tuning procedures for matching the computed results to observed patient data, and requires approximately 6-8 min of computation time on a standard personal computer (Intel Core2 Duo CPU, 3.06 GHz), thus making it feasible for use in a clinical setting. The initial validation studies for the pressure-drop computations have been performed on four patient datasets with native or recurrent coarctation, by comparing the results with the invasively measured peak pressure gradients recorded during routine cardiac catheterization procedure. The preliminary results are promising, with a mean absolute error of less than 2 mmHg in all the patients.

Journal ArticleDOI
TL;DR: Should Hong Kong establish a formal recycling network with tight regulatory control on imports and exports, the potential risks of current e-waste recycling practices on e-Waste recycling workers, local residents and the environment can be greatly reduced.

Journal ArticleDOI
TL;DR: The results of the investigation on the thermal cooling of vapor chamber for cooling hard disk drive of the personal computer are presented in this article, which is of technological importance for the efficient design of cooling systems of personal computers or electronic devices to enhance cooling performance.