scispace - formally typeset
Search or ask a question

Showing papers by "University of Colorado Colorado Springs published in 2013"


Journal ArticleDOI
TL;DR: The origins of the errors in NDDO methods have been examined, and were found to be attributable to inadequate and inaccurate reference data.
Abstract: Modern semiempirical methods are of sufficient accuracy when used in the modeling of molecules of the same type as used as reference data in the parameterization. Outside that subset, however, there is an abundance of evidence that these methods are of very limited utility. In an attempt to expand the range of applicability, a new method called PM7 has been developed. PM7 was parameterized using experimental and high-level ab initio reference data, augmented by a new type of reference data intended to better define the structure of parameter space. The resulting method was tested by modeling crystal structures and heats of formation of solids. Two changes were made to the set of approximations: a modification was made to improve the description of noncovalent interactions, and two minor errors in the NDDO formalism were rectified. Average unsigned errors (AUEs) in geometry and ΔHf for PM7 were reduced relative to PM6; for simple gas-phase organic systems, the AUE in bond lengths decreased by about 5 % and the AUE in ΔHf decreased by about 10 %; for organic solids, the AUE in ΔHf dropped by 60 % and the reduction was 33.3 % for geometries. A two-step process (PM7-TS) for calculating the heights of activation barriers has been developed. Using PM7-TS, the AUE in the barrier heights for simple organic reactions was decreased from values of 12.6 kcal/mol-1 in PM6 and 10.8 kcal/mol-1 in PM7 to 3.8 kcal/mol-1. The origins of the errors in NDDO methods have been examined, and were found to be attributable to inadequate and inaccurate reference data. This conclusion provides insight into how these methods can be improved.

1,447 citations


Journal ArticleDOI
TL;DR: This paper explores the nature of open set recognition and formalizes its definition as a constrained minimization problem, and introduces a novel “1-vs-set machine,” which sculpts a decision space from the marginal distances of a 1-class or binary SVM with a linear kernel.
Abstract: To date, almost all experimental evaluations of machine learning-based recognition algorithms in computer vision have taken the form of “closed set” recognition, whereby all testing classes are known at training time. A more realistic scenario for vision applications is “open set” recognition, where incomplete knowledge of the world is present at training time, and unknown classes can be submitted to an algorithm during testing. This paper explores the nature of open set recognition and formalizes its definition as a constrained minimization problem. The open set recognition problem is not well addressed by existing algorithms because it requires strong generalization. As a step toward a solution, we introduce a novel “1-vs-set machine,” which sculpts a decision space from the marginal distances of a 1-class or binary SVM with a linear kernel. This methodology applies to several different applications in computer vision where open set recognition is a challenging problem, including object recognition and face verification. We consider both in this work, with large scale cross-dataset experiments performed over the Caltech 256 and ImageNet sets, as well as face matching experiments performed over the Labeled Faces in the Wild set. The experiments highlight the effectiveness of machines adapted for open set evaluation compared to existing 1-class and binary SVMs for the same tasks.

1,029 citations


Journal ArticleDOI
TL;DR: In this paper, the authors report the light-to-heat energy transfer efficiencies of gold nanoparticles with variable sizes by assessing the temperature profiles of laser-activated particle suspensions in water.
Abstract: We report the light-to-heat energy transfer efficiencies of gold nanoparticles with variable sizes by assessing the temperature profiles of laser-activated particle suspensions in water. Gold nanoparticles with sizes ranging from 5 to 50 nm were synthesized by chemical reduction methods using sodium borohydride, sodium citrate, or hydroquinone as reducing agents. As-synthesized gold nanoparticle solution (1 mL) was loaded into a quartz cuvette and exposed to a CW green laser (532 nm). Heat input into the system by energy transfer from nanoparticles equals heat dissipation at thermal equilibrium. The transducing efficiency was then determined by plotting temperature increase as a function of laser power extinction. The efficiency increases from 0.650 ± 0.012 to 0.803 ± 0.008 as the particle size decreases from 50.09 ± 2.34 to 4.98 ± 0.59 nm, respectively. The results indicate that the photothermal properties of gold nanoparticles are size-tunable, and the variation of efficiency can be correlated to the ab...

288 citations


Journal ArticleDOI
TL;DR: It was found that the [B(int.)-O-Ag] units could trap the photo induced electron to form a unique intermediate structure in the (B, Ag)-codoped TiO(2) during the irradiation, which is responsible for the photoinduced shifts of the B 1s and Ag 3d peaks observed in the in situ XPS spectra.
Abstract: The origin of the exceptionally high activity of (B, Ag)-codoped TiO2 catalysts under solar-light irradiation has been investigated by XPS and 11B solid-state NMR spectroscopy in conjunction with density functional theory (DFT) calculations. XPS experimental results demonstrated that a portion of the dopant Ag (Ag3+) ions were implanted into the crystalline lattice of (B, Ag)-codoped TiO2 and were in close proximity to the interstitial B (Bint.) sites, forming [Bint.–O–Ag] structural units. In situ XPS experiments were employed to follow the evolution of the chemical states of the B and Ag dopants during UV–vis irradiation. It was found that the [Bint.–O–Ag] units could trap the photoinduced electron to form a unique intermediate structure in the (B, Ag)-codoped TiO2 during the irradiation, which is responsible for the photoinduced shifts of the B 1s and Ag 3d peaks observed in the in situ XPS spectra. Solid-state NMR experiments including 11B triple-quantum and double-quantum magic angle spinning (MAS) N...

220 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined the dynamics of consumer-brand identification and its antecedents in the context of the launch of a new brand and found that on average, CBI growth trajectories initially rise after the introduction but eventually decline, following an inverted-U shape.
Abstract: This study examines the dynamics of consumer–brand identification (CBI) and its antecedents in the context of the launch of a new brand. Three focal drivers of CBI with a new brand are examined, namely: perceived quality (the instrumental driver), self–brand congruity (the symbolic driver), and consumer innate innovativeness (a trait-based driver). Using longitudinal survey data, the authors find that on average, CBI growth trajectories initially rise after the introduction but eventually decline, following an inverted-U shape. More importantly, the longitudinal effects of the antecedents suggest that CBI can take different paths. Consumer innovativeness creates a fleeting identification with the brand that dissipates over time. On the other hand, company-controlled drivers of CBI—such as brand positioning—can contribute to the build-up of deep-structure CBI that grows stronger over time. Based on these findings, the authors offer normative guidelines to managers on consumer–brand relationship investment.

218 citations


Proceedings ArticleDOI
17 Nov 2013
TL;DR: A template-based optimization framework, AUGEM, is presented, which can automatically generate fully optimized assembly code for several dense linear algebra kernels, such as GEMM, GEMV, AXPY and DOT, on varying multi-core CPUs without requiring any manual interference from developers.
Abstract: Basic Liner algebra subprograms (BLAS) is a fundamental library in scientific computing. In this paper, we present a template-based optimization framework, AUGEM, which can automatically generate fully optimized assembly code for several dense linear algebra (DLA) kernels, such as GEMM, GEMV, AXPY and DOT, on varying multi-core CPUs without requiring any manual interference from developers. In particular, based on domain-specific knowledge about algorithms of the DLA kernels, we use a collection of parameterized code templates to formulate a number of commonly occurring instruction sequences within the optimized low-level C code of these DLA kernels. Then, our framework uses a specialized low-level C optimizer to identify instruction sequences that match the pre-defined code templates and thereby translates them into extremely efficient SSE/AVX instructions. The DLA kernels generated by our template-based approach surpass the implementations of Intel MKL and AMD ACML BLAS libraries, on both Intel Sandy Bridge and AMD Piledriver processors.

203 citations


Journal ArticleDOI
TL;DR: This paper presents a meta-analyses of the chiral stationary phase transition of Na6(CO3)(SO4)2, Na2SO4, and Na2CO3 of the Na2O2-CO3 mixtures obtained at the resolution of 2D/cm2 (“2D-cm2”) using a diamond-magnifying lens.
Abstract: Sergey V. Krivovichev,*,†,‡ Olivier Mentre,́ Oleg I. Siidra,† Marie Colmont, and Stanislav K. Filatov† †St. Petersburg State University, Department of Crystallography, University Emb. 7/9, 199034 St. Petersburg, Russia ‡Institute of Silicate Chemistry, Russian Academy of Sciences, Makarova Emb. 6, 199034 St. Petersburg, Russia UCCS, Equipe de Chimie du Solide, UMR CNRS 8181, ENSC LilleUST Lille, BP 90108, 59652 Villeneuve d’Ascq Cedex, France

200 citations


Journal ArticleDOI
01 Jan 2013
TL;DR: Methodologies of detecting and identifying trending topics from streaming data from Twitter's streaming API were outlined, and term frequency-inverse document frequency analysis identified unigrams, bigrams, and trigrams as trending topics.
Abstract: As social media continue to grow, the zeitgeist of society is increasingly found not in the headlines of traditional media institutions, but in the activity of ordinary individuals. The identification of trending topics utilises social media (such as Twitter) to provide an overview of the topics and issues that are currently popular within the online community. In this paper, we outline methodologies of detecting and identifying trending topics from streaming data. Data from Twitter's streaming API was collected and put into documents of equal duration using data collection procedures that allow for analysis over multiple timespans, including those not currently associated with Twitter-identified trending topics. Term frequency-inverse document frequency analysis and relative normalised term frequency analysis were performed on the documents to identify the trending topics. Relative normalised term frequency analysis identified unigrams, bigrams, and trigrams as trending topics, while term frequency-inverse document frequency analysis identified unigrams as trending topics. Application of these methodologies to streaming data resulted in F-measures ranging from 0.1468 to 0.7508.

183 citations


Journal ArticleDOI
TL;DR: The purpose of this article is to provide a synopsis of the state of the art in coding for secrecy, and discusses the importance of a nested code structure and stochastic encoding, which allow for both data reliability and security.
Abstract: While secrecy in communication systems has historically been obtained through cryptographic means in the upper layers, recent research efforts have focused on the physical layer and have unveiled ample opportunities for security design. In particular, the combination of signal processing techniques with channel coding for secrecy has been central to the development of physical-layer security efforts. Although implicit coding techniques for secrecy have been known since the 1970s, explicit code constructions have only been discovered within the last decade. The purpose of this article is to provide a synopsis of the state of the art in coding for secrecy. We discuss the general principles of coding, and we illustrate them with several examples. In particular, we discuss the importance of a nested code structure and stochastic encoding, which allow for both data reliability and security.

170 citations


Journal ArticleDOI
TL;DR: In this paper, a qualitative study of how community members reenergized a valued community identity following years of decline was carried out, and their findings suggest a recursive model of identity resurrection, in which community leaders marshal tangible resources such as money and human talent to orchestrate experiences and community members authenticate the experiences by judging them resonant with memories and existing identity symbols.
Abstract: We build theory on the process of collective identity resurrection through a qualitative study investigating how community members reenergized a valued community identity following years of decline. Our findings suggest a recursive model of identity resurrection, in which community leaders marshal tangible resources such as money and human talent to orchestrate experiences and community members authenticate the experiences by judging them resonant with memories and existing identity symbols. This model draws attention to the role of experience and emotion in identity processes, extending theory that has tended to focus narrowly on cognitive aspects of collective identity. We discuss implications for processes of identity reproduction and resurrection in organizational settings, and for interdependencies between community and organizational identities.

148 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide a review of the current state-of-the-art of complex oxide and multiferroic thin film materials and devices, identify technical issues and technical challenges that need to be overcome for successful insertion of the technology for both military and commercial applications, and provide mitigation strategies to address these technical challenges.
Abstract: There has been significant progress on the fundamental science and technological applications of complex oxides and multiferroics. Among complex oxide thin films, barium strontium titanate (BST) has become the material of choice for room-temperature-based voltage-tunable dielectric thin films, due to its large dielectric tunability and low microwave loss at room temperature. BST thin film varactor technology based reconfigurable radio frequency (RF)/microwave components have been demonstrated with the potential to lower the size, weight, and power needs of a future generation of communication and radar systems. Low-power multiferroic devices have also been recently demonstrated. Strong magneto-electric coupling has also been demonstrated in different multiferroic heterostructures, which show giant voltage control of the ferromagnetic resonance frequency of more than two octaves. This manuscript reviews recent advances in the processing, and application development for the complex oxides and multiferroics, with the focus on voltage tunable RF/microwave components. The over-arching goal of this review is to provide a synopsis of the current state-of the-art of complex oxide and multiferroic thin film materials and devices, identify technical issues and technical challenges that need to be overcome for successful insertion of the technology for both military and commercial applications, and provide mitigation strategies to address these technical challenges.

Journal ArticleDOI
TL;DR: This study assessed the degree to which anxiety and depression symptoms are associated with memory and executive functioning among community-dwelling older adults and suggested that anxiety and Depression have unique relationships with cognitive functioning in community-Dwelling Older adults.

Journal ArticleDOI
TL;DR: Extended transrectal ultrasound guided biopsies of the prostate may not accurately convey true morphometric information and Gleason score of prostate cancer (PCa) and the clinical use of template‐guided (5‐mm grid) transperineal mappingBiopsies (TPMBs) remains controversial.
Abstract: BACKGROUND. Extended transrectal ultrasound guided biopsies (TRUSB) of the prostate may not accurately convey true morphometric information and Gleason score (GS) of prostate cancer (PCa) and the clinical use of template-guided (5-mm grid) transperineal mapping biopsies (TPMBs) remains controversial. METHODS. We correlated the clinical-pathologic results of 1,403 TPMB cores obtained from 25 men diagnosed with PCa with 64 cancer lesions found in their corresponding radical prostatectomy (RP) specimens. Special computer models of three-dimensional, whole-mounted radical prostatectomy (3D-WMRP) specimens were generated and used as gold standard to determine tumor morphometric data. Between-sample rates of upgrade and downgrade (highest GS and a novel cumulative GS) and upstage and downstage (laterality) were determined. Lesions � 0.5 cm 3 or GS � 7 were considered clinically significant. RESULTS. From 64 separate 3D-WMRP lesions, 25 had significant volume (mean 1.13 cm 3 ) and 39 were insignificant (mean 0.09 cm 3 )( P < 0.0001); 18/64 lesions were missed by TPMB, but only one was clinically significant with GS-8 (0.02 cm 3 ). When comparing the cumulative GS of TPMB versus RP, 72% (n ¼ 18) had identical scores, 12% (n ¼ 3) were upgraded, and only 16% (n ¼ 4) were downgraded. Laterality of TPMB and RP was strongly correlated, 80% same laterality, 4% were up-staged, and 16% down-staged. CONCLUSIONS. Our clinical-pathology correlation showed very high accuracy of TPMB with a 5-mm grid template to detect clinically significant PCa lesions as compared with 3D-WMRP, providing physicians and patients with a reliable assessment of grade and stage of disease and the opportunity to choose the most appropriate therapeutic options.

Proceedings ArticleDOI
17 Jun 2013
TL;DR: This paper implements an interference-aware scheduling policy, based on a task performance prediction model, and an adaptive delay scheduling algorithm for data locality improvement that is effective and efficient on a 72-node Xen-based virtual cluster.
Abstract: MapReduce emerges as an important distributed programming paradigm for large-scale applications. Running MapReduce applications in clouds presents an attractive usage model for enterprises. In a virtual MapReduce cluster, the interference between virtual machines (VMs) causes performance degradation of map and reduce tasks and renders existing data locality-aware task scheduling policy, like delay scheduling, no longer effective. On the other hand, virtualization offers an extra opportunity of data locality for co-hosted VMs. In this paper, we present a task scheduling strategy to mitigate interference and meanwhile preserving task data locality for MapReduce applications. The strategy includes an interference-aware scheduling policy, based on a task performance prediction model, and an adaptive delay scheduling algorithm for data locality improvement. We implement the interference and locality-aware (ILA) scheduling strategy in a virtual MapReduce framework. We evaluated its effectiveness and efficiency on a 72-node Xen-based virtual cluster. Experimental results with 10 representative CPU and IO-intensive applications show that ILA is able to achieve a speedup of 1.5 to 6.5 times for individual jobs and yield an improvement of up to 1.9 times in system throughput in comparison with four other MapReduce schedulers.

Journal ArticleDOI
TL;DR: Flaxseed intake decreased glucose and insulin and improved insulin sensitivity as part of a habitual diet in overweight or obese individuals with pre-diabetes.

Journal ArticleDOI
TL;DR: In this article, a rigorous theory of the inverse scattering transform for the defocusing nonlinear Schrodinger equation with nonvanishing boundary values is presented, where the direct problem is well posed for potentials q such that, for which analyticity properties of eigenfunctions and scattering data are established.
Abstract: A rigorous theory of the inverse scattering transform for the defocusing nonlinear Schrodinger equation with nonvanishing boundary values as is presented. The direct problem is shown to be well posed for potentials q such that , for which analyticity properties of eigenfunctions and scattering data are established. The inverse scattering problem is formulated and solved both via Marchenko integral equations, and as a Riemann-Hilbert problem in terms of a suitable uniform variable. The asymptotic behavior of the scattering data is determined and shown to ensure the linear system solving the inverse problem is well defined. Finally, the triplet method is developed as a tool to obtain explicit multisoliton solutions by solving the Marchenko integral equation via separation of variables.

Journal ArticleDOI
TL;DR: Results show high use of BC assessment but also a lack of standardisation and widespread perception of problems related to BM and BC in sport, which should emphasise standardisation with appropriate training opportunities and more research on BC and performance.
Abstract: Background Successful performers in weight-sensitive sports are characterised by low body mass (BM) and fat content. This often requires chronic energy restriction and acute weight loss practices. Aim To evaluate current use of body composition (BC) assessment methods and identify problems and solutions with current BC approaches. Methods A 40-item survey was developed, including demographic and content questions related to BC assessment. The survey was electronically distributed among international sporting organisations. Frequencies and χ 2 analyses were computed. Results 216 responses were received, from 33 countries, representing various institutions, sports and competitive levels. Of the sample, 86% of respondents currently assess BC, most frequently using skinfolds (International Society for the Advancement of Kinanthropometry (ISAK): 50%; non-ISAK, conventional: 40%; both: 28%), dual energy X-ray absorptiometry (38%), bioelectrical impedance (29%), air displacement plethysmography (17%) and hydrostatic weighing (10%). Of those using skinfolds, more at the international level used ISAK, whereas conventional approaches were more reported at regional/national level (p=0.006). The sport dietitian/nutritionist (57%) and physiologist/sports scientist (54%) were most frequently the professionals assessing BC, followed by MDs and athletic trainers, with some reporting coaches (5%). 36% of 116 respondents assessed hydration status and more (64%) did so at international than regional/national level (36%, p=0.028). Of 125 participants answering the question of whether they thought that BC assessment raised problems, 69% said ‘yes’, with most providing ideas for solutions. Conclusions Results show high use of BC assessment but also a lack of standardisation and widespread perception of problems related to BM and BC in sport. Future work should emphasise standardisation with appropriate training opportunities and more research on BC and performance.

Journal ArticleDOI
TL;DR: Catalysts based on ultra‐thin cobalt shells surrounding cheap iron oxide cores (see picture) are developed, an approach previously optimized for preparing magnetic tape for audio cassettes, giving good diesel fractions.
Abstract: Audio cassettes hold the key to enhancing Fischer–Tropsch catalysis. Catalysts based on ultra‐thin cobalt shells surrounding cheap iron oxide cores (see picture) are developed, an approach previously optimized for preparing magnetic tape for audio cassettes. These particles are easily made on a large scale, and are excellent Fischer–Tropsch catalysts, giving good diesel fractions.

Journal ArticleDOI
TL;DR: This article found that the proportion of fair-valued assets held by banks is positively associated with audit fees and that bank expert auditors charge more for auditing the proportions of total assets that are fair valued.
Abstract: Using publicly traded bank holding company data from 2008 through 2011, this paper documents that the proportions of fair-valued assets held by banks are positively associated with audit fees. The positive association between audit fees and the proportions of total assets that are fair-valued using Level 3 inputs is greater than its positive association with the proportions of total assets that are fair-valued using Level 1 or Level 2 inputs. These results are consistent with a hypothesized scenario in which audit effort increases in the difficulty of verifying asset fair values. We also document that bank specialist auditors, defined as in Behn et al. (2008), charge lower audit fees to bank clients on average, suggesting cost efficiencies passed to clients as lower fees. However, bank expert auditors charge more for auditing the proportions of total assets that are fair-valued. Overall, the results support concerns expressed by some observers that greater use of fair value measurements for financial instruments will trigger increased audit fees.

Journal ArticleDOI
TL;DR: The patterns of relationships between different social support facets and sources and QOL aspects including emotional, physical symptoms, functional, and social as well as the global QOL index were investigated.
Abstract: Objective: This systematic review analyzed the relationships between social support and quality of life (QOL) indicators among lung cancer patients. In particular, the patterns of relationships between different social support facets and sources (received and perceived support from healthcare professionals, family, and friends) and QOL aspects (emotional, physical symptoms, functional, and social) as well as the global QOL index were investigated. Methods: The review yielded 14 original studies (57% applying cross-sectional designs) analyzing data from a total of 2759 patients. Results: Regarding healthcare professionals as support source, corroborating evidence was found for associations between received support (as well as need for and satisfaction with received support) and all aspects of QOL, except for social ones. Overall, significant relations between support from healthcare personnel and QOL were observed more frequently (67% of analyzed associations), compared with support from families and friends (53% of analyzed associations). Corroborating evidence was found for the associations between perceived and received support from family and friends and emotional aspects of QOL.Research investigating perceivedsocial supportfrom unspecified sourcesindicated few significant relationships (25% of analyzed associations) and only for the global QOL index. Conclusions: Quantitative and qualitative differences in the associations between social support and QOL are observed, depending on the source and type of support. Psychosocial interventions may aim at enabling provision of social support from healthcare personnel in order to promote emotional, functional, and physical QOL among lung cancer patients.

Journal ArticleDOI
TL;DR: The capability of the use of the Sparkle Model for the prediction of geometries of compounds containing lanthanide trications within the PM7 semiempirical model is emphasized, as well as the usefulness of such semiempireical calculations for materials modeling.
Abstract: The recently published Parametric Method number 7, PM7, is the first semiempirical method to be successfully tested by modeling crystal structures and heats of formation of solids. PM7 is thus also capable of producing results of useful accuracy for materials science and constitutes a great improvement over its predecessor, PM6. In this article, we present Sparkle model parameters to be used with PM7 that allow the prediction of geometries of metal complexes and materials which contain lanthanide trications. Accordingly, we considered the geometries of 224 high-quality crystallographic structures of complexes for the parametrization set and 395 more for the validation of the parametrization for the whole lanthanide series, from La(III) to Lu(III). The average unsigned error for Sparkle/PM7 for the distances between the metal ion and its coordinating atoms is 0.063 A for all lanthanides, ranging from a minimum of 0.052 A for Tb(III) to 0.088 A for Ce(III), comparable to the equivalent errors in the distanc...

Journal ArticleDOI
TL;DR: Different patterns of treatment discontinuation reasons are important to consider when developing public policy and evidence-based treatment approaches to improve successful long-term psoriasis control.
Abstract: Background Despite widespread dissatisfaction and low treatment persistence in moderate to severe psoriasis, patients' reasons behind treatment discontinuation remain poorly understood. Objectives We sought to characterize patient-reported reasons for discontinuing commonly used treatments for moderate to severe psoriasis in real-world clinical practice. Methods A total of 1095 patients with moderate to severe plaque psoriasis from 10 dermatology practices who received systemic treatments completed a structured interview. Eleven reasons for treatment discontinuation were assessed for all past treatments. Results A total of 2231 past treatments were reported. Median treatment duration varied by treatment, ranging from 6.0 to 20.5 months ( P P Limitations The study is limited by its reliance on patient recall. Conclusions Different patterns of treatment discontinuation reasons are important to consider when developing public policy and evidence-based treatment approaches to improve successful long-term psoriasis control.

Proceedings ArticleDOI
23 Feb 2013
TL;DR: A Bias Random vCPU Migration (BRM) algorithm that dynamically migrates vCPUs to minimize the system-wide uncore penalty is proposed and NUMA awareness is added to the virtual machine scheduling.
Abstract: An increasing number of new multicore systems use the Non-Uniform Memory Access architecture due to its scalable memory performance. However, the complex interplay among data locality, contention on shared on-chip memory resources, and cross-node data sharing overhead, makes the delivery of an optimal and predictable program performance difficult. Virtualization further complicates the scheduling problem. Due to abstract and inaccurate mappings from virtual hardware to machine hardware, program and system-level optimizations are often not effective within virtual machines. We find that the penalty to access the “uncore” memory subsystem is an effective metric to predict program performance in NUMA multicore systems. Based on this metric, we add NUMA awareness to the virtual machine scheduling. We propose a Bias Random vCPU Migration (BRM) algorithm that dynamically migrates vCPUs to minimize the system-wide uncore penalty. We have implemented the scheme in the Xen virtual machine monitor. Experiment results on a two-way Intel NUMA multicore system with various workloads show that BRM is able to improve application performance by up to 31.7% compared with the default Xen credit scheduler. Moreover, BRM achieves predictable performance with, on average, no more than 2% runtime variations.

Journal ArticleDOI
TL;DR: The significance of water repellency in the natural environment remains an underexplored research topic as mentioned in this paper, and although the hydrological significance of soil water droplet by a leaf surface has become well established in the ecohydrology and water resources literature, fewer studies have examined the significance of leaf water droplets from surfaces.
Abstract: Numerous studies in materials science and chemistry have expanded our understanding of the repellency of water droplets from surfaces. Much of the inspiration for the development of synthetic water-repellent materials came from the examination of water-repellent properties of animals and plants in the natural environment. The hydrological significance of water repellency in the natural environment remains an underexplored research topic. Although the hydrological significance of soil water repellency has become well established in the ecohydrology and water resources literature, fewer studies have examined the significance of leaf water repellency. This review examines the properties of leaf water repellency, the methodologies used to calculate leaf water repellency, the leaf surface properties that promote leaf water repellency, and the significance of leaf water repellency in ecohydrological research. The repellency of a water droplet by a leaf surface is functionally important among plant species and may reflect selective strategies that either favour leaf water uptake in drought-prone environments or promote higher photosynthetic efficiency during prolonged periods of precipitation through higher repellency. Information on the functional significance of leaf water repellency in hydrologic models could enhance our understanding of the delivery of water resources to municipal reservoirs and fill in a missing gap in our understanding of leaf water repellency as a process that influences discharge, and by extension, water resources. Copyright © 2012 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: Confirmatory factor analyses indicated that a six-factor structure yielded the best fit for scores and that the scores were invariant across samples.
Abstract: In this study, the authors report on the development of English and German versions of the Adolescent Time Attitude Scale (ATAS). The ATAS consists of six subscales assessing Past Positive, Past Negative, Present Positive, Present Negative, Future Positive, and Future Negative time attitudes. The authors describe the development of the scales and present data on the reliability and structural validity of ATAS scores in samples of American (N = 300) and German (N = 316) adolescents. Internal consistency estimates for scores on the English and German versions of the ATAS were in the .70 to .80 range. Confirmatory factor analyses indicated that a six-factor structure yielded the best fit for scores and that the scores were invariant across samples.

Journal ArticleDOI
TL;DR: This article used information from a rich administrative panel dataset following the universe of test-taking public school students in Florida over a period of five years to estimate the relationship between same-gender teacher assignment and student achievement.

Journal ArticleDOI
TL;DR: A cut-point score of 21 or less can be used to identify patients who are experiencing rhinitis symptom control problems and its brevity supports its usefulness in clinical care.
Abstract: Background The Rhinitis Control Assessment Test (RCAT) is a brief, patient-completed tool to evaluate rhinitis symptom control. Objective We sought to test the reliability, validity, and responsiveness of RCAT and to estimate a cut-point score and minimal important difference (MID). Methods A total of 402 patients 12 years of age and older with allergic or nonallergic rhinitis were enrolled in a noninterventional study. Patients completed the RCAT (6 items; score range, 6-30) and had Total Nasal Symptom Scores (TNSSs) measured at baseline and 2 weeks later. Physicians completed a global assessment of rhinitis symptom control (Physician's Global Assessment) and disease severity. Internal consistency, test-retest reliability, convergent validity, known-groups validity, and responsiveness were evaluated. The MID was determined by using distribution- and anchor-based methods. Content validity of the RCAT was assessed in individual interviews with a separate group of 58 adult patients. Results Internal consistency and test-retest reliability of RCAT scores were 0.77 and 0.78, respectively. Convergent validity correlation between RCAT and TNSS scores was 0.57, and that between RCAT and Physician's Global Assessment scores was 0.34. Mean RCAT scores differed significantly ( P Conclusion The RCAT demonstrated adequate reliability, validity, and responsiveness and was deemed acceptable and appropriate by patients. This tool can facilitate the detection of rhinitis symptom control problems, and its brevity supports its usefulness in clinical care.

Journal ArticleDOI
TL;DR: BARREL as discussed by the authors is a multiple-balloon investigation designed to study electron losses from Earth's Radiation Belts, which augments the Radiation Belt Storm Probes mission by providing measurements of relativistic electron precipitation with a pair of Antarctic balloon campaigns.
Abstract: BARREL is a multiple-balloon investigation designed to study electron losses from Earth’s Radiation Belts. Selected as a NASA Living with a Star Mission of Opportunity, BARREL augments the Radiation Belt Storm Probes mission by providing measurements of relativistic electron precipitation with a pair of Antarctic balloon campaigns that will be conducted during the Austral summers (January-February) of 2013 and 2014. During each campaign, a total of 20 small (∼20 kg) stratospheric balloons will be successively launched to maintain an array of ∼5 payloads spread across ∼6 hours of magnetic local time in the region that magnetically maps to the radiation belts. Each balloon carries an X-ray spectrometer to measure the bremsstrahlung X-rays produced by precipitating relativistic electrons as they collide with neutrals in the atmosphere, and a DC magnetometer to measure ULF-timescale variations of the magnetic field. BARREL will provide the first balloon measurements of relativistic electron precipitation while comprehensive in situ measurements of both plasma waves and energetic particles are available, and will characterize the spatial scale of precipitation at relativistic energies. All data and analysis software will be made freely available to the scientific community.

Journal ArticleDOI
TL;DR: Associations between telomere length (TL) and TL‐related genes and risk of breast cancer risk in an admixed population of US non‐Hispanic white and Hispanic and Mexican women are evaluated and support for an association is provided.
Abstract: Telomeres are involved in maintaining genomic stability. Previous studies have linked both telomere length (TL) and telomere-related genes with cancer. We evaluated associations between telomere-related genes, TL, and breast cancer risk in an admixed population of US non-Hispanic white (1,481 cases, 1,586 controls) and U.S. Hispanic and Mexican women (2,111 cases, 2,597 controls) from the Breast Cancer Health Disparities Study. TL was assessed in 1,500 women based on their genetic ancestry. TL-related genes assessed were MEN1, MRE11A, RECQL5, TEP1, TERC, TERF2, TERT, TNKS, and TNKS2. Longer TL was associated with increased breast cancer risk [odds ratio (OR) 1.87, 95% confidence interval (CI) 1.38, 2.55], with the highest risk (OR 3.11, 95% CI 1.74, 5.67 p interaction 0.02) among women with high Indigenous American ancestry. Several TL-related single nucleotide polymorphisms had modest association with breast cancer risk overall, including TEP1 rs93886 (OR 0.82, 95% CI 0.70,0.95); TERF2 rs3785074 (OR 1.13, 95% CI 1.03,1.24); TERT rs4246742 (OR 0.85, 95% CI 0.77,0.93); TERT rs10069690 (OR 1.13, 95% CI 1.03,1.24); TERT rs2242652 (OR 1.51, 95% CI 1.11,2.04); and TNKS rs6990300 (OR 0.89, 95% CI 0.81,0.97). Several differences in association were detected by hormone receptor status of tumors. Most notable were associations with TERT rs2736118 (ORadj 6.18, 95% CI 2.90, 13.19) with estrogen receptor negative/progesterone receptor positive (ER-/PR+) tumors and TERT rs2735940 (ORadj 0.73, 95% CI 0.59, 0.91) with ER-/PR- tumors. These data provide support for an association between TL and TL-related genes and risk of breast cancer. The association may be modified by hormone receptor status and genetic ancestry.

Journal ArticleDOI
TL;DR: The tunable properties of thermoresponsive physical hydrogels recently found application in catalysis and the most relevant examples are described in this perspective article.
Abstract: The tunable properties of thermoresponsive physical hydrogels recently found application in catalysis. The most relevant examples are described in this perspective article. Novel concepts are especially highlighted through the beneficial effects of thermoresponsive hydrogels on the catalytic performance. Their scope and future developments are also addressed.