scispace - formally typeset
Search or ask a question

Showing papers by "University of Minnesota published in 2006"


Journal ArticleDOI
Hui Zou1
TL;DR: A new version of the lasso is proposed, called the adaptive lasso, where adaptive weights are used for penalizing different coefficients in the ℓ1 penalty, and the nonnegative garotte is shown to be consistent for variable selection.
Abstract: The lasso is a popular technique for simultaneous estimation and variable selection. Lasso variable selection has been shown to be consistent under certain conditions. In this work we derive a necessary condition for the lasso variable selection to be consistent. Consequently, there exist certain scenarios where the lasso is inconsistent for variable selection. We then propose a new version of the lasso, called the adaptive lasso, where adaptive weights are used for penalizing different coefficients in the l1 penalty. We show that the adaptive lasso enjoys the oracle properties; namely, it performs as well as if the true underlying model were given in advance. Similar to the lasso, the adaptive lasso is shown to be near-minimax optimal. Furthermore, the adaptive lasso can be solved by the same efficient algorithm for solving the lasso. We also discuss the extension of the adaptive lasso in generalized linear models and show that the oracle properties still hold under mild regularity conditions. As a bypro...

6,765 citations


Journal ArticleDOI
TL;DR: The new local density functional, called M06-L, is designed to capture the main dependence of the exchange-correlation energy on local spin density, spin density gradient, and spin kinetic energy density, and it is parametrized to satisfy the uniform-electron-gas limit.
Abstract: We present a new local density functional, called M06-L, for main-group and transition element thermochemistry, thermochemical kinetics, and noncovalent interactions. The functional is designed to capture the main dependence of the exchange-correlation energy on local spin density, spin density gradient, and spin kinetic energy density, and it is parametrized to satisfy the uniform-electron-gas limit and to have good performance for both main-group chemistry and transition metal chemistry. The M06-L functional and 14 other functionals have been comparatively assessed against 22 energetic databases. Among the tested functionals, which include the popular B3LYP, BLYP, and BP86 functionals as well as our previous M05 functional, the M06-L functional gives the best overall performance for a combination of main-group thermochemistry, thermochemical kinetics, and organometallic, inorganometallic, biological, and noncovalent interactions. It also does very well for predicting geometries and vibrational frequencies. Because of the computational advantages of local functionals, the present functional should be very useful for many applications in chemistry, especially for simulations on moderate-sized and large systems and when long time scales must be addressed. © 2006 American Institute of Physics. DOI: 10.1063/1.2370993

4,154 citations


Journal ArticleDOI
TL;DR: The M05-2X functional has the best performance for thermochemical kinetics, noncovalent interactions (especially weak interaction, hydrogen bonding, π···π stacking, and interactions energies of nucleobases), and alkyl bond dissociation energies and the best composite results for energetics, excluding metals.
Abstract: We present a new hybrid meta exchange-correlation functional, called M05-2X, for thermochemistry, thermochemical kinetics, and noncovalent interactions. We also provide a full discussion of the new M05 functional, previously presented in a short communication. The M05 functional was parametrized including both metals and nonmetals, whereas M05-2X is a high-nonlocality functional with double the amount of nonlocal exchange (2X) that is parametrized only for nonmetals. In particular, M05 was parametrized against 35 data values, and M05-2X is parametrized against 34 data values. Both functionals, along with 28 other functionals, have been comparatively assessed against 234 data values: the MGAE109/3 main-group atomization energy database, the IP13/3 ionization potential database, the EA13/3 electron affinity database, the HTBH38/4 database of barrier height for hydrogen-transfer reactions, five noncovalent databases, two databases involving metal−metal and metal−ligand bond energies, a dipole moment databas...

3,246 citations


Journal ArticleDOI
TL;DR: This work introduces a new method called sparse principal component analysis (SPCA) using the lasso (elastic net) to produce modified principal components with sparse loadings and shows that PCA can be formulated as a regression-type optimization problem.
Abstract: Principal component analysis (PCA) is widely used in data processing and dimensionality reduction. However, PCA suffers from the fact that each principal component is a linear combination of all the original variables, thus it is often difficult to interpret the results. We introduce a new method called sparse principal component analysis (SPCA) using the lasso (elastic net) to produce modified principal components with sparse loadings. We first show that PCA can be formulated as a regression-type optimization problem; sparse loadings are then obtained by imposing the lasso (elastic net) constraint on the regression coefficients. Efficient algorithms are proposed to fit our SPCA models for both regular multivariate data and gene expression arrays. We also give a new formula to compute the total variance of modified principal components. As illustrations, SPCA is applied to real and simulated data with encouraging results.

3,102 citations


Journal ArticleDOI
TL;DR: The Meaning in Life Questionnaire (MLQ) as mentioned in this paper is a 10-item measure of the presence of, and the search for, meaning in life, which was developed to measure the emotional well-being of counseling patients.
Abstract: Counseling psychologists often work with clients to increase their well-being as well as to decrease their distress. One important aspect of well-being, highlighted particularly in humanistic theories of the counseling process, is perceived meaning in life. However, poor measurement has hampered research on meaning in life. In 3 studies, evidence is provided for the internal consistency, temporal stability, factor structure, and validity of the Meaning in Life Questionnaire (MLQ), a new 10-item measure of the presence of, and the search for, meaning in life. A multitrait-multimethod matrix demonstrates the convergent and discriminant validity of the MLQ subscales across time and informants, in comparison with 2 other meaning scales. The MLQ offers several improvements over current meaning in life measures, including no item overlap with distress measures, a stable factor structure, better discriminant validity, a briefer format, and the ability to measure the search for meaning.

3,066 citations


Journal ArticleDOI
TL;DR: It is shown that increased lipopolysaccharide is bioactive in vivo and correlates with measures of innate and adaptive immune activation, which establish a mechanism for chronic immune activation in the context of a compromised gastrointestinal mucosal surface and provide new directions for therapeutic interventions that modify the consequences of acute HIV infection.
Abstract: Chronic activation of the immune system is a hallmark of progressive HIV infection and better predicts disease outcome than plasma viral load, yet its etiology remains obscure. Here we show that circulating microbial products, probably derived from the gastrointestinal tract, are a cause of HIV-related systemic immune activation. Circulating lipopolysaccharide, which we used as an indicator of microbial translocation, was significantly increased in chronically HIV-infected individuals and in simian immunodeficiency virus (SIV)-infected rhesus macaques (P

3,049 citations


Journal ArticleDOI
TL;DR: Survivors of childhood cancer have a high rate of illness owing to chronic health conditions, including severe, disabling, or life-threatening conditions or death due to a chronic condition.
Abstract: Background Only a few small studies have assessed the long-term morbidity that follows the treatment of childhood cancer. We determined the incidence and severity of chronic health conditions in adult survivors. Methods The Childhood Cancer Survivor Study is a retrospective cohort study that tracks the health status of adults who received a diagnosis of childhood cancer between 1970 and 1986 and compares the results with those of siblings. We calculated the frequencies of chronic conditions in 10,397 survivors and 3034 siblings. A severity score (grades 1 through 4, ranging from mild to life-threatening or disabling) was assigned to each condition. Cox proportional-hazards models were used to estimate hazard ratios, reported as relative risks and 95% confidence intervals (CIs), for a chronic condition. Results Survivors and siblings had mean ages of 26.6 years (range, 18.0 to 48.0) and 29.2 years (range, 18.0 to 56.0), respectively, at the time of the study. Among 10,397 survivors, 62.3% had at least one chronic condition; 27.5% had a severe or life-threatening condition (grade 3 or 4). The adjusted relative risk of a chronic condition in a survivor, as compared with siblings, was 3.3 (95% CI, 3.0 to 3.5); for a severe or life-threatening condition, the risk was 8.2 (95% CI, 6.9 to 9.7). Among survivors, the cumulative incidence of a chronic health condition reached 73.4% (95% CI, 69.0 to 77.9) 30 years after the cancer diagnosis, with a cumulative incidence of 42.4% (95% CI, 33.7 to 51.2) for severe, disabling, or life-threatening conditions or death due to a chronic condition. Conclusions Survivors of childhood cancer have a high rate of illness owing to chronic health conditions.

2,897 citations


Journal ArticleDOI
TL;DR: Transportation biofuels such as synfuel hydrocarbons or cellulosic ethanol, if produced from low-input biomass grown on agriculturally marginal land or from waste biomass, could provide much greater supplies and environmental benefits than food-basedBiofuels.
Abstract: Negative environmental consequences of fossil fuels and concerns about petroleum supplies have spurred the search for renewable transportation biofuels. To be a viable alternative, a biofuel should provide a net energy gain, have environmental benefits, be economically competitive, and be producible in large quantities without reducing food supplies. We use these criteria to evaluate, through life-cycle accounting, ethanol from corn grain and biodiesel from soybeans. Ethanol yields 25% more energy than the energy invested in its production, whereas biodiesel yields 93% more. Compared with ethanol, biodiesel releases just 1.0%, 8.3%, and 13% of the agricultural nitrogen, phosphorus, and pesticide pollutants, respectively, per net energy gain. Relative to the fossil fuels they displace, greenhouse gas emissions are reduced 12% by the production and combustion of ethanol and 41% by biodiesel. Biodiesel also releases less air pollutants per net energy gain than ethanol. These advantages of biodiesel over ethanol come from lower agricultural inputs and more efficient conversion of feedstocks to fuel. Neither biofuel can replace much petroleum without impacting food supplies. Even dedicating all U.S. corn and soybean production to biofuels would meet only 12% of gasoline demand and 6% of diesel demand. Until recent increases in petroleum prices, high production costs made biofuels unprofitable without subsidies. Biodiesel provides sufficient environmental advantages to merit subsidy. Transportation biofuels such as synfuel hydrocarbons or cellulosic ethanol, if produced from low-input biomass grown on agriculturally marginal land or from waste biomass, could provide much greater supplies and environmental benefits than food-based biofuels.

2,841 citations


Journal ArticleDOI
16 Mar 2006-Nature
TL;DR: It is found that memory deficits in middle-aged Tg2576 mice are caused by the extracellular accumulation of a 56-kDa soluble amyloid-β assembly, which is proposed to be Aβ*56 (Aβ star 56), which may contribute to cognitive deficits associated with Alzheimer's disease.
Abstract: Memory function often declines with age, and is believed to deteriorate initially because of changes in synaptic function rather than loss of neurons. Some individuals then go on to develop Alzheimer's disease with neurodegeneration. Here we use Tg2576 mice, which express a human amyloid-beta precursor protein (APP) variant linked to Alzheimer's disease, to investigate the cause of memory decline in the absence of neurodegeneration or amyloid-beta protein amyloidosis. Young Tg2576 mice ( 14 months old) form abundant neuritic plaques containing amyloid-beta (refs 3-6). We found that memory deficits in middle-aged Tg2576 mice are caused by the extracellular accumulation of a 56-kDa soluble amyloid-beta assembly, which we term Abeta*56 (Abeta star 56). Abeta*56 purified from the brains of impaired Tg2576 mice disrupts memory when administered to young rats. We propose that Abeta*56 impairs memory independently of plaques or neuronal loss, and may contribute to cognitive deficits associated with Alzheimer's disease.

2,693 citations


Journal ArticleDOI
TL;DR: In this paper, a definition of dynamic capabilities, separating them from substantive capabilities as well as from their antecedents and consequences, is proposed, and a set of propositions that outline how substantive capabilities and dynamic capabilities are related to one another, how this relationship is moderated by organizational knowledge and skills, and how organizational age affects the speed of utilization of dynamic capability and the learning mode used in organizational change.
Abstract: The emergent literature on dynamic capabilities and their role in value creation is riddled with inconsistencies, overlapping definitions, and outright contradictions. Yet, the theoretical and practical importance of developing and applying dynamic capabilities to sustain a firm's competitive advantage in complex and volatile external environments has catapulted this issue to the forefront of the research agendas of many scholars. In this paper, we offer a definition of dynamic capabilities, separating them from substantive capabilities as well as from their antecedents and consequences. We also present a set of propositions that outline (1) how substantive capabilities and dynamic capabilities are related to one another, (2) how this relationship is moderated by organizational knowledge and skills, (3) how organizational age affects the speed of utilization of dynamic capabilities and the learning mode used in organizational change, and (4) how organizational knowledge and market dynamism affect the likely value of dynamic capabilities. Our discussion and model help to delineate key differences in the dynamic capabilities that new ventures and established companies have, revealing a key source of strategic heterogeneity between these firms.

2,546 citations



Journal ArticleDOI
TL;DR: It is shown that in a system consisting of two additive components which are mixed but the densities of which are known, the determination of the density of the system allows one to calculate the proportional masses of the two components.
Abstract: One can trace to Archimedes the idea that in a system consisting of two additive components which are mixed but the densities of which are known ( d l , d 2 ) , the determination of the density of the system ( D ) allows one to calculate the proportional masses of the two components. Let’s denote these components as W1 and W2.S Then, in a system with total weight W = W1 + Wz, the general equation for calculating component W1 expressed as a fraction (w1) of the total body weight is:

Journal ArticleDOI
TL;DR: In this article, the authors presented the final report from a series of precision measurements of the muon anomalous magnetic moment, a(mu)=(g-2)/2.54 ppm, which represents a 14-fold improvement compared to previous measurements at CERN.
Abstract: We present the final report from a series of precision measurements of the muon anomalous magnetic moment, a(mu)=(g-2)/2. The details of the experimental method, apparatus, data taking, and analysis are summarized. Data obtained at Brookhaven National Laboratory, using nearly equal samples of positive and negative muons, were used to deduce a(mu)(Expt)=11659208.0(5.4)(3.3)x10(-10), where the statistical and systematic uncertainties are given, respectively. The combined uncertainty of 0.54 ppm represents a 14-fold improvement compared to previous measurements at CERN. The standard model value for a(mu) includes contributions from virtual QED, weak, and hadronic processes. While the QED processes account for most of the anomaly, the largest theoretical uncertainty, approximate to 0.55 ppm, is associated with first-order hadronic vacuum polarization. Present standard model evaluations, based on e(+)e(-) hadronic cross sections, lie 2.2-2.7 standard deviations below the experimental result.

Journal ArticleDOI
TL;DR: Episodic antiretroviral therapy guided by the CD4+ count significantly increased the risk of opportunistic disease or death from any cause, as compared with continuous antireteviral therapy, largely as a consequence of lowering theCD4+ cell count and increasing the viral load.
Abstract: Methods We randomly assigned persons infected with HIV who had a CD4+ cell count of more than 350 per cubic millimeter to the continuous use of antiretroviral therapy (the viral suppression group) or the episodic use of antiretroviral therapy (the drug conservation group). Episodic use involved the deferral of therapy until the CD4+ count decreased to less than 250 per cubic millimeter and then the use of therapy until the CD4+ count increased to more than 350 per cubic millimeter. The primary end point was the development of an opportunistic disease or death from any cause. An important secondary end point was major cardiovascular, renal, or hepatic disease. Results A total of 5472 participants (2720 assigned to drug conservation and 2752 to viral suppression) were followed for an average of 16 months before the protocol was modified for the drug conservation group. At baseline, the median and nadir CD4+ counts were 597 per cubic millimeter and 250 per cubic millimeter, respectively, and 71.7% of participants had plasma HIV RNA levels of 400 copies or less per milliliter. Opportunistic disease or death from any cause occurred in 120 participants (3.3 events per 100 person-years) in the drug conservation group and 47 participants (1.3 per 100 person-years) in the viral suppression group (hazard ratio for the drug conservation group vs. the viral suppression group, 2.6; 95% confidence interval [CI], 1.9 to 3.7; P<0.001). Hazard ratios for death from any cause and for major cardiovascular, renal, and hepatic disease were 1.8 (95% CI, 1.2 to 2.9; P = 0.007) and 1.7 (95% CI, 1.1 to 2.5; P = 0.009), respectively. Adjustment for the latest CD4+ count and HIV RNA level (as time-updated covariates) reduced the hazard ratio for the primary end point from 2.6 to 1.5 (95% CI, 1.0 to 2.1). Conclusions Episodic antiretroviral therapy guided by the CD4+ count, as used in our study, significantly increased the risk of opportunistic disease or death from any cause, as compared with continuous antiretroviral therapy, largely as a consequence of lowering the CD4+ cell count and increasing the viral load. Episodic antiretroviral therapy does not reduce the risk of adverse events that have been associated with antiretroviral therapy. (ClinicalTrials.gov number, NCT00027352.)

Journal ArticleDOI
TL;DR: In this paper, the authors present a propositional inventory organized around the initial conditions affecting collaboration formation, process, structural and governance components, constraints and contingencies, outcomes, and accountability issues.
Abstract: People who want to tackle tough social problems and achieve beneficial community outcomes are beginning to understand that multiple sectors of a democratic society—business, nonprofits and philanthropies, the media, the community, and government—must collaborate to deal effectively and humanely with the challenges. This article focuses on cross-sector collaboration that is required to remedy complex public problems. Based on an extensive review of the literature on collaboration, the article presents a propositional inventory organized around the initial conditions affecting collaboration formation, process, structural and governance components, constraints and contingencies, outcomes, and accountability issues.

Journal ArticleDOI
TL;DR: These guidelines are developed under the auspices of the American College of Gastroenterology and its practice parameters committee and may be updated with pertinent scientific developments at a later time.

Journal ArticleDOI
TL;DR: Islet transplantation with the use of the Edmonton protocol can successfully restore long-term endogenous insulin production and glycemic stability in subjects with type 1 diabetes mellitus and unstable control, but insulin independence is usually not sustainable.
Abstract: Background Islet transplantation offers the potential to improve glycemic control in a subgroup of patients with type 1 diabetes mellitus who are disabled by refractory hypoglycemia. We conducted an international, multicenter trial to explore the feasibility and reproducibility of islet transplantation with the use of a single common protocol (the Edmonton protocol). Methods We enrolled 36 subjects with type 1 diabetes mellitus, who underwent islet transplantation at nine international sites. Islets were prepared from pancreases of deceased donors and were transplanted within 2 hours after purification, without culture. The primary end point was defined as insulin independence with adequate glycemic control 1 year after the final transplantation. Results Of the 36 subjects, 16 (44%) met the primary end point, 10 (28%) had partial function, and 10 (28%) had complete graft loss 1 year after the final transplantation. A total of 21 subjects (58%) attained insulin independence with good glycemic control at any point throughout the trial. Of these subjects, 16 (76%) required insulin again at 2 years; 5 of the 16 subjects who reached the primary end point (31%) remained insulin-independent at 2 years. Conclusions Islet transplantation with the use of the Edmonton protocol can successfully restore long-term endogenous insulin production and glycemic stability in subjects with type 1 diabetes mellitus and unstable control, but insulin independence is usually not sustainable. Persistent islet function even without insulin independence provides both protection from severe hypoglycemia and improved levels of glycated hemoglobin. (ClinicalTrials.gov number, NCT00014911.)

Journal ArticleDOI
TL;DR: The Seattle Heart Failure Model provides an accurate estimate of 1-, 2-, and 3-year survival with the use of easily obtained clinical, pharmacological, device, and laboratory characteristics.
Abstract: Background— Heart failure has an annual mortality rate ranging from 5% to 75%. The purpose of the study was to develop and validate a multivariate risk model to predict 1-, 2-, and 3-year survival in heart failure patients with the use of easily obtainable characteristics relating to clinical status, therapy (pharmacological as well as devices), and laboratory parameters. Methods and Results— The Seattle Heart Failure Model was derived in a cohort of 1125 heart failure patients with the use of a multivariate Cox model. For medications and devices not available in the derivation database, hazard ratios were estimated from published literature. The model was prospectively validated in 5 additional cohorts totaling 9942 heart failure patients and 17 307 person-years of follow-up. The accuracy of the model was excellent, with predicted versus actual 1-year survival rates of 73.4% versus 74.3% in the derivation cohort and 90.5% versus 88.5%, 86.5% versus 86.5%, 83.8% versus 83.3%, 90.9% versus 91.0%, and 89.6%...

Journal ArticleDOI
08 Dec 2006-Science
TL;DR: Low-input high-diversity mixtures of native grassland perennials can provide more usable energy, greater greenhouse gas reductions, and less agrichemical pollution per hectare than can corn grain ethanol or soybean biodiesel.
Abstract: Biofuels derived from low-input high-diversity (LIHD) mixtures of native grassland perennials can provide more usable energy, greater greenhouse gas reductions, and less agrichemical pollution per hectare than can corn grain ethanol or soybean biodiesel. High-diversity grasslands had increasingly higher bioenergy yields that were 238% greater than monoculture yields after a decade. LIHD biofuels are carbon negative because net ecosystem carbon dioxide sequestration (4.4 megagram hectare(-1) year(-1) of carbon dioxide in soil and roots) exceeds fossil carbon dioxide release during biofuel production (0.32 megagram hectare(-1) year(-1)). Moreover, LIHD biofuels can be produced on agriculturally degraded lands and thus need to neither displace food production nor cause loss of biodiversity via habitat destruction.

Journal ArticleDOI
TL;DR: Among healthy postmenopausal women, calcium with vitamin D supplementation resulted in a small but significant improvement in hip bone density, did not significantly reduce hip fracture, and increased the risk of kidney stones.
Abstract: Background The efficacy of calcium with vitamin D supplementation for preventing hip and other fractures in healthy postmenopausal women remains equivocal. Methods We recruited 36,282 postmenopausal women, 50 to 79 years of age, who were already enrolled in a Women's Health Initiative (WHI) clinical trial. We randomly assigned participants to receive 1000 mg of elemental calcium as calcium carbonate with 400 IU of vitamin D3 daily or placebo. Fractures were ascertained for an average follow-up period of 7.0 years. Bone density was measured at three WHI centers. Results Hip bone density was 1.06 percent higher in the calcium plus vitamin D group than in the placebo group (P<0.01). Intention-to-treat analysis indicated that participants receiving calcium plus vitamin D supplementation had a hazard ratio of 0.88 for hip fracture (95 percent confidence interval, 0.72 to 1.08), 0.90 for clinical spine fracture (0.74 to 1.10), and 0.96 for total fractures (0.91 to 1.02). The risk of renal calculi increased with...

Journal ArticleDOI
01 Jun 2006-Nature
TL;DR: It is found that greater numbers of plant species led to greater temporal stability of ecosystem annual aboveground plant production and the reliable, efficient and sustainable supply of some foods, fuels and ecosystem services can be enhanced by the use of biodiversity.
Abstract: Human-driven ecosystem simplification has highlighted questions about how the number of species in an ecosystem influences its functioning. Although biodiversity is now known to affect ecosystem productivity, its effects on stability are debated. Here we present a long-term experimental field test of the diversity-stability hypothesis. During a decade of data collection in an experiment that directly controlled the number of perennial prairie species, growing-season climate varied considerably, causing year-to-year variation in abundances of plant species and in ecosystem productivity. We found that greater numbers of plant species led to greater temporal stability of ecosystem annual aboveground plant production. In particular, the decadal temporal stability of the ecosystem, whether measured with intervals of two, five or ten years, was significantly greater at higher plant diversity and tended to increase as plots matured. Ecosystem stability was also positively dependent on root mass, which is a measure of perenniating biomass. Temporal stability of the ecosystem increased with diversity, despite a lower temporal stability of individual species, because of both portfolio (statistical averaging) and overyielding effects. However, we found no evidence of a covariance effect. Our results indicate that the reliable, efficient and sustainable supply of some foods (for example, livestock fodder), biofuels and ecosystem services can be enhanced by the use of biodiversity.

Journal ArticleDOI
TL;DR: Assessment of the geographic and social distribution of PA facilities and how disparity in access might underlie population-level PA and overweight patterns in US adolescents found inequality in availability ofPA facilities may contribute to ethnic and SES disparities in PA and obese patterns.
Abstract: CONTEXT. Environmental factors are suggested to play a major role in physical activity (PA) and other obesity-related behaviors, yet there is no national research on the relationship between disparity in access to recreational facilities and additional impact on PA and overweight patterns in US adolescents. OBJECTIVE. In a nationally representative cohort, we sought to assess the geographic and social distribution of PA facilities and how disparity in access might underlie population-level PA and overweight patterns. DESIGN, SETTING, AND PARTICIPANTS. Residential locations of US adolescents in wave I (1994–1995) of the National Longitudinal Study of Adolescent Health (N = 20745) were geocoded, and a 8.05-km buffer around each residence was drawn (N = 42857 census-block groups [19% of US block groups]). PA facilities, measured by national databases and satellite data, were linked with Geographic Information Systems technology to each respondent. Logistic-regression analyses tested the relationship of PA-related facilities with block-group socioeconomic status (SES) (at the community level) and the subsequent association of facilities with overweight and PA (at the individual level), controlling for population density. MAIN OUTCOME MEASURES. Outcome measures were overweight (BMI ≥ 95th percentile of the Centers for Disease Control and Prevention/National Center for Health Statistics growth curves) and achievement of ≥5 bouts per week of moderate-vigorous PA. RESULTS. Higher-SES block groups had a significantly greater relative odds of having 1 or more facilities. Low-SES and high-minority block groups were less likely to have facilities. Relative to zero facilities per block group, an increasing number of facilities was associated with decreased overweight and increased relative odds of achieving ≥5 bouts per week of moderate-vigorous PA. CONCLUSIONS. Lower-SES and high-minority block groups had reduced access to facilities, which in turn was associated with decreased PA and increased overweight. Inequality in availability of PA facilities may contribute to ethnic and SES disparities in PA and overweight patterns.

Journal ArticleDOI
TL;DR: A key finding is that the feedback rate per mobile must be increased linearly with the signal-to-noise ratio (SNR) (in decibels) in order to achieve the full multiplexing gain.
Abstract: Multiple transmit antennas in a downlink channel can provide tremendous capacity (i.e., multiplexing) gains, even when receivers have only single antennas. However, receiver and transmitter channel state information is generally required. In this correspondence, a system where each receiver has perfect channel knowledge, but the transmitter only receives quantized information regarding the channel instantiation is analyzed. The well-known zero-forcing transmission technique is considered, and simple expressions for the throughput degradation due to finite-rate feedback are derived. A key finding is that the feedback rate per mobile must be increased linearly with the signal-to-noise ratio (SNR) (in decibels) in order to achieve the full multiplexing gain. This is in sharp contrast to point-to-point multiple-input multiple-output (MIMO) systems, in which it is not necessary to increase the feedback rate as a function of the SNR

Journal ArticleDOI
19 Oct 2006-Nature
TL;DR: It is indicated that there may have been at least four independent losses of the flagellum in the kingdom Fungi, and the enigmatic microsporidia seem to be derived from an endoparasitic chytrid ancestor similar to Rozella allomycis, on the earliest diverging branch of the fungal phylogenetic tree.
Abstract: The ancestors of fungi are believed to be simple aquatic forms with flagellated spores, similar to members of the extant phylum Chytridiomycota (chytrids). Current classifications assume that chytrids form an early-diverging clade within the kingdom Fungi and imply a single loss of the spore flagellum, leading to the diversification of terrestrial fungi. Here we develop phylogenetic hypotheses for Fungi using data from six gene regions and nearly 200 species. Our results indicate that there may have been at least four independent losses of the flagellum in the kingdom Fungi. These losses of swimming spores coincided with the evolution of new mechanisms of spore dispersal, such as aerial dispersal in mycelial groups and polar tube eversion in the microsporidia (unicellular forms that lack mitochondria). The enigmatic microsporidia seem to be derived from an endoparasitic chytrid ancestor similar to Rozella allomycis, on the earliest diverging branch of the fungal phylogenetic tree.

Journal ArticleDOI
TL;DR: In this article, a method of engaged scholarship is proposed to address the knowledge gap between theory and practice, arguing that engaged scholarship not only enhances the relevance of research for practice but also contributes significantly to advancing research knowledge in a given domain.
Abstract: We examine three related ways in which the gap between theory and practice has been framed. One approach views it as a knowledge transfer problem, a second argues that theory and practice represent distinct kinds of knowledge, and a third incorporates a strategy of arbitrage--leading to the view that the gap is a knowledge production problem. We propose a method of engaged scholarship for addressing the knowledge production problem, arguing that engaged scholarship not only enhances the relevance of research for practice but also contributes significantly to advancing research knowledge in a given domain.

Journal ArticleDOI
TL;DR: In this paper, a self-report instrument designed to measure two subtypes of student engagement with school, cognitive and psychological engagement, was proposed, based on responses of an ethnically and economically diverse urban sample of 1931 ninth grade students.

Journal ArticleDOI
TL;DR: This paper considers the problem of downlink transmit beamforming for wireless transmission and downstream precoding for digital subscriber wireline transmission, in the context of common information broadcasting or multicasting applications wherein channel state information (CSI) is available at the transmitter.
Abstract: This paper considers the problem of downlink transmit beamforming for wireless transmission and downstream precoding for digital subscriber wireline transmission, in the context of common information broadcasting or multicasting applications wherein channel state information (CSI) is available at the transmitter. Unlike the usual "blind" isotropic broadcasting scenario, the availability of CSI allows transmit optimization. A minimum transmission power criterion is adopted, subject to prescribed minimum received signal-to-noise ratios (SNRs) at each of the intended receivers. A related max-min SNR "fair" problem formulation is also considered subject to a transmitted power constraint. It is proven that both problems are NP-hard; however, suitable reformulation allows the successful application of semidefinite relaxation (SDR) techniques. SDR yields an approximate solution plus a bound on the optimum value of the associated cost/reward. SDR is motivated from a Lagrangian duality perspective, and its performance is assessed via pertinent simulations for the case of Rayleigh fading wireless channels. We find that SDR typically yields solutions that are within 3-4 dB of the optimum, which is often good enough in practice. In several scenarios, SDR generates exact solutions that meet the associated bound on the optimum value. This is illustrated using measured very-high-bit-rate Digital Subscriber line (VDSL) channel data, and far-field beamforming for a uniform linear transmit antenna array.

Journal ArticleDOI
TL;DR: Phylogenetic analyses, comparison of gene content across the group, and reconstruction of ancestral gene sets indicate a combination of extensive gene loss and key gene acquisitions via horizontal gene transfer during the coevolution of lactic acid bacteria with their habitats.
Abstract: Lactic acid-producing bacteria are associated with various plant and animal niches and play a key role in the production of fermented foods and beverages. We report nine genome sequences representing the phylogenetic and functional diversity of these bacteria. The small genomes of lactic acid bacteria encode a broad repertoire of transporters for efficient carbon and nitrogen acquisition from the nutritionally rich environments they inhabit and reflect a limited range of biosynthetic capabilities that indicate both prototrophic and auxotrophic strains. Phylogenetic analyses, comparison of gene content across the group, and reconstruction of ancestral gene sets indicate a combination of extensive gene loss and key gene acquisitions via horizontal gene transfer during the coevolution of lactic acid bacteria with their habitats.

Proceedings ArticleDOI
01 Sep 2006
TL;DR: Zhang et al. as mentioned in this paper presented Casper1, a new framework in which mobile and stationary users can entertain location-based services without revealing their location information, which consists of two main components, the location anonymizer and the privacy-aware query processor.
Abstract: This paper tackles a major privacy concern in current location-based services where users have to continuously report their locations to the database server in order to obtain the service. For example, a user asking about the nearest gas station has to report her exact location. With untrusted servers, reporting the location information may lead to several privacy threats. In this paper, we present Casper1; a new framework in which mobile and stationary users can entertain location-based services without revealing their location information. Casper consists of two main components, the location anonymizer and the privacy-aware query processor. The location anonymizer blurs the users' exact location information into cloaked spatial regions based on user-specified privacy requirements. The privacy-aware query processor is embedded inside the location-based database server in order to deal with the cloaked spatial areas rather than the exact location information. Experimental results show that Casper achieves high quality location-based services while providing anonymity for both data and queries.

Posted Content
TL;DR: A review of the literature suggests that the evidence for the incumbent's curse is based on anecdotes and scattered case studies of highly specialized innovations as mentioned in this paper, which is not clear if it applies widely across several product categories.
Abstract: A common perception in the field of innovation is that large, incumbent firms rarely introduce radical product innovations. Such firms tend to solidify their market positions with relatively incremental innovations. They may even turn away entrepreneurs who come up with radical innovations, though they themselves had such entrepreneurial roots. As a result, radical innovations tend to come from small firms, the outsiders. This thesis, which we term the incumbent's curse, is commonly accepted in academic and popular accounts of radical innovation. This topic is important, because radical product innovation is an engine of economic growth that has created entire industries and brought down giants while catapulting small firms to market leadership. Yet a review of the literature suggests that the evidence for the incumbent's curse is based on anecdotes and scattered case studies of highly specialized innovations. It is not clear if it applies widely across several product categories. The authors reexamine the incumbent's curse using a historical analysis of a relatively large number of radical innovations in the consumer durables and office products categories. In particular, the authors seek to answer the following questions: (1) How prevalent is this phenomenon? What percentage of radical innovations do incumbents versus nonincumbents introduce? What percentage of radical innovations do small firms versus large firms introduce? (2) Is the phenomenon a curse that invariably afflicts large incumbents in current industries? Is it driven by incumbency or size? and (3) How consistent is the phenomenon? Has the increasing size and complexity of firms over time accentuated it? Does it vary across national boundaries? Results from the study suggest that conventional wisdom about the incumbent's curse may not always be valid.