scispace - formally typeset
Search or ask a question

Showing papers by "Yale University published in 2008"


Journal ArticleDOI
20 Mar 2008-Nature
TL;DR: Some of the science and technology being developed to improve the disinfection and decontamination of water, as well as efforts to increase water supplies through the safe re-use of wastewater and efficient desalination of sea and brackish water are highlighted.
Abstract: One of the most pervasive problems afflicting people throughout the world is inadequate access to clean water and sanitation. Problems with water are expected to grow worse in the coming decades, with water scarcity occurring globally, even in regions currently considered water-rich. Addressing these problems calls out for a tremendous amount of research to be conducted to identify robust new methods of purifying water at lower cost and with less energy, while at the same time minimizing the use of chemicals and impact on the environment. Here we highlight some of the science and technology being developed to improve the disinfection and decontamination of water, as well as efforts to increase water supplies through the safe re-use of wastewater and efficient desalination of sea and brackish water.

6,967 citations


Journal ArticleDOI
TL;DR: The Compact Muon Solenoid (CMS) detector at the Large Hadron Collider (LHC) at CERN as mentioned in this paper was designed to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1)
Abstract: The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.

5,193 citations


Journal ArticleDOI
Ruslan Medzhitov1
23 Jul 2008-Nature
TL;DR: This work has shown that tissue stress or malfunction induces an adaptive response that is intermediate between the basal homeostatic state and a classic inflammatory response, which is referred to here as para-inflammation.
Abstract: Inflammation underlies a wide variety of physiological and pathological processes. Although the pathological aspects of many types of inflammation are well appreciated, their physiological functions are mostly unknown. The classic instigators of inflammation - infection and tissue injury - are at one end of a large range of adverse conditions that induce inflammation, and they trigger the recruitment of leukocytes and plasma proteins to the affected tissue site. Tissue stress or malfunction similarly induces an adaptive response, which is referred to here as para-inflammation. This response relies mainly on tissue-resident macrophages and is intermediate between the basal homeostatic state and a classic inflammatory response. Para-inflammation is probably responsible for the chronic inflammatory conditions that are associated with modern human diseases.

4,832 citations


Journal ArticleDOI
08 Feb 2008-Cell
TL;DR: The authors synthesize some of the basic principles that have emerged from studies of NF-kappaB, and aim to generate a more unified view of the regulation of the transcription factor.

3,996 citations


Journal ArticleDOI
TL;DR: AMPK directly phosphorylates the mTOR binding partner raptor on two well-conserved serine residues, and this phosphorylation induces 14-3-3 binding to raptor, uncovering a conserved effector of AMPK that mediates its role as a metabolic checkpoint coordinating cell growth with energy status.

3,328 citations


Journal ArticleDOI
TL;DR: The results strongly confirm 11 previously reported loci and provide genome-wide significant evidence for 21 additional loci, including the regions containing STAT3, JAK2, ICOSLG, CDKAL1 and ITLN1, which offer promise for informed therapeutic development.
Abstract: Several risk factors for Crohn's disease have been identified in recent genome-wide association studies. To advance gene discovery further, we combined data from three studies on Crohn's disease (a total of 3,230 cases and 4,829 controls) and carried out replication in 3,664 independent cases with a mixture of population-based and family-based controls. The results strongly confirm 11 previously reported loci and provide genome-wide significant evidence for 21 additional loci, including the regions containing STAT3, JAK2, ICOSLG, CDKAL1 and ITLN1. The expanded molecular understanding of the basis of this disease offers promise for informed therapeutic development.

2,584 citations


Journal ArticleDOI
TL;DR: A consensus meeting was convened by the Initiative on Methods, Measurement, and Pain Assessment in Clinical Trials (IMMPACT) to provide recommendations for interpreting clinical importance of treatment outcomes in clinical trials of the efficacy and effectiveness of chronic pain treatments as discussed by the authors.

2,581 citations


Journal ArticleDOI
06 Jun 2008-Science
TL;DR: A quantitative sequencing-based method is developed for mapping transcribed regions, in which complementary DNA fragments are subjected to high-throughput sequencing and mapped to the genome, and it is demonstrated that most (74.5%) of the nonrepetitive sequence of the yeast genome is transcribed.
Abstract: The identification of untranslated regions, introns, and coding regions within an organism remains challenging. We developed a quantitative sequencing-based method called RNA-Seq for mapping transcribed regions, in which complementary DNA fragments are subjected to high-throughput sequencing and mapped to the genome. We applied RNA-Seq to generate a high-resolution transcriptome map of the yeast genome and demonstrated that most (74.5%) of the nonrepetitive sequence of the yeast genome is transcribed. We confirmed many known and predicted introns and demonstrated that others are not actively used. Alternative initiation codons and upstream open reading frames also were identified for many yeast genes. We also found unexpected 3'-end heterogeneity and the presence of many overlapping genes. These results indicate that the yeast transcriptome is more complex than previously appreciated.

2,506 citations


Journal ArticleDOI
TL;DR: A set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes are presented.
Abstract: Research in autophagy continues to accelerate,(1) and as a result many new scientists are entering the field Accordingly, it is important to establish a standard set of criteria for monitoring macroautophagy in different organisms Recent reviews have described the range of assays that have been used for this purpose(2,3) There are many useful and convenient methods that can be used to monitor macroautophagy in yeast, but relatively few in other model systems, and there is much confusion regarding acceptable methods to measure macroautophagy in higher eukaryotes A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers of autophagosomes versus those that measure flux through the autophagy pathway; thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from fully functional autophagy that includes delivery to, and degradation within, lysosomes (in most higher eukaryotes) or the vacuole (in plants and fungi) Here, we present a set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes This set of guidelines is not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to verify an autophagic response

2,310 citations


Journal ArticleDOI
TL;DR: A systematic process for creating a frailty index, which relates deficit accumulation to the individual risk of death, showed reproducible properties in the Yale Precipitating Events Project cohort study.
Abstract: Background Frailty can be measured in relation to the accumulation of deficits using a frailty index. A frailty index can be developed from most ageing databases. Our objective is to systematically describe a standard procedure for constructing a frailty index.

2,149 citations


Journal ArticleDOI
TL;DR: Providing specific definitions for compliance and persistence is important for sound quantitative expressions of patients' drug dosing histories and their explanatory power for clinical and economic events and adoption by health outcomes researchers will provide a consistent framework and lexicon for research.

Journal ArticleDOI
TL;DR: The DEC model is sufficiently similar to character models that it might serve as a gateway through which many existing comparative methods for characters could be imported into the realm of historical biogeography; moreover, it might inspire the conceptual expansion of character models toward inclusion of evolutionary change as directly coincident with cladogenesis events.
Abstract: In historical biogeography, model-based inference methods for reconstructing the evolution of geographic ranges on phylogenetic trees are poorly developed relative to the diversity of analogous methods available for inferring character evolution. We attempt to rectify this deficiency by constructing a dispersal-extinction-cladogenesis (DEC) model for geographic range evolution that specifies instantaneous transition rates between discrete states (ranges) along phylogenetic branches and apply it to estimating likelihoods of ancestral states (range inheritance scenarios) at cladogenesis events. Unlike an earlier version of this approach, the present model allows for an analytical solution to probabilities of range transitions as a function of time, enabling free parameters in the model, rates of dispersal, and local extinction to be estimated by maximum likelihood. Simulation results indicate that accurate parameter estimates may be difficult to obtain in practice but also show that ancestral range inheritance scenarios nevertheless can be correctly recovered with high success if rates of range evolution are low relative to the rate of cladogenesis. We apply the DEC model to a previously published, exemplary case study of island biogeography involving Hawaiian endemic angiosperms in Psychotria (Rubiaceae), showing how the DEC model can be iteratively refined from inspecting inferences of range evolution and also how geological constraints involving times of island origin may be imposed on the likelihood function. The DEC model is sufficiently similar to character models that it might serve as a gateway through which many existing comparative methods for characters could be imported into the realm of historical biogeography; moreover, it might also inspire the conceptual expansion of character models toward inclusion of evolutionary change as directly coincident, either as cause or consequence, with cladogenesis events. The DEC model is thus an incremental advance that highlights considerable potential in the nascent field of model-based historical biogeographic inference.

Journal ArticleDOI
23 Oct 2008-Nature
TL;DR: It is found that MyD88 deficiency changes the composition of the distal gut microbiota, and that exposure to the microbiota of specific pathogen-free MyD 88-negative NOD donors attenuates T1D in germ-free NOD recipients.
Abstract: Type 1 diabetes (T1D) is a debilitating autoimmune disease that results from T-cell-mediated destruction of insulin-producing beta-cells. Its incidence has increased during the past several decades in developed countries, suggesting that changes in the environment (including the human microbial environment) may influence disease pathogenesis. The incidence of spontaneous T1D in non-obese diabetic (NOD) mice can be affected by the microbial environment in the animal housing facility or by exposure to microbial stimuli, such as injection with mycobacteria or various microbial products. Here we show that specific pathogen-free NOD mice lacking MyD88 protein (an adaptor for multiple innate immune receptors that recognize microbial stimuli) do not develop T1D. The effect is dependent on commensal microbes because germ-free MyD88-negative NOD mice develop robust diabetes, whereas colonization of these germ-free MyD88-negative NOD mice with a defined microbial consortium (representing bacterial phyla normally present in human gut) attenuates T1D. We also find that MyD88 deficiency changes the composition of the distal gut microbiota, and that exposure to the microbiota of specific pathogen-free MyD88-negative NOD donors attenuates T1D in germ-free NOD recipients. Together, these findings indicate that interaction of the intestinal microbes with the innate immune system is a critical epigenetic factor modifying T1D predisposition.

Journal ArticleDOI
10 Apr 2008-Nature
TL;DR: This data reinforce several previously identified clades that split deeply in the animal tree, unambiguously resolve multiple long-standing issues for which there was strong conflicting support in earlier studies with less data, and provide molecular support for the monophyly of molluscs, a group long recognized by morphologists.
Abstract: Long-held ideas regarding the evolutionary relationships among animals have recently been upended by sometimes controversial hypotheses based largely on insights from molecular data. These new hypotheses include a clade of moulting animals (Ecdysozoa) and the close relationship of the lophophorates to molluscs and annelids (Lophotrochozoa). Many relationships remain disputed, including those that are required to polarize key features of character evolution, and support for deep nodes is often low. Phylogenomic approaches, which use data from many genes, have shown promise for resolving deep animal relationships, but are hindered by a lack of data from many important groups. Here we report a total of 39.9 Mb of expressed sequence tags from 29 animals belonging to 21 phyla, including 11 phyla previously lacking genomic or expressed-sequence-tag data. Analysed in combination with existing sequences, our data reinforce several previously identified clades that split deeply in the animal tree (including Protostomia, Ecdysozoa and Lophotrochozoa), unambiguously resolve multiple long-standing issues for which there was strong conflicting support in earlier studies with less data (such as velvet worms rather than tardigrades as the sister group of arthropods), and provide molecular support for the monophyly of molluscs, a group long recognized by morphologists. In addition, we find strong support for several new hypotheses. These include a clade that unites annelids (including sipunculans and echiurans) with nemerteans, phoronids and brachiopods, molluscs as sister to that assemblage, and the placement of ctenophores as the earliest diverging extant multicellular animals. A single origin of spiral cleavage (with subsequent losses) is inferred from well-supported nodes. Many relationships between a stable subset of taxa find strong support, and a diminishing number of lineages remain recalcitrant to placement on the tree.

Posted Content
TL;DR: This work adapts an algorithm and uses it to implement a general-purpose, multiple imputation model for missing data that is considerably faster and easier to use than the leading method recommended in the statistics literature.
Abstract: We propose a remedy for the discrepancy between the way political scientists analyze data with missing values and the recommendations of the statistics community. Methodologists and statisticians agree that "multiple imputation" is a superior approach to the problem of missing data scattered through one's explanatory and dependent variables than the methods currently used in applied data analysis. The discrepancy occurs because the computational algorithms used to apply the best multiple imputation models have been slow, difficult to implement, impossible to run with existing commercial statistical packages, and have demanded considerable expertise. We adapt an algorithm and use it to implement a general-purpose, multiple imputation model for missing data. This algorithm is considerably easier to use than the leading method recommended in statistics literature. We also quantify the risks of current missing data practices, illustrate how to use the new procedure, and evaluate this alternative through simulated data as well as actual empirical examples. Finally, we offer easy-to-use that implements our suggested methods.

Journal ArticleDOI
TL;DR: Greater appreciation of the convergence of mechanisms between stress, depression, and neuroplasticity is likely to lead to the identification of novel targets for more efficacious treatments.

Journal ArticleDOI
TL;DR: HR accessory factors that facilitate other stages of the Rad51- and Dmc1-catalyzed homologous DNA pairing and strand exchange reaction have also been identified.
Abstract: Homologous recombination (HR) serves to eliminate deleterious lesions, such as double-stranded breaks and interstrand crosslinks, from chromosomes. HR is also critical for the preservation of repli- cation forks, for telomere maintenance, and chromosome segrega- tion in meiosis I. As such, HR is indispensable for the maintenance of genome integrity and the avoidance of cancers in humans. The HR reaction is mediated by a conserved class of enzymes termed recombinases. Two recombinases, Rad51 and Dmc1, catalyze the pairing and shuffling of homologous DNA sequences in eukaryotic cells via a filamentous intermediate on ssDNA called the presynaptic filament. The assembly of the presynaptic filament is a rate-limiting process that is enhanced by recombination mediators, such as the breast tumor suppressor BRCA2. HR accessory factors that facil- itate other stages of the Rad51- and Dmc1-catalyzed homologous DNA pairing and strand exchange reaction have also been identified. Recent progress on elucidating the mechanisms of action of Rad51 and Dmc1 and their cohorts of ancillary factors is reviewed here.

Journal ArticleDOI
TL;DR: To improve outcome from GEP NETs, a better understanding of their biology is needed, with emphasis on molecular genetics and disease modeling, and more-reliable serum markers, better tumour localisation and identification of small lesions, and histological grading systems and classifications with prognostic application are needed.
Abstract: Gastroenteropancreatic (GEP) neuroendocrine tumours (NETs) are fairly rare neoplasms that present many clinical challenges. They secrete peptides and neuroamines that cause distinct clinical syndromes, including carcinoid syndrome. However, many are clinically silent until late presentation with mass effects. Investigation and management should be highly individualised for a patient, taking into consideration the likely natural history of the tumour and general health of the patient. Management strategies include surgery for cure (which is achieved rarely) or for cytoreduction, radiological intervention (by chemoembolisation and radiofrequency ablation), chemotherapy, and somatostatin analogues to control symptoms that result from release of peptides and neuroamines. New biological agents and somatostatin-tagged radionuclides are under investigation. The complexity, heterogeneity, and rarity of GEP NETs have contributed to a paucity of relevant randomised trials and little or no survival increase over the past 30 years. To improve outcome from GEP NETs, a better understanding of their biology is needed, with emphasis on molecular genetics and disease modeling. More-reliable serum markers, better tumour localisation and identification of small lesions, and histological grading systems and classifications with prognostic application are needed. Comparison between treatments is currently very difficult. Progress is unlikely to occur without development of centers of excellence, with dedicated combined clinical teams to coordinate multicentre studies, maintain clinical and tissue databases, and refine molecularly targeted therapeutics.

Journal ArticleDOI
TL;DR: A redshift quality parameter, -->Qz, is introduced, which provides a robust estimate of the reliability of the photometric redshift estimate, and is provided for the FIRES, MUSYC, and FIREWORKS surveys.
Abstract: We describe a new program for determining photometric redshifts, dubbed EAZY. The program is optimized for cases where spectroscopic redshifts are not available, or are only available for a biased subset of the galaxies. The code combines features from various existing codes: it can fit linear combinations of templates, it includes optional flux- and redshift-based priors, and its user interface is modeled on the popular HYPERZ code. A novel feature is that the default template set, as well as the default functional forms of the priors, are not based on (usually highly biased) spectroscopic samples, but on semianalytical models. Furthermore, template mismatch is addressed by a novel rest-frame template error function. This function gives different wavelength regions different weights, and ensures that the formal redshift uncertainties are realistic. We introduce a redshift quality parameter, -->Qz, which provides a robust estimate of the reliability of the photometric redshift estimate. Despite the fact that EAZY is not trained on spectroscopic samples, the code (with default parameters) performs very well on existing public data sets. For K-selected samples in CDF-South and other deep fields, we find a 1 ? scatter in -->? z/(1 + z) of 0.034, and we provide updated photometric redshift catalogs for the FIRES, MUSYC, and FIREWORKS surveys.

Journal ArticleDOI
03 Oct 2008-Science
TL;DR: A comparative quality assessment of current yeast interactome data sets is carried out, demonstrating that high-throughput yeast two-hybrid (Y2H) screening provides high-quality binary interaction information.
Abstract: Current yeast interactome network maps contain several hundred molecular complexes with limited and somewhat controversial representation of direct binary interactions. We carried out a comparative quality assessment of current yeast interactome data sets, demonstrating that high-throughput yeast two-hybrid (Y2H) screening provides high-quality binary interaction information. Because a large fraction of the yeast binary interactome remains to be mapped, we developed an empirically controlled mapping framework to produce a "second-generation" high-quality, high-throughput Y2H data set covering approximately 20% of all yeast binary interactions. Both Y2H and affinity purification followed by mass spectrometry (AP/MS) data are of equally high quality but of a fundamentally different and complementary nature, resulting in networks with different topological and biological properties. Compared to co-complex interactome models, this binary map is enriched for transient signaling interactions and intercomplex connections with a highly significant clustering between essential proteins. Rather than correlating with essentiality, protein connectivity correlates with genetic pleiotropy.

Journal ArticleDOI
Rajita Sinha1
TL;DR: The effects of regular and chronic drug use on alterations in these stress and motivational systems are reviewed, with specific attention to the impact of these adaptations on stress regulation, impulse control, and perpetuation of compulsive drug seeking and relapse susceptibility.
Abstract: Stress is a well-known risk factor in the development of addiction and in addiction relapse vulnerability. A series of population-based and epidemiological studies have identified specific stressors and individual-level variables that are predictive of substance use and abuse. Preclinical research also shows that stress exposure enhances drug self-administration and reinstates drug seeking in drug-experienced animals. The deleterious effects of early life stress, child maltreatment, and accumulated adversity on alterations in the corticotropin releasing factor and hypothalamic-pituitary-adrenal axis (CRF/HPA), the extrahypothalamic CRF, the autonomic arousal, and the central noradrenergic systems are also presented. The effects of these alterations on the corticostriatal-limbic motivational, learning, and adaptation systems that include mesolimbic dopamine, glutamate, and gamma-amino-butyric acid (GABA) pathways are discussed as the underlying pathophysiology associated with stress-related risk of addiction. The effects of regular and chronic drug use on alterations in these stress and motivational systems are also reviewed, with specific attention to the impact of these adaptations on stress regulation, impulse control, and perpetuation of compulsive drug seeking and relapse susceptibility. Finally, research gaps in furthering our understanding of the association between stress and addiction are presented, with the hope that addressing these unanswered questions will significantly influence new prevention and treatment strategies to address vulnerability to addiction.

Journal ArticleDOI
19 Jun 2008-Nature
TL;DR: It is shown that aluminium adjuvants activate an intracellular innate immune response system called the Nalp3 (also known as cryopyrin, CIAS1 or NLRP3) inflammasome, which is a crucial element in the adjuvant effect of aluminium adjvants.
Abstract: Aluminium adjuvants, typically referred to as 'alum', are the most commonly used adjuvants in human and animal vaccines worldwide, yet the mechanism underlying the stimulation of the immune system by alum remains unknown. Toll-like receptors are critical in sensing infections and are therefore common targets of various adjuvants used in immunological studies. Although alum is known to induce the production of proinflammatory cytokines in vitro, it has been repeatedly demonstrated that alum does not require intact Toll-like receptor signalling to activate the immune system. Here we show that aluminium adjuvants activate an intracellular innate immune response system called the Nalp3 (also known as cryopyrin, CIAS1 or NLRP3) inflammasome. Production of the pro-inflammatory cytokines interleukin-1beta and interleukin-18 by macrophages in response to alum in vitro required intact inflammasome signalling. Furthermore, in vivo, mice deficient in Nalp3, ASC (apoptosis-associated speck-like protein containing a caspase recruitment domain) or caspase-1 failed to mount a significant antibody response to an antigen administered with aluminium adjuvants, whereas the response to complete Freund's adjuvant remained intact. We identify the Nalp3 inflammasome as a crucial element in the adjuvant effect of aluminium adjuvants; in addition, we show that the innate inflammasome pathway can direct a humoral adaptive immune response. This is likely to affect how we design effective, but safe, adjuvants in the future.

Journal ArticleDOI
TL;DR: Today, chemotherapy has changed as important molecular abnormalities are being used to screen for potential new drugs as well as for targeted treatments.
Abstract: The use of chemotherapy to treat cancer began at the start of the 20th century with attempts to narrow the universe of chemicals that might affect the disease by developing methods to screen chemicals using transplantable tumors in rodents. It was, however, four World War II-related programs, and the effects of drugs that evolved from them, that provided the impetus to establish in 1955 the national drug development effort known as the Cancer Chemotherapy National Service Center. The ability of combination chemotherapy to cure acute childhood leukemia and advanced Hodgkin's disease in the 1960s and early 1970s overcame the prevailing pessimism about the ability of drugs to cure advanced cancers, facilitated the study of adjuvant chemotherapy, and helped foster the national cancer program. Today, chemotherapy has changed as important molecular abnormalities are being used to screen for potential new drugs as well as for targeted treatments.

Journal ArticleDOI
TL;DR: This review critically assesses the contributions of carbon-based nanomaterials to a broad range of environmental applications: sorbents, high-flux membranes, depth filters, antimicrobial agents, environmental sensors, renewable energy technologies, and pollution prevention strategies.
Abstract: The unique and tunable properties of carbon-based nanomaterials enable new technologies for identifying and addressing environmental challenges. This review critically assesses the contributions of carbon-based nanomaterials to a broad range of environmental applications: sorbents, high-flux membranes, depth filters, antimicrobial agents, environmental sensors, renewable energy technologies, and pollution prevention strategies. In linking technological advance back to the physical, chemical, and electronic properties of carbonaceous nanomaterials, this article also outlines future opportunities for nanomaterial application in environmental systems.

Journal ArticleDOI
TL;DR: EI--conceptualized as an ability--is an important variable both conceptually and empirically, and it shows incremental validity for predicting socially relevant outcomes.
Abstract: Some individuals have a greater capacity than others to carry out sophisticated information processing about emotions and emotion-relevant stimuli and to use this information as a guide to thinking and behavior. The authors have termed this set of abilities emotional intelligence (EI). Since the introduction of the concept, however, a schism has developed in which some researchers focus on EI as a distinct group of mental abilities, and other researchers instead study an eclectic mix of positive traits such as happiness, self-esteem, and optimism. Clarifying what EI is and is not can help the field by better distinguishing research that is truly pertinent to EI from research that is not. EI--conceptualized as an ability--is an important variable both conceptually and empirically, and it shows incremental validity for predicting socially relevant outcomes.

Journal ArticleDOI
06 Aug 2008-JAMA
TL;DR: This study provides the first direct estimates of HIV incidence in the United States using laboratory technologies previously implemented only in clinic-based settings and indicated that HIV incidence increased in the mid-1990s, then slightly declined after 1999 and has been stable thereafter.
Abstract: Context Incidence of human immunodeficiency virus (HIV) in the United States has not been directly measured. New assays that differentiate recent vs long-standing HIV infections allow improved estimation of HIV incidence. Objective To estimate HIV incidence in the United States. Design, Setting, and Patients Remnant diagnostic serum specimens from patients 13 years or older and newly diagnosed with HIV during 2006 in 22 states were tested with the BED HIV-1 capture enzyme immunoassay to classify infections as recent or long-standing. Information on HIV cases was reported to the Centers for Disease Control and Prevention through June 2007. Incidence of HIV in the 22 states during 2006 was estimated using a statistical approach with adjustment for testing frequency and extrapolated to the United States. Results were corroborated with back-calculation of HIV incidence for 1977-2006 based on HIV diagnoses from 40 states and AIDS incidence from 50 states and the District of Columbia. Main Outcome Measure Estimated HIV incidence. Results An estimated 39 400 persons were diagnosed with HIV in 2006 in the 22 states. Of 6864 diagnostic specimens tested using the BED assay, 2133 (31%) were classified as recent infections. Based on extrapolations from these data, the estimated number of new infections for the United States in 2006 was 56 300 (95% confidence interval [CI], 48 200-64 500); the estimated incidence rate was 22.8 per 100 000 population (95% CI, 19.5-26.1). Forty-five percent of infections were among black individuals and 53% among men who have sex with men. The back-calculation (n = 1.230 million HIV/AIDS cases reported by the end of 2006) yielded an estimate of 55 400 (95% CI, 50 000-60 800) new infections per year for 2003-2006 and indicated that HIV incidence increased in the mid-1990s, then slightly declined after 1999 and has been stable thereafter. Conclusions This study provides the first direct estimates of HIV incidence in the United States using laboratory technologies previously implemented only in clinic-based settings. New HIV infections in the United States remain concentrated among men who have sex with men and among black individuals.

Journal ArticleDOI
TL;DR: Improved understanding of the biological functions of small non-coding RNAs has been fostered by the analysis of genetic deletions of individual miRNAs in mammals, and studies show that miRNA are key regulators of animal development and are potential human disease loci.
Abstract: Our understanding of the biological functions of small non-coding RNAs has been fostered by the analysis of genetic deletions of individual microRNAs (miRNAs) in mammals. These studies show that miRNAs are key regulators of animal development and are potential human disease loci.

Journal ArticleDOI
TL;DR: Continuous glucose monitoring can be associated with improved glycemic control in adults with type 1 diabetes and further work is needed to identify barriers to effectiveness of continuous monitoring in children and adolescents.
Abstract: BACKGROUND The value of continuous glucose monitoring in the management of type 1 diabetes mellitus has not been determined. METHODS In a multicenter clinical trial, we randomly assigned 322 adults and children who were already receiving intensive therapy for type 1 diabetes to a group with continuous glucose monitoring or to a control group performing home monitoring with a blood glucose meter. All the patients were stratified into three groups according to age and had a glycated hemoglobin level of 7.0 to 10.0%. The primary outcome was the change in the glycated hemoglobin level at 26 weeks. RESULTS The changes in glycated hemoglobin levels in the two study groups varied markedly according to age group (P=0.003), with a significant difference among patients 25 years of age or older that favored the continuous-monitoring group (mean difference in change, -0.53%; 95% confidence interval [CI], -0.71 to -0.35; P<0.001). The between-group difference was not significant among those who were 15 to 24 years of age (mean difference, 0.08; 95% CI, -0.17 to 0.33; P=0.52) or among those who were 8 to 14 years of age (mean difference, -0.13; 95% CI, -0.38 to 0.11; P=0.29). Secondary glycated hemoglobin outcomes were better in the continuous-monitoring group than in the control group among the oldest and youngest patients but not among those who were 15 to 24 years of age. The use of continuous glucose monitoring averaged 6.0 or more days per week for 83% of patients 25 years of age or older, 30% of those 15 to 24 years of age, and 50% of those 8 to 14 years of age. The rate of severe hypoglycemia was low and did not differ between the two study groups; however, the trial was not powered to detect such a difference. CONCLUSIONS Continuous glucose monitoring can be associated with improved glycemic control in adults with type 1 diabetes. Further work is needed to identify barriers to effectiveness of continuous monitoring in children and adolescents. (ClinicalTrials.gov number, NCT00406133.)

Journal ArticleDOI
TL;DR: These findings demonstrate that prospective ascertainment of individuals at risk for psychosis is feasible, with a level of predictive accuracy comparable to that in other areas of preventive medicine.
Abstract: Context Early detection and prospective evaluation of individuals who will develop schizophrenia or other psychotic disorders are critical to efforts to isolate mechanisms underlying psychosis onset and to the testing of preventive interventions, but existing risk prediction approaches have achieved only modest predictive accuracy. Objectives To determine the risk of conversion to psychosis and to evaluate a set of prediction algorithms maximizing positive predictive power in a clinical high-risk sample. Design, Setting, and Participants Longitudinal study with a 2½-year follow-up of 291 prospectively identified treatment-seeking patients meeting Structured Interview for Prodromal Syndromes criteria. The patients were recruited and underwent evaluation across 8 clinical research centers as part of the North American Prodrome Longitudinal Study. Main Outcome Measure Time to conversion to a fully psychotic form of mental illness. Results The risk of conversion to psychosis was 35%, with a decelerating rate of transition during the 2½-year follow-up. Five features assessed at baseline contributed uniquely to the prediction of psychosis: a genetic risk for schizophrenia with recent deterioration in functioning, higher levels of unusual thought content, higher levels of suspicion/paranoia, greater social impairment, and a history of substance abuse. Prediction algorithms combining 2 or 3 of these variables resulted in dramatic increases in positive predictive power (ie, 68%-80%) compared with the prodromal criteria alone. Conclusions These findings demonstrate that prospective ascertainment of individuals at risk for psychosis is feasible, with a level of predictive accuracy comparable to that in other areas of preventive medicine. They provide a benchmark for the rate and shape of the psychosis risk function against which standardized preventive intervention programs can be compared.

Journal ArticleDOI
Gina Novick1
TL;DR: Research is needed comparing these modalities, and examining their impact on data quality and their use for studying varying topics and populations, to contribute evidence-based guidelines for optimizing interview data.
Abstract: Telephone interviews are largely neglected in the qualitative research literature and, when discussed, they are often depicted as a less attractive alternative to face-to-face interviewing. The absence of visual cues via telephone is thought to result in loss of contextual and nonverbal data and to compromise rapport, probing, and interpretation of responses. Yet, telephones may allow respondents to feel relaxed and able to disclose sensitive information, and evidence is lacking that they produce lower quality data. This apparent bias against telephone interviews contrasts with a growing interest in electronic qualitative interviews. Research is needed comparing these modalities, and examining their impact on data quality and their use for studying varying topics and populations. Such studies could contribute evidence-based guidelines for optimizing interview data.