scispace - formally typeset
Search or ask a question

Showing papers by "University of California, San Diego published in 2008"


Journal ArticleDOI
16 May 2008-Cell
TL;DR: It is reported that the induction of an EMT in immortalized human mammary epithelial cells (HMLEs) results in the acquisition of mesenchymal traits and in the expression of stem-cell markers, and it is shown that those cells have an increased ability to form mammospheres, a property associated with mammARY epithelial stem cells.

8,052 citations


Journal ArticleDOI
TL;DR: The Compact Muon Solenoid (CMS) detector at the Large Hadron Collider (LHC) at CERN as mentioned in this paper was designed to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1)
Abstract: The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.

5,193 citations


Journal ArticleDOI
Jean Bousquet, N. Khaltaev, Alvaro A. Cruz1, Judah A. Denburg2, W. J. Fokkens3, Alkis Togias4, T. Zuberbier5, Carlos E. Baena-Cagnani6, Giorgio Walter Canonica7, C. van Weel8, Ioana Agache9, Nadia Aït-Khaled, Claus Bachert10, Michael S. Blaiss11, Sergio Bonini12, L.-P. Boulet13, Philippe-Jean Bousquet, Paulo Augusto Moreira Camargos14, K-H. Carlsen15, Y. Z. Chen, Adnan Custovic16, Ronald Dahl17, Pascal Demoly, H. Douagui, Stephen R. Durham18, R. Gerth van Wijk19, O. Kalayci19, Michael A. Kaliner20, You Young Kim21, Marek L. Kowalski, Piotr Kuna22, L. T. T. Le23, Catherine Lemière24, Jing Li25, Richard F. Lockey26, S. Mavale-Manuel26, Eli O. Meltzer27, Y. Mohammad28, J Mullol, Robert M. Naclerio29, Robyn E O'Hehir30, K. Ohta31, S. Ouedraogo31, S. Palkonen, Nikolaos G. Papadopoulos32, Gianni Passalacqua7, Ruby Pawankar33, Todor A. Popov34, Klaus F. Rabe35, J Rosado-Pinto36, G. K. Scadding37, F. E. R. Simons38, Elina Toskala39, E. Valovirta40, P. Van Cauwenberge10, De Yun Wang41, Magnus Wickman42, Barbara P. Yawn43, Arzu Yorgancioglu44, Osman M. Yusuf, H. J. Zar45, Isabella Annesi-Maesano46, E.D. Bateman45, A. Ben Kheder47, Daniel A. Boakye48, J. Bouchard, Peter Burney18, William W. Busse49, Moira Chan-Yeung50, Niels H. Chavannes35, A.G. Chuchalin, William K. Dolen51, R. Emuzyte52, Lawrence Grouse53, Marc Humbert, C. M. Jackson54, Sebastian L. Johnston18, Paul K. Keith2, James P. Kemp27, J. M. Klossek55, Désirée Larenas-Linnemann55, Brian J. Lipworth54, Jean-Luc Malo24, Gailen D. Marshall56, Charles K. Naspitz57, K. Nekam, Bodo Niggemann58, Ewa Nizankowska-Mogilnicka59, Yoshitaka Okamoto60, M. P. Orru61, Paul Potter45, David Price62, Stuart W. Stoloff63, Olivier Vandenplas, Giovanni Viegi, Dennis M. Williams64 
Federal University of Bahia1, McMaster University2, University of Amsterdam3, National Institutes of Health4, Charité5, Catholic University of Cordoba6, University of Genoa7, Radboud University Nijmegen8, Transilvania University of Brașov9, Ghent University10, University of Tennessee Health Science Center11, University of Naples Federico II12, Laval University13, Universidade Federal de Minas Gerais14, University of Oslo15, University of Manchester16, Aarhus University17, Imperial College London18, Erasmus University Rotterdam19, George Washington University20, Seoul National University21, Medical University of Łódź22, Hai phong University Of Medicine and Pharmacy23, Université de Montréal24, Guangzhou Medical University25, University of South Florida26, University of California, San Diego27, University of California28, University of Chicago29, Monash University30, Teikyo University31, National and Kapodistrian University of Athens32, Nippon Medical School33, Sofia Medical University34, Leiden University35, Leiden University Medical Center36, University College London37, University of Manitoba38, University of Helsinki39, Finnish Institute of Occupational Health40, National University of Singapore41, Karolinska Institutet42, University of Minnesota43, Celal Bayar University44, University of Cape Town45, Pierre-and-Marie-Curie University46, Tunis University47, University of Ghana48, University of Wisconsin-Madison49, University of British Columbia50, Georgia Regents University51, Vilnius University52, University of Washington53, University of Dundee54, University of Poitiers55, University of Mississippi56, Federal University of São Paulo57, German Red Cross58, Jagiellonian University Medical College59, Chiba University60, American Pharmacists Association61, University of Aberdeen62, University of Nevada, Reno63, University of North Carolina at Chapel Hill64
01 Apr 2008-Allergy
TL;DR: The ARIA guidelines for the management of allergic rhinitis and asthma are similar in both the 1999 ARIA workshop report and the 2008 Update as discussed by the authors, but the GRADE approach is not yet available.
Abstract: Allergic rhinitis is a symptomatic disorder of the nose induced after allergen exposure by an IgE-mediated inflammation of the membranes lining the nose. It is a global health problem that causes major illness and disability worldwide. Over 600 million patients from all countries, all ethnic groups and of all ages suffer from allergic rhinitis. It affects social life, sleep, school and work and its economic impact is substantial. Risk factors for allergic rhinitis are well identified. Indoor and outdoor allergens as well as occupational agents cause rhinitis and other allergic diseases. The role of indoor and outdoor pollution is probably very important, but has yet to be fully understood both for the occurrence of the disease and its manifestations. In 1999, during the Allergic Rhinitis and its Impact on Asthma (ARIA) WHO workshop, the expert panel proposed a new classification for allergic rhinitis which was subdivided into 'intermittent' or 'persistent' disease. This classification is now validated. The diagnosis of allergic rhinitis is often quite easy, but in some cases it may cause problems and many patients are still under-diagnosed, often because they do not perceive the symptoms of rhinitis as a disease impairing their social life, school and work. The management of allergic rhinitis is well established and the ARIA expert panel based its recommendations on evidence using an extensive review of the literature available up to December 1999. The statements of evidence for the development of these guidelines followed WHO rules and were based on those of Shekelle et al. A large number of papers have been published since 2000 and are extensively reviewed in the 2008 Update using the same evidence-based system. Recommendations for the management of allergic rhinitis are similar in both the ARIA workshop report and the 2008 Update. In the future, the GRADE approach will be used, but is not yet available. Another important aspect of the ARIA guidelines was to consider co-morbidities. Both allergic rhinitis and asthma are systemic inflammatory conditions and often co-exist in the same patients. In the 2008 Update, these links have been confirmed. The ARIA document is not intended to be a standard-of-care document for individual countries. It is provided as a basis for physicians, health care professionals and organizations involved in the treatment of allergic rhinitis and asthma in various countries to facilitate the development of relevant local standard-of-care documents for patients.

3,769 citations


Journal ArticleDOI
TL;DR: The approach taken in ADNI to standardization across sites and platforms of the MRI protocol, postacquisition corrections, and phantom‐based monitoring of all scanners could be used as a model for other multisite trials.
Abstract: Dementia, one of the most feared associates of increasing longevity, represents a pressing public health problem and major research priority. Alzheimer's disease (AD) is the most common form of dementia, affecting many millions around the world. There is currently no cure for AD, but large numbers of novel compounds are currently under development that have the potential to modify the course of the disease and slow its progression. There is a pressing need for imaging biomarkers to improve understanding of the disease and to assess the efficacy of these proposed treatments. Structural magnetic resonance imaging (MRI) has already been shown to be sensitive to presymptomatic disease (1-10) and has the potential to provide such a biomarker. For use in large-scale multicenter studies, however, standardized methods that produce stable results across scanners and over time are needed. The Alzheimer's Disease Neuroimaging Initiative (ADNI) study is a longitudinal multisite observational study of elderly individuals with normal cognition, mild cognitive impairment (MCI), or AD (11,12). It is jointly funded by the National Institutes of Health (NIH) and industry via the Foundation for the NIH. The study will assess how well information (alone or in combination) obtained from MRI, (18F)-fludeoyglucose positron emission tomography (FDG PET), urine, serum, and cerebrospinal fluid (CSF) biomarkers, as well as clinical and neuropsychometric assessments, can measure disease progression in the three groups of elderly subjects mentioned above. At the 55 participating sites in North America, imaging, clinical, and biologic samples will be collected at multiple time points in 200 elderly cognitively normal, 400 MCI, and 200 AD subjects. All subjects will be scanned with 1.5 T MRI at each time point, and half of these will also be scanned with FDG PET. Subjects not assigned to the PET arm of the study will be eligible for 3 T MRI scanning. The goal is to acquire both 1.5 T and 3 T MRI studies at multiple time points in 25% of the subjects who do not undergo PET scanning [R2C1]. CSF collection at both baseline and 12 months is targeted for 50% of the subjects. Sampling varies by clinical group. Healthy elderly controls will be sampled at 0, 6, 12, 24, and 36 months. Subjects with MCI will be sampled at 0, 6, 12, 18, 24, and 36 months. AD subjects will be sampled at 0, 6, 12, and 24 months. Major goals of the ADNI study are: to link all of these data at each time point and make this repository available to the general scientific community; to develop technical standards for imaging in longitudinal studies; to determine the optimum methods for acquiring and analyzing images; to validate imaging and biomarker data by correlating these with concurrent psychometric and clinical assessments; and to improve methods for clinical trials in MCI and AD. The ADNI study overall is divided into cores, with each core managing ADNI-related activities within its sphere of expertise: clinical, informatics, biostatistics, biomarkers, and imaging. The purpose of this report is to describe the MRI methods and decision-making process underlying the selection of the MRI protocol employed in the ADNI study.

3,611 citations


Journal ArticleDOI
17 Aug 2008
TL;DR: This paper shows how to leverage largely commodity Ethernet switches to support the full aggregate bandwidth of clusters consisting of tens of thousands of elements and argues that appropriately architected and interconnected commodity switches may deliver more performance at less cost than available from today's higher-end solutions.
Abstract: Today's data centers may contain tens of thousands of computers with significant aggregate bandwidth requirements. The network architecture typically consists of a tree of routing and switching elements with progressively more specialized and expensive equipment moving up the network hierarchy. Unfortunately, even when deploying the highest-end IP switches/routers, resulting topologies may only support 50% of the aggregate bandwidth available at the edge of the network, while still incurring tremendous cost. Non-uniform bandwidth among data center nodes complicates application design and limits overall system performance.In this paper, we show how to leverage largely commodity Ethernet switches to support the full aggregate bandwidth of clusters consisting of tens of thousands of elements. Similar to how clusters of commodity computers have largely replaced more specialized SMPs and MPPs, we argue that appropriately architected and interconnected commodity switches may deliver more performance at less cost than available from today's higher-end solutions. Our approach requires no modifications to the end host network interface, operating system, or applications; critically, it is fully backward compatible with Ethernet, IP, and TCP.

3,549 citations


Book
07 Mar 2008
TL;DR: Applied Survival Analysis, Second Edition is an ideal book for graduate-level courses in biostatistics, statistics, and epidemiologic methods and serves as a valuable reference for practitioners and researchers in any health-related field or for professionals in insurance and government.
Abstract: THE MOST PRACTICAL, UP-TO-DATE GUIDE TO MODELLING AND ANALYZING TIME-TO-EVENT DATANOW IN A VALUABLE NEW EDITION Since publication of the first edition nearly a decade ago, analyses using time-to-event methods have increase considerably in all areas of scientific inquiry mainly as a result of model-building methods available in modern statistical software packages. However, there has been minimal coverage in the available literature to9 guide researchers, practitioners, and students who wish to apply these methods to health-related areas of study. Applied Survival Analysis, Second Edition provides a comprehensive and up-to-date introduction to regression modeling for time-to-event data in medical, epidemiological, biostatistical, and other health-related research. This book places a unique emphasis on the practical and contemporary applications of regression modeling rather than the mathematical theory. It offers a clear and accessible presentation of modern modeling techniques supplemented with real-world examples and case studies. Key topics covered include: variable selection, identification of the scale of continuous covariates, the role of interactions in the model, assessment of fit and model assumptions, regression diagnostics, recurrent event models, frailty models, additive models, competing risk models, and missing data. Features of the Second Edition include: Expanded coverage of interactions and the covariate-adjusted survival functions The use of the Worchester Heart Attack Study as the main modeling data set for illustrating discussed concepts and techniques New discussion of variable selection with multivariable fractional polynomials Further exploration of time-varying covariates, complex with examples Additional treatment of the exponential, Weibull, and log-logistic parametric regression models Increased emphasis on interpreting and using results as well as utilizing multiple imputation methods to analyze data with missing values New examples and exercises at the end of each chapter Analyses throughout the text are performed using Stata Version 9, and an accompanying FTP site contains the data sets used in the book. Applied Survival Analysis, Second Edition is an ideal book for graduate-level courses in biostatistics, statistics, and epidemiologic methods. It also serves as a valuable reference for practitioners and researchers in any health-related field or for professionals in insurance and government.

3,507 citations


Posted Content
TL;DR: It is shown that more efficient sampling designs exist for making valid inferences, such as sampling all available events and a tiny fraction of nonevents, which enables scholars to save as much as 99% of their (nonfixed) data collection costs or to collect much more meaningful explanatory variables.
Abstract: We study rare events data, binary dependent variables with dozens to thousands of times fewer ones (events, such as wars, vetoes, cases of political activism, or epidemiological infections) than zeros ("nonevents"). In many literatures, these variables have proven difficult to explain and predict, a problem that seems to have at least two sources. First, popular statistical procedures, such as logistic regression, can sharply underestimate the probability of rare events. We recommend corrections that outperform existing methods and change the estimates of absolute and relative risks by as much as some estimated effects reported in the literature. Second, commonly used data collection strategies are grossly inefficient for rare events data. The fear of collecting data with too few events has led to data collections with huge numbers of observations but relatively few, and poorly measured, explanatory variables, such as in international conflict data with more than a quarter-million dyads, only a few of which are at war. As it turns out, more efficient sampling designs exist for making valid inferences, such as sampling all variable events (e.g., wars) and a tiny fraction of nonevents (peace). This enables scholars to save as much as 99% of their (nonfixed) data collection costs or to collect much more meaningful explanatory variables. We provide methods that link these two results, enabling both types of corrections to work simultaneously, and software that implements the methods developed.

3,170 citations


Journal ArticleDOI
TL;DR: The second most important contribution to anthropogenic climate warming, after carbon dioxide emissions, was made by black carbon emissions as mentioned in this paper, which is an efficient absorbing agent of solar irradiation that is preferentially emitted in the tropics and can form atmospheric brown clouds in mixture with other aerosols.
Abstract: Black carbon in soot is an efficient absorbing agent of solar irradiation that is preferentially emitted in the tropics and can form atmospheric brown clouds in mixture with other aerosols. These factors combine to make black carbon emissions the second most important contribution to anthropogenic climate warming, after carbon dioxide emissions.

3,060 citations



MonographDOI
06 Nov 2008
TL;DR: A balanced mechanics-materials approach and coverage of the latest developments in biomaterials and electronic materials, the new edition of this popular text is the most thorough and modern book available for upper-level undergraduate courses on the mechanical behavior of materials as discussed by the authors.
Abstract: A balanced mechanics-materials approach and coverage of the latest developments in biomaterials and electronic materials, the new edition of this popular text is the most thorough and modern book available for upper-level undergraduate courses on the mechanical behavior of materials To ensure that the student gains a thorough understanding the authors present the fundamental mechanisms that operate at micro- and nano-meter level across a wide-range of materials, in a way that is mathematically simple and requires no extensive knowledge of materials This integrated approach provides a conceptual presentation that shows how the microstructure of a material controls its mechanical behavior, and this is reinforced through extensive use of micrographs and illustrations New worked examples and exercises help the student test their understanding Further resources for this title, including lecture slides of select illustrations and solutions for exercises, are available online at wwwcambridgeorg/97800521866758

2,905 citations


Journal ArticleDOI
TL;DR: This review summarizes and compares major signaling pathways that regulate the epithelial-mesenchymal transitions during both development and tumor metastasis and examines their role in carcinoma invasion and metastasis.

Journal ArticleDOI
TL;DR: A nanopore-based device provides single-molecule detection and analytical capabilities that are achieved by electrophoretically driving molecules in solution through a nano-scale pore, a unique analytical capability that makes inexpensive, rapid DNA sequencing a possibility.
Abstract: A nanopore-based device provides single-molecule detection and analytical capabilities that are achieved by electrophoretically driving molecules in solution through a nano-scale pore. The nanopore provides a highly confined space within which single nucleic acid polymers can be analyzed at high throughput by one of a variety of means, and the perfect processivity that can be enforced in a narrow pore ensures that the native order of the nucleobases in a polynucleotide is reflected in the sequence of signals that is detected. Kilobase length polymers (single-stranded genomic DNA or RNA) or small molecules (e.g., nucleosides) can be identified and characterized without amplification or labeling, a unique analytical capability that makes inexpensive, rapid DNA sequencing a possibility. Further research and development to overcome current challenges to nanopore identification of each successive nucleotide in a DNA strand offers the prospect of 'third generation' instruments that will sequence a diploid mammalian genome for ∼$1,000 in ∼24 h.

Journal ArticleDOI
06 Aug 2008-JAMA
TL;DR: This report provides guidelines for when to initiate antiretroviral therapy, selection of appropriate initial regimens, patient monitoring, when to change therapy, and what regimens to use when changing.
Abstract: Context New trial data and drug regimens that have become available in the last 2 years warrant an update to guidelines for antiretroviral therapy (ART) in human immunodeficiency virus (HIV)–infected adults in resource-rich settings. Objective To provide current recommendations for the treatment of adult HIV infection with ART and use of laboratory-monitoring tools. Guidelines include when to start therapy and with what drugs, monitoring for response and toxic effects, special considerations in therapy, and managing antiretroviral failure. Data Sources, Study Selection, and Data Extraction Data that had been published or presented in abstract form at scientific conferences in the past 2 years were systematically searched and reviewed by an International Antiviral Society–USA panel. The panel reviewed available evidence and formed recommendations by full panel consensus. Data Synthesis Treatment is recommended for all adults with HIV infection; the strength of the recommendation and the quality of the evidence increase with decreasing CD4 cell count and the presence of certain concurrent conditions. Recommended initial regimens include 2 nucleoside reverse transcriptase inhibitors (tenofovir/emtricitabine or abacavir/lamivudine) plus a nonnucleoside reverse transcriptase inhibitor (efavirenz), a ritonavir-boosted protease inhibitor (atazanavir or darunavir), or an integrase strand transfer inhibitor (raltegravir). Alternatives in each class are recommended for patients with or at risk of certain concurrent conditions. CD4 cell count and HIV-1 RNA level should be monitored, as should engagement in care, ART adherence, HIV drug resistance, and quality-of-care indicators. Reasons for regimen switching include virologic, immunologic, or clinical failure and drug toxicity or intolerance. Confirmed treatment failure should be addressed promptly and multiple factors considered. Conclusion New recommendations for HIV patient care include offering ART to all patients regardless of CD4 cell count, changes in therapeutic options, and modifications in the timing and choice of ART in the setting of opportunistic illnesses such as cryptococcal disease and tuberculosis.

Journal ArticleDOI
02 May 2008-Cell
TL;DR: Deep sequencing of smRNAs revealed a direct relationship between the location of sm RNAs and DNA methylation, perturbation of smRNA biogenesis upon loss of CpG DNA methylisation, and a tendency for smRN as to direct strand-specific DNA methylations in regions of RNA-DNA homology.

Journal ArticleDOI
TL;DR: Despite the decrease in smoking in the overall population, the size of the clusters of smokers remained the same across time, suggesting that whole groups of people were quitting in concert.
Abstract: BACKGROUND The prevalence of smoking has decreased substantially in the United States over the past 30 years. We examined the extent of the person-to-person spread of smoking behavior and the extent to which groups of widely connected people quit together. METHODS We studied a densely interconnected social network of 12,067 people assessed repeatedly from 1971 to 2003 as part of the Framingham Heart Study. We used network analytic methods and longitudinal statistical models. RESULTS Discernible clusters of smokers and nonsmokers were present in the network, and the clusters extended to three degrees of separation. Despite the decrease in smoking in the overall population, the size of the clusters of smokers remained the same across time, suggesting that whole groups of people were quitting in concert. Smokers were also progressively found in the periphery of the social network. Smoking cessation by a spouse decreased a person's chances of smoking by 67% (95% confidence interval [CI], 59 to 73). Smoking cessation by a sibling decreased the chances by 25% (95% CI, 14 to 35). Smoking cessation by a friend decreased the chances by 36% (95% CI, 12 to 55 ). Among persons working in small firms, smoking cessation by a coworker decreased the chances by 34% (95% CI, 5 to 56). Friends with more education influenced one another more than those with less education. These effects were not seen among neighbors in the immediate geographic area. CONCLUSIONS Network phenomena appear to be relevant to smoking cessation. Smoking behavior spreads through close and distant social ties, groups of interconnected people stop smoking in concert, and smokers are increasingly marginalized socially. These findings have implications for clinical and public health interventions to reduce and prevent smoking.

Journal ArticleDOI
TL;DR: In this article, the basic building blocks are described, starting with the 20 amino acids and proceeding to polypeptides, polysaccharides, and polyprotein-saccharide.

Posted Content
TL;DR: The authors investigate how external and internal rewards work in concert to produce (dis)honesty and suggest that dishonesty governed by self-concept maintenance is likely to be prevalent in the economy, and understand it has important implications for designing effective methods to curb dishonesty.
Abstract: Dishonesty plays a large role in the economy. Causes for (dis)honest behavior seem to be based partially on external rewards, and partially on internal rewards. Here, we investigate how such external and internal rewards work in concert to produce (dis)honesty. We propose and test a theory of self-concept maintenance that allows people to engage to some level in dishonest behavior, thereby benefiting from external benefits of dishonesty, while maintaining their positive view about themselves in terms of being honest individuals. The results show that (1) given the opportunity to engage in beneficial dishonesty, people will engage in such behaviors; (2) the amount of dishonesty is largely insensitive to either the expected external benefits or the costs associated with the deceptive acts; (3) people know about their actions but do not update their self-concepts; (4) causing people to become more aware of their internal standards for honesty decreases their tendency for deception; and (5) increasing the "degrees of freedom" that people have to interpret their actions increases their tendency for deception. We suggest that dishonesty governed by self-concept maintenance is likely to be prevalent in the economy, and understanding it has important implications for designing effective methods to curb dishonesty.Former working paper titles:“(Dis)Honesty: A Combination of Internal and External Rewards” and "Almost Honest: Internal and External Motives for Honesty")

Journal ArticleDOI
TL;DR: TEAD is revealed as a new component in the Hippo pathway playing essential roles in mediating biological functions of YAP, and is required for YAP-induced cell growth, oncogenic transformation, and epithelial-mesenchymal transition.
Abstract: The YAP transcription coactivator has been implicated as an oncogene and is amplified in human cancers. Recent studies have established that YAP is phosphorylated and inhibited by the Hippo tumor suppressor pathway. Here we demonstrate that the TEAD family transcription factors are essential in mediating YAP-dependent gene expression. TEAD is also required for YAP-induced cell growth, oncogenic transformation, and epithelial-mesenchymal transition. CTGF is identified as a direct YAP target gene important for cell growth. Moreover, the functional relationship between YAP and TEAD is conserved in Drosophila Yki (the YAP homolog) and Scalloped (the TEAD homolog). Our study reveals TEAD as a new component in the Hippo pathway playing essential roles in mediating biological functions of YAP.

Journal ArticleDOI
24 Sep 2008-JAMA
TL;DR: In this study involving 10 geographic regions in North America, there were significant and important regional differences in out-of-hospital cardiac arrest incidence and outcome.
Abstract: Context The health and policy implications of regional variation in incidence and outcome of out-of-hospital cardiac arrest remain to be determined. Objective To evaluate whether cardiac arrest incidence and outcome differ across geographic regions. Design, Setting, and Patients Prospective observational study (the Resuscitation Outcomes Consortium) of all out-of-hospital cardiac arrests in 10 North American sites (8 US and 2 Canadian) from May 1, 2006, to April 30, 2007, followed up to hospital discharge, and including data available as of June 28, 2008. Cases (aged 0-108 years) were assessed by organized emergency medical services (EMS) personnel, did not have traumatic injury, and received attempts at external defibrillation or chest compressions or resuscitation was not attempted. Census data were used to determine rates adjusted for age and sex. Main Outcome Measures Incidence rate, mortality rate, case-fatality rate, and survival to discharge for patients assessed or treated by EMS personnel or with an initial rhythm of ventricular fibrillation. Results Among the 10 sites, the total catchment population was 21.4 million, and there were 20 520 cardiac arrests. A total of 11 898 (58.0%) had resuscitation attempted; 2729 (22.9% of treated) had initial rhythm of ventricular fibrillation or ventricular tachycardia or rhythms that were shockable by an automated external defibrillator; and 954(4.6% of total) were discharged alive. The median incidence of EMS-treated cardiac arrest across sites was 52.1 (interquartile range [IQR], 48.0-70.1) per 100 000 population; survival ranged from 3.0% to 16.3%, with a median of 8.4% (IQR, 5.4%-10.4%). Median ventricular fibrillation incidence was 12.6 (IQR, 10.6-5.2) per 100 000 population; survival ranged from 7.7% to 39.9%, with a median of 22.0% (IQR, 15.0%-24.4%), with significant differences across sites for incidence and survival (P Conclusion In this study involving 10 geographic regions in North America, there were significant and important regional differences in out-of-hospital cardiac arrest incidence and outcome.

Journal ArticleDOI
10 Apr 2008-Nature
TL;DR: This data reinforce several previously identified clades that split deeply in the animal tree, unambiguously resolve multiple long-standing issues for which there was strong conflicting support in earlier studies with less data, and provide molecular support for the monophyly of molluscs, a group long recognized by morphologists.
Abstract: Long-held ideas regarding the evolutionary relationships among animals have recently been upended by sometimes controversial hypotheses based largely on insights from molecular data. These new hypotheses include a clade of moulting animals (Ecdysozoa) and the close relationship of the lophophorates to molluscs and annelids (Lophotrochozoa). Many relationships remain disputed, including those that are required to polarize key features of character evolution, and support for deep nodes is often low. Phylogenomic approaches, which use data from many genes, have shown promise for resolving deep animal relationships, but are hindered by a lack of data from many important groups. Here we report a total of 39.9 Mb of expressed sequence tags from 29 animals belonging to 21 phyla, including 11 phyla previously lacking genomic or expressed-sequence-tag data. Analysed in combination with existing sequences, our data reinforce several previously identified clades that split deeply in the animal tree (including Protostomia, Ecdysozoa and Lophotrochozoa), unambiguously resolve multiple long-standing issues for which there was strong conflicting support in earlier studies with less data (such as velvet worms rather than tardigrades as the sister group of arthropods), and provide molecular support for the monophyly of molluscs, a group long recognized by morphologists. In addition, we find strong support for several new hypotheses. These include a clade that unites annelids (including sipunculans and echiurans) with nemerteans, phoronids and brachiopods, molluscs as sister to that assemblage, and the placement of ctenophores as the earliest diverging extant multicellular animals. A single origin of spiral cleavage (with subsequent losses) is inferred from well-supported nodes. Many relationships between a stable subset of taxa find strong support, and a diminishing number of lineages remain recalcitrant to placement on the tree.

Journal ArticleDOI
01 Mar 2008-Genetics
TL;DR: A new method, efficient mixed-model association (EMMA), which corrects for population structure and genetic relatedness in model organism association mapping and takes advantage of the specific nature of the optimization problem in applying mixed models for association mapping, which allows for substantially increase the computational speed and reliability of the results.
Abstract: Genomewide association mapping in model organisms such as inbred mouse strains is a promising approach for the identification of risk factors related to human diseases. However, genetic association studies in inbred model organisms are confronted by the problem of complex population structure among strains. This induces inflated false positive rates, which cannot be corrected using standard approaches applied in human association studies such as genomic control or structured association. Recent studies demonstrated that mixed models successfully correct for the genetic relatedness in association mapping in maize and Arabidopsis panel data sets. However, the currently available mixed-model methods suffer from computational inefficiency. In this article, we propose a new method, efficient mixed-model association (EMMA), which corrects for population structure and genetic relatedness in model organism association mapping. Our method takes advantage of the specific nature of the optimization problem in applying mixed models for association mapping, which allows us to substantially increase the computational speed and reliability of the results. We applied EMMA to in silico whole-genome association mapping of inbred mouse strains involving hundreds of thousands of SNPs, in addition to Arabidopsis and maize data sets. We also performed extensive simulation studies to estimate the statistical power of EMMA under various SNP effects, varying degrees of population structure, and differing numbers of multiple measurements per strain. Despite the limited power of inbred mouse association mapping due to the limited number of available inbred strains, we are able to identify significantly associated SNPs, which fall into known QTL or genes identified through previous studies while avoiding an inflation of false positives. An R package implementation and webserver of our EMMA method are publicly available.

Journal ArticleDOI
TL;DR: The authors show that people behave dishonestly enough to profit but honestly enough to delude themselves of their own integrity, and that a little bit of dishonesty gives a taste of profit without spoiling a positive self-view.
Abstract: People like to think of themselves as honest. However, dishonesty pays—and it often pays well. How do people resolve this tension? This research shows that people behave dishonestly enough to profit but honestly enough to delude themselves of their own integrity. A little bit of dishonesty gives a taste of profit without spoiling a positive self-view. Two mechanisms allow for such self-concept maintenance: inattention to moral standards and categorization malleability. Six experiments support the authors' theory of self-concept maintenance and offer practical applications for curbing dishonesty in everyday life.

Journal ArticleDOI
TL;DR: Necroptosis is a cellular mechanism of necrotic cell death induced by apoptotic stimuli in the form of death domain receptor engagement by their respective ligands under conditions where apoptotic execution is prevented and necrostatins are established as the first-in-class inhibitors of RIP1 kinase, the key upstream kinase involved in the activation of necroptosis.
Abstract: Necroptosis is a cellular mechanism of necrotic cell death induced by apoptotic stimuli in the form of death domain receptor engagement by their respective ligands under conditions where apoptotic execution is prevented. Although it occurs under regulated conditions, necroptotic cell death is characterized by the same morphological features as unregulated necrotic death. Here we report that necrostatin-1, a previously identified small-molecule inhibitor of necroptosis, is a selective allosteric inhibitor of the death domain receptor-associated adaptor kinase RIP1 in vitro. We show that RIP1 is the primary cellular target responsible for the antinecroptosis activity of necrostatin-1. In addition, we show that two other necrostatins, necrostatin-3 and necrostatin-5, also target the RIP1 kinase step in the necroptosis pathway, but through mechanisms distinct from that of necrostatin-1. Overall, our data establish necrostatins as the first-in-class inhibitors of RIP1 kinase, the key upstream kinase involved in the activation of necroptosis.

Journal ArticleDOI
05 Dec 2008-BMJ
TL;DR: People’s happiness depends on the happiness of others with whom they are connected, providing further justification for seeing happiness, like health, as a collective phenomenon.
Abstract: Objectives To evaluate whether happiness can spread from person to person and whether niches of happiness form within social networks. Design Longitudinal social network analysis. Setting Framingham Heart Study social network. Participants 4739 individuals followed from 1983 to 2003. Main outcome measures Happiness measured with validated four item scale; broad array of attributes of social networks and diverse social ties. ResultsClustersofhappyand unhappypeoplearevisible in the network, and the relationship between people ’s happiness extends up to three degrees of separation (for example, to the friends of one’s friends’ friends). People whoaresurroundedbymanyhappypeopleandthosewho arecentralinthenetworkaremorelikelytobecomehappy in the future. Longitudinal statistical models suggest that clustersofhappinessresultfromthespreadofhappiness and not just a tendency for people to associate with similarindividuals.Afriendwholiveswithinamile(about 1.6km)andwhobecomeshappyincreasestheprobability that a person is happy by 25% (95% confidence interval 1% to 57%). Similar effects are seen in coresident spouses (8%, 0.2% to 16%), siblings who live within a mile (14%, 1% to 28%), and next door neighbours (34%, 7% to 70%). Effects are not seen between coworkers. The

Journal ArticleDOI
TL;DR: It is concluded that at present, there is no adequate evidence base to justify incorporating learning-styles assessments into general educational practice and limited education resources would better be devoted to adopting other educational practices that have a strong evidence base.
Abstract: The term “learning styles” refers to the concept that individuals differ in regard to what mode of instruction or study is most effective for them. Proponents of learning-style assessment contend that optimal instruction requires diagnosing individuals' learning style and tailoring instruction accordingly. Assessments of learning style typically ask people to evaluate what sort of information presentation they prefer (e.g., words versus pictures versus speech) and/or what kind of mental activity they find most engaging or congenial (e.g., analysis versus listening), although assessment instruments are extremely diverse. The most common—but not the only—hypothesis about the instructional relevance of learning styles is the meshing hypothesis, according to which instruction is best provided in a format that matches the preferences of the learner (e.g., for a “visual learner,” emphasizing visual presentation of information).The learning-styles view has acquired great influence within the education field, and...

Journal ArticleDOI
TL;DR: If the antimicrobial resistance crisis is to be addressed, a concerted, grassroots effort led by the medical community will be required and could mean a literal return to the preantibiotic era for many types of infections.
Abstract: The ongoing explosion of antibiotic-resistant infections continues to plague global and US health care. Meanwhile, an equally alarming decline has occurred in the research and development of new antibiotics to deal with the threat. In response to this microbial “perfect storm,” in 2001, the federal Interagency Task Force on Antimicrobial Resistance released the “Action Plan to Combat Antimicrobial Resistance; Part 1: Domestic” to strengthen the response in the United States. The Infectious Diseases Society of America (IDSA) followed in 2004 with its own report, “Bad Bugs, No Drugs: As Antibiotic Discovery Stagnates, A Public Health Crisis Brews,” which proposed incentives to reinvigorate pharmaceutical investment in antibiotic research and development. The IDSA’s subsequent lobbying efforts led to the introduction of promising legislation in the 109th US Congress (January 2005–December 2006). Unfortunately, the legislation was not enacted. During the 110th Congress, the IDSA has continued to work with congressional leaders on promising legislation to address antibiotic-resistant infection. Nevertheless, despite intensive public relations and lobbying efforts, it remains unclear whether sufficiently robust legislation will be enacted. In the meantime, microbes continue to become more resistant, the antibiotic pipeline continues to diminish, and the majority of the public remains unaware of this critical situation. The result of insufficient federal funding; insufficient surveillance, prevention, and control; insufficient research and development activities; misguided regulation of antibiotics in agriculture and, in particular, for food animals; and insufficient overall coordination of US (and international) efforts could mean a literal return to the preantibiotic era for many types of infections. If we are to address the antimicrobial resistance crisis, a concerted, grassroots effort led by the medical community will be required.

Journal ArticleDOI
TL;DR: A simple information-theoretic characterization of processing difficulty as the work incurred by resource reallocation during parallel, incremental, probabilistic disambiguation in sentence comprehension is proposed, and its equivalence to the theory of Hale is demonstrated.

Journal ArticleDOI
19 Jun 2008-Nature
TL;DR: Whole-genome comparisons illuminate the murky relationships among the three chordate groups (tunicates, lancelets and vertebrates), and allow not only reconstruction of the gene complement of the last common chordate ancestor but also partial reconstruction of its genomic organization.
Abstract: Lancelets ('amphioxus') are the modern survivors of an ancient chordate lineage, with a fossil record dating back to the Cambrian period. Here we describe the structure and gene content of the highly polymorphic approximately 520-megabase genome of the Florida lancelet Branchiostoma floridae, and analyse it in the context of chordate evolution. Whole-genome comparisons illuminate the murky relationships among the three chordate groups (tunicates, lancelets and vertebrates), and allow not only reconstruction of the gene complement of the last common chordate ancestor but also partial reconstruction of its genomic organization, as well as a description of two genome-wide duplications and subsequent reorganizations in the vertebrate lineage. These genome-scale events shaped the vertebrate genome and provided additional genetic variation for exploitation during vertebrate evolution.

Journal ArticleDOI
TL;DR: The Immunological Genome Project combines immunology and computational biology laboratories in an effort to establish a complete 'road map' of gene-expression and regulatory networks in all immune cells.
Abstract: nology is an ideal field for the application of systems approaches, with its detailed descriptions of cell types (over 200 immune cell types are defined in the scope of the Immunological Genome Project (ImmGen)), wealth of reagents and easy access to cells. Thanks to the broad and robust approaches allowed by gene-expression microarrays and related techniques, the transcriptome is probably the only ‘-ome’ that can be reliably tackled in its entirety. Generating a complete perspective of gene expression in the immune system

Journal ArticleDOI
TL;DR: In this paper, the E-cadherin binding partner beta-catenin was found to be necessary, but not sufficient, for the formation of anoikis resistance.
Abstract: Loss of the epithelial adhesion molecule E-cadherin is thought to enable metastasis by disrupting intercellular contacts-an early step in metastatic dissemination. To further investigate the molecular basis of this notion, we use two methods to inhibit E-cadherin function that distinguish between E-cadherin's cell-cell adhesion and intracellular signaling functions. Whereas the disruption of cell-cell contacts alone does not enable metastasis, the loss of E-cadherin protein does, through induction of an epithelial-to-mesenchymal transition, invasiveness, and anoikis resistance. We find the E-cadherin binding partner beta-catenin to be necessary, but not sufficient, for induction of these phenotypes. In addition, gene expression analysis shows that E-cadherin loss results in the induction of multiple transcription factors, at least one of which, Twist, is necessary for E-cadherin loss-induced metastasis. These findings indicate that E-cadherin loss in tumors contributes to metastatic dissemination by inducing wide-ranging transcriptional and functional changes.