scispace - formally typeset
Search or ask a question

Showing papers by "University of Southern California published in 2004"


Journal ArticleDOI
TL;DR: In this paper, the authors tested the hypothesis that prophylactic cardiac-resynchronization therapy in the form of biventricular stimulation with a pacemaker with or without a defibrillator would reduce the risk of death and hospitalization among patients with advanced chronic heart failure and intraventricular conduction delays.
Abstract: background We tested the hypothesis that prophylactic cardiac-resynchronization therapy in the form of biventricular stimulation with a pacemaker with or without a defibrillator would reduce the risk of death and hospitalization among patients with advanced chronic heart failure and intraventricular conduction delays. methods A total of 1520 patients who had advanced heart failure (New York Heart Association class III or IV) due to ischemic or nonischemic cardiomyopathies and a QRS interval of at least 120 msec were randomly assigned in a 1:2:2 ratio to receive optimal pharmacologic therapy (diuretics, angiotensin-converting–enzyme inhibitors, beta-blockers, and spironolactone) alone or in combination with cardiac-resynchronization therapy with either a pacemaker or a pacemaker–defibrillator. The primary composite end point was the time to death from or hospitalization for any cause. results As compared with optimal pharmacologic therapy alone, cardiac-resynchronization therapy with a pacemaker decreased the risk of the primary end point (hazard ratio, 0.81; P=0.014), as did cardiac-resynchronization therapy with a pacemaker–defibrillator (hazard ratio, 0.80; P=0.01). The risk of the combined end point of death from or hospitalization for heart failure was reduced by 34 percent in the pacemaker group (P<0.002) and by 40 percent in the pacemaker–defibrillator group (P<0.001 for the comparison with the pharmacologic-therapy group). A pacemaker reduced the risk of the secondary end point of death from any cause by 24 percent (P=0.059), and a pacemaker–defibrillator reduced the risk by 36 percent (P=0.003). conclusions In patients with advanced heart failure and a prolonged QRS interval, cardiac-resynchronization therapy decreases the combined risk of death from any cause or first hospitalization and, when combined with an implantable defibrillator, significantly reduces mortality.

5,132 citations


Posted Content
TL;DR: In this paper, the authors proposed simple tests of error cross section dependence which are applicable to a variety of panel data models, including stationary and unit root dynamic heterogeneous panels with short T and large N.
Abstract: This paper proposes simple tests of error cross section dependence which are applicable to a variety of panel data models, including stationary and unit root dynamic heterogeneous panels with short T and large N. The proposed tests are based on average of pair-wise correlation coefficients of the OLS residuals from the individual regressions in the panel, and can be used to test for cross section dependence of any fixed order p, as well as the case where no a priori ordering of the cross section units is assumed, referred to as CD(p) and CD tests, respectively. Asymptotic distribution of these tests are derived and their power function analyzed under different alternatives. It is shown that these tests are correctly centred for fixed N and T, and are robust to single or multiple breaks in the slope coefficients and/or error variances. The small sample properties of the tests are investigated and compared to the Lagrange multiplier test of Breusch and Pagan using Monte Carlo experiments. It is shown that the tests have the correct size in very small samples and satisfactory power, and as predicted by the theory, quite robust to the presence of unit roots and structural breaks. The use of the CD test is illustrated by applying it to study the degree of dependence in per capita output innovations across countries within a given region and across countries in different regions. The results show significant evidence of cross dependence in output innovations across many countries and regions in the World.

4,991 citations


Journal ArticleDOI
TL;DR: In this paper, the authors compared the natural and anthropogenic controls on the conversion of unreactive N2 to more reactive forms of nitrogen (Nr) and found that human activities increasingly dominate the N budget at the global and at most regional scales, and the terrestrial and open ocean N budgets are essentially dis-connected.
Abstract: This paper contrasts the natural and anthropogenic controls on the conversion of unreactive N2 to more reactive forms of nitrogen (Nr). A variety of data sets are used to construct global N budgets for 1860 and the early 1990s and to make projections for the global N budget in 2050. Regional N budgets for Asia, North America, and other major regions for the early 1990s, as well as the marine N budget, are presented to highlight the dominant fluxes of nitrogen in each region. Important findings are that human activities increasingly dominate the N budget at the global and at most regional scales, the terrestrial and open ocean N budgets are essentially dis- connected, and the fixed forms of N are accumulating in most environmental reservoirs. The largest uncertainties in our understanding of the N budget at most scales are the rates of natural biological nitrogen fixation, the amount of Nr storage in most environmental reservoirs, and the production rates of N2 by denitrification.

4,555 citations


Journal ArticleDOI
02 Apr 2004-Science
TL;DR: Over 1.2 million previously unknown genes represented in these samples, including more than 782 new rhodopsin-like photoreceptors are identified, suggesting substantial oceanic microbial diversity.
Abstract: We have applied “whole-genome shotgun sequencing” to microbial populations collected en masse on tangential flow and impact filters from seawater samples collected from the Sargasso Sea near Bermuda. A total of 1.045 billion base pairs of nonredundant sequence was generated, annotated, and analyzed to elucidate the gene content, diversity, and relative abundance of the organisms within these environmental samples. These data are estimated to derive from at least 1800 genomic species based on sequence relatedness, including 148 previously unknown bacterial phylotypes. We have identified over 1.2 million previously unknown genes represented in these samples, including more than 782 new rhodopsin-like photoreceptors. Variation in species present and stoichiometry suggests substantial oceanic microbial diversity. Microorganisms are responsible for most of the biogeochemical cycles that shape the environment of Earth and its oceans. Yet, these organisms are the least well understood on Earth, as the ability to study and understand the metabolic potential of microorganisms has been hampered by the inability to generate pure cultures. Recent studies have begun to explore environ

4,210 citations


Posted Content
TL;DR: In this article, the authors proposed a new approach to estimation and inference in panel data models with a multifactor error structure where the unobserved common factors are correlated with exogenously given individual-specific regressors, and the factor loadings differ over the cross-section units.
Abstract: This paper presents a new approach to estimation and inference in panel data models with a multifactor error structure where the unobserved common factors are (possibly) correlated with exogenously given individual-specific regressors, and the factor loadings differ over the cross section units. The basic idea behind the proposed estimation procedure is to filter the individual-specific regressors by means of (weighted) cross-section aggregates such that asymptotically as the cross-section dimension (N) tends to infinity the differential effects of unobserved common factors are eliminated. The estimation procedure has the advantage that it can be computed by OLS applied to an auxiliary regression where the observed regressors are augmented by (weighted) cross sectional averages of the dependent variable and the individual specific regressors. Two different but related problems are addressed: one that concerns the coefficients of the individual-specific regressors, and the other that focusses on the mean of the individual coefficients assumed random. In both cases appropriate estimators, referred to as common correlated effects (CCE) estimators, are proposed and their asymptotic distribution as N with T (the time-series dimension) fixed or as N and T (jointly) are derived under different regularity conditions. One important feature of the proposed CCE mean group (CCEMG) estimator is its invariance to the (unknown but fixed) number of unobserved common factors as N and T (jointly). The small sample properties of the various pooled estimators are investigated by Monte Carlo experiments that confirm the theoretical derivations and show that the pooled estimators have generally satisfactory small sample properties even for relatively small values of N and T.

3,170 citations


Journal ArticleDOI
27 May 2004-Nature
TL;DR: Great potential lies in the development of ‘epigenetic therapies’ — several inhibitors of enzymes controlling epigenetic modifications, specifically DNA methyltransferases and histone deacetylases, have shown promising anti-tumorigenic effects for some malignancies.
Abstract: Epigenetic mechanisms, which involve DNA and histone modifications, result in the heritable silencing of genes without a change in their coding sequence. The study of human disease has focused on genetic mechanisms, but disruption of the balance of epigenetic networks can cause several major pathologies, including cancer, syndromes involving chromosomal instabilities, and mental retardation. The development of new diagnostic tools might reveal other diseases that are caused by epigenetic alterations. Great potential lies in the development of ‘epigenetic therapies’ — several inhibitors of enzymes controlling epigenetic modifications, specifically DNA methyltransferases and histone deacetylases, have shown promising anti-tumorigenic effects for some malignancies.

3,051 citations


Journal ArticleDOI
TL;DR: This paper proposes S-MAC, a medium access control (MAC) protocol designed for wireless sensor networks that enables low-duty-cycle operation in a multihop network and reveals fundamental tradeoffs on energy, latency and throughput.
Abstract: This paper proposes S-MAC, a medium access control (MAC) protocol designed for wireless sensor networks. Wireless sensor networks use battery-operated computing and sensing devices. A network of these devices will collaborate for a common application such as environmental monitoring. We expect sensor networks to be deployed in an ad hoc fashion, with nodes remaining largely inactive for long time, but becoming suddenly active when something is detected. These characteristics of sensor networks and applications motivate a MAC that is different from traditional wireless MACs such as IEEE 802.11 in several ways: energy conservation and self-configuration are primary goals, while per-node fairness and latency are less important. S-MAC uses a few novel techniques to reduce energy consumption and support self-configuration. It enables low-duty-cycle operation in a multihop network. Nodes form virtual clusters based on common sleep schedules to reduce control overhead and enable traffic-adaptive wake-up. S-MAC uses in-channel signaling to avoid overhearing unnecessary traffic. Finally, S-MAC applies message passing to reduce contention latency for applications that require in-network data processing. The paper presents measurement results of S-MAC performance on a sample sensor node, the UC Berkeley Mote, and reveals fundamental tradeoffs on energy, latency and throughput. Results show that S-MAC obtains significant energy savings compared with an 802.11-like MAC without sleeping.

2,843 citations


Proceedings ArticleDOI
01 Sep 2004
TL;DR: Gazebo is designed to fill this niche by creating a 3D dynamic multi-robot environment capable of recreating the complex worlds that would be encountered by the next generation of mobile robots.
Abstract: Simulators have played a critical role in robotics research as tools for quick and efficient testing of new concepts, strategies, and algorithms. To date, most simulators have been restricted to 2D worlds, and few have matured to the point where they are both highly capable and easily adaptable. Gazebo is designed to fill this niche by creating a 3D dynamic multi-robot environment capable of recreating the complex worlds that would be encountered by the next generation of mobile robots. Its open source status, fine grained control, and high fidelity place Gazebo in a unique position to become more than just a stepping stone between the drawing board and real hardware: data visualization, simulation of remote environments, and even reverse engineering of blackbox systems are all possible applications. Gazebo is developed in cooperation with the Player and Stage projects (Gerkey, B. P., et al., July 2003), (Gerkey, B. P., et al., May 2001), (Vaughan, R. T., et al., Oct. 2003), and is available from http://playerstage.sourceforge.net/gazebo/ gazebo.html.

2,824 citations


Journal ArticleDOI
16 Jul 2004-Science
TL;DR: The in situ CaCO3 dissolution rates for the global oceans from total alkalinity and chlorofluorocarbon data are estimated, and the future impacts of anthropogenic CO2 on Ca CO3 shell–forming species are discussed.
Abstract: Rising atmospheric carbon dioxide (CO 2 ) concentrations over the past two centuries have led to greater CO2 uptake by the oceans. This acidification process has changed the saturation state ofthe oceans with respect to calcium carbonate (CaCO3) particles. Here we estimate the in situ CaCO3 dissolution rates for the global oceans from total alkalinity and chlorofluorocarbon data, and we also discuss the future impacts of anthropogenic CO2 on CaCO3 shell– forming species. CaCO 3 dissolution rates, ranging from 0.003 to 1.2 micromoles per kilogram per year, are observed beginning near the aragonite saturation horizon. The total water column CaCO 3 dissolution rate for the global oceans is approximately 0.5 0.2 petagrams ofCaCO 3-C per year, which is approximately 45 to 65% ofthe export production ofCaCO 3 . Atmospheric CO 2 concentrations oscillated be

2,140 citations



Journal ArticleDOI
14 Jan 2004-JAMA
TL;DR: Data support the hypothesis that high CACS can modify predicted risk obtained from FRS alone, especially among patients in the intermediate-risk category in whom clinical decision making is most uncertain.
Abstract: ContextGuidelines advise that all adults undergo coronary heart disease (CHD) risk assessment to guide preventive treatment intensity. Although the Framingham Risk Score (FRS) is often recommended for this, it has been suggested that risk assessment may be improved by additional tests such as coronary artery calcium scoring (CACS).ObjectivesTo determine whether CACS assessment combined with FRS in asymptomatic adults provides prognostic information superior to either method alone and whether the combined approach can more accurately guide primary preventive strategies in patients with CHD risk factors.Design, Setting, and ParticipantsProspective observational population-based study, of 1461 asymptomatic adults with coronary risk factors. Participants with at least 1 coronary risk factor (>45 years) underwent computed tomography (CT) examination, were screened between 1990-1992, were contacted yearly for up to 8.5 years after CT scan, and were assessed for CHD. This analysis included 1312 participants with CACS results; excluded were 269 participants with diabetes and 14 participants with either missing data or had a coronary event before CACS was performed.Main Outcome MeasureNonfatal myocardial infarction (MI) or CHD death.ResultsDuring a median of 7.0 years of follow-up, 84 patients experienced MI or CHD death; 70 patients died of any cause. There were 291 (28%) participants with an FRS of more than 20% and 221 (21%) with a CACS of more than 300. Compared with an FRS of less than 10%, an FRS of more than 20% predicted the risk of MI or CHD death (hazard ratio [HR], 14.3; 95% confidence interval [CI]; 2.0-104; P = .009). Compared with a CACS of zero, a CACS of more than 300 was predictive (HR, 3.9; 95% CI, 2.1-7.3; P<.001). Across categories of FRS, CACS was predictive of risk among patients with an FRS higher than 10% (P<.001) but not with an FRS less than 10%.ConclusionThese data support the hypothesis that high CACS can modify predicted risk obtained from FRS alone, especially among patients in the intermediate-risk category in whom clinical decision making is most uncertain.

Journal ArticleDOI
01 Oct 2004-Ecology
TL;DR: In this paper, a binomial mixture model is proposed for the species accumulation function based on presence-absence (incidence) of species in a sample of quadrats or other sampling units, which covers interpolation between zero and the observed number of samples, as well as extrapolation beyond the observed sample set.
Abstract: A general binomial mixture model is proposed for the species accumulation function based on presence-absence (incidence) of species in a sample of quadrats or other sampling units. The model covers interpolation between zero and the observed number of samples, as well as extrapolation beyond the observed sample set. For interpolation (sample- based rarefaction), easily calculated, closed-form expressions for both expected richness and its confidence limits are developed (using the method of moments) that completely eliminate the need for resampling methods and permit direct statistical comparison of richness between sample sets. An incidence-based form of the Coleman (random-placement) model is developed and compared with the moment-based interpolation method. For ex- trapolation beyond the empirical sample set (and simultaneously, as an alternative method of interpolation), a likelihood-based estimator with a bootstrap confidence interval is de- scribed that relies on a sequential, AIC-guided algorithm to fit the mixture model parameters. Both the moment-based and likelihood-based estimators are illustrated with data sets for temperate birds and tropical seeds, ants, and trees. The moment-based estimator is confi- dently recommended for interpolation (sample-based rarefaction). For extrapolation, the likelihood-based estimator performs well for doubling or tripling the number of empirical samples, but it is not reliable for estimating the richness asymptote. The sensitivity of individual-based and sample-based rarefaction to spatial (or temporal) patchiness is dis- cussed.

Proceedings ArticleDOI
23 Aug 2004
TL;DR: A system that, given a topic, automatically finds the people who hold opinions about that topic and the sentiment of each opinion and another module for determining word sentiment and another for combining sentiments within a sentence is presented.
Abstract: Identifying sentiments (the affective parts of opinions) is a challenging problem. We present a system that, given a topic, automatically finds the people who hold opinions about that topic and the sentiment of each opinion. The system contains a module for determining word sentiment and another for combining sentiments within a sentence. We experiment with various models of classifying and combining sentiment at word and sentence levels, with promising results.

Journal ArticleDOI
TL;DR: A domain-independent taxonomy of MRTA problems is given, and it is shown how many such problems can be viewed as instances of other, well-studied, optimization problems.
Abstract: Despite more than a decade of experimental work in multi-robot systems, important theoretical aspects of multi-robot coordination mechanisms have, to date, been largely untreated. To address this issue, we focus on the problem of multi-robot task allocation (MRTA). Most work on MRTA has been ad hoc and empirical, with many coordination architectures having been proposed and validated in a proof-of-concept fashion, but infrequently analyzed. With the goal of bringing objective grounding to this important area of research, we present a formal study of MRTA problems. A domain-independent taxonomy of MRTA problems is given, and it is shown how many such problems can be viewed as instances of other, well-studied, optimization problems. We demonstrate how relevant theory from operations research and combinatorial optimization can be used for analysis and greater understanding of existing approaches to task allocation, and show how the same theory can be used in the synthesis of new approaches.

Journal ArticleDOI
TL;DR: Current levels of air pollution have chronic, adverse effects on lung development in children from the age of 10 to 18 years, leading to clinically significant deficits in attained FEV as children reach adulthood.
Abstract: background Whether exposure to air pollution adversely affects the growth of lung function during the period of rapid lung development that occurs between the ages of 10 and 18 years is unknown. methods In this prospective study, we recruited 1759 children (average age, 10 years) from schools in 12 southern California communities and measured lung function annually for eight years. The rate of attrition was approximately 10 percent per year. The communities represented a wide range of ambient exposures to ozone, acid vapor, nitrogen dioxide, and particulate matter. Linear regression was used to examine the relationship of air pollution to the forced expiratory volume in one second (FEV 1 ) and other spirometric measures. results Over the eight-year period, deficits in the growth of FEV 1 were associated with exposure to nitrogen dioxide (P=0.005), acid vapor (P=0.004), particulate matter with an aerodynamic diameter of less than 2.5 µm (PM 2.5 ) (P=0.04), and elemental carbon (P=0.007), even after adjustment for several potential confounders and effect modifiers. Associations were also observed for other spirometric measures. Exposure to pollutants was associated with clinically and statistically significant deficits in the FEV 1 attained at the age of 18 years. For example, the estimated proportion of 18-year-old subjects with a low FEV 1 (defined as a ratio of observed to expected FEV 1 of less than 80 percent) was 4.9 times as great at the highest level of exposure to PM 2.5 as at the lowest level of exposure (7.9 percent vs. 1.6 percent, P=0.002). conclusions The results of this study indicate that current levels of air pollution have chronic, adverse effects on lung development in children from the age of 10 to 18 years, leading to clinically significant deficits in attained FEV 1 as children reach adulthood.

Proceedings ArticleDOI
03 Nov 2004
TL;DR: Wisden incorporates two novel mechanisms, reliable data transport using a hybrid of end-to-end and hop-by-hop recovery, and low-overhead data time-stamping that does not require global clock synchronization.
Abstract: Structural monitoring---the collection and analysis of structural response to ambient or forced excitation--is an important application of networked embedded sensing with significant commercial potential. The first generation of sensor networks for structural monitoring are likely to be data acquisition systems that collect data at a single node for centralized processing. In this paper, we discuss the design and evaluation of a wireless sensor network system (called Wisden for structural data acquisition. Wisden incorporates two novel mechanisms, reliable data transport using a hybrid of end-to-end and hop-by-hop recovery, and low-overhead data time-stamping that does not require global clock synchronization. We also study the applicability of wavelet-based compression techniques to overcome the bandwidth limitations imposed by low-power wireless radios. We describe our implementation of these mechanisms on the Mica-2 motes and evaluate the performance of our implementation. We also report experiences from deploying Wisden on a large structure.

Journal ArticleDOI
TL;DR: It is demonstrated that changes in SA in a number of type 2 diabetic cohorts serve as a quantitative indicator of improvements in insulin sensitivity obtained during TZD treatment, whereas changes in total serum adiponectin levels do not correlate well at the individual level.

Posted Content
TL;DR: In this paper, the authors extend the work of Sloan (1996) by linking accrual reliability to earnings persistence and construct a model showing that less reliable accruals lead to lower earnings persistence.
Abstract: This paper extends the work of Sloan (1996) by linking accrual reliability to earnings persistence. We construct a model showing that less reliable accruals lead to lower earnings persistence. We then develop a comprehensive balance sheet categorization of accruals and rate each category according to the reliability of the underlying accruals. Empirical tests generally confirm that less reliable categories of accruals lead to lower earnings persistence and that investors do not fully anticipate the lower earnings persistence, leading to significant security mispricing. We conclude that there are significant costs associated with the recognition of unreliable information in financial statements.

Proceedings ArticleDOI
26 Apr 2004
TL;DR: DMAC is designed to solve the interruption problem and allow continuous packet forwarding by giving the sleep schedule of a node an offset that depends upon its depth on the tree, and adjusts the duty cycles adaptively according to the traffic load in the network.
Abstract: Summary form only given. In many sensor network applications the major traffic pattern consists of data collected from several source nodes to a sink through a unidirectional tree. We propose DMAC, an energy efficient and low latency MAC that is designed and optimized for such data gathering trees in wireless sensor networks. We first show that previously proposed MAC protocols for sensor networks that utilize activation/sleep duty cycles suffer from a data forwarding interruption problem, whereby not all nodes on a multihop path to the sink are notified of data delivery in progress, resulting in significant sleep delay. DMAC is designed to solve the interruption problem and allow continuous packet forwarding by giving the sleep schedule of a node an offset that depends upon its depth on the tree. DMAC also adjusts the duty cycles adaptively according to the traffic load in the network. We further propose a data prediction mechanism and the use of more-to-send (MTS) packets in order to alleviate problems pertaining to channel contention and collisions. Our simulation results show that by exploiting the application-specific structure of data gathering trees in sensor networks, DMAC provides significant energy savings and latency reduction while ensuring high data reliability.

Journal ArticleDOI
Gary A. Churchill, David C. Airey1, Hooman Allayee2, Joe M. Angel3, Alan D. Attie4, Jackson Beatty5, Willam D. Beavis6, John K. Belknap7, Beth Bennett8, Wade H. Berrettini9, André Bleich10, Molly A. Bogue, Karl W. Broman11, Kari J. Buck12, Edward S. Buckler13, Margit Burmeister14, Elissa J. Chesler15, James M. Cheverud16, Steven J. Clapcote17, Melloni N. Cook18, Roger D. Cox19, John C. Crabbe12, Wim E. Crusio20, Ariel Darvasi21, Christian F. Deschepper22, Rebecca W. Doerge23, Charles R. Farber24, Jiri Forejt25, Daniel Gaile26, Steven J. Garlow27, Hartmut Geiger28, Howard K. Gershenfeld29, Terry Gordon30, Jing Gu15, Weikuan Gu15, Gerald de Haan31, Nancy L. Hayes32, Craig Heller33, Heinz Himmelbauer34, Robert Hitzemann12, Kent W. Hunter35, Hui-Chen Hsu36, Fuad A. Iraqi37, Boris Ivandic38, Howard J. Jacob39, Ritsert C. Jansen31, Karl J. Jepsen40, Dabney K. Johnson41, Thomas E. Johnson8, Gerd Kempermann42, Christina Kendziorski4, Malak Kotb15, R. Frank Kooy43, Bastien Llamas22, Frank Lammert44, J. M. Lassalle45, Pedro R. Lowenstein5, Lu Lu15, Aldons J. Lusis5, Kenneth F. Manly15, Ralph S. Marcucio46, Doug Matthews18, Juan F. Medrano24, Darla R. Miller41, Guy Mittleman18, Beverly A. Mock35, Jeffrey S. Mogil47, Xavier Montagutelli48, Grant Morahan49, David G. Morris50, Richard Mott51, Joseph H. Nadeau52, Hiroki Nagase53, Richard S. Nowakowski32, Bruce F. O'Hara54, Alexander V. Osadchuk, Grier P. Page36, Beverly Paigen, Kenneth Paigen, Abraham A. Palmer, Huei Ju Pan, Leena Peltonen-Palotie55, Leena Peltonen-Palotie5, Jeremy L. Peirce15, Daniel Pomp56, Michal Pravenec25, Daniel R. Prows28, Zonghua Qi1, Roger H. Reeves11, John C. Roder17, Glenn D. Rosen57, Eric E. Schadt58, Leonard C. Schalkwyk59, Ze'ev Seltzer17, Kazuhiro Shimomura60, Siming Shou61, Mikko J. Sillanpää55, Linda D. Siracusa62, Hans-Willem Snoeck40, Jimmy L. Spearow24, Karen L. Svenson, Lisa M. Tarantino63, David W. Threadgill64, Linda A. Toth65, William Valdar51, Fernando Pardo-Manuel de Villena64, Craig H Warden24, Steve Whatley59, Robert W. Williams15, Tom Wiltshire63, Nengjun Yi36, Dabao Zhang66, Min Zhang13, Fei Zou64 
Vanderbilt University1, University of Southern California2, University of Texas MD Anderson Cancer Center3, University of Wisconsin-Madison4, University of California, Los Angeles5, National Center for Genome Resources6, Portland VA Medical Center7, University of Colorado Boulder8, University of Pennsylvania9, Hannover Medical School10, Johns Hopkins University11, Oregon Health & Science University12, Cornell University13, University of Michigan14, University of Tennessee Health Science Center15, Washington University in St. Louis16, University of Toronto17, University of Memphis18, Medical Research Council19, University of Massachusetts Medical School20, Hebrew University of Jerusalem21, Université de Montréal22, Purdue University23, University of California, Davis24, Academy of Sciences of the Czech Republic25, University at Buffalo26, Emory University27, University of Cincinnati28, University of Texas Southwestern Medical Center29, New York University30, University of Groningen31, Rutgers University32, Stanford University33, Max Planck Society34, National Institutes of Health35, University of Alabama at Birmingham36, International Livestock Research Institute37, Heidelberg University38, Medical College of Wisconsin39, Icahn School of Medicine at Mount Sinai40, Oak Ridge National Laboratory41, Charité42, University of Antwerp43, RWTH Aachen University44, Paul Sabatier University45, University of California, San Francisco46, McGill University47, Pasteur Institute48, University of Western Australia49, Yale University50, University of Oxford51, Case Western Reserve University52, Roswell Park Cancer Institute53, University of Kentucky54, University of Helsinki55, University of Nebraska–Lincoln56, Harvard University57, Merck & Co.58, King's College London59, Northwestern University60, Shriners Hospitals for Children61, Thomas Jefferson University62, Novartis63, University of North Carolina at Chapel Hill64, Southern Illinois University Carbondale65, University of Rochester66
TL;DR: The Collaborative Cross will provide a common reference panel specifically designed for the integrative analysis of complex systems and will change the way the authors approach human health and disease.
Abstract: The goal of the Complex Trait Consortium is to promote the development of resources that can be used to understand, treat and ultimately prevent pervasive human diseases. Existing and proposed mouse resources that are optimized to study the actions of isolated genetic loci on a fixed background are less effective for studying intact polygenic networks and interactions among genes, environments, pathogens and other factors. The Collaborative Cross will provide a common reference panel specifically designed for the integrative analysis of complex systems and will change the way we approach human health and disease.

Journal ArticleDOI
TL;DR: The helium-droplet technique combines the benefits of both the gas phase and the classical matrix-isolation techniques, and can be viewed as an isothermal nanoscopic reactor, which isolates single molecules, clusters, or even a single reactive encounter at ultralow temperatures.
Abstract: Herein, recent experiments on the spectroscopy and chemical reactions of molecules and complexes embedded in helium droplets are reviewed. In the droplets, a high spectroscopic resolution, which is comparable to the gas phase is achieved, while an isothermal low-temperature environment is maintained by evaporative cooling at T =0.37 K (4He droplets) or 0.15 K (3He droplets), lower than possible in most solid matrices. Thus the helium-droplet technique combines the benefits of both the gas phase and the classical matrix-isolation techniques. Most important, the superfluid helium facilitates binary encounters, and absorbs the released binding energy upon recombination. Thus the droplet can be viewed as an isothermal nanoscopic reactor, which isolates single molecules, clusters, or even a single reactive encounter at ultralow temperatures.

Proceedings ArticleDOI
04 Oct 2004
TL;DR: A key finding is that for radios using narrow-band modulation, the transitional region is not an artifact of the radio non-ideality, as it would exist even with perfect-threshold receivers because of multi-path fading.
Abstract: The wireless sensor networks community, has now an increased understanding of the need for realistic link layer models. Recent experimental studies have shown that real deployments have a "transitional region" with highly unreliable links, and that therefore the idealized perfect-reception-within-range models used in common network simulation tools can be very misleading. In this paper, we use mathematical techniques from communication theory to model and analyze the low power wireless links. The primary contribution of this work is the identification of the causes of the transitional region, and a quantification of their influence. Specifically, we derive expressions for the packet reception rate as a function of distance, and for the width of the transitional region. These expressions incorporate important channel and radio parameters such as the path loss exponent and shadowing variance of the channel; and the modulation and encoding of the radio. A key finding is that for radios using narrow-band modulation, the transitional region is not an artifact of the radio non-ideality, as it would exist even with perfect-threshold receivers because of multi-path fading. However, we hypothesize that radios with mechanisms to combat multi-path effects, such as spread-spectrum and diversity techniques, can reduce the transitional region.

Journal ArticleDOI
TL;DR: The hypothesis that targeting and functional disruption of particular synapses by Aβ oligomers may provide a molecular basis for the specific loss of memory function in early Alzheimer's disease is suggested.
Abstract: The cognitive hallmark of early Alzheimer's disease (AD) is an extraordinary inability to form new memories. For many years, this dementia was attributed to nerve-cell death induced by deposits of fibrillar amyloid β (Aβ). A newer hypothesis has emerged, however, in which early memory loss is considered a synapse failure caused by soluble Aβ oligomers. Such oligomers rapidly block long-term potentiation, a classic experimental paradigm for synaptic plasticity, and they are strikingly elevated in AD brain tissue and transgenic-mouse AD models. The current work characterizes the manner in which Aβ oligomers attack neurons. Antibodies raised against synthetic oligomers applied to AD brain sections were found to give diffuse stain around neuronal cell bodies, suggestive of a dendritic pattern, whereas soluble brain extracts showed robust AD-dependent reactivity in dot immunoblots. Antigens in unfractionated AD extracts attached with specificity to cultured rat hippocampal neurons, binding within dendritic arbors at discrete puncta. Crude fractionation showed ligand size to be between 10 and 100 kDa. Synthetic Aβ oligomers of the same size gave identical punctate binding, which was highly selective for particular neurons. Image analysis by confocal double-label immunofluorescence established that >90% of the punctate oligomer binding sites colocalized with the synaptic marker PSD-95 (postsynaptic density protein 95). Synaptic binding was accompanied by ectopic induction of Arc, a synaptic immediate-early gene, the overexpression of which has been linked to dysfunctional learning. Results suggest the hypothesis that targeting and functional disruption of particular synapses by Aβ oligomers may provide a molecular basis for the specific loss of memory function in early AD.

Journal ArticleDOI
TL;DR: The authors investigated the effect of scale on performance in the active money management industry and found that fund returns decline with lagged fund size, even after accounting for various performance benchmarks, suggesting that these adverse scale effects are related to liquidity.
Abstract: We investigate the effect of scale on performance in the active money management industry. We first document that fund returns, both before and after fees and expenses, decline with lagged fund size, even after accounting for various performance benchmarks. We then explore a number of potential explanations for this relationship. This association is most pronounced among funds that have to invest in small and illiquid stocks, suggesting that these adverse scale effects are related to liquidity. Controlling for its size, a fund's return does not deteriorate with the size of the family that it belongs to, indicating that scale need not be bad for performance depending on how the fund is organized. Finally, using data on whether funds are solo-managed or team-managed and the composition of fund investments, we explore the idea that scale erodes fund performance because of the interaction of liquidity and organizational diseconomies.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the relationship between tax services fees and restatements and found a significant negative association between tax service fees and tax restatement, consistent with net benefits from acquiring tax services from a registrant's audit firm.
Abstract: Do fees for non-audit services compromise auditor's independence and result in reduced quality of financial reporting? The Sarbanes-Oxley Act of 2002 presumes that some fees do and bans these services for audit clients. Also, some registrants voluntarily restrict their audit firms from providing legally permitted non-audit services. Assuming that restatements of previously issued financial statements reflect low-quality financial reporting, we investigate detailed fees for restating registrants for 1995 to 2000 and for similar nonrestating registrants. We do not find a statistically significant positive association between fees for either financial information systems design and implementation or internal audit services and restatements, but we do find some such association for unspecified non-audit services and restatements. We find a significant negative association between tax services fees and restatements, consistent with net benefits from acquiring tax services from a registrant's audit firm. The significant associations are driven primarily by larger registrants.

Journal ArticleDOI
TL;DR: A detailed case study of a time series of 11 contracts concluded during 1989-1997 between the same two partners, both of whom participate in the personal computer industry, to explore whether and how firms learn to contract.
Abstract: Organizational forms involving more detailed contracts than are found in traditional spot market exchanges appear to be increasingly prevalent. There has been relatively little analysis, however, of the extent to which firms learn how to use contracts to manage their interfirm relationships over time. In this paper, we conduct a detailed case study of a time series of 11 contracts concluded during 1989-1997 between the same two partners, both of whom participate in the personal computer industry, to explore whether and how firms learn to contract. We find many changes to the structure of the contracts that cannot be fully explained by changes in the assets at risk in the relationship, and evidence that these changes are largely the result of processes in which the firms were learning how to work together, including learning how to contract with each other. The nature of this learning appears to have been quite incremental and local, that is, not very far sighted. We suggest how and when contracts might serve as repositories for knowledge about how to govern collaborations, and suggest some boundary conditions for this phenomenon. Our findings also provide implications for the debate about whether contracts have a positive or negative effect on interorganizational trust. We conclude with suggestions for future research.

Proceedings ArticleDOI
13 Oct 2004
TL;DR: Results reveal that the system based on facial expression gave better performance than the systembased on just acoustic information for the emotions considered, and that when these two modalities are fused, the performance and the robustness of the emotion recognition system improve measurably.
Abstract: The interaction between human beings and computers will be more natural if computers are able to perceive and respond to human non-verbal communication such as emotions. Although several approaches have been proposed to recognize human emotions based on facial expressions or speech, relatively limited work has been done to fuse these two, and other, modalities to improve the accuracy and robustness of the emotion recognition system. This paper analyzes the strengths and the limitations of systems based only on facial expressions or acoustic information. It also discusses two approaches used to fuse these two modalities: decision level and feature level integration. Using a database recorded from an actress, four emotions were classified: sadness, anger, happiness, and neutral state. By the use of markers on her face, detailed facial motions were captured with motion capture, in conjunction with simultaneous speech recordings. The results reveal that the system based on facial expression gave better performance than the system based on just acoustic information for the emotions considered. Results also show the complementarily of the two modalities and that when these two modalities are fused, the performance and the robustness of the emotion recognition system improve measurably.

Journal ArticleDOI
01 Feb 2004-Diabetes
TL;DR: The hypothesis that patients with Alzheimer disease are more vulnerable to type 2 diabetes and the possibility of linkage between the processes responsible for loss of brain cells and beta-cells in these diseases are supported.
Abstract: Alzheimer disease and type 2 diabetes are characterized by increased prevalence with aging, a genetic predisposition, and comparable pathological features in the islet and brain (amyloid derived from amyloid protein in the brain in Alzheimer disease and islet amyloid derived from islet amyloid polypeptide in the pancreas in type 2 diabetes). Evidence is growing to link precursors of amyloid deposition in the brain and pancreas with the pathogenesis of Alzheimer disease and type 2 diabetes, respectively. Given these similarities, we questioned whether there may be a common underlying mechanism predisposing to islet and cerebral amyloid. To address this, we first examined the prevalence of type 2 diabetes in a community-based controlled study, the Mayo Clinic Alzheimer Disease Patient Registry (ADPR), which follows patients with Alzheimer disease versus control subjects without Alzheimer disease. In addition to this clinical study, we performed a pathological study of autopsy cases from this same community to determine whether there is an increased prevalence of islet amyloid in patients with Alzheimer disease and increased prevalence of cerebral amyloid in patients with type 2 diabetes. Patients who were enrolled in the ADPR (Alzheimer disease n 100, non–Alzheimer disease control subjects n 138) were classified according to fasting glucose concentration (FPG) as nondiabetic (FPG 126 mg/dl). The mean slope of FPG over 10 years in each case was also compared between Alzheimer disease and non–Alzheimer disease control subjects. Pancreas and brain were examined from autopsy specimens obtained from 105 humans (first, 28 cases of Alzheimer disease disease vs. 21 non–Alzheimer disease control subjects and, second, 35 subjects with type 2 diabetes vs. 21 non–type 2 diabetes control subjects) for the presence of islet and brain amyloid. Both type 2 diabetes (35% vs. 18%; P < 0.05) and IFG (46% vs. 24%; P < 0.01) were more prevalent in Alzheimer disease versus non–Alzheimer disease control subjects, so 81% of cases of Alzheimer disease had either type 2 diabetes or IFG. The slope of increase of FPG with age over 10 years was also greater in Alzheimer disease than non–Alzheimer disease control subjects (P < 0.01). Islet amyloid was more frequent (P < 0.05) and extensive (P < 0.05) in patients with Alzheimer disease than in non–Alzheimer disease control subjects. However, diffuse and neuritic plaques were not more common in type 2 diabetes than in control subjects. In cases of type 2 diabetes when they were present, the duration of type 2 diabetes correlated with the density of diffuse (P < 0.001) and neuritic plaques (P < 0.01). In this community cohort from southeast Minnesota, type 2 diabetes and IFG are more common in patients with Alzheimer disease than in control subjects, as is the pathological hallmark of type 2 diabetes, islet amyloid. However, there was no increase in brain plaque formation in cases of type 2 diabetes, although when it was present, it correlated in extent with duration of diabetes. These data support the hypothesis that patients with Alzheimer disease are more vulnerable to type 2 diabetes and the possibility of linkage between the processes responsible for loss of brain cells and -cells in these diseases. Diabetes 53: 474 – 481, 2004

Journal ArticleDOI
TL;DR: In this paper, the authors demonstrate detection of NO2 down to ppb levels using transistors based on both single and multiple In2O3 nanowires operating at room temperature.
Abstract: We demonstrate detection of NO2 down to ppb levels using transistors based on both single and multiple In2O3 nanowires operating at room temperature. This represents orders-of-magnitude improvement over previously reported metal oxide film or nanowire/nanobelt sensors. A comparison between the single and multiple nanowire sensors reveals that the latter have numerous advantages in terms of great reliability, high sensitivity, and simplicity in fabrication. Furthermore, selective detection of NO2 can be readily achieved with multiple-nanowire sensors even with other common chemicals such as NH3, O2, CO, and H2 around.

Journal ArticleDOI
TL;DR: This article examined the market reaction to a sample of 403 restatements announced from 1995 to 1999 and found that more negative returns are associated with restatement involving fraud, affecting more accounts, decreasing reported income and attributed to auditors or management.