scispace - formally typeset
Search or ask a question

Showing papers by "University of Chicago published in 2003"


Journal ArticleDOI
TL;DR: In this article, the authors find that the emerging standard model of cosmology, a flat -dominated universe seeded by a nearly scale-invariant adiabatic Gaussian fluctuations, fits the WMAP data.
Abstract: WMAP precision data enable accurate testing of cosmological models. We find that the emerging standard model of cosmology, a flat � -dominated universe seeded by a nearly scale-invariant adiabatic Gaussian fluctuations, fits the WMAP data. For the WMAP data only, the best-fit parameters are h ¼ 0:72 � 0:05, � bh 2 ¼ 0:024 � 0:001, � mh 2 ¼ 0:14 � 0:02, � ¼ 0:166 þ0:076 � 0:071 , ns ¼ 0:99 � 0:04, and � 8 ¼ 0:9 � 0:1. With parameters fixed only by WMAP data, we can fit finer scale cosmic microwave background (CMB) measure- ments and measurements of large-scale structure (galaxy surveys and the Lyforest). This simple model is also consistent with a host of other astronomical measurements: its inferred age of the universe is consistent with stellar ages, the baryon/photon ratio is consistent with measurements of the (D/H) ratio, and the inferred Hubble constant is consistent with local observations of the expansion rate. We then fit the model parameters to a combination of WMAP data with other finer scale CMB experiments (ACBAR and CBI), 2dFGRS measurements, and Lyforest data to find the model's best-fit cosmological parameters: h ¼ 0:71 þ0:04 � 0:03 , � bh 2 ¼ 0:0224 � 0:0009, � mh 2 ¼ 0:135 þ0:008 � 0:009 , � ¼ 0:17 � 0:06, ns(0.05 Mpc � 1 )=0 :93 � 0:03, and � 8 ¼ 0:84 � 0:04. WMAP's best determination of � ¼ 0:17 � 0:04 arises directly from the temperature- polarization (TE) data and not from this model fit, but they are consistent. These parameters imply that the age of the universe is 13:7 � 0:2 Gyr. With the Lyforest data, the model favors but does not require a slowly varying spectral index. The significance of this running index is sensitive to the uncertainties in the Ly� forest. By combining WMAP data with other astronomical data, we constrain the geometry of the universe, � tot ¼ 1:02 � 0:02, and the equation of state of the dark energy, w < � 0:78 (95% confidence limit assuming w �� 1). The combination of WMAP and 2dFGRS data constrains the energy density in stable neutrinos: � � h 2 < 0:0072 (95% confidence limit). For three degenerate neutrino species, this limit implies that their mass is less than 0.23 eV (95% confidence limit). The WMAP detection of early reionization rules out warm dark matter. Subject headings: cosmic microwave background — cosmological parameters — cosmology: observations — early universe On-line material: color figure

10,650 citations


Journal ArticleDOI
01 Aug 2003-Genetics
TL;DR: Extensions to the method of Pritchard et al. for inferring population structure from multilocus genotype data are described and methods that allow for linkage between loci are developed, which allows identification of subtle population subdivisions that were not detectable using the existing method.
Abstract: We describe extensions to the method of Pritchard et al. for inferring population structure from multilocus genotype data. Most importantly, we develop methods that allow for linkage between loci. The new model accounts for the correlations between linked loci that arise in admixed populations (“admixture linkage disequilibium”). This modification has several advantages, allowing (1) detection of admixture events farther back into the past, (2) inference of the population of origin of chromosomal regions, and (3) more accurate estimates of statistical uncertainty when linked loci are used. It is also of potential use for admixture mapping. In addition, we describe a new prior model for the allele frequencies within each population, which allows identification of subtle population subdivisions that were not detectable using the existing method. We present results applying the new methods to study admixture in African-Americans, recombination in Helicobacter pylori , and drift in populations of Drosophila melanogaster . The methods are implemented in a program, structure , version 2.0, which is available at http://pritch.bsd.uchicago.edu.

7,615 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a geometrically motivated algorithm for representing high-dimensional data, based on the correspondence between the graph Laplacian, the Laplace Beltrami operator on the manifold and the connections to the heat equation.
Abstract: One of the central problems in machine learning and pattern recognition is to develop appropriate representations for complex data. We consider the problem of constructing a representation for data lying on a low-dimensional manifold embedded in a high-dimensional space. Drawing on the correspondence between the graph Laplacian, the Laplace Beltrami operator on the manifold, and the connections to the heat equation, we propose a geometrically motivated algorithm for representing the high-dimensional data. The algorithm provides a computationally efficient approach to nonlinear dimensionality reduction that has locality-preserving properties and a natural connection to clustering. Some potential applications and illustrative examples are discussed.

7,210 citations


Journal ArticleDOI
John W. Belmont1, Paul Hardenbol, Thomas D. Willis, Fuli Yu1, Huanming Yang2, Lan Yang Ch'Ang, Wei Huang3, Bin Liu2, Yan Shen3, Paul K.H. Tam4, Lap-Chee Tsui4, Mary M.Y. Waye5, Jeffrey Tze Fei Wong6, Changqing Zeng2, Qingrun Zhang2, Mark S. Chee7, Luana Galver7, Semyon Kruglyak7, Sarah S. Murray7, Arnold Oliphant7, Alexandre Montpetit8, Fanny Chagnon8, Vincent Ferretti8, Martin Leboeuf8, Michael S. Phillips8, Andrei Verner8, Shenghui Duan9, Denise L. Lind10, Raymond D. Miller9, John P. Rice9, Nancy L. Saccone9, Patricia Taillon-Miller9, Ming Xiao10, Akihiro Sekine, Koki Sorimachi, Yoichi Tanaka, Tatsuhiko Tsunoda, Eiji Yoshino, David R. Bentley11, Sarah E. Hunt11, Don Powell11, Houcan Zhang12, Ichiro Matsuda13, Yoshimitsu Fukushima14, Darryl Macer15, Eiko Suda15, Charles N. Rotimi16, Clement Adebamowo17, Toyin Aniagwu17, Patricia A. Marshall18, Olayemi Matthew17, Chibuzor Nkwodimmah17, Charmaine D.M. Royal16, Mark Leppert19, Missy Dixon19, Fiona Cunningham20, Ardavan Kanani20, Gudmundur A. Thorisson20, Peter E. Chen21, David J. Cutler21, Carl S. Kashuk21, Peter Donnelly22, Jonathan Marchini22, Gilean McVean22, Simon Myers22, Lon R. Cardon22, Andrew P. Morris22, Bruce S. Weir23, James C. Mullikin24, Michael Feolo24, Mark J. Daly25, Renzong Qiu26, Alastair Kent, Georgia M. Dunston16, Kazuto Kato27, Norio Niikawa28, Jessica Watkin29, Richard A. Gibbs1, Erica Sodergren1, George M. Weinstock1, Richard K. Wilson9, Lucinda Fulton9, Jane Rogers11, Bruce W. Birren25, Hua Han2, Hongguang Wang, Martin Godbout30, John C. Wallenburg8, Paul L'Archevêque, Guy Bellemare, Kazuo Todani, Takashi Fujita, Satoshi Tanaka, Arthur L. Holden, Francis S. Collins24, Lisa D. Brooks24, Jean E. McEwen24, Mark S. Guyer24, Elke Jordan31, Jane Peterson24, Jack Spiegel24, Lawrence M. Sung32, Lynn F. Zacharia24, Karen Kennedy29, Michael Dunn29, Richard Seabrook29, Mark Shillito, Barbara Skene29, John Stewart29, David Valle21, Ellen Wright Clayton33, Lynn B. Jorde19, Aravinda Chakravarti21, Mildred K. Cho34, Troy Duster35, Troy Duster36, Morris W. Foster37, Maria Jasperse38, Bartha Maria Knoppers39, Pui-Yan Kwok10, Julio Licinio40, Jeffrey C. Long41, Pilar N. Ossorio42, Vivian Ota Wang33, Charles N. Rotimi16, Patricia Spallone29, Patricia Spallone43, Sharon F. Terry44, Eric S. Lander25, Eric H. Lai45, Deborah A. Nickerson46, Gonçalo R. Abecasis41, David Altshuler47, Michael Boehnke41, Panos Deloukas11, Julie A. Douglas41, Stacey Gabriel25, Richard R. Hudson48, Thomas J. Hudson8, Leonid Kruglyak49, Yusuke Nakamura50, Robert L. Nussbaum24, Stephen F. Schaffner25, Stephen T. Sherry24, Lincoln Stein20, Toshihiro Tanaka 
18 Dec 2003-Nature
TL;DR: The HapMap will allow the discovery of sequence variants that affect common disease, will facilitate development of diagnostic tools, and will enhance the ability to choose targets for therapeutic intervention.
Abstract: The goal of the International HapMap Project is to determine the common patterns of DNA sequence variation in the human genome and to make this information freely available in the public domain. An international consortium is developing a map of these patterns across the genome by determining the genotypes of one million or more sequence variants, their frequencies and the degree of association between them, in DNA samples from populations with ancestry from parts of Africa, Asia and Europe. The HapMap will allow the discovery of sequence variants that affect common disease, will facilitate development of diagnostic tools, and will enhance our ability to choose targets for therapeutic intervention.

5,926 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present full sky microwave maps in five frequency bands (23 to 94 GHz) from the WMAP first year sky survey, which are consistent with the 7 in. full-width at half-maximum (FWHM) Cosmic Background Explorer (COBE) maps.
Abstract: We present full sky microwave maps in five frequency bands (23 to 94 GHz) from the WMAP first year sky survey. Calibration errors are less than 0.5% and the low systematic error level is well specified. The cosmic microwave background (CMB) is separated from the foregrounds using multifrequency data. The sky maps are consistent with the 7 in. full-width at half-maximum (FWHM) Cosmic Background Explorer (COBE) maps. We report more precise, but consistent, dipole and quadrupole values. The CMB anisotropy obeys Gaussian statistics with -58 less than f(sub NL) less than 134 (95% CL). The 2 less than or = l less than or = 900 anisotropy power spectrum is cosmic variance limited for l less than 354 with a signal-to-noise ratio greater than 1 per mode to l = 658. The temperature-polarization cross-power spectrum reveals both acoustic features and a large angle correlation from reionization. The optical depth of reionization is tau = 0.17 +/- 0.04, which implies a reionization epoch of t(sub r) = 180(sup +220, sub -80) Myr (95% CL) after the Big Bang at a redshift of z(sub r) = 20(sup +10, sub -9) (95% CL) for a range of ionization scenarios. This early reionization is incompatible with the presence of a significant warm dark matter density. A best-fit cosmological model to the CMB and other measures of large scale structure works remarkably well with only a few parameters. The age of the best-fit universe is t(sub 0) = 13.7 +/- 0.2 Gyr old. Decoupling was t(sub dec) = 379(sup +8, sub -7)kyr after the Big Bang at a redshift of z(sub dec) = 1089 +/- 1. The thickness of the decoupling surface was Delta(sub z(sub dec)) = 195 +/- 2. The matter density of the universe is Omega(sub m)h(sup 2) = 0.135(sup +0.008, sub -0.009) the baryon density is Omega(sub b)h(sup 2) = 0.0224 +/- 0.0009, and the total mass-energy of the universe is Omega(sub tot) = 1.02 +/- 0.02. There is progressively less fluctuation power on smaller scales, from WMAP to fine scale CMB measurements to galaxies and finally to the Ly-alpha forest. This is accounted for with a running spectral index, significant at the approx. 2(sigma) level. The spectral index of scalar fluctuations is fit as n(sub s) = 0.93 +/-0.03 at wavenumber k(sub o) = 0.05/Mpc ((sub eff) approx. = 700), with a slope of dn(sub s)/d I(sub nk) = -0.031(sup + 0.016, sub -0.018) in the best-fit model.

4,821 citations


Journal ArticleDOI
14 Aug 2003-Nature
TL;DR: This research presents the next generation of single-beam optical traps, which promise to take optical tweezers out of the laboratory and into the mainstream of manufacturing and diagnostics and even become consumer products.
Abstract: Optical tweezers use the forces exerted by a strongly focused beam of light to trap and move objects ranging in size from tens of nanometres to tens of micrometres. Since their introduction in 1986, the optical tweezer has become an important tool for research in the fields of biology, physical chemistry and soft condensed matter physics. Recent advances promise to take optical tweezers out of the laboratory and into the mainstream of manufacturing and diagnostics; they may even become consumer products. The next generation of single-beam optical traps offers revolutionary new opportunities for fundamental and applied research.

4,647 citations


Proceedings Article
09 Dec 2003
TL;DR: These are linear projective maps that arise by solving a variational problem that optimally preserves the neighborhood structure of the data set by finding the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the manifold.
Abstract: Many problems in information processing involve some form of dimensionality reduction. In this paper, we introduce Locality Preserving Projections (LPP). These are linear projective maps that arise by solving a variational problem that optimally preserves the neighborhood structure of the data set. LPP should be seen as an alternative to Principal Component Analysis (PCA) – a classical linear technique that projects the data along the directions of maximal variance. When the high dimensional data lies on a low dimensional manifold embedded in the ambient space, the Locality Preserving Projections are obtained by finding the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the manifold. As a result, LPP shares many of the data representation properties of nonlinear techniques such as Laplacian Eigenmaps or Locally Linear Embedding. Yet LPP is linear and more crucially is defined everywhere in ambient space rather than just on the training data points. This is borne out by illustrative examples on some high dimensional data sets.

4,318 citations


Journal ArticleDOI
TL;DR: Olley and Pakes as discussed by the authors show that when intermediate inputs (i.e., those inputs which are typically subtracted out in a value-added production function) can also solve this simultaneity problem, and discuss some potential benefits of expanding the choice set of proxies to include these inputs.
Abstract: Economists began relating output to inputs in the early 1800's. A large literature on estimating production functions has followed, in part because much of economic theory yields testable implications that are related to the technology and optimizing behaviour.1 Since at least as early as Marschak and Andrews (1944), applied researchers have worried about the potential correlation between input levels and the unobserved firm-specific productivity shocks in the estimation of production function parameters. The economics underlying this concern are intuitive. Firms that have a large positive productivity shock may respond by using more inputs. To the extent that this is true, ordinary least squares (OLS) estimates of production functions will yield biased parameter estimates, and, by implication, biased estimates of productivity. Many alternatives to OLS have been proposed, and we add to this set by extending Olley and Pakes (1996). They show the conditions under which an investment proxy controls for correlation between input levels and the unobserved productivity shock. Their approach has the advantage that, for many questions, it is no more difficult to implement than OLS. We show when intermediate inputs (those inputs which are typically subtracted out in a value-added production function) can also solve this simultaneity problem. We discuss some potential benefits of expanding the choice set of proxies to include these inputs.

3,901 citations


Journal ArticleDOI
Abstract: We present full sky microwave maps in five bands (23 to 94 GHz) from the WMAP first year sky survey. Calibration errors are 1 per mode to l=658. The temperature-polarization cross-power spectrum reveals both acoustic features and a large angle correlation from reionization. The optical depth of reionization is 0.17 +/- 0.04, which implies a reionization epoch of 180+220-80 Myr (95% CL) after the Big Bang at a redshift of 20+10-9 (95% CL) for a range of ionization scenarios. This early reionization is incompatible with the presence of a significant warm dark matter density. The age of the best-fit universe is 13.7 +/- 0.2 Gyr old. Decoupling was 379+8-7 kyr after the Big Bang at a redshift of 1089 +/- 1. The thickness of the decoupling surface was dz=195 +/- 2. The matter density is Omega_m h^2 = 0.135 +0.008 -0.009, the baryon density is Omega_b h^2 = 0.0224 +/- 0.0009, and the total mass-energy of the universe is Omega_tot = 1.02 +/- 0.02. The spectral index of scalar fluctuations is fit as n_s = 0.93 +/- 0.03 at wavenumber k_0 = 0.05 Mpc^-1, with a running index slope of dn_s/d ln k = -0.031 +0.016 -0.018 in the best-fit model. This flat universe model is composed of 4.4% baryons, 22% dark matter and 73% dark energy. The dark energy equation of state is limited to w<-0.78 (95% CL). Inflation theory is supported with n_s~1, Omega_tot~1, Gaussian random phases of the CMB anisotropy, and superhorizon fluctuations. An admixture of isocurvature modes does not improve the fit. The tensor-to-scalar ratio is r(k_0=0.002 Mpc^-1)<0.90 (95% CL).

3,868 citations


Journal ArticleDOI
TL;DR: Imatinib was superior to interferon alfa plus low-dose cytarabine as first-line therapy in newly diagnosed chronic-phase CML and was better tolerated than combination therapy.
Abstract: Background Imatinib, a selective inhibitor of the BCR-ABL tyrosine kinase, produces high response rates in patients with chronic-phase chronic myeloid leukemia (CML) who have had no response to interferon alfa. We compared the efficacy of imatinib with that of interferon alfa combined with low-dose cytarabine in newly diagnosed chronic-phase CML. Methods We randomly assigned 1106 patients to receive imatinib (553 patients) or interferon alfa plus low-dose cytarabine (553 patients). Crossover to the alternative group was allowed if stringent criteria defining treatment failure or intolerance were met. Patients were evaluated for hematologic and cytogenetic responses, toxic effects, and rates of progression. Results After a median follow-up of 19 months, the estimated rate of a major cytogenetic response (0 to 35 percent of cells in metaphase positive for the Philadelphia chromosome) at 18 months was 87.1 percent (95 percent confidence interval, 84.1 to 90.0) in the imatinib group and 34.7 percent (95 perce...

3,399 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigate whether and how individual managers affect corporate behavior and performance and show that managers with higher performance effects receive higher compensation and are more likely to be found in better governed environments.
Abstract: This paper investigates whether and how individual managers affect corporate behavior and performance. We construct a manager-e rm matched panel data set which enables us to track the top managers across different e rms over time. We e nd that manager e xed effects matter for a wide range of corporate decisions. A signie cant extent of the heterogeneity in investment, e nancial, and organizational practices of e rms can be explained by the presence of manager e xed effects. We identify specie c patterns in managerial decision-making that appear to indicate general differences in “ style” across managers. Moreover, we show that management style is signie cantly related to manager e xed effects in performance and that managers with higher performance e xed effects receive higher compensation and are more likely to be found in better governed e rms. In a e nal step, we tie back these e ndings to observable managerial characteristics. We e nd that executives from earlier birth cohorts appear on average to be more conservative; on the other hand, managers who hold an MBA degree seem to follow on average more aggressive strategies.

Journal ArticleDOI
TL;DR: A signaling pathway initiated by eIF2alpha phosphorylation protects cells against metabolic consequences of ER oxidation by promoting the linked processes of amino acid sufficiency and resistance to oxidative stress.

Journal ArticleDOI
TL;DR: The term "vulnerable patient" may be more appropriate and is proposed now for the identification of subjects with high likelihood of developing cardiac events in the near future and a quantitative method for cumulative risk assessment of vulnerable patients needs to be developed.
Abstract: Atherosclerotic cardiovascular disease results in >19 million deaths annually, and coronary heart disease accounts for the majority of this toll. Despite major advances in treatment of coronary heart disease patients, a large number of victims of the disease who are apparently healthy die suddenly without prior symptoms. Available screening and diagnostic methods are insufficient to identify the victims before the event occurs. The recognition of the role of the vulnerable plaque has opened new avenues of opportunity in the field of cardiovascular medicine. This consensus document concludes the following. (1) Rupture-prone plaques are not the only vulnerable plaques. All types of atherosclerotic plaques with high likelihood of thrombotic complications and rapid progression should be considered as vulnerable plaques. We propose a classification for clinical as well as pathological evaluation of vulnerable plaques. (2) Vulnerable plaques are not the only culprit factors for the development of acute coronary syndromes, myocardial infarction, and sudden cardiac death. Vulnerable blood (prone to thrombosis) and vulnerable myocardium (prone to fatal arrhythmia) play an important role in the outcome. Therefore, the term "vulnerable patient" may be more appropriate and is proposed now for the identification of subjects with high likelihood of developing cardiac events in the near future. (3) A quantitative method for cumulative risk assessment of vulnerable patients needs to be developed that may include variables based on plaque, blood, and myocardial vulnerability. In Part I of this consensus document, we cover the new definition of vulnerable plaque and its relationship with vulnerable patients. Part II of this consensus document focuses on vulnerable blood and vulnerable myocardium and provide an outline of overall risk assessment of vulnerable patients. Parts I and II are meant to provide a general consensus and overviews the new field of vulnerable patient. Recently developed assays (eg, C-reactive protein), imaging techniques (eg, CT and MRI), noninvasive electrophysiological tests (for vulnerable myocardium), and emerging catheters (to localize and characterize vulnerable plaque) in combination with future genomic and proteomic techniques will guide us in the search for vulnerable patients. It will also lead to the development and deployment of new therapies and ultimately to reduce the incidence of acute coronary syndromes and sudden cardiac death. We encourage healthcare policy makers to promote translational research for screening and treatment of vulnerable patients.

Journal ArticleDOI
TL;DR: Treatment with pemetrexed plus cisplatin and vitamin supplementation resulted in superior survival time, time to progression, and response rates compared with treatment with cis Platin alone in patients with malignant pleural mesothelioma.
Abstract: Purpose: Patients with malignant pleural mesothelioma, a rapidly progressing malignancy with a median survival time of 6 to 9 months, have previously responded poorly to chemotherapy. We conducted a phase III trial to determine whether treatment with pemetrexed and cisplatin results in survival time superior to that achieved with cisplatin alone. Patients and Methods: Chemotherapy-naive patients who were not eligible for curative surgery were randomly assigned to receive pemetrexed 500 mg/m2 and cisplatin 75 mg/m2 on day 1, or cisplatin 75 mg/m2 on day 1. Both regimens were given intravenously every 21 days. Results: A total of 456 patients were assigned: 226 received pemetrexed and cisplatin, 222 received cisplatin alone, and eight never received therapy. Median survival time in the pemetrexed/cisplatin arm was 12.1 months versus 9.3 months in the control arm (P = .020, two-sided log-rank test). The hazard ratio for death of patients in the pemetrexed/cisplatin arm versus those in the control arm was 0.7...

Journal ArticleDOI
TL;DR: In this paper, the authors used Monte Carlo realizations of different star formation histories, including starbursts of varying strength and a range of metallicities, to constrain the mean stellar ages of galaxies and the fractional stellar mass formed in bursts over the past few Gyr.
Abstract: We develop a new method to constrain the star formation histories, dust attenuation and stellar masses of galaxies. It is based on two stellar absorption-line indices, the 4000-A break strength and the Balmer absorption-line index Hδ A . Together, these indices allow us to constrain the mean stellar ages of galaxies and the fractional stellar mass formed in bursts over the past few Gyr. A comparison with broad-band photometry then yields estimates of dust attenuation and of stellar mass. We generate a large library of Monte Carlo realizations of different star formation histories, including starbursts of varying strength and a range of metallicities. We use this library to generate median likelihood estimates of burst mass fractions, dust attenuation strengths, stellar masses and stellar mass-to-light ratios for a sample of 122 808 galaxies drawn from the Sloan Digital Sky Survey. The typical 95 per cent confidence range in our estimated stellar masses is ′40 per cent. We study how the stellar mass-to-light ratios of galaxies vary as a function of absolute magnitude, concentration index and photometric passband and how dust attenuation varies as a function of absolute magnitude and 4000-A break strength. We also calculate how the total stellar mass of the present Universe is distributed over galaxies as a function of their mass, size, concentration, colour, burst mass fraction and surface mass density. We find that most of the stellar mass in the local Universe resides in galaxies that have, to within a factor of approximately 2, stellar masses ∼5 x 10 1 0 M O ., half-light radii ∼3 kpc and half-light surface mass densities ∼10 9 M O .kpc - 2 . The distribution of D n (4000) is strongly bimodal, showing a clear division between galaxies dominated by old stellar populations and galaxies with more recent star formation.

Journal ArticleDOI
TL;DR: In this paper, the authors propose an interest group theory of financial development where incumbents oppose financial development because it breeds competition. And the theory predicts that incumbents’ opposition will be weaker when an economy allows both cross-border trade and capital flows.

Journal ArticleDOI
TL;DR: Using neoadjuvant methotrexate, vinblastine, doxorubicin, and cisplatin followed by radical cystectomy increases the likelihood of eliminating residual cancer in the Cystectomy specimen and is associated with improved survival among patients with locally advanced bladder cancer.
Abstract: BACKGROUND Despite aggressive local therapy, patients with locally advanced bladder cancer are at significant risk for metastases. We evaluated the ability of neoadjuvant chemotherapy to improve the outcome in patients with locally advanced bladder cancer who were treated with radical cystectomy. METHODS Patients were enrolled if they had muscle-invasive bladder cancer (stage T2 to T4a) and were to be treated with radical cystectomy. They were stratified according to age (less than 65 years vs. 65 years or older) and stage (superficial muscle invasion vs. more extensive disease) and were randomly assigned to radical cystectomy alone or three cycles of methotrexate, vinblastine, doxorubicin, and cisplatin followed by radical cystectomy. RESULTS We enrolled 317 patients over an 11-year period, 10 of whom were found to be ineligible; thus, 154 were assigned to receive surgery alone and 153 to receive combination therapy. According to an intention-to-treat analysis, the median survival among patients assigned to surgery alone was 46 months, as compared with 77 months among patients assigned to combination therapy (P=0.06 by a two-sided stratified log-rank test). In both groups, improved survival was associated with the absence of residual cancer in the cystectomy specimen. Significantly more patients in the combination-therapy group had no residual disease than patients in the cystectomy group (38 percent vs. 15 percent, P<0.001). CONCLUSIONS As compared with radical cystectomy alone, the use of neoadjuvant methotrexate, vinblastine, doxorubicin, and cisplatin followed by radical cystectomy increases the likelihood of eliminating residual cancer in the cystectomy specimen and is associated with improved survival among patients with locally advanced bladder cancer.

Journal ArticleDOI
TL;DR: The concept of indexical order is introduced in this article to analyze how semiotic agents access macro-sociological plane categories and concepts as values in the indexable realm of the micro-contextual.

Journal ArticleDOI
TL;DR: In this article, Hong Kong, Malaysia, Singapore and Thailand provide rare insight into the interaction between accounting standards and the incentives of managers and auditors, showing that their financial reporting quality is not higher than under code law, with quality operationalized as timely recognition of economic income.

Journal ArticleDOI
TL;DR: The authors argue that capabilities can help us to construct a normative conception of social justice, with critical potential for gender issues, only if we specify a definite set of capabilities as the most important ones to protect.
Abstract: Amartya Sen has made a major contribution to the theory of social justice, and of gender justice, by arguing that capabilities are the relevant space of comparison when justice-related issues are considered. This article supports Sen's idea, arguing that capabilities supply guidance superior to that of utility and resources (the view's familiar opponents), but also to that of the social contract tradition, and at least some accounts of human rights. But I argue that capabilities can help us to construct a normative conception of social justice, with critical potential for gender issues, only if we specify a definite set of capabilities as the most important ones to protect. Sen's "perspective of freedom" is too vague. Some freedoms limit others; some freedoms are important, some trivial, some good, and some positively bad. Before the approach can offer a valuable normative gender perspective, we must make commitments about substance.

Journal ArticleDOI
TL;DR: In this article, the authors discuss how metallicity affects the evolution and final fate of massive stars, and derive the relative populations of stellar populations as a function of metallity.
Abstract: How massive stars die-what sort of explosion and remnant each produces-depends chiefly on the masses of their helium cores and hydrogen envelopes at death. For single stars, stellar winds are the only means of mass loss, and these are a function of the metallicity of the star. We discuss how metallicity, and a simplified prescription for its effect on mass loss, affects the evolution and final fate of massive stars. We map, as a function of mass and metallicity, where black holes and neutron stars are likely to form and where different types of supernovae are produced. Integrating over an initial mass function, we derive the relative populations as a function of metallicity. Provided that single stars rotate rapidly enough at death, we speculate on stellar populations that might produce gamma-ray bursts and jet-driven supernovae.

Journal ArticleDOI
TL;DR: In this article, the authors present experimental evidence in support of an additional factor: women may be less effective than men in competitive environments, even if they are able to perform similarly in non-competitive environments.
Abstract: Even though the provision of equal opportunities for men and women has been a priority in many countries, large gender differences prevail in competitive high-ranking positions. Suggested explanations include discrimination and differences in preferences and human capital. In this paper we present experimental evidence in support of an additional factor: women may be less effective than men in competitive environments, even if they are able to perform similarly in non-competitive environments. In a laboratory experiment we observe, as we increase the competitiveness of the environment, a significant increase in performance for men, but not for women. This results in a significant gender gap in performance in tournaments, while there is no gap when participants are paid according to piece rate. This effect is stronger when women have to compete against men than in single-sex competitive environments: this suggests that women may be able to perform in competitive environments per se.

Book
01 Jan 2003
TL;DR: In this article, Carles Boix offers a complete theory of political transitions, in which political regimes ultimately hinge on the nature of economic assets, their distribution among individuals, and the balance of power among different social groups.
Abstract: When do countries democratize? What facilitates the survival of authoritarian regimes? What determines the occurrence of revolutions, often leading to left-wing dictatorships, such as the Soviet regime? Although a large literature has developed since Aristotle through contemporary political science to answer these questions, we still lack a convincing understanding of the process of political development. Employing analytical tools borrowed from game theory, Carles Boix offers a complete theory of political transitions, in which political regimes ultimately hinge on the nature of economic assets, their distribution among individuals, and the balance of power among different social groups. Backed up by detailed historical work and extensive statistical analysis that goes back to the mid-nineteenth century, this 2003 book explains why democracy emerged in classical Athens. It also discusses the early triumph of democracy in both nineteenth-century agrarian Norway, Switzerland and northeastern America and the failure in countries with a powerful landowning class.

Journal ArticleDOI
TL;DR: Despite an increased focus on pain management programs and the development of new standards for pain management, many patients continue to experience intense pain after surgery and additional efforts are required to improve patients’ postoperative pain experience.
Abstract: Postoperative pain can have a significant effect on patient recovery. An understanding of patient attitudes and concerns about postoperative pain is important for identifying ways health care professionals can improve postoperative care. To assess patients’ postoperative pain experience and the stat

Journal ArticleDOI
TL;DR: The results suggest that CD occurs frequently not only in patients with gastrointestinal symptoms, but also in first- and second-degree relatives and patients with numerous common disorders even in the absence of gastrointestinal symptoms.
Abstract: Background Celiac disease (CD) is an immune-mediated enteropathic condition triggered in genetically susceptible individuals by the ingestion of gluten. Although common in Europe, CD is thought to be rare in the United States, where there are no large epidemiologic studies of its prevalence. The aim of this study was to determine the prevalence of CD in at-risk and not-at-risk groups in the United States. Methods Serum antigliadin antibodies and anti–endomysial antibodies (EMA) were measured. In EMA-positive subjects, human tissue transglutaminase IgA antibodies and CD-associated human leukocyte antigen DQ2/DQ8 haplotypes were determined. Intestinal biopsy was recommended and performed whenever possible for all EMA-positive subjects. A total of 13 145 subjects were screened: 4508 first-degree and 1275 second-degree relatives of patients with biopsy-proven CD, 3236 symptomatic patients (with either gastrointestinal symptoms or a disorder associated with CD), and 4126 not-at-risk individuals. Results In at-risk groups, the prevalence of CD was 1:22 in first-degree relatives, 1:39 in second-degree relatives, and 1:56 in symptomatic patients. The overall prevalence of CD in not-at-risk groups was 1:133. All the EMA-positive subjects who underwent intestinal biopsy had lesions consistent with CD. Conclusions Our results suggest that CD occurs frequently not only in patients with gastrointestinal symptoms, but also in first- and second-degree relatives and patients with numerous common disorders even in the absence of gastrointestinal symptoms. The prevalence of CD in symptomatic patients and not-at-risk subjects was similar to that reported in Europe. Celiac disease appears to be a more common but neglected disorder than has generally been recognized in the United States.

Journal ArticleDOI
TL;DR: In this article, the authors present a model of mergers and acquisitions based on stock market misvaluations of the combining firms and the market's perception of the synergies from the combination.

Journal ArticleDOI
TL;DR: Corporate transparency is defined as the availability of firm-specific information to those outside publicly traded firms, and viewed as the joint output of multi-faceted systems whose components collectively produce, gather, validate and disseminate information to market participants as discussed by the authors.
Abstract: We investigate corporate transparency, defined as the availability of firm-specific information to those outside publicly traded firms, and viewed as the joint output of multi-faceted systems whose components collectively produce, gather, validate and disseminate information to market participants. We factor analyze an extensive range of measures capturing countries' firm-specific information environments, and isolate two factors interpreted as financial transparency and governance transparency. We investigate whether these factors vary with the countries' legal/judicial regimes and political economies. Our main multivariate result is that the governance transparency factor is primary related to a country's legal/judicial regime, while the financial transparency factor is primarily related to political economy.

Journal ArticleDOI
TL;DR: The data derived from studies using the reinstatement model suggest that the neuronal events that mediate drug-, cue- and stress-induced reinstatement of drug seeking are not identical, and that the duration of the withdrawal period following cocaine and heroin self-administration has a profound effect on reinstatement induced by drug cues and stress.
Abstract: The reinstatement model is currently used in many laboratories to investigate mechanisms underlying relapse to drug seeking. Here, we review briefly the history of the model and describe the different procedures that have been used to study the phenomenon of reinstatement of drug seeking. The results from studies using pharmacological and neuroanatomical techniques to determine the neuronal events that mediate reinstatement of heroin, cocaine and alcohol seeking by acute priming injections of drugs, drug-associated cues and environmental stressors are summarized. In addition, several issues are discussed, including (1) the concordance between the neuronal mechanisms involved in drug-induced reinstatement and those involved in drug reward and discrimination, (2) the role of drug withdrawal states and periods in reinstatement of drug seeking, (3) the role of neuronal adaptations induced by exposure to drugs in relapse, and (4) the degree to which the rat reinstatement model provides a suitable preclinical model of relapse to drug taking. The data derived from studies using the reinstatement model suggest that the neuronal events that mediate drug-, cue- and stress-induced reinstatement of drug seeking are not identical, that the mechanisms underlying drug-induced reinstatement are to some degree different from those mediating drug discrimination or reward, and that the duration of the withdrawal period following cocaine and heroin self-administration has a profound effect on reinstatement induced by drug cues and stress. Finally, there appears to be a good correspondence between the events that induce reinstatement in laboratory animals and those that provoke relapse in humans.

Journal ArticleDOI
27 Mar 2003-Neuron
TL;DR: It is shown that neuronal activity modulates the formation and secretion of Abeta peptides in hippocampal slice neurons that overexpress APP, and it is proposed that activity-dependent modulation of endogenous Abeta production may normally participate in a negative feedback that could keep neuronal hyperactivity in check.

Journal ArticleDOI
19 Nov 2003-JAMA
TL;DR: This study suggests thatPlayers with a history of previous concussions are more likely to have future concussive injuries than those with no history; 1 in 15 players with a concussion may have additional concussions in the same playing season; and previous concussion may be associated with slower recovery of neurological function.
Abstract: ContextApproximately 300 000 sport-related concussions occur annually in the United States, and the likelihood of serious sequelae may increase with repeated head injury.ObjectiveTo estimate the incidence of concussion and time to recovery after concussion in collegiate football players.Design, Setting, and ParticipantsProspective cohort study of 2905 football players from 25 US colleges were tested at preseason baseline in 1999, 2000, and 2001 on a variety of measures and followed up prospectively to ascertain concussion occurrence. Players injured with a concussion were monitored until their concussion symptoms resolved and were followed up for repeat concussions until completion of their collegiate football career or until the end of the 2001 football season.Main Outcome MeasuresIncidence of concussion and repeat concusion; type and duration of symptoms and course of recovery among players who were injured with a concussion during the seasons.ResultsDuring follow-up of 4251 player-seasons, 184 players (6.3%) had a concussion, and 12 (6.5%) of these players had a repeat concussion within the same season. There was an association between reported number of previous concussions and likelihood of incident concussion. Players reporting a history of 3 or more previous concussions were 3.0 (95% confidence interval, 1.6-5.6) times more likely to have an incident concussion than players with no concussion history. Headache was the most commonly reported symptom at the time of injury (85.2%), and mean overall symptom duration was 82 hours. Slowed recovery was associated with a history of multiple previous concussions (30.0% of those with ≥3 previous concussions had symptoms lasting >1 week compared with 14.6% of those with 1 previous concussion). Of the 12 incident within-season repeat concussions, 11 (91.7%) occurred within 10 days of the first injury, and 9 (75.0%) occurred within 7 days of the first injury.ConclusionsOur study suggests that players with a history of previous concussions are more likely to have future concussive injuries than those with no history; 1 in 15 players with a concussion may have additional concussions in the same playing season; and previous concussions may be associated with slower recovery of neurological function.