scispace - formally typeset
Search or ask a question

Showing papers by "University of Chicago published in 2005"


Journal ArticleDOI
TL;DR: A software package for the analysis of X-ray absorption spectroscopy (XAS) data is presented, based on the IFEFFIT library of numerical and XAS algorithms and is written in the Perl programming language using the Perl/Tk graphics toolkit.
Abstract: A software package for the analysis of X-ray absorption spectroscopy (XAS) data is presented. This package is based on the IFEFFIT library of numerical and XAS algorithms and is written in the Perl programming language using the Perl/Tk graphics toolkit. The programs described here are: (i) ATHENA, a program for XAS data processing, (ii) ARTEMIS, a program for EXAFS data analysis using theoretical standards from FEFF and (iii) HEPHAESTUS, a collection of beamline utilities based on tables of atomic absorption data. These programs enable high-quality data analysis that is accessible to novices while still powerful enough to meet the demands of an expert practitioner. The programs run on all major computer platforms and are freely available under the terms of a free software license.

12,505 citations


Journal ArticleDOI
TL;DR: Members of the Chamber Quantification Writing Group are: Roberto M. Lang, MD, Fase, Michelle Bierig, MPH, RDCS, FASE, Richard B. Devereux,MD, Frank A. Flachskampf, MD and Elyse Foster, MD.
Abstract: Members of the Chamber Quantification Writing Group are: Roberto M. Lang, MD, FASE, Michelle Bierig, MPH, RDCS, FASE, Richard B. Devereux, MD, Frank A. Flachskampf, MD, Elyse Foster, MD, Patricia A. Pellikka, MD, Michael H. Picard, MD, Mary J. Roman, MD, James Seward, MD, Jack S. Shanewise, MD, FASE, Scott D. Solomon, MD, Kirk T. Spencer, MD, FASE, Martin St John Sutton, MD, FASE, and William J. Stewart, MD

10,834 citations


Journal ArticleDOI
TL;DR: In this paper, a large-scale correlation function measured from a spectroscopic sample of 46,748 luminous red galaxies from the Sloan Digital Sky Survey is presented, which demonstrates the linear growth of structure by gravitational instability between z ≈ 1000 and the present and confirms a firm prediction of the standard cosmological theory.
Abstract: We present the large-scale correlation function measured from a spectroscopic sample of 46,748 luminous red galaxies from the Sloan Digital Sky Survey. The survey region covers 0.72h −3 Gpc 3 over 3816 square degrees and 0.16 < z < 0.47, making it the best sample yet for the study of large-scale structure. We find a well-detected peak in the correlation function at 100h −1 Mpc separation that is an excellent match to the predicted shape and location of the imprint of the recombination-epoch acoustic oscillations on the low-redshift clustering of matter. This detection demonstrates the linear growth of structure by gravitational instability between z ≈ 1000 and the present and confirms a firm prediction of the standard cosmological theory. The acoustic peak provides a standard ruler by which we can measure the ratio of the distances to z = 0.35 and z = 1089 to 4% fractional accuracy and the absolute distance to z = 0.35 to 5% accuracy. From the overall shape of the correlation function, we measure the matter density mh 2 to 8% and find agreement with the value from cosmic microwave background (CMB) anisotropies. Independent of the constraints provided by the CMB acoustic scale, we find m = 0.273 ±0.025+0.123(1+ w0)+0.137K. Including the CMB acoustic scale, we find that the spatial curvature is K = −0.010 ± 0.009 if the dark energy is a cosmological constant. More generally, our results provide a measurement of cosmological distance, and hence an argument for dark energy, based on a geometric method with the same simple physics as the microwave background anisotropies. The standard cosmological model convincingly passes these new and robust tests of its fundamental properties. Subject headings: cosmology: observations — large-scale structure of the universe — distance scale — cosmological parameters — cosmic microwave background — galaxies: elliptical and lenticular, cD

4,428 citations


Journal ArticleDOI
TL;DR: Patients with moderate-to-severe active ulcerative colitis treated with infliximab at weeks 0, 2, and 6 and every eight weeks thereafter were more likely to have a clinical response at weeks 8, 30, and 54 than were those receiving placebo.
Abstract: Background Infliximab, a chimeric monoclonal antibody directed against tumor necrosis factor α, is an established treatment for Crohn's disease but not ulcerative colitis. Methods Two randomized, double-blind, placebo-controlled studies — the Active Ulcerative Colitis Trials 1 and 2 (ACT 1 and ACT 2, respectively) — evaluated the efficacy of infliximab for induction and maintenance therapy in adults with ulcerative colitis. In each study, 364 patients with moderate-to-severe active ulcerative colitis despite treatment with concurrent medications received placebo or infliximab (5 mg or 10 mg per kilogram of body weight) intravenously at weeks 0, 2, and 6 and then every eight weeks through week 46 (in ACT 1) or week 22 (in ACT 2). Patients were followed for 54 weeks in ACT 1 and 30 weeks in ACT 2. Results In ACT 1, 69 percent of patients who received 5 mg of infliximab and 61 percent of those who received 10 mg had a clinical response at week 8, as compared with 37 percent of those who received placebo (P<0...

3,345 citations


Journal ArticleDOI
TL;DR: Experimental results suggest that the proposed Laplacianface approach provides a better representation and achieves lower error rates in face recognition.
Abstract: We propose an appearance-based face recognition method called the Laplacianface approach. By using locality preserving projections (LPP), the face images are mapped into a face subspace for analysis. Different from principal component analysis (PCA) and linear discriminant analysis (LDA) which effectively see only the Euclidean structure of face space, LPP finds an embedding that preserves local information, and obtains a face subspace that best detects the essential face manifold structure. The Laplacianfaces are the optimal linear approximations to the eigenfunctions of the Laplace Beltrami operator on the face manifold. In this way, the unwanted variations resulting from changes in lighting, facial expression, and pose may be eliminated or reduced. Theoretical analysis shows that PCA, LDA, and LPP can be obtained from different graph models. We compare the proposed Laplacianface approach with Eigenface and Fisherface methods on three different face data sets. Experimental results suggest that the proposed Laplacianface approach provides a better representation and achieves lower error rates in face recognition.

3,314 citations


Journal ArticleDOI
TL;DR: This study details the 2009 recommendations of the NCCD on the use of cell death-related terminology including ‘entosis’, ‘mitotic catastrophe”,’ ‘necrosis‚ ‘necroptosis‚’ and ‘pyroptotic’.
Abstract: Different types of cell death are often defined by morphological criteria, without a clear reference to precise biochemical mechanisms. The Nomenclature Committee on Cell Death (NCCD) proposes unified criteria for the definition of cell death and of its different morphologies, while formulating several caveats against the misuse of words and concepts that slow down progress in the area of cell death research. Authors, reviewers and editors of scientific periodicals are invited to abandon expressions like 'percentage apoptosis' and to replace them with more accurate descriptions of the biochemical and cellular parameters that are actually measured. Moreover, at the present stage, it should be accepted that caspase-independent mechanisms can cooperate with (or substitute for) caspases in the execution of lethal signaling pathways and that 'autophagic cell death' is a type of cell death occurring together with (but not necessarily by) autophagic vacuolization. This study details the 2009 recommendations of the NCCD on the use of cell death-related terminology including 'entosis', 'mitotic catastrophe', 'necrosis', 'necroptosis' and 'pyroptosis'.

3,005 citations


Journal ArticleDOI
TL;DR: In this article, the authors argue that the textbook search and matching model cannot generate the observed business-cycle-frequency fluctuations in unemployment and job vacancies in response to shocks of a plausible magnitude.
Abstract: This paper argues that the textbook search and matching model cannot generate the observed business-cycle-frequency fluctuations in unemployment and job vacancies in response to shocks of a plausible magnitude. In the United States, the standard deviation of the vacancy-unemployment ratio is almost 20 times as large as the standard deviation of average labor productivity, while the search model predicts that the two variables should have nearly the same volatility. A shock that changes average labor productivity primarily alters the present value of wages, generating only a small movement along a downward-sloping Beveridge curve (unemploymentvacancy locus). A shock to the separation rate generates a counterfactually positive correlation between unemployment and vacancies. In both cases, the model exhibits virtually no propagation. (JEL E24, E32, J41, J63, J64)

2,672 citations


Journal ArticleDOI
TL;DR: This manuscript summarizes the proceedings of the ISUP consensus meeting for grading of prostatic carcinoma held in September 2019, in Nice, France, where topics brought to consensus included approaches to reporting of Gleason patterns 4 and 5 quantities, and minor/tertiary patterns.
Abstract: Five years after the last prostatic carcinoma grading consensus conference of the International Society of Urological Pathology (ISUP), accrual of new data and modification of clinical practice require an update of current pathologic grading guidelines. This manuscript summarizes the proceedings of the ISUP consensus meeting for grading of prostatic carcinoma held in September 2019, in Nice, France. Topics brought to consensus included the following: (1) approaches to reporting of Gleason patterns 4 and 5 quantities, and minor/tertiary patterns, (2) an agreement to report the presence of invasive cribriform carcinoma, (3) an agreement to incorporate intraductal carcinoma into grading, and (4) individual versus aggregate grading of systematic and multiparametric magnetic resonance imaging-targeted biopsies. Finally, developments in the field of artificial intelligence in the grading of prostatic carcinoma and future research perspectives were discussed.

2,636 citations


Journal ArticleDOI
TL;DR: In this article, the authors hypothesize that private company financial reporting nevertheless is of lower quality due to different market demand, regulation notwithstanding, and a large UK sample supports this hypothesis, using Basu's (1997) measure of timely loss recognition and a new accruals-based method.

2,183 citations


Journal ArticleDOI
TL;DR: The subsystem approach is described, the first release of the growing library of populated subsystems is offered, and the SEED is the first annotation environment that supports this model of annotation.
Abstract: The release of the 1000th complete microbial genome will occur in the next two to three years. In anticipation of this milestone, the Fellowship for Interpretation of Genomes (FIG) launched the Project to Annotate 1000 Genomes. The project is built around the principle that the key to improved accuracy in high-throughput annotation technology is to have experts annotate single subsystems over the complete collection of genomes, rather than having an annotation expert attempt to annotate all of the genes in a single genome. Using the subsystems approach, all of the genes implementing the subsystem are analyzed by an expert in that subsystem. An annotation environment was created where populated subsystems are curated and projected to new genomes. A portable notion of a populated subsystem was defined, and tools developed for exchanging and curating these objects. Tools were also developed to resolve conflicts between populated subsystems. The SEED is the first annotation environment that supports this model of annotation. Here, we describe the subsystem approach, and offer the first release of our growing library of populated subsystems. The initial release of data includes 180 177 distinct proteins with 2133 distinct functional roles. This data comes from 173 subsystems and 383 different organisms.

1,896 citations


Proceedings Article
05 Dec 2005
TL;DR: This paper proposes a "filter" method for feature selection which is independent of any learning algorithm, based on the observation that, in many real world classification problems, data from the same class are often close to each other.
Abstract: In supervised learning scenarios, feature selection has been studied widely in the literature. Selecting features in unsupervised learning scenarios is a much harder problem, due to the absence of class labels that would guide the search for relevant information. And, almost all of previous unsupervised feature selection methods are "wrapper" techniques that require a learning algorithm to evaluate the candidate feature subsets. In this paper, we propose a "filter" method for feature selection which is independent of any learning algorithm. Our method can be performed in either supervised or unsupervised fashion. The proposed method is based on the observation that, in many real world classification problems, data from the same class are often close to each other. The importance of a feature is evaluated by its power of locality preserving, or, Laplacian Score. We compare our method with data variance (unsupervised) and Fisher score (supervised) on two data sets. Experimental results demonstrate the effectiveness and efficiency of our algorithm.

Journal ArticleDOI
TL;DR: This review addresses current concepts regarding the diagnosis, cause, and treatment of the polycystic ovary syndrome.
Abstract: The polycystic ovary syndrome is one of the most common hormonal disorders affecting women It has multiple components — reproductive, metabolic, and cardiovascular — with health implications for the patient's entire life span This review addresses current concepts regarding the diagnosis, cause, and treatment of the condition

Book
01 Jan 2005
TL;DR: McNeill as mentioned in this paper argued that gestures are key ingredients in an Imagery-Language dialectic that fuels speech and thought; gestures are the "imagery" and also the components of language, rather than mere consequences.
Abstract: Gesturing is such an integral yet unconscious part of communication that we are mostly oblivious to it. But if you observe anyone in conversation, you are likely to see his or her fingers, hands, and arms in some form of spontaneous motion. Why? David McNeill, a pioneer in the ongoing study of the relationship between gesture and language, set about answering this question in "Gesture and Thought" with an unlikely accomplice - Tweety Bird. McNeill argues that gestures are active participants in both speaking and thinking. He posits that gestures are key ingredients in an "imagery-language dialectic" that fuels speech and thought; gestures are the "imagery" and also the components of "language," rather than mere consequences. The smallest unit of this dialectic is the "growth point," a snapshot of an utterance at its beginning psychological stage. Enter Tweety Bird. In "Gesture and Thought", the central growth point comes from a cartoon. In his quest to eat Tweety Bird, Sylvester the cat first scales the outside of a rain gutter to reach his prey. Unsuccessful, he makes a second attempt by climbing up the inside of the gutter. Tweety, however, drops a bowling ball down the gutter; Sylvester swallows the ball. Over the course of twenty-five years, McNeill showed this cartoon to numerous subjects who spoke a variety of languages. A fascinating pattern emerged. Those who remembered the exact sequence of the cartoon while retelling it all used the same gesture to describe Sylvester's position inside the gutter. Those who forgot, in the retelling, that Sylvester had first climbed the outside of the gutter did not use this gesture at all. Thus that gesture becomes part of the "growth point" - the building block of language and thought. An ambitious project in the ongoing study of the relationship of how we communicate and its connection to thought, "Gesture and Thought" is a work of such consequence that it will influence all subsequent linguistic and evolutionary theory on the subject.

ReportDOI
TL;DR: In this paper, the authors formalize the concepts of self-productivity and complementarity of human capital investments and use them to explain the evidence on skill formation, and provide a theoretical framework for interpreting the evidence from a vast empirical literature, for guiding the next generation of empirical studies, and for formulating policy.
Abstract: This paper presents economic models of child development that capture the essence of recent findings from the empirical literature on skill formation. The goal of this essay is to provide a theoretical framework for interpreting the evidence from a vast empirical literature, for guiding the next generation of empirical studies, and for formulating policy. Central to our analysis is the concept that childhood has more than one stage. We formalize the concepts of self-productivity and complementarity of human capital investments and use them to explain the evidence on skill formation. Together, they explain why skill begets skill through a multiplier process. Skill formation is a life cycle process. It starts in the womb and goes on throughout life. Families play a role in this process that is far more important than the role of schools. There are multiple skills and multiple abilities that are important for adult success. Abilities are both inherited and created, and the traditional debate about nature versus nurture is scientiÞcally obsolete. Human capital investment exhibits both self-productivity and complementarity. Skill attainment at one stage of the life cycle raises skill attainment at later stages of the life cycle (self-productivity). Early investment facilitates the productivity of later investment (complementarity). Early investments are not productive if they are not followed up by later investments (another aspect of complementarity). This complementarity explains why there is no equity-efficiency trade-off for early investment. The returns to investing early in the life cycle are high. Remediation of inadequate early investments is difficult and very costly as a consequence of both self-productivity and complementarity.

Posted Content
TL;DR: In this article, the authors investigate the nature of selection and productivity growth using data from industries where they observe producer-level quantities and prices separately, and show that there are important differences between revenue and physical productivity.
Abstract: There is considerable evidence that producer-level churning contributes substantially to aggregate (industry) productivity growth, as more productive businesses displace less productive ones. However, this research has been limited by the fact that producer-level prices are typically unobserved; thus within-industry price differences are embodied in productivity measures. If prices reflect idiosyncratic demand or market power shifts, high "productivity" businesses may not be particularly efficient, and the literature's findings might be better interpreted as evidence of entering businesses displacing less profitable, but not necessarily less productive, exiting businesses. In this paper, we investigate the nature of selection and productivity growth using data from industries where we observe producer-level quantities and prices separately. We show there are important differences between revenue and physical productivity. A key dissimilarity is that physical productivity is inversely correlated with plant-level prices while revenue productivity is positively correlated with prices. This implies that previous work linking (revenue-based) productivity to survival has confounded the separate and opposing effects of technical efficiency and demand on survival, understating the true impacts of both. We further show that young producers charge lower prices than incumbents, and as such the literature understates the productivity advantage of new producers and the contribution of entry to aggregate productivity growth.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the extent, nature, and economic costs of political rent provision in government banks and found that political firms borrow 45 percent more and have 50 percent higher default rates than private banks.
Abstract: Corruption by the politically connected is often blamed for economic ills, particularly in less developed economies. Using a loan-level data set of more than 90,000 …rms that represents the universe of corporate lending in Pakistan between 1996 and 2002, we investigate rents to politically connected …rms in banking. Classifying a …rm as “political”if its director participates in an election, we examine the extent, nature, and economic costs of political rent provision. We …nd that political …rms borrow 45 percent more and have 50 percent higher default rates. Such preferential treatment occurs exclusively in government banks - private banks provide no

Proceedings ArticleDOI
17 Oct 2005
TL;DR: This paper proposes a novel subspace learning algorithm called neighborhood preserving embedding (NPE), which aims at preserving the local neighborhood structure on the data manifold and is less sensitive to outliers than principal component analysis (PCA).
Abstract: Recently there has been a lot of interest in geometrically motivated approaches to data analysis in high dimensional spaces. We consider the case where data is drawn from sampling a probability distribution that has support on or near a submanifold of Euclidean space. In this paper, we propose a novel subspace learning algorithm called neighborhood preserving embedding (NPE). Different from principal component analysis (PCA) which aims at preserving the global Euclidean structure, NPE aims at preserving the local neighborhood structure on the data manifold. Therefore, NPE is less sensitive to outliers than PCA. Also, comparing to the recently proposed manifold learning algorithms such as Isomap and locally linear embedding, NPE is defined everywhere, rather than only on the training data points. Furthermore, NPE may be conducted in the original space or in the reproducing kernel Hilbert space into which data points are mapped. This gives rise to kernel NPE. Several experiments on face database demonstrate the effectiveness of our algorithm

Journal ArticleDOI
TL;DR: The authors believe that the time-honored requirement to follow every small indeterminate nodule with serial CT should be revised and new guidelines are proposed for follow-up and management of small pulmonary nodules detected on CT scans.
Abstract: Lung nodules are detected very commonly on computed tomographic (CT) scans of the chest, and the ability to detect very small nodules improves with each new generation of CT scanner. In reported studies, up to 51% of smokers aged 50 years or older have pulmonary nodules on CT scans. However, the existing guidelines for follow-up and management of noncalcified nodules detected on nonscreening CT scans were developed before widespread use of multi–detector row CT and still indicate that every indeterminate nodule should be followed with serial CT for a minimum of 2 years. This policy, which requires large numbers of studies to be performed at considerable expense and with substantial radiation exposure for the affected population, has not proved to be beneficial or cost-effective. During the past 5 years, new information regarding prevalence, biologic characteristics, and growth rates of small lung cancers has become available; thus, the authors believe that the time-honored requirement to follow every smal...

Journal ArticleDOI
19 Jan 2005-JAMA
TL;DR: In this study of CPR during out-of-hospital cardiac arrest, chest compressions were not delivered half of the time, and most compressions was too shallow.
Abstract: ContextCardiopulmonary resuscitation (CPR) guidelines recommend target values for compressions, ventilations, and CPR-free intervals allowed for rhythm analysis and defibrillation. There is little information on adherence to these guidelines during advanced cardiac life support in the field.ObjectiveTo measure the quality of out-of-hospital CPR performed by ambulance personnel, as measured by adherence to CPR guidelines.Design and SettingCase series of 176 adult patients with out-of-hospital cardiac arrest treated by paramedics and nurse anesthetists in Stockholm, Sweden, London, England, and Akershus, Norway, between March 2002 and October 2003. The defibrillators recorded chest compressions via a sternal pad fitted with an accelerometer and ventilations by changes in thoracic impedance between the defibrillator pads, in addition to standard event and electrocardiographic recordings.Main Outcome MeasureAdherence to international guidelines for CPR.ResultsChest compressions were not given 48% (95% CI, 45%-51%) of the time without spontaneous circulation; this percentage was 38% (95% CI, 36%-41%) when subtracting the time necessary for electrocardiographic analysis and defibrillation. Combining these data with a mean compression rate of 121/min (95% CI, 118-124/min) when compressions were given resulted in a mean compression rate of 64/min (95% CI, 61-67/min). Mean compression depth was 34 mm (95% CI, 33-35 mm), 28% (95% CI, 24%-32%) of the compressions had a depth of 38 mm to 51 mm (guidelines recommendation), and the compression part of the duty cycle was 42% (95% CI, 41%-42%). A mean of 11 (95% CI, 11-12) ventilations were given per minute. Sixty-one patients (35%) had return of spontaneous circulation, and 5 of 6 patients discharged alive from the hospital had normal neurological outcomes.ConclusionsIn this study of CPR during out-of-hospital cardiac arrest, chest compressions were not delivered half of the time, and most compressions were too shallow. Electrocardiographic analysis and defibrillation accounted for only small parts of intervals without chest compressions.

Journal ArticleDOI
TL;DR: This article found evidence consistent with small banks being better able to collect and act on soft information than large banks, and that large banks are less willing to lend to informationally "difficult" credits, such as firms with no financial records.

Journal ArticleDOI
TL;DR: It is demonstrated that mitochondria function as O(2) sensors and signal hypoxic Hif-1 alpha and HIF-2 alpha stabilization by releasing ROS to the cytosol.

Journal ArticleDOI
TL;DR: It is concluded that sexual difficulties are relatively common among mature adults throughout the world, and tend to be more associated with physical health and aging among men than women.
Abstract: The Global Study of Sexual Attitudes and Behaviors (GSSAB) is an international survey of various aspects of sex and relationships among adults aged 40–80 y. An analysis of GSSAB data was performed to estimate the prevalence and correlates of sexual problems in 13 882 women and 13 618 men from 29 countries. The overall response rate was modest; however, the estimates of prevalence of sexual problems are comparable with published values. Several factors consistently elevated the likelihood of sexual problems. Age was an important correlate of lubrication difficulties among women and of several sexual problems, including a lack of interest in sex, the inability to reach orgasm, and erectile difficulties among men. We conclude that sexual difficulties are relatively common among mature adults throughout the world. Sexual problems tend to be more associated with physical health and aging among men than women.

Journal ArticleDOI
17 Feb 2005-Nature
TL;DR: An anatomically distinct population of ‘giant’, melanopsin-expressing ganglion cells in the primate retina that, in addition to being intrinsically photosensitive, are strongly activated by rods and cones, and display a rare, S-Off, (L + M)-On type of colour-opponent receptive field.
Abstract: Human vision starts with the activation of rod photoreceptors in dim light and short (S)-, medium (M)-, and long (L)- wavelength-sensitive cone photoreceptors in daylight. Recently a parallel, non-rod, non-cone photoreceptive pathway, arising from a population of retinal ganglion cells, was discovered in nocturnal rodents. These ganglion cells express the putative photopigment melanopsin and by signalling gross changes in light intensity serve the subconscious, 'non-image-forming' functions of circadian photoentrainment and pupil constriction. Here we show an anatomically distinct population of 'giant', melanopsin-expressing ganglion cells in the primate retina that, in addition to being intrinsically photosensitive, are strongly activated by rods and cones, and display a rare, S-Off, (L + M)-On type of colour-opponent receptive field. The intrinsic, rod and (L + M) cone-derived light responses combine in these giant cells to signal irradiance over the full dynamic range of human vision. In accordance with cone-based colour opponency, the giant cells project to the lateral geniculate nucleus, the thalamic relay to primary visual cortex. Thus, in the diurnal trichromatic primate, 'non-image-forming' and conventional 'image-forming' retinal pathways are merged, and the melanopsin-based signal might contribute to conscious visual perception.

ReportDOI
TL;DR: In this paper, the authors discuss the implications of monetary policy and prudential supervision on financial intermediaries and suggest market-friendly policies that would reduce the incentive of intermediary managers to take excessive risk.
Abstract: Developments in the financial sector have led to an expansion in its ability to spread risks. The increase in the risk bearing capacity of economies, as well as in actual risk taking, has led to a range of financial transactions that hitherto were not possible, and has created much greater access to finance for firms and households. On net, this has made the world much better off. Concurrently, however, we have also seen the emergence of a whole range of intermediaries, whose size and appetite for risk may expand over the cycle. Not only can these intermediaries accentuate real fluctuations, they can also leave themselves exposed to certain small probability risks that their own collective behavior makes more likely. As a result, under some conditions, economies may be more exposed to financial-sector-induced turmoil than in the past. The paper discusses the implications for monetary policy and prudential supervision. In particular, it suggests market-friendly policies that would reduce the incentive of intermediary managers to take excessive risk. 1 The author is the Economic Counselor and Director of Research of the International Monetary Fund. This paper reflects the author’s views and not necessarily those of the International Monetary Fund, its management, or its Board. I thank Laura Kodres for extremely useful conversations and suggestions, Sergei Antoshin for valuable research assistance, and Douglas Diamond, Jonathan Fiechter, Laura Kodres, Donald Kohn, Hyun Shin, Jeremy Stein, and Hung Tran for valuable comments on a previous draft.

Journal ArticleDOI
TL;DR: In this paper, the performance and capital inflows of private equity partnerships were investigated and the results showed that average fund returns (net of fees) approximately equal the SP however, established funds are less sensitive to cycles than new entrants.
Abstract: This paper investigates the performance and capital inflows of private equity partnerships. Average fund returns (net of fees) approximately equal the SP however, established funds are less sensitive to cycles than new entrants. Several of these results differ markedly from those for mutual funds. THE PRIVATE EQUITY INDUSTRY, primarily venture capital (VC) and buyout (LBO)

Journal ArticleDOI
24 Mar 2005-Nature
TL;DR: Evidence is reported for microbial, antigen-specific activation of NKT cells against Gram-negative, lipopolysaccharide (LPS)-negative alpha-Proteobacteria such as Ehrlichia muris and Sphingomonas capsulata and shows that glycosylceramides are an alternative to LPS for innate recognition of the Gram- negative, LPS-negative bacterial cell wall.
Abstract: CD1d-restricted natural killer T (NKT) cells are innate-like lymphocytes that express a conserved T-cell receptor and contribute to host defence against various microbial pathogens. However, their target lipid antigens have remained elusive. Here we report evidence for microbial, antigen-specific activation of NKT cells against Gram-negative, lipopolysaccharide (LPS)-negative alpha-Proteobacteria such as Ehrlichia muris and Sphingomonas capsulata. We have identified glycosylceramides from the cell wall of Sphingomonas that serve as direct targets for mouse and human NKT cells, controlling both septic shock reaction and bacterial clearance in infected mice. In contrast, Gram-negative, LPS-positive Salmonella typhimurium activates NKT cells through the recognition of an endogenous lysosomal glycosphingolipid, iGb3, presented by LPS-activated dendritic cells. These findings identify two novel antigenic targets of NKT cells in antimicrobial defence, and show that glycosylceramides are an alternative to LPS for innate recognition of the Gram-negative, LPS-negative bacterial cell wall.

Journal ArticleDOI
TL;DR: Although the incidence of gout flares diminished with continued treatment, the overall incidence during weeks 9 through 52 was similar in all groups andFebuxostat, at a daily dose of 80 mg or 120 mg, was more effective than allopurinol at the commonly used fixed daily doses of 300 mg in lowering serum urate.
Abstract: background Febuxostat, a novel nonpurine selective inhibitor of xanthine oxidase, is a potential alternative to allopurinol for patients with hyperuricemia and gout. methods We randomly assigned 762 patients with gout and with serum urate concentrations of at least 8.0 mg per deciliter (480 µmol per liter) to receive either febuxostat (80 mg or 120 mg) or allopurinol (300 mg) once daily for 52 weeks; 760 received the study drug. Prophylaxis against gout flares with naproxen or colchicine was provided during weeks 1 through 8. The primary end point was a serum urate concentration of less than 6.0 mg per deciliter (360 µmol per liter) at the last three monthly measurements. The secondary end points included reduction in the incidence of gout flares and in tophus area. results The primary end point was reached in 53 percent of patients receiving 80 mg of febuxostat, 62 percent of those receiving 120 mg of febuxostat, and 21 percent of those receiving allopurinol (P<0.001 for the comparison of each febuxostat group with the allopurinol group). Although the incidence of gout flares diminished with continued treatment, the overall incidence during weeks 9 through 52 was similar in all groups: 64 percent of patients receiving 80 mg of febuxostat, 70 percent of those receiving 120 mg of febuxostat, and 64 percent of those receiving allopurinol (P=0.99 for 80 mg of febuxostat vs. allopurinol; P = 0.23 for 120 mg of febuxostat vs. allopurinol). The median reduction in tophus area was 83 percent in patients receiving 80 mg of febuxostat and 66 percent in those receiving 120 mg of febuxostat, as compared with 50 percent in those receiving allopurinol (P=0.08 for 80 mg of febuxostat vs. allopurinol; P=0.16 for 120 mg of febuxostat vs. allopurinol). More patients in the high-dose febuxostat group than in the allopurinol group (P=0.003) or the low-dose febuxostat group discontinued the study. Four of the 507 patients in the two febuxostat groups (0.8 percent) and none of the 253 patients in the allopurinol group died; all deaths were from causes that the investigators (while still blinded to treatment) judged to be unrelated to the study drugs (P=0.31 for the comparison between the combined febuxostat groups and the allopurinol group). conclusions Febuxostat, at a daily dose of 80 mg or 120 mg, was more effective than allopurinol at the commonly used fixed daily dose of 300 mg in lowering serum urate. Similar reductions in gout flares and tophus area occurred in all treatment groups.

Book
09 Dec 2005
TL;DR: Bayesian methods have become widespread in marketing literature as mentioned in this paper, and they have been used extensively in marketing problems, especially in situations in which there is limited information about a large number of units or where the information comes from different sources.
Abstract: Bayesian methods have become widespread in marketing literature. We review the essence of the Bayesian approach and explain why it is particularly useful for marketing problems. While the appeal of the Bayesian approach has long been noted by researchers, recent developments in computational methods and expanded availability of detailed marketplace data has fueled the growth in application of Bayesian methods in marketing. We emphasize the modularity and flexibility of modern Bayesian approaches. The usefulness of Bayesian methods in situations in which there is limited information about a large number of units or where the information comes from different sources is noted. We include an extensive discussion of open issues and directions for future research.

Journal ArticleDOI
TL;DR: Chronic sleep loss, behavioral or sleep disorder related, may represent a novel risk factor for weight gain, insulin resistance, and Type 2 diabetes.
Abstract: Chronic sleep loss as a consequence of voluntary bedtime restriction is an endemic condition in modern society. Although sleep exerts marked modulatory effects on glucose metabolism, and molecular mechanisms for the interaction between sleeping and feeding have been documented, the potential impact of recurrent sleep curtailment on the risk for diabetes and obesity has only recently been investigated. In laboratory studies of healthy young adults submitted to recurrent partial sleep restriction, marked alterations in glucose metabolism including decreased glucose tolerance and insulin sensitivity have been demonstrated. The neuroendocrine regulation of appetite was also affected as the levels of the anorexigenic hormone leptin were decreased, whereas the levels of the orexigenic factor ghrelin were increased. Importantly, these neuroendocrine abnormalities were correlated with increased hunger and appetite, which may lead to overeating and weight gain. Consistent with these laboratory findings, a growing body of epidemiological evidence supports an association between short sleep duration and the risk for obesity and diabetes. Chronic sleep loss may also be the consequence of pathological conditions such as sleep-disordered breathing. In this increasingly prevalent syndrome, a feedforward cascade of negative events generated by sleep loss, sleep fragmentation, and hypoxia are likely to exacerbate the severity of metabolic disturbances. In conclusion, chronic sleep loss, behavioral or sleep disorder related, may represent a novel risk factor for weight gain, insulin resistance, and Type 2 diabetes.

Journal ArticleDOI
TL;DR: In this article, the authors combine the constraints from the recent Ly$\ensuremath{\alpha}$ forest analysis of the Sloan Digital Sky Survey (SDSS) and the SDSS galaxy bias analysis with previous constraints from sDSS galaxies clustering, the latest supernovae, and 1st year WMAP cosmic microwave background anisotropies, and find significant improvements on all of the cosmological parameters compared to previous constraints.
Abstract: We combine the constraints from the recent Ly$\ensuremath{\alpha}$ forest analysis of the Sloan Digital Sky Survey (SDSS) and the SDSS galaxy bias analysis with previous constraints from SDSS galaxy clustering, the latest supernovae, and 1st year WMAP cosmic microwave background anisotropies. We find significant improvements on all of the cosmological parameters compared to previous constraints, which highlights the importance of combining Ly$\ensuremath{\alpha}$ forest constraints with other probes. Combining WMAP and the Ly$\ensuremath{\alpha}$ forest we find for the primordial slope ${n}_{s}=0.98\ifmmode\pm\else\textpm\fi{}0.02$. We see no evidence of running, $dn/d\mathrm{ln} k=\ensuremath{-}0.003\ifmmode\pm\else\textpm\fi{}0.010$, a factor of $3$ improvement over previous constraints. We also find no evidence of tensors, $rl0.36$ ($95%$ c.l.). Inflationary models predict the absence of running and many among them satisfy these constraints, particularly negative curvature models such as those based on spontaneous symmetry breaking. A positive correlation between tensors and primordial slope disfavors chaotic inflation-type models with steep slopes: while the $V\ensuremath{\propto}{\ensuremath{\phi}}^{2}$ model is within the 2-sigma contour, $V\ensuremath{\propto}{\ensuremath{\phi}}^{4}$ is outside the 3-sigma contour. For the amplitude we find ${\ensuremath{\sigma}}_{8}=0.90\ifmmode\pm\else\textpm\fi{}0.03$ from the Ly$\ensuremath{\alpha}$ forest and WMAP alone. We find no evidence of neutrino mass: for the case of $3$ massive neutrino families with an inflationary prior, $\ensuremath{\sum}_{}^{}{m}_{\ensuremath{ u}}l0.42$ eV and the mass of lightest neutrino is ${m}_{1}l0.13$ eV at $95%$ c.l. For the 3 massless $+1$ massive neutrino case we find ${m}_{\ensuremath{ u}}l0.79$ eV for the massive neutrino, excluding at $95%$ c.l. all neutrino mass solutions compatible with the LSND results. We explore dark energy constraints in models with a fairly general time dependence of dark energy equation of state, finding ${\ensuremath{\Omega}}_{\ensuremath{\lambda}}=0.72\ifmmode\pm\else\textpm\fi{}0.02$, $\mathrm{w}(z=0.3)=\ensuremath{-}{0.98}_{\ensuremath{-}0.12}^{+0.10}$, the latter changing to $\mathrm{w}(z=0.3)=\ensuremath{-}{0.92}_{\ensuremath{-}0.10}^{+0.09}$ if tensors are allowed. We find no evidence for variation of the equation of state with redshift, $\mathrm{w}(z=1)=\ensuremath{-}{1.03}_{\ensuremath{-}0.28}^{+0.21}$. These results rely on the current understanding of the Ly$\ensuremath{\alpha}$ forest and other probes, which need to be explored further both observationally and theoretically, but extensive tests reveal no evidence of inconsistency among different data sets used here.