scispace - formally typeset
Search or ask a question

Showing papers by "Pompeu Fabra University published in 2007"


Journal ArticleDOI
14 Jun 2007-Nature
TL;DR: Functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project are reported, providing convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts.
Abstract: We report the generation and analysis of functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project. These data have been further integrated and augmented by a number of evolutionary and computational analyses. Together, our results advance the collective knowledge about human genome function in several major areas. First, our studies provide convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts, including non-protein-coding transcripts, and those that extensively overlap one another. Second, systematic examination of transcriptional regulation has yielded new understanding about transcription start sites, including their relationship to specific regulatory sequences and features of chromatin accessibility and histone modification. Third, a more sophisticated view of chromatin structure has emerged, including its inter-relationship with DNA replication and transcriptional regulation. Finally, integration of these new sources of information, in particular with respect to mammalian evolution based on inter- and intra-species sequence comparisons, has yielded new mechanistic and evolutionary insights concerning the functional landscape of the human genome. Together, these studies are defining a path for pursuit of a more comprehensive characterization of human genome function.

5,091 citations


Journal ArticleDOI
Andrew G. Clark1, Michael B. Eisen2, Michael B. Eisen3, Douglas Smith  +426 moreInstitutions (70)
08 Nov 2007-Nature
TL;DR: These genome sequences augment the formidable genetic tools that have made Drosophila melanogaster a pre-eminent model for animal genetics, and will further catalyse fundamental research on mechanisms of development, cell biology, genetics, disease, neurobiology, behaviour, physiology and evolution.
Abstract: Comparative analysis of multiple genomes in a phylogenetic framework dramatically improves the precision and sensitivity of evolutionary inference, producing more robust results than single-genome analyses can provide. The genomes of 12 Drosophila species, ten of which are presented here for the first time (sechellia, simulans, yakuba, erecta, ananassae, persimilis, willistoni, mojavensis, virilis and grimshawi), illustrate how rates and patterns of sequence divergence across taxa can illuminate evolutionary processes on a genomic scale. These genome sequences augment the formidable genetic tools that have made Drosophila melanogaster a pre-eminent model for animal genetics, and will further catalyse fundamental research on mechanisms of development, cell biology, genetics, disease, neurobiology, behaviour, physiology and evolution. Despite remarkable similarities among these Drosophila species, we identified many putatively non-neutral changes in protein-coding genes, non-coding RNA genes, and cis-regulatory regions. These may prove to underlie differences in the ecology and behaviour of these diverse species.

2,057 citations


Journal ArticleDOI
TL;DR: Interestingly, this work functionally link the epigenetic loss of miRNA-124a with the activation of cyclin D kinase 6, a bona fide oncogenic factor, and the phosphorylation of the retinoblastoma, a tumor suppressor gene.
Abstract: The mechanisms underlying microRNA (miRNA) disruption in human disease are poorly understood. In cancer cells, the transcriptional silencing of tumor suppressor genes by CpG island promoter hypermethylation has emerged as a common hallmark. We wondered if the same epigenetic disruption can "hit" miRNAs in transformed cells. To address this issue, we have used cancer cells genetically deficient for the DNA methyltransferase enzymes in combination with a miRNA expression profiling. We have observed that DNA hypomethylation induces a release of miRNA silencing in cancer cells. One of the main targets is miRNA-124a, which undergoes transcriptional inactivation by CpG island hypermethylation in human tumors from different cell types. Interestingly, we functionally link the epigenetic loss of miRNA-124a with the activation of cyclin D kinase 6, a bona fide oncogenic factor, and the phosphorylation of the retinoblastoma, a tumor suppressor gene.

924 citations


Book
01 Jan 2007
TL;DR: In this article, a detailed introduction to the analysis and design of multiple-input multiple-output (MIMO) wireless systems is presented, and the fundamental capacity limits of MIMO systems are examined.
Abstract: Multiple-input multiple-output (MIMO) technology constitutes a breakthrough in the design of wireless communications systems, and is already at the core of several wireless standards. Exploiting multipath scattering, MIMO techniques deliver significant performance enhancements in terms of data transmission rate and interference reduction. This book is a detailed introduction to the analysis and design of MIMO wireless systems. Beginning with an overview of MIMO technology, the authors then examine the fundamental capacity limits of MIMO systems. Transmitter design, including precoding and space-time coding, is then treated in depth, and the book closes with two chapters devoted to receiver design. Written by a team of leading experts, the book blends theoretical analysis with physical insights, and highlights a range of key design challenges. It can be used as a textbook for advanced courses on wireless communications, and will also appeal to researchers and practitioners working on MIMO wireless systems.

721 citations


Proceedings ArticleDOI
15 Feb 2007
TL;DR: The reac Table is presented, a musical instrument based on a tabletop interface that exemplifies several of the reasons for which live music performance and HCI in general, and musical instruments and tabletop interfaces in particular, can lead to a fertile two-way cross-pollination that can equally benefit both fields.
Abstract: In recent years we have seen a proliferation of musical tables. Believing that this is not just the result of a tabletop trend, in this paper we first discuss several of the reasons for which live music performance and HCI in general, and musical instruments and tabletop interfaces in particular, can lead to a fertile two-way cross-pollination that can equally benefit both fields. After that, we present the reac Table, a musical instrument based on a tabletop interface that exemplifies several of these potential achievements.

626 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine how suppliers may have a comparative advantage over banks in lending to customers because they are able to stop the supply of intermediate goods and act as liquidity providers.
Abstract: This article examines how in a context of limited enforceability of contracts suppliers may have a comparative advantage over banks in lending to customers because they are able to stop the supply of intermediate goods. Suppliers may act also as liquidity providers, insuring against liquidity shocks that could endanger the survival of their customer relationships. The relatively high implicit interest rates of trade credit are the result of insurance and default premiums that are amplified whenever suppliers face a relatively high cost of funds. I explore these effects empirically for a panel of UK firms.

565 citations


Journal ArticleDOI
TL;DR: Some of the in silico methods for pharmacology that are used in drug discovery and applications to specific targets and their limitations will be discussed in the second accompanying part of this review.
Abstract: Pharmacology over the past 100 years has had a rich tradition of scientists with the ability to form qualitative or semi-quantitative relations between molecular structure and activity in cerebro. To test these hypotheses they have consistently used traditional pharmacology tools such as in vivo and in vitro models. Increasingly over the last decade however we have seen that computational (in silico) methods have been developed and applied to pharmacology hypothesis development and testing. These in silico methods include databases, quantitative structure-activity relationships, pharmacophores, homology models and other molecular modeling approaches, machine learning, data mining, network analysis tools and data analysis tools that use a computer. In silico methods are primarily used alongside the generation of in vitro data both to create the model and to test it. Such models have seen frequent use in the discovery and optimization of novel molecules with affinity to a target, the clarification of absorption, distribution, metabolism, excretion and toxicity properties as well as physicochemical characterization. The aim of this review is to illustrate some of the in silico methods for pharmacology that are used in drug discovery. Further applications of these methods to specific targets and their limitations will be discussed in the second accompanying part of this review.

533 citations


Journal ArticleDOI
TL;DR: In this paper, a structural nonequilibrium model of initial responses to incomplete-information games based on "level-k" thinking is proposed, which generalizes many insights from equilibrium auction theory.
Abstract: This paper proposes a structural nonequilibrium model of initial responses to incomplete-information games based on “level-k” thinking, which describes behavior in many experiments with complete-information games. We derive the model's implications in first- and second-price auctions with general information structures, compare them to equilibrium and Eyster and Rabin's (2005) “cursed equilibrium,” and evaluate the model's potential to explain nonequilibrium bidding in auction experiments. The level-k model generalizes many insights from equilibrium auction theory. It allows a unified explanation of the winner's curse in common-value auctions and overbidding in those independent-private-value auctions without the uniform value distributions used in most experiments.

523 citations


Proceedings ArticleDOI
15 Feb 2007
TL;DR: An introductory overview to first-time users of the reacTIVision framework -- an open-source cross-platform computer-vision framework primarily designed for the construction of table-based tangible user interfaces.
Abstract: This article provides an introductory overview to first-time users of the reacTIVision framework -- an open-source cross-platform computer-vision framework primarily designed for the construction of table-based tangible user interfaces. The central component of the framework is a standalone application for fast and robust tracking of fiducial markers in a real-time video stream. The framework also defines a transport protocol for efficient and reliable transmission of object states via a local or wide area network. In addition, the distribution includes a collection of client example projects for various programming environments that allow the rapid development of unique tangible user interfaces. This article also provides a discussion of key points relevant to the construction of the necessary table hardware and surveys some projects that have been based on this technology.

501 citations


Journal ArticleDOI
TL;DR: The Neandertals, the authors' closest extinct relatives, share with modern humans two evolutionary changes in FOXP2, a gene that has been implicated in the development of speech and language, and these changes lie on the common modern human haplotype, which previously was shown to have been subject to a selective sweep.

495 citations


Journal ArticleDOI
22 Nov 2007-Oncogene
TL;DR: It is shown that Snail activation and consequent repression of E-cadherin may depend on AKT-mediated nuclear factor-κB activation, and that NF-κBs induces Snail expression, which is a potential target for antimetastatic therapeutics.
Abstract: Carcinoma progression is associated with the loss of epithelial features, and the acquisition of mesenchymal characteristics and invasive properties by tumour cells. The loss of cell-cell contacts may be the first step of the epithelium mesenchyme transition (EMT) and involves the functional inactivation of the cell-cell adhesion molecule E-cadherin. Repression of E-cadherin expression by the transcription factor Snail is a central event during the loss of epithelial phenotype. Akt kinase activation is frequent in human carcinomas, and Akt regulates various cellular mechanisms including EMT. Here, we show that Snail activation and consequent repression of E-cadherin may depend on AKT-mediated nuclear factor-kappaB (NF-kappaB) activation, and that NF-kappaB induces Snail expression. Expression of the NF-kappaB subunit p65 is sufficient for EMT induction, validating this signalling module during EMT. NF-kappaB pathway activation is associated with tumour progression and metastasis of several human tumour types; E-cadherin acts as a metastasis suppressor protein. Thus, this signalling and transcriptional network linking AKT, NF-kappaB, Snail and E-cadherin during EMT is a potential target for antimetastatic therapeutics.

Journal ArticleDOI
TL;DR: An effort should be made to discover all common and rare copy number variants (CNVs) in the human population to enable systematic exploration of both SNPs and CNVs in association studies to identify the genomic contributors to the common disorders and complex traits.
Abstract: A considerable and unanticipated plasticity of the human genome, manifested as inter-individual copy number variation, has been discovered. These structural changes constitute a major source of inter-individual genetic variation that could explain variable penetrance of inherited (Mendelian and polygenic) diseases and variation in the phenotypic expression of aneuploidies and sporadic traits, and might represent a major factor in the aetiology of complex, multifactorial traits. For these reasons, an effort should be made to discover all common and rare copy number variants (CNVs) in the human population. This will also enable systematic exploration of both SNPs and CNVs in association studies to identify the genomic contributors to the common disorders and complex traits.

Journal ArticleDOI
TL;DR: The study confirms the association of Val66Met to substance-related disorders, eating disorders, and schizophrenia and investigates whether other variants in tight linkage disequilibrium with Val 66Met could configure an extended functional haplotype that would explain observed discrepancies in risk estimations across studies.

Journal ArticleDOI
TL;DR: In this paper, a survey of the inventors of 9017 European patented inventions is presented, which provides new information about the characteristics of European inventors, the sources of their knowledge, the importance of formal and informal collaborations, the motivations to invent, and the actual use and economic value of the patents.


Journal ArticleDOI
TL;DR: Although they may be uncommon on a percentage basis, given the vast predominance of RCC in adults compared with children, adult Xp11 translocation RCC may well outnumber their pediatric counterparts.
Abstract: The recently recognized Xp11 translocation renal cell carcinomas (RCCs), all of which bear gene fusions involving the TFE3 transcription factor gene, comprise at least one-third of pediatric RCC. Only rare adult cases have been reported, without detailed pathologic analysis. We identified and analyzed 28 Xp11 translocation RCC in patients over the age of 20 years. All cases were confirmed by TFE3 immunohistochemistry, a sensitive and specific marker of neoplasms with TFE3 gene fusions, which can be applied to archival material. Three cases were also confirmed genetically. Patients ranged from ages 22 to 78 years, with a strong female predominance (F:M=22:6). These cancers tended to present at advanced stage; 14 of 28 presented at stage 4, whereas lymph nodes were involved by metastatic carcinoma in 11 of 13 cases in which they were resected. Previously not described and distinctive clinical presentations included dense tumor calcifications such that the tumor mimicked renal lithiasis, and obstruction of the renal pelvis promoting extensive obscuring xanthogranulomatous pyelonephritis. Previously unreported morphologic variants included tumor giant cells, fascicles of spindle cells, and a biphasic appearance that simulated the RCC characterized by a t(6;11)(p21;q12) chromosome translocation. One case harbored a novel variant translocation, t(X;3)(p11;q23). Five of 6 patients with 1 or more years of follow-up developed hematogenous metastases, with 2 dying within 1 year of diagnosis. Xp11 translocation RCC can occur in adults, and may be aggressive cancers that require morphologic distinction from clear cell and papillary RCC. Although they may be uncommon on a percentage basis, given the vast predominance of RCC in adults compared with children, adult Xp11 translocation RCC may well outnumber their pediatric counterparts.

Journal ArticleDOI
TL;DR: There are a number of reasons why public health researchers should be concerned about the growth of non-standard employment relationships, many of which are characterised by variable work schedules, reduced job security, lower wages, hazards at the workplace and stressful psychosocial working conditions.
Abstract: New types of work arrangements can be as dangerous as traditional unemployment for workers’ health Since at least the late 1970s, “flexible production” has commonly been considered as a positive and necessary innovation to ensure sustainable economic growth.1,2 The need to be “flexible” has been proposed for workplace technical systems, schedules and salaries, and “flexibility” has even been recognised as a positive feature of a worker’s personality. Increasing labour flexibility means reducing the constraints on the movement of workers into and out of jobs previously constrained by labour laws, union agreements, training systems or labour markets that protect workers’ income and job security.3 Within this context, one of the best-known outcomes of labour market flexibility has been the growth of “atypical” forms of employment and the decline of the “standard” full-time, permanent jobs. Thus, the standard full-time permanent job with benefits is now often replaced with different forms of non-standard work arrangements such as contingent, part-time contract, unregulated underground work or home-based work, many of which are characterised by variable work schedules, reduced job security, lower wages, hazards at the workplace and stressful psychosocial working conditions.4 There are a number of reasons why public health researchers should be concerned about the growth of non-standard employment relationships.5 Workers in flexible jobs share many labour market characteristics (eg, lower credentials, low income, women, immigrant and non-white) with the unemployed, while themselves experiencing bouts of unemployment, a factor strongly associated with adverse health outcomes.6,7 In addition, evidence suggests that these …

Journal ArticleDOI
TL;DR: In this paper, the authors studied the duration pattern of fixed-term contracts and the determinants of their conversion into permanent ones in Spain, where the share of fixedterm employment is the highest in Europe.

Journal ArticleDOI
TL;DR: With the ever-growing demand for power and reliability, actual planning strategies to increase transmission systems would have to take into account this relative increase in vulnerability with size, in order to facilitate and improve the power grid design and functioning.
Abstract: We present an analysis of the topological structure and static tolerance to errors and attacks of the September 2003 actualization of the Union for the Coordination of Transport of Electricity (UCTE) power grid, involving thirty-three different networks. Though every power grid studied has exponential degree distribution and most of them lack typical small-world topology, they display patterns of reaction to node loss similar to those observed in scale-free networks. We have found that the node removal behavior can be logarithmically related to the power grid size. This logarithmic behavior would suggest that, though size favors fragility, growth can reduce it. We conclude that, with the ever-growing demand for power and reliability, actual planning strategies to increase transmission systems would have to take into account this relative increase in vulnerability with size, in order to facilitate and improve the power grid design and functioning.

Journal ArticleDOI
TL;DR: In this paper, the properties of G-7 cycles using a multicountry Bayesian panel VAR model with time variations, unit specific dynamics and cross country interdependences are examined.

Journal ArticleDOI
TL;DR: In this paper, the authors study optimal Taylor-type interest rate rules in an economy with credit market imperfections and find that a strong anti-inflationary stance always attains the highest level of welfare.

Journal ArticleDOI
TL;DR: In this article, a brief overview of the recent literature on macroeconomic volatility in developing countries, highlighting its causes, consequences, and possible remedies is provided, as well as contributions of a recent conference on the subject, sponsored by the World Bank and Pompeu Fabra University, Barcelona.
Abstract: Macroeconomic volatility, both a source and a reflection of underdevelopment, is a fundamental concern for developing countries. Their high aggregate instability results from a combination of large external shocks, volatile macroeconomic policies, microeconomic rigidities, and weak institutions. Volatility entails a direct welfare cost for risk-averse individuals, as well as an indirect one through its adverse effect on income growth and development. This article provides a brief overview of the recent literature on macroeconomic volatility in developing countries, highlighting its causes, consequences, and possible remedies. It then introduces the contributions of a recent conference on the subject, sponsored by the World Bank and Pompeu Fabra University, Barcelona.

Journal ArticleDOI
TL;DR: The conclusion is that the in silico pharmacology paradigm is ongoing and presents a rich array of opportunities that will assist in expediating the discovery of new targets, and ultimately lead to compounds with predicted biological activity for these novel targets.
Abstract: Computational (in silico) methods have been developed and widely applied to pharmacology hypothesis development and testing. These in silico methods include databases, quantitative structure-activity relationships, similarity searching, pharmacophores, homology models and other molecular modeling, machine learning, data mining, network analysis tools and data analysis tools that use a computer. Such methods have seen frequent use in the discovery and optimization of novel molecules with affinity to a target, the clarification of absorption, distribution, metabolism, excretion and toxicity properties as well as physicochemical characterization. The first part of this review discussed the methods that have been used for virtual ligand and target-based screening and profiling to predict biological activity. The aim of this second part of the review is to illustrate some of the varied applications of in silico methods for pharmacology in terms of the targets addressed. We will also discuss some of the advantages and disadvantages of in silico methods with respect to in vitro and in vivo methods for pharmacology research. Our conclusion is that the in silico pharmacology paradigm is ongoing and presents a rich array of opportunities that will assist in expediating the discovery of new targets, and ultimately lead to compounds with predicted biological activity for these novel targets.

Journal ArticleDOI
TL;DR: In this paper, the time needed to comply with government entry procedures in 45 countries with industry-level data on employment growth and growth in the number of establishments during the 1980s was compared to find that countries with less time to register new businesses have seen more entry in industries that experienced expansionary global demand and technology shifts.
Abstract: Does cutting red tape foster entrepreneurship in industries with the potential to expand? We address this question by combining the time needed to comply with government entry procedures in 45 countries with industry-level data on employment growth and growth in the number of establishments during the 1980s. Our main empirical finding is that countries where it takes less time to register new businesses have seen more entry in industries that experienced expansionary global demand and technology shifts. Our estimates take into account that proxying global industry shifts using data from only one country?or group of countries with similar entry regulations?will in general yield biased results.

Journal ArticleDOI
TL;DR: Six of 7 European asthmatic adults using ICSs in the last year did not achieve good disease control and greater attention should be paid to asthma management and to the implementation of the GINA guidelines.
Abstract: BACKGROUND: Epidemiologic evidence related to asthma control in patients from the general population is scanty. OBJECTIVES: We sought to assess asthma control in several European centers according to the Global Initiative for Asthma (GINA) guidelines and to investigate its determinants. METHODS: In the European Community Respiratory Health Survey II (1999-2002), 1241 adults with asthma were identified and classified into inhaled corticosteroid (ICS) users and non-ICS users in the last year. Control was assessed in both groups by using the GINA proposal (controlled, partly controlled, and uncontrolled asthma), and it was related to potential determinants. RESULTS: Only 15% (95% CI, 12% to 19%) of subjects who had used ICSs in the last year and 45% (95% CI, 41% to 50%) of non-ICS users had their asthma under control; individuals with uncontrolled asthma accounted for 49% (95% CI, 44% to 53%) and 18% (95% CI, 15% to 21%), respectively. Among ICS users, the prevalence of uncontrolled asthma showed great variability across Europe, ranging from 20% (95% CI, 7% to 41%; Iceland) to 67% (95% CI, 35% to 90%; Italy). Overweight status, chronic cough and phlegm, and sensitization to Cladosporium species were associated with poor control in ICS users. About 65% and 87% of ICS users with uncontrolled and partly controlled asthma, respectively, were on a medication regimen that was less than recommended by the GINA guidelines. CONCLUSION: Six of 7 European asthmatic adults using ICSs in the last year did not achieve good disease control. The large majority of subjects with poorly controlled asthma were using antiasthma drugs in a suboptimal way. A wide variability in asthma control emerged across Europe. CLINICAL IMPLICATIONS: Greater attention should be paid to asthma management and to the implementation of the GINA guidelines.

Journal ArticleDOI
TL;DR: This study aimed to implement innovative teaching methods − blended learning strategies − that include the use of new information technologies in the teaching of human anatomy and to analyse both the impact of these strategies on academic performance, and the degree of user satisfaction.
Abstract: Objectives This study aimed to implement innovative teaching methods − blended learning strategies − that include the use of new information technologies in the teaching of human anatomy and to analyse both the impact of these strategies on academic performance, and the degree of user satisfaction. Methods The study was carried out among students in Year 1 of the biology degree curriculum (human biology profile) at Pompeu Fabra University, Barcelona. Two groups of students were tested on knowledge of the anatomy of the locomotor system and results compared between groups. Blended learning strategies were employed in 1 group (BL group, n = 69); the other (TT group; n = 65) received traditional teaching aided by complementary material that could be accessed on the Internet. Both groups were evaluated using the same types of examination. Results The average marks presented statistically significant differences (BL 6.3 versus TT 5.0; P < 0.0001). The percentage pass rate for the subject in the first call was higher in the BL group (87.9% versus 71.4%; P = 0.02), reflecting a lower incidence of students who failed to sit the examination (BL 4.3% versus TT 13.8%; P = 0.05). There were no differences regarding overall satisfaction with the teaching received. Conclusions Blended learning was more effective than traditional teaching for teaching human anatomy.

Journal ArticleDOI
TL;DR: In this paper, a generalized theoretical approach to study imitation and subject it to rigorous experimental testing is introduced and the authors find that the different predictions of previous imitation models are mainly explained by different informational assumptions, and to a lesser extent by different behavioral rules.

Posted Content
TL;DR: In this article, the authors show that sovereign risk neither constrains welfare nor lowers credit, but it creates some additional trade in secondary markets, and they suggest a change in perspective regarding the origins of sovereign risk and its remedies.
Abstract: Conventional wisdom says that, in the absence of sufficient default penalties, sovereign risk constraints credit and lowers welfare We show that this conventional wisdom rests on one implicit assumption: that assets cannot be retraded in secondary markets Once this assumption is relaxed, there is always an equilibrium in which sovereign risk is stripped of its conventional effects In such an equilibrium, foreigners hold domestic debts and resell them to domestic residents before enforcement In the presence of (even arbitrarily small) default penalties, this equilibrium is shown to be unique As a result, sovereign risk neither constrains welfare nor lowers credit At most, it creates some additional trade in secondary markets The results presented here suggest a change in perspective regarding the origins of sovereign risk and its remedies To argue that sovereign risk constrains credit, one must show both the insufficiency of default penalties and the imperfect workings of secondary markets To relax credit constraints created by sovereign risk, one can either increase default penalties or improve the workings of secondary markets

Journal ArticleDOI
TL;DR: The authors use statistical tools to model how the performance of heuristic rules varies as a function of environmental characteristics and highlight the trade-off between using linear models and heuristics.
Abstract: Much research has highlighted incoherent implications of judgmental heuristics, yet other findings have demonstrated high correspondence between predictions and outcomes. At the same time, judgment has been well modeled in the form of as if linear models. Accepting the probabilistic nature of the environment, the authors use statistical tools to model how the performance of heuristic rules varies as a function of environmental characteristics. They further characterize the human use of linear models by exploring effects of different levels of cognitive ability. They illustrate with both theoretical analyses and simulations. Results are linked to the empirical literature by a meta-analysis of lens model studies. Using the same tasks, the authors estimate the performance of both heuristics and humans where the latter are assumed to use linear models. Their results emphasize that judgmental accuracy depends on matching characteristics of rules and environments and highlight the trade-off between using linear models and heuristics. Whereas the former can be cognitively demanding, the latter are simple to implement. However, heuristics require knowledge to indicate when they should be used.

Journal ArticleDOI
TL;DR: In this article, the authors present a simple, theory-based measure of the variations in aggregate economic efficiency associated with business fluctuations, which they refer to as "the gap" and decompose into two constituent parts: a price markup and a wage markup.
Abstract: In this paper we present a simple, theory-based measure of the variations in aggregate economic efficiency associated with business fluctuations. We decompose this indicator, which we refer to as “the gap”, into two constituent parts: a price markup and a wage markup, and show that the latter accounts for the bulk of the fluctuations in our gap measure. We also demonstrate the connection between our gap measure and the gap between ouput and its natural level, a more traditional indicator of aggregate inefficiency. Finally, we derive a measure of the welfare costs of business cycles that is directly related to our gap variable. Our welfare measure corresponds to the inefficient component of economic fluctuations, and should thus be interpreted as a lower bound to the costs of the latter. When applied to postwar U.S. data, for some plausible parametrizations, our measure indicates non-negligible welfare losses of gap fluctuations. The results, however, hinge critically on some key parameters, including the intertemporal elasticity of labor supply.