scispace - formally typeset
Search or ask a question

Showing papers in "Research Papers in Economics in 2013"


Posted Content
TL;DR: Three distinct weighting motives are discussed: to achieve precise estimates by correcting for heteroskedasticity; to achieve consistent estimates by corrected for endogenous sampling; and to identify average partial effects in the presence of unmodeled heterogeneity of effects.
Abstract: The purpose of this paper is to help empirical economists think through when and how to weight the data used in estimation. We start by distinguishing two purposes of estimation: to estimate population descriptive statistics and to estimate causal effects. In the former type of research, weighting is called for when it is needed to make the analysis sample representative of the target population. In the latter type, the weighting issue is more nuanced. We discuss three distinct potential motives for weighting when estimating causal effects: (1) to achieve precise estimates by correcting for heteroskedasticity, (2) to achieve consistent estimates by correcting for endogenous sampling, and (3) to identify average partial effects in the presence of unmodeled heterogeneity of effects. In each case, we find that the motive sometimes does not apply in situations where practitioners often assume it does. We recommend diagnostics for assessing the advisability of weighting, and we suggest methods for appropriate inference.

1,006 citations



Posted Content
TL;DR: The first randomized evaluation of the impact of introducing the standard micro-credit group-based lending product in a new market is reported in this paper, where half of 104 slums in Hyderabad, India were randomly selected for opening of a branch of a particular micro-finance institution (Spandana) while the remainder were not, although other MFIs were free to enter those slums.
Abstract: This paper reports on the first randomized evaluation of the impact of introducing the standard microcredit group-based lending product in a new market. In 2005, half of 104 slums in Hyderabad, India were randomly selected for opening of a branch of a particular microfinance institution (Spandana) while the remainder were not, although other MFIs were free to enter those slums. Fifteen to 18 months after Spandana began lending in treated areas, households were 8.8 percentage points more likely to have a microcredit loan. They were no more likely to start any new business, although they were more likely to start several at once, and they invested more in their existing businesses. There was no effect on average monthly expenditure per capita. Expenditure on durable goods increased in treated areas, while expenditures on “temptation goods” declined. Three to four years after the initial expansion (after many of the control slums had started getting credit from Spandana and other MFIs ), the probability of borrowing from an MFI in treatment and comparison slums was the same, but on average households in treatment slums had been borrowing for longer and in larger amounts. Consumption was still no different in treatment areas, and the average business was still no more profitable, although we find an increase in profits at the top end. We found no changes in any of the development outcomes that are often believed to be affected by microfinance, including health, education, and women’s empowerment. The results of this study are largely consistent with those of four other evaluations of similar programs in different contexts.

879 citations



OtherDOI
TL;DR: Final version published in Handbook of Research Methods and Applications in Empirical Macroeconomics; edited by Nigar Hashimzade and Michael A. Thornton.
Abstract: Final version published in Handbook of Research Methods and Applications in Empirical Macroeconomics; edited by Nigar Hashimzade and Michael A. Thornton (Handbooks of Research Methods and Applications series) Edward Elgar, 2013 ISBN 9780857931016

652 citations


Posted Content
TL;DR: In this paper, the authors present estimates of monetary non-neutrality based on evidence from high-frequency responses of real interest rates, expected inflation, and expected output growth, and build a model in which Fed announcements affect beliefs not only about monetary policy but also about other economic fundamentals.
Abstract: We present estimates of monetary non-neutrality based on evidence from high-frequency responses of real interest rates, expected inflation, and expected output growth Our identifying assumption is that unexpected changes in interest rates in a 30-minute window surrounding scheduled Federal Reserve announcements arise from news about monetary policy In response to an interest rate hike, nominal and real interest rates increase roughly one-for-one, several years out into the term structure, while the response of expected inflation is small At the same time, forecasts about output growth also increase|the opposite of what standard models imply about a monetary tightening To explain these facts, we build a model in which Fed announcements affect beliefs not only about monetary policy but also about other economic fundamentals Our model implies that these information effects play an important role in the overall causal effect of monetary policy shocks on output

595 citations


Posted Content
TL;DR: The authors found a link between the sharp drop in U.S. manufacturing employment beginning in 2001 and a change in trade policy that eliminated potential tariff increases on Chinese imports and found that industries where the threat of tariff hikes declines the most experience more severe employment losses along with larger increases in the value of imports from China and the number of firms engaged in China-U.S trade.
Abstract: This paper finds a link between the sharp drop in U.S. manufacturing employment beginning in 2001 and a change in U.S. trade policy that eliminated potential tariff increases on Chinese imports. Industries where the threat of tariff hikes declines the most experience more severe employment losses along with larger increases in the value of imports from China and the number of firms engaged in China-U.S. trade. These results are robust to other potential explanations of the employment loss, and we show that the U.S. employment trends differ from those in the EU, where there was no change in policy.

530 citations


OtherDOI
Abstract: The ability to determine the scale of innovation activities, the characteristics of innovating firms, and the internal and systemic factors that can influence innovation is a prerequisite for the pursuit and analysis of policies aimed at fostering innovation. The Oslo Manual is the foremost international source of guidelines for the collection and use of data on innovation activities in industry. This third edition has been updated to take account of the progress made in understanding the innovation process and its economic impact, and the experience gained from recent rounds of innovation surveys in OECD and non-member countries. For the first time, the Manual investigates the field of non-technological innovation and the linkages between different innovation types. It also includes an annex on the implementation of innovation surveys in developing countries.

524 citations


Posted Content
TL;DR: In this article, the authors examined the historical impact of railroads on the American economy and found that the total impact on each county is captured by changes in that county's "market access," a reduced-form expression derived from general equilibrium trade theory.
Abstract: This paper examines the historical impact of railroads on the American economy. Expansion of the railroad network may have affected all counties directly or indirectly - an econometric challenge that arises in many empirical settings. However, the total impact on each county is captured by changes in that county's "market access," a reduced-form expression derived from general equilibrium trade theory. We measure counties' market access by constructing a network database of railroads and waterways and calculating lowest-cost county-to-county freight routes. As the railroad network expanded from 1870 to 1890, changes in market access were capitalized into county agricultural land values with an estimated elasticity of 1.1. County-level declines in market access associated with removing all railroads in 1890 are estimated to decrease the total value of US agricultural land by 64%. Feasible extensions to internal waterways or improvements in country roads would have mitigated 13% or 20% of the losses from removing railroads.

512 citations


Posted Content
TL;DR: The authors analyzes the role of income distribution in macroeconomic analysis and demonstrates that the long-run equilibrium depends on the initial distribution of income, and that an economy that is characterized by a relatively equal distribution of wealth is likely to be wealthier.
Abstract: This paper analyzes the role of income distribution in macroeconomic analysis The study demonstrates that the long-run equilibrium depends on the initial distribution of income In accordance with empirical evidence concerning the conelation between income distribution and output, an economy that is characterized by a relatively equal distribution of wealth is likely to be wealthier in the long run The study may, therefore, provide an additional explanation for the persistent differences in per-capita output across countries Furthermore, the paper may shed light on cross-countries differences macroeconomic adjustment to aggregate shocks

501 citations


Posted Content
TL;DR: This article attempts to provide a comprehensive comparison of these databases to answer frequent questions which researchers ask, such as: How Web of Science and Scopus are different?
Abstract: Nowadays, the world’s scientific community has been publishing an enormous number of papers in different scientific fields. In such environment, it is essential to know which databases are equally efficient and objective for literature searches. It seems that two most extensive databases are Web of Science and Scopus. Besides searching the literature, these two databases used to rank journals in terms of their productivity and the total citations received to indicate the journals impact, prestige or influence. This article attempts to provide a comprehensive comparison of these databases to answer frequent questions which researchers ask, such as: How Web of Science and Scopus are different? In which aspects these two databases are similar? Or, if the researchers are forced to choose one of them, which one should they prefer? For answering these questions, these two databases will be compared based on their qualitative and quantitative characteristics.

Posted Content
TL;DR: In this paper, the authors developed and tested a moderated mediation model linking perceived human resource management practices to organisational citizenship behaviour and turnover intentions, and found that the effect of perceived HRM practices on both outcome variables is mediated by levels of employee engagement, while the relationship between employee engagement and both outcomes is moderated by perceived organisational support and leader-member exchange.
Abstract: This study contributes to our understanding of the mediating and moderating processes through which human resource management (HRM) practices are linked with behavioural outcomes. We developed and tested a moderated mediation model linking perceived HRM practices to organisational citizenship behaviour and turnover intentions. Drawing on social exchange theory, our model posits that the effect of perceived HRM practices on both outcome variables is mediated by levels of employee engagement, while the relationship between employee engagement and both outcome variables is moderated by perceived organisational support and leader–member exchange. Overall, data from 297 employees in a service sector organisation in the UK support this model. This suggests that the enactment of positive behavioural outcomes, as a consequence of engagement, largely depends on the wider organisational climate and employees' relationship with their line manager. Implications for practice and directions for future research are discussed.

Report SeriesDOI
TL;DR: In this paper, the authors highlight the main principles, concepts and criteria framing open government data initiatives and the issues challenging their implementation, while providing a note of caution on the challenges this agenda poses for the public sector.
Abstract: Open Government Data (OGD) initiatives, and in particular the development of OGD portals, have proliferated since the mid-2000s both at central and local government levels in OECD and non OECD countries. Understanding the preconditions that enable the efficient and effective implementation of these initiatives is essential for achieving their overall objectives. This is especially true in terms of the role played by OGD in relation to Open Government policies in general. This paper highlights the main principles, concepts and criteria framing open government data initiatives and the issues challenging their implementation. It underlines the opportunities that OGD and data analytics may offer policy makers, while providing a note of caution on the challenges this agenda poses for the public sector. Finally, the overall analysis of key concepts and issues aims to pave the way for an empirical analysis of OGD initiatives. So far, little has been done to analyse and prove the impact and accrued value of these initiatives. The paper suggests a methodology comprising an analytical framework for OGD initiatives (to be applied to ex post and ex ante analysis of initiatives) and a related set of data to be collected across OECD countries. The application of the analytical framework and the collection of data would enable the acquisition of a solid body of evidence that could ultimately lead to mapping initiatives across OECD countries (i.e. a typography of initiatives) and developing a common set of metrics to consistently assess impact and value creation within and across countries.

Posted Content
TL;DR: In this paper, the authors discuss the different types of instruments of innovation policy, examine how governments and public agencies in different countries and different times have used these instruments differently, explore the political nature of instrument choice and design, and elaborate a set of criteria for the selection and design of the instruments in relation to the formulation of the innovation policy.
Abstract: The purpose of this article is to discuss the different types of instruments of innovation policy, to examine how governments and public agencies in different countries and different times have used these instruments differently, to explore the political nature of instrument choice and design (and associated issues), and to elaborate a set of criteria for the selection and design of the instruments in relation to the formulation of innovation policy. The article argues that innovation policy instruments must be designed and combined into mixes in ways that address the problems of the innovation system. These mixes are often called “policy mix”. The problem-oriented nature of the design of instrument mixes is what makes innovation policy instruments ‘systemic’

Book ChapterDOI
TL;DR: In this paper, the authors survey the literature on stock return forecasting, highlighting the challenges faced by forecasters as well as strategies for improving return forecasts and illustrate key issues via an empirical application based on updated data.
Abstract: We survey the literature on stock return forecasting, highlighting the challenges faced by forecasters as well as strategies for improving return forecasts. We focus on U.S. equity premium forecastability and illustrate key issues via an empirical application based on updated data. Some studies argue that, despite extensive in-sample evidence of equity premium predictability, popular predictors from the literature fail to outperform the simple historical average benchmark forecast in out-of-sample tests. Recent studies, however, provide improved forecasting strategies that deliver statistically and economically significant out-of-sample gains relative to the historical average benchmark. These strategies – including economically motivated model restrictions, forecast combination, diffusion indices, and regime shifts – improve forecasting performance by addressing the substantial model uncertainty and parameter instability surrounding the data-generating process for stock returns. In addition to the U.S. equity premium, we succinctly survey out-of-sample evidence supporting U.S. cross-sectional and international stock return forecastability. The significant evidence of stock return forecastability worldwide has important implications for the development of both asset pricing models and investment management strategies.

Posted Content
TL;DR: In this paper, a new measure of time-varying tail risk is proposed, which is directly estimable from the cross-section of returns and has strong predictive power for aggregate market returns: a one standard deviation increase in tail risk forecasts an increase in excess market returns of 4.5% over the following year.
Abstract: We propose a new measure of time-varying tail risk that is directly estimable from the cross section of returns. We exploit firm-level price crashes every month to identify common fluctuations in tail risk across stocks. Our tail measure is significantly correlated with tail risk measures extracted from S&P 500 index options, but is available for a longer sample since it is calculated from equity data. We show that tail risk has strong predictive power for aggregate market returns: A one standard deviation increase in tail risk forecasts an increase in excess market returns of 4.5% over the following year. Cross-sectionally, stocks with high loadings on past tail risk earn an annual three-factor alpha 5.4% higher than stocks with low tail risk loadings. These findings are consistent with asset pricing theories that relate equity risk premia to rare disasters or other forms of tail risk.

Posted Content
TL;DR: Admati and Hellwig as discussed by the authors argue that banks are as fragile as they are not because they must be, but because they want to be--and they get away with it.
Abstract: What is wrong with today's banking system? The past few years have shown that risks in banking can impose significant costs on the economy. Many claim, however, that a safer banking system would require sacrificing lending and economic growth. The Bankers' New Clothes examines this claim and the narratives used by bankers, politicians, and regulators to rationalize the lack of reform, exposing them as invalid. Admati and Hellwig argue we can have a safer and healthier banking system without sacrificing any of the benefits of the system, and at essentially no cost to society. They show that banks are as fragile as they are not because they must be, but because they want to be--and they get away with it. Whereas this situation benefits bankers, it distorts the economy and exposes the public to unnecessary risks. Weak regulation and ineffective enforcement allowed the buildup of risks that ushered in the financial crisis of 2007-2009. Much can be done to create a better system and prevent crises. Yet the lessons from the crisis have not been learned. Admati and Hellwig seek to engage the broader public in the debate by cutting through the jargon of banking, clearing the fog of confusion, and presenting the issues in simple and accessible terms. The Bankers' New Clothes calls for ambitious reform and outlines specific and highly beneficial steps that can be taken immediately.

Posted Content
TL;DR: This article showed that microeconomic uncertainty is robustly countercyclical, rising sharply during recessions, particularly during the Great Recession of 2007-2009, and found that reasonably calibrated uncertainty shocks can explain drops and rebounds in GDP of around 3%.
Abstract: We propose uncertainty shocks as a new shock that drives business cycles. First, we demonstrate that microeconomic uncertainty is robustly countercyclical, rising sharply during recessions, particularly during the Great Recession of 2007-2009. Second, we quantify the impact of time-varying uncertainty on the economy in a dynamic stochastic general equilibrium model with heterogeneous firms. We find that reasonably calibrated uncertainty shocks can explain drops and rebounds in GDP of around 3%. Moreover, we show that increased uncertainty alters the relative impact of government policies, making them initially less effective and then subsequently more effective.

Posted Content
TL;DR: It is shown that differences in countrywide institutional structures across the national border do not explain within-ethnicity differences in economic performance, as captured by satellite images of light density.
Abstract: We investigate the role of national institutions on subnational African development in a novel framework that accounts both for local geography and cultural-genetic traits. We exploit the fact that the political boundaries in the eve of African independence partitioned more than two hundred ethnic groups across adjacent countries subjecting similar cultures, residing in homogeneous geographic areas, to different formal institutions. Using both a matching-type and a spatial regression discontinuity approach we show that differences in countrywide institutional structures across the national border do not explain within-ethnicity differences in economic performance, as captured by satellite images of light density. The average non-effect of national institutions on ethnic development masks considerable heterogeneity partially driven by the diminishing role of national institutions in areas further from the capital cities.

Posted Content
TL;DR: The validity of network meta-analysis is based on the underlying assumption that there is no imbalance in the distribution of effect modifiers across the different types of direct treatment comparisons, regardless of the structure of the evidence network.
Abstract: Background In the last decade, network meta-analysis of randomized controlled trials has been introduced as an extension of pairwise meta-analysis. The advantage of network meta-analysis over standard pairwise meta-analysis is that it facilitates indirect comparisons of multiple interventions that have not been studied in a head-to-head fashion. Although assumptions underlying pairwise meta-analyses are well understood, those concerning network meta-analyses are perceived to be more complex and prone to misinterpretation. Discussion In this paper, we aim to provide a basic explanation when network meta-analysis is as valid as pairwise meta-analysis. We focus on the primary role of effect modifiers, which are study and patient characteristics associated with treatment effects. Because network meta-analysis includes different trials comparing different interventions, the distribution of effect modifiers cannot only vary across studies for a particular comparison (as with standard pairwise meta-analysis, causing heterogeneity), but also between comparisons (causing inconsistency). If there is an imbalance in the distribution of effect modifiers between different types of direct comparisons, the related indirect comparisons will be biased. If it can be assumed that this is not the case, network meta-analysis is as valid as pairwise meta-analysis. Summary The validity of network meta-analysis is based on the underlying assumption that there is no imbalance in the distribution of effect modifiers across the different types of direct treatment comparisons, regardless of the structure of the evidence network.

Posted ContentDOI
TL;DR: In this article, the authors describe the state-of-play of short food supply chains (SFSC) in the EU understood as being the chains in which foods involved are identified by, and traceable to a farmer and for which the number of intermediaries between farmer and consumer should be minimal or ideally nil.
Abstract: The present study aims at describing the state-of-play of short food supply chains (SFSC) in the EU understood as being the chains in which foods involved are identified by, and traceable to a farmer and for which the number of intermediaries between farmer and consumer should be minimal or ideally nil. Several types of SFSCs can be identified, for example CSAs (Community-Supported Agriculture), on-farm sales, off-farm schemes (farmers markets, delivery schemes), collective sales in particular towards public institutions, being mostly local / proximity sales and in some cases distance sales. Such type of food chain has specific social impacts, economic impacts at regional and farm level as well as environmental impacts translating themselves into a clear interest of consumers. SFSCs are present throughout the EU, although there are some differences in the different MS in terms of dominating types of SFSCs. In general, they are dominantly small or micro-enterprises, composed of small-scale producers, often coupled to organic farming practices. Social values (quality products to consumers and direct contact with the producer) are the values usually highlighted by SFSCs before environmental or economic values. In terms of policy tools, there are pros and cons in developing a specific EU labelling scheme which could bring more recognition, clarity, protection and value added to SFSCs, while potential costs might be an obstacle. Anyhow, a possible labelling scheme should take into account the current different stages and situations of development of SFSCs in the EU and be flexible enough accommodate these differences. Other policy tools, in particular training and knowledge exchange in marketing and communication are considered important and should continue to be funded by Rural Development programmes, as well as possibly other EU funds in view of the positive social and not specifically rural impacts.

Posted Content
TL;DR: This paper provides an overview of the panel VAR models used in macroeconomics and finance and shows how structural time variation can be dealt with and illustrates the challenges that they present to researchers interested in studying cross-unit dynamics interdependences in heterogeneous setups.
Abstract: This paper provides an overview of the panel VAR models used in macroeconomics and finance. It discusses what are their distinctive features, what they are used for, and how they can be derived from economic theory. It also describes how they are estimated and how shock identification is performed, and compares panel VARs to other approaches used in the literature to deal with dynamic models involving heterogeneous units. Finally, it shows how structural time variation can be dealt with and illustrates the challenges that they present to researchers interested in studying cross-unit dynamics interdependences in heterogeneous setups.

ReportDOI
TL;DR: In this article, the authors generalize the gross exports accounting framework at the country level to one that decomposes gross trade flows (for both exports and imports) at the sector, bilateral, or bilateral sector level.
Abstract: This paper generalizes the gross exports accounting framework at the country level, recently proposed by Koopman, Wang, and Wei (2014), to one that decomposes gross trade flows (for both exports and imports) at the sector, bilateral, or bilateral sector level. We overcome major technical challenges for such generalization by allocating bilateral intermediate trade flows into their final destination of absorption. We also point out two major shortcomings associated with the VAX ratio concept often used in the literature and ways to overcome them. We present the dis-aggregated decomposition results for bilateral sector level gross trade flows among 40 trading nations in 35 sectors from 1995 to 2011 based on the WIOD database.

Posted Content
TL;DR: This article measured the macroeconomic consequences of this convergence through the prism of a Roy model of occupational choice in which women and blacks face frictions in the labor market and in the accumulation of human capital.
Abstract: Over the last 50 years, there has been a remarkable convergence in the occupational distribution between white men, women, and blacks. We measure the macroeconomic consequences of this convergence through the prism of a Roy model of occupational choice in which women and blacks face frictions in the labor market and in the accumulation of human capital. The changing frictions implied by the observed occupational convergence account for 15 to 20 percent of growth in aggregate output per worker since 1960.

Posted Content
TL;DR: Barro and Sala-i-Martin this article studied the two-way interplay between health and economic growth and found that initial health status is a better predictor than initial education of subsequent economic growth.
Abstract: Since the mid 1980s, research on economic growth has experienced a boom, beginning with the work of Romer (1986). The new “endogenous growth” theories have focused on productivity advances that derive from technological progress and increased human capital in the form of education. Barro and Sala-i-Martin (1995) explore these theories and also discuss extensions to allow for open economies, diffusion of technology, migration of persons, fertility choice, and variable labor supply. The government can be important in the models in terms of its policies on maintenance of property rights, encouragement of free markets, taxation, education, and public infrastructure. One area that has received little attention in the recent literature on growth theory is the two-way interplay between health and economic growth. Two preliminary efforts in this direction are Ehrlich and Lui (1991) and Meltzer (1995). Also, the empirical work of Barro (1996) and others suggests that health status, as measured by life expectancy or analogous aggregate indicators, is an important contributor to subsequent growth. In fact, initial health seems to be a better predictor than initial education of subsequent economic growth. The main purpose of this study is to apply the spirit and apparatus of the recent advances in growth theory to the interaction between health and growth. The analysis is conceptual and is intended to form the basis for further theorizing and for empirical analyses of the joint determination of health and growth. The discussion begins with a survey of existing theories and empirical evidence on the determinants of economic growth. Then the paper develops models of the interplay between health and growth.

Posted Content
TL;DR: In this article, the authors introduce pre-determined peer characteristics as covariates in a model linking individual outcomes with group averages, and the question of whether peer effects or social spillovers exist is econometrically identical to that of whether a 2SLS estimator using group dummies to instrument individual characteristics differs from OLS estimates of the effect of these characteristics.
Abstract: Individual outcomes are highly correlated with group average outcomes, a fact often interpreted as a causal peer effect. Without covariates, however, outcome-on-outcome peer effects are vacuous, either unity or, if the average is defined as leave-out, determined by a generic intraclass correlation coefficient. When pre-determined peer characteristics are introduced as covariates in a model linking individual outcomes with group averages, the question of whether peer effects or social spillovers exist is econometrically identical to that of whether a 2SLS estimator using group dummies to instrument individual characteristics differs from OLS estimates of the effect of these characteristics. The interpretation of results from models that rely solely on chance variation in peer groups is therefore complicated by bias from weak instruments. With systematic variation in group composition, the weak IV issue falls away, but the resulting 2SLS estimates can be expected to exceed the corresponding OLS estimates as a result of measurement error and other reasons unrelated to social effects. Randomized and quasi-experimental research designs that manipulate peer characteristics in a manner unrelated to individual characteristics provide the strongest evidence on the nature of social spillovers. As an empirical matter, designs of this sort have uncovered little in the way of socially significant causal effects.

Posted Content
TL;DR: The QALY gain by patients using telehealth in addition to usual care was similar to that by patients receiving usual care only, and total costs associated with the telehealth intervention were higher, compared with standard support and treatment.
Abstract: Objective: To examine the costs and cost effectiveness of telehealth in addition to standard support and treatment, compared with standard support and treatment. Design: Economic evaluation nested in a pragmatic, cluster randomised controlled trial. Setting: Community based telehealth intervention in three local authority areas in England. Participants: 3230 people with a long term condition (heart failure, chronic obstructive pulmonary disease, or diabetes) were recruited into the Whole Systems Demonstrator telehealth trial between May 2008 and December 2009. Of participants taking part in the Whole Systems Demonstrator telehealth questionnaire study examining acceptability, effectiveness, and cost effectiveness, 845 were randomised to telehealth and 728 to usual care. Interventions: Intervention participants received a package of telehealth equipment and monitoring services for 12 months, in addition to the standard health and social care services available in their area. Controls received usual health and social care. Main outcome measure: Primary outcome for the cost effectiveness analysis was incremental cost per quality adjusted life year (QALY) gained. Results: We undertook net benefit analyses of costs and outcomes for 965 patients (534 receiving telehealth; 431 usual care). The adjusted mean difference in QALY gain between groups at 12 months was 0.012. Total health and social care costs (including direct costs of the intervention) for the three months before 12 month interview were £1390 (€1610; $2150) and £1596 for the usual care and telehealth groups, respectively. Cost effectiveness acceptability curves were generated to examine decision uncertainty in the analysis surrounding the value of the cost effectiveness threshold. The incremental cost per QALY of telehealth when added to usual care was £92 000. With this amount, the probability of cost effectiveness was low (11% at willingness to pay threshold of £30 000; >50% only if the threshold exceeded about £90 000). In sensitivity analyses, telehealth costs remained slightly (non-significantly) higher than usual care costs, even after assuming that equipment prices fell by 80% or telehealth services operated at maximum capacity. However, the most optimistic scenario (combining reduced equipment prices with maximum operating capacity) eliminated this group difference (cost effectiveness ratio £12 000 per QALY). Conclusions: The QALY gain by patients using telehealth in addition to usual care was similar to that by patients receiving usual care only, and total costs associated with the telehealth intervention were higher. Telehealth does not seem to be a cost effective addition to standard support and treatment.

Posted Content
TL;DR: A review of the theoretical and empirical literature on the different channels through which large shareholders engage in corporate governance can be found in this paper, where the authors highlight the empirical challenges in identifying causal effects of and on blockholders and the typical strategies attempted to achieve identification.
Abstract: This paper reviews the theoretical and empirical literature on the different channels through which blockholders (large shareholders) engage in corporate governance. In classical models, blockholders exert governance through direct intervention in a firm’s operations, otherwise known as “voice.” These theories have motivated empirical research on the determinants and consequences of activism. More recent models show that blockholders can govern through the alternative mechanism of “exit” – selling their shares if the manager underperforms. These theories give rise to new empirical studies on the two-way relationship between blockholders and financial markets, linking corporate finance with asset pricing. Blockholders may also worsen governance by extracting private benefits of control or pursuing objectives other than firm value maximization. I highlight the empirical challenges in identifying causal effects of and on blockholders, and the typical strategies attempted to achieve identification. I close with directions for future research.

Posted Content
TL;DR: In this article, the authors provide a model-free test for asymmetric correlations in which stocks move more often with the market when the market goes down than when it goes up.
Abstract: We provide a model-free test for asymmetric correlations in which stocks move more often with the market when the market goes down than when it goes up, and also provide such tests for asymmetric betas and covariances. When stocks are sorted by size, book-to-market, and momentum, we find strong evidence of asymmetries for both size and momentum portfolios, but no evidence for book-to-market portfolios. Moreover, we evaluate the economic significance of incorporating asymmetries into investment decisions, and find that they can be of substantial economic importance for an investor with a disappointment aversion (DA) preference as described by Ang, Bekaert, and Liu (2005). , Oxford University Press. (This abstract was borrowed from another version of this item.)

Posted Content
TL;DR: In this paper, the causal relationship between economic growth and renewable energy consumption in BRICS countries over the period 1971-2010 within a multivariate framework was investigated, based on the ARDL estimates, there exist long-run equilibrium relationships among the competing variables.
Abstract: The current study investigates the causal relationship between economic growth and renewable energy consumption in the BRICS countries over the period 1971-2010 within a multivariate framework. The ARDL bounds testing approach to cointegration and vector error correction model (VECM) are used to examine the long-run and causal relationships between economic growth, renewable energy consumption, trade openness and carbon dioxide emissions. Empirical evidence shows that, based on the ARDL estimates, there exist long-run equilibrium relationships among the competing variables. Regarding the VECM results, bi-directional Granger causality exists between economic growth and renewable energy consumption, suggesting the feedback hypothesis, which can explain the role of renewable energy in stimulating economic growth in BRICS countries.