Showing papers in "Social Science Research Network in 2009"
TL;DR: An evaluation of double-blind reviewed journals through important academic publishing databases revealed that more than 30 academic articles in the domain of international marketing (in a broad sense) used PLS path modeling as means of statistical analysis.
Abstract: Purpose: This paper discusses partial least squares path modeling (PLS), a powerful structural equation modeling technique for research on international marketing. While a significant body of research provides guidance for the use of covariance-based structural equation modeling (CBSEM) in international marketing, there are no subject-specific guidelines for the use of PLS so far.Methodology/approach: A literature review of the use of PLS in international marketing reveals the increasing application of this methodology.Findings: This paper reveals the strengths and weaknesses of PLS in the context of research on international marketing, and provides guidance for multi-group analysis.Originality/value of paper: The paper assists researchers in making well-grounded decisions regarding the application of PLS in certain research situations and provides specific implications for an appropriate application of the methodology.
TL;DR: In this paper, the authors identify inefficient institutions as the root cause of economic differences between societies and propose a framework to change these institutions and apply them to improve the economic well-being of countries.
Abstract: Why are some countries much richer than others? This technical note proposes a framework to begin answering this question. The first part identifies inefficient institutions as the root cause of the economic differences between societies. The second part analyzes how these institutions change. And the final part suggests how lessons from this institutional framework can be applied.
TL;DR: In this article, the authors provide an introduction and user guide to regression discontinuity (RD) design for empirical researchers, including the basic theory behind RD design, details when RD is likely to be valid or invalid given economic incentives.
Abstract: This paper provides an introduction and "user guide" to Regression Discontinuity (RD) designs for empirical researchers. It presents the basic theory behind the research design, details when RD is likely to be valid or invalid given economic incentives, explains why it is considered a "quasi-experimental" design, and summarizes different ways (with their advantages and disadvantages) of estimating RD designs and the limitations of interpreting these estimates. Concepts are discussed using examples drawn from the growing body of empirical research using RD.
TL;DR: Brand experience is conceptualized as sensations, feelings, cognitions, and behavioral responses evoked by brand-related stimuli that are part of a brand's design and identity, packaging, communications, and environments.
Abstract: Brand experience is conceptualized as sensations, feelings, cognitions, and behavioral responses evoked by brand-related stimuli that are part of a brand's design and identity, packaging, communications, and environments. The authors distinguish several experience dimensions and construct a brand experience scale that includes four dimensions: sensory, affective, intellectual, and behavioral. In six studies, the authors show that the scale is reliable, valid, and distinct from other brand measures, including brand evaluations, brand involvement, brand attachment, customer delight, and brand personality. Moreover, brand experience affects consumer satisfaction and loyalty directly and indirectly through brand personality associations.
TL;DR: This paper proposed Instructional manipulation check (IMC), a new tool for detecting participants who are not following instructions and demonstrated how the inclusion of an IMC can increase statistical power and reliability of a dataset.
Abstract: Participants are not always as diligent in reading and following instructions as experimenters would like them to be. When participants fail to follow instructions, this increases noise and decreases the validity of their data. This paper presents and validates a new tool for detecting participants who are not following instructions – the Instructional manipulation check (IMC). We demonstrate how the inclusion of an IMC can increase statistical power and reliability of a dataset.
TL;DR: In this paper, the authors collected detailed qualitative information from financial filings to categorize financial constraints for a random sample of firms from 1995 to 2004, and used ordered logit models predicting constraints as a function of different quantitative factors.
Abstract: We collect detailed qualitative information from financial filings to categorize financial constraints for a random sample of firms from 1995 to 2004. Using this categorization, we estimate ordered logit models predicting constraints as a function of different quantitative factors. Our findings cast serious doubt on the validity of the KZ index as a measure of financial constraints, while offering mixed evidence on the validity of other common measures of constraints. We find that firm size and age are particularly useful predictors of financial constraint levels, and we propose a measure of financial constraints that is based solely on these firm characteristics.
TL;DR: This article showed that new loans to large borrowers fell by 47% during the peak period of the financial crisis (fourth quarter of 2008) relative to the prior quarter and by 79% relative to peak of the credit boom (second quarter of 2007).
Abstract: This paper documents that new loans to large borrowers fell by 47% during the peak period of the financial crisis (fourth quarter of 2008) relative to the prior quarter and by 79% relative to the peak of the credit boom (second quarter of 2007). New lending for real investment (such as working capital and capital expenditures) fell by only 14% in the last quarter of 2008, but contracted nearly as much as new lending for restructuring (LBOs, M&A, share repurchases) relative to the peak of the credit boom. After the failure of Lehman Brothers in September 2008 there was a run by short-term bank creditors, making it difficult for banks to roll over their short-term debt. We document that there was a simultaneous run by borrowers who drew down their credit lines, leading to a spike in commercial and industrial loans reported on bank balance sheets. We examine whether these two stresses on bank liquidity led them to cut lending. In particular, we show that banks cut their lending less if they had better access to deposit financing and thus they were not as reliant on short-term debt. We also show that banks that were more vulnerable to credit line drawdowns because they co-syndicated more of their credit lines with Lehman Brothers reduced their lending to a greater extent.
TL;DR: This paper studied the behavior of money, credit, and macroeconomic indicators over the long run based on a newly constructed historical dataset for 12 developed countries over the years 1870-2008, utilizing the data to study rare events associated with financial crisis episodes.
Abstract: The crisis of 2008-09 has focused attention on money and credit fluctuations, financial crises, and policy responses. In this paper we study the behavior of money, credit, and macroeconomic indicators over the long run based on a newly constructed historical dataset for 12 developed countries over the years 1870-2008, utilizing the data to study rare events associated with financial crisis episodes. We present new evidence that leverage in the financial sector has increased strongly in the second half of the twentieth century as shown by a decoupling of money and credit aggregates, and we also find a decline in safe assets on banks' balance sheets. We also show for the first time how monetary policy responses to financial crises have been more aggressive post-1945, but how despite these policies the output costs of crises have remained large. Importantly, we can also show that credit growth is a powerful predictor of financial crises, suggesting that such crises are
TL;DR: In a financial system in which balance sheets are continuously marked to market, asset price changes appear immediately as changes in net worth, eliciting responses from financial intermediaries who adjust the size of their balance sheets as mentioned in this paper.
Abstract: In a financial system in which balance sheets are continuously marked to market, asset price changes appear immediately as changes in net worth, eliciting responses from financial intermediaries who adjust the size of their balance sheets. We document evidence that marked-to-market leverage is strongly procyclical. Such behavior has aggregate consequences. Changes in dealer repos—the primary margin of adjustment for the aggregate balance sheets of intermediaries—forecast changes in financial market risk as measured by the innovations in the Chicago Board Options Exchange Volatility Index (VIX). Aggregate liquidity can be seen as the rate of change of the aggregate balance sheet of the financial intermediaries.
TL;DR: It is proposed that open source software development is an exemplar of a compound "private-collective" model of innovation that contains elements of both the private investment and the collective action models and can offer society the "best of both worlds" under many conditions.
Abstract: Currently two models of innovation are prevalent in organization science. The "private investment" model assumes returns to the innovator results from private goods and efficient regimes of intellectual property protection. The "collective action" model assumes that under conditions of market failure, innovators collaborate in order to produce a public good. The phenomenon of open source software development shows that users program to solve their own as well as shared technical problems, and freely reveal their innovations without appropriating private returns from selling the software. In this paper we propose that open source software development is an exemplar of a compound model of innovation that contains elements of both the private investment and the collective action models. We describe a new set of research questions this model raises for scholars in organization science. We offer some details regarding the types of data available for open source projects in order to ease access for researchers who are unfamiliar with these, and also offer some advice on conducting empirical studies on open source software development processes.
TL;DR: It is proposed that the effectiveness of vertical integration as a strategy to manage ecosystem interdependence increases over the course of the technology life cycle.
Abstract: The success of an innovating firm often depends on the efforts of other innovators in its environment. How do the challenges faced by external innovators affect the focal firm's outcomes? To address this question we first characterize the external environment according to the structure of interdependence. We follow the flow of inputs and outputs in the ecosystem to distinguish between upstream components that are bundled by the focal firm, and downstream complements that are bundled by the firm's customers. We argue that the effect of external innovation challenges depends not only on their magnitude, but also on their location in the ecosystem relative to the focal firm - whereas greater innovation challenges in components enhances the benefits that accrue to technology leaders, greater innovation challenges in complements erodes these benefits. We further argue that the effectiveness of vertical integration as a strategy to manage ecosystem interdependence increases over the course of the technology life cycle. We explore these arguments in the context of the global semiconductor lithography industry from its emergence in 1962 to 2005 across nine distinct technology generations. We find strong support for our arguments.
TL;DR: In this article, the authors characterize dynamic tax policies that achieve sustainable growth or maximize intertemporal welfare, as a function of the degree of substitutability between clean and dirty inputs, environmental and resource stocks, and cross-country technological spillovers.
Abstract: This paper introduces endogenous and directed technical change in a growth model with environmental constraints and limited resources. A unique final good is produced by combining inputs from two sectors. One of these sectors uses "dirty" machines and thus creates environmental degradation. Research can be directed to improving the technology of machines in either sector. We characterize dynamic tax policies that achieve sustainable growth or maximize intertemporal welfare, as a function of the degree of substitutability between clean and dirty inputs, environmental and resource stocks, and cross-country technological spillovers. We show that: (i) in the case where the inputs are sufficiently substitutable, sustainable long-run growth can be achieved with temporary taxation of dirty innovation and production; (ii) optimal policy involves both "carbon taxes" and research subsidies, so that excessive use of carbon taxes is avoided; (iii) delay in intervention is costly: the sooner and the stronger is the policy response, the shorter is the slow growth transition phase; (iv) the use of an exhaustible resource in dirty input production helps the switch to clean innovation under laissez-faire when the two inputs are substitutes. Under reasonable parameter values (corresponding to those used in existing models with exogenous technology) and with sufficient substitutability between inputs, it is optimal to redirect technical change towards clean technologies immediately and optimal environmental regulation need not reduce long-run growth. We also show that in a two-country extension, even though optimal environmental policy involves global policy coordination, when the two inputs are sufficiently substitutable environmental regulation only in the North may be sufficient to avoid a global disaster.
TL;DR: In this paper, the authors propose a conceptual framework and sketch out a strategic plan for delivering on the promise of ecosystem services, drawing on emerging examples from Hawai'i, and describe key advances in the science and practice of accounting for natural capital in the decisions of individuals, communities, corporations, and governments.
Abstract: Over the past decade, efforts to value and protect ecosystem services have been promoted by many as the last, best hope for making conservation mainstream - attractive and commonplace worldwide. In theory, if we can help individuals and institutions to recognize the value of nature, then this should greatly increase investments in conservation, while at the same time fostering human well-being. In practice, however, we have not yet developed the scientific basis, nor the policy and finance mechanisms, for incorporating natural capital into resource- and land-use decisions on a large scale. Here, we propose a conceptual framework and sketch out a strategic plan for delivering on the promise of ecosystem services, drawing on emerging examples from Hawai‘i. We describe key advances in the science and practice of accounting for natural capital in the decisions of individuals, communities, corporations, and governments.
TL;DR: In this paper, the authors used Google Trends and Google Insights for Search data to predict economic activity, including automobile sales, home sales, retail sales, and travel behavior, and found that Google Trends data can help improve forecasts of the current level of activity for a number of different economic time series.
Abstract: Can Google queries help predict economic activity?The answer depends on what you mean by "predict." Google Trends and Google Insights for Search provide a real time report on query volume, while economic data is typically released several days after the close of the month. Given this time lag, it is not implausible that Google queries in a category like "Automotive/Vehicle Shopping" during the first few weeks of March may help predict what actual March automotive sales will be like when the official data is released halfway through April.That famous economist Yogi Berra once said "It's tough to make predictions, especially about the future." This inspired our approach: let us lower the bar and just try to predict the present. Our work to date is summarized in a paper called Predicting the Present with Google Trends. We find that Google Trends data can help improve forecasts of the current level of activity for a number of different economic time series, including automobile sales, home sales, retail sales, and travel behavior. Even predicting the present is useful, since it may help identify "turning points" in economic time series. If people start doing significantly more searches for "Real Estate Agents" in a certain location, it is tempting to think that house sales might increase in that area in the near future.Our paper outlines one approach to short-term economic prediction, but we expect that there are several other interesting ideas out there. So we suggest that forecasting wannabes download some Google Trends data and try to relate it to other economic time series. If you find an interesting pattern, post your findings on a website and send a link to firstname.lastname@example.org. We'll report on the most interesting results in a later blog post.It has been said that if you put a million monkeys in front of a million computers, you would eventually produce an accurate economic forecast. Let's see how well that theory works.
TL;DR: In this article, the authors overview and synthesize extant word of mouth theory and present a study of a marketing campaign in which mobile phones were seeded with prominent bloggers, revealing the complex cultural conditions through which marketing "hype" is transformed by consumers into the "honey" of relevant, shared communications.
Abstract: Word of mouth marketing — the intentional influencing of consumer-to-consumer communications — is an increasingly important technique. The authors overview and synthesize extant word of mouth theory and present a study of a marketing campaign in which mobile phones were seeded with prominent bloggers. Eighty-three blogs were followed for six months. Findings reveal the complex cultural conditions through which marketing “hype” is transformed by consumers into the “honey” of relevant, shared communications. Four word of mouth communication strategies are identified — evaluation, embracing, endorsement and explanation. Each is influenced by communicator narrative, communications forum, communal norms and the nature of the marketing promotion. An intrinsic tension between commercial and communal interests plays a prominent, normative role in message formation and reception. This “hype-to-honey” theory shows that communal word of mouth does not simply increase or amplify marketing messages. Rather, marketing messages and meanings are systematically altered in the process of embedding them. The theory has implications for how marketers should plan, target and benefit from word of mouth and how scholars should understand word of mouth in a networked world.
TL;DR: In this article, the authors argue that some types of well-being are consistent across cultures, whereas there are also unique patterns of wellbeing in societies that are not comparable across cultures.
Abstract: Subjective well-being (SWB) is composed of people’s evaluations of their lives, including pleasant affect, infrequent unpleasant affect, life satisfaction (LS). We review the research literature concerning the influence of culture on SWB. We argue that some types of well-being, as well as their causes, are consistent across cultures, whereas there are also unique patterns of well-being in societies that are not comparable across cultures. Thus, well-being can be understood to some degree in universal terms, but must also be understood within the framework of each culture. We review the methodological challenges to assessing SWB in different cultures. One important question for future research is the degree to which feelings of well-being lead to the same outcomes in different cultures. The overarching theme of the paper is that there are pancultural experiences of SWB that can be compared across cultures, but that there are also culture-specific patterns that make cultures unique in their experience of well-being.
TL;DR: The authors survey 1,050 CFOs in the US, Europe, and Asia to assess whether their firms are credit constrained during the global financial crisis of 2008 and find that constrained firms planned deeper cuts in tech spending, employment, and capital spending.
Abstract: We survey 1,050 CFOs in the US, Europe, and Asia to directly assess whether their firms are credit constrained during the global financial crisis of 2008 We study whether corporate spending plans differ conditional on this survey-based measure of financial constraint Our evidence indicates that constrained firms planned deeper cuts in tech spending, employment, and capital spending Constrained firms also burned through more cash, drew more heavily on lines of credit for fear banks would restrict access in the future, and sold more assets to fund their operations We also find that the inability to borrow externally caused many firms to bypass attractive investment opportunities, with 86% of constrained US CFOs saying their investment in attractive projects was restricted during the credit crisis of 2008 More than half of the respondents said they canceled or postponed their planned investments Our results also hold in Europe and Asia, and in many cases are stronger in those economies Our analysis adds to the portfolio of approaches and knowledge about the impact of credit constraints on real firm behavior
TL;DR: A statistical framework is developed that uses satellite data on lights growth to augment existing income growth measures, under the assumption that measurement error in using observed light as an indicator of income is uncorrelated with measurementerror in national income accounts.
Abstract: GDP growth is often measured poorly for countries and rarely measured at all for cities or subnational regions. We propose a readily available proxy: satellite data on lights at night. We develop a statistical framework that uses lights growth to augment existing income growth measures, under the assumption that measurement error in using observed light as an indicator of income is uncorrelated with measurement error in national income accounts. For countries with good national income accounts data, information on growth of lights is of marginal value in estimating the true growth rate of income, while for countries with the worst national income accounts, the optimal estimate of true income growth is a composite with roughly equal weights. Among poor-data countries, our new estimate of average annual growth differs by as much as 3 percentage points from official data. Lights data also allow for measurement of income growth in sub- and supranational regions. As an application, we examine growth in Sub Saharan African regions over the last 17 years. We find that real incomes in non-coastal areas have grown faster by 1/3 of an annual percentage point than coastal areas; non-malarial areas have grown faster than malarial ones by 1/3 to 2/3 annual percent points; and primate city regions have grown no faster than hinterland areas. Such applications point toward a research program in which "empirical growth" need no longer be synonymous with "national income accounts."
TL;DR: In this article, a large-scale Monte-Carlo simulation was conducted to compare the performance of covariance-based and partial least squares (PLS) analysis with PLS.
Abstract: Variance-based SEM, also known under the term partial least squares (PLS) analysis, is an approach that has gained increasing interest among marketing researchers in recent years. During the last 25 years, more than 30 articles have been published in leading marketing journals that have applied this approach instead of the more traditional alternative of covariance-based SEM (CBSEM). However, although an analysis of these previous publications shows that there seems to be at least an implicit agreement about the factors that should drive the choice between PLS analysis and CBSEM, no research has until now empirically compared the performance of these approaches given a set of different conditions. Our study addresses this open question by conducting a large-scale Monte-Carlo simulation. We show that justifying the choice of PLS due to a lack of assumptions regarding indicator distribution and measurement scale is often inappropriate, as CBSEM proves extremely robust with respect to violations of its underlying distributional assumptions. Additionally, CBSEM clearly outperforms PLS in terms of parameter consistency and is preferable in terms of parameter accuracy as long as the sample size exceeds a certain threshold (250 observations). Nevertheless, PLS analysis should be preferred when the emphasis is on prediction and theory development, as the statistical power of PLS is always larger than or equal to that of CBSEM; already, 100 observations can be sufficient to achieve acceptable levels of statistical power given a certain quality of the measurement model.
TL;DR: In this paper, the authors examined image motivation as a driver in prosocial behavior and asked whether extrinsic monetary incentives (do well) have a detrimental effect on prosocial behaviour due to crowding out of image motivation.
Abstract: This paper examines image motivation - the desire to be liked and well-regarded by others - as a driver in prosocial behavior (doing good), and asks whether extrinsic monetary incentives (doing well) have a detrimental effect on prosocial behavior due to crowding out of image motivation.By definition, image depends on one's behavior being visible to other people. Using this unique property we show that image is indeed an important part of the motivation to behave prosocially. Moreover, we show that extrinsic incentives interact with image motivation and are therefore less effective in public than in private. Together, these results imply that image motivation is crowded out by monetary incentives; this means that monetary incentives are more likely to be counterproductive for public prosocial activities than for private ones.
TL;DR: In this paper, the authors investigated the relationship between stocks, bonds and gold and found that gold is a hedge against stocks on average and a safe haven in extreme stock market conditions.
Abstract: Is gold a hedge against sudden changes in stock and bond returns, or does it instead have a subtly different property, that of being a safe haven? This paper addresses these two interlinked questions. A safe haven is defined as a security that is uncorrelated with stocks and bonds in case of a market crash. This is counterpoised against a hedge, defined as a security that is uncorrelated with stocks or bonds on average. We study constant and time-varying relationships between stocks, bonds and gold in order to investigate the existence of a hedge and a safe haven. The empirical analysis examines US, UK and German stock and bond returns and their relationship with gold returns. We find that gold is a hedge against stocks on average and a safe haven in extreme stock market conditions. This finding suggests that the existence of a safe haven enhances the stability and resiliency of financial markets since it reduces investors' losses at times when a reduction is needed the most. A portfolio analysis further shows that the safe haven property is extremely short-lived so that an investor buying gold one day after a shock loses money.
TL;DR: This paper found that family firms are less tax aggressive than their non-family counterparts, ceteris paribus, and that family owners are willing to forgo tax benefits in order to avoid the non-tax cost of a potential price discount, which can arise from minority shareholders' concern with family rent-seeking masked by tax avoidance activities.
Abstract: Taxes represent a significant cost to the firm and shareholders, and it is generally expected that shareholders prefer tax aggressiveness. However, this argument ignores potential non-tax costs that can accompany tax aggressiveness, especially those arising from agency problems. Firms owned/run by founding family members are characterized by a unique agency conflict between dominant and small shareholders. Using multiple measures to capture tax aggressiveness and founding family presence, we find that family firms are less tax aggressive than their non-family counterparts, ceteris paribus. This result suggests that family owners are willing to forgo tax benefits in order to avoid the non-tax cost of a potential price discount, which can arise from minority shareholders' concern with family rent-seeking masked by tax avoidance activities (Desai and Dharmapala 2006). This inference is further strengthened by our finding that family firms without long-term institutional investors (as outside monitors) and family firms expecting to raise capital exhibit even lower tax aggressiveness. Our result is also consistent with family owners being more concerned with the potential penalty and reputation damage from an IRS audit than non-family firms. We obtain similar inferences when using a small sample of tax shelter cases.
TL;DR: In this article, it is shown that it is easy to calculate standard errors that are robust to simultaneous correlation across both firms and time, and that any statistical package with a clustering command can be used to easily calculate these standard errors.
Abstract: When estimating finance panel regressions, it is common practice to adjust standard errors for correlation either across firms or across time. These procedures are valid only if the residuals are correlated either across time or across firms, but not across both. This note shows that it is very easy to calculate standard errors that are robust to simultaneous correlation across both firms and time. The covariance estimator is equal to the estimator that clusters by firm, plus the the estimator that clusters by time, minus the usual heteroskedasticity-robust OLS covariance matrix. Any statistical package with a clustering command can be used to easily calculate these standard errors.
TL;DR: The 2009 Human Development Report (HDR09) as mentioned in this paper investigates migration in the context of demographic changes and trends in both growth and inequality, and explores less visible movements typically pursued by disadvantaged groups such as short term and seasonal migration.
Abstract: Migration, both within and beyond borders, has become an increasingly prominent theme in domestic and international debates, and is the topic of the 2009 Human Development Report (HDR09). The starting point is that the global distribution of capabilities is extraordinarily unequal, and that this is a major driver for movement of people. Migration can expand their choices — in terms of incomes, accessing services and participation, for example — but the opportunities open to people vary from those who are best endowed to those with limited skills and assets. These underlying inequalities, which can be compounded by policy distortions, is a theme of the report.The report investigates migration in the context of demographic changes and trends in both growth and inequality. It also presents more detailed and nuanced individual, family and village experiences, and explores less visible movements typically pursued by disadvantaged groups such as short term and seasonal migration.There is a range of evidence about the positive impacts of migration on human development, through such avenues as increased household incomes and improved access to education and health services. There is further evidence that migration can empower traditionally disadvantaged groups, in particular women. At the same time, risks to human development are also present where migration is a reaction to threats and denial of choice, and where regular opportunities for movement are constrained.National and local policies play a critical role in enabling better human development outcomes for both those who choose to move in order to improve their circumstances, and those forced to relocate due to conflict, environmental degradation, or other reasons. Host country restrictions can raise both the costs and the risks of migration. Similarly, negative outcomes can arise at the country levels where basic civic rights, like voting, schooling and health care are denied to those who have moved across provincial lines to work and live. HDR09 shows how a human development approach can be a means to redress some of the underlying issues that erode the potential benefits of mobility and/or force migration.
TL;DR: An overview of the meaning and measurement of financial literacy is presented to highlight current limitations and assist researchers in establishing standardized, commonly accepted financial literacy instruments as mentioned in this paper, which is essential to understand educational impact as well as barriers to effective financial choice.
Abstract: Financial literacy (or financial knowledge) is typically an input to model the need for financial education and explain variation in financial outcomes. Defining and appropriately measuring financial literacy is essential to understand educational impact as well as barriers to effective financial choice. This article summarizes the broad range of financial literacy measures used in research over the last decade. An overview of the meaning and measurement of financial literacy is presented to highlight current limitations and assist researchers in establishing standardized, commonly accepted financial literacy instruments.
TL;DR: This paper examined the role of gold in the global financial system and found that gold is both a hedge and a safe haven for major European stock markets and the US but not for Australia, Canada, Japan and large emerging markets such as the BRIC countries.
Abstract: The aim of this paper is to examine the role of gold in the global financial system. We test the hypothesis that gold represents a safe haven against stocks of major emerging and developing countries. A descriptive and econometric analysis for a sample spanning a 30 year period from 1979-2009 shows that gold is both a hedge and a safe haven for major European stock markets and the US but not for Australia, Canada, Japan and large emerging markets such as the BRIC countries. We also distinguish between a weak and strong form of the safe haven and argue that gold may act as a stabilizing force for the financial system by reducing losses in the face of extreme negative market shocks. Looking at specific crisis periods, we find that gold was a strong safe haven for most developed markets during the peak of the recent financial crisis.
TL;DR: The authors review and evaluate the methods commonly used in the accounting literature to correct for cross-sectional and time-series dependence and find that the extant methods are not robust to both forms of dependence.
Abstract: We review and evaluate the methods commonly used in the accounting literature to correct for cross-sectional and time-series dependence. While much of the accounting literature studies settings where variables are cross-sectionally and serially correlated, we find that the extant methods are not robust to both forms of dependence. Contrary to claims in the literature, we find that the Z2-statistic and Newey-West corrected Fama-MacBeth do not correct for both cross-sectional and time-series dependence. We show that extant methods produce misspecified test statistics in common accounting research settings, and that correcting for both forms of dependence substantially alters inferences reported in the literature. Specifically, several findings in the cost of equity capital literature, the cost of debt literature, and the conservatism literature appear not to be robust to the use of well-specified test statistics.
TL;DR: This article conducted a meta-analysis of 251 studies presented in 214 manuscripts and found that the overall effect is positive but small (mean r =.13, median r = 0.09, weighted r = 1.11), and results for the 106 studies from the past decade are even smaller.
Abstract: In an era of rising concern about financial performance and social ills, companies’ economic achievements and negative externalities prompt a common question: Does it pay to be good? For thirty-five years, researchers have been investigating the empirical link between corporate social performance (CSP) and corporate financial performance (CFP). In the most comprehensive review of this research to date, we conduct a meta-analysis of 251 studies presented in 214 manuscripts. The overall effect is positive but small (mean r = .13, median r = .09, weighted r = .11), and results for the 106 studies from the past decade are even smaller. We also conduct sensitivity analyses to determine whether or not the relationship is stronger under certain conditions. Except for the effect of revealed misdeeds on financial performance, none of the many contingencies examined in the literature markedly affects the results. Therefore, we conclude by considering whether, aside from striving to do no harm, companies have grounds for doing good - and whether researchers have grounds for continuing to look for an empirical link between CSP and CFP.
TL;DR: In this article, a growth model that is consistent with salient features of the Chinese growth experience since 1992 is presented, which includes high output growth, sustained returns on capital investments, extensive reallocation within the manufacturing sector, falling labor share and accumulation of a large foreign surplus.
Abstract: This paper constructs a growth model that is consistent with salient features of the Chinese growth experience since 1992: high output growth, sustained returns on capital investments, extensive reallocation within the manufacturing sector, falling labor share and accumulation of a large foreign surplus. The theory makes only minimal deviations from a neoclassical growth model. Its building blocks are financial imperfections and reallocation among firms with heterogeneous productivity. Some firms use more productive technologies than others, but low-productivity firms survive because of better access to credit markets. Due to the financial imperfections, high-productivity firms - which are run by entrepreneurs - must be financed out of internal savings. If these savings are sufficiently large, the high-productivity sector outgrows the low-productivity sector, and attracts an increasing employment share. During the transition, low wage growth sustains the return to capital. The downsizing of the financially integrated sector forces a growing share of domestic savings to be invested in foreign assets, generating a foreign surplus. We test some auxiliary implications of the theory and find robust empirical support.
TL;DR: In this article, the authors argue that substantial model uncertainty and instability seriously impair the forecasting ability of individual predictive regression models, and they recommend combining individual model forecasts to improve out-of-sample equity premium prediction.
Abstract: While a host of economic variables have been identified in the literature with the apparent in-sample ability to predict the equity premium, Goyal and Welch (2008) find that these variables fail to deliver consistent out-of-sample forecasting gains relative to the historical average. Arguing that substantial model uncertainty and instability seriously impair the forecasting ability of individual predictive regression models, we recommend combining individual model forecasts to improve out-of-sample equity premium prediction. Combining delivers statistically and economically significant out-of-sample gains relative to the historical average on a consistent basis over time. We provide two empirical explanations for the benefits of the forecast combination approach: (i) combining forecasts incorporates information from numerous economic variables while substantially reducing forecast volatility; (ii) combination forecasts of the equity premium are linked to the real economy.