scispace - formally typeset
Search or ask a question

Showing papers in "Social Science Research Network in 2017"


Book ChapterDOI
TL;DR: Partial least squares structural equation modeling (PLS-SEM) has become a popular method for estimating path models with latent variables and their relationships as discussed by the authors, and a common goal of PLSSEM analyses is to identify key success factors and sources of competitive advantage for important target constructs such as customer satisfaction, customer loyalty, behavioral intentions, and user behavior.
Abstract: Partial least squares structural equation modeling (PLS-SEM) has become a popular method for estimating path models with latent variables and their relationships. A common goal of PLS-SEM analyses is to identify key success factors and sources of competitive advantage for important target constructs such as customer satisfaction, customer loyalty, behavioral intentions, and user behavior. Building on an introduction of the fundamentals of measurement and structural theory, this chapter explains how to specify and estimate path models using PLS-SEM. Complementing the introduction of the PLS-SEM method and the description of how to evaluate analysis results, the chapter also offers an overview of complementary analytical techniques. A PLS-SEM application of the widely recognized corporate reputation model illustrates the method.

1,842 citations


Journal ArticleDOI
TL;DR: The findings indicate that the circular economy is most frequently depicted as a combination of reduce, reuse and recycle activities, whereas it is oftentimes not highlighted that CE necessitates a systemic shift, which may eventually result in the collapse of the concept.
Abstract: The circular economy concept has gained momentum both among scholars and practitioners. However, critics claim that it means many different things to different people. This paper provides further evidence for these critics. The aim of this paper is to create transparency regarding the current understandings of the circular economy concept. For this purpose, we have gathered 114 circular economy definitions which were coded on 17 dimensions. Our findings indicate that the circular economy is most frequently depicted as a combination of reduce, reuse and recycle activities, whereas it is oftentimes not highlighted that CE necessitates a systemic shift. We further find that the definitions show few explicit linkages of the circular economy concept to sustainable development. The main aim of the circular economy is considered to be economic prosperity, followed by environmental quality; its impact on social equity and future generations is barely mentioned. Furthermore, neither business models nor consumers are frequently outlined as enablers of the circular economy. We critically discuss the various circular economy conceptualizations throughout this paper. Overall, we hope to contribute via this study towards the coherence of the circular economy concept; we presume that significantly varying circular economy definitions may eventually result in the collapse of the concept.

1,381 citations


Journal ArticleDOI
TL;DR: It is suggested data controllers should offer a particular type of explanation, unconditional counterfactual explanations, to support these three aims, which describe the smallest change to the world that can be made to obtain a desirable outcome, or to arrive at the closest possible world, without needing to explain the internal logic of the system.
Abstract: There has been much discussion of the “right to explanation” in the EU General Data Protection Regulation, and its existence, merits, and disadvantages. Implementing a right to explanation that opens the ‘black box’ of algorithmic decision-making faces major legal and technical barriers. Explaining the functionality of complex algorithmic decision-making systems and their rationale in specific cases is a technically challenging problem. Some explanations may offer little meaningful information to data subjects, raising questions around their value. Data controllers have an interest to not disclose information about their algorithms that contains trade secrets, violates the rights and freedoms of others (e.g. privacy), or allows data subjects to game or manipulate decision-making. Explanations of automated decisions need not hinge on the general public understanding how algorithmic systems function. Even though such interpretability is of great importance and should be pursued, explanations can, in principle, be offered without opening the black box. Looking at explanations as a means to help a data subject act rather than merely understand, one could gauge the scope and content of explanations according to the specific goal or action they are intended to support. From the perspective of individuals affected by automated decision-making, we propose three aims for explanations: (1) to inform and help the individual understand why a particular decision was reached, (2) to provide grounds to contest the decision if the outcome is undesired, and (3) to understand what would need to change in order to receive a desired result in the future, based on the current decision-making model. We assess how each of these goals finds support in the GDPR, and the extent to which they hinge on opening the ‘black box’. We suggest data controllers should offer a particular type of explanation, ‘unconditional counterfactual explanations’, to support these three aims. These counterfactual explanations describe the smallest change to the world that can be made to obtain a desirable outcome, or to arrive at the “closest possible world.” As multiple variables or sets of variables can lead to one or more desirable outcomes, multiple counterfactual explanations can be provided, corresponding to different choices of nearby possible worlds for which the counterfactual holds. Counterfactuals describe a dependency on the external facts that lead to that decision without the need to convey the internal state or logic of an algorithm. As a result, counterfactuals serve as a minimal solution that bypasses the current technical limitations of interpretability, while striking a balance between transparency and the rights and freedoms of others (e.g. privacy, trade secrets).

1,167 citations


Journal ArticleDOI
TL;DR: Self-Determination Theory (SDT) is a macro theory of human motivation that evolved from research on intrinsic and extrinsic motivations and expanded to include research on work organizations and other domains of life.
Abstract: Self-determination theory (SDT) is a macro theory of human motivation that evolved from research on intrinsic and extrinsic motivations and expanded to include research on work organizations and other domains of life. We discuss SDT research relevant to the workplace, focusing on (a) the distinction between autonomous motivation (i.e., intrinsic motivation and fully internalized extrinsic motivation) and controlled motivation (i.e., externally and internally controlled extrinsic motivation), as well as (b) the postulate that all employees have three basic psychological needs—for competence, autonomy, and relatedness—the satisfaction of which promotes autonomous motivation, high-quality performance, and wellness. Research in work organizations has tended to take the perspectives of either the employees (i.e., their well-being) or the owners (i.e., their profits). SDT provides the concepts that guide the creation of policies, practices, and environments that promote both wellness and high-quality performanc...

1,089 citations


Book ChapterDOI
TL;DR: The basic law of corporate governance has achieved a high degree of uniformity across developed market jurisdictions, and continuing convergence toward a single, standard model is likely as discussed by the authors, which is sometimes said that the shareholder-oriented model of corporate law is well suited only to those jurisdictions in which one finds large numbers of firms with widely dispersed share ownership, such as the United States and the United Kingdom.
Abstract: The basic law of corporate governance—indeed, most of corporate law—has achieved a high degree of uniformity across developed market jurisdictions, and continuing convergence toward a single, standard model is likely. It is sometimes said that the shareholder-oriented model of corporate law is well suited only to those jurisdictions in which one finds large numbers of firms with widely dispersed share ownership, such as the United States and the United Kingdom. The core legal features of the corporate form were already well established in advanced jurisdictions one hundred years ago, at the turn of the twentieth century. Thus, just as there was rapid crystallization of the core features of the corporate form in the late nineteenth century, at the beginning of the twenty-first century we are witnessing rapid convergence on the standard shareholder-oriented model as a normative view of corporate structure and governance.

1,080 citations


Book ChapterDOI
TL;DR: In this paper, the authors outline the development of the idea of "stakeholder management" as it has come to be applied in strategic management and suggest several related characteristics that serve as distinguishing features.
Abstract: The purpose of this chapter is to outline the development of the idea of "stakeholder management" as it has come to be applied in strategic management. We begin by developing a brief history of the concept. We then suggest that traditionally the stakeholder approach to strategic management has several related characteristics that serve as distinguishing features. We review recent work on stakeholder theory and suggest how stakeholder management has affected the practice of management. We end by suggesting further research questions.

1,066 citations


Book ChapterDOI
TL;DR: In this article, international law is described as a social historical legal tradition that emerged and spread over time to deal with matters between and across polities, and the history of interaction between polities and how this has been managed.
Abstract: International law is a social historical legal tradition that emerged and spread over time to deal with matters between and across polities. This statement may appear obvious, but its full implications point to a thorough reconstruction of theoretical accounts of international law. Part I recounts how Bentham inadvertently created an enduring set of theoretical problems for international law. Part II describes international law as a social historical legal tradition, showing its European origins and diffusion with imperialism, and exposing three slants in international law. Part III broadens the lens to sketch the history of interaction between and across polities and how this has been managed. Part IV details contemporary efforts to deal with this interaction through organizations and transnational law and regulation. With this background in place, Part V elaborates a series of theoretical clarifications. First I unravel several confusions that result from construing state law and international law as parallel categories and conflating system with category. Then I explain why international law is a form of law, although not a unified hierarchical system. Contrary to common perceptions, furthermore, I show that state law and international law are not and have never been separate systems. Finally, I clarify the relationship between international law and transnational law and regulation. Aspects of this theoretical reconstruction may initially appear surprising, but they follow from the insight that international law is a social historical tradition.

696 citations


OtherDOI
TL;DR: In this article, the authors compared Jaffe's work on the use of patents as a measure of the spillover of university research with the work of Acs and Audretsch in which innovation activity is measured by number of innovations.
Abstract: Compares Jaffe's work on the use of patents as a measure of the spillover of university research with the work of Acs and Audretsch in which innovation activity is measured by number of innovations. Jaffe's work, which modified the knowledge production function proposed by Griliches, showed a positive relationship between corporate patent activity and commercial spillovers from university research. This research approach was criticized by many. In 1987, Acs and Audretsch proposed measuring innovative activity by the number of innovations recorded in 1982 by the U.S. Small Business Administration. It was believed that using number of innovations, using those provided a more direct measure than Jaffe's work because inventions that were not patented but were introduced into the market were counted and inventions that were patented but never introduced were not counted. This analysis seeks to compare the two works. Jaffe used a pool of data that spanned an eight-year period while Acs and Audretsch considered a single year, 1982. It is shown that using a single year sample in Jaffe's model does not greatly alter the results, which means that both private corporate expenditures on R&D and university expenditures on research both positively and significantly influence patent activity. The impact of university spillovers is greater on innovations than patents using Jaffe's model. By directly substituting the innovation measure for the patent measure, this research approach shows further support for Jaffe's findings and arguments. (SRD)

675 citations


Journal ArticleDOI
TL;DR: It is shown that CSR decreases systematic risk and increases firm value and these effects are stronger for firms producing differentiated goods and when consumers' expenditure share on CSR goods is small.
Abstract: This paper presents an industry equilibrium model where firms have a choice to engage in corporate social responsibility (CSR) activities. We model CSR activities as a product differentiation strategy allowing firms to benefit from higher profit margins. The model predicts that CSR decreases systematic risk and increases firm value and that these effects are stronger for firms operating in differentiated goods industries and when consumers' expenditure share on CSR goods is small. We find supporting evidence for our predictions. We address a potential endogeneity problem by instrumenting CSR using data on the political affiliation of the firm's home state.

553 citations


Journal ArticleDOI
TL;DR: The first-order positive psychological resources that make up PsyCap include hope, efficacy, resilience, and optimism, or the HERO within this article, and these four best meet the inclusion criteria of being theory and research-based, positive, validly measurable, state-like, and having impact on attitudes, behaviors, performance and well-being.
Abstract: The now recognized core construct of psychological capital, or simply PsyCap, draws from positive psychology in general and positive organizational behavior (POB) in particular. The first-order positive psychological resources that make up PsyCap include hope, efficacy, resilience, and optimism, or the HERO within. These four best meet the inclusion criteria of being theory- and research-based, positive, validly measurable, state-like, and having impact on attitudes, behaviors, performance and well-being. The article first provides the background and precise meaning of PsyCap and then comprehensively reviews its measures, theoretical mechanisms, antecedents and outcomes, levels of analysis, current status and needed research, and finally application. Particular emphasis is given to practical implications, which focuses on PsyCap development, positive leadership, and novel applications such as the use of video games and gamification techniques. The overriding theme throughout is that PsyCap has both scient...

551 citations


Journal ArticleDOI
TL;DR: Integrated nested Laplace approximations (INLA) as mentioned in this paper approximates the integrand with a second-order Taylor expansion around the mode and computes the integral analytically.
Abstract: The key operation in Bayesian inference is to compute high-dimensional integrals. An old approximate technique is the Laplace method or approximation, which dates back to Pierre-Simon Laplace (1774). This simple idea approximates the integrand with a second-order Taylor expansion around the mode and computes the integral analytically. By developing a nested version of this classical idea, combined with modern numerical techniques for sparse matrices, we obtain the approach of integrated nested Laplace approximations (INLA) to do approximate Bayesian inference for latent Gaussian models (LGMs). LGMs represent an important model abstraction for Bayesian inference and include a large proportion of the statistical models used today. In this review, we discuss the reasons for the success of the INLA approach, the R-INLA package, why it is so accurate, why the approximations are very quick to compute, and why LGMs make such a useful concept for Bayesian computing.

Book ChapterDOI
TL;DR: The notion of legal pluralism is gaining momentum across a range of law-related fields as mentioned in this paper, from the medieval period up to the present, and it has been studied extensively.
Abstract: The notion of legal pluralism is gaining momentum across a range of law-related fields. Part 1 of this article will portray the rich history of legal pluralism, from the medieval period up to the present. Part II will explain why current theoretical efforts to formulate legal pluralism are plagued by the difficulty of defining “law.” Finally, Part III will articulate an approach to contemporary legal pluralism that avoids the conceptual problems suffered by most current approaches, while framing the salient features of legal pluralism.

Journal ArticleDOI
TL;DR: In this paper, a task-based framework is proposed to characterize the equilibrium in a dynamic setting where tasks previously performed by labor can be automated and more complex versions of existing tasks, in which labor has a comparative advantage, can be created.
Abstract: The advent of automation and the simultaneous decline in the labor share and employment among advanced economies raise concerns that labor will be marginalized and made redundant by new technologies. We examine this proposition using a task-based framework in which tasks previously performed by labor can be automated and more complex versions of existing tasks, in which labor has a comparative advantage, can be created. We characterize the equilibrium in this model and establish how the available technologies and the choices of firms between producing with capital or labor determine factor prices and the allocation of factors to tasks. In a static version of our model where capital is fixed and technology is exogenous, automation reduces employment and the share of labor in national income and may even reduce wages, while the creation of more complex tasks has the opposite effects. Our full model endogenizes capital accumulation and the direction of research towards automation and the creation of new complex tasks. Under reasonable conditions, there exists a stable balanced growth path in which the two types of innovations go hand-in-hand. An increase in automation reduces the cost of producing using labor, and thus discourages further automation and encourages the faster creation of new complex tasks. The endogenous response of technology restores the labor share and employment back to their initial level. Although the economy contains powerful self correcting forces, the equilibrium generates too much automation. Finally, we extend the model to include workers of different skills. We find that inequality increases during transitions, but the self-correcting forces in our model also limit the increase in inequality over the long-run.

Journal ArticleDOI
TL;DR: The results indicate that escapism, acquiring knowledge about the games being played, novelty and eSports athlete aggressiveness were found to positively predict eSport spectating frequency.
Abstract: Purpose: In this study we investigate why do people spectate eSports on the internet. We define eSports (electronic sports) as a form of sports where the primary aspects of the sport are facilitated by electronic systems; the input of players and teams as well as the output of the eSports system are mediated by human-computer interfaces. In more practical terms, eSports refer to competitive video gaming (broadcasted on the internet).Methodology: We employed the MSSC (Motivations Scale for Sports Consumption) which is one of the most widely applied measurement instruments for sports consumption in general. The questionnaire was designed and pre-tested before distributing to target respondents (N=888). The reliability and validity of the instrument both met the commonly accepted guidelines. The model was assessed first by examining its measurement model and then the structural model.Findings: The results indicate that escapism, acquiring knowledge about the games being played, novelty and eSports athlete aggressiveness were found to positively predict eSport spectating frequency.Originality: During recent years, eSports (electronic sports) and video game streaming have become rapidly growing forms of new media in the internet driven by the growing provenance of (online) games and online broadcasting technologies. Today, hundreds of millions of people spectate eSports. The present investigation presents a large study on gratification-related determinants of why people spectate eSports on the internet. Moreover, the study proposes a definition for eSports and further discusses how eSports can be seen as a form of sports.

Posted Content
TL;DR: The authors summarize the major themes and contributions driving the empirical literature since 2011 reviews, and try to interpret the literature in light of an overarching conceptual framework about how human capital is produced early in life.
Abstract: That prenatal events can have life-long consequences is now well established. Nevertheless, research on the Fetal Origins Hypothesis is flourishing and has expanded to include the early childhood (postnatal) environment. Why does this literature have a “second act?” We summarize the major themes and contributions driving the empirical literature since our 2011 reviews, and try to interpret the literature in light of an overarching conceptual framework about how human capital is produced early in life. One major finding is that relatively mild shocks in early life can have substantial negative impacts, but that the effects are often heterogeneous reflecting differences in child endowments, budget constraints, and production technologies. Moreover, shocks, investments, and interventions can interact in complex ways that are only beginning to be understood. Many advances in our knowledge are due to increasing accessibility of comprehensive administrative data that allow events in early life to be linked to long-term outcomes. Yet, we still know relatively little about the interval between, and thus about whether it would be feasible to identify and intervene with affected individuals at some point between early life and adulthood. We do know enough, however, to be able to identify some interventions that hold promise for improving child outcomes in early life and throughout the life course.Institutional subscribers to the NBER working paper series, and residents of developing countries may download this paper without additional charge at www.nber.org.

Journal ArticleDOI
TL;DR: It is found that inclusion of widely used content related to brand personality is associated with higher levels of consumer engagement (Likes, comments, shares) with a message, and certain directly informative content, such as deals and promotions, drive consumers’ path to conversio...
Abstract: We describe the effects of social media advertising content on customer engagement using Facebook data. We content-code more than 100,000 messages across 800 companies using a combination of Amazon Mechanical Turk and state-of-the-art Natural Language Processing and machine learning algorithms. We use this large-scale dataset of content attributes to describe the association of various kinds of social media marketing content with user engagement - defined as Likes, comments, shares, and click-throughs - with the messages. We find that inclusion of widely used content related to brand-personality - like humor, emotion and brand’s philanthropic positioning - is associated with higher levels of consumer engagement (like, comment, share) with a message. We find that directly informative content - like mentions of prices and availability - is associated with lower levels of engagement when included in messages in isolation, but higher engagement levels when provided in combination with brand-personality content. We also find certain directly informative content such as the mention of deals and promotions drive consumers’ path-to-conversion (click-throughs). These results hold after correcting for the non-random targeting of Facebook’s EdgeRank (News Feed) algorithm, so reflect more closely user reaction to content, rather than Facebook’s behavioral targeting. Our results suggest therefore that there may be substantial gains from content engineering by combining informative characteristics associated with immediate leads (via improved click-throughs) with brand-personality related content that help maintain future reach and branding on the social media site (via improved engagement). These results inform content design strategies in social media. Separately, the methodology we apply to content-code large-scale textual data provides a framework for future studies on unstructured data such as advertising content or product reviews.

Journal ArticleDOI
TL;DR: NewGene is software designed to eliminate many of the difficulties commonly involved in constructing large international relations data sets by providing a highly flexible platform on which users can construct datasets for international relations research using pre-loaded data or by incorporating their own data.
Abstract: This paper introduces a complete redesign of the popular EUGene software, called NewGene. Like EUGene, NewGene is software designed to eliminate many of the difficulties commonly involved in constructing large international relations data sets. NewGene is a stand-alone Microsoft Windows and Osx based program for the construction of annual, monthly, and daily data sets for the variety of decision making units (e.g. countries, leaders, organizations, etc) used in quantitative studies of international relations. It also provides users the ability to construct units of analysis ranging from monads (e.g. country-year), to dyads (e.g. country1-country2-year), to extra-dyadic observations called k-ads (e.g. country1-country2-…-countryk-year). NewGene’s purpose is to provide a highly flexible platform on which users can construct datasets for international relations research using pre-loaded data or by incorporating their own data.

Journal ArticleDOI
TL;DR: The authors showed that when the Fed funds rate increases, banks widen the interest spreads they charge on deposits, and deposits flow out of the banking system, in areas with less deposit competition.
Abstract: We propose and test a new channel for the transmission of monetary policy. We show that when the Fed funds rate increases, banks widen the interest spreads they charge on deposits, and deposits flow out of the banking system. We present a model in which imperfect competition among banks gives rise to these relationships. An increase in the nominal interest rate increases banks' effective market power, inducing them to increase deposit spreads. Households respond by substituting away from deposits into less liquid but higher-yielding assets. Using branch-level data on all U.S. banks, we show that following an increase in the Fed funds rate, deposit spreads increase by more, and deposit supply falls by more, in areas with less deposit competition. We control for changes in banks' lending opportunities by comparing branches of the same bank. We control for changes in macroeconomic conditions by showing that deposit spreads widen immediately after a rate change, even if it is fully expected. Our results imply that monetary policy has a significant impact on how the financial system is funded, on the quantity of safe and liquid assets it produces, and on its lending to the real economy.

Journal ArticleDOI
TL;DR: This article found that an increase in the household debt to GDP ratio predicts lower subsequent GDP growth and higher unemployment in an unbalanced panel of 30 countries from 1960 to 2012, and uncover a global household debt cycle that partly predicts the severity of the global growth slowdown after 2007.
Abstract: An increase in the household debt to GDP ratio predicts lower subsequent GDP growth and higher unemployment in an unbalanced panel of 30 countries from 1960 to 2012. Low mortgage spreads are associated with an increase in the household debt to GDP ratio and a decline in subsequent GDP growth, highlighting the importance of credit supply shocks. Economic forecasters systematically over-predict GDP growth at the end of household debt booms, suggesting an important role of flawed expectations formation. The negative relation between the change in household debt to GDP and subsequent output growth is stronger for countries with less flexible exchange rate regimes and those closer to the zero lower bound on nominal interest rates. We also uncover a global household debt cycle that partly predicts the severity of the global growth slowdown after 2007. Countries with a household debt cycle more correlated with the global household debt cycle experience a sharper decline in growth after an increase in domestic household debt.

Journal ArticleDOI
TL;DR: The authors examined the existence and dates of pricing bubbles in Bitcoin and Ethereum, two popular cryptocurrencies using the Phillips et al. (2011) methodology and concluded that Bitcoin is almost certainly in a bubble phase.
Abstract: We examine the existence and dates of pricing bubbles in Bitcoin and Ethereum, two popular cryptocurrencies using the Phillips et al. (2011) methodology. In contrast to previous papers, we examine the fundamental drivers of the price. Having derived ratios that are economically and computationally sensible, we use these variables to detect and datestamp bubbles. Our conclusion is that there are periods of clear bubble behaviour, with Bitcoin now almost certainly in a bubble phase.

Journal ArticleDOI
TL;DR: In this article, the effects of diversity in the board of directors on corporate policies and risk were examined using a multi-dimensional measure, and it was found that greater board diversity leads to lower volatility and better performance.
Abstract: We examine the effects of diversity in the board of directors on corporate policies and risk. Using a multi-dimensional measure, we find that greater board diversity leads to lower volatility and better performance. Lower risk levels are largely due to diverse boards adopting less risky financial policies. However, consistent with diversity fostering more efficient (real) risk-taking, firms with greater board diversity invest more in R&D and have more efficient innovation processes. Instrumental variable tests that exploit exogenous variation in firm access to the supply of diverse nonlocal directors indicate that these relations are causal.

Journal ArticleDOI
TL;DR: The authors empirically measured changes in the constituency of an organizational field centered around the issue of corporate environmentalism from 1960 to 1993, and correlated those changes with the evolving institutions adopted by the US chemical industry to interpret the issue.
Abstract: This paper empirically measures changes in the constituency of an organizational field centered around the issue of corporate environmentalism from 1960 to 1993, and correlates those changes with the evolving institutions adopted by the US chemical industry to interpret the issue. Four stages are identified, each representing a different field membership, interaction pattern and set of dominant institutions. The beginning of each stage is marked by the emergence of a triggering event. The article develops the ideas that: fields form around central issues, not markets or technologies; within fields, competing institutions may simultaneously exist within individual populations (or classes of constituencies); as institutions evolve, inter-connections between their regulative, normative and cognitive aspects can be detected, and; field level analyses can reveal the cultural and institutional origins of organizational impacts on the natural environment. The article concludes with future research challenges in understanding the dynamics by which events influence institutional change processes and the role of institutional entrepreneurs in channeling that influence.

Journal ArticleDOI
TL;DR: This paper reviewed the literature on alternative work arrangements published since the most recent major review of nonstandard work by Ashford et al. (2007) and identified three dimensions of flexibility that undergird alternative work arrangement: flexibility in the employment relationship, flexibility in scheduling of work, and flexibility in where work is accomplished.
Abstract: Alternative work arrangements continue to increase in number and variety. We review the literature on alternative work arrangements published since the most recent major review of nonstandard work by Ashford et al. (2007). We look across the research findings to identify three dimensions of flexibility that undergird alternative work arrangements: (a) flexibility in the employment relationship, (b) flexibility in the scheduling of work, and (c) flexibility in where work is accomplished. We identify two images of the new world of work—one for high-skill workers who choose alternative work arrangements and the other for low-skill workers who struggle to make a living and are beholden to the needs of the organization. We close with future directions for research and practice for tending to the first image and moving away from the second image of the new world of work.

Journal ArticleDOI
TL;DR: In this paper, the authors show that in the presence of unit and time fixed effects, it is impossible to identify the linear component of the path of pre-trends and dynamic treatment effects.
Abstract: A broad empirical literature uses "event study" research designs for treatment effect estimation, a setting in which all units in the panel receive treatment but at random times. We make four novel points about identification and estimation of causal effects in this setting and show their practical relevance. First, we show that in the presence of unit and time fixed effects, it is impossible to identify the linear component of the path of pre-trends and dynamic treatment effects. Second, we propose graphical and statistical tests for pre-trends. Third, we consider commonly-used "static" regressions, with a treatment dummy instead of a full set of leads and lags around the treatment event, and we show that OLS does not recover a weighted average of the treatment effects: long-term effects are weighted negatively, and we introduce a different estimator that is robust to this issue. Fourth, we show that equivalent problems of under-identification and negative weighting arise in difference-in-differences settings when the control group is allowed to be on a different time trend or in the presence of unit-specific time trends. Finally, we show the practical relevance of these issues in a series of examples from the existing literature, with a focus on the estimation of the marginal propensity to consume out of tax rebates.

Journal ArticleDOI
TL;DR: The results suggest that belief in fake news may be driven, to some extent, by a general tendency to be overly accepting of weak claims, which may be partly responsible for the prevalence of epistemically suspect beliefs writ large.
Abstract: Objective: Fake news represents a particularly egregious and direct avenue by which inaccurate beliefs have been propagated via social media. We investigate the psychological profile of individuals who fall prey to fake news. Method: We recruited 1,606 participants from Amazon’s Mechanical Turk for three online surveys. Results: The tendency to ascribe profundity to randomly generated sentences – pseudo-profound bullshit receptivity – correlates positively with perceptions of fake news accuracy, and negatively with the ability to differentiate between fake and real news (media truth discernment). Relatedly, individuals who overclaim their level of knowledge also judge fake news to be more accurate. We also extend previous research indicating that analytic thinking correlates negatively with perceived accuracy by showing that this relationship is not moderated by the presence/absence of the headline’s source (which has no effect on accuracy), or by familiarity with the headlines (which correlates positively with perceived accuracy of fake and real news). Conclusion: Our results suggest that belief in fake news may be driven, to some extent, by a general tendency to be overly accepting of weak claims. This tendency, which we refer to as reflexive open-mindedness, may be partly responsible for the prevalence of epistemically suspect beliefs writ large.

Journal ArticleDOI
TL;DR: In this article, the authors identify high quality liquidity proxies based on low-frequency (daily) data, which provide 1,000X to 10,000x computational savings compared to computing high frequency (intraday) liquidity measures.
Abstract: Liquidity plays an important role in global research. We identify high quality liquidity proxies based on low-frequency (daily) data, which provide 1,000X to 10,000X computational savings compared to computing high-frequency (intraday) liquidity measures. We find that: (1) Closing Percent Quoted Spread is the best monthly percent-cost proxy when available, (2) Amihud, Closing Percent Quoted Spread Impact, LOT Mixed Impact, High-Low Impact, and FHT Impact are tied as the best monthly cost-per-dollar-volume proxy, (3) the daily version of Closing Percent Quoted Spread is the best daily percent-cost proxy, and (4) the daily version of Amihud is the best daily cost-per-dollar-volume proxy.

Posted Content
Lina Khan1
TL;DR: In this article, the authors argue that the current framework in antitrust, specifically its pegging competition to "consumer welfare", defined as short-term price effects, is unequipped to capture the architecture of market power in the modern economy.
Abstract: Amazon is the titan of twenty-first century commerce. In addition to being a retailer, it is now a marketing platform, a delivery and logistics network, a payment service, a credit lender, an auction house, a major book publisher, a producer of television and films, a fashion designer, a hardware manufacturer, and a leading host of cloud server space. Although Amazon has clocked staggering growth, it generates meager profits, choosing to price below-cost and expand widely instead. Through this strategy, the company has positioned itself at the center of e-commerce and now serves as essential infrastructure for a host of other businesses that depend upon it. Elements of the firm’s structure and conduct pose anticompetitive concerns—yet it has escaped antitrust scrutiny.This Note argues that the current framework in antitrust—specifically its pegging competition to “consumer welfare,” defined as short-term price effects—is unequipped to capture the architecture of market power in the modern economy. We cannot cognize the potential harms to competition posed by Amazon’s dominance if we measure competition primarily through price and output. Specifically, current doctrine underappreciates the risk of predatory pricing and how integration across distinct business lines may prove anticompetitive. These concerns are heightened in the context of online platforms for two reasons. First, the economics of platform markets create incentives for a company to pursue growth over profits, a strategy that investors have rewarded. Under these conditions, predatory pricing becomes highly rational—even as existing doctrine treats it as irrational and therefore implausible. Second, because online platforms serve as critical intermediaries, integrating across business lines positions these platforms to control the essential infrastructure on which their rivals depend. This dual role also enables a platform to exploit information collected on companies using its services to undermine them as competitors.This Note maps out facets of Amazon’s dominance. Doing so enables us to make sense of its business strategy, illuminates anticompetitive aspects of Amazon’s structure and conduct, and underscores deficiencies in current doctrine. The Note closes by considering two potential regimes for addressing Amazon’s power: restoring traditional antitrust and competition policy principles or applying common carrier obligations and duties.

Journal ArticleDOI
TL;DR: It is shown that even a single exposure increases subsequent perceptions of accuracy, both within the same session and after a week, and that social media platforms help to incubate belief in blatantly false news stories and that tagging such stories as disputed is not an effective solution to this problem.
Abstract: The 2016 US Presidential Election brought considerable attention to the phenomenon of “fake news”: entirely fabricated and often partisan content that is presented as factual. Here we demonstrate one mechanism that contributes to the believability of fake news: fluency via prior exposure. Using actual fake news headlines presented as they were seen on Facebook, we show that even a single exposure increases subsequent perceptions of accuracy, both within the same session and after a week. Moreover, this “illusory truth effect” for fake news headlines occurs despite a low level of overall believability, and even when the stories are labeled as contested by fact-checkers or are inconsistent with the reader’s political ideology. These results suggest that social media platforms help to incubate belief in blatantly false news stories, and that tagging such stories as disputed is not an effective solution to this problem. Interestingly, however, we also find that prior exposure does not impact entirely implausible statements (e.g., “The Earth is a perfect square”). These observations indicate that although extreme implausibility is a boundary condition of the illusory truth effect, only a small degree of potential plausibility is sufficient for repetition to increase perceived accuracy. As a consequence, the scope and impact of repetition on beliefs is greater than previously assumed.

Posted Content
Lani Guinier1
TL;DR: Brown v. Board of Education no longer enjoys the unbridled admiration it once earned from academic commentators as discussed by the authors, and the will to support public education from kindergarten through twelfth grade appears to be eroding despite growing awareness of education's importance in a knowledge-based society.
Abstract: On its fiftieth anniversary, Brown v. Board of Education no longer enjoys the unbridled admiration it once earned from academic commentators. Early on, the conventional wisdom was that the courageous social engineers from the National Association for the Advancement of Colored People Legal Defense and Educational Fund (NAACP LDEF), whose inventive lawyering brought the case to fruition, had caused a social revolution. Legal academics and lawyers still widely acclaim the Brown decision as one of the most important Supreme Court cases in the twentieth century, if not since the founding of our constitutional republic. Brown's exalted status in the constitutional canon is unimpeachable, yet over time its legacy has become complicated and ambiguous. The fact is that fifty years later, many of the social, political, and economic problems that the legally trained social engineers thought the Court had addressed through Brown are still deeply embedded in our society. Blacks lag behind whites in multiple measures of educational achievement, and within the black community, boys are falling further behind than girls. In addition, the will to support public education from kindergarten through twelfth grade appears to be eroding despite growing awareness of education's importance in a knowledge-based society. In the Boston metropolitan area in 2003, poor people of color were at least three times more likely than poor whites to live in severely distressed, racially stratified urban neighborhoods. Whereas poor, working-class, and middle-income whites often lived together in economically stable suburban communities, black families with incomes above $50,000 were twice as likely as white households earning less than $20,000 to live in neighborhoods with high rates of crime and concentrations of poverty. Even in the so-called liberal North, race still segregates more than class.

Journal ArticleDOI
TL;DR: For instance, this paper found that adults view Black girls as less innocent and more adult-like than their white peers, especially in the age range of 5-14, which may contribute to harsher punishment by educators and school resource officers.
Abstract: This study by the Georgetown Law Center on Poverty and Inequality provides—for the first time—data showing that adults view Black girls as less innocent and more adult-like than their white peers, especially in the age range of 5–14. The perception of Black girls as less innocent may contribute to harsher punishment by educators and school resource officers. Furthermore, the view that Black girls need less nurturing, protection, and support and are more independent may translate into fewer leadership and mentorship opportunities in schools. The perception of Black girls as less innocent and more adult-like may contribute to more punitive exercise of discretion by those in positions of authority, greater use of force, and harsher penalties.