scispace - formally typeset
Search or ask a question

Showing papers in "Social Science Research Network in 2019"


Book ChapterDOI
TL;DR: This paper argued that the well-worn constructs of firm performance or success and failure of the individual entrepreneur do not provide the field the clarity of purpose and unique domain it desires, and that the context of small business is not what will bring singular clarity for the field.
Abstract: In this chapter an argument is made for a clear articulation for the exclusive domain of entrepreneurship research. To date, the entrepreneurship academic community has neglected to define clear boundaries as to what distinguishes entrepreneurship scholarship from other closely related fields. It is argued here that the well-worn constructs of firm performance or success and failure of the individual entrepreneur do not provide the field the clarity of purpose and unique domain it desires. Similarly, the context of small business is not what will bring singular clarity for the field. Instead, this chapter argues that entrepreneurship research should be focused upon understanding how opportunities to bring future goods and services into existence occur.

1,373 citations


Journal ArticleDOI
TL;DR: The authors surveys reinforcement learning from the perspective of optimization and control, with a focus on continuous control applications, and reviews the general formulation, terminology, and techniques for reinforcement learning for continuous control.
Abstract: This article surveys reinforcement learning from the perspective of optimization and control, with a focus on continuous control applications. It reviews the general formulation, terminology, and t...

496 citations


Journal ArticleDOI
F. Kyle Satterstrom1, Jack A. Kosmicki1, Jiebiao Wang2, Michael S. Breen3  +150 moreInstitutions (45)
TL;DR: Using an enhanced Bayesian framework to integrate de novo and case-control rare variation, 102 risk genes are identified at a false discovery rate of ≤ 0.1, consistent with multiple paths to an excitatory/inhibitory imbalance underlying ASD.
Abstract: We present the largest exome sequencing study of autism spectrum disorder (ASD) to date (n=35,584 total samples, 11,986 with ASD). Using an enhanced Bayesian framework to integrate de novo and case-control rare variation, we identify 102 risk genes at a false discovery rate ≤ 0.1. Of these genes, 49 show higher frequencies of disruptive de novo variants in individuals ascertained for severe neurodevelopmental delay, while 53 show higher frequencies in individuals ascertained for ASD; comparing ASD cases with mutations in these groups reveals phenotypic differences. Expressed early in brain development, most of the risk genes have roles in regulation of gene expression or neuronal communication (i.e., mutations effect neurodevelopmental and neurophysiological changes), and 13 fall within loci recurrently hit by copy number variants. In human cortex single-cell gene expression data, expression of risk genes is enriched in both excitatory and inhibitory neuronal lineages, consistent with multiple paths to an excitatory/inhibitory imbalance underlying ASD.

461 citations


Reference EntryDOI
TL;DR: In this paper, the effect of sustainability disclosure regulations on firms' disclosure practices and valuations was examined, and the authors found that the increased likelihood by treated firms of voluntarily receiving assurance to enhance disclosure credibility and increased likelihood of voluntarily adopting reporting guidelines that enhance disclosure comparability.
Abstract: A key aspect of the governance process inside organizations and markets is the measurement and disclosure of important metrics and information. In this chapter, we examine the effect of sustainability disclosure regulations on firms’ disclosure practices and valuations. Specifically, we explore the implications of regulations mandating the disclosure of environmental, social, and governance (ESG) information in China, Denmark, Malaysia, and South Africa using differences-in-differences estimation with propensity score matched samples. We find that relative to propensity score matched control firms, treated firms significantly increased disclosure following the regulations. We also find increased likelihood by treated firms of voluntarily receiving assurance to enhance disclosure credibility and increased likelihood of voluntarily adopting reporting guidelines that enhance disclosure comparability. These results suggest that even in the absence of a regulation that mandates the adoption of assurance or specific guidelines, firms seek the qualitative properties of comparability and credibility. Instrumental variables analysis suggests that increases in sustainability disclosure driven by the regulation are associated with increases in firm valuations, as reflected in Tobin’s Q. Collectively, the evidence suggest that current efforts to increase transparency around organizations’ impact on society are effective at improving disclosure quantity and quality as well as corporate value.

396 citations


Journal ArticleDOI
TL;DR: Evidence for accelerated development in amygdala-mPFC circuits was limited but emerged in other metrics of neurodevelopment, and progress in charting neurodevelopmental consequences of adversity requires larger samples, longitudinal designs, and more precise assessments of adversity.
Abstract: An extensive literature on childhood adversity and neurodevelopment has emerged over the past decade. We evaluate two conceptual models of adversity and neurodevelopment-the dimensional model of adversity and stress acceleration model-in a systematic review of 109 studies using MRI-based measures of neural structure and function in children and adolescents. Consistent with the dimensional model, children exposed to threat had reduced amygdala, medial prefrontal cortex (mPFC), and hippocampal volume and heightened amygdala activation to threat in a majority of studies; these patterns were not observed consistently in children exposed to deprivation. In contrast, reduced volume and altered function in frontoparietal regions were observed consistently in children exposed to deprivation but not children exposed to threat. Evidence for accelerated development in amygdala-mPFC circuits was limited but emerged in other metrics of neurodevelopment. Progress in charting neurodevelopmental consequences of adversity requires larger samples, longitudinal designs, and more precise assessments of adversity.

298 citations


Journal ArticleDOI
TL;DR: In this paper, the authors identify two key costs that are affected by distributed ledger technology: 1) the cost of verification; and 2) the costs of networking, and discuss how blockchain technology and cryptocurrencies will influence the rate and direction of innovation.
Abstract: We rely on economic theory to discuss how blockchain technology and cryptocurrencies will influence the rate and direction of innovation. We identify two key costs that are affected by distributed ledger technology: 1) the cost of verification; and 2) the cost of networking. Markets facilitate the voluntary exchange of goods and services between buyers and sellers. For an exchange to be executed, key attributes of a transaction need to be verified by the parties involved at multiple points in time. Blockchain technology, by allowing market participants to perform costless verification, lowers the costs of auditing transaction information, and allows new marketplaces to emerge. Furthermore, when a distributed ledger is combined with a native cryptographic token (as in Bitcoin), marketplaces can be bootstrapped without the need of traditional trusted intermediaries, lowering the cost of networking. This challenges existing revenue models and incumbents's market power, and opens opportunities for novel approaches to regulation, auctions and the provision of public goods, software, identity and reputation systems.Institutional subscribers to the NBER working paper series, and residents of developing countries may download this paper without additional charge at www.nber.org.

294 citations


Posted Content
TL;DR: This paper found that public concern about misinformation is making some people more careful about the brands they choose and the content they share online and that changing behaviour is most apparent with those that are younger and better educated, rather than older or less privileged groups.
Abstract: The eighth Digital News Report from the Reuters Institute for the Study of Journalism at the University of Oxford reveals that public concern about misinformation is making some people more careful about the brands they choose and the content they share online. The report, which is based on a YouGov survey conducted with 75,000 people in 38 markets, says that changing behaviour is most apparent with those that are younger and better educated, rather than older or less privileged groups. Across countries over a quarter (26%) say they are relying on more ‘reputable’ sources of news than this time last year – rising to 40% in the US. A further quarter (24%) said they had stopped using sources with a dubious reputation. The report also brings new comparative data on changing online business models, trust, misinformation, the rise of messaging apps and the impact of populism on media usage.

291 citations


Book ChapterDOI
TL;DR: The emerging global (not inter-national!) law is a legal order in its own right which should not be measured against the standards of national legal systems as mentioned in this paper, and it is not - as is usually understood - an underdeveloped body of law which has certain structural deficiencies in comparison to national law.
Abstract: There are a number of inchoate forms of global law, none of which are the creations of states. In relation to them I wish to develop three arguments:1. Global law can only be adequately explained by a theory of legal pluralism which turned from the law of colonial societies to the laws of diverse ethnic, cultural and religious communities in modern nation-states. It needs to make another turn - from groups to discourses. It should focus its attention on a new body of law that emerges from various globalization processes in multiple sectors of civil society independently of the laws of the nation states.2. The emerging global (not inter-national!) law is a legal order in its own right which should not be measured against the standards of national legal systems. It is not - as is usually understood - an underdeveloped body of law which has certain structural deficiencies in comparison to national law. Rather, its peculiar characteristics as fully fledged law distinguishes it from the traditional law of the nation states. These characteristics can be explained by differentiation within world society itself. While global law lacks political and institutional support on the global level, it is closely coupled with globalized socio-economic processes.3. Its relative distance from international politics will not protect global law from its re-politicization. On the contrary, the very reconstruction of social and economic (transactions as a global legal process undermines its non-political character and is the basis of its repoliticization. Yet this will occur in new and unexpected ways. We can expect global law to become politicized not via traditional political institutions but within the various processes under which law engages in 'structural coupling' with highly specialized discourses.

282 citations


Journal ArticleDOI
TL;DR: The authors examined which issuer and ICO characteristics predict successful real outcomes (increasing issuer employment and avoiding enterprise failure). Success is associated with disclosure, credible commitment to the project, and quality signals.
Abstract: Initial coin offerings (ICOs) have emerged as a new mechanism for entrepreneurial finance, with parallels to initial public offerings, venture capital, and pre-sale crowdfunding. In a sample of more than 1,500 ICOs that collectively raise $12.9 billion, we examine which issuer and ICO characteristics predict successful real outcomes (increasing issuer employment and avoiding enterprise failure). Success is associated with disclosure, credible commitment to the project, and quality signals. An instrumental variables analysis finds that ICO token exchange listing causes higher future employment, indicating that access to token liquidity has important real consequences for the enterprise.

236 citations


Posted Content
TL;DR: In this paper, the authors examine the link between unlawful and biased police practices and the data available to train or implement predictive policing systems and highlight three case studies: (1) Chicago, an example of where dirty data was ingested directly into the city's predictive system; (2) New Orleans, a case where the extensive evidence of dirty policing practices and recent litigation suggests an extremely high risk that dirty data could be used in predictive policing; and (3) Maricopa County, where despite extensive evidence, a lack of public transparency about the details of various predictive policing practices, a
Abstract: Law enforcement agencies are increasingly using predictive policing systems to forecast criminal activity and allocate police resources. Yet in numerous jurisdictions, these systems are built on data produced during documented periods of flawed, racially biased, and sometimes unlawful practices and policies (“dirty policing”). These policing practices and policies shape the environment and the methodology by which data is created, which raises the risk of creating inaccurate, skewed, or systemically biased data (“dirty data”). If predictive policing systems are informed by such data, they cannot escape the legacies of the unlawful or biased policing practices that they are built on. Nor do current claims by predictive policing vendors provide sufficient assurances that their systems adequately mitigate or segregate this data. In our research, we analyze thirteen jurisdictions that have used or developed predictive policing tools while under government commission investigations or federal court monitored settlements, consent decrees, or memoranda of agreement stemming from corrupt, racially biased, or otherwise illegal policing practices. In particular, we examine the link between unlawful and biased police practices and the data available to train or implement these systems. We highlight three case studies: (1) Chicago, an example of where dirty data was ingested directly into the city’s predictive system; (2) New Orleans, an example where the extensive evidence of dirty policing practices and recent litigation suggests an extremely high risk that dirty data was or could be used in predictive policing; and (3) Maricopa County, where despite extensive evidence of dirty policing practices, a lack of public transparency about the details of various predictive policing systems restricts a proper assessment of the risks. The implications of these findings have widespread ramifications for predictive policing writ large. Deploying predictive policing systems in jurisdictions with extensive histories of unlawful police practices presents elevated risks that dirty data will lead to flawed or unlawful predictions, which in turn risk perpetuating additional harm via feedback loops throughout the criminal justice system. The use of predictive policing must be treated with high levels of caution and mechanisms for the public to know, assess, and reject such systems are imperative.

213 citations


Journal ArticleDOI
TL;DR: A dynamic asset-pricing model of (crypto-)tokens on (blockchain-based) platforms, and their roles on endogenous user adoption is provided, which produces explosive growth of user base after an initial period of dormant adoption, accompanied by a run-up of token price volatility.
Abstract: We develop a dynamic asset-pricing model of cryptocurrencies/tokens that allow users to conduct peer-to-peer transactions on digital platforms. The equilibrium value of tokens is determined by aggregating heterogeneous users' transactional demand rather than discounting cashflows as in standard valuation models. Endogenous platform adoption builds upon user network externality and exhibits an S-curve — it starts slow, becomes volatile, and eventually tapers off. Introducing tokens lowers users' transaction costs on the platform by allowing users to capitalize on platform growth. The resulting intertemporal feedback between user adoption and token price accelerates adoption and dampens user-base volatility. Institutional subscribers to the NBER working paper series, and residents of developing countries may download this paper without additional charge at www.nber.org.

ReportDOI
TL;DR: The authors study how different forms of communication influence the inflation expectations of individuals in a randomized controlled trial and find that reading the actual Federal Open Market Committee (FOMC) statement has about the same average effect on expectations as simply being told about the Federal Reserve's inflation target.
Abstract: We study how different forms of communication influence the inflation expectations of individuals in a randomized controlled trial. We first solicit individuals’ inflation expectations in the Nielsen Homescan panel and then provide eight different forms of information regarding inflation. Reading the actual Federal Open Market Committee (FOMC) statement has about the same average effect on expectations as simply being told about the Federal Reserve’s inflation target. Reading a news article about the most recent FOMC meetings results in a forecast revision which is smaller by half. Our results have implications for how central banks should communicate to the broader public.

Journal ArticleDOI
TL;DR: In this paper, the authors explore how entrepreneurs can use initial coin offerings to fund venture start-up costs and find that venture returns are independent of any committed growth in the supply of tokens over time, but that initial funds raised are maximized by setting that growth to zero to encourage saving by early participants.
Abstract: This paper explores how entrepreneurs can use initial coin offerings - whereby they issue crypto tokens and commit to only accept those tokens as payment for their products - to fund venture start-up costs. We show that the ICO mechanism allows entrepreneurs to generate buyer competition for the token, giving it value. We also find that venture returns are independent of any committed growth in the supply of tokens over time, but that initial funds raised are maximized by setting that growth to zero to encourage saving by early participants. Nonetheless, since the value of the tokens depends on a single period of demand, the ability to raise funds is more limited than in traditional equity finance. Furthermore, a lack of commitment in monetary policy undermines saving behavior, hence the cost of using tokens to fund start-up costs is inflexibility in future capital raises. Crypto tokens can also facilitate coordination among stakeholders within digital ecosystems when network effects are present.

Journal ArticleDOI
TL;DR: In this article, the state of the art of artificial hands is discussed and some of the most important trends and directions for future research are discussed. And a review and group of most important developments in the field is presented.
Abstract: This article reports on the state of the art of artificial hands, discussing some of the field's most important trends and suggesting directions for future research. We review and group the most im...

Posted Content
TL;DR: The first history of sex discrimination in public accommodations law can be traced back to the 1970s, when the first public accommodations laws were proposed by the women's movement as mentioned in this paper, which secured state laws opening up commerce and leisure for full and equal enjoyment by both sexes.
Abstract: This Article recounts the first history of sex in public accommodations law—a history essential to debates that rage today over gender and sexuality in public. Just fifty years ago, not only LGBTQ people but also cisgender women were the subject of discrimination in public. Restaurants and bars displayed “men-only” signs. Women held secondary status in civic organizations, such as Rotary and Jaycees, and were excluded altogether from many professional bodies, such as press clubs. Sports—from the Little League to the golf club—kept girls and women from achieving athletic excellence. Financial institutions subsumed married women’s identities within those of their husbands. Over the course of the 1970s, the feminist movement protested and litigated against sex discrimination in public accommodations. They secured state laws opening up commerce and leisure for “full and equal enjoyment” by both sexes. When “sex” was added to state public accommodations laws, feminists, their opponents, and government actors understood sex equality in public to signify more than equal access to public spaces. It also implicated freedom from the regulation of sexuality and gender performance and held the potential to transform institutions central to dominant masculinity, like baseball fields and bathrooms. This history informs the interpretation of public accommodations laws in controversies from same-sex couples’ wedding cakes to transgender people’s restroom access.

Journal ArticleDOI
TL;DR: The authors showed that experience with two or more languages confers a bilingual advantage in English-to-Spanish learning. But it was once thought that bilingualism was associated with cognitive disadvantages.
Abstract: Bilingualism was once thought to result in cognitive disadvantages, but research in recent decades has demonstrated that experience with two (or more) languages confers a bilingual advantage in exe...

Book ChapterDOI
TL;DR: In this paper, the authors study capacity management when workers self-schedule, where the agents have the flexibility to choose when they will or will not work and they optimize their schedules based on the compensation offered and their individual availability.
Abstract: Motivated by recent innovations in service delivery such as ride-sharing services and work-from-home call centers, we study capacity management when workers self-schedule. Our service provider chooses capacity to maximize its profit (revenue from served customers minus capacity costs) over a horizon. Because demand varies over the horizon, the provider benefits from flexibility to adjust its capacity from period to period. However, the firm controls its capacity only indirectly through compensation. The agents have the flexibility to choose when they will or will not work and they optimize their schedules based on the compensation offered and their individual availability. To guarantee adequate capacity, the firm must offer sufficiently high compensation. An augmented newsvendor formula captures the tradeoffs for the firm and the agents. If the firm could keep the flexibility but summon as many agents as it wants (i.e., have direct control) for the same wages it would not only generate higher profit, as is expected, but would also provide better service levels to its customers. If the agents require a “minimum wage” to remain in the agent pool they will have to relinquish some of their flexibility. To pay a minimum wage the firm must restrict the number of agents that can work in some time intervals. The costs to the firm are countered by the self-scheduling firm’s flexibility to match supply to varying demand. If the pool of agents is sufficiently large relative to peak demand, the firm earns more than it would if it had control of agents’ schedules but had to maintain a fixed staffing level over the horizon.

Posted ContentDOI
TL;DR: The optimal design of cryptocurrencies is studied and quantitatively how well such currencies can support bilateral trade is assessed and it is pointed out that cryptocurrencies can potentially challenge retail payment systems provided scaling limitations can be addressed.
Abstract: Depuis la creation du Bitcoin en 2009, plus de 2 000 cryptomonnaies ont ete emises. Dans la presente etude, nous evaluons le succes d’une cryptomonnaie comme systeme de paiement.

Journal ArticleDOI
TL;DR: In this article, the authors revisited the relationship between firm performance and CEO turnover and introduced the concept of performance-induced turnover, defined as turnover that would not have occurred had performance been "good".
Abstract: This paper revisits the relationship between firm performance and CEO turnover. We drop the distinction between forced and voluntary turnovers and introduce the concept of performance-induced turnover, defined as turnover that would not have occurred had performance been "good". We document a close link between performance and CEO turnover and estimate that between 38% and 55% of all turnovers are performance induced, with an even higher percentage early in tenure. This is significantly more than the number of forced turnovers identified in prior studies. We contrast the empirical properties of performance-induced turnovers with the predictions of Bayesian learning models of CEO turnover. Learning by boards about CEO ability appears to be slow, and boards act as if CEO ability (or match quality) was subject to frequent and sizeable shocks.

Journal ArticleDOI
TL;DR: This article considers identification and estimation of treatment effect parameters using DID with (i) multiple time periods, (ii) variation in treatment timing, and (iii) when the "parallel trends assumption" holds potentially only after conditioning on observed covariates.
Abstract: In this article, we consider identification, estimation, and inference procedures for treatment effect parameters using Difference-in-Differences (DID) with (i) multiple time periods, (ii) variation in treatment timing, and (iii) when the ``parallel trends assumption" holds potentially only after conditioning on observed covariates. We show that a family of causal effect parameters are identified in staggered DID setups, even if differences in observed characteristics create non-parallel outcome dynamics between groups. Our identification results allow one to use outcome regression, inverse probability weighting, or doubly-robust estimands. We also propose different aggregation schemes that can be used to highlight treatment effect heterogeneity across different dimensions as well as to summarize the overall effect of participating in the treatment. We establish the asymptotic properties of the proposed estimators and prove the validity of a computationally convenient bootstrap procedure to conduct asymptotically valid simultaneous (instead of pointwise) inference. Finally, we illustrate the relevance of our proposed tools by analyzing the effect of the minimum wage on teen employment from 2001--2007. Open-source software is available for implementing the proposed methods.

Journal ArticleDOI
TL;DR: Algorithms are not only a threat to be regulated; with the right safeguards in place, they have the potential to be a positive force for equity.
Abstract: The law forbids discrimination. But the ambiguity of human decision-making often makes it extraordinarily hard for the legal system to know whether anyone has actually discriminated. To understand how algorithms affect discrimination, we must therefore also understand how they affect the problem of detecting discrimination. By one measure, algorithms are fundamentally opaque, not just cognitively but even mathematically. Yet for the task of proving discrimination, processes involving algorithms can provide crucial forms of transparency that are otherwise unavailable. These benefits do not happen automatically. But with appropriate requirements in place, the use of algorithms will make it possible to more easily examine and interrogate the entire decision process, thereby making it far easier to know whether discrimination has occurred. By forcing a new level of specificity, the use of algorithms also highlights, and makes transparent, central tradeoffs among competing values. Algorithms are not only a threat to be regulated; with the right safeguards in place, they have the potential to be a positive force for equity.

Journal ArticleDOI
TL;DR: In this article, a large literature on persistence finds that many modern outcomes strongly reflect characteristics of the same places in the distant past, and the purpose of this paper is to examine whether these two properties might be connected, and find that even for modest ranges of spatial correlation between points, t statistics become severely inflated leading to significance levels that are in error by several orders of magnitude.
Abstract: A large literature on persistence finds that many modern outcomes strongly reflect characteristics of the same places in the distant past. However, alongside unusually high t statistics, these regressions display severe spatial autocorrelation in residuals, and the purpose of this paper is to examine whether these two properties might be connected. We start by running artificial regressions where both variables are spatial noise and find that, even for modest ranges of spatial correlation between points, t statistics become severely inflated leading to significance levels that are in error by several orders of magnitude. We analyse 27 persistence studies in leading journals and find that in most cases if we replace the main explanatory variable with spatial noise the fit of the regression commonly improves; and if we replace the dependent variable with spatial noise, the persistence variable can still explain it at high significance levels. We can predict in advance which persistence results might be the outcome of fitting spatial noise from the degree of spatial autocorrelation in their residuals measured by a standard Moran statistic. Our findings suggest that the results of persistence studies, and of spatial regressions more generally, might be treated with some caution in the absence of reported Moran statistics and noise simulations.

Journal ArticleDOI
TL;DR: The authors presented a broad look at the American public's attitudes toward artificial intelligence (AI) and AI governance, based on findings from a nationally representative survey of 2,000 American adults.
Abstract: This report presents a broad look at the American public’s attitudes toward artificial intelligence (AI) and AI governance, based on findings from a nationally representative survey of 2,000 American adults. As the study of the public opinion toward AI is relatively new, we aimed for breadth over depth, with our questions touching on: workplace automation; attitudes regarding international cooperation; the public’s trust in various actors to develop and regulate AI; views about the importance and likely impact of different AI governance challenges; and historical and cross-national trends in public opinion regarding AI. Our results provide preliminary insights into the character of US public opinion regarding AI.

Journal ArticleDOI
Michael Webb1
TL;DR: It is estimated that AI will reduce 90:10 wage inequality, but will not affect the top 1% under the assumption that the historical pattern of long-run substitution will continue.
Abstract: I develop a new method to predict the impacts of a technology on occupations. I use the overlap between the text of job task descriptions and the text of patents to construct a measure of the exposure of tasks to automation. I first apply the method to historical cases such as software and industrial robots. I establish that occupations I measure as highly exposed to previous automation technologies saw declines in employment and wages over the relevant periods. I use the fitted parameters from the case studies to predict the impacts of artificial intelligence. I find that, in contrast to software and robots, AI is directed at high-skilled tasks. Under the assumption that the historical pattern of long-run substitution will continue, I estimate that AI will reduce 90:10 wage inequality, but will not affect the top 1%.

Journal ArticleDOI
TL;DR: A snapshot of the main concepts involved in Wasserstein distances and optimal transportation is provided, and a succinct overview of some of their many statistical aspects are provided.
Abstract: Wasserstein distances are metrics on probability distributions inspired by the problem of optimal mass transportation. Roughly speaking, they measure the minimal effort required to reconfigure the ...

ReportDOI
TL;DR: In this article, a randomized controlled trial with housing voucher recipients in Seattle and King County was conducted, where the authors provided services to reduce barriers to moving to high-upward mobility neighborhoods: customized search assistance, landlord engagement, and short-term financial assistance.
Abstract: Low-income families in the United States tend to live in neighborhoods that offer limited opportunities for upward income mobility. One potential explanation for this pattern is that families prefer such neighborhoods for other reasons, such as affordability or proximity to family and jobs. An alternative explanation is that they do not move to high-opportunity areas because of barriers that prevent them from making such moves. We test between these two explanations using a randomized controlled trial with housing voucher recipients in Seattle and King County. We provided services to reduce barriers to moving to high-upward-mobility neighborhoods: customized search assistance, landlord engagement, and short-term financial assistance. Unlike many previous housing mobility programs, families using vouchers were not required to move to a high-opportunity neighborhood to receive a voucher. The intervention increased the fraction of families who moved to high-upward-mobility areas from 15% in the control group to 53% in the treatment group. Families induced to move to higher opportunity areas by the treatment do not make sacrifices on other aspects of neighborhood quality, tend to stay in their new neighborhoods when their leases come up for renewal, and report higher levels of neighborhood satisfaction after moving. These findings imply that most low-income families do not have a strong preference to stay in low-opportunity areas; instead, barriers in the housing search process are a central driver of residential segregation by income. Interviews with families reveal that the capacity to address each family's needs in a specific manner — from emotional support to brokering with landlords to customized financial assistance — was critical to the program's success. Using quasi-experimental analyses and comparisons to other studies, we show that more standardized policies — increasing voucher payment standards in high-opportunity areas or informational interventions — have much smaller impacts. We conclude that redesigning affordable housing policies to provide customized assistance in housing search could reduce residential segregation and increase upward mobility substantially.

Journal ArticleDOI
TL;DR: In this article, the authors explored the extent to which the Western concept of the rule of law impacts systematic violence against Indigenous girls and women in Australia and post-war Liberia, and found that although the principle of the Rule of Law is an emancipatory tool for justice and redress generally, it can also be an apparatus for persistent systemic violence against women.
Abstract: The gender-agenda is borderless. Arguably, legal justice for Indigenous girls and women survivors of violence is unfair, inequitable, and sometimes arbitrary. Systematic violence against girls and women pervades cultures and societies; operates at three main levels: institution and state, structural and cultural, and community and individual; and manifests in myriad shapes, forms and categories. Systematic violence in this research comprises historical, colonial and contemporary aspects of violence and its impact on Indigenous girls and women. Unlike comparative studies, this research is founded on heuristic arguments derived from validating the formation, establishment and continuity of the voices of Indigenous peoples in Liberia and Australia. While many studies isolate ‘gender-based violence’ and the ‘rule of law’ in separate contexts, none has explored the extent to which the Western concept of the rule of law impacts systematic violence against Indigenous girls and women in Australia and post-war Liberia. The research assesses the efficacy of the ‘rule of law’ in dispensing justice to Indigenous girls and women who have suffered systematic gender-based violence. The scope of the research demands a comprehensive and complex systematic empirical approach that draws on the principles of phenomenology, community-based participatory research, and feminist and Indigenous methods. The study adopts an interdisciplinary mixed-methods approach informed by theories of decolonization, feminist jurisprudence, intersectionality, critical legal/race studies, and social determinants of health. Data is drawn from case law, secondary data, empirical evidence, textual/content analysis, electronic mailing and informal participant observation. Over a period of two years, a survey of 231 social service providers working with Indigenous girls and women; in-depth interviews with 29 Indigenous Women Advocates; and 22 informal email exchanges with male colleagues were conducted in both Australia and Liberia. Statistical analyses were carried out on records of 127 708 convicts to Australia; 14 996 former slave returnees to Liberia; 2701 sexual and gender-based violence cases reported to the Ministry of Gender, Children and Social Protection in Liberia; seven case files from the Sexual and Gender-based Crimes Unit in Liberia; and 1200 interview entries from the Longitudinal Study of Indigenous Children in Australia. This analysis of historical documents, jurisprudence and case studies triangulates a philosophical inquiry intended to migrate issues of violence against Indigenous girls and women from the margins of complex socio-legal structures towards the core of Western-centric perspectives, such as the rule of law. Situated between dominant academic conventions and resistance, the research provokes readers to consider ontological, epistemological and ethical arguments regarding access to justice outcomes for Indigenous girls and women. Contrary to the research hypothesis and despite socioeconomic differences between Australia and Liberia, findings show that: although the principle of the rule of law is an emancipatory tool for justice and redress generally, it can also be an apparatus for persistent systematic violence against Indigenous girls and women. Furthermore, the intersection of colonial history, race, gender, class and social status exacerbates the ongoing perpetration of institutional/state, structural/cultural and interpersonal/community violence against Indigenous girls and women. In conclusion, the research recommends adopting a holistic approach to educating girls and women and encouraging boys and men to participate equally in the gender justice agenda, to ensure justice for Indigenous girls and women. The research also suggests incorporating diverse and comprehensive conceptual and methodological frameworks into further research. Finally, throughout the work, this dissertation attempts to give agency to Indigenous ways of being, knowing and doing justice.

Journal ArticleDOI
TL;DR: In this paper, the authors used linear regression with period and group fixed effects to estimate treatment effects and showed that half of the weights are negative, and proposed another estimator that solves this issue.
Abstract: Linear regressions with period and group fixed effects are widely used to estimate treatment effects. We show that they identify weighted sums of the average treatment effects (ATE) in each group and period, with weights that may be negative. Due to the negative weights, the linear regression estimand may for instance be negative while all the ATEs are positive. In two articles that have used those regressions, half of the weights are negative. We propose another estimator that solves this issue. In one of the articles we revisit, it is of a different sign than the linear regression estimator.

Journal ArticleDOI
TL;DR: This work reviews the computational roles played by internal models of the motor system and environmental dynamics and the neural and behavioral evidence for their implementation in the brain in the context of theoretic formalism.
Abstract: Rationality principles such as optimal feedback control and Bayesian inference underpin a probabilistic framework that has accounted for a range of empirical phenomena in biological sensorimotor control. To facilitate the optimization of flexible and robust behaviors consistent with these theories, the ability to construct internal models of the motor system and environmental dynamics can be crucial. In the context of this theoretic formalism, we review the computational roles played by such internal models and the neural and behavioral evidence for their implementation in the brain.

Journal ArticleDOI
TL;DR: The authors investigated whether Tether, a digital currency pegged to the U.S. dollar, influenced Bitcoin and other cryptocurrency prices during the 2017 boom and found that purchases with Tether are timed following market downturns and result in sizable increases in Bitcoin prices.
Abstract: This paper investigates whether Tether, a digital currency pegged to the U.S. dollar, influenced Bitcoin and other cryptocurrency prices during the 2017 boom. Using algorithms to analyze blockchain data, we find that purchases with Tether are timed following market downturns and result in sizable increases in Bitcoin prices. The flow is attributable to one entity, clusters below round prices, induces asymmetric autocorrelations in Bitcoin, and suggests insufficient Tether reserves before month-ends. Rather than demand from cash investors, these patterns are most consistent with the supply-based hypothesis of unbacked digital money inflating cryptocurrency prices.