scispace - formally typeset
Search or ask a question

Showing papers in "Social Science Research Network in 2016"


Journal ArticleDOI
TL;DR: In this article, the authors study the impact of the short-term accommodation market on the hotel industry and find that the impact is non-uniformly distributed, with lower-priced hotels and those hotels not catering to business travelers being the most affected.
Abstract: Peer-to-peer markets, collectively known as the sharing economy, have emerged as alternative suppliers of goods and services traditionally provided by long-established industries. A central question regards the impact of these sharing economy platforms on incumbent firms. We study the case of Airbnb, specifically analyzing Airbnb’s entry into the short-term accommodation market in Texas and its impact on the incumbent hotel industry. We first explore Airbnb’s impact on hotel revenue, by using a difference- in-differences empirical strategy that exploits the significant spatiotemporal variation in the patterns of Airbnb adoption across city-level markets. We estimate that in Austin, where Airbnb supply is highest, the causal impact on hotel revenue is in the 8-10% range; moreover, the impact is non-uniformly distributed, with lower-priced hotels and those hotels not catering to business travelers being the most affected. We find that this impact materializes through less aggressive hotel room pricing, an impact that benefits all consumers, not just participants in the sharing economy. The impact on hotel prices is especially pronounced during periods of peak demand, such as SXSW. We find that by enabling supply to scale – a differentiating feature of peer-to-peer platforms – Airbnb has significantly crimped hotels’ ability to raise prices during periods of peak demand. Our work provides empirical evidence that the sharing economy is making inroads by successfully competing with, differentiating from, and acquiring market share from incumbent firms.

1,519 citations


Journal ArticleDOI
TL;DR: In the absence of a demonstrable intent to discriminate, the best doctrinal hope for data mining's victims would seem to lie in disparate impact doctrine as discussed by the authors, which holds that a practice can be justified as a business necessity when its outcomes are predictive of future employment outcomes, and data mining is specifically designed to find such statistical correlations.
Abstract: Advocates of algorithmic techniques like data mining argue that these techniques eliminate human biases from the decision-making process. But an algorithm is only as good as the data it works with. Data is frequently imperfect in ways that allow these algorithms to inherit the prejudices of prior decision makers. In other cases, data may simply reflect the widespread biases that persist in society at large. In still others, data mining can discover surprisingly useful regularities that are really just preexisting patterns of exclusion and inequality. Unthinking reliance on data mining can deny historically disadvantaged and vulnerable groups full participation in society. Worse still, because the resulting discrimination is almost always an unintentional emergent property of the algorithm’s use rather than a conscious choice by its programmers, it can be unusually hard to identify the source of the problem or to explain it to a court.This Essay examines these concerns through the lens of American antidiscrimination law — more particularly, through Title VII’s prohibition of discrimination in employment. In the absence of a demonstrable intent to discriminate, the best doctrinal hope for data mining’s victims would seem to lie in disparate impact doctrine. Case law and the Equal Employment Opportunity Commission’s Uniform Guidelines, though, hold that a practice can be justified as a business necessity when its outcomes are predictive of future employment outcomes, and data mining is specifically designed to find such statistical correlations. Unless there is a reasonably practical way to demonstrate that these discoveries are spurious, Title VII would appear to bless its use, even though the correlations it discovers will often reflect historic patterns of prejudice, others’ discrimination against members of protected groups, or flaws in the underlying dataAddressing the sources of this unintentional discrimination and remedying the corresponding deficiencies in the law will be difficult technically, difficult legally, and difficult politically. There are a number of practical limits to what can be accomplished computationally. For example, when discrimination occurs because the data being mined is itself a result of past intentional discrimination, there is frequently no obvious method to adjust historical data to rid it of this taint. Corrective measures that alter the results of the data mining after it is complete would tread on legally and politically disputed terrain. These challenges for reform throw into stark relief the tension between the two major theories underlying antidiscrimination law: anticlassification and antisubordination. Finding a solution to big data’s disparate impact will require more than best efforts to stamp out prejudice and bias; it will require a wholesale reexamination of the meanings of “discrimination” and “fairness.”

1,504 citations


Journal ArticleDOI
TL;DR: In this article, the authors examine the impact of economic insecurity and cultural values as predictors of voting for populist parties in 31 European countries and find the most consistent evidence supporting the cultural backlash thesis.
Abstract: Rising support for populist parties has disrupted the politics of many Western societies. What explains this phenomenon? Two theories are examined here. Perhaps the most widely-held view of mass support for populism -- the economic insecurity perspective -- emphasizes the consequences of profound changes transforming the workforce and society in post-industrial economies. Alternatively, the cultural backlash thesis suggests that support can be explained as a retro reaction by once-predominant sectors of the population to progressive value change. To consider these arguments, Part I develops the conceptual and theoretical framework. Part II of the study uses the 2014 Chapel Hill Expert Survey (CHES) to identify the ideological location of 268 political parties in 31 European countries. Part III compares the pattern of European party competition at national-level. Part IV uses the pooled European Social Survey 1-6 (2002-2014) to examine the cross-national evidence at individual level for the impact of the economic insecurity and cultural values as predictors of voting for populist parties. Part V summarizes the key findings and considers their implications. Overall, we find the most consistent evidence supporting the cultural backlash thesis.

1,133 citations


OtherDOI
TL;DR: The main principles behind blockchain technology are expounded and the core concepts at the heart of the blockchain are presented, and the main features of decentralized public ledger platforms are exposed.
Abstract: This paper expounds the main principles behind blockchain technology and some of its cutting-edge applications. Firstly, we present the core concepts at the heart of the blockchain, and we discuss the potential risks and drawbacks of public distributed ledgers, and the shift toward hybrid solutions. Secondly, we expose the main features of decentralized public ledger platforms. Thirdly, we show why the blockchain is a disruptive and foundational technology, and fourthly, we sketch out a list of important applications, bearing in mind the most recent evolutions.

1,009 citations


Posted ContentDOI
TL;DR: In this article, the authors provide new empirical evidence on the extent of and trends in the gender wage gap, using PSID microdata over the 1980-2010, which shows that women's work force interruptions and shorter hours remain significant in high skilled occupations, possibly due to compensating differentials.
Abstract: Using PSID microdata over the 1980-2010, we provide new empirical evidence on the extent of and trends in the gender wage gap, which declined considerably over this period. By 2010, conventional human capital variables taken together explained little of the gender wage gap, while gender differences in occupation and industry continued to be important. Moreover, the gender pay gap declined much more slowly at the top of the wage distribution that at the middle or the bottom and by 2010 was noticeably higher at the top. We then survey the literature to identify what has been learned about the explanations for the gap. We conclude that many of the traditional explanations continue to have salience. Although human capital factors are now relatively unimportant in the aggregate, women’s work force interruptions and shorter hours remain significant in high skilled occupations, possibly due to compensating differentials. Gender differences in occupations and industries, as well as differences in gender roles and the gender division of labor remain important, and research based on experimental evidence strongly suggests that discrimination cannot be discounted. Psychological attributes or noncognitive skills comprise one of the newer explanations for gender differences in outcomes. Our effort to assess the quantitative evidence on the importance of these factors suggests that they account for a small to moderate portion of the gender pay gap, considerably smaller than say occupation and industry effects, though they appear to modestly contribute to these differences.

984 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide an overview of FDA, starting with simple statistical notions such as mean and covariance functions, then covering some core techniques, the most popular of which is functional principal component analysis (FPCA).
Abstract: With the advance of modern technology, more and more data are being recorded continuously during a time interval or intermittently at several discrete time points. These are both examples of functional data, which has become a commonly encountered type of data. Functional data analysis (FDA) encompasses the statistical methodology for such data. Broadly interpreted, FDA deals with the analysis and theory of data that are in the form of functions. This paper provides an overview of FDA, starting with simple statistical notions such as mean and covariance functions, then covering some core techniques, the most popular of which is functional principal component analysis (FPCA). FPCA is an important dimension reduction tool, and in sparse data situations it can be used to impute functional data that are sparsely observed. Other dimension reduction approaches are also discussed. In addition, we review another core technique, functional linear regression, as well as clustering and classification of functional d...

963 citations


Posted Content
TL;DR: In this paper, the authors provide some tentative answers to the following basic questions regarding the dynamic processes governing an industry's structure: 'What are the quantitative effects of various factors on the rates of entry and exit? How well can the growth of firms be represented by Gibrat's law of proportionate effect?' and 'What have been the effects of successful innovations on a firm's growth rate? What determines the amount of mobility within a industry's size structure?'
Abstract: and death of firms, we lack even crude answers to the following basic questions regarding the dynamic processes governing an industry's structure. What are the quantitative effects of various factors on the rates of entry and exit? How well can the growth of firms be represented by Gibrat's law of proportionate effect? What have been the effects of successful innovations on a firm's growth rate? What determines the amount of mobility within an industry's size structure?' This paper provides some tentative answers to these questions. First, it constructs some simple models to estimate the effects of an industry's capital requirements, profitability, and other such factors on its entry and exit rates. Second, it investigates how well Gibrat's law of proportionate effect can represent the growth of firms in each of the industries for which we have appropriate data. Although this law has played a prominent role in models designed to explain the size distribution of firms, it has been tested only a few times against data for very large firms. Third, we estimate the difference in growth rate between firms that carried out significant innovations and other firms of comparable initial size. The results help to measure the importance of successful innovation as a cause of interfirm differences in growth rates, and they shed new light on the rewards for such innovations. Fourth, the paper presents and tests a simple model to explain interindustry and * The author is associate professor of economics at Carnegie Institute of Technology. This paper, a preliminary version of which was read at the August 1961 meetings of the Econometric Society, will be reprinted as a Cowles Foundation Paper. The work on which it is based is part

832 citations


Journal ArticleDOI
TL;DR: Some major analytical building blocks for the development of a theory of organizations are outlined and discussed in this article, and two literatures of agency theory are briefly discussed in light of these issues.
Abstract: The foundations are being put in place for a revolution in the science of organizations. Some major analytical building blocks for the development of a theory of organizations are outlined and discussed in this paper. This development of organization theory will be hastened by increased understanding of the importance of the choice of definitions, tautologies, analytical techniques, and types of evidence. The two literatures of agency theory are briefly discussed in light of these issues. Because accounting is an integral part of the structure of every organization, the development of a theory of organiza- tions will be closely associated with the development of a theory of accounting. This theory will explain why organizations take the form they do, why they behave as they do, and why accounting practices take the form they do. Because such positive theories as these are required for purposeful decision making, their development will provide a better scientific basis for the decisions of managers, standard-setting boards, and govern- ment regulatory bodies.

700 citations


Journal ArticleDOI
TL;DR: As there are different types of sampling techniques/methods, researcher needs to understand the differences to select the proper sampling method for the research.
Abstract: In order to answer the research questions, it is doubtful that researcher should be able to collect data from all cases. Thus, there is a need to select a sample. This paper presents the steps to go through to conduct sampling. Furthermore, as there are different types of sampling techniques/methods, researcher needs to understand the differences to select the proper sampling method for the research. In the regards, this paper also presents the different types of sampling techniques and methods.

685 citations


Journal ArticleDOI
TL;DR: In this article, the authors identify the impact of unobservable FCF conflicts on firm policy using a structural approach and find that firms with large institutional holdings or better-aligned executive compensation suffer less from FCF agency conflicts.
Abstract: Free Cash Flow (FCF) agency conflicts exist when managers divert cash flow for private benefits. We identify the impact of unobservable FCF conflicts on firm policy using a structural approach. Measurement equations are constructed based on observable managerial choices: payout policy changes and personal portfolio decisions around exogenous tax rate changes. We find that FCF agency conflicts cause (i) under-leverage, leading to higher corporate taxes and (ii) under-investment in PP&E, leading to slower firm growth. Capital markets recognize FCF conflicts and discount such firms. Finally, firms with (i) large institutional holdings or (ii) better-aligned executive compensation, suffer less from FCF agency conflicts.

677 citations


Journal ArticleDOI
TL;DR: Propensity score matching (PSM) has become a popular technique for estimating average treatment effects (ATEs) in accounting research, but studies often oversell the capabilities of PSM, fail to disclose important design choices, and/or implement PSM in a theoretically inconsistent manner.
Abstract: Propensity score matching (PSM) has become a popular technique for estimating average treatment effects (ATEs) in accounting research. In this study, we discuss the usefulness and limitations of PSM relative to more traditional multiple regression (MR) analysis. We discuss several PSM design choices and review the use of PSM in 86 articles in leading accounting journals from 2008-2014. We document a significant increase in the use of PSM from 0 studies in 2008 to 26 studies in 2014. However, studies often oversell the capabilities of PSM, fail to disclose important design choices, and/or implement PSM in a theoretically inconsistent manner. We then empirically illustrate complications associated with PSM in three accounting research settings. We first demonstrate that when the treatment is not binary, PSM tends to confine analyses to a subsample of observations where the effect size is likely to be smallest. We also show that seemingly innocuous design choices greatly influence sample composition and estimates of the ATE. We conclude with suggestions for future research considering the use of matching methods.

Journal ArticleDOI
TL;DR: The authors summarizes and draws connections among diverse streams of theoretical and empirical research on the economics of privacy, focusing on the economic value and consequences of protecting and disclosing personal information, and on consumers' understanding and decisions regarding the tradeoffs associated with the privacy and the sharing of personal data.
Abstract: This article summarizes and draws connections among diverse streams of theoretical and empirical research on the economics of privacy. We focus on the economic value and consequences of protecting and disclosing personal information, and on consumers' understanding and decisions regarding the trade-offs associated with the privacy and the sharing of personal data. We highlight how the economic analysis of privacy evolved over time, as advancements in information technology raised increasingly nuanced and complex issues associated with the protection and sharing of personal information. We find and highlight three themes that connect diverse insights from the literature. First, characterizing a single unifying economic theory of privacy is hard, because privacy issues of economic relevance arise in widely diverse contexts. Second, there are theoretical and empirical situations where the protection of privacy can both enhance, and detract from, individual and societal welfare. Third, in digital economies, consumers' ability to make informed decisions about their privacy is severely hindered, because consumers are often in a position of imperfect or asymmetric information regarding when their data is collected, for what purposes, and with what consequences. We conclude the article by highlighting some of the ongoing issues in the privacy debate of interest to economists.

Posted Content
TL;DR: In this paper, the authors examine cross-sectional differences in the optimistic behavior of financial analysts and investigate whether the predic- tive accuracy of past information (e.g., time-series of earnings, past returns, etc.) is associated with the magnitude of the bias in analysts' earnings fore-casts.
Abstract: This paper examines cross-sectional differences in the optimistic behavior of financial analysts. Specifically, we investigate whether the predic- tive accuracy of past information (e.g., time-series of earnings, past returns, etc.) is associated with the magnitude of the bias in analysts' earnings fore- casts. We posit that there is higher demand for non-public information for firms whose earnings are difficult to accurately predict than for firms whose earnings can be accurately forecasted using public information. Assuming that opti- mism facilitates access to management's non-public information, we hypoth- esize that analysts will issue more optimistic forecasts for low predictability firms than for high predictability firms. Our results support this hypothesis.

Journal ArticleDOI
TL;DR: In this article, the role of institutional investors in firms' corporate social responsibility choices and the impact of social norms on these investors was explored using a decade of firm-level environmental and social (E&S) performance data from 41 countries.
Abstract: This paper explores both the role of institutional investors in firms’ corporate social responsibility choices and the impact of social norms on these investors. Using a decade of firm-level environmental and social (E&S) performance data from 41 countries, we find that institutional ownership is positively associated with firm-level E&S performance, with multiple tests suggesting a causal relationship. The impact of institutional investors on firms’ E&S commitments is greatest for foreign investors based in countries with strong social norms regarding E&S, which are predominantly European. Tests that segment by investor type show that these social norm effects hold even for institutional investor types that are subject to market discipline, such as investment advisors. Overall, our results indicate that institutional investors transplant their social norms into the firms they hold around the world.

Journal ArticleDOI
TL;DR: In this article, a review article explores and describes the validity and reliability of a questionnaire/survey and also discusses various forms of validation and reliability tests for social science research instrument.
Abstract: Questionnaire is one of the most widely used tools to collect data in especially social science research. The main objective of questionnaire in research is to obtain relevant information in most reliable and valid manner. Thus the accuracy and consistency of survey/questionnaire forms a significant aspect of research methodology which are known as validity and reliability. Often new researchers are confused with selection and conducting of proper validity type to test their research instrument (questionnaire/survey). This review article explores and describes the validity and reliability of a questionnaire/survey and also discusses various forms of validity and reliability tests.

Book ChapterDOI
TL;DR: This chapter provides a brief overview of the core aspects of blockchain technology, as well as the second-generation contract-based developments, and discusses key issues that must be considered in developing ledger based technologies in a banking context.
Abstract: In this chapter we provide an overview of the concept of blockchain technology and its potential to disrupt the world of banking through facilitating global money remittance, smart contracts, automated banking ledgers and digital assets. In this regard, we first provide a brief overview of the core aspects of this technology, as well as the second-generation contract-based developments. From there we discuss key issues that must be considered in developing such ledger based technologies in a banking context.

Journal ArticleDOI
TL;DR: This paper found that applicants with distinctively African-American names are 16% less likely to be accepted relative to identical hosts with White names on the same platform. But, their results suggest that only a subset of hosts discriminate.
Abstract: In an experiment on Airbnb, we find that applications from guests with distinctively African-American names are 16% less likely to be accepted relative to identical guests with distinctively White names. Discrimination occurs among landlords of all sizes, including small landlords sharing the property and larger landlords with multiple properties. It is most pronounced among hosts who have never had an African-American guest, suggesting only a subset of hosts discriminate. While rental markets have achieved significant reductions in discrimination in recent decades, our results suggest that Airbnb’s current design choices facilitate discrimination and raise the possibility of erasing some of these civil rights gains.

Journal ArticleDOI
TL;DR: This survey describes the nuances of the method and, as users of textual analysis, some of the tripwires in implementation and reviews the contemporary textual analysis literature to highlight areas of future research.
Abstract: Relative to quantitative methods traditionally used in accounting and finance, textual analysis is substantially less precise Thus, understanding the art is of equal importance to understanding the science In this survey we describe the nuances of the method and, as users of textual analysis, some of the tripwires in implementation We also review the contemporary textual analysis literature and highlight areas of future research

Journal ArticleDOI
TL;DR: In this paper, the authors extend the literature on how managerial traits relate to corporate choices by documenting that firms run by female CEOs have lower leverage, less volatile earnings, and a higher chance of survival than otherwise similar firms running by male CEOs, and that transitions from male to female CEOs are associated with economically and statistically significant reductions in corporate risk-taking.
Abstract: We extend the literature on how managerial traits relate to corporate choices by documenting that firms run by female CEOs have lower leverage, less volatile earnings, and a higher chance of survival than otherwise similar firms run by male CEOs. Additionally, transitions from male to female CEOs (or vice-versa) are associated with economically and statistically significant reductions (increases) in corporate risk-taking. The results are robust to controlling for the endogenous matching between firms and CEOs using a variety of econometric techniques. We further document that this risk-avoidance behavior appears to lead to distortions in the capital allocation process. These results potentially have important macroeconomic implications for long-term economic growth.

Posted Content
TL;DR: A multi-level framework that integrates the notion of research context and cross-context theorizing with the theory evaluation framework to synthesize the existing UTAUT extensions across both the dimensions and the levels of the research context is proposed.
Abstract: The unified theory of acceptance and use of technology (UTAUT) is a little over a decade old and has been used extensively in information systems (IS) and other fields, as the large number of citations to the original paper that introduced the theory evidences. In this paper, we review and synthesize the IS literature on UTAUT from September 2003 until December 2014, perform a theoretical analysis of UTAUT and its extensions, and chart an agenda for research going forward. Based on Weber’s (2012) framework of theory evaluation, we examined UTAUT and its extensions along two sets of quality dimensions; namely, the parts of a theory and the theory as a whole. While our review identifies many merits to UTAUT, we also found that the progress related to this theory has hampered further theoretical development in research into technology acceptance and use. To chart an agenda for research that will enable significant future work, we analyze the theoretical contributions of UTAUT using Whetten’s (2009) notion of cross-context theorizing. Our analysis reveals several limitations that lead us to propose a multi-level framework that can serve as the theoretical foundation for future research. Specifically, this framework integrates the notion of research context and cross-context theorizing with the theory evaluation framework to: (1) synthesize the existing UTAUT extensions across both the dimensions and the levels of the research context and (2) highlight promising research directions. We conclude with recommendations for future UTAUT-related research using the proposed framework.

Journal ArticleDOI
TL;DR: In this article, the authors developed a novel dataset by hand-mapping sustainability investments classified as material for each industry into firm-specific sustainability ratings and found that firms with good ratings on material sustainability issues significantly outperform firms with poor ratings on these issues.
Abstract: Using newly-available materiality classifications of sustainability topics, we develop a novel dataset by hand-mapping sustainability investments classified as material for each industry into firm-specific sustainability ratings. This allows us to present new evidence on the value implications of sustainability investments. Using both calendar-time portfolio stock return regressions and firm-level panel regressions we find that firms with good ratings on material sustainability issues significantly outperform firms with poor ratings on these issues. In contrast, firms with good ratings on immaterial sustainability issues do not significantly outperform firms with poor ratings on the same issues. These results are confirmed when we analyze future changes in accounting performance. The results have implications for asset managers who have committed to the integration of sustainability factors in their capital allocation decisions.

Journal ArticleDOI
TL;DR: The problems show that the GDPR lacks precise language as well as explicit and well-defined rights and safeguards against automated decision-making, and therefore runs the risk of being toothless.
Abstract: Since approval of the EU General Data Protection Regulation (GDPR) in 2016, it has been widely and repeatedly claimed that a ‘right to explanation’ of decisions made by automated or artificially intelligent algorithmic systems will be legally mandated by the GDPR. This right to explanation is viewed as an ideal mechanism to enhance the accountability and transparency of automated decision-making. However, there are several reasons to doubt both the legal existence and the feasibility of such a right. In contrast to the right to explanation of specific automated decisions claimed elsewhere, the GDPR only mandates that data subjects receive limited information (Articles 13-15) about the logic involved, as well as the significance and the envisaged consequences of automated decision-making systems, what we term a ‘right to be informed’. Further, the ambiguity and limited scope of the ‘right not to be subject to automated decision-making’ contained in Article 22 (from which the alleged ‘right to explanation’ stems) raises questions over the protection actually afforded to data subjects. These problems show that the GDPR lacks precise language as well as explicit and well-defined rights and safeguards against automated decision-making, and therefore runs the risk of being toothless. We propose a number of legislative steps that, if taken, may improve the transparency and accountability of automated decision-making when the GDPR comes into force in 2018.

Journal ArticleDOI
TL;DR: In this article, the Tobin's q is used to explain both physical and intangible investment, and the authors show that it is a better proxy for both physical investment and intangible capital.
Abstract: The neoclassical theory of investment has mainly been tested with physical investment, but we show it also helps explain intangible investment At the firm level, Tobin's q explains physical and intangible investment roughly equally well, and it explains total investment even better Compared to physical capital, intangible capital adjusts more slowly to changes in investment opportunities The classic q theory performs better in firms and years with more intangible capital: Total and even physical investment are better explained by Tobin's q and are less sensitive to cash flow At the macro level, Tobin's q explains intangible investment many times better than physical investment We propose a simple, new Tobin's q proxy that accounts for intangible capital, and we show that it is a superior proxy for both physical and intangible investment opportunities

Journal ArticleDOI
TL;DR: In this paper, the authors examine whether and by which mechanisms passive investors influence firms' governance, exploiting variation in ownership by passive mutual funds associated with stock assignments to the Russell 1000 and 2000 indexes.
Abstract: Passive institutional investors are an increasingly important component of U.S. stock ownership. To examine whether and by which mechanisms passive investors influence firms’ governance, we exploit variation in ownership by passive mutual funds associated with stock assignments to the Russell 1000 and 2000 indexes. Our findings suggest that passive mutual funds influence firms’ governance choices, resulting in more independent directors, removal of takeover defenses, and more equal voting rights. Passive investors appear to exert influence through their large voting blocs, and consistent with the observed governance differences increasing firm value, passive ownership is associated with improvements in firms’ longer-term performance.

Posted Content
TL;DR: In this paper, a taxonomy of underground economies is elaborated based on the new institutional approach to economic development, which distinguishes illegal, unreported, unrecorded and informal economies and examines the conceptual and empirical linkages among them.
Abstract: A taxonomy of underground economies is elaborated based on the new institutional approach to economic development. Members of formal sectors confront different sets of transformation and transaction costs than do members of informal sectors and these differences are regarded as crucial to the development process. The paper distinguishes illegal, unreported, unrecorded and informal economies and examines the conceptual and empirical linkages among them. Alternative micro and macro methodologies for measuring underground activities are reviewed and evaluated including census and survey procedures, discrepancies and monetary methods.

Journal ArticleDOI
TL;DR: In this paper, the authors find that corporate social responsibility is more strongly and consistently related to legal origins than to doing good by doing well, and most firm and country characteristics such as ownership concentration, political institutions, and degree of globalization.
Abstract: A firm’s corporate social responsibility (CSR) practice and its country’s legal origin are strongly correlated. This relation is valid for various CSR ratings coming from several large datasets that comprise more than 23,000 large companies from 114 countries. We find that CSR is more strongly and consistently related to legal origins than to “doing good by doing well”-factors, and most firm and country characteristics such as ownership concentration, political institutions, and degree of globalization. In particular, companies from common law countries have lower level of CSR than companies from civil law countries, and Scandinavian civil law firms assume highest level of CSR. This link between legal origins and CSR seems to be explained by differences in ex post shareholder litigation risk as well as in stakeholder regulations and state involvement in the economy. Evidence from quasi-natural experiments such as scandals and natural disasters suggest that civil law firms are more responsive to CSR shocks than common law firms, and such responsiveness is not likely driven by declining market shares following the shock.

Journal ArticleDOI
TL;DR: In this paper, the authors present four popular technologies: electronic monitoring systems, robots, teleconferencing, and wearable computing devices to illustrate technology's impact on work, work systems, and organizations.
Abstract: Given the rapid advances and the increased reliance on technology, the question of how it is changing work and employment is highly salient for scholars of organizational psychology and organizational behavior (OP/OB). This article attempts to interpret the progress, direction, and purpose of current research on the effects of technology on work and organizations. After a review of key breakthroughs in the evolution of technology, we consider the disruptive effects of emerging information and communication technologies. We then examine numbers and types of jobs affected by developments in technology, and how this will lead to significant worker dislocation. To illustrate technology's impact on work, work systems, and organizations, we present four popular technologies: electronic monitoring systems, robots, teleconferencing, and wearable computing devices. To provide insights regarding what we know about the effects of technology for OP/OB scholars, we consider the results of research conducted from four ...

Journal ArticleDOI
TL;DR: The organizational field is defined as "a community of organizations that partakes of a common meaning system and whose participants interact more frequently and fatefully with one another than with actors outside the field".
Abstract: The central construct of neo-institutional theory has been the organizational field. Strictly speaking, the field is 'a community of organizations that partakes of a common meaning system and whose participants interact more frequently and fatefully with one another than with actors outside the field.' It may include constituents such as the government, critical exchange partners, sources of funding, professional and trade associations, special interest groups, and the general public – any constituent which imposes a coercive, normative or mimetic influence on the organization. But the concept of the organizational field encompasses much more than simply a discrete list of constituents; and the ways in which the institutional literature has sought to capture this complexity has evolved over the past decades, and continues to evolve. In this chapter, we present this evolution, discussing the past, present and future of this important construct. We illustrate its early conceptualization and present its progression in a way that invites scholars to both consider their work within this historical trajectory and contribute to its further development.

Journal ArticleDOI
TL;DR: In this article, the authors proposed a new variant of the traveling salesman problem (TSP) called the TSP with drone, and formulated this problem as an MIP model and developed several fast route first-cluster second heuristics based on local search and dynamic programming.
Abstract: The fast and cost-effcient home delivery of goods ordered online is logistically challenging. Many companies are looking for new ways to cross the last-mile to their customers. One technology-enabled opportunity that recently has received much attention is the use of a drone to support deliveries. An innovative last-mile delivery concept in which a truck collaborates with a drone to make deliveries gives rise to a new variant of the traveling salesman problem (TSP) that we call the TSP with drone. In this paper, we formulate this problem as an MIP model and develop several fast route first-cluster second heuristics based on local search and dynamic programming. We prove worst-case approximation ratios for the heuristics and test their performance by comparing the solutions to the optimal solutions for small instances. In addition, we apply our heuristics to several artificial instances with different characteristics and sizes. Our numerical analysis shows that substantial savings are possible with this concept in comparison to truck-only delivery.

Journal ArticleDOI
TL;DR: This paper conducted a large-sample, quasi-experimental evaluation of R&D subsidies and found that an award approximately doubled the probability that a firm receives subsequent venture capital and has large, positive impacts on patenting and commercialization.
Abstract: Governments regularly subsidize new ventures to spur innovation. This paper conducts the first large-sample, quasi-experimental evaluation of R&D subsidies. I use data on ranked applicants to the U.S. Department of Energy’s SBIR grant program. An award approximately doubles the probability that a firm receives subsequent venture capital and has large, positive impacts on patenting and commercialization. These effects are stronger for more financially constrained firms. Certification, where the award contains information about firm quality, likely does not explain the grant effect on funding. Instead, the grants seem to reduce investor uncertainty by funding technology prototyping.