scispace - formally typeset
Search or ask a question

Showing papers in "Organizational Research Methods in 2012"


Journal ArticleDOI
TL;DR: In this article, the authors extend previous recommendations for improved control variable (CV) practice in management research by mapping the objectives for using statistical control to recommendations for researches for researc...
Abstract: The authors extend previous recommendations for improved control variable (CV) practice in management research by mapping the objectives for using statistical control to recommendations for researc...

453 citations


Journal ArticleDOI
TL;DR: The use of Bayesian methods for data analysis is creating a revolution in fields ranging from genetics to marketing as mentioned in this paper, yet, results of a literature review, including more than 10,000 articles publi...
Abstract: The use of Bayesian methods for data analysis is creating a revolution in fields ranging from genetics to marketing. Yet, results of our literature review, including more than 10,000 articles publi...

446 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that even modest departures from perfect convergent validity can result in substantial differences in the magnitudes of findings, creating challenges for the accumulation and interpretation of research.
Abstract: Using different measures of constructs in research to develop robust evidence of relationships and effects is seen as good methodological practice. This assumes these measures possess high convergent validity. However, proxies—alternative measures of the same construct—are rarely perfectly convergent. Although some convergence is preferred to none, this study demonstrates that even modest departures from perfect convergent validity can result in substantial differences in the magnitudes of findings, creating challenges for the accumulation and interpretation of research. Using data from published research, the authors find that substantial differences in findings between studies using desired and proxy variables occur even at levels of convergent validity as high as r = .85. Implications of using measures with less-than-ideal convergent validity for the interpretability of research results are examined. Convergent validities above r = .70 are recommended, whereas those below r = .50 should be avoided. Res...

375 citations


Journal ArticleDOI
TL;DR: Additive transformations are often offered as a remedy for the common problem of collinearity in moderated regression and polynomial regression analysis as discussed by the authors, and they have been shown to be useful in many applications.
Abstract: Additive transformations are often offered as a remedy for the common problem of collinearity in moderated regression and polynomial regression analysis. As the authors demonstrate in this article,...

353 citations


Journal ArticleDOI
TL;DR: In this article, the authors introduce advances in meta-analytic techniques from the medical and related sciences for a comprehensive assessment and evaluation of publication bias in employment interview validities, using multiple methods, including contourenhanced funnel plots, trim and fill, Egger's test of the intercept, Begg and Mazumdar's rank correlation, meta-regression, cumulative meta-analysis and selection models.
Abstract: Publication bias poses multiple threats to the accuracy of meta-analytically derived effect sizes and related statistics. Unfortunately, a review of the literature indicates that unlike meta-analytic reviews in medicine, research in the organizational sciences tends to pay little attention to this issue. In this article, the authors introduce advances in meta-analytic techniques from the medical and related sciences for a comprehensive assessment and evaluation of publication bias. The authors illustrate their use on a data set on employment interview validities. Using multiple methods, including contour-enhanced funnel plots, trim and fill, Egger’s test of the intercept, Begg and Mazumdar’s rank correlation, meta-regression, cumulative meta-analysis, and selection models, the authors find limited evidence of publication bias in the studied data.

322 citations


Journal ArticleDOI
TL;DR: In this paper, a bias-corrected bootstrap confidence interval for testing a specific mediation effect in a complex latent variable model is presented, and the procedure is extended to construct a BC bootstrap interval for the difference between two specific mediation effects.
Abstract: This teaching note starts with a demonstration of a straightforward procedure using Mplus Version 6 to produce a bias-corrected (BC) bootstrap confidence interval for testing a specific mediation effect in a complex latent variable model. The procedure is extended to constructing a BC bootstrap confidence interval for the difference between two specific mediation effects. The extended procedure not only tells whether the strengths of any direct and mediation effects or any two specific mediation effects in a latent variable model are significantly different but also provides an estimate and a confidence interval for the difference. However, the Mplus procedures do not allow the estimation of a BC bootstrap confidence interval for the difference between two standardized mediation effects. This teaching note thus demonstrates the LISREL procedures for constructing BC confidence intervals for specific standardized mediation effects and for comparing two standardized mediation effects. Finally, procedures com...

299 citations


Journal ArticleDOI
TL;DR: This article proposes eight steps of synthesizing existing qualitative case study findings to build theory by drawing on an understanding of research synthesis as the interpretation of qualitative evidence from a postpositivistic perspective.
Abstract: The purpose of this article is to provide the research design of a meta-synthesis of qualitative case studies. The meta-synthesis aims at building theory out of primary qualitative case studies that have not been planned as part of a unified multisite effect. By drawing on an understanding of research synthesis as the interpretation of qualitative evidence from a postpositivistic perspective, this article proposes eight steps of synthesizing existing qualitative case study findings to build theory. An illustration of the application of this method in the field of dynamic capabilities is provided. After enumerating the options available to meta-synthesis researchers, the potential challenges as well as the prospects of this research design are discussed.

278 citations


Journal ArticleDOI
TL;DR: In this paper, the authors demystify the key tenets of GT, discuss the problematic impacts of adopting an a la carte approach to GT, and draw attention to GT as a rigorous method for business research.
Abstract: The grounded theory method (GT) remains elusive and misunderstood by many—even those who advocate its use. In practice, many research studies cite the use of GT but merely apply certain a la carte aspects or jargon of the method while not actually incorporating the fundamental principles of the methodology. Consequently, the purpose of this article is fourfold: (a) to demystify the key tenets of GT, (b) to discuss the problematic impacts of adopting an a la carte approach to GT, (c) to draw attention to GT as a rigorous method for business research, and (d) to advocate for the increased use of GT by more researchers where appropriate. Throughout the article, the authors use the example of a recently completed GT study by the lead author to highlight the multiple dimensions of GT and how they all work together.

204 citations


Journal ArticleDOI
TL;DR: The authors found that mixed methods articles tend to receive more citations than monomethod articles do, and the average citations received per year and the cumulating sum of citations are both higher for articles reporting studies using mixed methods than for monometrichod research designs.
Abstract: Mixed methods research is becoming an increasingly popular approach in several areas, and it has long been called for as an approach for providing a better understanding of research problems. However, there have been no assessments as to whether such research, which may be timely and expensive, has more impact on the field. The main purpose of this article is to determine whether the use of a mixed methods approach is a predictor of article impact. The analysis is based on articles published in the Strategic Management Journal from 1980 to 2006. The findings show that mixed methods articles tend to receive more citations than monomethod articles do. The average citations received per year and the cumulating sum of citations are both higher for articles reporting studies using mixed methods than for monomethod research designs. Furthermore, a content analysis of the mixed methods articles identified shows that there are different types of studies based on several characteristics (purpose, priority, impleme...

195 citations


Journal ArticleDOI
TL;DR: In this article, the authors explore the use of control variables in management research, as reflected in both macro and micro management studies published in four leading management journals and find that it is not at all uncommon for the control variables included in studies to account for more variance than the main effects.
Abstract: This study explores the use of control variables in management research, as reflected in both macro and micro management studies published in four leading management journals. Based on a review of 812 empirical articles published from 2005 to 2009—a much larger sample than was employed by earlier studies of control variables—the authors make several important observations. One key finding is that, given it is not at all uncommon for the control variables included in studies to account for more variance than the main effects, it is surprising how infrequently adequate justification for inclusion is provided. In addition, even when justification is provided, often no expectation of the nature of the relationship between control and dependent variables is offered. The authors also make several recommendations for both authors and reviewers. The most important may be to avoid simple mimicry of others and think more deeply about the theoretical foundation for the control variables included in empirical studies.

194 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a measure equivalence/invariance (ME/I) condition that should be met before meaningful comparisons of survey results across groups can be made.
Abstract: Measurement equivalence/invariance (ME/I) is a condition that should be met before meaningful comparisons of survey results across groups can be made. As an alternative to the likelihood ratio test...

Journal ArticleDOI
TL;DR: For instance, the authors suggests that many behaviors are driven by processes operating outside of awareness, and an array of implicit measures to capture such processes can be found in social and cognitive psychology literature.
Abstract: Accumulated evidence from social and cognitive psychology suggests that many behaviors are driven by processes operating outside of awareness, and an array of implicit measures to capture such proc...

Journal ArticleDOI
Irene Pollach1
TL;DR: The article demonstrates how these techniques can be applied to compare letters to shareholders from two different years and discusses the strengths and limitations of corpus linguistics for management and organization studies.
Abstract: Corpus linguistics studies real-life language use on the basis of a text corpus, drawing on both quantitative and qualitative text analysis techniques. This article seeks to bridge the gap between the social sciences and linguistics by introducing the techniques of corpus linguistics to the field of computer-aided text analysis. The article first discusses the differences between corpus linguistics and computer-aided text analysis, which is divided into computer-aided content analysis and computer-aided interpretive textual analysis. It then outlines the techniques of corpus linguistics for exploring textual data. In an exemplary analysis of letters to shareholders, the article demonstrates how these techniques can be applied to compare letters to shareholders from two different years. The article concludes with a discussion of the strengths and limitations of corpus linguistics for management and organization studies.

Journal ArticleDOI
TL;DR: In this article, the authors identify researcher choices related to the use of photographs in organizational research, clarify the advantages and disadvantages of these choices, and discuss ethical and other special considerations of photographs.
Abstract: Despite calls for more visual methodologies in organizational research, the use of photographs remains sparse. Organizational research could benefit from the inclusion of photographs to track contemporary change processes in an organization and change processes over time, as well as to incorporate diverse voices within organizations, to name a few advantages. To further understanding, the authors identify researcher choices related to the use of photographs in organizational research, clarify the advantages and disadvantages of these choices, and discuss ethical and other special considerations of the use of photographs. They highlight several organizational areas of research, primarily related to the management discipline, that could benefit from the inclusion of photographs. Finally, the authors describe how they used photographs in a study of one organization and specifically how their intended research design with photographs changed over the course of the study as well as how photographs helped to de...

Journal ArticleDOI
TL;DR: In this paper, the authors examined the degree to which meta-analyses in the organizational sciences transparently report procedures, decisions, and judgment calls by systematically reviewing all meta-analysis published between 1995 and 2008 in 11 top journals that publish metaanalysis in industrial and organizational psychology and organizational behavior.
Abstract: The authors examined the degree to which meta-analyses in the organizational sciences transparently report procedures, decisions, and judgment calls by systematically reviewing all (198) meta-analyses published between 1995 and 2008 in 11 top journals that publish meta-analyses in industrial and organizational psychology and organizational behavior. The authors extracted information on 54 features of each meta-analysis. On average, the meta-analyses in the sample provided 52.8% of the information needed to replicate the meta-analysis or to assess its validity and 67.6% of the information considered to be most important according to expert meta-analysts. More recently published meta-analyses exhibited somewhat more transparent reporting practices than older ones did. Overall transparency of reporting (but not reporting of the most important items) was associated with higher ranked journals; transparency was not significantly related to number of citations. The authors discuss the implications of inadequate...

Journal ArticleDOI
TL;DR: In this article, the assessment of noncognitive constructs in organizational research and practice is challenging because of response biases that can distort test scores and researchers must also deal with time constraint.
Abstract: Assessment of noncognitive constructs in organizational research and practice is challenging because of response biases that can distort test scores. Researchers must also deal with time constraint...

Journal ArticleDOI
TL;DR: New stochastic actor-based models for two-mode networks that may be adopted to redress the limitations of current analytical strategies are introduced.
Abstract: Two-mode networks are used to describe dual patterns of association between distinct social entities through their joint involvement in categories, activities, issues, and events. In empirical organizational research the analysis of two-mode networks is typically accomplished either by (i) decomposition of the dual structure into its two unimodal components defined in terms of indirect relations between entities of the same kind, or (ii) direct statistical analysis of individual two-mode dyads. Both strategies are useful, but neither is fully satisfactory. In this paper we introduce newly developed stochastic actor-based models for two-mode networks that may be adopted to redress the limitations of current analytical strategies. We specify and estimate the model in the context of data we have collected on the dual association between software developers and software problems observed during a complete release cycle of an open source software project. We discuss the general methodological implications of our models for organizational research based on the empirical analysis of two-mode networks.

Journal ArticleDOI
TL;DR: This article argues that organizational scholars, who strive to understand dynamic behavior in a complex context, are particularly in need of the support computational models offer, and provides a tutorial in model building and simulation.
Abstract: Theorists in management and organizational science rarely use computational modeling to support theoretical development or refinement, particularly at the micro level of analysis. This article argues that organizational scholars, who strive to understand dynamic behavior in a complex context, are particularly in need of the support computational models offer. Moreover, organizational scholars can build on (a) the plethora of informal theories extant in the literature and (b) the computational architectures and model building platforms developed in recent years. To increase the number of organizational scholars building and evaluating computational models, the article provides a tutorial in model building and simulation. Specifically, a new computational model is built and assessed. Surprising realizations emerge in the process. There is also an extensive section on model evaluation involving empirical observations.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that minor methodological changes within an unreformed epistemology will be as unhelpful as emotive exaggerations of the ill effects of null hypothesis significance testing.
Abstract: The purpose of this article is to propose possible solutions to the methodological problem of null hypothesis significance testing (NHST), which is framed as deeply embedded in the institutional structure of the social and organizational sciences. The core argument is that, for the deinstitutionalization of statistical significance tests, minor methodological changes within an unreformed epistemology will be as unhelpful as emotive exaggerations of the ill effects of NHST. Instead, several institutional-epistemological reforms affecting cultural-cognitive, normative, and regulative processes and structures in the social sciences are necessary and proposed in this article. In the conclusion, the suggested research reforms, ranging from greater emphasis on inductive and abductive reasoning to statistical modeling and Bayesian epistemology, are classified according to their practical importance and the time horizon expected for their implementation. Individual-level change in researchers' use of NHST is unli...

Journal ArticleDOI
TL;DR: This paper showed that principal components analysis (PCA) produces an additional spurious dimension despite Likert scaling procedures (i.e., reverse scoring and excluding items with low item-total correlations to improve scale reliability).
Abstract: Can one accurately infer the dimensionality of constructs such as emotions (i.e., happy–sad), work–family spillover (i.e., positive–negative), or job performance (i.e., organizational citizenship behaviors and counterproductive work behaviors) with commonly used methods? In this article, the authors show how the misapplication of commonly used methods (e.g., factor analysis [FA]) to data originating from an ideal point response process (i.e., self-reported typical behaviors: attitudes, personality, emotions, or interests) can lead to incorrect theoretical and statistical inferences. The authors demonstrate that principal components analysis (PCA) produces an additional spurious dimension despite Likert scaling procedures (i.e., reverse scoring and excluding items with low item-total correlations to improve scale reliability). This incorrectly leads to a conclusion against bipolarity. The authors illustrate the substantive implications for organizational research with emotions data showing that the misappl...

Journal ArticleDOI
TL;DR: For nearly three decades, the predominant approach to modeling the latent structure of multitrait-multimethod (MTMM) data in organizational research has involved confirmatory factor analysis (CFA).
Abstract: For nearly three decades, the predominant approach to modeling the latent structure of multitrait–multimethod (MTMM) data in organizational research has involved confirmatory factor analysis (CFA)....

Journal ArticleDOI
TL;DR: In this article, the authors present and illustrate the research staff ride, the re-creation of a historical event for the purpose of understanding organizational phenomena through observation, reflection, and discussion.
Abstract: The authors present and illustrate the research staff ride—the re-creation of a historical event for the purpose of understanding organizational phenomena through observation, reflection, and discussion. Staff rides make unique contributions to research through the independent analysis of events outside organizations by content experts who collectively and concurrently reflect on retrospective data while experiencing context. Staff rides involve the examination of ordered sequences of contextually bound events and, thus, promote participants' understanding of the dependence between past and future observations. In this article, the authors elaborate on the types of data, data collection procedures, and data analyses for research staff rides. Importantly, they discuss potential strengths and challenges associated with staff rides in qualitative research, along with ways to address these challenges.

Journal ArticleDOI
TL;DR: In this paper, the most important neuroeconomics techniques are described, along with four specific examples of how these methods can greatly benefit theory development, testing, and pruning in the organizational sciences.
Abstract: Organizational research has seen several calls for the incorporation of neuroscience techniques. The aim of this article is to describe the methods of neuroeconomics and the promises of applying these methods to organizational research problems. To this end, the most important neuroeconomics techniques will be described, along with four specific examples of how these methods can greatly benefit theory development, testing, and pruning in the organizational sciences. The article concludes by contrasting the benefits and limitations of neuroeconomics and by discussing implications for future research.

Journal ArticleDOI
TL;DR: In this article, the authors point out that most variance in job performance ratings is not attributable to ratee main effects and use confirmatory factor analysis (CFA) to analyze the variance.
Abstract: Earlier research using confirmatory factor analysis (CFA) suggests that most variance in job performance ratings is not attributable to ratee main effects. In this article, the authors point out se...

Journal ArticleDOI
TL;DR: This article introduces functional data analysis (FDA), a set of statistical tools developed to study information on curves or functions to highlight the potential of FDA to managerial science.
Abstract: In this article, we introduce functional data analysis (FDA), a set of statistical tools developed to study information on curves or functions. We review fundamentals of the methodology along with previous applications in other business disciplines to highlight the potential of FDA to managerial science. We provide details of the three most commonly used FDA techniques, including functional principal component analysis, functional regression, and functional clustering, and demonstrate each by investigating measures of firm financial performance from a panel data set of the 1,000 largest U.S. firms by revenues from 1992 to 2008. We compare results obtained from FDA with hierarchical linear modeling and conclude by outlining ideas for future micro- and macro-level organizational research incorporating this methodology.

Journal ArticleDOI
TL;DR: In this article, the authors examined two methods for detecting differential item functioning (DIF): Raju, van der Linden, and Fleer's 1995 differential functioning of items and tests (DFIT) procedure and Thissen, Steinberg, and Wainer's 1988 likelihood ratio test (LRT).
Abstract: This study examined two methods for detecting differential item functioning (DIF): Raju, van der Linden, and Fleer’s 1995 differential functioning of items and tests (DFIT) procedure and Thissen, Steinberg, and Wainer’s 1988 likelihood ratio test (LRT). The major research questions concerned which test provides the best balance of Type I errors and power and if the tests differ in terms of detecting different types of DIF. Monte Carlo simulations were conducted to address these questions. Equal and unequal sample size conditions were fully crossed with test lengths of 10 and 20 items. In addition, α and β parameters were manipulated in order to simulate DIF. Findings indicate that DFIT and LRT both had acceptable Type I error rates when sample sizes were equal but that DFIT produced too many Type I errors when sample sizes were unequal. Overall, the LRT exhibited greater power to detect both α and β parameter DIF than did DFIT. However, DFIT was more powerful than LRT when the last two β parameters had DI...

Journal ArticleDOI
TL;DR: In this article, a web search engine-based method, called retrospective relatedness reconstruction (3R), is proposed for collecting approximated historical data of temporally changing adaptive social networks.
Abstract: Examination of temporally changing adaptive social networks has been difficult given the need for extensive and usually real-time data collection. Building from interdisciplinary advances, the authors propose a web search engine–based method (called retrospective relatedness reconstruction or 3R) for collecting approximated historical data of temporally changing adaptive social networks. As quantifying relatedness among people in social networks leads to difficulty in assigning proper weights to relationship ties, 3R offers a means for assessing relatedness between people over time. Additionally, 3R can be applied beyond people relatedness to include word associations. To illustrate these two novel contributions, the authors reconstructed the temporal evolution of a social network from 2005 to 2009 of 92 individuals (key leaders) related to the U.S. financial crisis and also examined the temporal evolution of social sentiment (i.e., fear, shame, blame, confidence) related to the same 92 individuals. We fo...

Journal ArticleDOI
TL;DR: In this article, the authors extend prior invariance work to demonstrate how a lack of invariance can obscure the effect size, direction, and statistical significance of mean differences, and how manifest mean differences can be exaggerated, reduced, or even switch in direction relative to latent mean differences.
Abstract: Many studies in the social and organizational sciences are concerned with estimating and testing between-group mean differences. Typically, responses to these scales are summed, and the summed score or the mean of the summed score is then compared across groups using standard statistical tests ( t tests) and described using d values. Even though there is a call for establishing measurement invariance prior to examining mean differences, very few studies seem to follow this advice. In this study, the authors extend prior invariance work to demonstrate how a lack of invariance can obscure the effect size, direction, and statistical significance of mean differences. In particular, the authors show how manifest mean differences can be exaggerated, reduced, or even switch in direction relative to latent mean differences. Thus, they emphasize the statistical and substantive consequences that may occur when the scales used to perform mean comparisons do not possess measurement invariance.

Journal ArticleDOI
TL;DR: De Corte, Sackett, and Lievens as mentioned in this paper proposed to use Pareto-optimal predictor composites for simple selection, which yield trade-offs between selection quality and diversity levels that cannot be improved simultaneously by any other composite.
Abstract: The current practice of personnel selection faces the challenge of reconciling the often competing goals of obtaining a high-quality as well as a diverse workforce. To address this challenge for simple selections, De Corte, Sackett, and Lievens (2011) propose using Pareto-optimal predictor composites. These composites yield trade-offs between selection quality and selection diversity levels that cannot be improved simultaneously by any other composite. The current article describes how these Pareto-optimal composites and trade-offs can be developed in complex selection situations, which are characterized by vacancies for at least two different positions and applicants that apply either for one or several of these open positions simultaneously. An analytic method that estimates the selection quality and adverse impact of these complex selection decisions is presented and implemented in a multi-objective optimization program, so as to obtain Pareto-optimal predictor composites. The resulting decision aid is...