Taylorizing business school research: On the ‘one best way’ performative effects of journal ranking lists
Summary (3 min read)
Introduction
- Examining the use and effects of journal lists is therefore important not simply for better understanding, or refining, how such metrics are devised (see Truex et al, 2011 for a critical review) but also, and more significantly, for appreciating and questioning their constitutive role in defining and policing the focus and direction of research activity.
- Detailed consideration is given to the establishment and use of the ABS ‘Quality Guide’ before the authors assess claims that its use brings cultural and economic benefits.
Heterogeneity at risk
- The image of sporting league tables, with their organization into divisions and top clubs, is frequently invoked to characterize and legitimize journal rankings.
- The aptness of the parallel is, however, limited and misleading.
- In competitive sports, team managers have broadly the same objective: in many ball sports, for example, the aim is to score goals without conceding.
- At the same time, it would be naïve to deny how editors are subject to reputational and commercial pressures that are intensified by the competitive use of citation scores and JIFs as performance indicators which may induce them inter alia to emulate the genre of scholarship published in top-tier journals.
- These can then be “cashed in” in the promotion and transfer markets, and in the selection process for submission of staff/outputs to research evaluation exercises.
Marginalizing innovation
- It would be surprising if many journal editors were unable to resist all temptation, including material incentives, to game the system in order to increase their JIFs – for example, by favouring articles that are assessed to have “shooting star” potential (see above), or by inviting authors to consider including reference to articles that have been published in the journal.
- Well-documented examples include the areas of sustainability (see Wells, 2010) and innovation studies (see Rafols et al, 2011).
- When challenged by a letter in OR/MS Today (Ackerman and 48 others, 2009), signed by academics from around the world, the response from the Editor was that, as far as Operations Research was concerned, non-mathematical OR was not OR and therefore not publishable.
- To the extent that a journal list broadly reflects, endorses and reproduces the hegemony of this neo-positivist tradition of scholarship, ts most potent effect is to devalue and marginalize, if not exclude, heterodox forms of scholarly contribution, and thereby induce a homogenization of research activity.
- For a majority of senior managers - from research directors, through deans to vicechancellors - the ranking of their university’s departments, including its business school, in national evaluation exercises is the key indicator of its status in relation to other universities.
Performance anxiety
- Determining whether the outputs of staff are of adequate quality to be submitted to research evaluation exercises and, if so, which publications to select, presents university managers with a major challenge, especially when the reputation of the university and the business school is research-based, and so is heavily dependent upon the outcome.
- When faced with such troublesome decisions, the availability of a journal list is a seductive decision-making aid as it purports to provide an impersonal and objective basis for assessing the quality of research published by staff, and thus offers a basis for making and justifying difficult and divisive decisions.
- The appeal of a list is especially strong in contexts of diversity where there are multiple paradigmatic differences over what counts as quality - differences that can be as acute and delicate within business schools as they are between schools.
PLACE EXHIBIT 2 ABOUT HERE
- Some indications of how the 2008 Panel would operate were provided in its statement of Working Methods, published in January 2006 (see Exhibit 2).
- A lingering question remained, however, about how practically this undertaking could be fulfilled in the time available (a few months) by a Panel comprising less than twenty members.
- It could be inferred from its Working Methods statement that the Panel would rely upon its “professional judgment” to assess the remaining outputs; and that it would do this without reading the outputs “in detail”.
- For members of research committees, research directors and deans of business schools, a possible remedy for their performance anxieties (see above) presented itself in the form of the indicator of publication quality provided by a journal rankings listvi.
- Given this application of the ABS list, and its continuing use in preparations for the 2014 evaluation exercise, it is relevant to consider further how its construction and justification has contributed to exerting an influence over the range and direction of business and management research.
Managing by numbers: The ABS ╄Jラ┌ヴミ;ノ Q┌;ノキデ┞ G┌キSWげ
- In the run-up up to the 2008 evaluation exercise, th ABS “Journal Quality Guide” became adopted as the “de facto standard” across UK business schools (Mingers, Watson and Scaparra, 2012: 3).
- Its relevance for this purpose was crudely signalled by Version 1 of the ABS list in which journals were grouped using a five point scale that directly mimicked the scale used by the Panel.
- In place of any principled articulation or defence of the ABS list – for example, by indicating how it might contribute to promoting more innovative research and scholarship - its architects simply anticipate and openly commend its managerial use.
- In other words, the list is commended as a handy, expedient tool for those charged with making onerous decisions about their colleagues’ careers, yet who are disinclined to prioritize time for reading and assessing the work itself, or seeking advice from subject specialists.
Performative Effects of the ABS List as a Policy Tool
- Their analysis necessarily relies upon available, aggregated information on the outcomes of the 2008 exercise profiled as the percentage of outputs awarded a particular grade from 4 to 1 and ungraded (see Exhibit 1).
- The implication is that academics should direct their research activity primarily to what is publishable in journals rather than through other media (e.g. monographs).
- This shows the vast range of research carried out within business and management that does not have the “ABS stamp” of recognition.
- Evidence that the ABS list was used in making submissions – for instance comparing the ABS journals that were submitted with those that were not, 45% of those not submitted were ABS 1*, while only 4% were ABS 4*.
Discussion: The ABS List and the Taylorization of Business School Research
- Rather like Taylor, any problems associated with using the list are attributed to its imperfect or misdirected application or, more recently, to “predelictions and prejudices” amongst journal editors and referees that the ABS list merely “reflects” and renders more “visible” and available to “challenge”, thereby ignoring its performative effects (Morris et al, 2011: 563).
- In order to create a metric with general applicability or universality, its particularity must be obscured– a particularity that unavoidably privileges the values of certain research traditions while it marginalizes others.
- But a moratorium on the use of journal lists for making decisions on recruitment, promotion and research evaluation submission is more consistent with their analysis, and so is the preferred policy recommendation.
Did you find this useful? Give us your feedback
Citations
930 citations
Cites background from "Taylorizing business school researc..."
...The adverse impact of this “audit culture” is well documented (see e.g. Adler & Harzing, 2009; Mingers & Willmott, 2013)....
[...]
...The adverse impact of this ‘‘audit culture’’ is well documented (see e.g. Adler and Harzing 2009; Mingers and Willmott 2013)....
[...]
560 citations
Cites background from "Taylorizing business school researc..."
...Indeed journal ranking lists such as the UK Association of Business Schools’ (ABS) has a huge effect on research behaviour (Mingers & Willmott, 2013)....
[...]
361 citations
290 citations
Cites background from "Taylorizing business school researc..."
...In economics and many departments in business studies, however, publication productivity has been strongly stimulated by the ubiquitous use of journal rankings as obligatory publication outlets for faculty (Mingers and Willmott 2013)....
[...]
180 citations
Cites background from "Taylorizing business school researc..."
...…managerialist pressures to perform, and these disciplinary techniques of evaluating research output, teaching quality and public/social impact assessments have become normalized and naturalized (Clarke et al., 2012; Harley, 2002; Harley and Lee, 1997; Keenoy, 2003; Mingers and Willmott, 2013)....
[...]
References
7,467 citations
"Taylorizing business school researc..." refers background in this paper
...Operational Research (OR) has mathematical roots but since the 1970s, especially in the UK, the adequacy of mathematical modelling of complex real-world problems has been questioned, resulting in the development of a new area of OR, known as ‘soft OR’ or ‘soft systems’ (Checkland, 1981)....
[...]
2,981 citations
"Taylorizing business school researc..." refers background in this paper
...…of the ABS list themselves acknowledge, has shown considerable resiliance, up until now at least, in ‘resist(ing) normative pressures to coalesce around a set of ontological, epistemological and methodological norms (Tranfield and Starkey, 1998)’ (Morris et al., 2009: 1444; see also Becher, 1989)....
[...]
1,878 citations
"Taylorizing business school researc..." refers background in this paper
...…to internal, external or construct validity; and no preoccupation with the operationalization, measurement and statistical analysis of variables – the ethos of positivism, in the numerous guises of neo-positivism, tends to hold sway (e.g. Gibbert et al., 2008; Scandura and Williams, 2000)....
[...]
1,638 citations
"Taylorizing business school researc..." refers background in this paper
...…the Association of Business Schools) become influential for processes of recruitment, promotion and the selection of staff/outputs for submission to evaluation exercises, they come to shape the nature, structure and conditions of academic work (Espeland and Sauder, 2007; Sauder and Espeland, 2009)....
[...]
...those created by the Financial Times and the Association of Business Schools) become influential for processes of recruitment, promotion and the selection of staff/outputs for submission to evaluation exercises, they come to shape the nature, structure and conditions of academic work (Espeland and Sauder, 2007; Sauder and Espeland, 2009)....
[...]
1,162 citations
"Taylorizing business school researc..." refers background in this paper
...…to internal, external or construct validity; and no preoccupation with the operationalization, measurement and statistical analysis of variables – the ethos of positivism, in the numerous guises of neo-positivism, tends to hold sway (e.g. Gibbert et al., 2008; Scandura and Williams, 2000)....
[...]
...Even when the data are qualitative – and when there is no explicit hypothesis testing of propositions; no reference to internal, external or construct validity; and no preoccupation with the operationalization, measurement and statistical analysis of variables – the ethos of positivism, in the numerous guises of neo-positivism, tends to hold sway (e.g. Gibbert et al., 2008; Scandura and Williams, 2000)....
[...]
Related Papers (5)
Frequently Asked Questions (2)
Q2. What are the future works in this paper?
In the UK context, the authors suggest that the Business and Management Panel for any future evaluation exercises might: 1. Reiterate the exclusion of all use of journal lists from the evaluation process. It is hoped that the evidence and arguments presented in this paper will stimulate further discussion of the pros and cons of the use of journals lists.