scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Economic Methodology in 2000"


Journal ArticleDOI
TL;DR: The authors argue that such models are not abstractions from, or simplifications of, the real world They describe counterfactual worlds which the modeller has constructed The gap between model world and real world can be filled only by inductive inference, and we can have more confidence in such inferences, the more credible the model is as an account of what could have been true.
Abstract: Using as examples Akerlof's 'market for ''lemons''' and Schelling's 'checkerboard' model of racial segregation, this paper asks how economists' abstract theoretical models can explain features of the real world It argues that such models are not abstractions from, or simplifications of, the real world They describe counterfactual worlds which the modeller has constructed The gap between model world and real world can be filled only by inductive inference, and we can have more confidence in such inferences, the more credible the model is as an account of what could have been true

465 citations


Journal ArticleDOI
TL;DR: The primary objective of this paper is to revisit a number of empirical modelling activities which are often characterized as data mining, in an attempt to distinguish between the problematic and the non-problematic cases, using the notion of error-statistical severity.
Abstract: The primary objective of this paper is to revisit a number of empirical modelling activities which are often characterized as data mining, in an attempt to distinguish between the problematic and the non-problematic cases. The key for this distinction is provided by the notion of error-statistical severity. It is argued that many unwarranted data mining activities often arise because of inherent weaknesses in the Traditional Textbook (TT) methodology. Using the Probabilistic Reduction (PR) approach to empirical modelling, it is argued that the unwarranted cases of data mining can often be avoided by dealing directly with the weaknesses of the TT approach. Moreover, certain empirical modelling activities, such as diagnostic testing and data snooping, constitute legitimate procedures in the context of the PR approach.

93 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated how normative considerations influenced the development of the theory of individual decision-making under risk and how informal notions of rationality were among the tacit heuristic principles that led to the discovery of generalized models of decision put forward in the early eighties to replace the received model.
Abstract: The paper investigates how normative considerations influenced the development of the theory of individual decision-making under risk. In the first part, the debate between Maurice Allais and the 'Neo-Bernoullians' (supporting the Expected Utility model) is reconstructed, in order to show that a controversy on the definition of rational decision and on the methodology of normative justification played a crucial role in legitimizing the Allais-paradox as genuinely refuting evidence. In the second part, it is shown how informal notions of rationality were among the tacit heuristic principles that led to the discovery of generalized models of decision put forward in the early eighties to replace the received model.

74 citations


Journal ArticleDOI
TL;DR: Data mining refers to a broad class of activities that have in common, a search over different ways to process or package data statistically or econometrically with the purpose of making the final presentation meet certain design criteria.
Abstract: 'Data mining' refers to a broad class of activities that have in common, a search over different ways to process or package data statistically or econometrically with the purpose of making the final presentation meet certain design criteria. We characterize three attitudes toward data mining: first, that it is to be avoided and, if it is engaged in, that statistical inferences must be adjusted to account for it; second, that it is inevitable and that the only results of any interest are those that transcend the variety of alternative data mined specifications (a view associated with Leamer's extreme-bounds analysis); and third, that it is essential and that the only hope we have of using econometrics to uncover true economic relationships is to be found in the intelligent mining of data. The first approach confuses considerations of sampling distribution and considerations of epistemic warrant and, reaches an unnecessarily hostile attitude toward data mining. The second approach relies on a notion of robu...

56 citations


Journal ArticleDOI
TL;DR: In this paper, the authors discuss contract theory from a combined Austrian/new institutional view and illustrate this by means of the treatment of asymmetric information under complete contracting and the notion of control rights under incomplete contracting.
Abstract: We discuss contract theory from a combined Austrian/new institutional view. In the latter view, the world is seen as shot through with ignorance and transaction costs, but, as a tendency, entrepreneurial activity responds to the problems caused by these. All modelling must critically reflect this. This ontological commitment is contrasted to various isolations characteristic of contract theory, specifically the modelling strategy of introducing often ad hoc and unexplained constraints that suppress margins and possibilities of entrepreneurial actions that would be open to real-world decision-makers. We illustrate this by means of, for example, the treatment of asymmetric information under complete contracting and the notion of control rights under incomplete contracting.

37 citations


Journal ArticleDOI
TL;DR: In this article, the authors make explicit the rhetoric of optimization and examine various arguments in order to determine whether we should retain optimization theory or assume bounded rationality, and show that either they involve logical defects or they rest upon a conceptual gap.
Abstract: This paper makes explicit the rhetoric of optimization. Various arguments are examined, in order to determine whether we should retain optimization theory or assume bounded rationality. Empirical evidence confounds optimization theory; in the face of experimental studies, an empirical dilemma emerges, according to which we should discard either the theory of expected utility or the criterion of empirical refutation. Methodological criticisms attack optimization theory's epistemological status; together, they give rise to a methodological trilemma, according to which optimization theory is indeterminate, unfalsifiable or tautological. Methodological defences seem to protect optimization theory against criticism; but a more careful examination shows that either they involve logical defects or they rest upon a conceptual gap. Theoretical difficulties plague optimization theory; though various extensions have been proposed, optimization theory entails a theoretical dilemma, according to which one must choose ...

36 citations


Journal ArticleDOI
TL;DR: In this article, the authors propose to reclaim relevant realism by reclaiming relevant realism in economic methodologies, and present an alternative approach for economic methodology, which they call Reclaiming Realism.
Abstract: (2000). Reclaiming relevant realism. Journal of Economic Methodology: Vol. 7, No. 1, pp. 109-125.

31 citations


Journal ArticleDOI
TL;DR: In this article, the authors recognize the legitimacy of explaining the resilience of certain patterns of behaviour: that is, explaining, not necessarily why they emerged or have been sustained, but why they are robust and reliable.
Abstract: In order to vindicate rational-choice theory as a mode of explaining social patterns in general - social patterns beyond the narrow range of economic behavi- our - we have to recognize the legitimacy of explaining the resilience of certain patterns of behaviour: that is, explaining, not necessarily why they emerged or have been sustained, but why they are robust and reliable. And once we allow the legiti- macy of explaining resilience, then we can see how functionalist theory may also serve us well in social science; we lose the basis - the empty black box ar gument - on which the rational-choice critique of the theory has mostly been grounded.

28 citations


Journal ArticleDOI
TL;DR: In this paper, a survey of the Symposium papers argues that the problem of data mining should be of interest to both practicing econometricians and specialists in economic methodology, and draws on recent work in the philosophy of science to point to parallels between data mining and practices engaged in routinely by experimental scientists.
Abstract: This survey of the symposium papers argues that the problem of data mining should be of interest to both practicing econometricians and specialists in economic methodology. After summarizing some of the main points to arise in the symposium, it draws on recent work in the philosophy of science to point to parallels between data mining and practices engaged in routinely by experimental scientists. These suggest that data mining might be seen in a more positive light than conventional doubts about it imply.

25 citations


Journal ArticleDOI
TL;DR: In this paper, the authors argue that modern economic theory is essentially utilitarian with one significant exception: its abandonment of a multi-dimensional conception of utility, and present three alternative methods by which utility can be portrayed as a one dimensional, hence determinate, index of desire, while suggesting that none of them can command empirical support.
Abstract: This paper argues that modern economic theory is essentially utilitarian with one significant exception: its abandonment of a multi-dimensional conception of utility. The paper reviews three alternative methods by which utility can be portrayed as a one-dimensional, hence determinate, index of desire, while suggesting that none of them can command empirical support. A second theme of the paper is that classical utilitarianism, by denying the ontological existence of intrinsic worth, implies the coincidence of economic and ethical aggregate optimality: those choices that maximize the self-perceived happiness of rational agents are also the right choices. Non-utilitarian ethics, by contrast, attains determinate optima by means of an a priori designation of intrinsic worth. It is argued that most philosophers, following G. E. Moore, have missed the true issue that divides utilitarian and non-utilitarian ethics, for they have presumed that all ethical systems presuppose intrinsic worth.

17 citations


Journal ArticleDOI
TL;DR: In this paper, the authors deal with a practical problem confronting a researcher who wants to persuade his readers but does not want to deceive them, i.e., when, say five of the eight specifications tested are consistent with the hypothesis and, three are not.
Abstract: Data mining occurs because most economic hypotheses do not have a unique empirical interpretation but allow the econometrician much leeway in selecting conditioning variables, lags, functional forms, and sometimes the sample. The resulting problems are of interest not only to methodologists and philosophers concerned with how hypotheses are validated in the presence of some inevitable ad hocery but, also to readers of economics journals who have no interest in methodology but need to know whether to believe what they read. Since I focus on such mundane problems I make no claim of contributing to the deeper epistemological problems of relating empirical evidence to theory, and to the meaning of confirmation and disconfirmation when, say five of the eight specifications tested are consistent with the hypothesis and, three are not. Instead, I deal with a practical problem confronting a researcher who wants to persuade his readers but does not want to deceive them. He has fitted many regressions with varying ...

Journal ArticleDOI
TL;DR: In this article, the authors argue that there is a risk of selective reporting as Mayer indicates but that other researchers (competition) will ensure that the sensitivity of truly important findings is checked.
Abstract: We maintain that the actions of researchers show that data mining is a necessary part of econometric inquiry. We analyse this phenomenon using the analogy of an industry producing a product (econometric analyses). There is a risk of selective reporting as Mayer indicates but we argue that other researchers (competition) will ensure that the sensitivity of truly important findings is checked. Hence, initial researchers have an incentive to analyse sensitivity from the beginning and so produce a quality product. Some suggestions are made towards encouraging this process. The 'general to specific' approach to data mining as promoted by Hoover and Perez can be valuable but it is premature to eliminate other strategies.

Journal ArticleDOI
TL;DR: In this paper, it is argued that there are compelling reasons to use the data for instrument selection, but that it is desirable to ensure the resulting estimator still behaves in the way predicted by standard textbook theory.
Abstract: Instrumental variables estimation is widely applied in econometrics. To implement the method, it is necessary to specify a vector of instruments. In this paper, it is argued that there are compelling reasons to use the data for instrument selection, but that it is desirable to ensure the resulting estimator still behaves in the way predicted by standard textbook theory. These arguments lead one to propose three criteria for data based instrument selection. The remainder of the paper assesses the extent to which these criteria are met by two algorithms for data based instrument selection. The first algorithm is the method of structurally ordered instrumental variables proposed in the context of economy-wide linear simultaneous equation models. The second algorithm is proposed in the context of the method of generalized instrumental variables, which is commonly used to estimate the parameters of Euler equation models.

Journal ArticleDOI
TL;DR: This paper argues classical statistics and standard econometrics are based on a desire to meet scientific standards for accumulating reliable knowledge and neglect the importance of out-of-sample testing in the production of reliable knowledge.
Abstract: This paper argues classical statistics and standard econometrics are based on a desire to meet scientific standards for accumulating reliable knowledge. Science requires two inputs, mining of existing data for inspiration and new or 'out-of-sample' data for predictive testing. Avoidance of data-mining is neither possible nor desirable. In economics out-of-sample data is relatively scarce, so the production process should intensively exploit the existing data. But the two inputs should be thought of as complements rather than substitutes. And we neglect the importance of out-of-sample testing in the production of reliable knowledge. Avoidance of data-mining is not a substitute for tests conducted in new samples. The problem is not that data-mining corrupts the process, the problem is our collective neglect of out-of-sample encompassing, stability and forecast tests. So the data-mining issue diverts us from the crucial margin.


Journal ArticleDOI
TL;DR: The New Institutional Economics (NIE) occupies an important space in the rapidly expanding theory of organization as discussed by the authors, and traditional testing techniques have only been applied to less complex parts of the NIE.
Abstract: The New Institutional Economics (NIE) occupies an important space in the rapidly expanding theory of organization. Traditional testing techniques have only been applied to less complex parts of the NIE. A rich body of evidence generated by the experiences of firms and other organizations lies fallow. The limited domain of traditional testing will persist because of the nature of the central concepts of the NIE, the difficulty posed for integrating transaction cost into an optimizing framework by self-reference, and the particularly wicked manifestations of the Duhem-Quine problem. Experimental economics may add credibility to some components of the NIE but is unlikely to fill the current void. The catchy title, the telling anecdote, and the creative 'fact' have played a disproportionate role in determining the composition of the canonical literature in the NIE. Catalytic steps for developing professional norms governing the marshalling of data generated by different organizations and institutions are disc...

Journal ArticleDOI
TL;DR: Hollis as mentioned in this paper is a philosopher of social science, and he has been called a "poet-of-social-science" and a "philosopher of social sciences".
Abstract: (2000). Martin Hollis: philosopher of social science. Journal of Economic Methodology: Vol. 7, No. 3, pp. 427-445.