scispace - formally typeset
Search or ask a question

Showing papers in "The British Journal for the Philosophy of Science in 2001"


Journal ArticleDOI
TL;DR: The authors surveys work in epistemology since the mid-1980s, focusing on contextualism about knowledge attributions, modest forms of foundationalism, and the internalism/externalism debate and its connections to the ethics of belief.
Abstract: This article surveys work in epistemology since the mid-1980s. It focuses on (i) contextualism about knowledge attributions, (ii) modest forms of foundationalism, and (iii) the internalism/externalism debate and its connections to the ethics of belief.

214 citations


Journal ArticleDOI
TL;DR: The Poverty of the Stimulus Argument is one of the most famous and controversial arguments in the study of language and the mind as mentioned in this paper, and it has been widely rejected by linguists.
Abstract: Noam Chomsky's Poverty of the Stimulus Argument is one of the most famous and controversial arguments in the study of language and the mind. Though widely endorsed by linguists, the argumen...

190 citations


Journal ArticleDOI
TL;DR: This article argued that the kernel of truth in the principle of the common cause is to be found by separating metaphysical and epistemological issues; as far as the epistemology is concerned, the Likelihood Principle is central.
Abstract: When two causally independent processes each have a quantity that increases monotonically (either deterministically or in probabilistic expectation), the two quantities will be correlated, thus providing a counterexample to Reichenbach's principle of the common cause. Several philosophers have denied this, but I argue that their efforts to save the principle are unsuccessful. Still, one salvage attempt does suggest a weaker principle that avoids the initial counterexample. However, even this weakened principle is mistaken, as can be seen by exploring the concepts of homology and homoplasy used in evolutionary biology. I argue that the kernel of truth in the principle of the common cause is to be found by separating metaphysical and epistemological issues; as far as the epistemology is concerned, the Likelihood Principle is central.

130 citations


Journal ArticleDOI
TL;DR: This article argued that the only laws that biology has or needs are the laws of natural selection, and that the problem of biological explanation can be assimilated to the parallel problem in the philosophy of history.
Abstract: That biology provides explanations is not open to doubt. But how it does so must be a vexed question for those who deny that biology embodies laws or other generalizations with the sort of explanatory force that the philosophy of science recognizes. The most common response to this problem has involved redefining law so that those grammatically general statements which biologists invoke in explanations can be counted as laws. But this terminological innovation cannot identify the source of biology's explanatory power. I argue that because biological science is historical, the problem of biological explanation can be assimilated to the parallel problem in the philosophy of history, and that the problem was solved by Carl Hempel. All we need to do is recognize that the only laws that biology-in all its compartments from the molecular onward-has or needs are the laws of natural selection.

81 citations


Journal ArticleDOI
TL;DR: In this paper, the authors discuss the threat posed by the possibility of inequivalent quantizations of a classical field theory, i.e., inequivalent representations of the algebra of observables of the field in terms of operators on a Hilbert space.
Abstract: Philosophical reflection on quantum field theory has tended to focus on how it revises our conception of what a particle is. However, there has been relatively little discussion of the threat to the "reality" of particles posed by the possibility of inequivalent quantizations of a classical field theory, i.e., inequivalent representations of the algebra of observables of the field in terms of operators on a Hilbert space. The threat is that each representation embodies its own distinctive conception of what a particle is, and how a "particle" will respond to a suitably operated detector. Our main goal is to clarify the subtle relationship between inequivalent representations of a field theory and their associated particle concepts. We also have a particular interest in the Minkowski versus Rindler quantizations of a free Boson field, because they respectively entail two radically different descriptions of the particle content of the field in the *very same* region of spacetime. We shall defend the idea that these representations provide *complementary descriptions* of the same state of the field against the claim that they embody completely *incommensurable theories* of the field.

76 citations


Journal ArticleDOI
TL;DR: The issue of how to apply the principle of natural selection at different levels of the biological hierarchy, which underlies the dispute between Sober and Wilson and Maynard Smith, is examined.
Abstract: The group selection controversy is about whether natural selection ever operates at the level of groups, rather than at the level of individual organisms. Traditionally, group selection has been invoked to explain the existence of altruistic behaviour in nature. However, most contemporary evolutionary biologists are highly sceptical of the hypothesis of group selection, which they regard as biologically implausible and not needed to explain the evolution of altruism anyway. But in their recent book, Elliot Sober and David Sloan Wilson [1998] argue that the widespread opposition to group selection is founded on conceptual confusion. The theories that have been propounded as alternatives to group selection are actually group selection in disguise, they maintain. I examine their arguments for this claim, and John Maynard Smith's arguments against it. I argue that Sober and Wilson arrive at a correct position by faulty reasoning. In the final section, I examine the issue of how to apply the principle of natural selection at different levels of the biological hierarchy, which underlies the

67 citations


Journal ArticleDOI
TL;DR: In this article, the authors describe in detail how to build an infinite computing machine within a continuous Newtonian universe, and the relevance of their construction to the Church is discussed in detail.
Abstract: We describe in some detail how to build an infinite computing machine within a continuous Newtonian universe. The relevance of our construction to the Church

62 citations


Journal ArticleDOI
TL;DR: The authors examined the standard Bayesian solution to the Quine-duhem problem, the problem of distributing blame between a theory and its auxiliary hypotheses in the aftermath of a failed prediction.
Abstract: This paper examines the standard Bayesian solution to the Quine-Duhem problem, the problem of distributing blame between a theory and its auxiliary hypotheses in the aftermath of a failed prediction. The standard solution, I argue, begs the question against those who claim that the problem has no solution. I then provide an alternative Bayesian solution that is not question-begging and that turns out to have some interesting and desirable properties not possessed by the standard solution. This solution opens the way to a satisfying treatment of a problem concerning ad hoc auxiliary hypotheses.

46 citations


Journal ArticleDOI
TL;DR: In this article, the question of whether Pauli's Exclusion Principle (EP) vindicates the contingent truth of Leibniz's Principle of the Identity of Indiscernibles (PII) for fermions was investigated.
Abstract: This paper concerns the question of whether Pauli's Exclusion Principle (EP) vindicates the contingent truth of Leibniz's Principle of the Identity of Indiscernibles (PII) for fermions as H. Weyl first suggested with the nomenclature 'Pauli-Leibniz principle'. This claim has been challenged by a time-honoured argument, originally due to PI. Margenau and further articulated and championed by other authors. According to this argument, the Exclusion Principle-far from vindicating Leibniz's principle-would refute it, since the same reduced state, viz. an improper mixture, can be assigned as a separate state to each fermion of a composite system in antisymmetric state. As a result, the two fermions do have the same monadic state-dependent properties and hence are indiscernibles. PII would then be refuted in its strong version (viz. for monadic properties).I shall argue that a misleading assumption underlies Margenau's argument: in the case of two fermions in antisymmetric state, no separate states should be invoked since the states of the two particles are entangled and the improper mixture-assigned to each fermion by reduction-cannot be taken as an ontologically separate state nor consequently as encoding monadic properties. I shall then conclude that the notion of monadic properties together with the strong version of PII are inapplicable to fermions in antisymmetric state and this undercuts Margenau's argument.

39 citations


Journal ArticleDOI
TL;DR: In this article, it was shown that Pietroski and Rey's reconstruction of CP-laws cannot avoid the result that CP-law is almost vacuous, and it was also shown that the assumption of determinism is a fundamental limitation of Pietroskki and rey's reconstruction.
Abstract: In Brit. J. Phil. Sci., 46, Pietroski and Rey ([1995]) suggested a reconstruction of ceteris paribus (CP)-laws, which - as they claim - saves CP-laws from vacuity. This discussion note is intended to show that, although Pietroski and Rey's reconstruction is an improvement in comparison to previous suggestions, it cannot avoid the result that CP-laws are almost vacuous. It is proved that if Cx is an arbitrary (nomological) event-type which has independently identifiable deterministic causes, then for every other (nomological) event-type Ax which is not strictly connected with Cx or with -Cx, CP if Ax then Cx' satisfies the conditions of Pietroski and Rey for CP-laws. It is also shown that Pietroski and Rey's reconstruction presupposes the assumption of determinism. The conclusion points towards some alternatives to Pietroski and Rey's reconstruction.

32 citations


Journal ArticleDOI
TL;DR: In this article, an interpretation of Poincare's conventionalism, distinguishing it from the Duhem-Quine thesis, and from the logical positivist understanding of conventionalism as a general account of necessary truth, is presented.
Abstract: This paper offers an interpretation of Poincare's conventionalism, distinguishing it from the Duhem-Quine thesis, on the one hand, and, on the other, from the logical positivist understanding of conventionalism as a general account of necessary truth. It also confronts Poincare's conventionalism with some counter-arguments that have been influential: Einstein's (general) relativistic argument, and the linguistic rejoinders of Quine and Davidson. In the first section, the distinct roles played by the intertranslatability of different geometries, the inaccessibility of space to direct observation, and general holistic considerations are identified. Together, they form a constructive argument for conventionalism that underscores the impact of fact on convention. The second section traces Poincare's influence on the general theory of relativity and Einstein's ensuing ambivalence toward Poincare. Lastly, it is argued that neither Quine nor Davidson has met the conventionalist challenge.


Journal ArticleDOI
TL;DR: In the face of argument to the contrary, it is shown that there is defensible middle ground available for entity realism, between the extremes of scientific realism and empiricist antirealism as mentioned in this paper.
Abstract: In the face of argument to the contrary, it is shown that there is defensible middle ground available for entity realism, between the extremes of scientific realism and empiricist antirealism. Cartwright's ([1983]) earlier argument for defensible middle ground between these extremes, which depended crucially on the viability of an underdeveloped distinction between inference to the best explanation (IBE) and inference to the most probable cause (IPC), is examined and its defects are identified. The relationship between IBE and IPC is clarified and a revised version of Cartwright's argument for defensible middle ground, which is free of the identified defects, is presented.

Journal ArticleDOI
TL;DR: It is argued that when reproduction is sexual, natural selection can explain why individual organisms possess the traits they do, and a conception of individual organisms as receptacles for collections of genes cannot do the work Matthen requires of it.
Abstract: Mohan Matthen ([1999]) argues that when reproduction is sexual, natural selection can explain why individual organisms possess the traits they do. In stating his argument Matthen makes use of a conception of individual organisms as receptacles for collections of genes--a conception that cannot do the work Matthen requires of it. Either these receptacles are abstract objects, such as bare possibilities for organisms, or they are concrete. The first reading is too weak, since it allows selection to explain individual traits in both sexual and asexual contexts. The only concrete entities we might think of as receptacles for collections of genes are male or female gametes. It is true that in the sexual context selection explains why an individual gamete combines with a second gamete of one type rather than another; however, this is not to say that selection explains why an individual organism has the traits it does.

Journal ArticleDOI
TL;DR: It is shown in the first half of this paper that the evolutionary game-theoretic models are often highly sensitive to the specific processes that they are intended to simulate, and the positive proposal is made that they may none the less obtain robust results by simulating the population structures that existed among the authors' evolutionary ancestors.
Abstract: Brian Skyrms has argued that the evolution of the social contract may be explained using the tools of evolutionary game theory. I show in the first half of this paper that the evolutionary game-theoretic models are often highly sensitive to the specific processes that they are intended to simulate. This sensitivity represents an important robustness failure that complicates Skyrms's project. But I go on to make the positive proposal that we may none the less obtain robust results by simulating the population structures that existed among our evolutionary ancestors. It is by extending the evolutionary models in this way that we should pursue the project of explaining the evolution of the social contract.

Journal ArticleDOI
TL;DR: In this paper, the authors discuss the philosophical significance of the statistical model selection criteria, in particular their relevance for philosophical problems of underdetermination, and present an easily comprehensible account of their simplest possible application.
Abstract: I discuss the philosophical significance of the statistical model selection criteria, in particular their relevance for philosophical problems of underdetermination. I present an easily comprehensible account of their simplest possible application and contrast it with their application to curve-fitting problems. I embed philosophers' earlier discussion concerning the situations in which the criteria yield implausible results into a more general framework. Among other things, I discuss a difficulty which is related to the so-called subfamily problem, and I show that it has analogies in all legitimate applications of the model selection criteria, and that an analogy of Goodman's new riddle of induction can be formulated in only some of their applications.

Journal ArticleDOI
TL;DR: A more precise understanding of cognitive systems will make it possible to articulate in some detail an alternative to the Fodorian doctrine of modularity, but it will also provide a better understanding of what a module is (since all modules are cognitive systems).
Abstract: The cognitive enuropsychological understanding of a cognitive system is roughly that of a ‘mental organ’, which is independent of other systems, specializes in some cognitive task, and exhibits a certain kind of internal cohesiveness. This is all quite vague, and I try to make it more precise. A more precise understanding of cognitive systems will make it possible to articulate in some detail an alternative to the Fodorian doctrine of modularity (since not all cognitive systems are modules), but it will also provide a better understanding of what a module is (since all modules are cognitive systems).

Journal ArticleDOI
TL;DR: A theory of sets and classes was proposed in this paper as an adequate founding theory of mathematics and by implication of category-theory, which is a slight extension of Ackermann's theory of 1956.
Abstract: Critique of set-theory as a founding theory of category-theory. Proposal of a theory of sets and classes as an adequate founding theory of mathematics and by implication of category-theory. This theory is a slight extension of Ackermann's theory of 1956.

Journal ArticleDOI
TL;DR: This article pointed out that the apparent superiority of prediction to accommodation is actually a side effect of an important difference between the hypotheses that tend to arise in each case, and that once the example is tweaked, the intuitive difference there between prediction and accommodation disappears.
Abstract: Maher ([1990], [1993]) has offered a lovely example to motivate the intuition that a successful prediction has a kind of confirmatory significance that an accommodation lacks. This paper scrutinizes Maher's example. It argues that once the example is tweaked, the intuitive difference there between prediction and accommodation disappears. This suggests that the apparent superiority of prediction to accommodation is actually a side effect of an important difference between the hypotheses that tend to arise in each case.

Journal Article
TL;DR: Barbour's theory of time in quantum geometrodynamics has been discussed in this paper, with a focus on his recent book, The End of Time (1999), and a shortened version will appear in The British Journal for Philosophy of Science}.
Abstract: I discuss Julian Barbour's Machian theories of dynamics, and his proposal that a Machian perspective enables one to solve the problem of time in quantum geometrodynamics (by saying that there is no time!). I concentrate on his recent book, The End of Time (1999). A shortened version will appear in The British Journal for Philosophy of Science}.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the uniqueness of certain simultaneity structures in flat spacetime, and prove that the relativities with respect to an additional structure X on spacetime is a non-trivial equivalence relation which is invariant under the subgroup in Aut that stabilises X.
Abstract: We consider the problem of uniqueness of certain simultaneity structures in flat spacetime. Absolute simultaneity is specified to be a non-trivial equivalence relation which is invariant under the automorphism group Aut of spacetime. Aut is taken to be the identity-component of either the inhomogeneous Galilei group or the inhomogeneous Lorentz group. Uniqueness of standard simultaneity in the first, and absence of any absolute simultaneity in the second case are demonstrated and related to certain group theoretic properties. Relative simultaneity with respect to an additional structure X on spacetime is specified to be a non-trivial equivalence relation which is invariant under the subgroup in Aut that stabilises X. Uniqueness of standard Einstein simultaneity is proven in the Lorentzian case when X is an inertial frame. We end by discussing the relation to previous work of others.


Journal ArticleDOI
TL;DR: In contrast, quantum mechanics does not readily admit any such relational formulation as mentioned in this paper and does not admit a relational classical mechanics in which absolute space and time do not play a fundamental role.
Abstract: Whereas one can conceive of a relational classical mechanics in which absolute space and time do not play a fundamental role, quantum mechanics does not readily admit any such relational formulation.

Journal ArticleDOI
TL;DR: In this article, a model of the dichotomy paradox is presented in Newtonian collision mechanics, where the base rules permit only spatial contact interactions, and the mechanical emergence of actionat-a-distance effects is found.
Abstract: A model of Zeno's dichotomy paradox is presented in Newtonian collision mechanics. One of several resolutions of the paradox illustrates the point that even in Newtonian ontology there is a spacetime weave. In a Newtonian system in which the base rules permit only spatial contact interactions, we find the mechanical emergence of actionat-a-distance effects.

Journal ArticleDOI
TL;DR: The authors argue that if dispositions are properties of individuals, then any reductive analysis of laws would require an extension of the notion of the dispositional beyond its usual meaning so that in effect there can be no reduction of laws to dispositions as traditionally understood.
Abstract: This paper discusses the relationship between dispositions and laws and the prospects for any analysis of talk of laws in terms of talk of dispositions. Recent attempts at such a reduction have often been motivated by the desire to give an account of ceteris paribus laws and in this they have had some success. However, such accounts differ as to whether they view dispositions as properties fundamentally of individuals or of kinds. I argue that if dispositions are properties of individuals, we cannot give a complete account of ceteris paribus laws. Alternatively, if dispositions are properties of kinds, any reductive analysis of laws would require an extension of the notion of the dispositional beyond its usual meaning so that in effect there can be no reduction of laws to dispositions as traditionally understood. An attempt to reduce the nomological to the dispositional is therefore not the way to provide a unified account of traditional and ceteris paribus laws.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that Kuhn and Musgrave arrive at their view because they lack a substantive account of how well discoverers must be able to conceptualize discovered objects.
Abstract: Thomas Kuhn (in The Structure of Scientific Revolutions) and Alan Musgrave (in Why Did Oxygen Supplant Phlogiston?') argue that it is impossible to precisely date discovery events and precisely identify discoverers. They defend this claim mainly on the grounds that so-called discoverers have in many cases misconceived the objects of discovery. In this paper, I argue that Kuhn and Musgrave arrive at their view because they lack a substantive account of how well discoverers must be able to conceptualize discovered objects. I remedy this deficiency by providing just such an account, and with this account I delineate how one can secure precision regarding the identity of discoverers and the times of discoveries. Near the end of my paper I bring my target of criticism up-to-date; it turns out that Steve Woolgar adopts an approach to discovery kindred to those of Kuhn and Musgrave and I close the paper by discussing what is at stake in rebutting him.


Journal ArticleDOI
TL;DR: This paper argued that Smith's analysis undermines much of the explanatory power of chaos theory and proposed a better approach by drawing analogies from the models found in continuum mechanics, where the infinite intricacy found in strange attractors can be explained.
Abstract: In his recent book, Explaining Chaos, Peter Smith presents a new problem in the foundations of chaos theory. Specifically, he argues that the standard ways of justifying idealizations in mathematical models fail when it comes to the infinite intricacy found in strange attractors. I argue that Smith's analysis undermines much of the explanatory power of chaos theory. A better approach is developed by drawing analogies from the models found in continuum mechanics.


Journal ArticleDOI
TL;DR: In this paper, a seemingly plausible application of Bayesian decision-theoretic reasoning to determine one's rational degrees of belief yields a paradoxical conclusion: one ought to jettison one's intermediate credences in favour of more extreme (opinionated) ones.
Abstract: A seemingly plausible application of Bayesian decision-theoretic reasoning to determine one's rational degrees of belief yields a paradoxical conclusion: one ought to jettison one's intermediate credences in favour of more extreme (opinionated) ones. I discuss various attempts to solve the paradox, those involving the acceptance of the paradoxical conclusion, and those which attempt to block its derivation.