scispace - formally typeset
Search or ask a question

Showing papers in "Philosophy of Science in 1988"


Journal ArticleDOI
TL;DR: In this article, the authors consider the cognitive impenetrability of perception and show that it does not establish a theory-neutral foundation for knowledge and that the psychological account of perceptual encapsulation that I set forth in The Modularity of Mind is almost certainly false.
Abstract: Churchland's paper "Perceptual Plasticity and Theoretical Neutrality" offers empirical, semantical and epistemological arguments intended to show that the cognitive impenetrability of perception "does not establish a theory-neutral foundation for knowledge" and that the psychological account of perceptual encapsulation that I set forth in The Modularity of Mind "[is] almost certainly false". The present paper considers these arguments in detail and dismisses them.

176 citations


Journal ArticleDOI
TL;DR: It is argued that the delay of the Darwinian Revolution in biological taxonomy has resulted partly from a failure to distinguish between two fundamentally different ways of ordering identified by Griffiths (1974): classification and systematization.
Abstract: Taxonomies of living things and the methods used to produce them changed little with the institutionalization of evolutionary thinking in biology. Instead, the relationships expressed in existing taxonomies were merely reinterpreted as the result of evolution, and evolutionary concepts were developed to justify existing methods. I argue that the delay of the Darwinian Revolution in biological taxonomy has resulted partly from a failure to distinguish between two fundamentally different ways of ordering identified by Griffiths (1974): classification and systematization. Classification consists of ordering entities into classes, groups defined by the attributes of their members; in contrast, systematization consists of ordering entities into systems, more inclusive entities whose existence depends on some natural process through which their parts are related. Evolutionary, or phylogenetic, systematics takes evolutionary descent to be the natural process of interest in biological taxonomy. I outline a genera...

170 citations


Journal ArticleDOI
TL;DR: In this article, the implications of Marr's theory of vision for some currently popular philosophies of psychology, specifically the "hegemony of neurophysiology view", the theories of Jerry Fodor, Daniel Dennett, and Stephen Stich, and the view that perception is permeated by belief.
Abstract: David Marr's theory of vision has been widely cited by philosophers and psychologists. I have three projects in this paper. First, I try to offer a perspicuous characterization of Marr's theory. Next, I consider the implications of Marr's work for some currently popular philosophies of psychology, specifically, the "hegemony of neurophysiology view", the theories of Jerry Fodor, Daniel Dennett, and Stephen Stich, and the view that perception is permeated by belief. In the last section, I consider what the phenomenon of vision must be like for Marr's project to succeed.

82 citations


Journal ArticleDOI
TL;DR: It is argued that the explanation of an individual's traits involves us in a description of the individual's ancestry, and in an explanation of the distribution of traits in that ancestral population.
Abstract: In this paper I argue against Sober's claim that natural selection does not explain the traits of individuals. Sober argues that natural selection only explains the distribution of traits in a population. My point is that the explanation of an individual's traits involves us in a description of the individual's ancestry, and in an explanation of the distribution of traits in that ancestral population. Thus Sober is wrong, natural selection is part of the explanation of the traits of individuals.

53 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that there is a straightforward relativistic generalization, and that therefore Maxwell's conclusion that the special theory of relativity should be amended is unwarranted.
Abstract: N. Maxwell (1985) has claimed that special relativity and "probabilism" are incompatible; "probabilism" he defines as the doctrine that "the universe is such that, at any instant, there is only one past but many alternative possible futures". Thus defined, the doctrine is evidently prerelativistic as it depends on the notion of a universal instant of the universe. In this note I show, however, that there is a straightforward relativistic generalization, and that therefore Maxwell's conclusion that the special theory of relativity should be amended is unwarranted. I leave open the question whether or not probabilism (or the related doctrine of the flow of time) is true, but argue that the special theory of relativity has no fundamental significance for this question.

42 citations


Journal ArticleDOI
TL;DR: This paper argued that the contrast theory and propositional approach can and should give equivalent readings to why-questions, where Q is a contrasting alternative to Why P (rather than Q).
Abstract: Classic studies of explanation, such as those of Hempel and Bromberger, took it for granted that an explanation-seeking question of the form "Why P?" should be understood as asking about the proposition P. This view has been recently challenged by Bas van Fraassen and Alan Garfinkel. They acknowledge that some questions have the surface form "Why P?", but they hold that a correct reading for why-questions should take the form "Why P (rather than Q)?", where Q is a contrasting alternative. This contrast theory is discussed here. It is argued that, properly understood, the contrast theory and the propositional approach can and should give equivalent readings to why-questions.

38 citations


Journal ArticleDOI
TL;DR: There is sufficient evidence at present to justify the belief that the universe began to exist without being caused to do so as mentioned in this paper, including the Hawking-Penrose singularity theorems that are based on Einstein's General Theory of Relativity, and the recently introduced Quantum Cosmological Models of the early universe.
Abstract: There is sufficient evidence at present to justify the belief that the universe began to exist without being caused to do so. This evidence includes the Hawking-Penrose singularity theorems that are based on Einstein's General Theory of Relativity, and the recently introduced Quantum Cosmological Models of the early universe. The singularity theorems lead to an explication of the beginning of the universe that involves the notion of a Big Bang singularity, and the Quantum Cosmological Models represent the beginning largely in terms of the notion of a vacuum fluctuation. Theories that represent the universe as infinitely old or as caused to begin are shown to be at odds with or at least unsupported by these and other current cosmological notions.

37 citations


Journal ArticleDOI
TL;DR: The authors argue that it is unnecessary for behavior to proceed from beliefs and desires according to the principles of logic and decision theory, or even from principles that generally get things right, and also deny that behavior to follow principles that, though perhaps subrational, are similar to those that we ourselves use.
Abstract: This paper challenges some leading views about the conditions under which the ascription of beliefs and desires can make sense of, or provide reasons for, a creature's behavior. I argue that it is unnecessary for behavior to proceed from beliefs and desires according to the principles of logic and decision theory, or even from principles that generally get things right. I also deny that it is necessary for behavior to proceed from principles that, though perhaps subrational, are similar to those that we ourselves use. I then propose some conditions that are considerably weaker, and argue that they fulfill the descriptive and explanatory requirements of intentional ascription.

32 citations


Journal ArticleDOI
TL;DR: In this article, the authors defend the Causal Theory of Reference against the recent criticism that it imposes a priori constraints on the aims and practices of science, and show that the metaphysical essentialism of this theory is compatible with the requirements of naturalistic epistemology.
Abstract: This paper defends the Causal Theory of Reference against the recent criticism that it imposes a priori constraints on the aims and practices of science. The metaphysical essentialism of this theory is shown to be compatible with the requirements of naturalistic epistemology. The theory is nevertheless unable to forestall the problem of incommensurability for scientific terms, because it misrepresents the conditions under which their reference is fixed. The resources of the Causal Theory of Reference and of the traditional cluster or "network" theory of meaning for handling problems of commensurability are compared, and an alternative approach is recommended.

28 citations


Journal ArticleDOI
TL;DR: In this article, it was shown that common cause hypotheses do not explain statistical correlations and not matchings between event tokens, and therefore they are not the proper explanations for statistical correlations.
Abstract: Sober (1984) has considered the problem of determining the evidential support, in terms of likelihood, for a hypothesis that is incomplete in the sense of not providing a unique probability function over the event space in its domain. Causal hypotheses are typically like this because they do not specify the probability of their initial conditions. Sober's (1984) solution to this problem does not work, as will be shown by examining his own biological examples of common cause explanation. The proposed solution will lead to the conclusion, contra Sober, that common cause hypotheses explain statistical correlations and not matchings between event tokens.

23 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that if probabilities are interpreted in the von Mises-Church sense of relative frequencies on random sequences, a proof of the Bell inequality is still possible in which such joint probabilities are assumed not to exist.
Abstract: Fine has recently proved the surprising result that satisfaction of the Bell inequality in a Clauser-Horne experiment implies the existence of joint probabilities for pairs of noncommuting observables in the experiment. In this paper we show that if probabilities are interpreted in the von Mises-Church sense of relative frequencies on random sequences, a proof of the Bell inequality is nonetheless possible in which such joint probabilities are assumed not to exist. We also argue that Fine's theorem and related results do not impugn the common view that local realists are committed to the Bell inequality.

Journal ArticleDOI
TL;DR: In this article, the authors test the contemporary concept of biological species against some of the problems caused by treating species as spatiotemporally extended entities governed by criteria of persistence, identity, etc.
Abstract: The purpose of this paper is to test the contemporary concept of biological species against some of the problems caused by treating species as spatiotemporally extended entities governed by criteria of persistence, identity, etc. After outlining the general problem of symmetric division in natural objects, I set out some useful distinctions (section 1) and confirm that species are not natural kinds (section 2). Section 3 takes up the separate issue of species definition, focusing on the Biological Species Concept (BSC). Sections 4 and 5 examine the matter of species identity over space and time respectively, as determined by the BSC. Both gradualistic and punctuated equilibrium models of speciation are discussed. In section 6 I argue that the BSC fails to determine adequate criteria for dealing with certain kinds of speciation. Section 7 moves speculatively beyond the BSC to a brief examination of alternatives.

Book ChapterDOI
TL;DR: In this paper, the authors define commensurability of two theories as the ratio of the total information of their shared answers to the sum of the answers yielded by the two theories combined.
Abstract: The commensurability of two theories can be defined (relative to a given set of questions) as the ratio of the total information of their shared answers to the total information of the answers yielded by the two theories combined. Answers should be understood here as model consequences (in the sense of the author’s earlier papers), not deductive consequences. This definition is relative to a given model of the joint language of the theories, but can be generalized to sets of models. It turns out to capture also the idea of incommensurability as conceptual alienation. Incommensurability so defined does not imply incomparability.

Journal ArticleDOI
TL;DR: The authors argued that even if it is advantageous to use rational strategies, it does not necessarily follow that we actually use them, and natural selection need not favor only or even primarily reliable belief-forming strategies.
Abstract: A tempting argument for human rationality goes like this: it is more conducive to survival to have true beliefs than false beliefs, so it is more conducive to survival to use reliable belief-forming strategies than unreliable ones. But reliable strategies are rational strategies, so there is a selective advantage to using rational strategies. Since we have evolved, we must use rational strategies. In this paper I argue that some criticisms of this argument offered by Stephen Stich fail because they rely on unsubstantiated interpretations of some results from experimental psychology. I raise two objections to the argument: (i) even if it is advantageous to use rational strategies, it does not follow that we actually use them; and (ii) natural selection need not favor only or even primarily reliable belief-forming strategies.

Journal ArticleDOI
TL;DR: In this paper, the Schrodinger equation has been investigated in the context of quantum measurements, and it has been shown that a sequence of statements bearing on quantum measurements may display intrinsic asymmetric properties, irrespective of the location of corresponding measurements in time t. The situation of an observer performing two measurements in two opposite directions of t is investigated.
Abstract: The formal time symmetry of the quantum measurement process is extensively discussed. Then, the origin of the alleged association between a fixed temporal direction and quantum measurements is investigated. It is shown that some features of such an association might arise from epistemological rather than purely physical assumptions. In particular, it is brought out that a sequence of statements bearing on quantum measurements may display intrinsic asymmetric properties, irrespective of the location of corresponding measurements in time t of the Schrodinger equation. The situation of an observer performing two measurements in two opposite directions of t is eventually investigated. Essential differences are found between two descriptions of this situation: the internal one (taking only into account what is recorded in the observer's memory) and the external one (whereby the observer is considered as a quantum system ruled by the Schrodinger equation). Finally, a method allowing several observers to establi...

Journal ArticleDOI
TL;DR: It is argued that resources made available within the constraints of the vector function approach are sufficient, indeed apt, for the physicalist enterprise, by offering a vector functional theory of the percept--the perceptual experience itself, a paradigm of phenomenally immediate, introspectively accessible consciousness.
Abstract: Physicalism is an empirical theory of the mind and its place in nature. So the physicalist must show that current neuroscience does not falsify physicalism, but instead supports it. Current neuroscience shows that a nervous system is what I call a vector function system. I provide a brief outline of the resources that empirical research has made available within the constraints of the vector function approach. Then I argue that these resources are sufficient, indeed apt, for the physicalist enterprise, by offering a vector functional, hence physicalist, theory of the percept--the perceptual experience itself, a paradigm of phenomenally immediate, introspectively accessible consciousness.

Journal ArticleDOI
TL;DR: The authors argue that Shapere's view that there are typically good reasons for scientific change ultimately presupposes the requirement of universal standards of scientific reasoning, and that the good reasons established by his account underdetermine the rationality of scientific change and allow that other changes would have been equally or even more rational.
Abstract: I argue that post-Kuhnian approaches to rational scientific change fail to appreciate several distinct philosophical requirements and relativist challenges that have been assumed to be, and may in fact be essential to any adequate conception of scientific rationality. These separate requirements and relativist challenges are clearly distinguished and motivated. My argument then focuses on Shapere's view that there are typically good reasons for scientific change. I argue: (1) that contrary to his central aim, his account of good reasons ultimately presupposes the requirement of universal standards of scientific reasoning; (2) that the good reasons established by his account underdetermine the rationality of scientific change and allow that other changes would have been equally or even more rational; (3) that as a result, Shapere's approach fails to meet what I characterize as the challenges of moderate, sociological, and cognitive relativism.

Journal ArticleDOI
TL;DR: In this paper, a hierarchical maximization of conditional causal expected utility and backtracking expected utility is proposed, where the conditional expected utility maximizes the expected utility among the options that are causally ratifiable.
Abstract: Causal decision theory produces decision instability in cases such as Death in Damascus where a decision itself provides evidence concerning the utility of options. Several authors have proposed ways of handling this instability. William Harper (1985 and 1986) advances one of the most elegant proposals. He recommends maximizing causal expected utility among the options that are causally ratifiable. Unfortunately, Harper's proposal imposes certain restrictions; for instance, the restriction that mixed strategies are freely available. To obtain a completely general method of handling decision instability, I step outside the confines of pure causal decision theory. I introduce a new kind of backtracking expected utility and propose maximizing it among the options that are causally ratifiable. In other words, I propose a hierarchical maximization of (1) conditional causal expected utility and (2) the new backtracking expected utility. I support this proposal with some intuitive considerations concerning the d...

Journal ArticleDOI
TL;DR: Glymour's Theory and Evidence contains an attempt to give a formal characterization of the relationships involved for theories with a first-order formalization, and Glymour called the process of testing a hypothesis in the theory by using that same hypothesis, or others in the same theory, bootstrap testing as mentioned in this paper.
Abstract: A familiar fact is that we use our background knowledge and some of our hypotheses in arguing for or against other hypotheses. But the structure of such arguments is difficult to capture. Glymour's Theory and Evidence contains an attempt to give a formal characterization of the relationships involved for theories with a first-order formalization, and Glymour called the process of testing a hypothesis in the theory by using that same hypothesis, or others in the same theory, \"bootstrap testing\". However, the original conditions for bootstrap testing were not strong enough, as shown by a series of counterexamples by Christensen (1983). Glymour (1983) proposed to strengthen the account by adding an extra requirement, condition (R). Zytkow (1986) claimed that (R) is too strong. As an alternative to Glymour (1983), Zytkow provided his own version of bootstrapping which, he claimed, avoided both the Christensen counterexamples and the objection to (R). We will show that Zytkow's version of bootstrapping is unacceptable and that his worry about condition (R) is unfounded. However, we will also argue that Zytkow's remarks point to what is, perhaps, a more satisfying version of (R).

Journal ArticleDOI
TL;DR: In this paper, the authors argue that convergence is not a necessary property of an inference rule or estimator, and present an example in which a rule of inference has a likelihood rationale but is not convergent.
Abstract: A common view among statisticians is that convergence (which statisticians call consistency) is a necessary property of an inference rule or estimator. In this paper, this view is challenged by appeal to an example in which a rule of inference has a likelihood rationale but is not convergent. The example helps clarify the significance of the likelihood concept in statistical inference.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the epistemic status of the auxiliaries involved in the process of relative confirmation of a hypothesis relative to auxiliary assumptions or background theory and show that such relative confirmation will not increase the credibility of the hypothesis thus confirmed.
Abstract: Recent work on the logical theory of confirmation has centered on accounts of the confirmation of hypotheses relative to auxiliary assumptions or background theory Whether such relative confirmation actually increases the credibility of the (relatively) confirmed hypothesis will depend in various ways on the epistemic status of the auxiliaries involved Most obviously, if the auxiliaries are not themselves credible, confirmation relative to them will not increase the credibility of the hypothesis thus confirmed A complete theory of confirmation must thus combine an account of relative confirmation with an account of the route from relative confirmation to real confirmation Some recent criticisms of hypothetico-deductive and bootstrapping accounts of relative confirmation are undermined by failure to appreciate the limitations of relative confirmation

Journal ArticleDOI
TL;DR: This paper showed that the modern theory of measure is not capable of refuting Zeno's paradox in the sense of showing his error, and argued that it is consequently more than a mere sophism.
Abstract: Professor Grunbaum's much-discussed refutation of Zeno's metrical paradox turns out to be ad hoc upon close examination of the relevant portion of measure theory. Although the modern theory of measure is able to defuse Zeno's reasoning, it is not capable of refuting Zeno in the sense of showing his error. I explain why the paradox is not refutable and argue that it is consequently more than a mere sophism.

Book ChapterDOI
TL;DR: In this article, the authors analyze the conditions that make it possible to refer to the same mathematical item through a variety of axiomatic presentations, such as the real line, the triangle, sets, and the natural numbers.
Abstract: The items of mathematics, such as the real line, the triangle, sets, and the natural numbers, share the property of retaining their identity while receiving axiomatic presentations which may vary radically. Mathematicians have axiomatized the real line as a one-dimensional continuum, as a complete Archimedean ordered field, as a real closed field, or as a system of binary decimals on which arithmetical operations are performed in a certain way. Each of these axiomatizations is tacitly understood by mathematicians as an axiomatization of the same real line. That is, the mathematical item thereby axiomatized is presumed to be the same in each case, and such an identity is not questioned. We wish to analyze the conditions that make it possible to refer to the same mathematical item through a variety of axiomatic presentations.

Journal ArticleDOI
TL;DR: Galileo's theory of the tides has been criticised as a version of explanationism, and it has been argued that if Galileo was a realist, his realism was so highly constrained as to be irrelevant.
Abstract: It is argued that Galileo's theory of justification was a version of explanationism. Galileo's Dialogue on the Two Chief World Systems is to be read as primarily a defense of his theory of the tides. He shows how, by assuming Copernican motions, he can explain the tides, thereby justifying the endorsement of Copernicus. The crux of the argument rests on Galileo's account of explanation, which is novel in its reliance on the use of geometry. Finally, the consequences of his use of geometry, and his views on the limits of knowledge, force us to conclude that if Galileo was a realist, his realism was so highly constrained as to be irrelevant.


Journal ArticleDOI
TL;DR: In this article, the authors reconstruct two different representation-theoretic or embedding accounts of space-time relationalism, involving two different conditions on embeddings: uniqueness up to symmetry and uniqueness down to indistinguishability.
Abstract: From recent writings of Brent Mundy and Michael Friedman we reconstruct two different representation-theoretic or embedding accounts of space-time relationalism, involving two different conditions on embeddings: respectively, uniqueness up to symmetry and uniqueness up to indistinguishability. We discuss the properties of these two accounts, and, with respect specifically to Friedman's projects, assess their merits and demerits.

Journal ArticleDOI
TL;DR: In this paper, the authors defend Paul Feyerabend's claim that there are some scientific theories that cannot be refuted unless one of their rivals is first confirmed, by criticizing Ronald Laymon's well-known attack on Feyerabe's claim, and argue both that the Second Law of Thermodynamics was not refuted before the Kinetic Theory's predictions were confirmed, and that it could not have been refuted without the confirmation of remarkable predictions of some rival theory.
Abstract: In this paper, I will defend Paul Feyerabend's claim--that there are some scientific theories that cannot be refuted unless one of their rivals is first confirmed--by criticizing Ronald Laymon's well-known attack on Feyerabend's claim. In particular, I will argue both that the Second Law of Thermodynamics was not refuted before the Kinetic Theory's predictions were confirmed, and that it could not have been refuted without the confirmation of the remarkable predictions of some rival theory.

Journal ArticleDOI
TL;DR: In fact, this account is compatible with virtually any formulation of evidential support, which runs afoul of Siegel's claim that scientific beliefs must be evaluated with respect to their rationality as mentioned in this paper.
Abstract: Harvey Siegel's (1985) attempts to revive the traditional epistemological formulation of the rationality of science. Contending that "a general commitment to evidence" is constitutive of method and rationality in science, Siegel advances its compatibility with specific, historically attuned formulations of principles of evidential support as a virtue of his aprioristic candidate for science's rationality. In point of fact, this account is compatible with virtually any formulation of evidential support, which runs afoul of Siegel's claim that scientific beliefs must be evaluated with respect to their rationality. The unwelcome consequence of Siegel's view is that most any belief, scientific or pseudoscientific, can be defended as rational. Indeed, if we want to furnish a warrant for rational choice, we must turn to the very historically informed principles of evidential support that are dismissed by Siegel as providing a misleading portrait of science's rationality.

Journal ArticleDOI
TL;DR: In this paper, the EPR correlations cannot be explained by signals being transmitted from one component of an EPR compound to the other (pp. 114-115), and there is no empirically verifiable action at a distance.
Abstract: (1) The EPR correlations cannot be explained by signals being transmitted from one component of an EPR compound to the other (pp. 114-115). (2) There is, in the EPR situation, no empirically verifiable action at a distance (pp. 124-126). (3) The demand for an explanation of the EPR correlations is similar to the Aristotelian demand of the post-Newtonian proponents of the law of inertia to explain what keeps a body moving if there are no forces impressed on it (pp. 126-128).

Journal ArticleDOI
TL;DR: This article argued that if justification is seen from a naturalized standpoint, more attention to the actual process of epistemic justification might be in order and that the justificatory set might come to be seen more descriptively and less normatively.
Abstract: The current project of "naturalizing" epistemology has left epistemologists with a plethora of theories alleged to fall under that rubric. Recent epistemic justification theorists have seemed to want to focus on theories of epistemic justification that are more contextualized (naturalized) and less normatively global than those of the past. This paper has two central arguments: (i) that if justification is seen from a naturalized standpoint, more attention to the actual process of epistemic justification might be in order (and, hence, that the justificatory set might come to be seen more descriptively and less normatively), and (ii) that if any theory of epistemic justification were to be normatively accurate, regardless of the size of its justificatory set, then one of the requirements upon it might well be that key terms in the set would refer, in the spirit of the new scientific realism. The central thesis of the paper relies on the normative/descriptive distinction as applicable to epistemic justifica...