scispace - formally typeset
Search or ask a question

Showing papers in "Philosophy of Science in 2001"


Journal ArticleDOI
TL;DR: The synthesis of Cummins' account with recent work on mechanisms and causal/mechanical explanation produces an analysis of specifically mechanistic role functions, one that uses the characteristic active, spatial, temporal, and hierarchical organization of mechanisms to add precision and content to Cummins's original suggestion.
Abstract: Many areas of science develop by discovering mechanisms and role functions. Cummins' (1975) analysis of role functions-according to which an item's role function is a capacity of that item that appears in an analytic explanation of the capacity of some containing system-captures one important sense of "function" in the biological sciences and elsewhere. Here I synthesize Cummins' account with recent work on mechanisms and causal/mechanical explanation. The synthesis produces an analysis of specifically mechanistic role functions, one that uses the characteristic active, spatial, temporal, and hierarchical organization of mechanisms to add precision and content to Cummins' original suggestion. This synthesis also shows why the discovery of role functions is a scientific achievement. Discovering a role function (i) contributes to the interlevel integration of multilevel mechanisms, and (ii) provides a unique, contextual variety of causal/mechanical explanation.

399 citations


Journal ArticleDOI
TL;DR: It is argued that on any definition of information the view that development is the expression of genetic information is misleading and that developmental biology is concerned with teleosemantic information.
Abstract: John Maynard Smith has defended against philosophical criticism the view that developmental biology is the study of the expression of information encoded in the genes by natural selection. However, like other naturalistic concepts of information, this 'teleosemantic' information applies to many non-genetic factors in development. Maynard Smith also fails to show that developmental biology is concerned with teleosemantic information. Some other ways to support Maynard Smith's conclusion are considered. It is argued that on any definition of information the view that development is the expression of genetic information is misleading. Some reasons for the popularity of that view are suggested.

297 citations


Journal ArticleDOI
TL;DR: This paper argues that simulation is a rich inferential process, and not simply a "number crunching" technique, and places more emphasis on the role of theory in guiding (rather than determining) the construction of models.
Abstract: Using an example of a computer simulation of the convective structure of a red giant star, this paper argues that simulation is a rich inferential process, and not simply a "number crunching" technique. The scientific practice of simulation, moreover, poses some interesting and challenging epistemological and methodological issues for the philosophy of science. I will also argue that these challenges would be best addressed by a philosophy of science that places less emphasis on the representational capacity of theories (and ascribes that capacity instead to models) and more emphasis on the role of theory in guiding (rather than determining) the construction of models.

126 citations


Journal ArticleDOI
TL;DR: In this paper, the authors use evidence from cognitive psychology and the history of science to examine the issue of the theory-ladenness of perceptual observation, and they conclude that the evidence for theoryladenness does not lead to a relativist account of scientific knowledge.
Abstract: We use evidence from cognitive psychology and the history of science to examine the issue of the theory-ladenness of perceptual observation. This evidence shows that perception is theory-laden, but that it is only strongly theory-laden when the perceptual evidence is ambiguous or degraded, or when it requires a difficult perceptual judgment. We argue that debates about the theory-ladenness issue have focused too narrowly on the issue of perceptual experience, and that a full account of the scientific process requires an examination of theory-ladenness in attention, perception, data interpretation, data production, memory, and scientific communication. We conclude that the evidence for theory-ladenness does not lead to a relativist account of scientific knowledge.

116 citations


Journal ArticleDOI
TL;DR: The authors show that underdetermination does not depend on empirical equivalents: our warrant for current theories is equally undermined by presently unconceived alternatives as well-confirmed merely by the existing evidence, so long as this transient predicament recurs for each theory and body of evidence we consider.
Abstract: Advocates have sought to prove that underdetermination obtains because all theories have empirical equivalents. But algorithms for generating empirical equivalents simply exchange underdetermination for familiar philosophical chestnuts, while the few convincing examples of empirical equivalents will not support the desired sweeping conclusions. Nonetheless, underdetermination does not depend on empirical equivalents: our warrant for current theories is equally undermined by presently unconceived alternatives as well-confirmed merely by the existing evidence, so long as this transient predicament recurs for each theory and body of evidence we consider. The historical record supports the claim that this recurrent, transient underdetermination predicament is our own.

98 citations


Journal ArticleDOI
TL;DR: The potential for contemporary research in human population genetics to contribute to racism needs to be considered with respect to the ability of the typological-population distinction to arbitrate boundaries between racist society and nonracist, even anti-racist, science.
Abstract: This paper questions the prevailing historical understanding that scientific racism "retreated" in the 1950s when anthropology adopted the concepts and methods of population genetics and race was recognized to be a social construct and replaced by the concept of population. More accurately, a "populational" concept of race was substituted for a "typological one"--this is demonstrated by looking at the work of Theodosius Dobzhansky circa 1950. The potential for contemporary research in human population genetics to contribute to racism needs to be considered with respect to the ability of the typological-population distinction to arbitrate boundaries between racist society and nonracist, even anti-racist, science. I point out some ethical limits of "population thinking" in doing so.

98 citations


Book ChapterDOI
TL;DR: Structural Realism (SR) as discussed by the authors is a philosophical position concerning what there is in the world and what can be known of it, and it is realist because it asserts the existence of a mind-independent world, and structural because what is knowable of the world is its structure only.
Abstract: Structural Realism (SR) is meant to be a substantive philosophical position concerning what there is in the world and what can be known of it. It is realist because it asserts the existence of a mind-independent world, and it is structural because what is knowable of the world is said to be its structure only. As a slogan, the thesis is that knowledge can reach only up to the structural features of the world. This chapter unravels and criticises the metaphysical presuppositions of SR. It questions its very possibility as a substantive — and viable — realist thesis.

98 citations


Journal ArticleDOI
TL;DR: In this article, the authors argue that careful attention to the dynamical properties of thermodynamically irreversible processes shows that in many ordinary cases, Lewis's analysis fails to yield this asymmetry.
Abstract: In "Counterfactual Dependence and Time's Arrow", David Lewis defends an analysis of counterfactuals intended to yield the asymmetry of counterfactual dependence: that later affairs depend counterfactually on earlier ones, and not the other way around. I argue that careful attention to the dynamical properties of thermodynamically irreversible processes shows that in many ordinary cases, Lewis's analysis fails to yield this asymmetry. Furthermore, the analysis fails in an instructive way: it teaches us something about the connection between the asymmetry of overdetermination and the asymmetry of entropy.

96 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine the following problems: How many concepts of function are there in biology, social science, and technology? Are they logically related and if so, how? Which of these function concepts effect a functional explanation as opposed to a mere functional account? What are the consequences of a pluralist view of functions for functionalism?
Abstract: In this paper we examine the following problems: How many concepts of function are there in biology, social science, and technology? Are they logically related and if so, how? Which of these function concepts effect a functional explanation as opposed to a mere functional account? What are the consequences of a pluralist view of functions for functionalism? We submit that there are five concepts of function in biology, which are logically related in a particular way, and six function concepts in social science and technology. Only two of them may help effect a genuine functional explanation. Finally, our synthetic approach allows us to distinguish four different varieties of functionalism in biology, psychology, social science, and technology: formalist, black boxist, adaptationist, and teleological. And only one of them is explanatory in the strong sense defended here.

94 citations


Journal ArticleDOI
TL;DR: In this paper, a Bayesian account of independent evidential support is outlined, which is partly inspired by the work of C. S. Peirce, and a large class of quantitative Bayesian measures of confirmation satisfy some basic desiderata for adequate accounts of independent evidence.
Abstract: A Bayesian account of independent evidential support is outlined. This account is partly inspired by the work of C. S. Peirce. I show that a large class of quantitative Bayesian measures of confirmation satisfy some basic desiderata suggested by Peirce for adequate accounts of independent evidence. I argue that, by considering further natural constraints on a probabilistic account of independent evidence, all but a very small class of Bayesian measures of confirmation can be ruled out. In closing, another application of my account to the problem of evidential diversity is also discussed.

92 citations


Journal ArticleDOI
TL;DR: The authors developed an account of explanation in biology which does not involve appeal to laws of nature, at least as traditionally conceived, and explored the differences between invariance and the related notions of stability and resiliency, due respectively to Sandra Mitchell and Brian Skyrms.
Abstract: This paper develops an account of explanation in biology which does not involve appeal to laws of nature, at least as traditionally conceived. Explanatory generalizations in biology must satisfy a requirement that I call invariance, but need not satisfy most of the other standard criteria for lawfulness. Once this point is recognized, there is little motivation for regarding such generalizations as laws of nature. Some of the differences between invariance and the related notions of stability and resiliency, due respectively to Sandra Mitchell and Brian Skyrms, are explored.

Journal ArticleDOI
TL;DR: It is argued that the epistemic basis of the initial model organism programs is not best understood as reasoning via causal analog models (CAMs), and it is suggested that a more valid characterization of the reasoning structure at work here is a form of case-based reasoning.
Abstract: Through an examination of the actual research strategies and assumptions underlying the Human Genome Project (HGP), it is argued that the epistemic basis of the initial model organism programs is not best understood as reasoning via causal analog models (CAMs). In order to answer a series of questions about what is being modeled and what claims about the models are warranted, a descriptive epistemological method is employed that uses historical techniques to develop detailed accounts which, in turn, help to reveal forms of reasoning that are explicit, or more often implicit, in the practice of a particular field of scientific study. It is suggested that a more valid characterization of the reasoning structure at work here is a form of case-based reasoning. This conceptualization of the role of model organisms can guide our understanding and assessment of these research programs, their knowledge claims and progress, and their limitations, as well as how we educate the public about this type of biomedical r...

Journal ArticleDOI
TL;DR: In this article, a taxonomy of higher-order properties of complex dynamical systems is presented, which distinguishes three classes of emergent properties: ontologically basic properties, such as the mythical vital properties, fully configurational properties, and highly configurable properties such as higher-level patterns characteristic of complex dynamic systems.
Abstract: A variety of recent philosophical discussions, particularly on topics relating to complexity, have begun to reemploy the concept of 'emergence'. Although multiple concepts of 'emergence' are available, little effort has been made to systematically distinguish them. In this paper, I provide a taxonomy of higher-order properties that (inter alia) distinguishes three classes of emergent properties: (1) ontologically basic properties of complex entities, such as the mythical vital properties, (2) fully configurational properties, such as mental properties as they are conceived of by functionalists and computationalists, and (3) highly configurational/holistic properties, such as the higher-level patterns characteristic of complex dynamical systems. Or more simply: emergence as ontological liberality, emergence as multiple realizability, and emergence as interactive complexity.

Journal ArticleDOI
TL;DR: The quantum gravity program seeks a theory that handles quantum matter fields and gravity consistently, but is such a theory really required and must it involve quantizing the gravitational field? as mentioned in this paper give reasons for a positive answer to the first question, but dispute a widespread contention that it is inconsistent for the gravitational force to be classical while matter is quantum.
Abstract: The quantum gravity program seeks a theory that handles quantum matter fields and gravity consistently. But is such a theory really required and must it involve quantizing the gravitational field? We give reasons for a positive answer to the first question, but dispute a widespread contention that it is inconsistent for the gravitational field to be classical while matter is quantum. In particular, we show how a popular argument (Eppley and Hannah 1997) falls short of a no-go theorem, and discuss possible counterexamples. Important issues in the foundations of physics are shown to bear crucially on all these considerations.

Journal ArticleDOI
TL;DR: "Theory and Method in the Neurosciences" surveys the nature and structure of theories in contemporary neuroscience, exploring many of its methodological techniques and problems.
Abstract: "Theory and Method in the Neurosciences" surveys the nature and structure of theories in contemporary neuroscience, exploring many of its methodological techniques and problems. The essays explore basic questions about how to relate theories of neuroscience and cognition, the multilevel character of such theories, and their experimental bases. Philosophers and scientists (and some who are both) examine the topics of explanation and mechanisms, simulation and computation, imaging and animal models that raise questions about the forefront of research in cognitive neuroscience. Their work will stimulate new thinking in anyone interested in the mind or brain and in recent theories of their connections.

Journal ArticleDOI
TL;DR: In this article, the authors reconstruct the Newton-Wigner localization scheme and clarify the limited extent to which it avoids the counterintuitive consequences of the Reeh-Schlieder theorem.
Abstract: Many of the "counterintuitive" features of relativistic quantum field theory have their formal root in the Reeh-Schlieder theorem, which in particular entails that local operations applied to the vacuum state can produce any state of the entire field. It is of great interest then that I.E. Segal and, more recently, G. Fleming (in a paper entitled "Reeh-Schlieder meets Newton-Wigner") have proposed an alternative "Newton-Wigner" localization scheme that avoids the Reeh-Schlieder theorem. In this paper, I reconstruct the Newton-Wigner localization scheme and clarify the limited extent to which it avoids the counterintuitive consequences of the Reeh-Schlieder theorem. I also argue that there is no coherent interpretation of the Newton-Wigner localization scheme that renders it free from act-outcome correlations at spacelike separation.

Journal ArticleDOI
TL;DR: In this article, it is argued that the fact that normic laws are a product of evolution must establish a systematic connection between prototypical and statistical normality, which explains their omnipresence, lawlikeness, and reliability.
Abstract: Normic laws have the form "if A, then normally B." They are omnipresent in everyday life and non-physical 'life' sciences such as biology, psychology, social sciences, and humanities. They differ significantly from ceteris-paribus laws in physics. While several authors have doubted that normic laws are genuine laws at all, others have argued that normic laws express a certain kind of prototypical normality which is independent of statistical majority. This paper presents a foundation for normic laws which is based on generalized evolution theory and explains their omnipresence, lawlikeness, and reliability. It is argued that the fact that normic laws are a product of evolution must establish a systematic connection between prototypical and statistical normality.

Journal ArticleDOI
TL;DR: In this paper, the authors use a vector model of least squares estimation to show that degrees of freedom, the difference between the observed parameters fit by the model and the number of explanatory parameters estimated, indicate the disconfirmability of the model.
Abstract: Model simplicity in curve fitting is the fewness of parameters estimated. I use a vector model of least squares estimation to show that degrees of freedom, the difference between the number of observed parameters fit by the model and the number of explanatory parameters estimated, are the number of potential dimensions in which data are free to differ from a model and indicate the disconfirmability of the model. Though often thought to control for parameter estimation, the AIC and similar indices do not do so for all model applications, while goodness of fit indices like chi-square, which explicitly take into account degrees of freedom, do. Hypothesis testing with prespecified values for parameters is based on a metaphoric regulative subject/object schema taken from object perception and has as its goal the accumulation of objective knowledge.

Journal ArticleDOI
TL;DR: In this article, the authors argue that the gauge potential is best understood as a feature of the physical situation whose global character is most naturally represented by the holonomies of closed curves in space-time.
Abstract: Classically, a gauge potential was merely a convenient device for generating a corresponding gauge field. Quantum-mechanically, a gauge potential lays claim to independent status as a further feature of the physical situation. But whether this is a local or a global feature is not made any clearer by the variety of mathematical structures used to represent it. I argue that in the theory of electromagnetism (or a non-Abelian generalization) that describes quantum particles subject to a classical interaction, the gauge potential is best understood as a feature of the physical situation whose global character is most naturally represented by the holonomies of closed curves in space-time.

Journal ArticleDOI
TL;DR: The role of the biologie moleculaire and of the Biologie fonctionnelle in the debat methodologique opposant le reductionnisme and lantireductionnism is discussed in this article.
Abstract: Dans le cadre du debat methodologique opposant le reductionnisme et l'antireductionnisme, l'A. mesure le role de la biologie moleculaire et de la biologie fonctionnelle dans l'explication des mecanismes evolutionnistes et des combinaisons genetiques. Soulevant le probleme de la generalisation et de l'existence des lois scientifiques, l'A. definit la biologie comme une discipline historique fondee sur la selection naturelle.

Journal ArticleDOI
TL;DR: The main thrust of the paper as mentioned in this paper involves a theoretical and philosophical analysis of the claim made in September 1999 that atomic orbitals have been directly imaged for the first time, and the conclusion is that contrary to these claims, atomic Orbitals have not in fact been observed.
Abstract: The main thrust of the paper involves a theoretical and philosophical analysis of the claim made in September 1999 that atomic orbitals have been directly imaged for the first time. After a brief account of the recent claims the paper reviews the development of the orbit and later orbital concepts and analyzes the theoretical status of atomic orbitals. The conclusion is that contrary to these claims, atomic orbitals have not in fact been observed. The non-referring nature of modern atomic orbitals is discussed in the context of Laudan's writings on realism, the success of theories, and whether or not scientific terms refer. I conclude that the failure to observe orbitals is a good prima facie case for divorcing the success of theories from the question of whether their central terms refer. The added relevance of this case is that it concerns a current and highly successful theory. Finally, the relevance of this 'floating model' to contemporary discussions on scientific models is briefly considered.

Journal ArticleDOI
TL;DR: In the early seventeenth century, there was a growing realization among those who reflected on the kind of knowledge the new sciences could afford (among them Kepler, Bacon, Descartes, Boyle, Huygens) that hypothesis would have to be conceded a much more significant place in natural philosophy than the earlier ideal of demonstration allowed.
Abstract: As the seventeenth century progressed, there was a growing realization among those who reflected on the kind of knowledge the new sciences could afford (among them Kepler, Bacon, Descartes, Boyle, Huygens) that hypothesis would have to be conceded a much more significant place in natural philosophy than the earlier ideal of demonstration allowed. Then came the mechanics of Newton's Principia, which seemed to manage quite well without appealing to hypothesis (though much would depend on how exactly terms like "force" and "attraction" were construed). If the science of motion could dispense with causal hypothesis and the attendant uncertainty, why should this not serve as the goal of natural philosophy generally? The apparent absence of causal hypothesis from the highly successful new science of motion went far towards shaping, in different ways, the account of scientific knowledge given by many of the philosophers of the century following, notable among them Berkeley, Hume, Reid, and Kant. This "Newtonian"...

Journal ArticleDOI
TL;DR: In this paper, the authors show how quantum indeterminism hooks up to point mutations (via tautomeric shifts, proton tunneling, and aqueous thermal motion) and conclude with a few thoughts on some of the wider implications of this topic.
Abstract: In "The Indeterministic Character of Evolutionary Theory: No 'Hidden Variables Proof' But No Room for Determinism Either," Brandon and Carson (1996) argue that evolutionary theory is statistical because the processes it describes are fundamentally statistical. In "Is Indeterminism the Source of the Statistical Character of Evolutionary Theory?" Graves, Horan, and Rosenberg (1999) argue in reply that the processes of evolutionary biology are fundamentally deterministic and that the statistical character of evolutionary theory is explained by epistemological rather than ontological considerations. In this paper I focus on the topic of mutation. By focusing on some of the theory and research on this topic from early to late, I show how quantum indeterminism hooks up to point mutations (via tautomeric shifts, proton tunneling, and aqueous thermal motion). I conclude with a few thoughts on some of the wider implications of this topic.

Journal ArticleDOI
TL;DR: In this paper, the cognitive impenetrability of perception is argued by undermining the argument from reentrant pathways, and the role of descending pathways is to enable the early-vision processing modules to participate in higher-level visual or cognitive functions.
Abstract: In this paper I argue for the cognitive impenetrability of perception by undermining the argument from reentrant pathways. To do that I will adduce psychological and neuropsychological evidence showing that (a) early vision processing is not affected by our knowledge about specific objects and events, and (b) that the role of the descending pathways is to enable the early-vision processing modules to participate in higher-level visual or cognitive functions. My thesis is that a part of observation, which I will call perception, is bottom-up and theory neutral. As such, perception could play the role of common ground on which a naturalized epistemology can be built and relativism avoided.

Journal ArticleDOI
TL;DR: The authors make a distinction between the claim that groups have a psychology and what I call the social manifestation thesis-a thesis about the psychology of individuals, and point out the connection between externalist conceptions of the mind familiar since the work of Putnam and Burge.
Abstract: David Sloan Wilson has recently revived the idea of a group mind as an application of group selectionist thinking to cognition. Central to my discussion of this idea is the distinction between the claim that groups have a psychology and what I call the social manifestation thesis-a thesis about the psychology of individuals. Contemporary work on this topic has confused these two theses. My discussion also points to research questions and issues that Wilson's work raises, as well as their connection to externalist conceptions of the mind familiar since the work of Putnam and Burge.


Journal ArticleDOI
TL;DR: In this paper, it is argued that nothing of global skeptical or agnostic significance follows from the kind of underdetermination presently encountered in fundamental quantum theory, and the case is instructive, however, for what it shows about the characteristics and prospects of scientific realism as a perspective in contemporary philosophy of science.
Abstract: Recent attempts to turn Standard Quantum Theory into a coherent representational system have improved markedly over previous offerings. Important questions about the nature of material systems remain open, however, as current theorizing effectively resolves into a multiplicity of incompatible statements about the nature of physical systems. Specifically, the most cogent proposals to date land in effective empirical equivalence, reviving old anti-realist fears about quantum physics. In this paper such fears are discussed and found unsound. It is argued that nothing of global skeptical or agnostic significance follows from the kind of underdetermination presently encountered in fundamental quantum theory. The case is instructive, however, for what it shows about the characteristics and prospects of scientific realism as a perspective in contemporary philosophy of science.

Journal ArticleDOI
Marcel Weber1
TL;DR: The authors pointed out some internal problems in these positions and showed that the relationship between determinism, eliminability, realism, and the interpretation of probability is more complex than previously assumed in this debate.
Abstract: Recent discussion of the statistical character of evolutionary theory has centered around two positions: (1) Determinism combined with the claim that the statistical character is eliminable, a subjective interpretation of probability, and instrumentalism; (2) Indeterminism combined with the claim that the statistical character is ineliminable, a propensity interpretation of probability, and realism. I point out some internal problems in these positions and show that the relationship between determinism, eliminability, realism, and the interpretation of probability is more complex than previously assumed in this debate. Furthermore, I take some initial steps towards a more adequate account of the statistical character of evolutionary theory.

Journal ArticleDOI
TL;DR: Two Bayesian approaches to choosing between statistical models are contrasted and it is pointed out that the former approach establishes a new, philosophically interesting connection between the notions of simplicity and informativeness.
Abstract: Two Bayesian approaches to choosing between statistical models are contrasted. One of these is an approach which Bayesian statisticians regularly use for motivating the use of AIC, BIC, and other similar model selection criteria, and the other one is a new approach which has recently been proposed by Bandyopadhayay, Boik, and Basu. The latter approach is criticized, and the basic ideas of the former approach are presented in a way that makes them accessible to a philosophical audience. It is also pointed out that the former approach establishes a new, philosophically interesting connection between the notions of simplicity and informativeness.

Journal ArticleDOI
TL;DR: In this article, the authors analyze the consequences of recent theories of perception and vision developed within Cognitive Science for classical epistemological theses about observation, taking into account the empirical results of Cognitive Psychology (theories of perception).
Abstract: The aim of this paper is to analyze a philosophical question (neutrality vs. theory-ladenness of observation) taking into consideration the empirical results of Cognitive Psychology (theories of perception). This is an important debate because the objectivity of science is at stake. In the Philosophy of Science there are two main positions with regard to observation, those of C. Hempel and N. R. Hanson. In the Philosophy of Mind there are also two important contrasting positions, those of J. Fodor and Paul M. Churchland. I will analyze the consequences of recent theories of perception and vision developed within Cognitive Science for classical epistemological theses about observation.