scispace - formally typeset
Search or ask a question

Showing papers in "Journal for General Philosophy of Science in 2017"


Journal ArticleDOI
TL;DR: The authors apply Benacerraf's distinction between mathematical ontology and mathematical practice to examine contrasting interpretations of infinitesimal mathematics of the seventeenth and eighteenth century, in the work of Bos, Ferraro, Laugwitz, and others.
Abstract: We apply Benacerraf’s distinction between mathematical ontology and mathematical practice (or the structures mathematicians use in practice) to examine contrasting interpretations of infinitesimal mathematics of the seventeenth and eighteenth century, in the work of Bos, Ferraro, Laugwitz, and others. We detect Weierstrass’s ghost behind some of the received historiography on Euler’s infinitesimal mathematics, as when Ferraro proposes to understand Euler in terms of a Weierstrassian notion of limit and Fraser declares classical analysis to be a “primary point of reference for understanding the eighteenth-century theories.” Meanwhile, scholars like Bos and Laugwitz seek to explore Eulerian methodology, practice, and procedures in a way more faithful to Euler’s own. Euler’s use of infinite integers and the associated infinite products are analyzed in the context of his infinite product decomposition for the sine function. Euler’s principle of cancellation is compared to the Leibnizian transcendental law of homogeneity. The Leibnizian law of continuity similarly finds echoes in Euler. We argue that Ferraro’s assumption that Euler worked with a classical notion of quantity is symptomatic of a post-Weierstrassian placement of Euler in the Archimedean track for the development of analysis, as well as a blurring of the distinction between the dual tracks noted by Bos. Interpreting Euler in an Archimedean conceptual framework obscures important aspects of Euler’s work. Such a framework is profitably replaced by a syntactically more versatile modern infinitesimal framework that provides better proxies for his inferential moves.

29 citations


Journal ArticleDOI
TL;DR: The authors argue that the traditional distinction between basic and applied science is at odds with the way the distinction was understood by its supporters in debates on science education and science policy in the nineteenth and twentieth centuries, and show how a distinction that refers to difference on several epistemic and social dimensions makes good sense of representative historical cases.
Abstract: The traditional distinction between basic (“pure”) and applied science has been much criticized in recent decades. The criticism is based on a combination of historical and systematic epistemic argument. The present paper is mostly concerned with the historical aspect. I argue that the critics impose an understanding at odds with the way the distinction was understood by its supporters in debates on science education and science policy in the nineteenth and twentieth centuries. And I show how a distinction that refers to difference on several epistemic and social dimensions makes good sense of representative historical cases. If this argument is tenable it suggests more continuity in the epistemology and politics of science than has been claimed by a new paradigm of science studies and politics during recent decades.

27 citations


Journal ArticleDOI
TL;DR: In this article, the authors show that interlevel causation in mechanisms is indeed possible, if we take seriously the idea that the relata of the mechanistic level relation are acting entities and accept a slightly modified notion of a mechanistic-level that is highly plausible in the light of the first clarification.
Abstract: According to the new mechanistic approach, an acting entity is at a lower mechanistic level than another acting entity if and only if the former is a component in the mechanism for the latter. Craver and Bechtel (Biol Philos 22(4):547–563, 2007. doi: 10.1007/s10539-006-9028-8 ) argue that a consequence of this view is that there cannot be causal interactions between acting entities at different mechanistic levels. Their main reason seems to be what I will call the Metaphysical Argument: things at different levels of a mechanism are related as part and whole; wholes and their parts cannot be related as cause and effect; hence, interlevel causation in mechanisms is impossible. I will analyze this argument in more detail and show under which conditions it is valid. This analysis will reveal that interlevel causation in mechanisms is indeed possible, if we take seriously the idea that the relata of the mechanistic level relation are acting entities and accept a slightly modified notion of a mechanistic level that is highly plausible in the light of the first clarification.

24 citations


Journal ArticleDOI
TL;DR: In this paper, the authors argue that Renormalization group explanations are non-causal explanations because their explanatory power is due to the application mathematical operations, which do not serve the purpose of representing causal relations.
Abstract: Renormalization group (RG) methods are an established strategy to explain how it is possible that microscopically different systems exhibit virtually the same macro behavior when undergoing phase-transitions. I argue – in agreement with Robert Batterman – that RG explanations are non-causal explanations. However, Batterman misidentifies the reason why RG explanations are non-causal: it is not the case that an explanation is non- causal if it ignores causal details. I propose an alternative argument, according to which RG explanations are non-causal explanations because their explanatory power is due to the application mathematical operations, which do not serve the purpose of representing causal relations.

22 citations


Journal ArticleDOI
TL;DR: It is argued that Dellsén's criticisms against Bird’s view fail, and that increasing understanding cannot account for scientific progress, if acceptance, as opposed to belief, is required for scientific understanding.
Abstract: Bird (2007) argues that scientific progress consists in increasing knowledge. Dellsen (2016a) objects that increasing knowledge is neither necessary nor sufficient for scientific progress, and argues that scientific progress rather consists in increasing understanding. Dellsen also contends that unlike Bird’s view, his view can account for the scientific practices of using idealizations and of choosing simple theories over complex ones. I argue that Dellsen’s criticisms against Bird’s view fail, and that increasing understanding cannot account for scientific progress, if acceptance, as opposed to belief, is required for scientific understanding.

20 citations


Journal ArticleDOI
TL;DR: In this article, a review article illuminates the positions in that debate, evaluates the discourse and gives an outlook on questions that have not yet been addressed, including whether simulations ultimately qualify as experiments or as thought experiments.
Abstract: Where should computer simulations be located on the ‘usual methodological map’ (Galison 1996, 120) which distinguishes experiment from (mathematical) theory? Specifically, do simulations ultimately qualify as experiments or as thought experiments? Ever since Galison raised that question, a passionate debate has developed, pushing many issues to the forefront of discussions concerning the epistemology and methodology of computer simulation. This review article illuminates the positions in that debate, evaluates the discourse and gives an outlook on questions that have not yet been addressed.

13 citations


Journal ArticleDOI
TL;DR: In this article, the authors discuss three interrelated questions: is explanation in mathematics a topic that philosophers of mathematics can legitimately investigate? Second, are the specific aims that philosophical philosophers of mathematical explanation set themselves legitimate? And third, they argue that the models of explanation developed by philosophers of science useful tools for philosophers of Mathematical Explanation.
Abstract: In this paper we discuss three interrelated questions. First: is explanation in mathematics a topic that philosophers of mathematics can legitimately investigate? Second: are the specific aims that philosophers of mathematical explanation set themselves legitimate? Finally: are the models of explanation developed by philosophers of science useful tools for philosophers of mathematical explanation? We argue that the answer to all these questions is positive. Our views are completely opposite to the views that Mark Zelcer has put forward recently. Throughout this paper, we show why Zelcer’s arguments fail.

12 citations


Journal ArticleDOI
TL;DR: Naturalizing error as discussed by the authors describes an error type that is an appeal to nature as a self-justified description dictating or limiting our choices in moral, economic, political, and other social contexts.
Abstract: We describe an error type that we call the naturalizing error: an appeal to nature as a self-justified description dictating or limiting our choices in moral, economic, political, and other social contexts. Normative cultural perspectives may be subtly and subconsciously inscribed into purportedly objective descriptions of nature, often with the apparent warrant and authority of science, yet not be fully warranted by a systematic or complete consideration of the evidence. Cognitive processes may contribute further to a failure to notice the lapses in scientific reasoning and justificatory warrant. By articulating this error type at a general level, we hope to raise awareness of this pervasive error type and to facilitate critiques of claims that appeal to what is “natural” as inevitable or unchangeable.

12 citations


Journal ArticleDOI
TL;DR: The social sciences need to take seriously their status as divisions of biology and recognize the central role of Darwinian processes in all the phenomena they seek to explain this article, and the analytical taxonomies of all the social sciences are shown to require a Darwinian approach to human affairs.
Abstract: The social sciences need to take seriously their status as divisions of biology. As such they need to recognize the central role of Darwinian processes in all the phenomena they seek to explain. An argument for this claim is formulated in terms of a small number of relatively precise premises that focus on the nature of the kinds and taxonomies of all the social sciences. The analytical taxonomies of all the social sciences are shown to require a Darwinian approach to human affairs, though not a nativist or genetically driven theory by any means. Non-genetic Darwinian processes have the fundamental role on all human affairs. I expound a general account of how Darwinian processes operate in human affairs by selecting for strategies and sets of strategies individuals and groups employ. I conclude by showing how a great deal of social science can be organized in accordance with Tinbergen’s approach to biological inquiry, an approach required by the fact that the social sciences are all divisions of biology, and in particular the studies of one particular biological species.

10 citations


Journal ArticleDOI
TL;DR: In this article, the authors analyze the philosophical consequences of the recent discovery of direct violations of the time-reversal symmetry of weak interactions and the consequences of this discovery for the general problem of the possible connections between direction (arrow) of time and time-asymmetric laws of nature.
Abstract: The paper analyzes the philosophical consequences of the recent discovery of direct violations of the time–reversal symmetry of weak interactions. It shows that although we have here an important case of the time asymmetry of one of the fundamental physical forces which could have had a great impact on the form of our world with an excess of matter over antimatter, this asymmetry cannot be treated as the asymmetry of time itself but rather as an asymmetry of some specific physical process in time. The paper also analyzes the consequences of the new discovery for the general problem of the possible connections between direction (arrow) of time and time-asymmetric laws of nature. These problems are analyzed in the context of Horwich’s Asymmetries in time: problems in the philosophy of science (1987) argumentation, trying to show that existence of a time–asymmetric law of nature is a sufficient condition for time to be anisotropic. Instead of Horwich’s sufficient condition for anisotropy of time, it is stressed that for a theory of asymmetry of time to be acceptable it should explain all fundamental time asymmetries: the asymmetry of traces, the asymmetry of causation (which holds although the electrodynamic, strong and gravitational interactions are invariant under time reversal), and the asymmetry between the fixed past and open future. It is so because the problem of the direction of time has originated from our attempts to understand these asymmetries and every plausible theory of the direction of time should explain them.

9 citations


Journal ArticleDOI
TL;DR: The idea of an extended mechanistic explanation, which makes explicit room for the role of environment in explanation, is introduced, which is believed to allow for mechanistic explanations regarding a broader group of scientific phenomena.
Abstract: Mechanistic accounts of explanation have recently found popularity within philosophy of science. Presently, we introduce the idea of an extended mechanistic explanation, which makes explicit room for the role of environment in explanation. After delineating Craver and Bechtel’s (2007) account, we argue this suggestion is not sufficiently robust when we take seriously the mechanistic environment and modeling practices involved in studying contemporary complex biological systems. Our goal is to extend the already profitable mechanistic picture by pointing out the importance of the mechanistic environment. It is our belief that extended mechanistic explanations, or mechanisms that take into consideration the temporal sequencing of the interplay between the mechanism and the environment, allow for mechanistic explanations regarding a broader group of scientific phenomena.

Journal ArticleDOI
TL;DR: In this article, a Bayesian analysis of the data directly obtained from studying members of the narrowest reference class is proposed, based on the ideas developed by Paul Thorn and John Pollock.
Abstract: The conflict of narrowness and precision in direct inference occurs if a body of evidence contains estimates for frequencies in a certain reference class and less precise estimates for frequencies in a narrower reference class. To develop a solution to this conflict, I draw on ideas developed by Paul Thorn and John Pollock. First, I argue that Kyburg and Teng’s solution to the conflict of narrowness and precision leads to unreasonable direct inference probabilities. I then show that Thorn’s recent solution to the conflict leads to unreasonable direct inference probabilities. Based on my analysis of Thorn’s approach, I propose a natural distribution for a Bayesian analysis of the data directly obtained from studying members of the narrowest reference class.

Journal ArticleDOI
TL;DR: In this paper, a detailed reconstruction and a critical analysis of Abraham Maslow's neglected psychological reading of Thomas Kuhn's famous dichotomy between "normal" and "revolutionary" science, which Maslow briefly expounded four years after the first edition of Kuhn’s The Structure of Scientific Revolutions, in his small book The Psychology of Science: A Reconnaissance (1966), and which relies heavily on his extensive earlier general writing in the motivational and personality psychology.
Abstract: In this paper, I offer a detailed reconstruction and a critical analysis of Abraham Maslow’s neglected psychological reading of Thomas Kuhn’s famous dichotomy between ‘normal’ and ‘revolutionary’ science, which Maslow briefly expounded four years after the first edition of Kuhn’s The Structure of Scientific Revolutions, in his small book The Psychology of Science: A Reconnaissance (1966), and which relies heavily on his extensive earlier general writing in the motivational and personality psychology. Maslow’s Kuhnian ideas, put forward as part of a larger program for the psychology of science, outlined already in his 1954 magnum opus Motivation and Personality, are analyzed not only in the context of Kuhn’s original ‘psychologizing’ attitude toward understanding the nature and development of science, but also in a broader historical, intellectual and social context.

Journal ArticleDOI
TL;DR: This paper discusses an example of the suppression of medical evidence in recent influenza research and develops a conceptual framework for the description and assessment of questionable research practices applied in research and publication processes to suppress evidence.
Abstract: Financial conflicts of interest in medical research foster deviations from research standards and evidentially lead to the suppression of research findings that are at odds with commercial interests of pharmaceutical companies. Questionable research practices prevent data from being created, made available, or given suitable recognition. They run counter to codified principles of responsible conduct of research, such as honesty, openness or respect for the law. Resulting in ignorance, misrepresentation and suspension of scientific self-correction, suppression of medical evidence in its various forms is both a threat to the epistemic and the moral integrity of medical science. This paper discusses an example of the suppression of medical evidence in recent influenza research and develops a conceptual framework for the description and assessment of questionable research practices applied in research and publication processes to suppress evidence.

Journal ArticleDOI
TL;DR: Climenhaga as discussed by the authors argued that the screening-off thesis (SOT) provides a criticism of the widely held theory of inference called "inference to the best explanation".
Abstract: We (2013, 2014) argued that explanatoriness is evidentially irrelevant in the following sense: Let H be a hypothesis, O an observation, and E the proposition that H would explain O if H and O were true Then our claim is that Pr(H | O & E) = Pr(H | O) We defended this screening-off thesis (SOT) by discussing an example concerning smoking and cancer Climenhaga (Philos Sci, forthcoming) argues that SOT is mistaken because it delivers the wrong verdict about a slightly different smoking-and-cancer case He also considers a variant of SOT, called “SOT*”, and contends that it too gives the wrong result We here reply to Climenhaga’s arguments and suggest that SOT provides a criticism of the widely held theory of inference called “inference to the best explanation”

Journal ArticleDOI
Saira Malik1
TL;DR: In this article, the authors present a detailed analysis of Hacking's observation versus experiment account, and argue that the Hacking approach is not an adequate framework for delineating scientific experimentation in a systematic manner.
Abstract: Observation and experiment as categories for analysing scientific practice have a long pedigree in writings on science. There has, however, been little attempt to delineate observation and experiment with respect to analysing scientific practice; in particular, scientific experimentation, in a systematic manner. Someone who has presented a systematic account of observation and experiment as categories for analysing scientific experimentation is Ian Hacking. In this paper, I present a detailed analysis of Hacking’s observation versus experiment account. Using a range of cases from various fields of scientific enquiry, I argue that the observation versus experiment account is not an adequate framework for delineating scientific experimentation in a systematic manner.

Journal ArticleDOI
TL;DR: In this article, a meta-representational model for interdisciplinary scientific activities is presented, in which interdisciplinarity is viewed in part as a process of integrating distinct scientific representational approaches.
Abstract: In this paper, I present a philosophical analysis of interdisciplinary scientific activities. I suggest that it is a fruitful approach to view interdisciplinarity in light of the recent literature on scientific representations. For this purpose I develop a meta-representational model in which interdisciplinarity is viewed in part as a process of integrating distinct scientific representational approaches. The analysis suggests that present methods for the evaluation of interdisciplinary projects places too much emphasis non-epistemic aspects of disciplinary integrations while more or less ignoring whether specific interdisciplinary collaborations puts us in a better, or worse, epistemic position. This leads to the conclusion that there are very good reasons for recommending a more cautious, systematic, and stringent approach to the development, evaluation, and execution of interdisciplinary science.

Journal ArticleDOI
TL;DR: In this paper, the authors seek to answer two new questions about truth and scientific change: (a) What lessons does the phenomenon of scientific change teach us about the nature of truth? (b) What light do recent developments in the theory of truth, incorporating these lessons, throw on problems arising from the prevalence of scientific changes, specifically, the problem of pessimistic meta-induction.
Abstract: The paper seeks to answer two new questions about truth and scientific change: (a) What lessons does the phenomenon of scientific change teach us about the nature of truth? (b) What light do recent developments in the theory of truth, incorporating these lessons, throw on problems arising from the prevalence of scientific change, specifically, the problem of pessimistic meta-induction?

Journal ArticleDOI
TL;DR: The notion of truthlikeness (verisimilitude, approximate truth) was coined by Karl Popper and has very much fallen into oblivion, but the paper defends it as mentioned in this paper.
Abstract: The notion of truthlikeness (verisimilitude, approximate truth), coined by Karl Popper, has very much fallen into oblivion, but the paper defends it. It can be regarded in two different ways. Either as a notion that is meaningful only if some formal measure of degree of truthlikeness can be constructed; or as a merely non-formal comparative notion that nonetheless has important functions to fulfill. It is the latter notion that is defended; it is claimed that such a notion is needed for both a reasonable backward-looking and a reasonable forward-looking view of science. On the one hand, it is needed in order to make sense of the history of science as containing a development; on the other, it is needed in order to understand present-day sciences as containing knowledge-seeking activities. The defense of truthlikeness requires also a defense of two other notions: quasi-comparisons and regulative ideas, which is supplied in this paper as well.

Journal ArticleDOI
TL;DR: In this article, an emergence relation between special science laws is proposed, which preserves the autonomy or novelty of each special science's laws but also shows their dependence: the autonomy of each level's generalisations is given because nomicity is conferred to them system intrinsic, their dependence is established via their supervenience on lower level laws.
Abstract: The better best system account, short BBSA, is a variation on Lewis’s theory of laws. The difference to the latter is that the BBSA suggests that best system analyses can be executed for any fixed set of properties (instead of perfectly natural properties only). This affords the possibility to launch system analyses separately for the set of biological properties yielding the set of biological laws, chemical properties yielding chemical laws, and so on for the other special sciences. As such, the BBSA remains silent about possible interrelations between these freestanding sets of laws. In this paper, I explicate an emergence relation between them which preserves the autonomy or novelty of each special science’s laws but also shows their dependence: the autonomy of each level’s generalisations is given because nomicity is conferred to them system intrinsic, their dependence is established via their supervenience on lower level laws. As will be shown, the autonomy of special science laws is further strengthened by their ceteris paribus character.


Journal ArticleDOI
TL;DR: In this article, a comprehensive refutation of the arguments necessitarians use to show that if natural necessities are posited, then there is no problem of induction is solved is presented.
Abstract: Some philosophers who believe that there are necessary connections in nature take it that an advantage of their commitment is that the problem of induction is solved. This paper aims to offer a comprehensive refutation of the arguments necessitarians use to show that if natural necessities are posited, then there is no problem of induction. In section 2, two models of natural necessity are presented. The “Contingent Natural Necessity” section examines David Armstrong’s explanationist ‘solution’ to the problem of induction. The “Natural Necessity and IBE” section looks in detail into the claim that natural necessity is the best explanation of observed regularity. The “Dispositional Essentialism to the Rescue?” section moves on to Brian Ellis’s dispositional essentialist ‘solution’. The “Sankey’s Helping Hand” section examines Howard Sankey’s attempt to blend dispositional essentialism and explanationism.

Journal ArticleDOI
TL;DR: In this article, the joint derivation from a small set of elementary and ontologically neutral assumptions of both the Galilei and the Lorentz transformation exemplifies the virtues of structural approaches to the foundations of physical theories and the common origination of the resulting two relativistic frameworks sheds light on both the successes and the limitations of correspondence claims.
Abstract: Retention of structure across theory change has been invoked in support of a ‘structural’ alternative to more traditional entity-based scientific realism. In that context the transition from Newtonian mechanics to the Special Theory of Relativity is often regarded as a very significant instance of structural preservation, or retention, associated with correspondence-based recovery. The joint derivation, from a small set of elementary and ontologically neutral assumptions, of both the Galilei and the Lorentz transformation exemplifies the virtues of structural approaches to the foundations of physical theories. The common origination of the resulting two relativistic frameworks sheds light on both the successes and the limitations of correspondence claims. However, the cognitive-operational character of the basic assumptions lends no support to the structural realist’s ‘inference to the best explanation’.

Journal ArticleDOI
TL;DR: In this article, a case study of Waddington's epigenetic landscape images in biology is used to develop a descriptive framework applicable to heuristic roles of various visual metaphors in the sciences.
Abstract: Recent philosophical analyses of the epistemic dimension of images in the sciences show a certain trend in acknowledging potential roles of these images beyond their merely decorative or pedagogical functions. We argue, however, that this new debate has yet paid little attention to a special type of pictures, we call ‘visual metaphor’, and its versatile heuristic potential in organizing data, supporting communication, and guiding research, modeling, and theory formation. Based on a case study of Conrad Hal Waddington’s epigenetic landscape images in biology, we develop a descriptive framework applicable to heuristic roles of various visual metaphors in the sciences.

Journal ArticleDOI
TL;DR: This article argued that the Carroll-Chen cosmogonic model does not provide a plausible scientific explanation of the past hypothesis (the thesis that our universe began in an extremely low-entropy state).
Abstract: I argue that the Carroll–Chen cosmogonic model does not provide a plausible scientific explanation of the past hypothesis (the thesis that our universe began in an extremely low-entropy state). I suggest that this counts as a welcomed result for those who adopt a Mill–Ramsey–Lewis best systems account of laws and maintain that the past hypothesis is a brute fact that is a non-dynamical law.



Journal ArticleDOI
TL;DR: In this article, the authors provide a characterization of ability theories of practice and, in this process, defend Pierre Bourdieu's ability theory against Stephen Turner's objections, and show that despite Turner's claims to the contrary, his arguments do not refute Bourdieseu's positive account.
Abstract: The aim of this paper is to provide a characterization of ability theories of practice and, in this process, to defend Pierre Bourdieu’s ability theory against Stephen Turner’s objections. In part I, I outline ability theorists’ conception of practices together with their objections to claims about rule following and rule explanations. In part II, I turn to the question of what ability theorists take to be the alternative to rule following and rule explanations. Ability theorists have offered, and been ascribed, somewhat different answers to this question, just as their replies, or positive accounts, have been heavily criticized by Turner. Due to this state of the debate, I focus on the positive account advanced by a single—and highly famous—ability theorist of practice, Pierre Bourdieu. Moreover, I show that despite Turner’s claims to the contrary, his arguments do not refute Bourdieu’s positive account.


Journal ArticleDOI
TL;DR: Putnam has been one of the greatest thinkers of our time, a philosopher who was able to propose groundbreaking ideas in virtually every area of philosophy as mentioned in this paper, and the changes some of his positions underwent, far from being a point of weakness, reveal the freshness and genuineness of Putnam's way of philosophising and at the same time the essence of philosophical discussion itself.
Abstract: Two convictions underlie the following article. The first is that Hilary Putnam has been one of the greatest thinkers of our time, a philosopher who was able to propose groundbreaking ideas in virtually every area of philosophy. As the reader will see, the topics he tackled in his writings included questions of philosophy of science, philosophy of language, philosophy of mathematics and logic, philosophy of mind, metaethics, the fact-value dichotomy, the interpretation of Wittgenstein’s later thought, the question of relativism, the analysis of rationality, the analysis of religious experience, the character of Jewish philosophy, the interpretation of pragmatism, the elucidation of the concept of truth, the question of realism, the relationship between mind and the world. The second is that the changes some of his positions underwent, far from being a point of weakness—as some critics have sometimes felt compelled to claim—reveal the freshness and genuineness of Putnam’s way of philosophising and at the same time the essence of philosophical discussion itself.