scispace - formally typeset
Search or ask a question

Showing papers in "Philosophy of Science in 1978"


Journal Article•DOI•
TL;DR: For example, the authors proposes to reinterpret biological species as historical entities, which solves several important anomalies in biology, in philosophy of biology, and within philosophy itself, and has important implications for any attempt to present an "evolutionary" analysis of science and for sciences such as anthropology which are devoted to the study of single species.
Abstract: Biological species have been treated traditionally as spatiotemporally unrestricted classes. If they are to perform the function which they do in the evolutionary process, they must be spatiotemporally localized individuals, historical entities. Reinterpreting biological species as historical entities solves several important anomalies in biology, in philosophy of biology, and within philosophy itself. It also has important implications for any attempt to present an "evolutionary" analysis of science and for sciences such as anthropology which are devoted to the study of single species.

640 citations


Journal Article•DOI•
TL;DR: This paper argued that the intuitively sanctioned distinction between beliefs and non-belief states that play a role in the proximate causal history of beliefs is a distinction worth preserving in cognitive psychology.
Abstract: It is argued that the intuitively sanctioned distinction between beliefs and non-belief states that play a role in the proximate causal history of beliefs is a distinction worth preserving in cognitive psychology. The intuitive distinction is argued to rest on a pair of features exhibited by beliefs but not by subdoxastic states. These are access to consciousness and inferential integration. Harman's view, which denies the distinction between beliefs and subdoxastic states, is discussed and criticized.

354 citations


Journal Article•DOI•
TL;DR: The aim of probabilistic explanation is not to demonstrate that the explanandum fact was nomically expectable, but to give an account of the chance mechanism(s) responsible for it as mentioned in this paper.
Abstract: It has been the dominant view that probabilistic explanations of particular facts must be inductive in character. I argue here that this view is mistaken, and that the aim of probabilistic explanation is not to demonstrate that the explanandum fact was nomically expectable, but to give an account of the chance mechanism(s) responsible for it. To this end, a deductive-nomological model of probabilistic explanation is developed and defended. Such a model has application only when the probabilities occurring in covering laws can be interpreted as measures of objective chance, expressing the strength of physical propensities. Unlike inductive models of probabilistic explanation, this deductive model stands in no need of troublesome requirements of maximal specificity or epistemic relativization.

196 citations


Journal Article•DOI•
TL;DR: The notion of supervenience is employed to provide a new statement of the relation of Mendelian predicates to molecular ones in order to provide for the commensurability and potential reducibility of MendELian to molecular genetics in a way that circumvents the theoretical complications which appear to stand in the way of such a reduction.
Abstract: In this paper the concept of supervenience is employed to explain the relationship between fitness as employed in the theory of natural selection and population biology and the physical, behavioral and ecological properties of organisms that are the subjects of lower level theories in the life sciences. The aim of this analysis is to account simultaneously for the fact that the theory of natural selection is a synthetic body of empirical claims, and for the fact that it continues to be misconstrued, even by biologists, for a tautological system. The notion of supervenience is then employed to provide a new statement of the relation of Mendelian predicates to molecular ones in order to provide for the commensurability and potential reducibility of Mendelian to molecular genetics in a way that circumvents the theoretical complications which appear to stand in the way of such a reduction.

123 citations


Journal Article•DOI•
TL;DR: This paper reformulated Jeffrey conditionalization to make it more natural and easier to work with on independent grounds, and they showed that the reformulated version is more natural than the original version.
Abstract: Bayesian decision theory can be viewed as the core of psychological theory for idealized agents. To get a complete psychological theory for such agents, you have to supplement it with input and output laws. On a Bayesian theory that employs strict conditionalization, the input laws are easy to give. On a Bayesian theory that employs Jeffrey conditionalization, there appears to be a considerable problem with giving the input laws. However, Jeffrey conditionalization can be reformulated so that the problem disappears, and in fact the reformulated version is more natural and easier to work with on independent grounds.

100 citations


Journal Article•DOI•
TL;DR: The distinction between initial thinking, plausibility, and acceptability is ambiguous as discussed by the authors, and the debate over the distinction between discovery and justification is ambiguous, which obscures the real discoveries are well established.
Abstract: The distinction between discovery and justification is ambiguous. This obscures the debate over a logic of discovery. For the debate presupposes the distinction. Real discoveries are well established. What is well established is justified. The proper distinctions are three: initial thinking, plausibility, and acceptability. Logic is not essential to initial thinking. We do not need good supporting reasons to initially think of an hypothesis. Initial thoughts need be neither plausible nor acceptable. Logic is essential, as Hanson noted, to both plausibility and acceptability. An hypothesis needs good supporting reasons to be either plausible or acceptable. Such reasons need not be relative to the particular scientific theory undergoing test at the time. There is no fundamental difference between reasons relevant to plausibility and acceptability. The difference is one of degree. Acceptability requires more than plausibility.

91 citations


Journal Article•DOI•
TL;DR: In this paper, the notion of dimensionally invariant laws of physics is studied in spaces constructed from a number of conjoint and extensive structures some of which are suitably interrelated by distribution laws.
Abstract: In formal theories of measurement meaningfulness is usually formulated in terms of numerical statements that are invariant under admissible transformations of the numerical representation. This is equivalent to qualitative relations that are invariant under automorphisms of the measurement structure. This concept of meaningfulness, appropriately generalized, is studied in spaces constructed from a number of conjoint and extensive structures some of which are suitably interrelated by distribution laws. Such spaces model the dimensional structures of classical physics. It is shown that this qualitative concept corresponds exactly with the numerical concept of dimensionally invariant laws of physics.

65 citations


Journal Article•DOI•
TL;DR: Taking reduction in the traditional deductive sense, the programmatic claim that most of genetics can be reduced by molecular genetics is defended as feasible and significant.
Abstract: Taking reduction in the traditional deductive sense, the programmatic claim that most of genetics can be reduced by molecular genetics is defended as feasible and significant. Arguments by Ruse and Hull that either the relationship is replacement or at best a weaker form of reduction are shown to rest on a mixture of historical and logical confusions about the nature of the theories involved.

51 citations


Journal Article•DOI•
TL;DR: In this paper, an attempt to get around certain well-known criticisms of the trace theory of memory is discussed and an account of the so-called "logical" notion of a memory trace is given.
Abstract: This paper consists of two parts. In Part I, an attempt to get around certain well-known criticisms of the trace theory of memory is discussed. Part II consists of an account of the so-called "logical" notion of a memory trace. Trace theories are sometimes thought to be empirical hypotheses about the functioning of memory. That this is not the case, that trace theories are in fact philosophical theories, is shown, I believe, in the arguments which follow. If this is so, one may well wonder about psychologists' insistence that any empirical theory of memory must involve the postulation of traces (or trace-like entities: engrams, schemata, etc.).

50 citations


Journal Article•DOI•
TL;DR: In this paper, a study of the group selection controversy, with special emphasis on the period from 1962 to the present, and the rise of inclusive fitness theory, is presented, focusing on the relations between individual fitness theory and other fitness theories and on the methodological imperatives used in the controversy over the status of these theories.
Abstract: This article is primarily a study of the group selection controversy, with special emphasis on the period from 1962 to the present, and the rise of inclusive fitness theory. Interest is focused on the relations between individual fitness theory and other fitness theories and on the methodological imperatives used in the controversy over the status of these theories. An appendix formalizes the notion of "assertive part" which is used in the informal discussion of the methodological imperatives elicited from the controversy.

50 citations


Journal Article•DOI•
TL;DR: In this paper, the authors argue that Suppes' probabilistic causal calculus is free of each of these problems and moreover, that several broader issues raised by Hesslow's discussion tend to support a probabilism conception of causes.
Abstract: Germund Hesslow has argued recently [2] that a probabilistic theory of causality as advocated by Patrick Suppes [4] has two problems that a deterministic theory avoids. In this paper, I argue that Suppes' probabilistic causal calculus is free of each of these problems and, moreover, that several broader issues raised by Hesslow's discussion tend to support a probabilistic conception of causes. Hesslow begins his paper with definitions of the rival theories. A deterministic theory is taken to be a theory which requires that any event having a cause have a sufficient or complete cause. In characterizing a probabilistic theory, Hesslow turns to Suppes' 1970 monograph where, among other things, a cause is viewed as raising the probability of its effect. Hesslow uses Suppes' definition of a primafacie cause to represent Suppes' probabilistic theory. Specifically:

Journal Article•DOI•
TL;DR: In this article, the authors argue that the attractiveness of methodological individualism as a regulative principle depends on two independent confusions, the conflation of an agent's reasons for action with the beliefs, needs, desires, or goals which are the reasons why he acted as he did, and the identification of explaining a phenomenon and describing its causes.
Abstract: Past criticisms to the contrary, methodological individualism in the social sciences is neither trivial nor obviously false. In the style of Weber's sociology, it restricts the ultimate explanatory repertoire of social science to agents' reasons for action. Although this restriction is not obviously false, it ought not to be accepted, at present, as a regulative principle. It excludes, as too far-fetched to merit investigation, certain hypotheses concerning the influence of objective interests on large-scale social phenomena. And these hypotheses, in fact, merit empirical consideration. The attractiveness of methodological individualism as a regulative principle depends on two independent confusions, the conflation of an agent's reasons for action with the beliefs, needs, desires, or goals which are the reasons why he acted as he did, and the identification of explaining a phenomenon and describing its causes.

Journal Article•DOI•
TL;DR: Gibson's theory of perception appears to be a valuable source of epistemological as well as psychological ideas as discussed by the authors, which suggests that the alleged distinction between psychology and epistemology is suspect.
Abstract: Hintikka has criticized psychologists for "hasty epistemologizing," which he takes to be an unwarranted transfer of ideas from psychology (a discipline dealing with questions of fact) into epistemology (a discipline dealing with questions of method and theory). Hamlyn argues, following Hintikka, that Gibson's theory of perception is an example of such an inappropriate transfer, especially insofar as Hamlyn feels Gibson does not answer several important questions. However, Gibson's theory does answer the relevant questions, albeit in a new and radical way, which suggests that the alleged distinction between psychology and epistemology is suspect. In fact, contrary to Hintikka and Hamlyn's claims, Gibson's theory of perception appears to be a valuable source of epistemological as well as psychological ideas.

Journal Article•DOI•
TL;DR: In this paper, conditions designed to capture the content of the more important of these senses are proposed and the relations among these conditions are examined, and the status of universality requirements is briefly discussed.
Abstract: Various senses in which laws of nature are supposed to be "universal" are distinguished. Conditions designed to capture the content of the more important of these senses are proposed and the relations among these conditions are examined. The status of universality requirements is briefly discussed.

Journal Article•DOI•
TL;DR: The Conventionality of Simultaneity espoused by Reichenbach, Grunbaum, Edwards, and Winnie is extended to mechanics and electrodynamics in this article, which is seen to be a special case of a generally covariant formulation of physics, and therefore consistent with Special Relativity as the geometry of flat space-time.
Abstract: The Conventionality of Simultaneity espoused by Reichenbach, Grunbaum, Edwards, and Winnie is herein extended to mechanics and electrodynamics. The extension is seen to be a special case of a generally covariant formulation of physics, and therefore consistent with Special Relativity as the geometry of flat space-time. Many of the quantities of classical physics, such as mass, charge density, and force, are found to be synchronization dependent in this formulation and, therefore, in Reichenbach's terminology, "metrogenic." The relationship of these quantities to 4-vectors and their physical significance is discussed.

Journal Article•DOI•
TL;DR: In this paper, second-order quantification theory with identity is introduced, where the identity of indiscernibles is restricted to individuals, as distinct from a class of entities Cortes takes to be non-individuals.
Abstract: Alberto Cortes, in [1], attempts to show (a) that Leibniz's Principle of The Identity of Indiscernibles (L) is a principle restricted to individuals (as distinct from a class of entities Cortes takes to be non-individuals),' and (b) that photons (light quanta) appear to violate L (since they don't obey Pauli's Exclusion Principle). L is stated by Leibniz as \"no two substances are completely similar, or differ solo numero.\" In second-order quantification theory with identity L becomes:

Journal Article•DOI•
TL;DR: In this paper, the conceptual machinery of contemporary possible-world semantics is invoked to provide an account of the metaphysical status of "bridge laws" in intertheoretic reductions, and it is argued that bridge laws are not definitions, and although they do not necessarily reflect attribute-identities, they are supervenient.
Abstract: I invoke the conceptual machinery of contemporary possible-world semantics to provide an account of the metaphysical status of "bridge laws" in intertheoretic reductions. I argue that although bridge laws are not definitions, and although they do not necessarily reflect attribute-identities, they are supervenient. I.e., they are true in all possible worlds in which the reducing theory is true.

Journal Article•DOI•
TL;DR: It is proved that Greeno's measure of transmitted information is a limiting special case of the E-I model, but that the former, unlike the latter, makes no distinction between explanatory power and descriptive power.
Abstract: This paper contrasts two information-theoretic approaches to statistical explanation: namely, (1) an analysis, which originated in my earlier research on problems of testing stochastic models of learning, based on an entropy-like measure of expected transmitted-information (and here referred to as the Expected-Information Model), and (2) the analysis, which was proposed by James Greeno (and which is closely related to Wesley Salmon's Statistical Relevance Model), based on the information-transmitted-by-a-system. The substantial differences between these analyses can be traced to the following basic difference. On Greeno's view, the essence of explanation lies in the relevance relations expressed by the conditional probabilities that relate the explanans variables to the explanandum variables; on my view, in contrast, the essence of explanation lies in theories viewed as hypothetical structures which deductively entail conditional probability distributions linking the explanans variables and the explanandu...


Journal Article•
Stephen Jay Gould1•
TL;DR: In this article, the authors propose a clock model of Heterochrony, which is a mechanism for separating the three stages of the development of an organism: birth, growth, and death.
Abstract: * *1. Prospectus * Part I: Recapitulation *2. The Analogistic Tradition from Anaximander to Bonnet * The Seeds of Recapitulation in Greek Science? * Ontogeny and Phylogeny in the Conflict of "Evolution" and Epigenesis: The Idyll of Charles Bonnet * Appendix: The Revolution in "Evolution" *3. Transcendental Origins, 1793--1860 * Naturphilosophie: An Expression of Developmentalism * Two Leading Recapitulationists among the Naturphilosophen: Oken and Meckel * Oken's Classification of Animals Linear Additions of Organs * J. F. Meckel's Sober Statement of the Same Principles * Serres and the French Transcendentalists * Recapitulation and the Theory of Developmental Arrests * Von Baer's Critique of Recapitulation * The Direction of Development and Classification of Animals * Von Baer and Naturphilosophie: What Is the Universal Direction of Development? * Louis Agassiz and the Threefold Parallelism *4. Evolutionary Triumph, 1859--1900 * Evolutionary Theory and Zoological Practice * Darwin and the Evolution of Von Baer' Laws * Evolution and the Mechanics of Recapitulation * Ernst Haeckel: Phylogeny as the Mechanical Cause of Ontogeny * The Mechanism of Recapitulation * The American Neo-Lamarckians: The Law of Acceleration as Evolution's Motor * Progressive Evolution by Acceleration * The Extent of Parallelism * Why Does Recapitulation Dominate the History of Life? * Alpheus Hyatt and Universal Acceleration * Lamarckism and the Memory Analogy * Recapitulation and Darwinism * Appendix: The Evolutionary Translation of von Baer's Laws *5. Pervasive Influence * Criminal Anthropology * Racism * Child Development * Primary Education * Freudian Psychoanalysis * Epilogue *6. Decline, Fall, and Generalization * A Clever Argument * An Empirical Critique * Organs or Ancestors: The Transformation of Haeckel's Heterochrony * Interpolations into Juvenile Stages * Introduction of Juvenile Features into the Adults of Descendants * What Had Become of von Baer's Critique? * Benign Neglect: Recapitulation and the Rise of Experimental Embryology * The Prior Assumptions of Recapitulation * Wilhelm His and His Physiological Embryology: A Preliminary Skirmish * Roux's Entwicklungsmechanik and the Biogenetic Low * Recapitulation and Substantive Issues in Experimental Embryology: The New Preformationism * Mendel's Resurrection, Haeckel's Fall, and the Generalization of Recapitulation * Part II: Heterocrony and Paedomorphosis *7. Heterochrony and the Parallel of Ontogeny and Phylogeny * Acceleration and Retardation * Confusion in and after Haeckel's Wake * Guidelines for a Resolution * The Reduction of de Beer's Categories of Heterochrony to Acceleration and Retardation * A Historical Paradox: The Supposed Dominance of Recapitulation * Dissociability and Heterochrony * Correlation and Disociability * Dissociation of the Three Processes * A Metric for Dissociation * Temporal Shift as a Mechanism of Dissociation * A Clock Model of Heterochrony * Appendix: A Note on the Multivariate Representation of Dissociation *8. The Ecological and Evolutionary Significance of Heterochrony * The Argument from Frequency * The Importance of Recapitulation * The Importance of Heterochronic Change: Selected Cases * Frequency of Paedomorphosis in the Origin of Higher Taxa * A Critique of the Classical Significance of Heterochrony * The Classical Arguments * Retrospective and Immediate Significance * Heterochrony, Ecology, and Life-History Strategies * The Potential Ease and Rapidity of Heterochronic Change * The Control of Metamorphosis in Insects * Amphibian Paedomorphosis and the Thyroid Gland *9. Progenesis and Neoteny Insect Progenesis * Prothetely and Metathetely * Paedogenesis (Parthenogenetic Progenesis) in Gall Midges and Beetles * Progenesis in Wingless, Parthenogenetic Aphids * Additional Cases of Progenesis with a Similar Ecological Basis * Neotenic Solitary Locusts: Are They an Exception to the Rule? * Amphibian Neoteny * The Ecological Determinants of Progenesis * Unstable Environments * Colonization * Parasites * Male Dispersal * Progenesis as an Adaptive Response to Pressures for Small Size * The Role of Heterochrony in Macroevolution: Contrasting Flexibilities for Progenesis and Neoteny * Progenesis * Neoteny * The Social Correlates of Neoteny in Higher Vertebrates *10. Retardation and Neoteny in Human Evolution * The Seeds of Neoteny * The Fetalization Theory of Louis Bolk * Bolk's Data * Bolk's Interpretation * Bolk's Evolutionary Theory * A Tradition of Argument * Retardation in Human Evolution * Morphology in the Matrix of Retardation * Of Enumeration * Of Prototypes * Of Correlation * The Adaptive Significance of Retarded Development *11. Epilogue * Notes * Bibliography * Glossary * Index

Journal Article•DOI•
TL;DR: The standard construal, developed by Braithwaite, Goodman, Hempel, Nagel, and others, states that laws are true or approximately true lawlike statements.
Abstract: According to the standard construal, developed by Braithwaite, Goodman, Hempel, Nagel, and others, laws are true or approximately true lawlike statements. Lawlike statements (A) are essentially universal in their syntactic form, (B) express objective regularities, (C) can be confirmed by their instances, (D) have explanatory power, (E) have predictive power, (F) support counterfactual conditionals, and (G) have systematic connections with broader theories. Conversely, a generalization satisfying (A) can express a regularity, can be confirmed by its instances, has explanatory power, etc., only if it is lawlike. Therefore, among generalizations satisfying condition (A), lawlikeness is an attribute which is coextensional with each of the attributes (B)-(G). (Alternatively, one might claim that lawlikeness is equal to some suitable combination of (B)-(G).)

Journal Article•DOI•
TL;DR: In this article, Quinn's characterization of the logical relations of two of the central Duhemian theses is shown to be erroneous, leading to the conclusion that Quinn's analysis of the logic relation between two of Duhem's theses cannot be regarded as an exegetical analysis of what Duhem really meant.
Abstract: In recent years there has been a rebirth of interest in the philosophy of Pierre Duhem.' Although I applaud the spirit of this movement, one finds the critics of Duhem frequently lacking in a basic understanding of Duhem's tenets, sometimes to the extent that one doubts a familiarity with the Duhemian text.2 One of the few papers which is designed to remedy this state of affairs is that of Philip Quinn entitled "What Duhem Really Meant." Quinn is to be applauded for his meticulous and rigorous exegetical work on the Duhemian text. Unfortunately, Quinn's characterization of the logical relations of two of the central Duhemian theses is erroneous. I shall endeavor

Journal Article•DOI•
TL;DR: The correspondence theory of truth is one of the most venerable philosophical theses as discussed by the authors, but it was known almost from the beginning that it was the antinomy of the liar, which affects any use of the word "true," not only the use which the correspondence theorist intended.
Abstract: The correspondence theory of truth is one of the most venerable philosophical theses. Yet it was known almost from the beginning to entail the antinomy of the liar. This antinomy affects any use of the word "true," not only the use which the correspondence theorist intends. A definition of truth for formalized languages which avoids the antinomy was not formulated before 1931. Tarski's proposal of that year uses an ingenious device. His definition concerns the truth or falsity of the sentences of one language, the object language, but it is given in a different language, the metalanguage. Adapting this definition to natural languages poses some problems, but let us assume they do not exist. Then we may freely apply the word "true" even to sentences of natural languages. We need have no fear of getting involved in antinomies. The use of the word "true" has been rehabilitated. But what about the correspondence theory? Has it too been rehabilitated? Many philosophers claim it has, in particular Karl Popper and his followers. But whether this is so depends of course on what exactly the correspondence theory asserts. Tarski's definition, or rather a definition of truth for natural languages along Tarski's lines, entails that the sentence "Schnee ist weiss" is true if, and only if, snow is white, and so does the correspondence theory. Here German is our object language and English is the metalanguage. But for reasons which Tarski indicated himself, the coherence theory and the pragmatist theory of truth entail the same consequence. Hence the correspondence theory has to assert something more if there is to be any difference between the theories of truth. According to Popper it asserts "that truth is correspondence with the facts (or with reality); or, more precisely, that a theory is true if and only if it corresponds to the facts" ([6], p. 44). The last

Journal Article•DOI•
TL;DR: Theoretical simplicity is difficult to characterize, and evidently can depend upon a number of distinct factors as discussed by the authors, such as the fact that the laws of a theory have relatively few "counterinstances" whose accommodation requires the invocation of a ceteris paribus condition and ancillary explanation.
Abstract: Theoretical simplicity is difficult to characterize, and evidently can depend upon a number of distinct factors One such desirable characteristic is that the laws of a theory have relatively few "counterinstances" whose accommodation requires the invocation of a ceteris paribus condition and ancillary explanation It is argued that, when one theory is reduced to another, such that the laws of the second govern the behavior of the parts of the entities in the domain of the first, there is a characteristic gain in simplicity of the sort mentioned: while I see no way of quantitatively measuring the "amount" of defeasibility of the laws of a theory, microreduction can be shown to decrease that "amount"


Journal Article•DOI•
TL;DR: The theory of conclusions presented here satisfies Tukey's desiderata, specifically: conclusions are statements which are accepted on the basis of unusually strong evidence, and conclusions are subject to future rejection, when and if the evidence against them becomes strong enough.
Abstract: This paper presents a theory of conclusions based upon the suggestions of Tukey [21]. The logic offered here is based upon two rules of detachment that occur naturally in probabilistic inference, a traditional rule of acceptance, and a rule of rejection. The rules of detachment provide flexibility: the theory of conclusions can account for both statistical and deductive arguments. The rule of acceptance governs the acceptance of new conclusions, is a variant of the rule of high probability, and is a limiting case of a decision-theoretic rule of acceptance. The rule of rejection governs the removal of previously accepted conclusions on the basis of new evidence. The resulting theory of conclusions is not a decision-theoretic logic but does, through the aforementioned limiting property, provide a line of demarcation between decision and conclusion (i.e., nondecision) logics of acceptance. The theory of conclusions therefore complements decision-theoretic inference. The theory of conclusions presented here s...

Journal Article•DOI•
TL;DR: The main thrust of as mentioned in this paper is to show that the above mentioned questions, including the ones that Barr attempts to answer, need to be exchanged for different questions, questions relating to the notion of approximation.
Abstract: While the use of so-called idealizations in science has been widely recognized for many years, the philosophical problems that arise from this use have received relatively little attention. Even a cursory reading of the philosophical literature devoted to these problems ([1]; [2]; [4], pp. 160-171; [5]; [7], pp. 54-63) reveals that the following questions remain unanswered: In general, what, if any, are the distinguishing characteristics of idealizations? More specifically, do idealizations have any distinguishing syntactic or semantic characteristics? In addition to these questions there exist the following pragmatic questions, questions relating to the ways in which idealizations are used in science: How are idealizations used in explanations? (see [2]). Do these explanations have any peculiar characteristicscharacteristics not shared by deductive-nomological (D-N) explanations? If we assume that (at least some types of) idealizations are false (or contain false components) or \"do not obtain,\" how is it that they can have any explanatory power? (see [7], p. 58). Further, there are questions of more general philosophic concern. How do the problems of idealizations relate to those of simplicity, for an idealization seems to be, in some sense, a type of simplification? How do the problems of ideal laws and theories relate to the general problems of scientific laws and theories? In this paper I examine the recent attempts of William Barr ([1], [2]) to answer some of these questions. I discuss Barr's syntactic and semantic, and pragmatic analyses, pointing to the ways in which they successfully begin to answer some of our questions and to the ways in which they fail in this endeavor. The main thrust of this paper consists in showing that the above mentioned questions, including the ones that Barr attempts to answer, need to be exchanged for different questions, questions relating to the notion of approximation.

Journal Article•DOI•
TL;DR: Extensive measurement is called weak if the axioms allow two objects to have the same scale value without being indifferent with respect to the order as mentioned in this paper. Necessary and/or sufficient conditions for such representations are given.
Abstract: Extensive measurement is called weak if the axioms allow two objects to have the same scale value without being indifferent with respect to the order. Necessary and/or sufficient conditions for such representations are given. The Archimedean and the non-Archimedean case are dealt with separately.

Journal Article•DOI•
TL;DR: The Relevance Criterion of confirmation gained prominence as the underlying principle of the class-size approach (CSA) to Hempel's paradoxes of confirmation in this paper.
Abstract: The Relevance Criterion of confirmation gained prominence as the underlying principle of the class-size approach (CSA) to Hempel's paradoxes of confirmation. The CSA, however, yields counter-intuitive results for (c) instances, and this failing cast serious doubt on the acceptability of the Relevance Criterion. In this paper an attempt is made to rescue the Relevance Criterion from this embarrassment. This is done by incorporating that criterion into a new resolution of the paradoxes, a resolution based on a theory of selective confirmation and a distinction between mere confirmation in principle and evaluative confirmation (E-confirmation).

Journal Article•DOI•
TL;DR: It is shown that when the empirical growth is faster than the theoretical growth the posterior probability of the theoretical component increases, and empirical progressiveness of a research program, as explicated in this model, is accompanied by an increase in the degree of confirmation.
Abstract: In this paper a model is presented for the growth of knowledge in a dynamic scientific system. A system which is in some respects an idealization of a Lakatosian research program. The kinematics of the system is described in terms of two probabilistic variables, one of which is related to the evolution of its theoretical component and the other--to the growth of the empirical component. It is shown that when the empirical growth is faster than the theoretical growth the posterior probability of the theoretical component increases. Thus, empirical progressiveness of a research program, as explicated in this model, is accompanied by an increase in the degree of confirmation. In such a case the system grows in a Popperian-like spirit, while learning from experience in a Bayesian manner.