scispace - formally typeset
Search or ask a question

Showing papers in "Studies in history and philosophy of science in 2022"


Journal ArticleDOI
TL;DR: In this paper , the authors propose a framework for identifying the contexts in which pursuitworthiness criteria may promote the efficiency of scientific inquiry, and then spell out some implications this framework has for values and pursuit.
Abstract: Recent philosophical literature has turned its attention towards assessments of how to judge scientific proposals as worthy of further inquiry. Previous work, as well as papers contained within this special issue, propose criteria for pursuitworthiness (Achinstein, 1993; Whitt, 1992; DiMarco & Khalifa, 2019; Laudan, 1977; Shan, 2020; Šešelja et al., 2012). The purpose of this paper is to assess the grounds on which pursuitworthiness demands can be legitimately made. To do this, I propose a challenge to the possibility of even minimal criteria of pursuitworthiness, inspired by Paul Feyerabend. I go on to provide a framework for identifying the contexts in which pursuitworthiness criteria may promote the efficiency of scientific inquiry. I then spell out some implications this framework has for values and pursuit.

14 citations


Journal ArticleDOI
TL;DR: In this article , the authors distinguish two ways in which Kant's ideas concerning the relation between teleology and biological organization have been taken up in contemporary philosophy of biology and theoretical biology, and they name the two approaches heuristic and naturalistic, respectively.
Abstract: This paper distinguishes two ways in which Kant's ideas concerning the relation between teleology and biological organization have been taken up in contemporary philosophy of biology and theoretical biology. The first sees his account as the first instance of the modern understanding of teleology as a heuristic tool aimed at producing mechanistic explanations of organismal form and function. The second sees in Kant's concept of intrinsic purposiveness the seed of a radically new way of thinking about biological systems that should be developed by turning teleology into a legitimate concept of natural science. We name the two approaches heuristic and naturalistic, respectively. Our aim is to critically evaluate these approaches and suggest that the naturalistic option, which remains a minority position, deserves to be taken more seriously than it currently is in contemporary biological theory. While evolution by natural selection closes the case on intelligent design, it does not close the case on teleology in general. In fact, the current return of the organism and the recent calls for an agential perspective in evolutionary biology point out that we still have some thinking to do concerning this side of Kant's legacy.

10 citations


Journal ArticleDOI
TL;DR: In this article , a detailed framework for analysing the subsystem structure of physical theories and applying it to the interpretation of their symmetries is provided, whereby interpretative conclusions about a sector of a theory can be deduced from considering subsystems of other models of the same theory.
Abstract: Physical theories, for the most part, should be understood as modelling isolated subsystems of a larger universe; doing so, among other benefits, greatly clarifies the interpretation of the dynamical symmetries of those theories. I provide a detailed framework for analysing the subsystem structure of physical theories and applying it to the interpretation of their symmetries: the core concept is subsystem recursivity, whereby interpretative conclusions about a sector of a theory can be deduced from considering subsystems of other models of the same theory. I illustrate the framework by extensive examples from nonrelativistic particle mechanics, and in particular from Newtonian theories of gravity. A sequel to the paper will apply the framework to the local and global symmetries of classical field theory.

10 citations


Journal ArticleDOI
TL;DR: The New Demarcation Problem (NDP) as mentioned in this paper ) is a set of desiderata for what we take to be a satisfactory solution and presents a case study where conflicting sets of values clearly impinge on science, but where the legitimacy of their influence is ambiguous.
Abstract: There is now a general consensus amongst philosophers in the values in science literature that values necessarily play a role in core areas of scientific inquiry. We argue that attention should now be turned from debating the value-free ideal to delineating legitimate from illegitimate influences of values in science, a project we dub "The New Demarcation Problem." First, we review past attempts to demarcate the uses of values and propose a categorization of the strategies by where they seek to draw legitimacy from. Next, we propose a set of desiderata for what we take to be a satisfactory solution and present a case study where conflicting sets of values clearly impinge on science, but where the legitimacy of their influence is ambiguous. We use these desiderata and the case study to illustrate what we take to be the strengths and weaknesses of current strategies. To be clear, our goal is not to answer the question we pose, but to articulate a framework within which a solution can be judged.

10 citations


Journal ArticleDOI
TL;DR: In this article , a general framework for the symmetries of isolated systems developed in the prequel to this paper is applied to field theory, and the resultant analysis provides a general basis for interpreting the physical significance of symmetry according to their topological and asymptotic features.
Abstract: Physical theories, for the most part, should be understood as modelling isolated subsystems of a larger universe; doing so, among other benefits, greatly clarifies the interpretation of the dynamical symmetries of those theories. Building on a general framework for the symmetries of isolated systems developed in the prequel to this paper, I apply that framework to field theory. The resultant analysis provides a general basis for interpreting the physical significance of symmetries according to their topological and asymptotic features: global symmetries in general, and local symmetries insofar as they preserve boundary conditions and are asymptotically nonvanishing and/or topologically nontrivial, can be understood as physical transformations of an isolated system against the assumed background of other systems. Other symmetries in general must be understood as mere redescription, though in certain circumstances even non-boundary-preserving local symmetries can be afforded physical significance. The analysis largely reproduces - and so can be seen as a theoretical justification for - general practice in contemporary physics.

9 citations


Journal ArticleDOI
TL;DR: The authors explored causal explanations in a traditional fishing community in Brazil that provide resources for transdisciplinary collaboration, without neglecting differences between Indigenous and academic experts, and found that community members often rely on causal explanations for local ecological phenomena with different degrees of complexity.
Abstract: Transdisciplinary research challenges the divide between Indigenous and academic knowledge by bringing together epistemic resources of heterogeneous stakeholders. The aim of this article is to explore causal explanations in a traditional fishing community in Brazil that provide resources for transdisciplinary collaboration, without neglecting differences between Indigenous and academic experts. Semi-structured interviews were carried out in a fishing village in the North shore of Bahia and our findings show that community members often rely on causal explanations for local ecological phenomena with different degrees of complexity. While these results demonstrate the ecological expertise of local community members, we also argue that recognition of local expertise needs to reflect on differences between epistemic communities by developing a culturally sensitive model of transdisciplinary knowledge negotiation.

9 citations


Journal ArticleDOI
TL;DR: The history of the debates on the foundational implications of the Bell non-locality theorem displayed very soon a tendency to put the theorem in a perspective that was not entirely motivated by its very assumptions, in particular in terms of a local-realistic narrative, according to which a major target of the theorem would be the very possibility to conceive quantum theory as a theory concerning 'real' stuff in the world out-there as mentioned in this paper .
Abstract: The history of the debates on the foundational implications of the Bell non-locality theorem displayed very soon a tendency to put the theorem in a perspective that was not entirely motivated by its very assumptions, in particular in terms of a 'local-realistic' narrative, according to which a major target of the theorem would be the very possibility to conceive quantum theory as a theory concerning 'real' stuff in the world out-there. I present here a historico-critical analysis of the stages, between 1963 and 1978, through which the locality condition of the original Bell theorem almost undiscernibly turned into a 'local realism' condition, a circumstance which too often has affected the analysis of how serious the consequences of the Bell theorem turn out to be. In particular, the analysis puts into focus the interpretive oscillations and inconsistencies surrounding 'local realism', that emerge in the very descriptions that many leading figures provided themselves of the deep work they devoted to the theorem and its consequences.

7 citations


Journal ArticleDOI
TL;DR: The authors proposed a new conceptual framework of the missing heritability problem, which comprises three independent methodological and explanatory challenges: the numerical gap, the prediction gap, and the mechanism gap, which is used in our work.
Abstract: The so-called ‘missing heritability problem’ is often characterized by behavior geneticists as a numerical discrepancy between alternative kinds of heritability. For example, while ‘traditional heritability’ derived from twin and family studies indicates that approximately ∼50% of variation in intelligence is attributable to genetics, ‘SNP heritability’ derived from genome-wide association studies indicates that only ∼10% of variation in intelligence is attributable to genetics. This 40% gap in variance accounted for by alternative kinds of heritability is frequently referred to as what's “missing.” Philosophers have picked up on this reading, suggesting that “dissolving” the missing heritability problem is merely a matter of closing the numerical gap between traditional and molecular kinds of heritability. We argue that this framing of the problem undervalues the severity of the many challenges to scientific understanding of the “heritability” of human behavior. On our view, resolving the numerical discrepancies between alternative kinds of heritability will do little to advance scientific explanation and understanding of behavior genetics. Thus, we propose a new conceptual framework of the missing heritability problem that comprises three independent methodological and explanatory challenges: the numerical gap, the prediction gap, and the mechanism gap.

6 citations


Journal ArticleDOI
TL;DR: In this paper , the authors analyse the relation between the use of environmental data in contemporary health sciences and related conceptualisations and operationalisations of the notion of environment and argue that the diversification of data sources, their increase in scale and scope, and the application of novel analytic tools have brought about three significant conceptual shifts.
Abstract: In this paper, we analyse the relation between the use of environmental data in contemporary health sciences and related conceptualisations and operationalisations of the notion of environment. We consider three case studies that exemplify a different selection of environmental data and mode of data integration in data-intensive epidemiology. We argue that the diversification of data sources, their increase in scale and scope, and the application of novel analytic tools have brought about three significant conceptual shifts. First, we discuss the EXPOsOMICS project, an attempt to integrate genomic and environmental data which suggests a reframing of the boundaries between external and internal environments. Second, we explore the MEDMI platform, whose efforts to combine health, environmental and climate data instantiate a reframing and expansion of environmental exposure. Third, we illustrate how extracting epidemiological insights from extensive social data collected by the CIDACS institute yields innovative attributions of causal power to environmental factors. Identifying these shifts highlights the benefits and opportunities of new environmental data, as well as the challenges that such tools bring to understanding and fostering health. It also emphasises the constraints that data selection and accessibility pose to scientific imagination, including how researchers frame key concepts in health-related research.

6 citations


Journal ArticleDOI
TL;DR: Literature-based meta-analysis has been criticized for being too malleable to constrain results, averaging incomparable values, lacking a measure of evidence's strength, and problems with a systematic bias of individual studies as discussed by the authors .
Abstract: Literature-based meta-analysis is a standard technique applied to pool results of individual studies used in medicine and social sciences. It has been criticized for being too malleable to constrain results, averaging incomparable values, lacking a measure of evidence's strength, and problems with a systematic bias of individual studies. We argue against using literature-based meta-analysis of RCTs to assess treatment efficacy and show that therapeutic decisions based on meta-analytic average are not optimal given the full scope of existing evidence. The argument proceeds with discussing examples and analyzing the properties of some standard meta-analytic techniques. First, we demonstrate that meta-analysis can lead to reporting statistically significant results despite the treatment's limited efficacy. Second, we show that meta-analytic confidence intervals are too narrow compared to the variability of treatment outcomes reported by individual studies. Third, we argue that literature-based meta-analysis is not a reliable measurement instrument. Finally, we show that meta-analysis averages out the differences among studies and leads to a loss of information. Despite these problems, literature-based meta-analysis is useful for the assessment of harms. We support two alternative approaches to evidence amalgamation: meta-analysis of individual patient data (IPD) and qualitative review employing mechanistic evidence.

6 citations


Journal ArticleDOI
TL;DR: The notion that Mendel's numbers were, in statistical terms, too good to be true was well understood almost immediately after the famous "rediscovery" of his work in 1900, for reasons having as much to do with Cold War geopolitics as with traditional concerns about the objectivity of science as discussed by the authors .
Abstract: Two things about Gregor Mendel are common knowledge: first, that he was the "monk in the garden" whose experiments with peas in mid-nineteenth-century Moravia became the starting point for genetics; second, that, despite that exalted status, there is something fishy, maybe even fraudulent, about the data that Mendel reported. Although the notion that Mendel's numbers were, in statistical terms, too good to be true was well understood almost immediately after the famous "rediscovery" of his work in 1900, the problem became widely discussed and agonized over only from the 1960s, for reasons having as much to do with Cold War geopolitics as with traditional concerns about the objectivity of science. Appreciating the historical origins of the problem as we have inherited it can be a helpful step in shifting the discussion in more productive directions, scientific as well as historiographic.

Journal ArticleDOI
TL;DR: The Mass Observation Project (MOP) as mentioned in this paper is a national life-writing project in the UK that aims to generate writing which is typically reflexive, its recognition of the plurality and performativity of identity, and embrace of knowledge as situated yet fluid, which offers lessons for approaching views towards animal research and the role of publics in dialogue around the practice.
Abstract: With an established history of controversy in the UK, the use of animals in science continues to generate significant socio-ethical discussion. Here, the figure of 'the public' plays a key role. However, dominant imaginaries of 'the public' have significant methodological and ethical problems. Examining these, this paper critiques three ways in which 'the public' is currently constructed in relation to animal research; namely as un- or mis-informed; homogenous; and holding fixed and extractable views. In considering an alternative to such imaginaries, we turn to the Mass Observation Project (MOP), a national life-writing project in the UK. In its efforts to generate writing which is typically reflexive, its recognition of the plurality and performativity of identity, and embrace of knowledge as situated yet fluid, the MOP offers lessons for approaching views towards animal research and the role of publics in dialogue around the practice. In considering the MOP, we underline the need to acknowledge the role of method in shaping both what publics are able to articulate, and which positions they are able to articulate from. Finally, we stress the need for future dialogue around animal research to involve publics beyond one-way measurements of 'public opinion' and instead work to foster a reciprocity which enables them to act as collaborators in and coproducers of animal research policy, practice, and dialogue.

Journal ArticleDOI
TL;DR: In this paper , the authors summarize and critically analyze some of the most significant accounts of contextuality in quantum mechanics and subsume them under two categories: simultaneous noncontextuality and self-adjoint operator non-contextuality.
Abstract: There are two different and logically independent concepts of noncontextuality in quantum mechanics. First, an ontological (hidden variable) model for quantum mechanics is called noncontextual if every ontic (hidden) state determines the probability of the outcomes of every measurement independently of what other measurements are simultaneously performed. Second, an ontological model is noncontextual if any two measurements which are represented by the same self-adjoint operator or, equivalently, which have the same probability distribution of outcomes in every quantum state also have the same probability distribution of outcomes in every ontic state. I will call the first concept simultaneous noncontextuality, the second measurement noncontextuality. In the paper I will overview and critically analyze some of the most significant accounts of contextuality in the literature and subsume them under these two categories.

Journal ArticleDOI
TL;DR: The problem of portability of polygenic scores was first raised by Lewontin in the 1970s as mentioned in this paper , who argued that any given heritability estimate is local to the original population and environment studied and could not be generalized to other populations and environments.
Abstract: In the 1970s, Lewontin sparked a debate about a problem of locality, by making the case that any given heritability estimate is local to the original population and environment studied, and could not be generalized to other populations and environments. Nearly 50 years later, a new problem of portability has emerged: the predictive accuracy of polygenic scores diminishes when applied to populations whose characteristics are different from the original population sample. This paper briefly reviews the nature of each problem and analyzes their similarities and differences in three areas: 1) conceptual underpinnings, 2) causal explanations, and 3) practical, social, and political implications. Although conceptually and methodologically different from the problem of locality in important respects, the problem of portability facing contemporary genomics today should come as no surprise, as it is an inevitable outcome of the kinds of problematic inferences detailed by Lewontin nearly half a century ago.

Journal ArticleDOI
TL;DR: In this article , a concept of behavioural ecological individuality is proposed, which is defined as phenotypic and ecological uniqueness, a concept that is operationalised in terms of individual differences such as animal personality and individual specialisation.
Abstract: In this paper I develop a concept of behavioural ecological individuality. Using findings from a case study which employed qualitative methods, I argue that individuality in behavioural ecology should be defined as phenotypic and ecological uniqueness, a concept that is operationalised in terms of individual differences such as animal personality and individual specialisation. This account make sense of how the term "individuality" is used in relation to intrapopulation variation in behavioural ecology. The concept of behavioural ecological individuality can sometimes be used to identify individuals. It also shapes research agendas and methodological choices in behavioural ecology, leading researchers to account for individuals as sources of variation. Overall, this paper draws attention to a field that has been largely overlooked in philosophical discussions of biological individuality and highlights the importance of individual differences and uniqueness for individuality in behavioural ecology.

Journal ArticleDOI
TL;DR: In this article , a new category of normative reasons, called social epistemic reasons, is proposed, which includes both promise and social inquisitive reasons, and is based on the notion of pursuitworthiness in philosophy of science.
Abstract: Sometimes inquirers may rationally pursue a theory even when the available evidence does not favor that theory over others. Features of a theory that favor pursuing it are known as considerations of promise or pursuitworthiness. Examples of such reasons include that a theory is testable, that it has a useful associated analogy, and that it suggests new research and experiments. These reasons need not be evidence in favor of the theory. This raises the question: what kinds of reasons are provided by pursuitworthiness considerations? Are they epistemic reasons or practical reasons? I argue that pursuitworthiness considerations are a kind of non-evidential epistemic reason, which I call an inquisitive reason. In support of this, I first point out two important similarities between the traditional pursuitworthiness considerations discussed in philosophy of science, which I call promise reasons, and certain social epistemic reasons that I call social inquisitive reasons. Specifically, both kinds of reason (1) favor pursuing a theory in a non-evidential way, and (2) concern promoting successful inquiry. I then propose recognition of a new category of normative reason: inquisitive reasons. This category contains both promise and social inquisitive reasons. Finally, I argue that inquisitive reasons share three essential features with previously recognized epistemic reasons: a connection to epistemic aims, explanatory independence, and the presence of a specific right-kind/wrong-kind reasons distinction. Each of these features have been used to argue that evidence should be treated as part of a distinct, independent domain of epistemic normativity. Since inquisitive reasons share these features, they too should be considered part of this independent epistemic domain. Thus, inquisitive reasons, including pursuitworthiness considerations, are epistemic reasons.

Journal ArticleDOI
TL;DR: In the first Critique, Kant develops the rationalist theory of testimony as mentioned in this paper , which he used in his own work to defend race as a subject of natural philosophy and treat race from the standpoint of a philosophical investigator of nature (Naturforscher), who learns from nature like an appointed judge who compels witnesses to answer questions he puts to them.
Abstract: A testimony is somebody else's reported experience of what has happened. It is an indispensable source of knowledge. It only gives us historical cognition, however, which stands in a complex relation to rational or philosophical cognition: while the latter presupposes historical cognition as its matter, one needs the architectonic "eye of a philosopher" to select, interpret, and organize historical cognition. Kant develops this rationalist theory of testimony. He also practices it in his own work, especially while theorizing about race as a subject of natural philosophy. In three dedicated essays on this subject, he treats race from the standpoint of a philosophical investigator of nature (Naturforscher), who (as Kant puts it in the first Critique) learns from nature "like an appointed judge who compels witnesses to answer the questions he puts to them." This view underwrites Kant's use of travel reports (a type of testimony) in developing and defending his theory of race.

Journal ArticleDOI
TL;DR: In this paper , the epistemic significance of operational theories in quantum foundations is investigated, and it is argued that operational axiomatisations of quantum mechanics may be interpreted as a novel form of structural realism.
Abstract: We undertake a reconstruction of the epistemic significance of research on operational theories in quantum foundations. We suggest that the space of operational theories is analogous to the space of possible worlds employed in the possible world semantics for modal logic, so research of this sort can be understood as probing modal structure. Thus we argue that operational axiomatisations of quantum mechanics may be interpreted as a novel form of structural realism; we discuss the consequences of this interpretation for the philosophy of structural realism and the future of operational theories.

Journal ArticleDOI
TL;DR: The authors argue that the demarcation problem does not need to be framed as the problem of defining a set of necessary and jointly sufficient criteria for distinguishing between acceptable and unacceptable roles that non-epistemic values can play in science.
Abstract: In this paper, we argue that the new demarcation problem does not need to be framed as the problem of defining a set of necessary and jointly sufficient criteria for distinguishing between acceptable and unacceptable roles that non-epistemic values can play in science. We introduce an alternative way of framing the problem and defend an open-ended list of criteria that can be used in demarcation. Applying such criteria requires context-specific work that clarifies which principles should be used, and possibly leads to the identification of new principles - which then can be added to the open-ended list. We illustrate our approach by examining a context where distinguishing between acceptable and unacceptable value influences in science is both needed and tricky: transdisciplinary research.

Journal ArticleDOI
TL;DR: The research field of Developmental Origins of Health and Disease (DOHaD) provides a framework for understanding how a wide range of environmental factors, such as deprivation, nutrition and stress, shape individual and population health over the course of a lifetime as discussed by the authors .
Abstract: The research field of Developmental Origins of Health and Disease (DOHaD) provides a framework for understanding how a wide range of environmental factors, such as deprivation, nutrition and stress, shape individual and population health over the course of a lifetime. DOHaD researchers face the challenge of how to conceptualize and measure ontologically diverse environments and their interactions with the developing organism over extended periods of time. Based on ethnographic research, I show how DOHaD researchers are often eager to capture what they regard as more 'complex' understandings of the environment in their work. At the same time, they are confronted with established methodological tools, disciplinary infrastructures and institutional contexts that favor simplistic articulations of the environment as distinct and mainly individual-level variables. I show how researchers struggle with these simplistic articulations of nutrition, maternal bodies and social determinants as relevant environments, which are sometimes at odds with the researchers' own normative commitments and aspirations.

Journal ArticleDOI
TL;DR: In this paper , the authors argue that if we assume a traditional answer to (1), namely that scientists trust each other to be reliable informants, then the answer to question (2) is negative, certainly for the biomedical and social sciences.
Abstract: Epistemic trust among scientists is inevitable. There are two questions about this: (1) What is the content of this trust, what do scientists trust each other for? (2) Is such trust epistemically justified? I argue that if we assume a traditional answer to (1), namely that scientists trust each other to be reliable informants, then the answer to question (2) is negative, certainly for the biomedical and social sciences. This motivates a different construal of trust among scientists and therefore a different answer to (1): scientists trust each other to only testify to claims that are backed by evidence gathered in accordance with prevailing methodological standards. On this answer, trust among scientists is epistemically justified.

Journal ArticleDOI
TL;DR: In this paper , the authors investigate whether there are any noteworthy network effects in a version of the bounded confidence model augmented with communication networks, and in particular whether the aforementioned result from network epistemology can be replicated in that version.
Abstract: The bounded confidence model has become a popular tool for studying communities of epistemically interacting agents. The model makes the idealizing assumption that all agents always have access to all other agents’ belief states. We draw on resources from network epistemology to do away with this assumption. In the model to be proposed, we impose an explicit communication network on a community, due to which each agent has access to the beliefs of only a selection of other agents. A much-discussed result from network epistemology shows that densely connected communication networks are not always preferable to sparser networks. The aim of this paper is to investigate whether there are any noteworthy network effects in a version of the bounded confidence model augmented with communication networks, and in particular whether the aforementioned result from network epistemology can be replicated in that version.

Journal ArticleDOI
TL;DR: The apokritic model of pursuit as mentioned in this paper is a model of scientific pursuitworthiness that takes criticizability as its starting point, and it can be used to unify several indices of criticizablity, including objectivity, confirmation, and theory choice.
Abstract: Criticism is a staple of the scientific enterprise and of the social epistemology of science. Philosophical discussions of criticism have traditionally focused on its roles in relation to objectivity, confirmation, and theory choice. However, attention to criticism and to criticizability should also inform our thinking about scientific pursuits: the allocation of resources with the aim of developing scientific tools and ideas. In this paper, we offer an account of scientific pursuitworthiness which takes criticizability as its starting point. We call this the apokritic model of pursuit. Its core ideas are that pursuits are practices governed by norms for asking and answering questions, and that criticism arises from the breach of these norms. We illustrate and advertise our approach using examples from institutional grant review, neuroscience, and sociology. We show that the apokritic model can unify several indices of criticizability, that it can account for the importance of criticizing pursuits in scientific practice, and that it can offer ameliorative advice to erstwhile pursuers.

Journal ArticleDOI
TL;DR: In this article , the concept of genetic gain, the conditions for its emerging status as an indicator of agricultural development and the broader implications of this move, with particular emphasis on the changing knowledge-control regimes of plant breeding, the social and political consequences for smallholder farmers and climate-adaptive agriculture.
Abstract: Accelerating the rate of genetic gain has in recent years become a key objective in plant breeding for the Global South, building on the availability of new data technologies and bridging biological interest in crop improvement with economic interest in enhancing the cost efficiency of breeding programs. This paper explains the concept of genetic gain, the conditions for its emerging status as an indicator of agricultural development and the broader implications of this move, with particular emphasis on the changing knowledge-control regimes of plant breeding, the social and political consequences for smallholder farmers and climate-adaptive agriculture. We analyse how prioritising the variables used to derive the indicator when deciding on agricultural policies affects the relationship between development goals and practice. We conclude that genetic gain should not be considered as a primary indicator of agricultural development in the absence of information on other key areas (including agrobiodiversity, seed systems and the differential impact of climate change on soil, crops and communities), as well as tools to evaluate the pros and cons of the acceleration in seed selection, management and evaluation fostered by the adoption of genetic gain as a key indicator.

Journal ArticleDOI
TL;DR: In this paper , the history of quantum mechanics is viewed from the perspective of radiation, and it is argued that wave and matrix mechanics emerged as spin-offs from nascent, ultimately unsuccessful quantum theories of radiation.
Abstract: This paper starts from the observation that the major hallmarks in the history of quantum mechanics centrally involved light and radiation. Consequently, we ask how precisely did concepts and ideas from radiation theory define the creation of quantum mechanics? And why was it a quantum theory of mechanics and not a quantum theory of electrodynamics that was crafted in the years 1925/26? Following these questions, this paper provides a reading of the history of the old quantum theory from the perspective of radiation. We argue that both wave and matrix mechanics, while emerging from widely different trajectories, emerged as spin-offs from nascent, ultimately unsuccessful quantum theories of radiation. We discuss the epistemological implications of this development for the historical and philosophical analysis of quantum mechanics and quantum field theory.

Journal ArticleDOI
TL;DR: The authors argue that the success of Bayesian approaches hinges on computational methods that make a class of models predictive that would otherwise lack practical relevance, and that the new computational approaches change Bayesian rationality in an important way.
Abstract: Bayesian approaches have long been a small minority group in scientific practice, but quickly acquired a high level of popularity since the 1990s. This paper shall describe and analyze this turn. I argue that the success of Bayesian approaches hinges on computational methods that make a class of models predictive that would otherwise lack practical relevance. Philosophically, however, this orientation toward prediction comes at a price. The new computational approaches change Bayesian rationality in an important way. Namely, they undercut the interpretation of priors, turning them from an expression of beliefs held prior to new evidence into an adjustable parameter that can be manipulated flexibly by computational machinery. Thus, in the case of Bayes, one can see a coevolution of computing technology, an exploratory-iterative mode of prediction, and the conception of rationality.

Journal ArticleDOI
TL;DR: In this article , an integrated account of why it is rational for scientists to pursue both complementary and competing models of the same mechanism in scientific practice is presented, with a focus on visual processing.
Abstract: Why is it rational for scientists to pursue multiple models of a phenomenon at the same time? The literatures on mechanistic inquiry and scientific pursuit each develop answers to a version of this question which is rarely discussed by the other. The mechanistic literature suggests that scientists pursue different complementary models because each model provides detailed insights into different aspects of the phenomenon under investigation. The pursuit literature suggests that scientists pursue competing models because alternative models promise to solve outstanding empirical and conceptual problems. Looking into research on visual processing as a case study, we suggest an integrated account of why it is rational for scientists to pursue both complementary and competing models of the same mechanism in scientific practice.

Journal ArticleDOI
TL;DR: In this article , the authors revisited conventionalism about the geometry of classical and relativistic spacetimes, and clarified key themes of, and rectified common misunderstandings about, conventionalism.
Abstract: The present paper revisits conventionalism about the geometry of classical and relativistic spacetimes. By means of critically examining a recent evaluation of conventionalism, we clarify key themes of, and rectify common misunderstandings about, conventionalism. Reichenbach's variant is demarcated from conventionalism simpliciter, associated primarily with Poincaré. We carefully outline the latter's core tenets—as a selective anti-realist response to a particular form of theory underdetermination. A subsequent double defence of geometric conventionalism is proffered: one line of defence employs (and thereby, to some extent, rehabilitates) a plausible reading of Reichenbach's idea of universal forces; another consists in independent support for conventionalism, unrelated to Reichenbach. Conventionalism, we maintain, remains a live option in contemporary philosophy of spacetime physics, worthy of serious consideration.

Journal ArticleDOI
Timothy Lau1
TL;DR: In this article , a conceptually-focused presentation of low energy quantum gravity (LEQG) is presented, the effective quantum field theory obtained from general relativity and which provides a well-defined theory of quantum gravity at energies well below the Planck scale.
Abstract: I provide a conceptually-focused presentation of 'low-energy quantum gravity' (LEQG), the effective quantum field theory obtained from general relativity and which provides a well-defined theory of quantum gravity at energies well below the Planck scale. I emphasize the extent to which some such theory is required by the abundant observational evidence in astrophysics and cosmology for situations which require a simultaneous treatment of quantum-mechanical and gravitational effects, contra the often-heard claim that all observed phenomena can be accounted for either by classical gravity or by non-gravitational quantum mechanics, and I give a detailed account of the way in which a treatment of the theory as fluctuations on a classical background emerges as an approximation to the underlying theory rather than being put in by hand. I discuss the search for a Planck-scale quantum-gravity theory from the perspective of LEQG and give an introduction to the Cosmological Constant problem as it arises within LEQG.

Journal ArticleDOI
TL;DR: In this article , the authors examine why early twenty-first century animal research governance in Britain foregrounds the "culture of care" as its key problem and adopt a historical perspective to understand why the regulation of animal research became primarily a problem of "culture" at this time and not before.
Abstract: This article examines why early twenty-first century animal research governance in Britain foregrounds the ‘culture of care’ as its key problem. It adopts a historical perspective to understand why the regulation of animal research became primarily a problem of ‘culture’, a term firmly associated with the social relations of animal research, at this time and not before. Drawing on the theoretical insights of Sheila Jasanoff, Stephen Hilgartner and others, we contrast the British regulatory framework under the Cruelty to Animals Act (1876), which established statutory regulation of animal research for the first time in the world, with its successor the Animals (Scientific Procedures) Act 1986 (ASPA), in an attempt to chart two closely related yet distinct ‘constitutions’ of animal research each shaped by a historically situated sociotechnical imaginary. Across this longue durée, many concerns remained consistent yet inevitably, as the biomedical sciences transformed in scale and scope, new concerns emerged. Animal care, at least as far as it entailed a commitment to the prevention of animal suffering, was a prominent feature of animal research governance across the period. However, a concern for the culture and social relations of animal research emerged only in the latter half of the twentieth century. We account for this change primarily through a gradual distribution of responsibility for animal research from a single coherent community with broadly shared expertise (‘scientists’ with experience of animal research) to a diversified community of multiple experience and skillsets which included, importantly, a more equitable inclusion of animal welfare as a form of expertise with direct relevance to animal research. We conclude that animal research governance could only become conceived as a problem of ‘culture’ and thus social relations when responsibility for care and animal welfare was distributed across a differentiated community, in which diverse forms of expertise were required for the practice of humane animal research.