scispace - formally typeset
Search or ask a question

Showing papers in "Foundations of Science in 2020"


Journal ArticleDOI
TL;DR: In this paper, it was shown that the comportment of quantum and relativistic entities is not that strange after all, if we only consider what their nature might possibly be: not an objectual one, but a conceptual one.
Abstract: How can we explain the strange behavior of quantum and relativistic entities? Why do they behave in ways that defy our intuition about how physical entities should behave, considering our ordinary experience of the world around us? In this article, we address these questions by showing that the comportment of quantum and relativistic entities is not that strange after all, if we only consider what their nature might possibly be: not an objectual one, but a conceptual one. This not in the sense that quantum and relativistic entities would be human concepts, but in the sense that they would share with the latter a same conceptual nature, similarly to how electromagnetic and sound waves, although very different entities, can share a same undulatory nature. When this hypothesis is adopted, i.e., when a conceptuality interpretation about the deep nature of physical entities is taken seriously, many of the interpretational difficulties disappear and our physical world is back making sense, though our view of it becomes radically different from what our classical prejudice made us believe in the first place.

23 citations


Journal ArticleDOI
TL;DR: This research introduces the pioneering work ULTIMATE that uses a novel tree structure that uses support bound and distance bound computations for pruning temporal patterns.
Abstract: Discovery of temporal association patterns, temporal association rules from temporal databases is extensively studied by academic research community and applied in various industrial applications. Temporal association pattern discovery is extended to similarity based temporal association pattern discovery from time-stamped transaction datasets by researchers Yoo and Sashi Sekhar. They introduced methods for pruning through distance bounds, and have also introduced SEQUENTIAL and SPAMINE algorithms for pattern mining that are based on snapshot data scan and lattice data scan strategies respectively. Our previous research introduced algorithms G-SPAMINE, MASTER, Z-SPAMINE for time profiled association pattern discovery. These algorithms applied distance measures SRIHASS, ASTRA, and KRISHNA SUDARSANA for similarity computations. SEQUENTIAL, SPAMINE, G-SPAMINE, MASTER, Z-SPAMINE approaches are all based on snapshot and lattice database scan strategies and prunes temporal itemsets by making use of lower bound, upper bound support time sequences and upper-lower distance bound, lower bound distance values. The major limitation of all these algorithms is their inevitability to eliminate dataset scanning process for knowing true supports of itemsets and essential need to have dataset available in memory. To eliminate the requirement of retaining dataset in main memory, algorithms VRKSHA and GANDIVA are two pioneering research contributions that introduced tree structure for time profiled temporal association mining. VRKSHA is based on snapshot tree scan technique while GANDIVA is a lattice tree scan based approach. VRKSHA and GANDIVA both apply Euclidean distance function, but they do not estimate support and distance bounds. This research introduces the pioneering work ULTIMATE that uses a novel tree structure. The tree is generated using similarity measure ASTRA. ULTIMATE uses support bound and distance bound computations for pruning temporal patterns. Experiment results showed that ULTIMATE outperforms SEQUENTIAL, SPAMINE, G-SPAMINE, MASTER, VRKSHA, GANDIVA algorithms.

22 citations


Journal ArticleDOI
TL;DR: This research proposes a novel z-space based interest measure named as Krishna Sudarsana for time-stamped transaction databases by extending interest measure Srihass proposed in previous research and proves the performance of the proposed approach is better to Sequential approach that uses snapshot database scan strategy and Spamine approach that using lattice baseddatabase scan strategy.
Abstract: Similarity profiled association mining from time stamped transaction databases is an important topic of research relatively less addressed in the field of temporal data mining. Mining temporal patterns from these time series databases requires choosing and applying similarity measure for similarity computations and subsequently pruning temporal patterns. This research proposes a novel z-space based interest measure named as Krishna Sudarsana for time-stamped transaction databases by extending interest measure Srihass proposed in previous research. Krishna Sudarsana is designed by using the product based fuzzy Gaussian membership function and performs similarity computations in z-space to determine the similarity degree between any two temporal patterns. The interest measure is designed by considering z-values between z = 0 and z = 3.09. Applying the Krishna Sudarsana requires moving the threshold value given by user to a different transformation space (z-space) which is a defined as a function of standard deviation. In addition to proposing interest measure, new expressions for standard deviation and equivalent z-space threshold are derived for similarity computations. For experimental evaluation, we considered Naive, Sequential and Spamine algorithms that applies Euclidean distance function and compared performance of these three approaches to Z-Spamine algorithm that uses Krishna Sudarsana by choosing various test cases. Experiment results proved the performance of the proposed approach is better to Sequential approach that uses snapshot database scan strategy and Spamine approach that uses lattice based database scan strategy.

21 citations


Journal ArticleDOI
TL;DR: The importance of proposed approach is that the accuracy achieved using proposed approach outperforms CLAPP, CANN, SVM, KNN and other existing classifiers.
Abstract: Detecting Intrusions and anomalies is becoming much more challenging with new attacks popping out over a period of time. Achieving better accuracies by applying benchmark classifier algorithms used for identifying intrusions and anomalies have several hidden data mining challenges. Although neglected by many research findings, one of the most important and biggest challenges is the similarity or membership computation. Another challenge that cannot be simply neglected is the number of features that attributes to dimensionality. This research aims to come up with a new membership function to carry similarity computation that can be helpful for addressing feature dimensionality issues. In principle, this work is aimed at introducing a novel membership function that can help to achieve better classification accuracies and eventually lead to better intrusion and anomaly detection. Experiments are performed on KDD dataset with 41 attributes and also KDD dataset with 19 attributes. Recent approaches CANN and CLAPP have showed new approaches for intrusion detection. The proposed classifier is named as UTTAMA. UTTAMA performed better to both CANN and CLAPP approaches w.r.t overall classifier accuracy. Another promising outcome achieved using UTTAMA is the U2R and R2L attack accuracies. The importance of proposed approach is that the accuracy achieved using proposed approach outperforms CLAPP, CANN, SVM, KNN and other existing classifiers.

20 citations


Journal ArticleDOI
TL;DR: This paper surveyed the literature on stigmatization related to fetal alcohol spectrum disorders (FASD) and found that public stigma appears to be the most common form of stigma studied, while less is known about FASD-related self-stigma, stigma by association and structural stigma.
Abstract: Alcohol consumption during pregnancy can lead to fetal alcohol spectrum disorders (FASD). FASD is a spectrum of structural, functional, and neurodevelopmental problems with often lifelong implications, affecting communities worldwide. It is a leading preventable form of intellectual disabilities and therefore warrants effective prevention approaches. However, well-intended FASD prevention can increase stigmatization of individuals with FASD, women who consume or have consumed alcohol during pregnancy, and non-biological parents and guardians of individuals with FASD. This narrative review surveyed the literature on stigmatization related to FASD. Public stigma appears to be the most common form of stigma studied. Less is known about FASD-related self-stigma, stigma by association, and structural stigma. Accordingly, the current literature on FASD-related stigma does not appear to provide sufficient guidance for effectively reducing FASD-related stigma. However, lessons can be learned from other related health topics and the use of a systematic approach for the development of health promotion programs, namely Intervention Mapping.

18 citations


Journal ArticleDOI
TL;DR: The authors argue that English, despite all appearance, is no Lingua Franca, and give reasons why epistemic diversity is also deeply hindered is monolingual contexts, and sketch a proposal for multilingual academia where epistemic diversities is thereby fostered.
Abstract: Epistemic diversity is the ability or possibility of producing diverse and rich epistemic apparati to make sense of the world around us. In this paper we discuss whether, and to what extent, different conceptions of knowledge—notably as ‘justified true belief’ and as ‘distributed and embodied cognition’—hinder or foster epistemic diversity. We then link this discussion to the widespread move in science and philosophy towards monolingual disciplinary environments. We argue that English, despite all appearance, is no Lingua Franca, and we give reasons why epistemic diversity is also deeply hindered is monolingual contexts. Finally, we sketch a proposal for multilingual academia where epistemic diversity is thereby fostered.

18 citations


Journal ArticleDOI
TL;DR: It is conjecture that the way one can easily understand how two of ‘the same concepts’ are ‘absolutely identical and indistinguishable’ in human language is also the way in which quantum particles are absolutely identical in physical reality, providing new evidence for the conceptuality interpretation of quantum theory.
Abstract: We model a piece of text of human language telling a story by means of the quantum structure describing a Bose gas in a state close to a Bose–Einstein condensate near absolute zero temperature. For this we introduce energy levels for the words (concepts) used in the story and we also introduce the new notion of ‘cogniton’ as the quantum of human thought. Words (concepts) are then cognitons in different energy states as it is the case for photons in different energy states, or states of different radiative frequency, when the considered boson gas is that of the quanta of the electromagnetic field. We show that Bose–Einstein statistics delivers a very good model for these pieces of texts telling stories, both for short stories and for long stories of the size of novels. We analyze an unexpected connection with Zipf’s law in human language, the Zipf ranking relating to the energy levels of the words, and the Bose–Einstein graph coinciding with the Zipf graph. We investigate the issue of ‘identity and indistinguishability’ from this new perspective and conjecture that the way one can easily understand how two of ‘the same concepts’ are ‘absolutely identical and indistinguishable’ in human language is also the way in which quantum particles are absolutely identical and indistinguishable in physical reality, providing in this way new evidence for our conceptuality interpretation of quantum theory.

16 citations


Journal ArticleDOI
TL;DR: A novel similarity function is proposed for feature pattern clustering and high dimensional text classification which achieves dimensionality reduction, retains the word distribution and obtained better classification accuracies compared to other measures.
Abstract: Text document classification and clustering is an important learning task which fits to both data mining and machine learning areas. The learning task throws several challenges when it is required to process high dimensional text documents. Word distribution in text documents plays a very key role in learning process. Research related to high dimensional text document classification and clustering is usually limited to application of traditional distance functions and most of the research contributions in the existing literature did not consider the word distribution in documents. In this research, we propose a novel similarity function for feature pattern clustering and high dimensional text classification. The similarity function proposed is used to carry supervised learning based dimensionality reduction. The important feature of this work is that the word distribution before and after dimensionality reduction is the same. Experiment results prove the proposed approach achieves dimensionality reduction, retains the word distribution and obtained better classification accuracies compared to other measures.

16 citations


Journal ArticleDOI
TL;DR: In this paper, it was shown that a naive mixing of arithmetics leads to contradictions at a much more elementary level than the Clauser-Horne-Shimony-Holt inequality.
Abstract: Bell’s theorem cannot be proved if complementary measurements have to be represented by random variables which cannot be added or multiplied. One such case occurs if their domains are not identical. The case more directly related to the Einstein–Rosen–Podolsky argument occurs if there exists an ‘element of reality’ but nevertheless addition of complementary results is impossible because they are represented by elements from different arithmetics. A naive mixing of arithmetics leads to contradictions at a much more elementary level than the Clauser–Horne–Shimony–Holt inequality.

15 citations


Journal ArticleDOI
TL;DR: The study addresses the greyness levels and systems levels and explains why the world cannot be perceived as a purely white or black structure and clarifies why human knowledge of systems always remains grey.
Abstract: The main purpose of this study is to probe into the human capacity of understanding systems and defects in human knowledge of the world The study addresses the greyness levels and systems levels and explains why the world cannot be perceived as a purely white or black structure It also clarifies why human knowledge of systems always remains grey The investigation relies on logical and deductive reasoning and uses the theoretical foundations of systems thinking and Boulding’s systems hierarchy The most important argument that this study advances is that human knowledge, in any form or under any circumstances, is grey and incomplete and will remain grey Because the notion of “perfect knowledge” is ambiguous given human epistemic limits, any proportion of knowledge is incomplete and prone to change Less complexity could lead to more accurate predications, but even in the simplest forms of systems, reaching perfect knowledge seems to be an unwarranted claim Furthermore, because our perception of past events is incomplete, we cannot predict the future with certainty, as a result of which both the past and the future appear grey to us The world, as an integrated system, is neither black nor white, but it remains grey, and the systems partially recognized by humans are part of the grey world Gaining knowledge and increasing discoveries only contribute to the grey systems that are already known

15 citations


Journal ArticleDOI
TL;DR: In this paper, the basic philosophical underpinnings of grey systems theory (GST), as well as the paradigm governing its postulates are examined from the perspective of postmodern philosophy.
Abstract: Every scientific or intellectual movement is founded upon basic assumptions and hypotheses that shape its specifically formulated philosophy. This study seeks to explore and explicate the basic philosophical underpinnings of grey systems theory (GST), as well as the paradigm governing its postulates. The study, more specifically, scrutinizes the underlying principles of GST from the perspective of postmodern philosophy. To accomplish this, the epistemology, ontology, human nature, and methodology of GST are substantially investigated in the light of postmodern philosophy. The study draws on Burrell and Morgan’s framework to reveal the paradigm underlying the philosophy of GST. Results demonstrate that GST is an anti-realistic, anti-positivistic, and non-deterministic theory which is inherently pluralistic and ideographic. Based on the principles of GST, change is an indispensable dimension of human speculation about the world and systems, and knowledge is ceaselessly reproduced as new information is collected. As a result, knowledge, narratives, theories and scientific laws are dynamically changed. GST, then, is remarkably compatible with the foundations of postmodern thought and it could be regarded as a postmodern theory governed by a humanistic paradigm.

Journal ArticleDOI
TL;DR: In this paper, a degree of scientificity using fuzzy sets is proposed for cosmologies with a contraction phase before the current expansion phase, which is potentially more scientific than the standard cosmological model.
Abstract: In spite of successful tests, the standard cosmological model, the $$\varLambda$$ CDM model, possesses the most problematic concept: the initial singularity, also known as the big bang. In this paper—by adopting the Kantian difference between to think of an object and to cognize an object—it is proposed a degree of scientificity using fuzzy sets. Thus, the notion of initial singularity will not be conceived of as a scientific issue because it does not belong to the fuzzy set of what is known. Indeed, the problematic concept of singularity is some sort of what Kant called the noumenon, but science, on the other hand, is constructed in the phenomenon. By applying the fuzzy degree of scientificity in cosmological models, one concludes that cosmologies with a contraction phase before the current expansion phase are potentially more scientific than the standard model. At the end of this article, it is shown that Kant’s first antinomy of pure reason indicates a limit to our cosmological models.

Journal ArticleDOI
TL;DR: It was discovered that searching for harmful nodes with GA-ANFIS using weighted trust evaluation significantly increased the lifespan of WSNs and the proposed method has high life time than other methods.
Abstract: Demodulating harmful nodes and diminishing the energy waste in sensor nodes can prolong the lifespan of wireless sensor networks (WSNs). In this study, a genetic algorithm (GA) and an adaptive neuro fuzzy inference system were used to diminish the energy waste of sensors. Weighted trust evaluation was applied to search for harmful nodes in the network to prolong the lifespan of WSNs. A low-energy adaptive clustering hierarchy method was used to analyze the results. It was discovered that searching for harmful nodes with GA-ANFIS using weighted trust evaluation significantly increased the lifespan of WSNs. For evaluation of the proposed method we used the mean of energy of all sensors against of the round, data packets received in base station, minimum energy versus rounds and number of alive sensors versus rounds. Also, in this paper we compared the proposed method results with LEACH, LEACH-DT, Random, SIF and GA-Fuzzy methods. As results the proposed method has high life time than other methods. A representation of the overall system was implemented using MATLAB software.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that there exists a general confusion within the foundational literature arising from the improper scrambling of two different meanings of quantum contextuality: epistemic interpretation of contextuality and purely formal interpretation.
Abstract: In this paper we attempt to analyze the physical and philosophical meaning of quantum contextuality. We will argue that there exists a general confusion within the foundational literature arising from the improper “scrambling” of two different meanings of quantum contextuality. While the first one, introduced by Bohr, is related to an epistemic interpretation of contextuality which stresses the incompatibility (or complementarity) of measurement situations described in classical terms; the second meaning of contextuality is related to a purely formal understanding of contextuality as exposed by the Kochen–Specker (KS) theorem which focuses instead on the constraints of the orthodox quantum formalism in order to interpret projection operators as preexistent or actual (definite valued) properties. We will show how these two notions have been scrambled together creating an “omelette of contextuality” which has been fully widespread through a popularized “epistemic explanation” of the KS theorem according to which: The measurement outcome of the observableAwhen measured together withBor together withCwill necessarily differ in case$$[A, B] = [A, C] = 0$$, and$$[B, C] e 0$$. We will show why this statement is not only improperly scrambling epistemic and formal perspectives, but is also physically and philosophically meaningless. Finally, we analyze the consequences of such widespread epistemic reading of KS theorem as related to statistical statements of measurement outcomes.

Journal ArticleDOI
TL;DR: In this article, the authors compare some sophisticated ideas and approaches for the treatment of the problem of time and its asymmetry by thoroughly considering various aspects of the second law of thermodynamics, nonequilibrium entropy, entropy production, and irreversibility.
Abstract: In this survey, we discuss and analyze foundational issues of the problem of time and its asymmetry from a unified standpoint. Our aim is to discuss concisely the current theories and underlying notions, including interdisciplinary aspects, such as the role of time and temporality in quantum and statistical physics, biology, and cosmology. We compare some sophisticated ideas and approaches for the treatment of the problem of time and its asymmetry by thoroughly considering various aspects of the second law of thermodynamics, nonequilibrium entropy, entropy production, and irreversibility. The concept of irreversibility is discussed carefully and reanalyzed in this connection to clarify the concept of entropy production, which is a marked characteristic of irreversibility. The role of boundary conditions in the distinction between past and future is discussed with attention in this context. The paper also includes a synthesis of past and present research and a survey of methodology. It also analyzes some open questions in the field from a critical perspective.

Journal ArticleDOI
TL;DR: A set of hybrid and efficient genetic algorithms are proposed to solve feature selection problem, when the handled data has a large feature size and can outperform the other feature selection algorithms and effectively enhance the classification performance over the tested datasets.
Abstract: Due to the huge amount of data being generating from different sources, the analyzing and extracting of useful information from these data becomes a very complex task. The difficulty of dealing with big data optimization problems comes from many factors such as the high number of features, and the existing of lost data. The feature selection process becomes an important step in many data mining and machine learning algorithms to reduce the dimensionality of the optimization problems and increase the performance of the classification or clustering algorithms. In this paper, a set of hybrid and efficient genetic algorithms are proposed to solve feature selection problem, when the handled data has a large feature size. The proposed algorithms use a new gene-weighted mechanism that can adaptively classify the features into strong relative features, weak or redundant features, and unstable features during the evolution of the algorithm. Based on this classification, the proposed algorithm gives the strong features high priority and the weak features less priority when generating new candidate solutions. In the same time, the proposed algorithm tries to more concentrate on unstable features that sometimes appear and sometimes disappear from the best solutions of the population. The performance of proposed algorithms is investigated by using different datasets and feature selection algorithms. The results show that our proposed algorithms can outperform the other feature selection algorithms and effectively enhance the classification performance over the tested datasets.

Journal ArticleDOI
TL;DR: This paper explores OSNs’ threats and challenges, and categorize them into: account-based, URL-based and content-based threats, and proposes a comprehensive, user-level, proactive and real-time OSN’ protection system, called Hybrid Real-time Social Networks Protector (HRSP).
Abstract: The impact of Online Social Networks (OSNs) on human lives is foreseen to be very large with unprecedented amount of data and users. OSN users share their ideas, photos, daily life events, feelings and news. Since OSNs’ security and privacy challenges are more potential than ever before, it is necessary to enhance the protection and filtering approaches of OSNs contents. This paper explores OSNs’ threats and challenges, and categorize them into: account-based, URL-based and content-based threats. In addition, we analyze the existing protection methods and highlight their limitations and weaknesses. Based on that, we propose a comprehensive, user-level, proactive and real-time OSNs’ protection system, called Hybrid Real-time Social Networks Protector (HRSP). HRSP has three components; a user-level security protocol and two classification models. The protocol defines a structure for OSN’s cryptographic services, including encryption, access control and users’ authentication. The classification models employ machine learning, black lists, white lists and users’ feedback, in order to classify URLs into: Benign, Risk and Inappropriate classes, and contents into: Benign, Hate speech and Inappropriate classes. We constructed two data sets of 150,000 URLs and 22,000 tweets to build and test the two classification models. Results show an overall accuracy of 93.2% for the URL model and 84.4% for the content model, while the protocol implementation produces compatible size and time overhead. The components of HRSP are integrated and have compatible design with OSN platforms.

Journal ArticleDOI
TL;DR: Transnational legal communication seeks to identify transnational legal regimes and attempts to establish channels and technics for comprehensible communication of the legal information to specified groups of recipients as mentioned in this paper, and also strives to conclude about possible inconsistencies in law.
Abstract: Transnational legal communication seeks to identify transnational legal regimes and attempts to establish channels and technics for comprehensible communication of the legal information to specified groups of recipients. It also strives to conclude about possible inconsistencies in law. The approach is based on the cooperation of scientists within the area of law and applied linguistics and the coordination of their efforts, in order to conduct research from various perspectives, share conclusions and develop more complete approaches as well as achieve and mutually use more multilateral research results. It strives to reconcile legal research and linguistics research despite of their very different paradigms. The paper aims to explain the nature of legal communication and to establish its general research questions and objectives. The study is going to find an answer to the question what methods are to be used to communicate law comprehensively to its recipients and to draw conclusions on the consistency of legal regimes to be communicated. It accentuates that the solidarity necessary to achieve the objective of comprehensible and consistent law goes beyond the particular interests of individual sciences and is the foundation of the existence of the transnational legal communication community, non-depending on the place of living and the scope of practical knowledge.

Journal ArticleDOI
TL;DR: In this article, local selective realism in view of the shifting from classical to quantum electrodynamics has been discussed in the context of the transition from one theory to its successor.
Abstract: This article elaborates local selective realism in view of the shifting from classical to quantum electrodynamics. After some introductory remarks, we critically address what we call global selective realism, hence setting forth the background for outlining local selective realism. When examining the transition from classical to quantum electrodynamics, we evaluate both continuities and discontinuities in observational features, mathematical structures, and ontological presuppositions. Our argument leads us to criticise the narrow understanding of limiting-case strategies, and to reject the claim that we need a fully coherent theoretical framework to account for the transition from one theory to its successor in the case of electrodynamics. We close with a few remarks on the scope of local selective realism.

Journal ArticleDOI
TL;DR: A dualistic model of human time is proposed in which each component has both an illusory and non-illusory (‘real’) aspect and provides experimental evidence for various spacetime cosmological assertions regarding human time.
Abstract: There is a long standing debate as to whether or not time is ‘real’ or illusory, and whether or not human time (the flow/passage of time) is a direct reflection of physical time. Differing spacetime cosmologies have opposing views. Exactly what human time entails has, in our opinion, led to the failure to resolve this ‘two times’ problem. To help resolve this issue we propose a dualistic model of human time in which each component (e.g. change, motion, and temporality) has both an illusory and non-illusory (‘real’) aspect. With the dualistic model we are able to provide experimental tests for all of the human time assertions of 10 chosen spacetime cosmologies. The illusory aspect of the ‘present,’ i.e. a ‘unique present’ was confirmed. An information gathering and utilizing system (IGUS) was constructed using a virtual reality (VR) apparatus allowing the observer to experientially roam back and forth along the worldline ad lib. The phenomenon of ‘change’ was experimentally found to be illusory at high frequency observation and non-illusory (‘real’) at low frequency observation, the latter phenomenon coinciding with ‘change’ referred to in the ‘Order of Time’ and ‘Relativity Refounded’ views. Additional experiments are presented indicating that both motion and temporality are dualistic. In sum, the dualistic model of human time allows for the existence of both illusory and non-illusory (‘real’) aspects of human time that are not in conflict with one another. It also provides experimental evidence for various spacetime cosmological assertions regarding human time.

Journal ArticleDOI
TL;DR: Two theorems showing that a conscious agent can consistently see geometric and probabilistic structures of space that are not necessarily in the world per se but are properties of the conscious agent itself are discussed, suggesting the need for a new theory which resolves the reverse mind-body problem.
Abstract: The Interface Theory of Perception, as stated by D. Hoffman, says that perceptual experiences do not to approximate properties of an “objective” world; instead, they have evolved to provide a simplified, species-specific, user interface to the world. Conscious Realism states that the objective world consists of ‘conscious agents’ and their experiences. Under these two theses, consciousness creates all objects and properties of the physical world: the problem of explaining this process reverses the mind-body problem. In support of the interface theory I propose that our perceptions have evolved, not to report the truth, but to guide adaptive behaviors. Using evolutionary game theory, I state a theorem asserting that perceptual strategies that see the truth will, under natural selection, be driven to extinction by perceptual strategies of equal complexity but tuned instead to fitness. I then give a minimal mathematical definition of the essential elements of a “conscious agent.” Under the conscious realism thesis, this leads to a non-dualistic, dynamical theory of conscious process in which both observer and observed have the same mathematical structure. The dynamics raises the possibility of emergence of combinations of conscious agents, in whose experiences those of the component agents are entangled. In support of conscious realism, I discuss two more theorems showing that a conscious agent can consistently see geometric and probabilistic structures of space that are not necessarily in the world per se but are properties of the conscious agent itself. The world simply has to be amenable to such a construction on the part of the agent; and different agents may construct different (even incompatible) structures as seeming to belong to the world. This again supports the idea that any true structure of the world is likely quite different from what we see. I conclude by observing that these theorems suggest the need for a new theory which resolves the reverse mind-body problem, a good candidate for which is conscious agent theory.

Journal ArticleDOI
TL;DR: A concrete avenue to de-stigmatization of mothers impacted by albinism exists by the application of principles of human rights, particularly equality and non-discrimination; contextual analysis of cultural dynamics including relevant ontology; meaningful participation of rights-claimants, such as peer groups of mothers; and accountability of governments and their obligation to ensure access to health information as a key social determinant of the right to health.
Abstract: In many parts of sub-Saharan Africa, mothers impacted by the genetic condition of albinism, whether as mothers of children with albinism or themselves with albinism, are disproportionately impacted by a constellation of health-related stigma, social determinants of health (SDH), and human rights violations. In a critical ethnographic study in Tanzania, we engaged with the voices of mothers impacted by albinism and key stakeholders to elucidate experiences of stigma. Their narratives revealed internalized subjective stigma, social stigma such as being ostracized by family and community, and structural stigma on account of lack of access to SDH. An analysis of health systems as SDH revealed stigmatizing attitudes and behaviours of healthcare providers, especially at the time of birth; a lack of access to timely quality health services, in particular skin and eye care; and a lack of health-related education about the cause and care of albinism. Gender inequality as another SDH featured prominently as an amplifier of stigma. The findings pose implications for research, policy, and practice. A concrete avenue to de-stigmatization of mothers impacted by albinism exists by the application of principles of human rights, particularly equality and non-discrimination; contextual analysis of cultural dynamics including relevant ontology; meaningful participation of rights-claimants, such as peer groups of mothers; and accountability of governments and their obligation to ensure access to health information as a key social determinant of the right to health.


Journal ArticleDOI
TL;DR: In this article, the authors introduce Boolean-like algebras of dimension n ( ``(n{\mathrm {BA}}$$�s) having n constants and an operation q (generalised if-then-else) that induces a decomposition of the algebra into n factors through the so-called n-central elements.
Abstract: We introduce Boolean-like algebras of dimension n ( $$n{\mathrm {BA}}$$ s) having n constants $${{{\mathsf {e}}}}_1,\ldots ,{{{\mathsf {e}}}}_n$$ , and an $$(n+1)$$ -ary operation q (a “generalised if-then-else”) that induces a decomposition of the algebra into n factors through the so-called n-central elements. Varieties of $$n{\mathrm {BA}}$$ s share many remarkable properties with the variety of Boolean algebras and with primal varieties. The $$n{\mathrm {BA}}$$ s provide the algebraic framework for generalising the classical propositional calculus to the case of n–perfectly symmetric–truth-values. Every finite-valued tabular logic can be embedded into such a n-valued propositional logic, $$n{\mathrm {CL}}$$ , and this embedding preserves validity. We define a confluent and terminating first-order rewriting system for deciding validity in $$n{\mathrm {CL}}$$ , and, via the embeddings, in all the finite tabular logics.

Journal ArticleDOI
TL;DR: In this article, the authors present an historical-philological-foundational hypothesis on the Tartaglia's fortifications corpus, as exposed in the Book VI and La Gionta del sesto libro; with respect to dates of editions of Quesiti (1546, 1554).
Abstract: Forums, I extensively analysed Tartaglia’s corpus: science of weights, geometry, arithmetic, mathematics and physics–trajectories of the projectiles, fortifications, included its intelligibility science in the military architecture. The latter is exposed in Book VI of the Quesiti et invention diverse (hereafter Quesiti). In Quesiti there is La Gionta del sesto libro—a kind of appendix to the Book VI containing drawings of the geometric shape of the Italian fortifications. It is based on Euclidean geometry and other figures where a scale is displayed. The interest—included intellectual history and cultural foundations of science—is: what is the role–played by La Gionta del sesto libro in the Quesiti? Is it independent booklet/speeches? If yes, when Tartaglia did write it? In this paper, I present an historical–philological–foundational hypothesis on the Tartaglia’s fortifications corpus, as exposed in the Book VI and La Gionta del sesto libro; with respect to dates of editions of Quesiti (1546, 1554). This goal is important to make historically clear both the editorial role–played by Curzio Troiano Navo (fl. 16th) in the Tartaglia’s corpus before—after Tartaglia’s death (1557) and the particular interest and development of subject of fortifications by Tartaglia between 1537–1546 and 1554.

Journal ArticleDOI
TL;DR: Approaches to the problem of software verification are surveyed and a new proof for why there can be no general solution is offered.
Abstract: How can we be certain that software is reliable? Is there any method that can verify the correctness of software for all cases of interest? Computer scientists and software engineers have informally assumed that there is no fully general solution to the verification problem. In this paper, we survey approaches to the problem of software verification and offer a new proof for why there can be no general solution.

Journal ArticleDOI
TL;DR: Experiment paradox as discussed by the authors reveals that any experiment performed on a physical system is invasive and thus establishes inevitable limits to the accuracy of any mathematical model, leading to the "experiment paradox".
Abstract: Modern physics is founded on two mainstays: mathematical modelling and empirical verification. These two assumptions are prerequisite for the objectivity of scientific discourse. Here we show, however, that they are contradictory, leading to the ‘experiment paradox’. We reveal that any experiment performed on a physical system is—by necessity—invasive and thus establishes inevitable limits to the accuracy of any mathematical model. We track its manifestations in both classical and quantum physics and show how it is overcome ‘in practice’ via the concept of environment. We argue that the unravelled paradox induces a new type of ‘ontic’ underdetermination, which has deep consequences for the methodological foundations of physics.

Journal ArticleDOI
TL;DR: This paper formally establishes the fundamental elements and postulates making up a first attempt at a theory of software engineering and knowledge engineering that related disciplines can particularise and extend to take benefit from it.
Abstract: The state of computing science and, particularly, software engineering and knowledge engineering is generally considered immature. The best starting point for achieving a mature engineering discipline is a solid scientific theory, and the primary reason behind the immaturity in these fields is precisely that computing science still has no such agreed upon underlying theory. As theories in other fields of science do, this paper formally establishes the fundamental elements and postulates making up a first attempt at a theory in this field, considering the features and peculiarities of computing science. The fundamental elements of this approach are informons and holons, and it is a general and comprehensive theory of software engineering and knowledge engineering that related disciplines (e.g., information systems) can particularise and/or extend to take benefit from it (Lakatos’ concepts of core theory and protective belt theories).

Journal ArticleDOI
TL;DR: In this paper, it was shown that quantum probability can be interpreted as epistemic, despite its non-Kolmogorovian structure, by considering the macroscopic contexts associated with measurement procedures and the microscopic contexts underlying them.
Abstract: According to a standard view, quantum mechanics (QM) is a contextual theory and quantum probability does not satisfy Kolmogorov’s axioms. We show, by considering the macroscopic contexts associated with measurement procedures and the microscopic contexts (μ-contexts) underlying them, that one can interpret quantum probability as epistemic, despite its non-Kolmogorovian structure. To attain this result we introduce a predicate language L(x), a classical probability measure on it and a family of classical probability measures on sets of μ-contexts, each element of the family corresponding to a (macroscopic) measurement procedure. By using only Kolmogorovian probability measures we can thus define mean conditional probabilities on the set of properties of any quantum system that admit an epistemic interpretation but are not bound to satisfy Kolmogorov’s axioms. The generalized probability measures associated with states in QM can then be seen as special cases of these mean probabilities, which explains how they can be non-classical and provides them with an epistemic interpretation. Moreover, the distinction between compatible and incompatible properties is explained in a natural way, and purely theoretical classical conditional probabilities coexist with empirically testable quantum conditional probabilities.

Journal ArticleDOI
TL;DR: In this paper, the authors present an approach to represent ecological systems using reaction networks, and show how a particular framework called chemical organization theory (COT) sheds new light on the longstanding complexity-stability debate.
Abstract: We present a novel approach to represent ecological systems using reaction networks, and show how a particular framework called chemical organization theory (COT) sheds new light on the longstanding complexity–stability debate. Namely, COT provides a novel conceptual landscape plenty of analytic tools to explore the interplay between structure and stability of ecological systems. Given a large set of species and their interactions, COT identifies, in a computationally feasible way, each and every sub-collection of species that is closed and self-maintaining. These sub-collections, called organizations, correspond to the groups of species that can survive together (co-exist) in the long-term. Thus, the set of organizations contains all the stable regimes that can possibly happen in the dynamics of the ecological system. From here, we propose to conceive the notion of stability from the properties of the organizations, and thus apply the vast knowledge on the stability of reaction networks to the complexity–stability debate. As an example of the potential of COT to introduce new mathematical tools, we show that the set of organizations can be equipped with suitable joint and meet operators, and that for certain ecological systems the organizational structure is a non-boolean lattice, providing in this way an unexpected connection between logico-algebraic structures, popular in the foundations of quantum theory, and ecology.