scispace - formally typeset
Search or ask a question

Showing papers in "Information-an International Interdisciplinary Journal in 2014"


Journal ArticleDOI
TL;DR: An approach drawn from the ideas of computer systems modelling is used to produce a model for information itself, which includes what information technology practitioners call “non-functional” attributes, which, for information, include information quality and information friction.
Abstract: This paper uses an approach drawn from the ideas of computer systems modelling to produce a model for information itself. The model integrates evolutionary, static and dynamic views of information and highlights the relationship between symbolic content and the physical world. The model includes what information technology practitioners call “non-functional” attributes, which, for information, include information quality and information friction. The concepts developed in the model enable a richer understanding of Floridi’s questions “what is information?” and “the informational circle: how can information be assessed?” (which he numbers P1 and P12).

91 citations


Journal ArticleDOI
TL;DR: This paper develops the case that there is a plausible underlying reality: one actual spacetime-based history, although with behavior that appears strange when analyzed dynamically (one time-slice at a time), by using a simple model with no dynamical laws.
Abstract: Despite various parallels between quantum states and ordinary information, quantum no-go-theorems have convinced many that there is no realistic framework that might underly quantum theory, no reality that quantum states can represent knowledge about. This paper develops the case that there is a plausible underlying reality: one actual spacetime-based history, although with behavior that appears strange when analyzed dynamically (one time-slice at a time). By using a simple model with no dynamical laws, it becomes evident that this behavior is actually quite natural when analyzed "all-at-once" (as in classical action principles). From this perspective, traditional quantum states would represent incomplete information about possible spacetime histories, conditional on the future measurement geometry. Without dynamical laws imposing additional restrictions, those histories can have a classical probability distribution, where exactly one history can be said to represent an underlying reality.

54 citations


Journal ArticleDOI
TL;DR: An emerging conceptual framework suggesting that the origin of life may be identified as a transition in causal structure and information flow is reviewed and some of the implications for understanding the early stages chemical evolution are detailed.
Abstract: Biological systems represent a unique class of physical systems in how they process and manage information. This suggests that changes in the flow and distribution of information played a prominent role in the origin of life. Here I review and expand on an emerging conceptual framework suggesting that the origin of life may be identified as a transition in causal structure and information flow, and detail some of the implications for understanding the early stages chemical evolution.

51 citations


Journal ArticleDOI
TL;DR: A new intuitionistic fuzzy (IF) entropy has been put forward, which considered both the uncertainty and the hesitancy degree of IF sets, and based on the new entropy measure, a new decision making method of a multi-attribute decision making problem was subsequentlyPut forward, in which attribute values are expressed with IF values.
Abstract: In this paper, firstly, a new intuitionistic fuzzy (IF) entropy has been put forward, which considered both the uncertainty and the hesitancy degree of IF sets. Through comparing with other entropy measures, the advantage of the new entropy measure is obvious. Secondly, based on the new entropy measure, a new decision making method of a multi-attribute decision making problem was subsequently put forward, in which attribute values are expressed with IF values. In the cases of attribute weights, completely unknown and attribute weights are partially known. Two methods were constructed to determine them. One method is an extension of the ordinary entropy weight method, and the other method is a construction the optimal model according to the minimum entropy principle. Finally, two practical examples are given to illustrate the effectiveness and practicability of the proposed method.

42 citations


Journal ArticleDOI
TL;DR: Godel’s incompleteness argument is explored here to devise an engine where an astronomically large number of “if-then” arguments are allowed to grow by self-assembly, based on the basic set of arguments written in the system, thus, the beyond Turing path of computing but following a fundamentally different route adopted in the last half-a-century old non-Turing adventures.
Abstract: Here, we introduce a new class of computer which does not use any circuit or logic gate In fact, no program needs to be written: it learns by itself and writes its own program to solve a problem Godel’s incompleteness argument is explored here to devise an engine where an astronomically large number of “if-then” arguments are allowed to grow by self-assembly, based on the basic set of arguments written in the system, thus, we explore the beyond Turing path of computing but following a fundamentally different route adopted in the last half-a-century old non-Turing adventures Our hardware is a multilayered seed structure If we open the largest seed, which is the final hardware, we find several computing seed structures inside, if we take any of them and open, there are several computing seeds inside We design and synthesize the smallest seed, the entire multilayered architecture grows by itself The electromagnetic resonance band of each seed looks similar, but the seeds of any layer shares a common region in its resonance band with inner and upper layer, hence a chain of resonance bands is formed (frequency fractal) connecting the smallest to the largest seed (hence the name invincible rhythm or Ajeya Chhandam in Sanskrit) The computer solves intractable pattern search (Clique) problem without searching, since the right pattern written in it spontaneously replies back to the questioner To learn, the hardware filters any kind of sensory input image into several layers of images, each containing basic geometric polygons (fractal decomposition), and builds a network among all layers, multi-sensory images are connected in all possible ways to generate “if” and “then” argument Several such arguments and decisions (phase transition from “if” to “then”) self-assemble and form the two giant columns of arguments and rules of phase transition Any input question is converted into a pattern as noted above, and these two astronomically large columns project a solution The driving principle of computing is synchronization and de-synchronization of network paths, the system drives towards highest density of coupled arguments for maximum matching Memory is located at all layers of the hardware Learning, computing occurs everywhere simultaneously Since resonance chain connects all computing seeds, wireless processing is feasible without a screening effect The computing power is increased by maximizing the density of resonance states and bandwidth of the resonance chain together We discovered this remarkable computing while studying the human brain, so we present a new model of the human brain in terms of an experimentally determined resonance chain with bandwidth 10−15 Hz (complete brain with all sensors) to 10+15 Hz (DNA) along with its implementation using a pure organic synthesis of entire computer (brain jelly) in our lab, software prototype as proof of concept and finally a new fourth circuit element (Hinductor) based beyond Complementary metal-oxide semiconductor (CMOS) hardware is also presented

38 citations


Journal ArticleDOI
TL;DR: This work proposes a measure of the complexity of a system that is largely orthogonal to computational, information theoretic, or thermodynamic conceptions of structural complexity, which is a separate dimension of system complexity that measures the degree to which it exhibits discrete levels of nonlinear dynamical organization.
Abstract: We argue that a critical difference distinguishing machines from organisms and computers from brains is not complexity in a structural sense, but a difference in dynamical organization that is not well accounted for by current complexity measures. We propose a measure of the complexity of a system that is largely orthogonal to computational, information theoretic, or thermodynamic conceptions of structural complexity. What we call a system's dynamical depth is a separate dimension of system complexity that measures the degree to which it exhibits discrete levels of nonlinear dynamical organization in which successive levels are distinguished by local entropy reduction and constraint generation. A system with greater dynamical depth than another consists of a greater number of such nested dynamical levels. Thus, a mechanical or linear thermodynamic system has less dynamical depth than an inorganic self-organized system, which has less dynamical depth than a living system. Including an assessment of dynamical depth can provide a more precise and systematic account of the fundamental difference between inorganic systems (low dynamical depth) and living systems (high dynamical depth), irrespective of the number of their parts and the causal relations between them.

37 citations


Journal ArticleDOI
TL;DR: The paper’s findings may enable an increased awareness towards the areas where existing knowledge is at the mercy of “leakage” and help managers of SMEs to better cope with risks related to knowledge leakage and better exploit the (limited) knowledge base available.
Abstract: In this paper, we look into knowledge leakages and ways to address them. It is conducted from the point of view of small and medium-sized enterprises (SMEs), as their specific attributes create unique challenges. Based on a discussion of the relevant fields, ways are presented in order to reduce the danger of knowledge leakages. In view of practitioners, the paper’s findings may enable an increased awareness towards the areas where existing knowledge is at the mercy of “leakage”. This can assist managers of SMEs to better cope with risks related to knowledge leakage and, therefore, better exploit the (limited) knowledge base available.

33 citations


Journal ArticleDOI
TL;DR: A model that characterizes CSFs for the identification activity and highlights the CSFs' contribution to knowledge retention is proposed.
Abstract: In this paper, the authors demonstrate the suitability of IT-supported knowledge repositories for knowledge retention. Successful knowledge retention is dependent on what is stored in a repository and, hence, possible to share. Accordingly, the ability to capture the right (relevant) knowledge is a key aspect. Therefore, to increase the quality in an IT-supported knowledge repository, the identification activity, which starts the capture process, must be successfully performed. While critical success factors (CSFs) for knowledge retention and knowledge management are frequently discussed in the literature, there is a knowledge gap concerning CSFs for this specific knowledge capture activity. From a knowledge retention perspective, this paper proposes a model that characterizes CSFs for the identification activity and highlights the CSFs' contribution to knowledge retention.

28 citations


Journal ArticleDOI
TL;DR: The Shelf Detector project aims to solve the problem of data knowledge in the shelf-out-of-stock problem by providing a high number of data and interesting insights for store and marketing teams.
Abstract: Shelf-out-of-stock is one of the leading motivations of technology innovation in the shelf of the future. The Shelf Detector project described in this paper aims to solve the problem of data knowledge in the shelf-out-of-stock problem. This paper is mainly focused on the information layer of the system and main novelties illustrated in this work are in the information field demonstrating the huge number of insights that can be derived from the use of such a tool able to gather data in real time from the store. The tool presented is the first being installed for a long time in a high number of stores and products, demonstrating the ability to gather data and extract interesting insights. This paper aims to demonstrate the feasibility and the scalability of our system in providing a high number of data and interesting insights for store and marketing teams. The cloud based architecture developed and tested in this project is a key feature of our system together with the ability to collect data from a distributed sensor network.

26 citations


Journal ArticleDOI
TL;DR: This review examines some particular, but important and basic aspects of information, and the close connection of information algebras to logic and domain theory will be exhibited.
Abstract: This review examines some particular, but important and basic aspects of information: Information is related to questions and should provide at least partial answers. Information comes in pieces, and it should be possible to aggregate these pieces. Finally, it should be possible to extract that part of a piece of information which relates to a given question. Modeling these concepts leads to an algebraic theory of information. This theory centers around two different but closely related types of information algebras, each containing operations for aggregation or combination of information and for extracting information relevant to a given question. Generic constructions of instances of such algebras are presented. In particular, the close connection of information algebras to logic and domain theory will be exhibited.

21 citations


Journal ArticleDOI
TL;DR: A novel approach of BPS is presented that exploits interval-valued data to represent model parameters, in place of conventional single-valued or probability-valued parameters, and is developed as an extension of a publicly available simulation engine, based on the Business Process Model and Notation standard.
Abstract: Simulating organizational processes characterized by interacting human activities, resources, business rules and constraints, is a challenging task, because of the inherent uncertainty, inaccuracy, variability and dynamicity. With regard to this problem, currently available business process simulation (BPS) methods and tools are unable to efficiently capture the process behavior along its lifecycle. In this paper, a novel approach of BPS is presented. To build and manage simulation models according to the proposed approach, a simulation system is designed, developed and tested on pilot scenarios, as well as on real-world processes. The proposed approach exploits interval-valued data to represent model parameters, in place of conventional single-valued or probability-valued parameters. Indeed, an interval-valued parameter is comprehensive; it is the easiest to understand and express and the simplest to process, among multi-valued representations. In order to compute the interval-valued output of the system, a genetic algorithm is used. The resulting process model allows forming mappings at different levels of detail and, therefore, at different model resolutions. The system has been developed as an extension of a publicly available simulation engine, based on the Business Process Model and Notation (BPMN) standard.

Journal ArticleDOI
TL;DR: This article is an attempt to capture some of the major developments and currents of thought in information theory and the relations between them, and has related key concepts in each domain to my non-standard extension of logic to real processes that I call Logic in Reality (LIR).
Abstract: This article is an attempt to capture, in a reasonable space, some of the major developments and currents of thought in information theory and the relations between them. I have particularly tried to include changes in the views of key authors in the field. The domains addressed range from mathematical-categorial, philosophical and computational approaches to systems, causal-compositional, biological and religious approaches and messaging theory. I have related key concepts in each domain to my non-standard extension of logic to real processes that I call Logic in Reality (LIR). The result is not another attempt at a General Theory of Information such as that of Burgin, or a Unified Theory of Information like that of Hofkirchner. It is not a compendium of papers presented at a conference, more or less unified around a particular theme. It is rather a highly personal, limited synthesis which nonetheless may facilitate comparison of insights, including contradictory ones, from different lines of inquiry. As such, it may be an example of the concept proposed by Marijuan, still little developed, of the recombination of knowledge. Like the best of the work to which it refers, the finality of this synthesis is the possible contribution that an improved understanding of the nature and dynamics of information may make to the ethical development of the information society.

Journal ArticleDOI
TL;DR: A broader picture is presented in which information is associated with epistemic structures, which form cognitive infological systems as basic recipients and creators of cognitive information.
Abstract: Information is usually related to knowledge. Here, we present a broader picture in which information is associated with epistemic structures, which form cognitive infological systems as basic recipients and creators of cognitive information. Infological systems are modeled by epistemic spaces, while operators in these spaces are mathematical models of information. Information that acts on epistemic structures is called cognitive information, while information that acts on knowledge structures is called epistemic information. The latter brings new and updates existing knowledge, being of primary importance to people. In this paper, both types of information are studied as operators in epistemic spaces based on the general theory of information. As a synthetic approach, which reveals the essence of information, organizing and encompassing all main directions in information theory, the general theory of information provides efficient means for such a study. Different types of information dynamics representation use tools from various mathematical disciplines, such as the theory of categories, functional analysis, mathematical logic and algebra. In this paper, we base our exploration of information and knowledge dynamics on functional analysis further developing the mathematical stratum of the general theory of information.

Journal ArticleDOI
TL;DR: The main objective of this treatise is to outline another conceptual step forward by employing Grothendieck’s dessins d’enfants to reveal the topological and (non)algebraic machinery underlying the measurement acts and their information content.
Abstract: Wheeler’s observer-participancy and the related it from bit credo refer to quantum non-locality and contextuality. The mystery of these concepts slightly starts unveiling if one encodes the (in)compatibilities between qubit observables in the relevant finite geometries. The main objective of this treatise is to outline another conceptual step forward by employing Grothendieck’s dessins d’enfants to reveal the topological and (non)algebraic machinery underlying the measurement acts and their information content.

Journal ArticleDOI
TL;DR: In this article, the authors proposed a method that protects the data transmitted between ADS-B sensors and Air Traffic Control (ATC) using Simple Public Key Infrastructure (SPKI) certificates and symmetric cryptography.
Abstract: Communications, Navigation, Surveillance/Air Traffic Management (CNS/ATM) systems utilize digital technologies, satellite systems, and various levels of automation to facilitate seamless global air traffic management Automatic Dependent Surveillance-Broadcast (ADS-B), the core component of CNS/ATM, broadcasts important monitoring information, such as the location, altitude, and direction of aircraft, to the ground However, ADS-B data are transmitted in an unencrypted (or unprotected) communication channel between ADS-B sensors and Air Traffic Control (ATC) Consequently, these data are vulnerable to security threats, such as spoofing, eavesdropping, and data modification In this paper, we propose a method that protects the ADS-B data transmitted between ADS-B sensors and ATC using Simple Public Key Infrastructure (SPKI) certificates and symmetric cryptography The SPKI certificates are used to grant transmission authorization to the ADS-B sensors, while symmetric cryptography is used to encrypt/decrypt the ADS-B data transmitted between the ADS-B sensors and ATC The proposed security framework comprises an ADS-B sensor authentication module, an encrypted data processing module, and an ADS-B sensor information management module We believe that application of the proposed security framework to CNS/ATM will enable it to effectively obviate security threats, such as ground station flood denial, ground station target ghost injection, and ADS-B data modification

Journal ArticleDOI
TL;DR: It is shown that with reasonable physical assumptions the symbol grounding problem and the quantum system identification problem are equivalent and demonstrably unsolvable.
Abstract: The symbol grounding problem is the problem of specifying a semantics for the representations employed by a physical symbol system in a way that is neither circular nor regressive. The quantum system identification problem is the problem of relating observational outcomes to specific collections of physical degrees of freedom, i.e., to specific Hilbert spaces. It is shown that with reasonable physical assumptions these problems are equivalent. As the quantum system identification problem is demonstrably unsolvable by finite means, the symbol grounding problem is similarly unsolvable.

Journal ArticleDOI
TL;DR: How a new understanding of the “natural information flows” as they prototypically occur in living beings—even in the simplest cells—could provide a sound basis for reappraising fundamental problems of the new science is discussed.
Abstract: The extraordinary scientific-technical, economic, and social transformations related to the widespread use of computers and to the whole information and communication technologies have not been accompanied by the development of a scientific “informational” perspective helping make a coherent sense of the spectacular changes occurring. Like in other industrial revolutions of the past, technical praxis antedates the emergence of theoretical disciplines. Apart from the difficulties in handling new empirical domains and in framing new ways of thinking, the case of information science implies the difficult re-evaluation of important bodies of knowledge already well accommodated in specific disciplines. Herein, we will discuss how a new understanding of the “natural information flows” as they prototypically occur in living beings—even in the simplest cells—could provide a sound basis for reappraising fundamental problems of the new science. The role of a renewed information science, multidisciplinarily conceived and empirically grounded, widely transcends the limited “library” and knowledge-repositories mission into which classical information science was cajoled during past decades. Paraphrasing the Spanish philosopher J. Ortega y Gasset, the overhaul of information science itself becomes “the challenge of our time”.

Journal ArticleDOI
TL;DR: This paper is an initial attempt to present the fundamental physics of non-quantum information in terms of a novel non-linguistic logic, applying LIR as a critique of current approaches to the physical grounding of information, focusing on its qualitative dualistic aspects at non-Quantum levels as a set of physical processes embedded in a physical world.
Abstract: A consensus is emerging that the multiple forms, functions and properties of information cannot be captured by a simple categorization into classical and quantum information. Similarly, it is unlikely that the applicable physics of information is a single classical discipline, completely expressible in mathematical terms, but rather a complex, multi- and trans-disciplinary field involving deep philosophical questions about the underlying structure of the universe. This paper is an initial attempt to present the fundamental physics of non-quantum information in terms of a novel non-linguistic logic. Originally proposed by the Franco-Romanian thinker Stephane Lupasco (1900–1988), this logic, grounded in quantum mechanics, can reflect the dual aspects of real processes and their evolution at biological, cognitive and social levels of reality. In my update of this logical system—Logic in Reality (LIR)—a change in perspective is required on the familiar notions in science and philosophy of causality, continuity and discontinuity, time and space. I apply LIR as a critique of current approaches to the physical grounding of information, focusing on its qualitative dualistic aspects at non-quantum levels as a set of physical processes embedded in a physical world.

Journal ArticleDOI
TL;DR: The analysis presented in this paper will be a foundation for energy-efficient context-aware services in mobile environments due to limitations of devices themselves and the reasonability and the possibility to develop low-power methods based on the analysis.
Abstract: In recent years, a large portion of smartphone applications (Apps) has targeted context-aware services. They aim to perceive users’ real-time context like his/her location, actions, or even emotion, and to provide various customized services based on the inferred context. However, context-awareness in mobile environments has some challenging issues due to limitations of devices themselves. Limited power is regarded as the most critical problem in context-awareness on smartphones. Many studies have tried to develop low-power methods, but most of them have focused on the power consumption of H/W modules of smartphones such as CPU and LCD. Only a few research papers have recently started to present some S/W-based approaches to improve the power consumption. That is, previous works did not consider energy consumed by context-awareness of Apps. Therefore, in this paper, we focus on the power consumption of context-aware Apps. We analyze the characteristics of context-aware Apps in a perspective of the power consumption, and then define two main factors which significantly influence the power consumption: a sort of context that context-aware Apps require for their services and a type of ways that a user uses them. The experimental result shows the reasonability and the possibility to develop low-power methods based on our analysis. That is, our analysis presented in this paper will be a foundation for energy-efficient context-aware services in mobile environments.

Journal ArticleDOI
TL;DR: A novel method of facial expression recognition via non-negative least squares (NNLS) sparse coding is presented and results indicate that the presented NNLS method performs better than other used methods on facial expression Recognition tasks.
Abstract: Sparse coding is an active research subject in signal processing, computer vision, and pattern recognition A novel method of facial expression recognition via non-negative least squares (NNLS) sparse coding is presented in this paper The NNLS sparse coding is used to form a facial expression classifier To testify the performance of the presented method, local binary patterns (LBP) and the raw pixels are extracted for facial feature representation Facial expression recognition experiments are conducted on the Japanese Female Facial Expression (JAFFE) database Compared with other widely used methods such as linear support vector machines (SVM), sparse representation-based classifier (SRC), nearest subspace classifier (NSC), K-nearest neighbor (KNN) and radial basis function neural networks (RBFNN), the experiment results indicate that the presented NNLS method performs better than other used methods on facial expression recognition tasks

Journal ArticleDOI
TL;DR: A solution to the problem of accurate forecasting on the ebb and flow of Vietnam’s Hoabinh Reservoir based on neural network with the Cuckoo Search algorithm is presented and it is expected that this work may be useful for hydrographic forecasting.
Abstract: The accuracy of reservoir flow forecasting has the most significant influence on the assurance of stability and annual operations of hydro-constructions. For instance, accurate forecasting on the ebb and flow of Vietnam’s Hoabinh Reservoir can aid in the preparation and prevention of lowland flooding and drought, as well as regulating electric energy. This raises the need to propose a model that accurately forecasts the incoming flow of the Hoabinh Reservoir. In this study, a solution to this problem based on neural network with the Cuckoo Search (CS) algorithm is presented. In particular, we used hydrographic data and predicted total incoming flows of the Hoabinh Reservoir over a period of 10 days. The Cuckoo Search algorithm was utilized to train the feedforward neural network (FNN) for prediction. The algorithm optimized the weights between layers and biases of the neuron network. Different forecasting models for the three scenarios were developed. The constructed models have shown high forecasting performance based on the performance indices calculated. These results were also compared with those obtained from the neural networks trained by the particle swarm optimization (PSO) and back-propagation (BP), indicating that the proposed approach performed more effectively. Based on the experimental results, the scenario using the rainfall and the flow as input yielded the highest forecasting accuracy when compared with other scenarios. The performance criteria RMSE, MAPE, and R obtained by the CS-FNN in this scenario were calculated as 48.7161, 0.067268 and 0.8965, respectively. These results were highly correlated to actual values. It is expected that this work may be useful for hydrographic forecasting.

Journal ArticleDOI
TL;DR: A new multivariable self-tuning proportional-integral-derivative (PID) controller tuned optimally by an improved particle swarm optimization (IPSO) algorithm is proposed to control the two-input/two-output (TITO) ESR process.
Abstract: A mathematical model of electroslag remelting (ESR) process is established based on its technical features and dynamic characteristics. A new multivariable self-tuning proportional-integral-derivative (PID) controller tuned optimally by an improved particle swarm optimization (IPSO) algorithm is proposed to control the two-input/two-output (TITO) ESR process. An adaptive chaotic migration mutation operator is used to tackle the particles trapped in the clustering field in order to enhance the diversity of the particles in the population, prevent premature convergence and improve the search efficiency of PSO algorithm. The simulation results show the feasibility and effectiveness of the proposed control method. The new method can overcome dynamic working conditions and coupling features of the system in a wide range, and it has strong robustness and adaptability.

Journal ArticleDOI
TL;DR: A developed reading of Roederer's interpretation of pragmatic information is introduced as a good candidate for a Unifying Information Concept required for an as-yet-unavailable Science of Information.
Abstract: This paper aims to introduce a developed reading of Roederer's interpretation of pragmatic information as a good candidate for a Unifying Information Concept required for an as-yet-unavailable Science of Information. According to pragmatic information, information and information processing are exclusive attributes of biological systems related to the very definition of life. I will apply the notion to give new accounts in the following areas: (1) quantum interpretation: based on a modified version of David Bohm's interpretation of quantum mechanics, I propose an ontological, information-based interpretation of quantum mechanics which, unlike Roederer's interpretation, satisfies all conditions of pragmatic information; (2) artificial intelligence: the notion successfully distinguishes natural living systems from artifacts and natural non-living systems, providing a context to pose an information-based argument against the thesis of Strong Artificial Intelligence; (3) phenomenal consciousness: I will use pragmatic information to modify and update Chalmers's Double-aspect Theory of Information to be explanatorily more powerful regarding the physical aspect of his theory; (4) causation: based on pragmatic information, I pose a new account of causation which differentiates causation in biology from causation in natural abiotic world.

Journal ArticleDOI
TL;DR: It is argued that the order-theoretic sense of contextuality is analogous to the sense embodied in the topos- theoretic statement of the Kochen–Specker theorem and leads to a relation between the entropy associated with measurements on quantum systems and the second law of thermodynamics.
Abstract: In this essay, I develop order-theoretic notions of determinism and contextuality on domains and topoi. In the process, I develop a method for quantifying contextuality and show that the order-theoretic sense of contextuality is analogous to the sense embodied in the topos-theoretic statement of the Kochen–Specker theorem. Additionally, I argue that this leads to a relation between the entropy associated with measurements on quantum systems and the second law of thermodynamics. The idea that the second law has its origin in the ordering of quantum states and processes dates to at least 1958 and possibly earlier. The suggestion that the mechanism behind this relation is contextuality, is made here for the first time.

Journal ArticleDOI
TL;DR: Experimental results show that the dynamic position of a pedestrian in straight moving model as well as two dimensional models can be tracked and higher performance in accuracy and dynamic tracking in real indoor environment can be achieved without other devices.
Abstract: Internet of Things (IoT) for Smart Environments (SE) is a new scenario that collects useful information and provides convenient services to humans via sensing and wireless communications. Infra-Red (IR) motion sensors have recently been widely used for indoor lighting because they allow the system to detect whether a human is inside or outside the sensors’ range. In this paper, the performance of a position estimation scheme based on IR motion sensor is evaluated in an indoor SE. The experimental results show that we can track the dynamic position of a pedestrian in straight moving model as well as two dimensional models. Experimental results also show that higher performance in accuracy and dynamic tracking in real indoor environment can be achieved without other devices.

Journal ArticleDOI
TL;DR: The present review seeks to take stock of the South Korean publication activity on the field of chemistry by analyzing systematically all chemistry-related scholarly communications collected in the Web of Science (WOS) database published by at least one Korean author or Korean institute- or university-affiliated author from 1993 to 2012.
Abstract: The present review seeks to take stock of the South Korean publication activity on the field of chemistry by analyzing systematically all chemistry-related scholarly communications collected in the Web of Science (WOS) database published by at least one Korean author or Korean institute- or university-affiliated author from 1993 to 2012. The studied parameters included the growth in number of the communications, as well as the language-, document-, category-, source-, organization-, and collaboration-wise distribution of the South Korean communications. A total of 5660 communications on chemistry were found to be published by South Korean researchers during the aforementioned period of time, and South Korea was the 15th country (1.77%) in the world in terms of informational communication activity in chemistry.

Journal ArticleDOI
TL;DR: This research presents a semi-supervised co-training ensemble learning approach using both neural networks and decision trees to deal with the search interface identification problem and shows that the proposed model outperforms previous methods using only labeled data.
Abstract: To surface the Deep Web, one crucial task is to predict whether a given web page has a search interface (searchable HyperText Markup Language (HTML) form) or not. Previous studies have focused on supervised classification with labeled examples. However, labeled data are scarce, hard to get and requires tediousmanual work, while unlabeled HTML forms are abundant and easy to obtain. In this research, we consider the plausibility of using both labeled and unlabeled data to train better models to identify search interfaces more effectively. We present a semi-supervised co-training ensemble learning approach using both neural networks and decision trees to deal with the search interface identification problem. We show that the proposed model outperforms previous methods using only labeled data. We also show that adding unlabeled data improves the effectiveness of the proposed model.

Journal ArticleDOI
TL;DR: An analysis of “information” from an evolutionary view can be helpful even for information sciences: there are gaps which cannot be bridged sufficiently, especially between the different evolutionary levels up to the “hierarchical structure” of a person as a social being.
Abstract: “Information” (=information including its processing, communication, etc.) is indispensable for the modern understanding of processes within cells, tissues, organs, the organism, but also between individuals and social structures. Is “information” the mathematically applicable substitute for the omnipotent and in all living entities identical Vis Vitalis, applicable also to machines? Vis Vitalis was falsified by evolutionary theory. Its explanatory power was not “saved” with an alternative hypothesis. So the causal explanation of what could be handled previously with Vis Vitalis remains a “grey area” in the landscape of sciences. “Information” seems to fill the gap between, e.g., body and mind. Therefore, an analysis of “information” from an evolutionary view can be helpful even for information sciences: there are gaps which cannot be bridged sufficiently, especially between the different evolutionary levels up to the “hierarchical structure” of a person as a social being. An analysis is presented: the meaning and the indispensable carriers of “information” have changed within the evolutionary processes. Options and restrictions for an evolution-oriented use of “information” are discussed and applied. Doing this it seems possible not only to bridge the gap between the layers within the biological, emotional, cognitive and intellectual hierarchical levels within a person, but between persons and machines too.

Journal ArticleDOI
TL;DR: The proposed vehicular antenna for WAVE communication systems shows an improvement of approximately 4.77 dB in the return loss, as compared with a conventional antenna system.
Abstract: This paper describes the design of a high-efficiency vehicular roof-mounted antenna for wireless access for vehicular environment (WAVE) communication systems used for ubiquitous intelligent systems. The main objective of the ubiquitous intelligent system’s automotive IT technology is to enhance the connectivity among vehicles to ensure seamless communication and to reduce the initial access time using high-performance antenna systems. The efficiency of WAVE communication systems used for ubiquitous intelligent systems depends on the antenna efficiency. The proposed vehicular antenna for WAVE communication systems shows an improvement of approximately 4.77 dB in the return loss, as compared with a conventional antenna system.