scispace - formally typeset
Search or ask a question

Showing papers in "New Mathematics and Natural Computation in 2012"


Journal ArticleDOI
TL;DR: This paper considers the arguments for such "crown jewels" of mathematical economics as the existence of general equilibrium and the second welfare theorem in reverse order, moving from the matters of economics applications to the broader issue of constructivist mathematics.
Abstract: Kumaraswamy Vela Velupillai74 presents a constructivist perspective on the foundations of mathematical economics, praising the views of Feynman in developing path integrals and Dirac in developing the delta function. He sees their approach as consistent with the Bishop constructive mathematics and considers its view on the Bolzano-Weierstrass, Hahn-Banach, and intermediate value theorems, and then the implications of these arguments for such "crown jewels" of mathematical economics as the existence of general equilibrium and the second welfare theorem. He also relates these ideas to the weakening of certain assumptions to allow for more general results as shown by Rosser51 in his extension of Godel's incompleteness theorem in his opening section. This paper considers these arguments in reverse order, moving from the matters of economics applications to the broader issue of constructivist mathematics, concluding by considering the views of Rosser on these matters, drawing both on his writings and on personal conversations with him.

41 citations


Journal ArticleDOI
TL;DR: SN PA systems are used to perform the arithmetic operation like 2's complement, addition and subtraction of binary numbers and they are also used to simulate NAND and NOR gates.
Abstract: Spiking neural P systems with anti-spikes (for short, SN PA systems) can encode the binary digits in a natural way using two types of objects called anti-spikes and spikes. In this paper, we use SN PA systems to perform the arithmetic operation like 2's complement, addition and subtraction of binary numbers. They are also used to simulate NAND and NOR gates.

16 citations


Journal ArticleDOI
TL;DR: The emergence of non-constructivities in economics is entirely due to the unnecessary and inappropriate formalization of economics by means of 'classical' mathematics as mentioned in this paper, which is the main reason for the emergence of undecidabilities in economics.
Abstract: The emergence of non-constructivities in economics is entirely due to the unnecessary and inappropriate formalization of economics by means of 'classical' mathematics. I have made similar claims for the emergence of uncomputabilities and undecidabilities in economics in earlier writings. Here, on the other hand, I want to suggest a way of confronting uncomputabilities, and remedying non-constructivities, in economics, and turning them into a positive force for modeling, for example, endogenous growth, as suggested by Stefano Zambelli. 107,108 In between, a case is made for economics to take seriously the kind of mathematical methodology fostered by Feynman and Dirac, in particular the way they developed the path integral and the δ-function, respectively. A sketch of a "research program" in mathematical economics, analogous to the way Godel thought incompleteness and its perplexities should be interpreted and resolved, is also outlined, albeit briefly, in the concluding section.

14 citations


Journal ArticleDOI
TL;DR: Weak fuzzy T-congruence axiom is introduced and it is proved that G-rational fuzzy choice functions with transitive rationalizations satisfying fuzzy Chernoff axiom characterizes their full rationality.
Abstract: The aim of this paper is to discuss the full rationality of fuzzy choice functions defined on base domain. For this purpose, we introduce weak fuzzy T-congruence axiom. We characterize full rationality of fuzzy choice functions in terms of this axiom and the fuzzy Chernoff axiom. Also, we prove that G-rational fuzzy choice functions with transitive rationalizations satisfying fuzzy Chernoff axiom characterizes their full rationality.

7 citations


Journal ArticleDOI
TL;DR: Dezert-Smarandache Theory (DSmT) as mentioned in this paper is a theory of plausible and paradoxical reasoning, developed to deal with imprecise, uncertain and conflicting sources of information.
Abstract: The management and combination of uncertain, imprecise, fuzzy and even paradoxical or highly conflicting sources of information has always been, and still remains today, of primal importance for the development of reliable modern information systems involving artificial and approximate reasoning. In this short paper, we present an introduction of our recent theory of plausible and paradoxical reasoning, known as Dezert-Smarandache Theory (DSmT), developed to deal with imprecise, uncertain and conflicting sources of information. We focus our presentation on the foundations of DSmT and on its most important rules of combination, rather than on browsing specific applications of DSmT available in literature. Several simple examples are given throughout this presentation to show the efficiency and the generality of this new theory.

6 citations


Journal ArticleDOI
TL;DR: Methods of existence proofs, used by the "classical" mathematician — even if not invoking the axiom of choice — cannot be shown to be equivalent to the exhibition of an instance in the sense of a constructive proof.
Abstract: Non-standard analysis can be harnessed by the recursion theorist. But as a computable economist, the conundrums of the Lowenheim-Skolem theorem and the associated Skolem paradox, seem to pose insurmountable epistemological difficulties against the use of algorithmic non-standard analysis. Discontinuities can be tamed by recursive analysis. This particular kind of taming may be a way out of the formidable obstacles created by the difficulties of Diophantine Decision Problems. Methods of existence proofs, used by the "classical" mathematician — even if not invoking the axiom of choice — cannot be shown to be equivalent to the exhibition of an instance in the sense of a constructive proof. These issues were prompted by the fertile and critical contributions to this special issue.

6 citations


Journal ArticleDOI
TL;DR: It is claimed that the methodological approach and results of classical recursion theory and constructive mathematics should be at the foundation of theorizing in economics and Computable Economics provides the proper foundation for a formally rigorous and meaningful definition of rationality.
Abstract: In this paper, a case for Velupillai's Computable Economics is made. It is claimed that the methodological approach and results (theorems) of classical recursion theory and constructive mathematics should be at the foundation of theorizing in economics. The major point is that Turing machine equivalent computations set an upper bound to rational choice that cannot be circumvented with a non-rigorous and teleological "as if" assumption. Clearly, given that the optimal choices are almost always not constructible the paradigm of rational behavior as optimizing behavior is deprived of any practical, operational or theoretical meaning. It is Computable Economics that provides the proper foundation for a formally rigorous and meaningful definition of rationality.

5 citations


Journal ArticleDOI
TL;DR: In this paper, the authors introduce three types of independence of irrelevant alternative conditions and show that they can be profitably used in the examination of Arrow's theorem and generalize some known nondictatorship results.
Abstract: The literature involving fuzzy Arrow results uses the same independence of irrelevant alternatives condition. We introduce three other types of independence of irrelevant alternative conditions and show that they can be profitably used in the examination of Arrow's theorem. We also generalize some known nondictatorship results. One known fuzzy aggregation rule that is nondictatorial is the average of the individual preferences. We show that a weighted average is also nondictatorial. Moreover, it is not an automorphic image of the ordinary average, which demonstrates that we have proposed a framework unique from the present known results.

5 citations


Journal ArticleDOI
TL;DR: The first part of a general study of the structure of preference-based choice functions introduced by Barrett et al. as mentioned in this paper has been carried out and the consistency of these preference functions when preferences are strongly complete fuzzy pre-orders has been investigated.
Abstract: This paper presents the first part of a general study of the structure of nine preference-based choice functions introduced by Barrett et al.2 More precisely, we show that, as for crisp total pre-orders, first and last alternatives exist in a finite set of alternatives equipped with a strongly complete fuzzy pre-order. We use that result to characterize each of those crisp choice functions for crisp total pre-orders and strongly complete fuzzy pre-orders. We study, by means of those characterizations, the consistency of those preference-based choice functions when preferences are strongly complete fuzzy pre-orders (thereby crisp total pre-orders), that is, we check if each choice function satisfies or violates each of six consistency conditions introduced by Sen.11

4 citations


Journal ArticleDOI
TL;DR: In this article, the authors extend the analysis of voting and simple rules in the fuzzy framework and show that Black's median voter theorem does not hold under all conceptualizations of the fuzzy maximal set.
Abstract: Under certain aggregation rules, particular subsets of the voting population fully characterize the social preference relation, and the preferences of the remaining voters become irrelevant. In the traditional literature, these types of rules, i.e. voting and simple rules, have received considerable attention because they produce non-empty social maximal sets under single-peaked preference profiles but are particularly poorly behaved in multi-dimensional space. However, the effects of fuzzy preference relations on these types of rules is largely unexplored. This paper extends the analysis of voting and simple rules in the fuzzy framework. In doing so, we contribute to this literature by relaxing previous assumptions about strict preference and by illustrating that Black's Median Voter Theorem does not hold under all conceptualizations of the fuzzy maximal set.

4 citations


Journal ArticleDOI
TL;DR: For this to be possible, the nature of realism in economics has to be delineated since the way in which mathematics is used in economics seems to be in the "reductive" mode instead of the "applied" mode.
Abstract: The use of mathematics in physics is not restricted to non-formal use with emphasis on approximation and computability. There is also a domain of creative use of symbolic manipulation as well as of metaphoric imagination that makes applicability in the sciences so effective. The challenge to mathematical use in economics is to find ways to use it in these creative ways, including finding new discursive and writing strategies. For this to be possible, the nature of realism in economics has to be delineated since the way in which mathematics is used in economics seems to be in the "reductive" mode instead of the "applied" mode.

Journal ArticleDOI
TL;DR: The relations between intuitionistic fuzzy Gateaux derivative and intuitionism fuzzy Frechet derivative are studied and some of their properties are studied.
Abstract: In this paper, we introduce intuitionistic fuzzy derivative, intuitionistic fuzzy Gateaux derivative and intuitionistic fuzzy Frechet derivative and some of their properties are studied. The relations between intuitionistic fuzzy Gateaux derivative and intuitionistic fuzzy Frechet derivative are studied.

Journal ArticleDOI
TL;DR: In this paper, interrelations between fuzzy direct revelation axiom (FDRA), fuzzy transitive-closure coherenceAxiom (FTCCA), fuzzy consistent- closure coherence Axiom (FCCCA), fuzzy intermediate congruence axiom(SFCA), and fuzzy intermediate CongruenceAxiom(FICA) are established.
Abstract: In this paper we establish interrelations between fuzzy direct revelation axiom (FDRA), fuzzy transitive-closure coherence axiom (FTCCA), fuzzy consistent-closure coherence axiom (FCCCA) and fuzzy intermediate congruence axiom (FICA). We also establish their relationships with weak fuzzy congruence axiom (WFCA), strong fuzzy congruence axiom (SFCA) and weak axiom of fuzzy revealed preference (WAFRP). Condition for equivalence of fuzzy Arrow axiom (FAA) and weak fuzzy congruence axiom (WFCA) on arbitrary domain is also given.

Journal ArticleDOI
TL;DR: This paper exploits a particular difunctional relation embedded in any binary relation, the fringe of , to find an approximate conceptual coverage of .
Abstract: Extracting knowledge from huge data in a reasonable time is still a challenging problem. Most real data (structured or not) can be mapped to an equivalent binary context, with or without using a scaling method, as for extracting associations between words in a text, or in machine learning systems. In this paper, our objective is to find a minimal coverage of a relation with formal concepts. The problem is known to be NP-complete.1 In this paper, we exploit a particular difunctional relation embedded in any binary relation , the fringe of , to find an approximate conceptual coverage of . We use formal properties of fringes to find better algorithms calculating the minimal rectangular coverage of binary relation. Here, a formal context is considered as a binary relation. By exploiting some background on relational algebra in the present work, we merge some results of Belohlavek and Vichodyl,2 using formal concept analysis with previous results obtained by Kcherif et al.3 using relational algebra. We finally propose decomposition algorithms based on the relational formalization and fringe relations.

Journal ArticleDOI
TL;DR: The work of K. 'Vela' Velupillai has illuminated the debate on the mathematization of economics by providing a broader view of the universe of mathematics and its possible applications in economics.
Abstract: The work of K. 'Vela' Velupillai has illuminated the debate on the mathematization of economics by providing a broader view of the universe of mathematics and its possible applications in economics. The theoretical and policy consequences from the peculiar mode of mathematization in economics is another important theme in Vela work. Alternative modes of mathematization are offered with a call for an "Algorithmic Economics" in the future.

Journal ArticleDOI
TL;DR: In this article, the revealed preference indicators WAFRP, SAFRP, WFCA, SFCA and SARP were defined and proved to be axioms of revealed preference theory.
Abstract: The axioms WAFRP, SAFRP (resp. WFCA, SFCA) are fuzzy versions of the axioms of revealed preference WARP, SARP (resp. of the congruence axioms WCA, SCA) of classic theory of revealed preference. The revealed preference indicators WAFRP(C), SAFRP(C), WFCA(C) and SFCA(C) of a fuzzy choice function C were introduced in a previous paper in order to express the degree to which C verifies the axioms WAFRP, SAFRP, WFCA, SFCA. In this paper, we shall define the new revealed preference indicators WAFRP°(C), HAFRP(C) corresponding to the axioms WAFRP°, HAFRP from fuzzy revealed preference theory. We shall prove two main results: (1) WAFRP°(C) = WFCA(C); (2) HAFRP(C) = SFCA(C). They extend in terms of numerical indicators two known theorems of Hansson and Suzumura of classic theory of choice functions.

Journal ArticleDOI
TL;DR: In this article, it was shown that the only aggregation rules satisfying these properties are dictatorships, and that such a rule should not be imposed, nor should it be manipulable.
Abstract: Consider the following social choice problem. A group of individuals seek to classify the elements of X as belonging in one of two sets. The individuals may disagree as to how the elements of X should be classified, and so an aggregation rule is applied to determine a compromise outcome. We require that the social classification should not be imposed, nor should it be manipulable. We prove that the only aggregation rules satisfying these properties are dictatorships.

Journal ArticleDOI
TL;DR: In this paper, the authors define prime, strongly prime and semiprime k-bi-ideals of a hemiring and also define their fuzzy versions and characterize hemirings by the properties of these k-bideals.
Abstract: In this paper we define prime, strongly prime and semiprime k-bi-ideals of a hemiring. We also define their fuzzy versions and characterize hemirings by the properties of these k-bi-ideals.

Journal ArticleDOI
TL;DR: The aim of this article is to introduce the notion of stratified lattice-valued balanced neighborhood topological group, and the notions of equicontinuity in stratified clustering space, and uniform equicentuity in Stratified lattices-valued uniform space, which are used to characterize stratification of neighbourhood topological groups.
Abstract: The aim of this article is to introduce the notion of stratified lattice-valued balanced neighborhood topological group. We also introduce the notions of equicontinuity in stratified lattice-valued neighborhood topological space, and uniform equicontinuity in stratified lattice-valued uniform space. We use these notions to characterize stratified lattice-valued neighborhood topological groups. Moreover, introducing the notions of lattice-valued neighborhood open function and lattice-valued uniformly open function, we show that in a stratified lattice-valued neighborhood topological group these notions are equivalent. Finally, we conclude with a characterization of balanced stratified lattice-valued neighborhood topological group in terms of uniform continuity of binary group operation.

Journal ArticleDOI
TL;DR: An algorithmic economics would allow mathematical economics to prove theorems relating to economic problems, such as the existence of equilibria defined on some metric space, with embedded mechanisms for getting to theEquilibria of these problems.
Abstract: Algorithmic economics helps us stipulate, formulate, and resolve economic problems in a more precise manner than mainstream mathematical economics. It does so by aligning theorizing about an economic problem with both the data generated by the real world and the computers used to manipulate that data. Theoretically coherent, numerically meaningful, and therefore policy relevant, answers to economic problems can be extrapolated more readily using algorithmic economics than present day mathematical economics. An algorithmic economics would allow mathematical economics to prove theorems relating to economic problems, such as the existence of equilibria defined on some metric space, with embedded mechanisms for getting to the equilibria of these problems. A blueprint for such an economics is given and discussed with an example.

Journal ArticleDOI
TL;DR: The non-computability of Walrasian competitive equilibria has been studied in this paper from the point of view of concepts of approximate equilibrium, and it is shown that neither the market-clearing nor Negishi approaches to the proof of existence of competitive equilibrium give rise to adequately robust notions of approximate Equilibria.
Abstract: The problem of the computability of Walrasian competitive equilibrium is considered from the point of view of concepts of approximate equilibrium. Neither the market-clearing nor Negishi approaches to the proof of existence of Walrasian competitive equilibrium give rise to adequately robust notions of approximate equilibrium. This explains the non-computability of Walrasian competitive equilibrium. The problem lies in the economic conception of markets, in particular the inconsistent treatment of information underlying the Walrasian definition. When trade takes place at disequilibrium prices decentralized market exchange redistributes income and economic welfare, and its equilibrium is path-dependent. The set of such equilibrium outcomes, however, in contrast to the Walrasian competitive equilibrium, is constructive and computable.

Journal ArticleDOI
TL;DR: In this article, the authors investigate the relationship between lattice-valued topological groups and their uniformities, and investigate the connection between stratified latticevalued neighborhood topological group and its level spaces.
Abstract: The purpose of this article is to investigate the relationships between some of the lattice-valued topological groups, and the lattice-valued uniformities that they inherit. In so doing, we look at the relationship between (a) crisp sets of lattice-valued neighborhood groups and lattice-valued neighborhood topological groups, and their uniformities; (b) lattice-valued topological groups of ordinary subsets and fuzzy neighborhood groups, and their uniformities. We also investigate the connection between stratified lattice-valued neighborhood topological group and its level spaces.

Journal ArticleDOI
TL;DR: In this article, a three-stage computational intelligence strategy is used to forecast the unsmoothed monthly sunspot number, which employs agents that use two computational techniques, genetic programming (GP) and neural networks (NN), in a sequence of three stages.
Abstract: A three-stage computational intelligence strategy is used to forecast the unsmoothed monthly sunspot number. The strategy employs agents that use two computational techniques, genetic programming (GP) and neural networks (NN), in a sequence of three stages. In the first, two agents fit the same set of observed monthly data. One employs GP, while the other employs NN. In the second, residuals (= differences between observed and solution values) from the first stage are fitted employing a different technique. The NN fitted-residuals are added to the GP first-stage solution while the GP fitted-residuals are added to the NN first-stage solution. In the third, outputs from the first and second stages become inputs to use in producing two new solutions that reconcile differences. The fittest third stage solution is then used to forecast 48 monthly sunspot numbers (September 2009 through August 2013). This modeling scheme delivered lower estimation errors at each stage. The next sunspot number peak is predicted to be around the middle of 2012.

Journal ArticleDOI
TL;DR: In this article, the authors consider exotic spacetimes in the sense of differential geometry, and then present undecidability and incompleteness results about general relativity, with the nature of time and the existence of some kind of "cosmic time" as central questions.
Abstract: In order to ponder the question of (space) time, we consider exotic spacetimes in the sense of differential geometry, and then present undecidability and incompleteness results about general relativity, with the nature of time and the existence of some kind of "cosmic time" as central questions. We conclude with a discussion on the possible interpretation of our results.