scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Anticipation and dynamics: Rosen's anticipation in the perspective of time

01 Jan 2010-International Journal of General Systems (Taylor & Francis Group)-Vol. 39, Iss: 1, pp 3-33
TL;DR: While Robert Rosen's work is the main focus of this article, an attempt is made to advance a perspective for the broad field of studies that developed around the notion of anticipation, including the circumstances of epistemological and gnoseological significance, leading to the articulation of the early hypotheses regarding anticipatory processes.
Abstract: Anticipation relates to the perception of change. Therefore, dynamics is the context for defining anticipation processes. Since preoccupation with change is as old as science itself, anticipation-related questions go back to the first attempts to explain why and how things change. However, as a specific concept, anticipation insinuates itself in the language of science in the writings of Whitehead, Burgers, Bennett, Feynman, Svoboda, Rosen, Nadin and Dubois, i.e. since 1929. While Robert Rosen’s work is the main focus of this article, an attempt is made to advance a perspective for the broad field of studies that developed around the notion of anticipation. Of particular interest are the circumstances of epistemological and gnoseological significance, leading to the articulation of the early hypotheses regarding anticipatory processes. Of no less interest to the scientific community are questions pertinent to complexity, adaptivity, purposiveness, time and computability as they relate to our understanding of anticipation.

Content maybe subject to copyright    Report

PLEASE SCROLL DOWN FOR ARTICLE
This article was downloaded by:
On:
11 January 2010
Access details:
Access Details: Free Access
Publisher
Taylor & Francis
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-
41 Mortimer Street, London W1T 3JH, UK
International Journal of General Systems
Publication details, including instructions for authors and subscription information:
http://www.informaworld.com/smpp/title~content=t713642931
Anticipation and dynamics: Rosen's anticipation in the perspective of time
Mihai Nadin
a
a
antÉ-Institute for Research in Anticipatory Systems, University of Texas at Dallas, Richardson, TX,
USA
Online publication date: 09 December 2009
To cite this Article Nadin, Mihai(2010) 'Anticipation and dynamics: Rosen's anticipation in the perspective of time',
International Journal of General Systems, 39: 1, 3 — 33
To link to this Article: DOI: 10.1080/03081070903453685
URL: http://dx.doi.org/10.1080/03081070903453685
Full terms and conditions of use: http://www.informaworld.com/terms-and-conditions-of-access.pdf
This article may be used for research, teaching and private study purposes. Any substantial or
systematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply or
distribution in any form to anyone is expressly forbidden.
The publisher does not give any warranty express or implied or make any representation that the contents
will be complete or accurate or up to date. The accuracy of any instructions, formulae and drug doses
should be independently verified with primary sources. The publisher shall not be liable for any loss,
actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directly
or indirectly in connection with or arising out of the use of this material.

Anticipation and dynamics: Rosen’s anticipation in the perspective
of time
Mihai Nadin*
1
antE
´
Institute for Research in Anticipatory Systems, University of Texas at Dallas, AT 10, 800
West Campbell Road, Richardson, TX 75080-3021, USA
(Received 8 April 2009; final version received 19 October 2009)
Anticipation relates to the perception of change. Therefore, dynamics is the context for
defining anticipation processes. Since preoccupation with change is as old as science
itself, anticipation-related questions go back to the first attempts to explain why and
how things change. However, as a specific concept, anticipation insinuates itself in the
language of science in the writings of Whitehead, Burgers, Bennett, Feynman,
Svoboda, Rosen, Nadin and Dubois, i.e. since 1929. While Robert Rosen’s work is the
main focus of this article, an attempt is made to advance a perspective for the broad
field of studies that developed around the notion of anticipation. Of particular interest
are the circumstances of epistemological and gnoseological significance, leading to the
articulation of the early hypotheses regarding anticipatory processes. Of no less interest
to the scientific community are questions pertinent to complexity, adaptivity,
purposiveness, time and computability as they relate to our understanding of
anticipation.
Keywords: anticipatory systems; ambiguity; causality; complexity; impredicativity;
non-determinism
Introduction
Those familiar with the history of science are aware of the fact that, in retrospect, some
theories, new concepts and new methods seemed to have been so much ahead of their time
that they were either ignored or declared unviable. A very telling example in this sense is
Leibniz and his digital notation. Leibniz’s attempt at a universal language deserves to be
celebrated as the precursor of the computer age (Nadin 1982, 1996a, 1996b, 1996c). His
vision of an ideal language (characteristica universalis) and his calculus racionator
effectively anticipated computation as the foundation of what today is called the
information society. De progressione dyadica, dated March 1679, is a text on binary
representation. His letter of 2 January 1697 shows how, in the form of a memorial coin
(Gedenkmu
¨
nze), one can establish a record of accomplishments (those of his patron, the
Herzog Rudolph August) using the very precise language of only two symbols: 0 and 1
(Leibniz 1965, 1968, 1986). He used pictorial elements to compensate for the lack of
expressiveness.
This example is quite clear with regard to what is needed for an idea to percola te,
moreover, how often the deep roots of an idea are ignored. The practitioners of
information technology so many disciplines are based on it will definitely not relate
ISSN 0308-1079 print/ISSN 1563-5104 online
q 2010 Taylor & Francis
DOI: 10.1080/03081070903453685
http://www.informaworld.com
*Email: nadin@utdallas.edu
International Journal of General Systems
Vol. 39, No. 1, January 2010, 3–33
Downloaded At: 04:06 11 January 2010

their work and expertise to Leibniz. What led to the assertion that Leibniz is the father of
the digital is the powerful representation he chose. His revolutionary thought was not
intended to please a patron or even to open access to Chinese writing (Leibniz 1968) in
particular the Book of Changes (I-ching) but to overcome the ambiguities of natural
language. This goal will prove to be related to the foundations of a science of anticipation,
and therefore, we shall return to it.
The opening sentence alluded to discoveries initially resisted, ignored or simply
rejected. Leibniz’s digital machine was what we today call ‘premature birth’. It was
difficult to connect its implications to the given state of affairs in science. In other cases,
the new ideas challenge the accepted understanding of reality. Darwin’s observations in
the Galapagos Islands explained variety (of finches, for instance) in terms of a process
evolution. But it took almost a century for biologists to understand his view. The fact that
evolution involves anticipation justifies mentioning Darwin’s case in an article focused on
processes involving or resulting in foresight. More examples can be given, mainly in order
to realise that delayed recognition is part of the contradictory dynamics of science.
(The fact that in our days we want recognition immediately, and often get it, does not
affect the argument.)
Those currently involved in the study of anticipation, or those just discovering the
subject, might discard this attempt to suggest a frame of reference. The argument could be
as simple as: so long as our un derstanding of anticipation does not continue that of the
precursors, regardless of how qualified and creative they were, why bother? Indeed, if the
past does not constitute a reference, the effort to reco nstitute it remains, at best, of
documentary significance. Leibniz’s work on binary representations and on a machine
capable of processing them is still practically ignored by the community of scientists and
practitioners of computation. That a mathematician, Gregory Chaitin (inspired by
Hermann Weyl, himself a mathematician and philosopher of science), brought Leibniz
into current scientific discourse is pertinent to our subject inso far as Leibniz is identified as
one of the first to dedicate his thoughts to the subject of complexity. Is Chaitin’s
algorithmic information theory for which Stephan Wolfram presented him with a medal
that replicates the thought of Leibniz’s medallion yet another example of a science
ahead of its time? Is Wolfram’s New Kind of Science (2002) in the same category?
Let us take note of the following: Once a branch of science becomes successful, some
of its practitioners look for precursors. And often they realise that the ‘wheel’ to which
they attached their names was invented well before. They also have the opportunity to
discover in the original thoughts man y paths to be pursued. To a certain extent, this holds
true for the works of scholars who set forth the initial systematic considerations on
anticipation. Le t us mention them in these preliminary lines: Alfred North Whitehead,
J.M. Burgers, J.W. Bennett, Buckminster Fuller, A. Svoboda, R. Feynman, Robert Rosen,
Mihai Nadin and D.M. Dubois (the list is open and definitely subject to comment).
Interestingly enough, reading the initial contributions made by such authors makes evident
that representations, models, evolution, complexity and dynamics, in addition to
purposiveness, causality and time, are part of the vocabulary deployed to make the
argument in favour of a distinct scientific subject.
The annotated bibliography associated with this article docum ents the emergence of a
‘data-rich’ but ‘theory-poor’ field of inquiry. By no means exhaustive, the bibliography
(listing publications up to the beginning of 2009) shows how far across academic
disciplines’ anticipation extends.
M. Nadin4
Downloaded At: 04:06 11 January 2010

Context
Some interesting developments recommend themselves to our attenti on. I will introduce
them almost in the style of headlines, with the intention to reconnect to them as the line of
argument requires:
. ‘Bacteria Can Plan Ahead Israeli Scientists Say’. Bacteria can anticipate and
prepare for future events, according to new research from the Weizmann Institute of
Science, which appeared in Nature. Researchers from the Institute’s Molecular
Genetics Department discovered that the genetic networks of micro-organisms
could predict what comes next in a sequence of events and begin responding before
its onset (Mitchell et al. 2009).
. Brain Imaging: ‘Wave of Brain Activity Linked to Anticipation Captured’ (Science
Daily, 2009) reporting on ‘Brain Activation during Anticipation of Sound
Sequences’ (Leaver et al. 2009). Neuroscientists at Georgetown University Medical
Center have, for the first time , shown what brain activity looks like when someone
anticipates an action or sensory input which soon follows (See also Soon et al.
(2008)).
. Insensitivity to future consequences, after damage to the human prefrontal cortex,
affects the anticipation of risk (Fukui and Murai 2005, Nadin 2009a).
. With the aim of fluency and efficiency in humanrobot teams, a cognitive
architecture based on the neuro-psychological principles of anticipation and
perceptual simulation through top-down biasing was developed (Hoffman and
Breazeal 2007, 2008a, 2008b).
This article is written in a context that can be characterised as one of missed
anticipations. In this vein, Rosen should be quoted: ‘I think it is fair to say that the mood of
those concerned with the problems of contemporary society is apocalyptic’. Scheduled for
Tuesday, 16 May 1972, Rosen’s presentation, ‘Planning management, policies and
strategies: four fuzzy concepts’ at the Center for the Study of Democratic Institutions,
started with the sentence quoted above, and it applies to our times without any amendment.
(We shall return to Rosen’s work and to this particular article.) In this respect, let us make
note of the fact that economists, as well as process control scientists, recognised early on
that anticipation deserves their attention (W.I. King, W.T. Powers and G.L.S. Shackle).
Their attempts can inform our current preoccupation with the broad subject of
anticipation.
Almost 25 years ago, Robert Rosen’s book on anticipation, Anticipatory Systems
(1985) first reached readers. Another book (Nadin 1991) introducing the concept of
anticipation was published 6 years later. The perspective of time and the evidence of
increasing interest from the scientific community in understanding anticipatory processes
speak in favou r of describing the premises for the initial definition of anticipation. The
work of Alfred North Whitehead (1929) advanced the idea that every process involves the
past and the anticipation of future possibilities. This thought is part of a larger philosophic
tradition sketched out (Nadin 2000) in the attempt to identify early considerations on the
subject. Indeed, let us be aware of the variety of understandings associated with the
concept because otherwise there is a real risk of trivialising anticipation before we know
what it is. Burgers (1975) was inspired by Whitehead. Although he came from Physics,
Burgers brought up choice and the preservation of freedom as coextensive with
anticipation. Bennett (1976a, p. 847), an anthropologist, saw anticipation as ‘the basis for
adaptation’. In his book (Bennett 1976b), the same broad considerations made the subject
International Journal of General Systems 5
Downloaded At: 04:06 11 January 2010

of an entire chapter (VII), in which Whitehead’s notion of anticipation, extended to the
entire realm of reality, is limited to living systems. Both Burgers and Bennett are part of
the broader context in which anticipation, usually used as a name holder in psychology,
slowly became part of the vocabulary of science and philosophy at the end of the last
century.
Feynman, famous for his contributions to quantum electrodynamics (which earned
him a Nobel Prize, in 1965, shared with Julian Schwinger and Sin-Itiro Tomonaga), is
probably, more by intuition than anything else, a part of the scientific story of anticipation.
Feynman’s own involvement with computers dated back to Los Alamos (the Manhattan
Project, 19431945); in his biographical notes, the subject is dealt with among so many
others. Howeve r, one is surprised, as he himself was, at finding out that some digitally
computed data were quite different from the results produced when computing the same
data in his mind he somehow anticipated the results. He did not specifically bring up
the difference made by the medium of computation, but awareness of this difference
cannot be ignored. In the early 1980s, Feynman, John Hopfield and Carver Mead offered
‘The Physics of Computation’ course at the California Institute of Technology. Later on,
interaction with Gerry Sussman (on sabbatical from the Massachusetts Institute of
Technology) helped him develop ‘Potent ialities and Limitations of Computing Machines’.
Another interaction, with Ed Fredkin, allowed him to understand the problem of reversible
computation; and yet another interaction, with Danny Hillis, gave him the opportunity to
become involved in parallel computing. These biographical details and there are so
many more relevant to the depth of Feynman’s involvement with the subject of
computation are significa nt here because he, as opposed to everyone else, brought up
anticipation, however indirectly, not from biology but from computation.
In an article entitled ‘Simulating Physics with Computers’, Feynman (1982) made
relatively clear that he was aware of the distinction between what is represented (Nature
his spelling with a capital N, and nothing else, since Physics always laid claim upon it),
and the representation (computation). The physical system can be simulated by a machine,
but that does not make the machine the same as what it simulates. Not unlike other
scientists, Feynman focused on states: the space time view, ‘imagining that the points o f
space and time are all laid out, so to speak, ahead of time’. The computer operation would
be to see how changes in the space time view take place. This is what dynamics is. His
drawing is very intuitive (Figure 1):
The sta te s
i
at space time coordinate i is described through a function F (Feynman did
not discuss the nature of the function): s
i
¼ F
i
(s
j
, s
k
, ...). The deterministic view i.e. the
past affects the present would result, as he noticed, in the cellular automaton: ‘the value
of the function at i only involves the points behind in time, earlier than this time i’.
Figure 1. Feynman’s [1999] original state diagram.
M. Nadin6
Downloaded At: 04:06 11 January 2010

Citations
More filters
Book ChapterDOI
01 Jan 1984
TL;DR: This paper introduced the notion of rational expectations into the familiar context of a simple cobweb model, and showed how this modification radically alters the results obtained, and highlighted the role of expectations and gave some practice in utilizing expectations-based models.
Abstract: The major theme of this book is the theory of rational expectations and its role in modern macroeconomics. In this chapter, we depart somewhat from that theme and introduce rational expectations in a microeconomic context, where the impact of expectations is more readily apparent. In this way, we hope to highlight the role of expectations and give some practice in utilising expectations-based models. We briefly consider the role of expectations in economics, and outline the historical antecedents of rational expectations. Then we introduce the notion of rational expectations into the familiar context of a simple cobweb model and show how this modification radically alters the results obtained.

97 citations

Journal ArticleDOI
TL;DR: The Gaia hypothesis about the Earth climate system is formalized using advances in theoretical biology based on the minimization of variational free energy and underwrites climatic non-equilibrium steady-state through free energy minimization and thus a form of planetary autopoiesis.
Abstract: We formalize the Gaia hypothesis about the Earth climate system using advances in theoretical biology based on the minimization of variational free energy. This amounts to the claim that non-equilibrium steady-state dynamics-that underwrite our climate-depend on the Earth system possessing a Markov blanket. Our formalization rests on how the metabolic rates of the biosphere (understood as Markov blanket's internal states) change with respect to solar radiation at the Earth's surface (i.e. external states), through the changes in greenhouse and albedo effects (i.e. active states) and ocean-driven global temperature changes (i.e. sensory states). Describing the interaction between the metabolic rates and solar radiation as climatic states-in a Markov blanket-amounts to describing the dynamics of the internal states as actively inferring external states. This underwrites climatic non-equilibrium steady-state through free energy minimization and thus a form of planetary autopoiesis.

33 citations

Journal ArticleDOI
TL;DR: The Synergism Hypothesis, originally proposed in 1983, addresses the evolution of “cooperation” in nature and why there has been a secular trend over time toward increased complexity in living systems.
Abstract: Living systems theory and other theory and research in the systems sciences and complexity science has illuminated many aspects of how living systems work—their mechanisms, processes and relationships. The Synergism Hypothesis, originally proposed in 1983, addresses the evolution of cooperative phenomena in nature and why there has been a secular trend over time toward increased complexity in living systems. This theory highlights the role of functional synergy—adaptively significant combined effects that are interdependent and otherwise unattainable—in shaping the ‘progressive’ emergence of complex living systems. This approach is entirely consistent with modern evolutionary biology and natural selection theory and is thus radically opposed to various orthogenetic/deterministic theories of complexity that have been proposed over the years. The Synergism Hypothesis has recently gained scientific support, and there is growing appreciation for the role of various kinds of synergy as an influence in the evolutionary process. Copyright © 2013 John Wiley & Sons, Ltd.

26 citations


Cites background from "Anticipation and dynamics: Rosen's ..."

  • ...…important new dimension to cybernetics, control theory, artificial intelligence, and information theory (see especially Rosen 1985, Louie 2009, 2012; Nadin 2010a,b, 2012; Heylighen 2012).2 Special note should also be made of the prolific and important work of physicist Herman Haken in…...

    [...]

01 Jan 2014
TL;DR: This study introduces the undecidable as a criterion for characterizing a particular type of complexity, defined as G-complexity, and provides a context for understanding how experimental evidence can be accumulated and what the characteristics of scientific work are at this juncture in the development of science.
Abstract: Computation is the medium of contemporary science. To understand the consequences of this gnoseological and epistemological revolution, one has to evaluate the outcome. As sciences become computational, difficulties concerning data processing associated with knowledge acquisition and dissemination are reduced. The focus on data afforded a quantum leap in many domains, including computation itself. The word complexity became part of the modern scientific discourse as a result of our ability to capture more data, and to associate it with interactions characterized quantitatively. In the process, the notion of complexity itself lost its resolution. This study introduces the undecidable as a criterion for characterizing a particular type of complexity. Defined as G-complexity, it allows for the understanding of questions pertinent to knowledge about the world, in particular, the living. With decidability as a well-defined criterion for complexity, we provide a context for understanding how experimental evidence—the hallmark of science in our days—can be accumulated, and what the characteristics of scientific work are at this juncture in the development of science.

25 citations

DOI
27 Apr 2018
TL;DR: The Future Literacy Framework (FLF) as mentioned in this paper is a general-purpose tool that reveals anticipatory assumptions (AA) and a task-oriented sub-category of FLL designed specifically for the research agenda of this project regarding novelty, the FLL-Novelty (FLL-N).
Abstract: This chapter presents a case study in order to introduce the key concepts of the Futures Literacy Framework (FLF). It describes the FLF in detail, explaining the different ontological and epistemological categories that are used to map Futures Literacy (FL). The chapter then provides two illustrations of how the FLF can be used. The first explains how the FLF can be used to situate and design Futures Literacy Laboratories (FLL), a general-purpose tool that reveals anticipatory assumptions (AA), and then a more specific task-oriented sub-category of FLL designed specifically for the research agenda of this project regarding novelty, the FLL-Novelty (FLL-N). The second discusses how the FLF can be used to situate the theory and practice of Future Studies (FS) in ways that clarify why particular tools are more or less appropriate for specific tasks as well as pointing to the potential to both deepen and enlarge the discipline beyond the boundaries of currently dominant theory and practice.

25 citations


Cites result from "Anticipation and dynamics: Rosen's ..."

  • ...In particular, the research results reported here have benefitted significantly from the work done on: anticipatory systems (Rosen, 1985; Nadin, 2010a, 2010b; Rossel, 2010; Tuomi, 2012; Miller and Poli, 2010); complexity (Ulanowicz, 1979; Rosen, 1986; Ehresmann and Vanbremeersch, 1987; Kauffman,…...

    [...]

References
More filters
Journal ArticleDOI
TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Abstract: In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a considerable extent the continuous case can be obtained through a limiting process from the discrete case by dividing the continuum of messages and signals into a large but finite number of small regions and calculating the various parameters involved on a discrete basis. As the size of the regions is decreased these parameters in general approach as limits the proper values for the continuous case. There are, however, a few new effects that appear and also a general change of emphasis in the direction of specialization of the general results to particular cases.

65,425 citations

Journal ArticleDOI
TL;DR: In this article, it is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under another and gives the same results, although perhaps not in the same time.

14,937 citations

Book
01 Jan 1955
TL;DR: In this paper, a reissue of George Kelly's classic work Personal Construct Psychology (PCP) is presented. And the implications of PCP for clinical practice are discussed. But the authors do not discuss the authorship of the book.
Abstract: First published in 1992. Unavailable for many years this is a reissue of George Kelly's classic work. It is the bible of personal construct psychology written by its founder. The first volume presents the theory of personal construct psychology and the second volume shows the implications for clinical practice.

9,065 citations

Journal ArticleDOI
TL;DR: The theory of possibility described in this paper is related to the theory of fuzzy sets by defining the concept of a possibility distribution as a fuzzy restriction which acts as an elastic constraint on the values that may be assigned to a variable.

8,918 citations


"Anticipation and dynamics: Rosen's ..." refers background in this paper

  • ...1961), will have to wait for a more comprehensive approach until Zadeh (1978), and subsequently many distinguished followers, gave it a foundation....

    [...]

  • ...1961), will have to wait for a more comprehensive approach until Zadeh (1978), and subsequently many distinguished followers, gave it a foundation. Zadeh himself arrived at possibility via fuzzy sets. As recently as June 2009, Zadeh, continuing his tireless investigation of the realm of knowledge he opened when introducing fuzzy sets, made note of the fact that judgement, perception and emotions play a prominent role in what we call economic, legal and political systems. Many years ago, Zadeh (1979, republished 1996) invoked the views of Shackle, among others, as an argument in introducing information granularity. This time, acknowledging complexity – which, as we shall see, is the threshold above which anticipatory behaviour becomes possible – Zadeh took a look at a world represented not with the sharp pen of illusory precision but with the spray can (spray pen geometry). Where others look for precision, Zadeh, in the spirit in which Shackle articulated his possibilistic views, wants to capture processes unfolding under uncertainty. We realise, at least intuitively, that anticipations (like imagination) are always of a fuzzy nature, and it seems to me that Zadeh’s new work will make the scientific community even more aware of this condition. It is very significant that economics prompts the type of questions that unite the early considerations of King (1938) and Shackle (1938) with Klir’s considerations (2002a) and Zadeh’s (2009) very recent attempts to extend fuzzy logic....

    [...]

  • ...Possibility and its relation to probability, which was of interest to Shackle (cf. 1961), will have to wait for a more comprehensive approach until Zadeh (1978), and subsequently many distinguished followers, gave it a foundation....

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors describe the possibility of simulating physics in the classical approximation, a thing which is usually described by local differential equations, and the possibility that there is to be an exact simulation, that the computer will do exactly the same as nature.
Abstract: This chapter describes the possibility of simulating physics in the classical approximation, a thing which is usually described by local differential equations. But the physical world is quantum mechanical, and therefore the proper problem is the simulation of quantum physics. A computer which will give the same probabilities as the quantum system does. The present theory of physics allows space to go down into infinitesimal distances, wavelengths to get infinitely great, terms to be summed in infinite order, and so forth; and therefore, if this proposition is right, physical law is wrong. Quantum theory and quantizing is a very specific type of theory. The chapter talks about the possibility that there is to be an exact simulation, that the computer will do exactly the same as nature. There are interesting philosophical questions about reasoning, and relationship, observation, and measurement and so on, which computers have stimulated people to think about anew, with new types of thinking.

7,202 citations


"Anticipation and dynamics: Rosen's ..." refers background in this paper

  • ...Svoboda was more focused on probabilistic-based predictions; Feynman dwelled on the same. Buckminster Fuller focused on the understanding of the future as a pre-requisite for design work. Prediction and anticipation are not interchangeable. Predictions are expressions of probabilities, i.e. description based on statistical data and on generalisations (that we call scientific laws). While not unrelated to probabilities, anticipations involve possibilities, such as those a design project involves. Zadeh’s genius in defining possibility is expressed in the accepted dicta: nothing is probable unless it is possible. Not everything possible is probable. The model of itself, that unfolds in faster than real time, in Rosen’s definition (1985a) is driven by both probability realisations and possibility projections....

    [...]

  • ...Svoboda was more focused on probabilistic-based predictions; Feynman dwelled on the same. Buckminster Fuller focused on the understanding of the future as a pre-requisite for design work. Prediction and anticipation are not interchangeable. Predictions are expressions of probabilities, i.e. description based on statistical data and on generalisations (that we call scientific laws). While not unrelated to probabilities, anticipations involve possibilities, such as those a design project involves. Zadeh’s genius in defining possibility is expressed in the accepted dicta: nothing is probable unless it is possible. Not everything possible is probable. The model of itself, that unfolds in faster than real time, in Rosen’s definition (1985a) is driven by both probability realisations and possibility projections. It is with respect to this fundamental distinction that Nadin submitted the thesis according to which the complementary nature of the living – physical substratum and specific irreducible dynamics – is expressed in the complementary nature of anticipatory processes (Nadin 2009c). Moreover, his attempts to quantify anticipatory processes (through the AnticipationScopee, in the framework of Project Seneludens, Nadin 2004) guided my continuous attempts to seek mathematical descriptions that transcend classical measurement (attaching numbers to variables). So far, a good candidate for this attempt proved to be Goldfarb’s Evolving Transformational System – ETS (Goldfarb et al. 2007). This note cannot end without explicitly acknowledging Zadeh’s attempt (2003) to describe anticipation as a particular form of perception-driven computation....

    [...]

  • ...Finally, Feynman’s understanding (1982) of the integration of past, present and future in the computation (meant to simulate Nature) is probably closer to Rosen’s understanding of anticipation....

    [...]

  • ...Finally, Feynman’s understanding (1982) of the integration of past, present and future in the computation (meant to simulate Nature) is probably closer to Rosen’s understanding of anticipation. With all these considerations in mind, the reader should now be in a better position to understand that at the level of simple machines, anticipation is not possible. Such simple machines operate in the interval domain of causes and effects, in a non-ambiguous manner. Once we reach the threshold of complexity at which causality itself is no longer reducible to determinism, and the condition of the living integrates past, present and future, a new form of adaptive behaviour and of finality ( purposiveness) emerges that makes anticipatory processes possible, although only as non-deterministic processes (after all, anticipation is often wrong). Life is process (to recall Whitehead, among others), more precisely, non-deterministic process. This makes the role of the physician, and of the economist for that matter, so difficult. Therefore, in addressing causality with respect to the living (a person’s health, the state of the economy), we need to consider past and present (cause–effect, and the associated reaction), both well defined, in conjunction with a possible future realisation, ill defined, ambiguous. When we have to account for higher complexity – the threshold beyond which reaction alone can no longer explain the dynamics – the anticipatory component must be integrated in our understanding. In logic (Kleene 1950), an impredicative definition is one in which the definition of an entity depends on some of the properties of the entities described. The definition of life is an example of impredicativity; that is, it is characterised by complexity which in turn is understood as a threshold for the living. Impredicative definitions are circular. Kercel (2007) noticed that ambiguity is an observable signature of complexity....

    [...]

  • ...These, again, are Feynman’s words, his own questions. To make it crystal clear: the questions Feynman posed fit the framework of anticipation and computing. However, Feynman was not even alluding to a characteristic of a part of Nature – the living – to be affected not only by its past, but also by a possible future realisation. Feynman’s focus was on quantum computation, and therefore the argument developed around probability configurations. When he wrote about simulating a probabilistic Nature by using a probabilistic computer, he realised that the output of such a machine ‘is not a unique function of the input’, that is, he realised the non-deterministic nature of the computation. As we shall see, where Feynman’s model and considerations on anticipation and computing, related to the work of Rosen and Nadin, diverge is not difficult to define. For him, as for all those – from Aristotle to Newton (Philosophiæ Naturalis Principia Mathematica) to Einstein – who made Physics the fundamental science it is, there is an all-encompassing Nature, and Physics is the science of Nature. In other words, Physics would suffice in explaining anticipatory processes or in computationally simulating them. Svoboda (1960, cf. Klir 2002) published a ‘model of the instinct of self-preservation’ in which the subject is a computer itself. Its own functioning models self-preservation under external disturbances. A probabilistic description based on inferences from past experiences quantifies its predictive capability. Pelikan (1964) further elaborated Svoboda’s original idea....

    [...]

Frequently Asked Questions (10)
Q1. What are the contributions mentioned in the paper "Anticipation and dynamics: rosen's anticipation in the perspective of time" ?

Nadin et al. this paper pointed out that the deep roots of an idea are often ignored or declared unviable. 

Subjects such as planning, management, political change and stable and reliable institutions informed Rosen’s presentations at the Center during the 1971–1972 academic year, when he was a Visiting Fellow. 

Research in dynamic systems and mind functioning at Stanford University led Nadin to search more deeply into anticipatory systems. 

The Institute became part of the University of Texas at Dallas in 2004, when Dr Nadin accepted the university’s invitation to become an Ashbel Smith University Professor. 

It was unfair of many of the commentators of his work to see in him a rather esoteric researcher, disconnected from reality, only because his arguments were articulated in the extremely abstract language of mathematics, in particular, category theory. 

The very encouraging aspect here is that measurements of trigger-based experiments reveal what happens before the trigger; in other words, in anticipation of stimuli, not as a result of them. 

At the time when systems theory emerged, a great number of high-quality scientific events made possible the meeting of scientists who usually do not meet: physicists, biologists, mathematicians and engineers. 

Later on, interaction with Gerry Sussman (on sabbatical from the Massachusetts Institute of Technology) helped him develop ‘Potentialities and Limitations of Computing Machines’. 

The American economist Willford Isbell King (incidentally, at one time Chairman of the Committee for Constitutional Government) published The Causes of Economic Fluctuations: Possibilities of Anticipation and Control (1938). 

Elsasser worked on a new foundation of biology because the accepted view of considering Physics as its foundation no longer did justice to the complexity specific to the living.