Copy Link
Add to Bookmark
Report

Alife Digest Number 049

eZine's profile picture
Published in 
Alife Digest
 · 10 months ago

 
ALIFE LIST: Artificial Life Research List Number 49 Friday, November 23rd 1990

ARTIFICIAL LIFE RESEARCH ELECTRONIC MAILING LIST
Maintained by the Indiana University Artificial Life Research Group

Contents:

References on Emergence
More Emergence
alife progs in basic
ML91 Call for papers: Eighth International Machine Learning Workshop

----------------------------------------------------------------------

Date: Fri, 9 Nov 90 00:20:37 est
From: "Peter Cariani" <peterc@chaos.cs.brandeis.edu>
Subject: References on Emergence

References on emergence (Cariani)
=========================================================================

Here are most of the references I have which directly and explicitly deal
with the problem of emergence. As far as I know the only clear definitions
of emergence are those which examine the behavior of a material system
and whether that behavior deviates from some specified model of the material
system. This definition has been elaborated by theoretical biologist
Robert Rosen (1978, pp. 90-93, 112; 1985, pp.19, 294, 305) using the
descriptive formalism of dynamical systems theory. I have utilized his
concept to formulate operational, empirically-grounded criteria for
deciding whether a given material system's behavior is emergent with
respect to a given, fixed set of observables on that system.
I also find it useful to divide the emergence literature into 3
largely separate discourses: functional emergence (including the
emergence of modelling relations in material systems), thermodynamic
emergence, and computational emergence. I also call functional emergence
"emergence-relative-to-a-model", to emphasize the relativity of the
emergent event to the observer's model (set of observables). Perhaps
I should add another category to encompass the questions asked by
Nagel (and Churchland, it seems) regarding the logical reducibility/
irreducibility of one theory to another. Although there are relations
between these discourses, it's helpful to keep in mind which sense
of the concept of "emergence" is being used, and to recognize that
there are widely different meanings associated with the term.

Highly recommended references are asterisked.
Discussions of the original concept of emergence:
Bergson, Morgan, Nagel, Denbigh, Pepper, Klee
Computational emergence
Langton, Hillis, Forrest
Functional emergence (emergence of modelling relations)
Rosen, Pattee, Cariani (see also Nagel & Klee)
Critiques of emergence in computer simulations
Rosen (1973, 1985), Pattee (1989), Cariani (1987-90),
Carello et al (1984)

I'd appreciate any additional references which are primarily focussed on
fundamental issues raised by the problem of emergence. Thanks.
-- Peter Cariani (peterc@chaos.cs.brandeis.edu)

Some references on the problem of emergence
==============================================================================
Bergson, Henri (1911) Creative Evolution. (New York: Random House), 1946.
[one of the original expositors of the concept of Emergent Evolution]
Cariani, Peter (1988) Why Artificial Life Needs Evolutionary Robotics.
unpublished manuscript, available upon request.
Cariani, Peter (1989) On the Design of Devices with Emergent Semantic
Functions. Ph.D. dissertation, Dept. of Systems Science, State University of
New York at Binghamton. [the problem of emergence as it relates to
AI and evolutionary robotics; limitations of purely simulational
approaches; relation between types of adaptivity and emergence]
Cariani, P. (1989) "Adaptivity, emergence and machine-environment
dependencies."
Proc, 33rd Annual Meeting, Int. Soc. for Systems Sciences
(formerly ISGSR), III:31-37.
Cariani, P. (1990) " Implications from structural evolution:
semantic adaptation."
Proceedings of the International Joint
Conference on Neural Networks, Washington, D.C., January, 1990, I: 47-50.
Cariani, P. (1990) "Emergence and Artificial Life." manuscript submitted
to Proceedings, Second Workshop on Artificial Life, available upon request.
Cariani, P. (1990) "Adaptation and Emergence in Organisms and Devices." in
press, Journal of General Evolution.
Carello, C., M.T. Turvey, P. N. Kugler, and R. E. Shaw (1984), "Inadequacies
of the computer metaphor."
Handbook of Cognitive Neuroscience. Ed., M.
Gazzaniga (New York: Plenum Press). [functional limitations of
computing devices: good general discussion]
Denbigh, KG (1975) An Inventive Universe. (New York: George Braziller).
[thermodynamics, determinism and emergence, good general discussion]
Forrest, S. (1989) "Emergent Computation: Self-organizing, Collective,
and Cooperative Phenomena in Natural and Artificial Computing Networks,"

Proceedings of the Ninth Annual Center for
Nonlinear Studies and Computing Division Conference, April, 1989.
[a serious attempt to formulate a definition of emergent computation]
Hillis, Daniel (1988), "Intelligence as an emergent behavior; or the songs
of Eden"
Daedelus AI issue, Winter, 1988 (reprinted by MIT Press).Eden," Daedalus, Winter, 1988,175-189.
Klee, R. L. (1984), "
Micro-determinism and concepts of emergence,
"Philosophy of Science 51: 44-63. [excellent paper discussing clashes
differing conceptions of hierarchies and emergence]
Langton, C. (1986), "
Studying artificial life with cellular automata,"
Physica D 22:120-149.
Langton, C. (1989), "
Artificial life," Artificial Life, SFI Studies in the
Sciences of Complexity. Ed. C. Langton (Reading, MA: Addison-Wesley).
Morgan, C.L. (1927), Emergent Evolution. (London: Northgate and Williams).
[one of the original expositors of the concept of emergent evolution]
Nagel, E. (1961), The Structure of Science. (New York: Harcourt, Brace & World)
(see pp.367-380 and Chapter 12; an excellent discussion of the problem)
Pattee, H. H. (1972), "
The nature of hierarchical controls in living matter,"
Foundations of Mathematical Biology, Vol. I, R. Rosen, ed.
(New York: Academic Press).
Pattee, H. H. (1973), "
Physical problems in the origin of natural controls,"
Biogenesis, Evolution, Homeostasis. Ed., A. Locker (New York: Pergamon).
[emergence and the formation of hierarchical constraints]
Pattee, H. H. (1973), "
The physical basis of the origin of hierarchical
control." Hierarchy Theory: The Challenge of Complex Systems.
Ed., H. Pattee (New York: George Braziller).
Pattee, H. H. (1989), "
Simulations, Realizations, and Theories of Life,"
Artificial Life, SFI Studies in the Sciences of Complexity (Reading, MA:
Addison-Wesley).
Pepper, Stephen (1926), "
Emergence," Journal of Philosophy 23-241-245.
Rosen, R. (1973), "
On the generation of metabolic novelties in evolution,"
Biogenesis, Evolution, Homeostasis. Ed., Alfred Locker (New York: Pergamon).
[must reading for Alife simulators; a discussion of the concept of
organism and emergent functional properties in simulations]
Rosen, R. (1978) Fundamentals of Measurement and Representation of
Natural Systems. (New York: North Holland).
Rosen, R. (1985), Anticipatory Systems. Pergamon Press, New York.
[A very dense, but very important book; lays out an entire theory of
modelling relations in organisms and devices]
Rosen, R. (1987) On the scope of syntactics in Mathematics and Science:
The Machine Metaphor. In: J Casti, ed. Real Brains, Artificial Minds.
North-Holland, New York.



------------------------------

Date: Wed, 14 Nov 90 16:16:38 +0100
From: Mike Manthey <manthey@iesd.auc.dk>
Subject: More Emergence

More Emergence

The last time the issue of 'what is an emergent phenomenon' came up here
in the Alife forum was last spring, and inspired my contribution to the
Alife II proceedings. The longish contribution below is drawn from there.
If it doesn't come to appear in the proceedings, I guess I'll send it
somewhere else (suggestions?!), and in the meantime, those who just can't
wait can write/email me, and I'll send a copy of the tech report (#R 90-25).

>From Section 1 of "
Hierarchy and Emergence: A Computational View":

The computer programs that we all write are what are called 'sequential'
programs because they express a sequence of operations which are to be
carried out by some computational engine. This engine can quite accurately
be said to take the static and finite program/algorithm and 'unroll' it
into the time dimension. This unrolling creates the dynamic, active, constantly
changing entity we call a 'process'. The processes so created may or may not
be bounded, i.e., may or may not ever halt.

Let us look a little more closely now. Suppose, for the sake of discussion,
that we wished to increment the memory cell a by the value 1. The program
which we write to do this might look like this:

load R,a {move the value in cell a to register }
add R,1 {increment R by 1}
store R,a {put the new value back into cell a}

The computer would of course simply execute these three instructions in
order, and we would find the result of our computation in cell a when it
finishes.

This sort of thing appears to be quite different from mathematics as
understood at the time when digital computers first appeared in the 1940's,
but in fact it is quite simple to re-express this calculation as

store( (add (load(R,a), 1), a )

The point is that an ordinary sequential computation can, very generally,
be viewed as a function composition.

It is important to notice that this second representation of the computation
elegantly side-steps the memory cells which are so prominent in the original
version, which are replaced by parameters bound to values residing in some
Platonic space. In point of fact, the replacing of memory with a function
call is an endemic practice in many well-accepted formalizations of
computation. This reflects a fundamental difference between traditional
mathematics and the computational realm, in that the former is a static
(read: timeless) description whereas the latter is dynamic (read: time
-creating). Said in a different way, the symbol "
=" expresses a timeless
relationship between two entities - they are/were/always will be equal;
in contrast, the symbol "
:=" [replaces] explicitly recognizes the existence
of memory, and memory implies state, which implies a sequence of states,
which implies time.

While the trick of using functions to finesse memory away works very well for
formal descriptions of sequential processes, it runs into trouble in a
context where there is more than one process, and the multiple processes in
question interact with each other. It is this context which I denote by the
term 'concurrent'. It turns out that there are only two basic ways that two
processes can interact with each other: one can pass a bit to the other, or
one can wait for the other. These are called, respectively, 'to communicate'
and 'to synchronize'; I note that these two aspects are frequently combined
and not distinguished in the standard practice-oriented computing literature.

Communication can only take place using a memory cell. The bit to be passed
is written into the cell by the 'Writer' process and later read from the
cell by the 'Reader' process. This is literally what happens when the
processes in question reside in the same computer. Processes residing on
distinct computers can send messages to each other, and it might at first
seem that such message passing is a new and different way for processes to
communicate. This is in fact not true - both in principal and if you look
at the actual implementations - for one can simply consider the message
medium to be a memory which holds the value until it is 'read' by the receiver

Synchronization is, as mentioned, the other way that processes can interact.
Whereas with interaction over memories we spoke of the Writer and Reader
processes, we now speak of the Waiting and Signalling processes (there is
no implied "
respectively" correspondence between the two pairs of operations).
I should perhaps state at the outset that, for me and many other computer
scientists, synchronization is a truly profound subject, one which cuts
very deeply indeed into our view of reality and hence has numerous facets.

One way to view synchronization is that it simply expresses a programmer's
intent that some given operation (to be performed eventually by the Waiting
process) must not take place before certain other operations have been
performed by the other process, which Signals the Waiting process when this
has occurred. In this view, synchronization is an operation which establishes
a total ordering (before-after) between two events, the Signal of the one
process and the Wait of the other, no more and no less. This is very often
enough for solving typical problems.

Another way to view synchronization is that it creates and protects a sequence
of operations which may only be executed by a single process at a time, a
so-called critical region. The classical example is a memory cell which is
incremented by two processes, i.e., the previous example. If a second process
(e.g.) executes this sequence on the heels of the first (i.e., is exactly
one instruction behind), the final contents of the cell will have missed
one of the increments (recommended exercise!).

In general, such violations of a critical region lead directly to non-
deterministic behavior. This non-determinism can be avoided by insisting
that any processes wishing to use a given critical region synchronize before
entering. Hence the deterministic version of our example would read
"
Wait, read cell, increment register, write register back to cell, Signal",
with the initial condition that the synchronizing element itself is initially
'Signalled', and that it otherwise only lets one process through per Signal.
Hence synchronization is the tool by which processes can be made to mutually
exclude each other from a critical region, in addition to simply coordinating
their activities. Subsequent sections show that synchronization
very literally induces a structure on a concurrent computation, corresponding
to the various critical regions and their relationships to each other.

We can now glimpse one of the central points of the paper. Notice that
in principle NO information is communicated between Waiting and Signalling
processes, as opposed to Writing and Reading. Hence the processes involved
in a synchronization act are completely anonymous - process Z cannot tell
whether it was process X or process Y that Signalled it. This is manifestly
not true of communication, where not only is information passed, but also is
automatically incorporated into the state of the receiving process. This
leads to an important distinction: whereas memory is a content-oriented
concept, synchronization is a structure-oriented concept. Said in a
slightly different way, the sequential world - and with it, function
composition - cannot capture structure, since synchronization - which is
irretrieveably a creature of the concurrent world - is what induces structure.
Said in yet another way, structure is a property of the relationships between
processes, rather than of any individual process in and of itself. A final
paraphrase: when two processes synchronize they create something - structure
- which neither possesses (nor can possess) individually. Thus structure is
an emergent property. I venture then the following definition of

an emergent phenomenon :

Restriction: The setting is computational, concurrent, asynchronous, distributed

Necessary conditions:
1. There are at least two processes.
2. These processes interact, via either memory or synchronization.
3. The putative emergent phenomenon cannot - even in principle - be
expressed by a single process.

Sufficient condition
1. The phenomenon occurs. [Ignore the delicate issue of observation.]

Example 1. A single process can synchronize with itself, but this has absolutely
no effect, since the process - being sequential - cannot help but carry out
one event before proceeding to the next. Rather, the emergent phenomenon first
appears when two processes together create an 'object', i.e., something that
preserves some invariant as a whole, and where neither process by itself can
reasonably be said to express this same invariance on its own. A traffic
"
gridlock" is an example, wherein none of the cars involved can be viewed,
individually, as being 'the source' of the deadlock: it is a group phenomenon.
The concept and formation of "
objects as invariances" is discussed at length
in Section 3.

Example 2. The non-determinism introduced by unsynchronized communication over
a shared memory cell, cf. the increment a example. A single process that
Writes and Reads to itself is an ordinary sequential and deterministic
computation: the phenomenon disappears. But allow these individually
deterministic processes to interact over an unsynchronized memory, and the
non-determinism appears. Again, it takes two.

I view Example 2 as being more phenomenal than structural in character, but
there is nothing which says emergence may only be structural. This observation
also reinforces the earlier distinction between content and structure.

This is not to say that sequential computations do not possess structure,
merely that we have in the case of synchronization a clearly new kind of
structure, and one which happens to capture emergence. I would claim however
that viewing an organism (of whatever kind) as a sequential process
is - at best - a vast distortion of its true composition, and therefore that
not only should we talk and think in concurrent terms when describing
organisms, but also that the synchronizational structure is what we should
be interested in. This is the structure which easily allows us to understand,
for example, how it is that an organism today and the organism tomorrow are
the same organism, even if their constituent molecules are not the same,
or indeed have been completely replaced: the (possible) chemical pathways
(i.e., the constituent processes) and their inter-relationships remain the
same.

The next section describes synchronization in greater detail. Section 3,
building on Section 2, treats the issues of object formation and hierarchy.
Section 4 gives some examples of using the hierarchy discussed in Section 3
to analyze systems, and Section 5 concludes. Before proceeding however,
there are a couple of little items which did not fit into the above exposition. The first is that I distinguish strongly between concurrency and parallelism,
from the standpoint that parallelism refers to a computation which can be
described as a sequential process, but which is faster, relative to the
clock on the wall. If you will, concurrency is a concept, whereas parallelism
is a technique aimed at a particular end. The other item which didn't fit
is that the reader should beware reading this paper as a way to describe
cellular automata. Cellular automata typically change state synchronously,
side- stepping synchronizational issues via global (i.e., Newtonian) time,
whereas the components of the systems described herein are asynchronous
with respect to each other. Hence the former is a subset of the latter.
Nevertheless, to the extent to which critical regions and other
synchronizational issues are recognized when they are present, the
discussion which follows is valid for cellular automata.

---This is the end of the first section of the paper. I note that I
cannot discuss 'emergence' without implicitly invoking some
notion of hierarchy, which topic is not included in the above quotation,
but rather appears in Section 3 of the paper. Thus - with reference to Nicol
Schraudolph's item in #46, wherein she cites Paul Churchland's definition
of emergence as a relationship between two theories T1 and T2, where a
putative emergent phenomenon is emergent in T1 (embedding a reductionistic
theory T2 which in principle cannot express the phenomenon), and Marek's
concern with relativism - T1 embeds T2 BY VIRTUE OF ascension to a higher
level of abstraction, ie an ascension to the next level of conceptual
hierarchy. An emergent phenomenon is a product of a hierarchy relation between
conceptual levels, and hence in this view the appearance of an emergent
phenomenon is an all or nothing proposition: you are either at the one level
(conceptually speaking) or the other. In my experience, 'endless' beery
late-night discussions typically spring from an unacknowledged lack of prior
agreement of which level is to provide the framework for the discussion.

Andy Holger's item (also in #46) about the "
epiphenomenal" character of
emergent phenomena touches a slightly different aspect, an aspect which
(for me) is related to both the causal relationship between activities at
different levels and to the suspicion with which 'emergence' as a valid
scientific concept is met (Marek's questions seeming to come from this quarter,
though with an apparent open-mindedness which is often lacking). I cannot
subscribe to a concept of emergence in which the putative emergent phenomenon
is not causally grounded at the next level down. So my concept of emergence is
reductionistic in this respect. On the other hand, computer scientists in
particular seem to have a particularly hard time imagining how a 'whole'
could be greater than its parts because the endemic hierarchy concept in the
field - subroutine 'who calls whom', which is a bastardized version of
function composition, which is inherently sequential in concept - leaves no
room for it. Briefly put, my basic criterion for emergence is the existence
of a resource invariant, which is a creature of the concurrent world. A
simple example is a 'gridlock': the number of 'resources' (=squares of pavement)
is fixed, and none of the processes involved (represented by the cars)
can legitimately singly bear responsibility for the emergent GROUP phenomenon.

Recalling Holger's (citing Hofstadter's) example of emergence in
the existence of a threshhold for thrashing in a paged computer system,
again the importance of concurrency per se appears. Emergence as
epiphenomenon is consistent with a reductionistic view such as I take,
but at the same time the central role played by concurrency
(and in particular synchronization) de-trivializes the characterization,
removing the usually pejorative tone of classifying something as 'a mere
epiphenomenon'. I (like to) think that most people who 'like' the concept
of emergence will find sufficient richness in the concurrency-based proposal
being advanced here.

In a way, I find the almost instinctive resistance to the concept of emergence
among my CS colleagues a little hard to understand, since it would seem that
the well respected mathematical distinction between local and global properties
(especially in topological contexts, eg knot theory) is an obvious analog. I
would also claim that non-linearity (eg x^2 +xy + y^2) smacks of the same, tho
this is rank intuition at work! If I were to seek a distinction between these
two examples, it would be in the structural (space-like, to which synchro-
nization is inevitably related) versus the content ("
value"-orientation)
domains (and it is the former that intrigues me). It is precisely the
structural or space-like aspect that traditional views of the theory of
computation seem to allow to slip through the cracks - there is, so far as
I know - as yet no 'geometry' in the theory of computation, and hence arguments
based on universal Turing machines' computational generality re function
composition (cf above, re the distinction between the sequential and the
concurrent) would be flawed.

In closing, a non-alife discussion of the hierarchy being proposed
can be found in Jamieson, Gannon, & Douglass Eds, Characteristics of
Parallel Algorithms (MIT Press, 1987), although the
reader should replace the "
formal" definition of the hierarchy in that
paper with the considerably more rigorous one in the paper quoted above.

Mike Manthey
Intitute for Electronic Systems
Aalborg University Center
Fr. Bajersvej 7
9220 Aalborg DENMARK

manthey@iesd.auc.dk



------------------------------

Date: Wed, 21 Nov 90 10:46:20 EST
From: Liane Gabora <liane@cogsci1.cogsci.indiana.edu>
Subject: alife progs in basic

Prof. Craig Davis at San Diego State University is planning to offer a
course in Artificial Life in which the computer language BASIC will be
used. If anyone has or knows about any programs that he might be able to
use in his course, he would greatly appreciate your getting in touch with
him. He can be reached at 619-265-6767, or at the following address:

Craig Davis
Department of Biology
College of Sciences
San Diego State University
San Diego, CA
92182



------------------------------

Date: Wed, 21 Nov 90 15:31:40 CST
From: birnbaum@fido.ils.nwu.edu (Lawrence Birnbaum)
Subject: ML91 Call for papers: Eighth International Machine Learning Workshop

ML91 The Eighth International Workshop on Machine Learning

Call for Papers

The organizing committee is please to announce that ML91 will include the
following workshop topics:

Automated Knowledge Acquisition
Computational Models of Human Learning
Learning Relations
Machine Learning in Engineering Automation
Learning to React/in Complex Environments
Constructive Induction
Learning in Intelligent Information Retrieval
Learning from Theory and Data

Papers must be submitted to one of these workshops for consideration. The
provisional deadline for submission is February 1, 1991. Papers to appear
in the Proceedings must fit in 4 pages, double column format.

More details about the constituent workshops, including submission
procedures, contact points, and reviewing committees, will be forthcoming
shortly.

ML91 will be held at Northwestern University, Evanston, Illinois, USA (just
north of Chicago), June 27-29, 1991.

On behalf of the organizing committee,

Larry Birnbaum and Gregg Collins

------------------------------
End of ALife Digest
********************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT