DEAD AND LIVING SYSTEMS
Their Relation to Formal Logical Descriptions

E. von Goldammer , J. Paul and C. Kennedy*)
Institut für Kybernetik & Systemtheorie - ICS
Harpener Hellweg 532, D-44388 Dortmund
*) Fakultät für Informatik, TU Dresden

 

Abstract

In the theory of living systems, any description of self-organizing processes is confronted by the central problem of self-generated system/environment boundaries. This contrasts with non-living systems where the boundary is always generated by an observer. A formal representation of the simultaneous existence of the inside and the outside of the system is required, i.e. the distinctions which the system itself makes (in order to generate its own boundary) and the distinctions which the observer makes must be incorporated together without contradiction. In contrast, if a theory of self-organization is restricted to changes within a system (for example, physical systems, which may be defined by some physical state variables), the question of the boundary is eliminated and the distinction between a system and its environment (its inside and its outside) is generally interpreted as an information transfer between both. This assumption, however, contradicts the 'closure thesis' postulated for cognitive and living system by modern biology, in particular the theory of autopoietic systems, which requires self-generated boundaries.

 

1. Basic Situation

The attempt to develop a scientific description of living systems in the sense of a 'holistic', i.e. non-reductionist "theory of life" in the 1970s led to a fundamental change in the scientific paradigm of strict separation of observer from observed. Instead the observer became part of the system to be described [von Foerster, 1985]:

"A living organism is an independent autonomous organizationally closed being, and [1]

a living organism is itself part, participant and contributor of its world of observation."

These two complementary statements necessarily assume 'autonomy', i.e. 'self-control' for living systems. The term 'self-control' here is synonymous with the expression 'control of control'. Within the terminology of cybernetics it means:

"A living system (self)-controls its control." [2]

The acceptance of such a statement leads to radical consequences for the (cybernetic) description of autonomous systems. The organisational closure of autonomous systems in the sense of the "closure thesis" is required [Varela, 1979]:

Closure thesis: [3]

"Every autonomous system is organizationally closed.

... organizational closure is to describe a system with no input and no output ..."

That idea is incompatible with Wiener's term "feedback". Here a transition is apparent between classical cybernetics (1st order) which treats only input/output-systems and "second order cybernetics" which describes (operationally) closed, i.e. autonomous systems. Epistemologically crucial is the understanding that both 'operational closure' and 'autonomy' are incompatible with a system description from the viewpoint of an observer-determined system-environment relationship. The observer-defined boundary between system and environment (given by an input/output relationship) is always different from the system-defined boundary between itself and all other systems created through its operational closure. This becomes clear when one considers the incompatibility between the two viewpoints:

i) firstly a description of a system and its environment defined by an external observer where the system is defined by the external observer,

and [4]

ii) secondly a description based on the viewpoint of the autonomous (living) system itself where the observer is included, i.e. where the boundary between system and environment is created by the autonomous system.

Maturana and Varela's concept of 'autopoiesis' plays a central role in the development of a 'theory of living systems' [Maturana and Varela, 1972]. It represents an attempt to develop a semantic, i.e. non-formal theory of living systems with the declared aim of developing a biological, non-physicalist terminology of living systems. In spite of this achievement, the problem remains that a symbiosis of computer and life sciences (in the sense of the simulation of biological systems and the consequent construction of technical artefacts) will never succeed on the basis of a purely semantic theory.

 

2. Problem Representation

In contrast to all currently known neural network models which describe only classical input/output-systems (i.e. open systems), the models of biological and cognitive networks postulated by 2nd-order cybernetics represent closed systems. Clearly there exists here an incompatible contradiction between the notion of 'open' and 'closed' systems as networks or models of description. The important point is that only a closed system can "own" an environment whereas an open system in principle does not own an environment.

If one looks for a model of the description of cognitive processes, the model must include the aspect of 'closure'. Thus cognition implies the capability of a system to draw a distinction between itself and its environment. The basis for that destinction is determined by the system itself. Again and from the viewpoint of the system itself, the existence of an environment must be presumed (see statement 4). It should be mentioned that in this definition one must distinguish between 'cognition' and 'consciousness'. A system with 'consciousness' must have cognitive capabilities but the converse of the statement does not hold. This is an issue which must be clearly emphasised if cognition is to be interpreted as a characteristic of living systems which distinguish them from the dead objects of the world of physics.

Definitions of 'environment' and 'cognition' are mandatory for a serious scientific approach. Their significances for technical developments may be easily visualized by the example of a robot:

In the following, a robot in a car factory should be considered; its task is to fix screws to car bodies. Clearly this is not a cognitive system. An independent observer of such a robot will easily draw a distinction between the robot and its environment, i.e. the screws, car bodies etc. From the viewpoint of the robot the situation is completely different; that robot does not own an environment. The 'screws', their path from the shelf to the car body' etc. are parts of the robot's internal program. Effectively they are preprogrammed as objects by the constructor and belong to the computer program which drives the robot's actions. The objects have the same significance as the visible steel construction for the observer. On the other hand, a robot equipped with cognitive capability should be able to distinguish between itself and its environment in order to avoid unscrewing itself. Up to now there are no robots with cognitive abilities. And no further development in that direction is gained by introducing or implementing neural networks, fuzzy controllers or a combination of both - the so-called neuro-fuzzy systems.

 

2.1 Physical Systems are 'Open' Systems

To repeat, the definition of 'environment' and 'cognition' given above postulates the assumption of 'closed' systems. To visualize the consequence of the postulate of 'closure' of living systems it is fruitful to re-think the corresponding formation of concepts and terminology within the natural sciences. In physics and chemistry we are not accustomed to think too much about the notion of systems. Only within thermodynamics are 'open' and 'closed' systems described and are usually understood as compositions of geometrical boundaries, i.e. systems are defined as a partition or part of a metric space. In physics, the difference between 'closed' and 'open' is related to exchanges of matter between different spatial areas. Within this framework, a physical system defined as 'closed' does not exchange matter across its boundaries in contrast to an 'open' one. If such boundaries are not only impermeable for matter fluxes but also for energy fluxes, the system is described as 'enclosed' or 'isolated'. One recognizes that those definitions of systems rely on a spatial imagination. They can be visualized at first glance. However, from both the viewpoints of mathematics and physics, they are not only inexpedient but to a large extent scientifically inconsequent. They were derived at a time when the quantity of matter (measured in 'mol') was not commonly accepted as a physical measure and 'chemical energy' as an energy form was ignored by physicists.

However, physical systems always have one common feature: they exchange different forms of energy with other (physical) systems. Here, the physical state of an observed system changes from, let's say, a state 1 to a state 2, or expressed in other words, from an initial state to a final state. Physics measures the changes of the physical variables (observables) which describe the system. If the state of a system remains unchanged then nothing is measurable, i.e. if the system does not exchange energy with another system, it remains constant and consequently nothing can be measured. However, within the notion of 'open' and 'closed', this means that it makes no sense to observe systems which do not exchange energy. Consequently, physics (and chemistry) only know systems which are 'open', i.e. they allow an exchange of energy with other systems. For a formal description of physical systems, terms like 'open' and 'closed' are completely unnecessary [Falk, 1990].

A system definition in physics and chemistry requires that the different energy forms undergoing exchange are balanced as a sum. One gets a differential equation (the so-called Gibbs-function) which defines the physical system completely. Indeed this is the generally valid definition of a physical system with no geometrical boundary (see [Falk, 1990] for details)

      [5]

The left hand side of equation (5) gives the change of the total energy E of the system which equals the sum of the single energy forms such as mechanical, heat, or chemical energy, etc. which the system exchanges with other systems and which also describe the system. Of importance is the change of energy from a state 1 (given by a constant value E=E1=const) to a state 2 whose energy is also constant E=E2. If the system does not exchange energy (closure) then all values dXi are equal to zero and the system does not exist in the sense of a physical description. It is easy to see that geometrical boundaries are completely unnecessary. Within physics, a system definition is given by an abstract mathematical description with terms like 'open' or 'closed' make no sense. Either the system exists, i.e. the right side of eq. (5) is unequal to zero, or the system does not exist, i.e. the right side equals zero and no energy is exchanged; even sophisticated philosophical arguments do not change this circumstance. The obsolescence of spatial boundaries in physics is already acknowledged by atomic physics (Heisenberg's uncertainty principle).

 

2.2 The Reduction of Biological Systems to 'Open Systems' or Reductionism in Biology

Since every measurement determines the change between an initial and a final state, every experimental science in which measurement plays the leading role reduces a system to an open (partial) system, where a 'beginning' and an 'end', an 'input' and an 'output' exist. Those terms only make sense with open systems. Now it becomes clear why in classical natural sciences the term 'system' plays a subordinate role; to talk about 'closure' in that context is senseless. This also holds for biology which, when talking about systems, usually inherits the formally obsolete spatial-geometrical imaginations from physics and chemistry.

In contrast, from the viewpoint of contempory cybernetics, 'closure' of a system is postulated for the existence of an 'environment' which also is postulated for the description of 'cognitive' processes. It is exactly this cognitive abilitiy which distinguishes living systems significantly from dead matter. It follows that this issue is not only crucial for engineering and computer sciences (for modelling and simulation of cognitive processes) but also for life sciences. Seen from a scientific conceptual view, the system 'ape', where for example an experimenter measures brain activity with electrodes as a function of predefined external optical stimuli, is reduced to a living non-trivial signal- or data-filter equipped with a living neural network. The experimental setup reduces the system 'ape', i.e. its brain, to an open system (for the experimenter). Such or similar experiments do not gain any knowledge about the (visual) perception or cognitive-processes which take place within the system 'ape' during the experimental situation. This will apply even if it would be possible to measure the activity of every single neuron within the ape's brain. Such experiments are more successful within investigations concerning metabolism. However, in that case one is clearly working in the field of physics and chemistry. The domain of cognitive processes is radically different.

· What epistemological difficulties occur within the experiment of measuring brain activities in the context of a scientific description of cognitive processes ?

a) First, there is the hitherto common but scientifically biased usage of the term 'information' as it was introduced by Shannon in 1948. According to Shannon's theory, the information content of signals (analogous to physical objects) is defined as a measurable variable. In telecommunication techniques the usefulness of this exclusively object-oriented conception of information cannot be denied. However, from the viewpoint of contemporary cybernetics whose leading task is the description of living systems, this concept is insufficient because a perceived signal first becomes information within the perceiving system and within a context defined by that system itself, i.e. 'information' does not exist sui generis. As long as biologists and (neuro)-information scientists fail to recognize this situation (which is actually easily rethinkable) they will only have limited success in their efforts to develop a theory of cognitive processes.

b) Naturally the 'definition' of physical systems as 'open systems' is not an invention of the physicists, but is derived from the wish to describe mathematical and physical systems and processes in a mathematical way. Here mathematics as a formal language represents an extremely efficient tool for scientific communication. However, the formal description of a closed system in the sense of the 'closure thesis' (see statement 3) is not possible on the basis of classical mathematics [von Goldammer and Kaehr, 1989].

Here immediately the question arises of how to realize technical cognitive systems when cognitive processes are not mathematically describable. A problem becomes apparent which not only touches the foundations of AI-research and neuroinformation sciences but also the field of advanced robotics. The tendency to ignore this problem must be counteracted if progress is to be made. Up to now, the knowledge of the non-formalisability of cognitive processes by classical mathematics along with the resulting scientific and technological consequences are not recognized (even to a minimal extent) by the traditional disciplines. As a result there exists an unquestioned belief that cognitive and autonomous technical systems can be achieved with the help of the models of neural networks and/or their combination with methods of fuzzy-logic by just increasing their computational power (massively parallel systems). From the point of view of cybernetics and emerging transdisciplinary systems sciences this belief appears increasingly naive. The models of neuroinformation science with their input and output layers are open systems par excellence, and therefore they are NON-COGNITIVE.

 

3. Autonomous Systems: Description and Construction

The modeling and simulation of autonomous systems and the more general formulation of a theory of living systems represent a highly complex interdisciplinary task. This along with the construction of autonomous vehicles, i.e., technical artefacts with cognitive capabilities, presume a (formal) language as a communication tool and as the basis of construction. As depicted in part 2, an adequate formal language must be available to allow different domains of description to be interwoven with each other by means of special operators. They should not simply exist alongside each other with no relationship. Since all scientific statements should be founded on logic, a formal language is postulated as a logical calculus where different logical domains are connected by means of special operators, so that a description (i.e. modeling and simulation) of simultaneously active physical, chemical and cognitive processes becomes possible.

Such a calculus of parallel networks is already provided by the polycontextural logic founded on the basis of the theory of position values (Stellenwerttheorie). The problem of a formal and scientific description of living systems in a "holistic" sense represents the task to fill this calculus with semantics in an adequate way. This is a complex interdisciplinary project where no historical prototypes exists. Regarded from the viewpoint of the history of science, it represents a totally new situation. In the past, formalisms were developed in parallel with the terminological form of a theory derived from experimental observations. Further mathematical development then led to a reduction of the variance of meaning of the concepts and terminology developed.

Another basic difficulty in the development of a 'theory of living systems' results from the fact that we do not have the ability to think simultaneously. Moreover such a parallelism of processes could not be measured or perceived in a direct way. However, this does not mean (or prove) that such processes do not exist. For the understanding of living systems they are of fundamental importance. Only the theoretical development of a terminology for such processes will lead to a theory of living systems, in which problems such as "Mehrzeitigkeit", polyrhythmics, etc. appear and must be tackled in a formal mathematical way.

All currently known parallel computer architectures and the algorithms running on them possess in principle a sequential eqivalent which does not modify the mapped process. Only the performance of the algorithm changes. The simultaneity we describe here does not refer to this kind of parallelism. Due to the sequential nature or human thinking and the non-measurability of simultaneously interacting processes, the computer is a crucial modeling tool and simulation platform. The path to a theory of living systems must be developed methodologically via computer sciences in the sense of a symbiosis of computer science and life sciences.

 

3.1 Supervised & Unsupervised Learning

In control engineering, one problem is the design of stable and robust general-purpose controllers. Nowadays neural networks (NN), fuzzy controllers (FC) and combinations of both - the socalled neuro-fuzzy systems (NFS) - come into application. The advantage of those 'unconventional' methods lies in the fact that a precise mathematical model of a control path is not necessary. With the help of FC- and/or NN-techniques a control path is recorded in order to create a look-up-table for realtime applications.

Example: "Supervised and Unsupervised Learning"

i) A finite number of balls with different radii must be sorted into corresponding storage locations. The different values of their radii have been collected and stored in a look-up-table. This problem of classification and control belongs to a class of processes where classification occurs within a given non-variable context, i.e. adaptive control (in the sense of supervised and unsupervised) is required.

ii) Adding sets of balls with a different radius from those stored in the look-up-table results in slightly new situations where, in addition, an adaptive control system, an adaptive classifier device (in the sense of unsupervised learning) becomes necessary for the design of an automated sorting machine.

The example holds for many control tasks where the classical methods of control engineering fulfil their applications. The method to be applied, whether classical control systems, neural networks, fuzzy controllers or combinations of both, is only a question of suitability, i.e. the solution is selected which is the most flexible, robust or easiest to realize. In the first step (i) the sorting machine can be adapted to the task by a teach-in process in which a look-up-table is generated. This "learning" process will be called ZERO_LEARNING which corresponds conceptually to the connectionist models of 'supervised learning'. In order to solve task (ii) the system must be able to adapt to a slightly new situation, which means that the context of control (sorting of balls of different radii) still exists. The system must adapt the input/output values stored in its look-up-table. Such a process will be called 'LEARNING_I' or 'unsupervised learning'.

Expressed in other words, processes labeled as LEARNING_0 and LEARNING_I can be realized with classical methods or alternatively with NN-, FC- or NFS-techniques. For a further discussion of LEARNING_0 and LEARNING_I processes and their realization, cf. ref. [von Goldammer et al., 1996].

 

3.2 Learning to learn: LEARNING_II

If instead of adding balls with radii different to those stored in the system's look-up-table we now consider a situation which differs completely from the original context of control (sorting of balls with different radii):

iii) If any object is added which has never before been classified by the system and we insist that the system itself must (autonomously) find a solution for the new and unpredicted situation (from the view of the system), then the object must not only be detected but it must also be classified. However, classification is only possible within a certain context. This means that the (autonomous) system must choose a new context by itself within a volitive process. The context corresponds to an interpretation of the classified object by the system itself. If the system again is a sorting machine then it may either decide to ignore the new object, to stop sorting, or to sort it into a special location. The decision depends on the chosen context, i.e., on the interpretation of the new situation.

If we do not want to pre-program such a situation then we must postulate that for such an unexpected event the system must create an extension of the context on its own. Here the implication is that the system should be able to alter its own algorithm and not only the data set (within its look-up-table) as in LEARNING_I. This is exactly what is understood as 'learning' in everyday life. During this new kind of learning, the relationship between the learning system and its environment changes; technically a change must take place in the algorithm which characterizes the system.

A system with this capability must be equipped with cognitive skills. It must perceive and reflect a situation (naturally the term 'reflection' we use here is something quite different from the 'reflection' of human self- consciousness). In other words, a system with cognitive abilities must be able to create a representation of itself and its environment autonomously [von Goldammer and Kaehr, 1989].

The formal description of such cognitive processes therefore requires the logical distinction between an object (i.e. the concrete object) and the representation of the object. A table standing in front of us is logically of a different type than that of the concept of a table, i.e. its representation. Expressed in the terminology of set theory, a set is of a logically higher type than are its elements; similarly this holds for the relationship between an operator and an operand.

A system which learns how it learns, i.e. the process 'learning to learn', is labeled here as LEARNING_II (cf. ref:7). The formal description of such a process, in which the system should both reflect the complete situation and alter its own algorithm, requires an interchange of operator and operand, i.e. what from one viewpoint was an operator becomes an operand during the process of reflection and vice versa. This process of interchange has a simultaneous nature. This is precisely the problem to be solved if we wish to realize processes like LEARNING_II technically. Although each process of cognition is a '2nd order process' in the sense of LEARNING_II, and although living systems are distinguishable significantly from dead objects by their cognitive abilitiy, up to now there exists neither a technical realization of LEARNING_II nor a technical model of cognitive processes. The models of neuroinformation sciences belong to the categories of learning processes of first and zeroth order. Regarded conceptually and as depicted in section II, they represent digital data filters and therefore must be labeled as non-cognitive networks.

 

4. Summary

Currently the state of the art of scientific knowledge is such that autonomy, cognition, learning, etc. are talked about within the context of biological and technical systems. However, the problem domain of closure is not even recognized. Moreover it is commonly not noticed that autonomous systems must not only have cognitive but also volitive capabilities. Accordingly the complexity of the problem of (formal) description is increased still further.

Concepts such as 'openness' and 'closure' do not result from experimental measurements but instead are like 'left' and 'right', i.e. they are viewpoint-dependent categories of description which achieve their fundamental meaning only within a formal description. It follows that for the formulation of a theory of autonomous systems, all spatial-geometrical conceptions are completely obsolete.

Every experimental measurement reduces the investigated object to a system with an initial and a final state, i.e. to an 'open system', where generally only physical and chemical observables are measurable. Cognitive processes are inconceivable within the above stated research strategy, i.e. the issues of 'cognition and autonomy' are lost. What is required is a suitable calculus for the development of a model of biological cognitive networks which allows a simultaneous formal representation of open and closed networks. Three different positions of description must be mediated with each other. In this way, models of biological cognitive networks (systems) can be viewed as:

- open networks (systems),

- closed networks (systems),

- a relationship between open and closed networks (systems). These different views must be incorporated without the appearance of ambiguities. The scientific implementation of this postulate primarily represents a scientific, logical and interdisciplinary problem which has been pointed out several times in the past [von Goldammer and Kaehr 1989; 1990; 1989; Kaehr, 1988; 1996] and which at the same time acts as one formal foundation for emerging systems theories.

 

References

G. Falk, Physik - Zahl und Realität, Birkhäuser Verlag, 1990

H. von Foerster, Kybernetik einer Erkenntnistheorie, in: Sicht und Einsicht,Vieweg Verlag, Braunschweig, 1985

E. von Goldammer and R. Kaehr, Poly-contextural modelling of heterarchies in brain functions, in: Models of Brain Functions (Cotterill, R. M. J., ed.), Cambridge University Press, 1989, p. 483-497

E. von Goldammer and R. Kaehr, Problems of Autonomy and Discontexturality in the Theory of Living Systems, in: Analyse dynamischer Systeme in Medizin, Biologie und Oekologie, Reihe: Informatik-Fachberichte, (Moeller, D. P. F., Richter, O., Hrsg.), Springer Verlag, 1990, p. 3-12

E. von Goldammer, C. Kennedy, J. Paul, H. Lerchner, and R. Swik, Autonomous Systems: Description and Construction, this volume, 1996

R. Kaehr and E. von Goldammer, Again Computers and the Brain, Journal of Molecular Electronics, Vol. 4, 1988, p. S31-S37

R. Kaehr and Th. Mahler, Introducing and Modeling Polycontextural Logics, this volume, 1996

H. Maturana and F. Varela, Autopoiesis: the Organization of the Living, in: Autopoiesis and Cognition, Boston Studies in Philosophy of Science, Vol. 42, p 63-134, (M. S. Cohen, M. W. Wartoisky, eds.) D. Reidel Publ., Dodrecht 1972

F. Varela, Principles of Biological Autonomy, in: General Systems Research (Klir, G., ed.), Vol. II, North Holland Publ., Amsterdam, 1979


webmaster@xpertnet.de
Copyright © 1996, ICS, revised Apr 1996