vordenker entrance page

IM KONTEXT:

Gotthard Günther 
2000 Special

 

GÜNTHER - 2000

A contribution in three parts 
to the 100th aniversary of Gotthard Günther

Topic to part 4: "Freiheit und Notwendigkeit - FREEDOM AND NECESSITY" 
(See also Introductions to part_1, part_2 and part_3)

by Eberhard von Goldammer

The appearance of Monod´s book Chance and Necessity in the 70-ies and the Laws of the Game by Eigen and Winkler can be considered as the beginning of a mental occupation of biologists by statistical physics, i.e., by the idea of "chance and necessity" as a scientific basis of modern life sciences. Prigogine´s book The End of Certainty: Time, Chaos, and the New Laws of Nature exploring deterministic chaos completed this process of a mental occupation. Prigogine claims that it is time's arrow that finally makes clear how probabilities become actualities and how "becoming" becomes "being". However, Prigogine´s ideas of deterministic chaos are absolutely unsuited as a basis for any theoretical approach toward a theory of the living. The reason is quite simple: All physical models or systems including those of irreversible thermodynamic are deterministic models and therefore they never create something new otherwise they would be non-deterministic systems which cannot be described by means of common mathematical methods. On the other side, living systems are characterized by their autonomy. An autonomous system must be able to control its own control. In other words, autonomy comprises at least two simultaneously interacting processes [1]:

Within complex I both processes (a1) and (b1) are complementary to each other, i.e., neither of the two can be considered or described separately. This is similar to the situation given by the following simple example:

The circularity in both complexes ( I ) and ( II ) is typical for all self-referential processes. In figure_1 the circularity of a self-referential process has been visualized with O representing an operator and O an operand. It shows the interplay between the operator and operand: The operand becomes an operator and vice versa.

Figure_1 
circularity of a self-referential process with
"-> " as symbol for an order relation

 

Self-reference is a precondition for life because living as a process is characterized essentially by cognition, and cognition is a self-referential process. This is valid for all organisms, with or without a nervous system. The nervous system not only expands the cognitive domain it is an essential condition for the volitive processes of living system. Both cognition and volition are self-referential processes and are related in a way that has been called circular causality by Gestalt psychologists [2]. If "live" is the scientific object of investigation, self-reference cannot be neglected. The majority of today´s life scientists still ignore self-reference. They obviously believe in a somewhat primitive materialistic sense, that the observation of inanimate matter will lead them one day to the secret of life. This, however, is on the same scientific level as the alchemists´ in the middle age in comparison with modern chemistry. In other words, life is not the primordial object of today´s life sciences; life is presupposed and the object of research is still physical matter or in a somewhat philosophical diction the "being" and not the "becoming".

Within physical and chemical systems or models self-reference can never appear. In physics it is the physical state (and not the process) which is of importance and it is the difference between two (physical) states that results from every physical measurement. A physical state is described by some physical variables that have constant values. In other words, a physical state is by definition a situation where nothing happens, where nothing changes. It is the opposite of any process. This also holds for irreversible thermodynamics where entropy changes occur only during a transition from one state A to another state B. As far as living systems are concerned, Prigogine´s concept of time may be a useful model for the description of the metabolism within a living system but this is chemistry of inanimate matter. Living as a process cannot be modeled by the theory of deterministic chaos.

There is another point that is of interest: It is the structure of physical models or descriptions. Every experimental science necessarily yields a formal description or a model which is hierarchically structured, i.e., where the transitivity relation strictly holds. If a physical or chemical process is considered, Progogine´s concept of time´s arrow can be formulated as follows:

The transitivity relation (1) represents the ordinary folk philosophy of time: If there exists an event_1 of a process measured at time t1 which happened earlier than a second measured event_2 of the same process at t2 and if the event_2 at t2 happened earlier than a third measured event_3 of the considered process at t3 then this implies that the event_1 at t1 was earlier than the event_3 at t3. In physics temporal sequencing is crucial; no effect can precede or occur simultaneously with its cause.

Relation (1) implies not only the principle of linear causality but also the hierarchical structure of the description of the underlying measured process. If the description of such a process will be modeled by an algorithm it always can be represented by Turing machine. In other words, such an algorithm can be realized in a temporal sequence of steps or actions. It can be shown that in general all physical descriptions correspond to hierarchically structured models or sequentially structured algorithms. This results directly from the metricity relation which by definition holds for all physical relations [3].

On the other side, self-reference implies heterarchically structured processes and non-transitivity. This knowledge has been introduced into science by Warren S. McCulloch [4] about 50 years ago but it has been ignored by the mainstream of scientific community until today. It was the philosopher and logician Gotthard Günther, who introduced the theory of polycontexturality which gives the formal basis for any modeling of self-referential processes and it is not surprising that Günther´s work also has been ignored by the mainstream of the scientific community life sciences as well as in philosophy.

Self-reference can never be measured - this is a trivial statemant. Thus self-reference cannot be detected simply by looking into the matter or by measuring the activity of neurons or whatsoever. It is also impossible to model self-referential processes within the framework of classical, i.e. monocontextural scientific conceptions. But still if it is done, the well-known circularities of the second order cybernetics or the Gestalt psychology are the result. Circularities always unmask as logical antinomies and ambiguities. Self-reference or "becoming" as a process, i.e., the appearance of the new, can only be modeled or re-constructed algorithmically within the framework of a scientific theory based on polycontextural logic conceptions.

From a scientific point of view, it is naive to assume that any probabilistic theory or the classical monocontextural artificial neural net models can be used for modeling self-reference or other mental processes such as cognition and volition, or learning, or consciousness, etc. On the basis of any monocontextural scientific approach, it is possible to model metabolic processes in living systems or to use neural networks as non-linear adaptive data filters. However, "becoming" in the sense of the "appearance of the new" cannot be modeled in this way. It was Hegel who already stated that there is nothing new in nature, which certainly means the inanimate nature: "The changes in nature, however, infinitely manifold they are, describe but a circle that repeats itself ever and again; in nature there is nothing new under the sun, and to that extend the play of her forms has a certain boredom to it. Only the changes that come forth on the ground of spirit does anything new appear."

Here, we only can comment: Hegel was right!

It is the last sentence in Hegel´s statement which is of interest in the light of the present discussion, namely: "Only the changes that come forth on the ground of spirit does anything new appear."

In order to understand the meaning, of Hegel´s statement concerning the appearance of the new, we will give a short description of McCulloch´s lecture Toward some circuitry of ethical robots... In this paper McCulloch discusses the structural differences between moral and ethical actions using a picture of two groups of robots that have to be constructed in a way that one group plays a game according to moral principles, we will call this group GM. The second group, EM, has to be constructed in such a way that they are able to play a game according to ethical principles.

The engineering of the first group (MG) can be realised at least in principle. The rules of the game are part of the computer program which controls the playing robots. The structure of the program is hierarchical and can be represented by a Turing machine. The robots are determined machines.

The design of the second group (EG), however, reveals some fundamental difficulties. This group has to be constructed in a way that the playing robots are able to introduce new strategies and if necessary to invent new rules which are not part of the original computer program. It is important to realise that these robots should not simply stop playing. In other words, this group has to play and to invent new strategies simultaneously. At this point it should become clear that probabilistic methods are not very helpful to solve the engineering problem.

This is what we mean when we talk about the "appearance of the new". From a classical point of view, we are only familiar with deterministic and non-deterministic algorithms - any third is excluded (tertium non datur). In other words, the computer program that has to be designed for the construction of group EG cannot be a determined algorithm in the classical sense. It is obvious that this problem can never be solved on the basis of any monocontextural logical conception.

From the classical (monocontextural) point of view it is impossible to, seriously, imagine "freedom and necessity" instead of "chance and necessity" as a scientific conception. On the basis of a polycontextural logic the situation is completely different. The parallel interwoven calculus gives the necessary degrees of freedom in order do design a mechanical brain in such a way that it is able to reflect and to interpret a situation in a certain context of control (depending on the design of the machine) and to change, for example, its algorithm while it is still playing the game. These are the processes that have to be designed in order to construct McCulloch´s group EG.

Without going into details, we should mention two things, first such a machine can never be represented as a Turing machine and second it was McCulloch who realised that the structure of the algorithms are no longer exclusively hierarchically structured. Instead, the engineer has to design an interplay of heterarchically and hierarchically structured algorithms. We should remember that any algorithm represented by a Turing machine is exclusively hierarchically structured, i.e., it is a sequence of single steps or actions.

Within a polycontextural calculus heterarchy results from inter-contextrual transitions while hierarchy is an intra-contextural determined structure of processes. The concept of "heterarchy" and non-transitivity was introduced by McCulloch in 1945. At that time polycontextrality was not known and therefore McCulloch uses a topo-logical method in order to demonstrate the heterarchical, non-transitive structure of the function (not the morphology !) of neural networks.

In analogy to relation_1, non-transitivity means:

Within the context of classical (monocontextural) logic the relation_2 turns out to be a pure nonsense. Nevertheless, and according to McCulloch, heterarchically structured processes have to be postulated in order to understand the function of neural network activities. Although in his paper McCulloch does not use the term "self-reference" the processes which he stated as non-transitive are self-referential processes. A re-interpretation of McCulloch´s A heterarchy of values... in the context of a polycontextural calculus has been published elsewhere [5].

Even without any knowledge about polycontexturality, one can easily deduce an interesting result from relation_2: By studying neural processes one should expect some temporal anomalies such as, for example, an inversion of the (linear) causality which is represented by temporal sequencing as it was shown by relation_1. And indeed several temporal anomalies that contradict common sense have been observed and reported. There exists an article by Dennett and Kinsbourne [6] about these phenomena followed by a long (50 pages) discussion. It is astonishing that neither within the article nor within the discussion the fundamental structural problems of heterarchy and non-transitivity, which McCulloch has pointed out about 50 years ago, are discussed. Even the terms "heterarchy", "transitivity" or "non-transitivity" don´t occur within the article and the discussion, and there is no single reference to McCulloch´s paper. Instead of using formal logic as a tool for a scientific analysis of brain functions, several extremely naive and obscure models based on different kinds of theaters are discussed in article by Dennett and Kinsbourne.

 

Part_4 offers the following contributions (as PDF files):

 

G. Günther
Die historische Kategorie des Neuen
(The Historical Category of the New)

This contribution was published in 1970 in Hegel Jahrbuch p. 34-61. This contribution will be presented in a bilingual (German/English) form. It was originally publihed in German and was later translated by R.H.Howe and E. von Goldammer.
The Historical Category of the New is a logical re-interpretation of Hegel´s theory of development and leads to a new and more precise view of Hegel´s category of the new, which is considered as discontexturality, i.e., as a transition between different logical contextures. A contexture is a logical domain where all logical rules strictly hold. In the notion of Günther´s polycontexturality the classical logic is mono-contextural, i.e., there is only one contexture. Polycontextural logic is a many-placed logic with parallel and interwoven logical domains (contextures). The different contextures a mediated by new operators such as transjunction or the negations operators. There exists a manifold of negation operators that manage the interplay - the discontexturality - between different contextures. This corresponds (on much higher theoretical level) to Hegel´s idea of the "negation of negation".

 

G. Günther
Cybernetic Ontology and Transjunctional Operations
This contribution was first published in Self-Organizing Systems, (M. C. Yovits, G.T. Jacobi, G.D. Goldstein, eds.), Spartan Books, Washington, 1962, p.313-392. Within this contribution Günther does not use the term "contexture". He introduced und used the term "contexture" from about 1970 on.

 

G. Günther
Kybernetik und Dialektik - der Materialismus von Marx und Lenin.

This contribution is an unpublished manuscript of a lecture given by Günther on July 1964 at the University of Cologne. This lecture is interesting insofar as it is one of the very rare efforts of an East-West-dialogue in Germany in the time before the fall of the iron curtain. At that time Günther was already an American citizen. He postulates not only the end of the materialism of Marx and Lenin but also the death of the idealism.

 

W. S. McCulloch
Toward some circuitry of ethical robots or an observational science
of the genesis of social evaluation in the mind-like behavior of artifacts

A lecture by McCulloch: Read to the 13th Conference on Science, Philosophy, and Religion, New York, September 1952, and to the Meeting under the Auspices of the Department of Experimental Psychiatry, University of Birmingham, England, 1953.

 

References:


[1] R. Kaehr & E. von Goldammer, Poly-contextural modeling of heterarchies in brain functions, in: Models of Brain Functions, (R.M.J.Cotterill, ed.), Cambridge University Press, 1989, p.483-497. See also: http://www.vordenker.deback to text

[2] See for example: W. J Freeman, Consciousness, Intentionality, and Causality, Journal of Consciousness Studies, 6 Nov/Dec: 143-172, 1999; http://sulcus.berkeley.edu/FreemanWWW/manuscripts/IF8/99.htm back to text

[3] See Ref. 1 and : "Theory of Polycontexturality - An Introduction to the Work of Gotthard Günther"- Vol.1 - in preparation, 2001. back to text

[4] See part_3 of this series. back to text

[5] R. Kaehr & E. von Goldammer, Again Computers and the Brain, in: Journal of Molecular Electronics, vol. 4, 1988, S31-S37. - See also: http://www.vordenker.de/ics/cbrain.htm back to text

[6] D.C.Dennett & M.Kinsbourne, "Time and the Observer: The where and when of consciousness in the brain", Behavioral and Brain Sciences, 15 (1992), p. 183-247. back to text