Symbiosis in Lo's LO11386

Thu, 12 Dec 1996 12:06:16 GMT+2

Replying to LO11294 and LO11298

Thanks Durval and Lon for commenting on my remarks about systems. I
appreciate your valuable insights.

I wish to pursue the Symbiosis thread a little further with a view to
getting a better handle on systemic properties of symbiosis, and how a
systems engineer may be able to exploit these aspects of a system.

The manner in which we divide systems into component subsystems depends on
our world view. For example consider a car engine. Using a simple
mechanical engineering mental model we would quickly identify the cooling
system and lubrication system as being two subsystems of the many that
constitute the whole. But assume now that you have just arrived here from
Mars and have no knowledge of engines at all. Assume also that I start an
engine from cold and provide you with the following data at short time
intervals: Oil temperature, air temperature from car heater, oil pressure,
water temperature, radiator pressure. We will simply refer to these as x1,
x2, x3, x4, x5. Because entropy is universal, you (a Martian) would in
all probability (like us Earthlings) define the strength of the
relationship between two variables X1 and X2 as T(x1,x2) = H(x1) + H(x2) -
H(x1,x2) where H(x1) is the entropy of the variable x1 ... and so on. T
is called the transmission and the normalised transmission is given by
t(x1x2) = T/H(x2). Transmission is similar to our well known correlation
coefficient usually denoted by 'r' and our well known correlation ratio
usually denoted by 'R', but does not require that the variables being
studied must be numerical.

Furthermore, if for argument's sake we let x1 = sin t and x2 = mod (sin t)
then whereas the correlation coefficient r(x1,x2) would be zero, T will
disclose the fact that x1 and x2 are highly related. These definitions
can be extended to refer to systems instead of variables.

Getting back to the car engine, from the data I have supplied, you could
set up a matrix showing the variation in the response variables (x1...x5)
over time, and then calculate a normalised transmission matrix
illustrating the interactions between them. You will find something very
interesting. Namely, that x1 and x3 have a high t(x1,x3) value and that x3
has a strong internal interaction t(x3,x3). The variables x4, x2, and x5
interact strongly with one another, and x5 has noticeable internal
interaction. Finally, there is a (relatively) mild interaction between x1
and x4. These results may enable you to conclude that there are two
subsystems. The first is composed of x1 and x3 and the second is composed
of x4, x2, x5. You will then have isolated the lubrication system (x1,x3)
and the cooling system(x2,x4,x5) and be aware of some transmission between
them. The high correspondence between x3 values makes sense because oil
pressure (x3) quickly settles down to a constant value. The reason for
the mild interaction between x1 and x4 (it is this relatively weak
relationship that suggests the existence of two systems) is that the water
temp and oil temp increase from cold at different rates. This little
example (maybe it is not a very good one, but hopefully it illustrates the
point) shows that by examining entropy (and hence transmission) IMHO we
may be able to study symbiosis. Some time ago work was done in the area of
system decomposition using transmission by R Conant. (Of course, John
Warfield has for quite some time used binary matrices to study

Now all this ties into the importance of rich picture formulation
mentioned in a previous post, and relates specifically to the question of
symbiosis (currently in vogue on this list) which is an aspect of the rich
picture. Quantifying the effects of symbiosis may be very valuable (I am
thinking of Lord Kelvin's famous remark right now). Studying entropy and
transmission could be a means to this end.


Keith Sandrock Systems
FAX 27-11-339-7997



Learning-org -- An Internet Dialog on Learning Organizations For info: <> -or- <>