Print

Print


More annals of entropy:

> |MB:Complexity/diversity is exactly the same measure of a system as
entropy.
>
> SG:Wrong. Entropy cannot distinguish between different hierarchies or
levels
> of organisation in a system. Entropy will only measure the material
> entropy and the material structure, not the functional relationships
> between constituents of a system. Thus, it cannot measure complexity. As
> an example, the entropical difference between a human being and the heap
> of chemicals it is composed of is only marginal, besides, it is almost
> unmeasurable. So how can entropy measure the complexity, if you can't
> actually 'measure' it? It is true, a complex dissipative structure will
> have a lower entropy than the simple mix of components it is made of, but
> as said above, this difference is not a valid measure for its complexity.
>

MB:One measure of diversity is to determine the relative
concentrations/percentages/frequencies/ probabililities and calculate the
sum of plnp for all the elements of the system.  This measure is maximized
when the species are equally distributed and is minimized when one species
dominates.  One measure of statistical entropy is also the sum of plnp, and
is maximized when the energy or matter is spread out equally throughout the
system.  I believe, as do you, that a system evolves toward maximum
complexity, not maximum disorder as the second law is usually stated.  Of
course, it depends whether you are looking at a closed system or an open
system.  But, just as there are no still-lifes, there are no closed systems;
everything leaks.  Nature does not abhor a vacuum, she abhors a closed
system.

> |MB:Helmholtz called the different microstates of a system its
"complexions."
> |Complexity/diversity is good for the biosphere,
>
> SG:Complxity is not 'good' for a biosphere, it is merely a consequence of
a
> well adapted living system. When the system can evolve without
> disturbance, it will naturally evolve into its most complex form under the
> given circumstances. A neighbouring system might have the same tendencies,
> and thus disturb the first system, initiating a competition for sources of
> free energy. The outcome of such a quarrel is dependent on many factors,
> but if the outer conditions for both systems don't change drastically,
> they will both have to compromise between adaption and dominance. As was
> once said, "there is no good or bad, just consequences"
>
MB:Complexity is good for any level of biological entity, from cell to
organism to biosphere, because it allows multiple strategies of adaptation
to internal or external perturbations.  Mono-cropping puts farmers at an
extreme risk of being wiped out by insects, for example.  Don't put all your
eggs in one basket.  Fungi have evolved enzymes for digesting wood, which is
composed of lignins.  Starches and other carbohydrate polymers are easy to
digest because they are composed of repeating units.  The trees responded by
evolving randomly connected lignins which are hard to digest because of
their complexity.  The HIV virus has a DNA polymerase which makes mistakes
at a rate 10,000 times that of standard DNA polymerases.  Thus, its coat
proteins change constantly to make it more difficult for the immune system
to recognize it and destroy.  This is good for the HIV virus, but bad for
us.


> |MB:but it [complexity] is hell for the
> |exploiter of resources.  The exploiter desires simplicity and complexity
> |frustrates attempts at exploitation.
>
>SG: Granted, for a very simplistic view of 'exploiters' this is true. The
> magic word is, again, adaption. An exploiter which is not capable of
> extracting the resources in a well-adapted manner, as to not destroy too
> much of the nourishing systems of the environment, will inevitably suffer.
>
MB:Look at it this way:  If you were a gold miner, would you rather find a
nugget weighing 1000 lbs. or would you rather extract 1000 lbs of gold from
a million lbs. of ore?  It seems pretty obvious to me that the exploiter
desires the simplest, lowest entropy resource to minimize the energy
required to exploiter that resource.  A well-adapted exploiter either has a
way of finding and consuming low-entropy chunks (like a fish swallowing
another fish) or has an efficient mechanism for extracting a dispersed
resource from its matrix.  This all depends on scale:  bacteria can more
efficiently extract dispersed resources than can larger organisms.

> |MB:That is why entropy is thought to be
> |"bad" by physical scientists, because the entropy law poses limits on the
> |exploitation of nature.
>
> SG:Hell, no! I don't know what kind of physicists you know, but here you
have
> one that would never call a physical quantity "bad", or any other name for
> that matter. Well, I have cursed entropy once in a while, but only because
> I couldn't quite understand the concept when I first learned about it. But
> that is a different matter. Apart from that, entropy production measures
> irreversibility. Irreversibility is the clue to destruction of complexity,
> as well as it is the clue to the creation of complexity. Read any text
> book on dissipative and/or evolutionary systems and you will find, that
> many physicists actually praise the concept of entropy for its explanatory
> power.
>
MB:Entropy has been traditionally viewed by scientists as a measure or
source of disorder, decay, degradation.  This is a negative judgment based
on, as I believe, several factors:  (1) the fact that entropy limits the
extent to which humans can exploit nature; (2) an attachment to form, and a
resistance to transformation (entropy comes from a Greek word meaning
transformation) and (3) our capitalist, closed-system thinking which values
accumulation of wealth over giving it away.  I know that scientists are
becoming more enlightened with the shift of the paradigm to chaos and
dissipative systems, but I don't think the negative view of entropy is
changing very fast across all the disciplines of science.

>
> |MB:Entropy is neither bad nor good, it just facilitates dispersion of
> |resources.
>
> SG:Not quite, entropy doesn't facilitate anything. It just measures. Does
a
> meter facilitate the moving of an object? There is a second law of
> thermodynamics that describes irreverible processes, like dispersion. But
> it only states that dispersion, and thus entropy production, IS happening
> and that it can't be avoided. It doesn't say WHY it happens, but it states
> how it can be measured.
>
MB:There are two opposing tendencies in nature: entropy which tends toward
dispersion and breaking of connections, and entrainment, which tends towards
coherence and making of connections.  Actually, coherence is
self-entrainment and entropy is entrainment with the surroundings, so
entrainment is the ur-process.  This is the paradigm of dissipative
entrainment, which is being developed by John Collier and myself, so you
probably haven't heard of it yet.

> |MB:This is good for a cell that relies on
> |diffusion of nutrients, but it is bad for capitalists who want to
accumulate
> |capital.  For the capitalist, entropy must appear as a communist
conspiracy.
> |Apply the entropy equation to the distribution of wealth and you will
> |understand what I am saying.
>
>SG: No comment. But remember: there are only two things on Earth that only
> grow more, when you try to give them away: love and entropy.
>
Thats because Entropy is Love, baby:  Give more and take less.  By the way,
if you think entropy has nothing to do with the political economy, think
again.   I recommend you read "The Entropy Law and the Economic Process" by
Georgescu-Roegen.

Mark Burch, CEO (Chief Entropic Officer), Institute for Entropic
Consciousness