Calvin,
W.H., 1997; The Six Essentials? Minimal Requirements for the Darwinian
Bootstrapping of Quality.
Journal of Memetics - Evolutionary Models of Information Transmission,
1.
http://www.fmb.mmu.ac.uk/jom-emit/1997/vol1/calvin_wh.html
This is the author's original formatting; if you prefer
the numbered paragraphs of technical manuals,
switch to reading the official version from the
Journal of Memetics web site.
THE SIX ESSENTIALS?
Minimal Requirements for the
Darwinian Bootstrapping of QualityWilliam H. Calvin
University of Washington
Seattle WA 98195-1800 USA
WCalvin@U.Washington.edu
http://www.WilliamCalvin.com
Abstract: Selectionism emphasizes carving patterns, memes remind us of minimal replicable patterns, but a full-fledged Darwinian process needs six essential ingredients to keep going, to recursively bootstrap quality from rude beginnings. While there may be situations ("sparse Darwinism") in which a reduced number suffice, another five ingredients, while not essential, greatly enhance the speed and stability of a Darwinian process. While our best examples are drawn from species evolution, the immune response, and evolutionary epistemology, the Darwinian process may well be a major law of the universe, right up there with chemical bonds as a prime generator of interesting combinations that discover stratified stabilities.
Since Richard Dawkins' The
Extended Phenotype got me to thinking about copying units in the
mid-1980s, I have been trying to define a cerebral code (the spatiotemporal
firing pattern that represents a word, image, metaphor, or even a sentence)
by searching for what can be successfully replicated in the brain's neural
circuitry, a minimum replicable unit.
I indeed found such circuitry (it implies that the firing pattern within several hundred minicolumns of
neocortex, contained in a 0.5 mm hexagon, is such a copying unit). But to explore creativity in higher intellectual function, I wanted
to see if the resulting copies could compete in a Darwinian manner, the
process shaping up quality as it goes. And that forced me to try and boil
down a lot of evolutionary biology, attempting to abstract the features
that were essential (for what I came to call "the full-fledged Darwinian
process") from those that merely contributed to speed or stability.
This isn't the place to describe the neural outcome -- it's in
The Cerebral Code
and, more briefly, in the seventh chapter of my other 1996 book, How
Brains Think -- but this does seem an appropriate place to review
what I started calling "The Six Essentials." They seem applicable
to a wide range of problems within memetics9
as the field attempts to cope with evolutionary models of information transmission. For a more general history of memetics, see the useful bibliographies22 of
McMullin, Speel, and Wilkins; I will only mention a few (mostly cautionary!) contributions from neuroscience along the way.
Selectionism
Looking back into the history of biology, it appears that wherever a phenomenon resembles learning, an instructive theory was first proposed to account for the underlying mechanisms. In every case, this was later replaced by a selective theory. Thus the species were thought to have developed by learning or by adaptation of individuals to the environment, until Darwin showed this to have been a selective process. Resistance of bacteria to antibacterial agents was thought to be acquired by adaptation, until Luria and Delbrück showed the mechanism to be a selective one. Adaptive enzymes were shown by Monod and his school to be inducible enzymes arising through the selection of preexisting genes. Finally, antibody formation that was thought to be based on instruction by the antigen is now found to result from the selection of already existing patterns. It thus remains to be asked if learning by the central nervous system might not also be a selective process; i.e., perhaps learning is not learning either.
Niels K. Jerne, 1967
The term "selectionism" covers a wide range of cases, ranging from fancy biology with sexual selection to examples that are called "selective survival" because they lack any notion of replication. Brain development offers many examples of this simple end of selectionism.
Sparse Darwinian Possibilities
There are two "halfway houses" which may prove to be more
interesting than environmental carving of patterns. First, since brain
development (to continue the earlier story) is never really over (it just
slows down, and the gene repertoire may shift), and since new synapses
may form during adulthood, one is initially reminded of a biological population
with replacement and growth -- and Darwinian shaping up. But observe that
there isn't a pattern being replicated with variations; there isn't a population
of such patterns competing with other patterns, etc. -- which is what population
usually means, not merely a number of constituents comprising the pattern
being carved.
While Gerald Edelman (in his 1987 book, Neural Darwinism; see
my book review in Science7) has such a
population lacking patterned individuals, he goes beyond selectionist carving
in an interesting, nontraditional way: he has a notion of interacting maps,
that shape up one another in a manner rather like the sometimes creative back-and-forth interactions
between author and editor (my analogy, not his -- as is my perhaps shopworn
name for it, revisionism). I have a difficult time identifying either
an individual unit, or a distinctive copying mechanism for it, in Edelman's
lots-of-neurons notion of a "population," even if his re-entrant
loops are reminiscent of generations. His differential amplification via
re-entrant loops, however, is undoubtedly an important process (I particularly
like it for the consolidation of episodic memories20).
On closer inspection, neither developmental patterning nor Edelman's
reentry fits my concept of Darwin's process. Populations -- in ecology
and evolutionary biology and immunology -- usually involve lots of patterned
individuals somehow making near copies of themselves, all present at the
same time, interacting with one another and with the environment.
Yet analogies always leave something out; we don't expect them to be
perfect fits, exactly the same thing. As the poet Robert Frost once said,
we have to know how far we can ride a metaphor, judge when it's safe. That's
exactly our problem in memetics, and why Edelman's notions have proved
controversial. When, then, are we forced to ascribe, to a candidate such as developmental
patterning or reentry, the potential for the recursive bootstrapping
of quality that we associate with Darwin's process, which we regularly
see operating on the biological species and the antibody?
To approach an answer to that question, it will be useful to enumerate what has contributed
to Darwin's process, while trying to strip it of the biological particulars
-- and then ask how well it could limp along with a reduced number of components
(what I've started calling "sparse Darwinism").
The Full-fledged Darwinian Process
The six essentials aren't a settled issue. What I was aiming for, however, were the essential ingredients of an algorithmic quality-improvement process20, stated in a way that didn't impose a lot of biological preconditions. I
wanted, for example, to avoid making use of the genotype-phenotype21 distinction, or a universal translation table like the genetic code;
I wanted to describe a process, not make an analogy. John Holland's computational technique10 known as the "genetic algorithm" comes close to what I had in mind, but Holland was trying to mimic recombination genetics in a computational procedure for discovering solutions, and I wanted to abstract more general principles
that avoided the presumption of recombination.
Since many of us think that (properly defined) the Darwinian process is a major law of the universe, right up there with chemical bonds as
a prime generator of interesting combinations that discover stratified
stabilities2, we want it to be able to run on different substrates, each
with their own distinctive properties that may, or may not, correspond
to those seen elsewhere. So our abstraction should fit the species evolution
problem, as well as the immune response, but also be independent of media
and time scale. Here, paraphrased from The Cerebral
Code, is what I ended up with:
1. There must be a pattern involved.
2. The pattern must be copied somehow (indeed, that which is copied may serve to define the pattern). [Together, 1 and 2 are the minimum replicable unit -- so, in a sense, we could reduce six essentials to five. But I'm splitting rather than lumping here because so many "sparse Darwinian" processes exhibit a pattern without replication.]
3. Variant patterns must sometimes be produced by chance -- though it need not be purely random, as another process could well bias the directionality of the small sidesteps that result. Superpositions and recombinations will also suffice.
4. The pattern and its variant must compete with one another for occupation of a limited work space. For example, bluegrass and crab grass compete for back yards. Limited means the workspace forces choices, unlike a wide-open niche with enough resources for all to survive. Observe that we're now talking about populations of a pattern, not one at a time.
5. The competition is biased by a multifaceted environment: for example, how often the grass is watered, cut, fertilized, and frozen, giving one pattern more of the lawn than another. That's Darwin's natural selection.
6. New variants always preferentially occur around the more successful of the current patterns. In biology, there is a skewed survival to reproductive maturity (environmental selection is mostly juvenile mortality) or a skewed distribution of those adults who successfully mate (sexual selection). This is what Darwin later called an inheritance principle. Variations are not just random jumps from some standard starting position; rather, they are usually little sidesteps from a pretty-good solution (most variants are worse than a parent, but a few may be even better, and become the preferred source of further variants).
Neural patterning in development is a sparse case: just a pattern and a multifaceted environment. There is no replication of the pattern, no variation, no population of the pattern to compete with a variant's population, and there's nothing recursive about achieving quality because there's no inheritance principle.
History qua history -- what it includes, what it leaves out, and how these change over time -- provides us with a memetic example of these six essentials at work. Of the many happenings, some are captured in patterned sentences that describe who did what to whom, why, and with what means.
Some of these patterns are retold (copied), often with little confusions (variation) and conflations (superpositions). Alternative versions of stories compete for the limited space of bookstore shelves or the limited time of campfire storytelling. There is a multifaceted environment that affects their success, the association of the described events to those of everyday life. In particular, the environment contains mental schemas and scripts; as Aristotle noted and all four-year-olds demanding bedtime stories seem to know, a proper narrative has a beginning, middle, and end -- and so "good stories" fare much better in the memorized environment. (Especially those conveyed by historical novels that strengthen the narrative aspects!) Finally, because historians rewrite earlier historians, we see Darwin's inheritance principle in action: new variations are preferentially based on the more successfully copied of the current generation of historical stories, and so history has a drift to better and better fits to language instincts (such as chunking and narratives) because current relevance is shifting and ephemeral.After many generations, only those stories of timeless relevance are left alongside the likely-ephemeral contemporary ones.
Nonessentials: Catalysts and Stabilizers
There are another five features that, while not essential, do notably influence the rate of evolutionary change:
There are also catalysts acting at several removes, as in Darwin's example of how the introduction of cats to an English village could improve the clover in the surrounding countryside: The (i) cats would (ii) eat the mice that (iii) attack the bumble bee nests and thus (iv) allow more flowers to be cross pollinated. (You can see why I call these the "Rube Goldberg Variations.")7. Stability may occur, as in getting stuck in a rut (a local minimum -- or maximum -- in the adaptational landscape). Variants happen, but they're either nonviable or backslide easily.
8. Systematic recombination (crossing over, sex) generates many more variants than do copying errors and the far-rarer point mutations. Or, for that matter, nonsystematic recombination such as bacterial conjugation or the conflation of ideas.
9. Fluctuating environments (seasons, climate changes, diseases) change the name of the game, shaping up more complex patterns capable of doing well in several environments. For such jack-of-all-trades selection to occur, the climate must change much faster than efficiency adaptations can track it (more in a minute).
10. Parcellation (as when rising sea level converts the hilltops of one large island into an archipelago of small islands) typically speeds evolution. It raises the surface-to-volume ratio (or perimeter-to-area ratio) and exposes a higher percentage of the population to the marginal conditions on the margins.
11. Local extinctions (as when an island population becomes too small to sustain itself) speed evolution because they create empty niches. The pioneers that rediscover the niche get a series of generations with no competition, enough resources even for the odder variants that would never grow up to reproduce under any competition. For a novel pattern, that could represent the chance to "establish itself" before the next climate change, for which it might prove better suited than the others.
The Augmented Darwinian Set
Although a Darwinian process will run without these catalysts, using
Darwinian creativity often requires some optimization for speed. In the
behavioral setting I analyze in my two 1996 books, quality must be achieved
within the time span of thought and action.
Accelerating factors are the problem in what the French call avoir
l'esprit de l'escalier -- finally thinking of a witty reply, but only
after leaving the party. Some accelerating factors are almost essential
in mental darwinism, simply because of the time windows created by fleeting
opportunities, and so this "augmented Darwinian set" may also prove
to be important for other memetic applications of the universal Darwinian
process.
I am proposing a standard Darwinian set (six ingredients,
in my formulation), with nonstandard cases often described via the sparse
and augmented sets. As with Edelman's reentry, some cases may be both sparse
and have a novel feature like revisionism (mixed cases). I was delighted to discover that my (neocortical circuitry) candidate process
was not only capable of implementing all six essentials, but stability
and the four catalysts as well.
At what point can we carry over the traditional implications of the best-studied case, the species-evolution Darwinian process, to a candidate
process? My present answer would be: When the six essentials are present,
and no obvious stability or relative-rate issue seems to be precluding
"progress," we are then entitled to predict that our candidate
process is capable of repeatedly bootstrapping quality.
The extent of "coverage" of memetic theories varies widely.
For example, I was able to spend much of my last chapter of The Cerebral Code discussing the Darwinian implications of minor circuit malfunctions for a broad range of pathophysiology such as
seizures, hallucinations, delusions, amnesia, déjà vu, and
so forth. My point is that candidate processes in other memetic fields are also likely to
be judged by similar nonevolutionary considerations, so we must remember
that possessing the six essentials is only a "threshold" consideration,
mostly relevant to the sorts of quality that can be bootstrapped -- and
for how long that improvement can continue.
Stratified Stability and Relative Rates
One coverage issue that seems relevant, however, is whether new levels
of organization emerge from the candidate evolutionary process. Can, for
example, a candidate process form categories? Can it progress to evolving
analogies4 or metaphors? Are these new levels ephemeral, or stable for awhile?
Jacob Bronowski spoke of "stratified
stability" and observed2, "The stable units that compose one level
or stratum are the raw material for random encounters which produce higher
configurations, some of which will chance to be stable....Evolution is
the climbing of a ladder from simple to complex by steps, each of which
is stable in itself." Does the process self-limit when reaching an
angle of repose18, so that piling on another layer
is self defeating? Does the process backslide in a catastrophic manner,
requiring something like the Weismannian barrier between genotype
and phenotype21 to provide a ratchet?
Relative rates play an important role in any process involving change,
one that can trivialize or magnify. Relative rates of expansion are the
major principle underlying most household bimetalic-layer thermostats,
and it is a familiar principle in development (the way curved surfaces
are made is to have two sheets of cells in contact, one growing faster
than the other). And we've already seen two examples here: the history
example of episodic memories that faded quickly when compared to the generation
time and lifespan, and again in #9 where climate changes had to be much
faster than adaptations could track, if jack-of-all-trades abilities were
to accumulate in the face of competition from lean, mean machines.
Repackaging the Essentials
Someone will surely try to condense my six essentials to a phrase more
memorable than "a pattern that copies with occasional
variation, where populations of the variants compete for
a limited workspace, biased by a multifaceted environment, and with
the next round of variations preferentially done from the more successful of the
current generation." Indeed, Alfred Russel
Wallace did a pretty good job back in 1875 ("...the known laws
of variation, multiplication, and heredity... have probably sufficed....")14.
It's just that I make explicit the pattern, the work space competition
between populations, and the environmental biases. As noted in #2, I'm
trying to avoid lumping where I know that splitting is going to be required
later, to deal with some important partial cases. Wallace shows us that
only three items cover a lot of essential ground, and there are surely
other profitable ways to split and lump, if context allows the inference
of other factors. A list of essentials -- at least one that aspires to
some universality -- can't omit the context, can't skip over the potential
confusions. Bronowski once observed1 that,
even if six sentences might serve to sum up one of his lectures, the rest
of the hour was really essential for disambiguating the meaning of those
summary sentences. The name of the game here isn't compression but abstraction,
an abstraction that is just general enough to cover a number of situations
that differ widely in media and time scale -- but not so abstract as to
lose important associations.
Of course, all the definition in the world can be upset by one little
existence proof, a simulation of a self-organizing quality bootstrap that
runs on a different set of principles. Until then, we are simply trying
to clarify our thinking about the creation-of-quality process we know best, the one
first stumbled upon by Charles Darwin.