Tag: Biology

Frozen Accidents: Why the Future Is So Unpredictable

“Each of us human beings, for example, is the product of an enormously long
sequence of accidents,
any of which could have turned out differently.”
— Murray Gell-Mann

***

What parts of reality are the product of an accident? The physicist Murray Gell-Mann thought the answer was “just about everything.” And to Gell-Mann, understanding this idea was the key to understanding how complex systems work.

Gell-Mann believed two things caused what we see in the world:

  1. A set of fundamental laws
  2. Random “accidents” — the little blips that could have gone either way and had they, would have produced a very different kind of world.

Gell-Mann pulled the second part from Francis Crick, co-discoverer of the human genetic code, who argued that the code itself may well have been an “accident” of physical history rather than a uniquely necessary arrangement.

These accidents become “frozen” in time, and have a great effect on all subsequent developments; complex life itself is an example of something that did happen a certain way but probably could have happened other ways — we know this from looking at the physics.

This idea of fundamental laws plus accidents and the non-linear second-order effects they produce became the science of complexity and chaos theory.

Gell-Mann discussed the fascinating idea further in a 1996 essay on Edge:

Each of us human beings, for example, is the product of an enormously long sequence of accidents, any of which could have turned out differently. Think of the fluctuations that produced our galaxy, the accidents that led to the formation of the solar system, including the condensation of dust and gas that produced Earth, the accidents that helped to determine the particular way that life began to evolve on Earth, and the accidents that contributed to the evolution of particular species with particular characteristics, including the special features of the human species. Each of us individuals has genes that result from a long sequence of accidental mutations and chance matings, as well as natural selection.

Now, most single accidents make very little difference to the future, but others may have widespread ramifications, many diverse consequences all traceable to one chance event that could have turned out differently. Those we call frozen accidents.

These “frozen accidents” occur at every nested level of the world: As Gell-Mann points out, they are an outcome in physics (the physical laws we observe may be accidents of history); in biology (our genetic code is largely a byproduct of “advantageous accidents” as discussed by Crick); and in human history, as we’ll discuss. In other words, the phenomenon hits all three buckets of knowledge.

Gell-Mann gives a great example of how this plays out on the human scale:

For instance, Henry VIII became king of England because his older brother Arthur died. From the accident of that death flowed all the coins, all the charters, all the other records, all the history books mentioning Henry VIII; all the different events of his reign, including the manner of separation of the Church of England from the Roman Catholic Church; and of course the whole succession of subsequent monarchs of England and of Great Britain, to say nothing of the antics of Charles and Diana. The accumulation of frozen accidents is what gives the world its effective complexity.

The most important idea here is that the frozen accidents of history have a nonlinear effect on everything that comes after. The complexity we see comes from simple rules and many, many “bounces” that could have gone in any direction. Once they go a certain way, there is no return.

This principle is illustrated wonderfully in the book The Origin of Wealth by Eric Beinhocker. The first example comes from 19th-century history:

In the late 1800s, “Buffalo Bill” Cody created a show called Buffalo Bill’s Wild West Show, which toured the United States, putting on exhibitions of gun fighting, horsemanship, and other cowboy skills. One of the show’s most popular acts was a woman named Phoebe Moses, nicknamed Annie Oakley. Annie was reputed to have been able to shoot the head off of a running quail by age twelve, and in Buffalo Bill’s show, she put on a demonstration of marksmanship that included shooting flames off candles, and corks out of bottles. For her grand finale, Annie would announce that she would shoot the end off a lit cigarette held in a man’s mouth, and ask for a brave volunteer from the audience. Since no one was ever courageous enough to come forward, Annie hid her husband, Frank, in the audience. He would “volunteer,” and they would complete the trick together. In 1880, when the Wild West Show was touring Europe, a young crown prince (and later, kaiser), Wilhelm, was in the audience. When the grand finale came, much to Annie’s surprise, the macho crown prince stood up and volunteered. The future German kaiser strode into the ring, placed the cigarette in his mouth, and stood ready. Annie, who had been up late the night before in the local beer garden, was unnerved by this unexpected development. She lined the cigarette up in her sights, squeezed…and hit it right on the target.

Many people have speculated that if at that moment, there had been a slight tremor in Annie’s hand, then World War I might never have happened. If World War I had not happened, 8.5 million soldiers and 13 million civilian lives would have been saved. Furthermore, if Annie’s hand had trembled and World War I had not happened, Hitler would not have risen from the ashes of a defeated Germany, and Lenin would not have overthrown a demoralized Russian government. The entire course of twentieth-century history might have been changed by the merest quiver of a hand at a critical moment. Yet, at the time, there was no way anyone could have known the momentous nature of the event.

This isn’t to say that other big events, many bad, would not have precipitated in the 20th century. Almost certainly, there would have been wars and upheavals.

But the actual course of history was, in some part, determined by a small chance event which had no seeming importance when it happened. The impact of Wilhelm being alive rather than dead was totally non-linear. (A small non-event had a massively disproportionate effect on what happened later.)

This is why predicting the future, even with immense computing power, is an impossible task. The chaotic effects of randomness, with small inputs having disproportionate and massive effects, makes prediction a very difficult task. That’s why we must appreciate the role of randomness in the world and seek to protect against it.

Another great illustration from The Origin of Wealth is a famous story in the world of technology:

[In 1980] IBM approached a small company with forty employees in Bellevue, Washington. The company, called Microsoft, was run by a Harvard dropout named bill Gates and his friend Paul Allen. IBM wanted to talk to the small company about creating a version of the programming language BASIC for the new PC. At their meeting, IBM asked Gates for his advice on what operating systems (OS) the new machine should run. Gates suggested that IBM talk to Gary Kildall of Digital Research, whose CP/M operating system had become the standard in the hobbyist world of microcomputers. But Kildall was suspicious of the blue suits from IBM and when IBM tried to meet him, he went hot-air ballooning, leaving his wife and lawyer to talk to the bewildered executives, along with instructions not to sign even a confidentiality agreement. The frustrated IBM executives returned to Gates and asked if he would be interested in the OS project. Despite never having written an OS, Gates said yes. He then turned around and license a product appropriately named Quick and Dirty Operating System, or Q-DOS, from a small company called Seattle Computer Products for $50,000, modified it, and then relicensed it to IBM as PC-DOS. As IBM and Microsoft were going through the final language for the agreement, Gates asked for a small change. He wanted to retain the rights to sell his DOS on non-IBM machines in a version called MS-DOS. Gates was giving the company a good price, and IBM was more interested in PC hardware than software sales, so it agreed. The contract was signed on August 12, 1981. The rest, as they say, is history. Today, Microsoft is a company worth $270 billion while IBM is worth $140 billion.

At any point in that story, business history could have gone a much different way: Kildall could have avoided hot-air ballooning, IBM could have refused Gates’ offer, Microsoft could have not gotten the license for QDOS. Yet this little episode resulted in massive wealth for Gates and a long period of trouble for IBM.

Predicting the outcomes of a complex system must clear a pretty major hurdle: The prediction must be robust to non-linear “accidents” with a chain of unforeseen causation. In some situations, this is doable: We can confidently rule out that Microsoft will not go broke in the next 12 months; the chain of events needed to take it under quickly is so low as to be negligible, no matter how you compute it. (Even IBM made it through the above scenario, although not unscathed.)

But as history rolls on and more “accidents” accumulate year by year, a “Fog of the Future” rolls in to obscure our view. To operate in such a world, we must learn that predicting is inferior to building systems that don’t require prediction, as Mother Nature does. And if we must predict, we must confine our predictions to areas with few variables that lie in our circle of competence, and understand the consequences if we’re wrong.

If this topic is interesting to you, try exploring the rest of the Origin of Wealth, which discusses complexity in the economic realm in great (but readable) detail; also check out the rest of Murray Gell-Mann’s essay on Edge. Gell-Mann also wrote a book on the topic called The Quark and the Jaguar which is worth checking out. The best writer on randomness and robustness in the face of an uncertain future is of course Nassim Taleb, whom we have written about many times.

What Biology Enables, Culture Forbids

We get a little confused when deciding if a particular human behavior is cultural or biological. Is homosexuality a natural act or unnatural? How about Facebook? Is it unnatural human behavior? Abortion? Non-procreative sex? Slavery? Mixing of races?

Many of these are either explicitly or certainly border on being taboo subjects. As in, they may not be discussed in polite company, even when encouraged.

Yet, for for those of us seeking to understand reality as it is, to understand deeply the most important buckets of knowledge, taboo is no reason to avoid the hard subjects.

So how should we think about this?

“From a biological perspective, nothing is unnatural. Whatever is possible is by definition also natural.”

— Yuval Harari

Professor Yuval Harari, who has previously taught us why humans dominate the earth and the false natural state of man, has an interesting take, discussed in his book Sapiens: A Brief History of Humankind. The chapter is aptly titled “There is No Justice in History.”

Professor Harari’s well-informed heuristic boils down to: Biology Enables. Culture Forbids.

How can we distinguish what is biologically determined from what people merely try to justify through biological myths? A good rule of thumb is ‘Biology enables, culture forbids.’ Biology is willing to tolerate a very wide spectrum of possibilities. It’s culture that obligates people to realize some possibilities while forbidding others. Biology enables women to have children — some cultures oblige women to realize this possibility. Biology enables men to enjoy sex with one another — some cultures forbid them to realize this possibility.

Culture tends to argue that it forbids only that which is unnatural. But from a biological perspective, nothing is unnatural. Whatever is possible is by definition also natural. A truly unnatural behavior, one that goes against the laws of nature, simply cannot exist, so it would need no prohibition.

[…]

…Evolution has no purpose. Organs have not evolved with a purpose, and the way they are used is in constant flux. There is not a single organ in the human body that only does the job its prototype did when it first appeared hundreds of millions of years ago. Organs evolve to perform a particular function, but once they exist, they can be adapted for other usages as well. Mouths, for example, appeared because the earliest multicellular organisms needed a way to take nutrients into their bodies. We still use our mouths for that purpose, but we also use them to kiss, speak, and, if we are Rambo, to pull the pins out of hand grenades. Are any of these uses unnatural simply because our worm-like ancestors 600 million years ago didn’t do those things with their mouths?

Our biology gives us a very wide playground and a lot of berth. We’re capable of a wide variety of activities and forms of organization, while other species generally fall into far more fixed and predictable hierarchies.

Over the course of history, humans have taken advantage of this wide range in a variety of positive and negative ways by creating and sustaining myths not supported by biological reality.

Take slavery, once a common practice throughout the world and now thankfully considered a scourge (and illegal) on all parts of the planet. Or the caste system, still in place in some in certain areas of the world, although perhaps less strictly than in the past.

Both slavery and the castes were carried out through a series of pseudoscientific rationalizations about the “natural order” of things, stories strong enough to believed (in part) by all constituents of the hierarchy. This “forbidding” aspect of culture was not supported by biological differences, but that didn’t make the stories any less powerful or believable.

Even the American political system, ostensibly founded on a bedrock of “liberty and equality”, only provided those things to certain small groups. The Founders used cultural myths to rationalize a deeply divided society in which men had dominion over women, European whites had dominion over blacks and the native people, and the historically rich had dominion over the historically poor. Any other order would have been “unnatural”:

The American order consecrated the hierarchy between the rich and poor. Most Americans at that time had little problem with the inequality caused by wealthy parents passing their money and businesses onto their children. In their view, equality meant simply that the same laws applied to rich and poor. It had nothing to do with unemployment benefits, integrated education or health insurance. Liberty, too, carried very different connotations than it does today. In 1776, it did not mean that the disempowered (certainly not blacks or Indians or, God forbid, women) could gain and exercise power. It meant simply that the state could not, except in unusual circumstances, confiscate a citizen’s private property or tell him what to do with it. The American order thereby upheld the hierarchy of wealth, which some thought was mandated by God and others viewed representing the immutable laws of nature. Nature, it was claimed, rewarded merit with wealth while penalizing indolence.

All the above-mentioned distinctions — between free persons and slaves, between whites and blacks, between rich and poor — are rooted in fictions…Yet it is an iron rule of history that every imagined hierarchy disavows its fictional origins and claims to be natural and inevitable. For instance, many people who have viewed the hierarchy of free persons and slaves as natural and correct have argued that slavery is not a human invention. Hammurabi saw it as ordained by the gods. Aristotle argued that slaves have a ‘slavish nature’ whereas free people have a ‘free nature’. Their status in society is merely a reflection of their innate nature.

This isn’t to argue that there aren’t biological differences between certain groups of people, including men and women. There are. But history has shown our tendency to exaggerate those differences and to create stories around our exaggerations, stories that uphold a certain desired hierarchy. These stories have a way of creating their own reality.

Just as frequently, we commit the opposite sin by restricting certain behavior based on some idea of what’s “natural” or “unnatural”, confusing biology with religious or cultural taboos. (And these myths die hard: It’s hard to fathom, but homosexuality wasn’t even legal in the United Kingdom until 1967.) As Harari rightly points out, anything we can do is perfectly natural in the biological sense. We come well-equipped for a variety of behavior.

And this certainly isn’t to argue that all behavior is equally acceptable: We put bumpers on society to reduce murder, rape, slavery, and other vile behavior that is perfectly biologically natural to us, and we should.

But unless we recognize the difference between biology and cultural myth and seek to reduce our unfair taboos wherever possible, we fail in some way to see the world through the eyes of others, and see that our imagined order is not always a fair or just one, a natural or inevitable one. Maybe some of the things we see around us are just a historical accident if we look closely enough.

Even more than that, examining the relationship between biological reality and cultural myth allows us to appreciate our basic storytelling instincts. Human beings are wired for narrative: We’ve been called the Storytelling Animal and for good reason. Our thirst and ready acceptance of narrative is a basic part of our existence; it’s hard-wired into our genetic algorithm.

Much of our narrative superpower can be observed in the structure of human language, which is unique among species in its infinite flexibility and adaptability. It makes us capable of great cooperative accomplishments, but also great evils.

Fortunately, the modern world has done a pretty good job steadily loosening the grip of mythical “natural” realities that only exist in our heads. But a fair inquiry remains: What sustaining myths still exist? Are they for good or for evil?

We leave that for you to ponder.

Check out Harari’s book Sapiens or his new book, Homo Deus.

***

If you liked this, you’ll love:

Why Humans Dominate the Earth: Myth-Making — It is our collected fictions that define us.

Religion and History: Will Durant on the Role of Religion and Morality — Religions ability to shape cultural behavior.

The False Allure of a “Natural State” of Man — The heated debate about Sapiens’ “natural way of life” is missing the point. Ever since the Cognitive Revolution, there hasn’t been a natural way of life for Sapiens.

Moving the Finish Line: The Goal Gradient Hypothesis

Imagine a sprinter running an Olympic race. He’s competing in the 1600 meter run.

The first two laps he runs at a steady but hard pace, trying to keep himself consistently near the head, or at least the middle, of the pack, hoping not to fall too far behind while also conserving energy for the whole race.

About 800 meters in, he feels himself start to fatigue and slow. At 1000 meters, he feels himself consciously expending less energy. At 1200, he’s convinced that he didn’t train enough.

Now watch him approach the last 100 meters, the “mad dash” for the finish. He’s been running what would be an all-out sprint to us mortals for 1500 meters, and yet what happens now, as he feels himself neck and neck with his competitors, the finish line in sight?

He speeds up. That energy drag is done. The goal is right there, and all he needs is one last push. So he pushes.

This is called the Goal Gradient Effect, or more precisely, the Goal Gradient Hypothesis. Its effect on biological creatures is not just a feeling, but a real and measurable thing.

The Math of Human Behavior

The first person to try explaining the goal gradient hypothesis was an early behavioral psychologist named Clark L. Hull.

As with other animals, when it came to humans, Hull was a pretty hardcore behaviorist, thinking that human behavior could eventually be reduced to mathematical prediction based on rewards and conditioning. As insane as this sounds now, he had a neat mathematical formula for human behavior:

screen-shot-2016-10-14-at-12-34-26-pm

Some of his ideas eventually came to be seen as extremely limiting Procrustean Bed type models of human behavior, but the Goal Gradient Hypothesis was replicated many times over the years.

Hull himself wrote papers with titles like The Goal-Gradient Hypothesis and Maze Learning to explore the effect of the idea in rats. As Hull put it, “...animals in traversing a maze will move at a progressively more rapid pace as the goal is approached.” Just like the runner above.

Most of the work Hull focused on were animals rather than humans, showing somewhat unequivocally that in the context of approaching a reward, the animals did seem to speed up as the goal approached, enticed by the end of the maze. The idea was, however, resurrected in the human realm in 2006 with a paper entitled The Goal-Gradient Hypothesis Resurrected: Purchase Acceleration, Illusionary Goal Progress, and Customer Retention. (link)

The paper examined consumer behavior in the “goal gradient” sense and found, alas, it wasn’t just rats that felt the tug of the “end of the race” — we do too. Examining a few different measurable areas of human behavior, the researchers found that consumers would work harder to earn incentives as the goal came within sight and that after the reward was earned, they’d slow down their efforts:

We found that members of a café RP accelerated their coffee purchases as they progressed toward earning a free coffee. The goal-gradient effect also generalized to a very different incentive system, in which shorter goal distance led members to visit a song-rating Web site more frequently, rate more songs during each visit, and persist longer in the rating effort. Importantly, in both incentive systems, we observed the phenomenon of post-reward resetting, whereby customers who accelerated toward their first reward exhibited a slowdown in their efforts when they began work (and subsequently accelerated) toward their second reward. To the best of our knowledge, this article is the first to demonstrate unequivocal, systematic behavioural goal gradients in the context of the human psychology of rewards.

Fascinating.

Putting The Goal Gradient Hypothesis to Work

If we’re to take the idea seriously, the Goal Gradient Hypothesis has some interesting implications for leaders and decision-makers.

The first and most important is probably that incentive structures should take the idea into account. This is a fairly intuitive (but often unrecognized) idea: Far-away rewards are much less motivating than near term ones. Given a chance to earn $1,000 at the end of this month, and each after that, or $12,000 at the end of the year, which would you be more likely to work hard for?

What if I pushed it back even more but gave you some “interest” to compensate: Would you work harder for the potential to earn $90,000 five years from now or to earn $1,000 this month, followed by $1,000 the following month, and so on, every single month during five year period?

Companies like Nucor take the idea seriously: They pay bonuses to lower-level employees based on monthly production, not letting it wait until the end of the year. Essentially, the end of the maze happens every 30 days rather than once per year. The time between doing the work and the reward is shortened.

The other takeaway comes to consumer behavior, as referenced in the marketing paper. If you’re offering rewards for a specific action from your customer, do you reward them sooner, or later?

The answer is almost always going to be “sooner.” In fact, the effect may be strong enough that you can get away with less total rewards by increasing their velocity.

Lastly, we might be able to harness the Hypothesis in our personal lives.

Let’s say we want to start reading more. Do we set a goal to read 52 books this year and hold ourselves accountable, or to read 1 book a week? What about 25 pages per day?

Not only does moving the goalposts forward tend to increase our motivation, but we repeatedly prove to ourselves that we’re capable of accomplishing them. This is classic behavioral psychology: Instant rewards rather than delayed. (Even if they’re psychological.) Not only that, but it forces us to avoid procrastination — leaving 35 books to be read in the last two months of the year, for example.

Those three seem like useful lessons, but here’s a challenge: Try synthesizing a new rule or idea of your own, combining the Goal Gradient Effect with at least one other psychological principle from The Psychology of Human Misjudgment, and start testing it out in your personal life or in your organization. Don’t let useful nuggets sit around; instead, start eating the broccoli.

How To Mentally Overachieve — Charles Darwin’s Reflections On His Own Mind

We’ve written quite a bit about the marvelous British naturalist Charles Darwin, who with his Origin of Species created perhaps the most intense intellectual debate in human history, one which continues up to this day.

Darwin’s Origin was a courageous and detailed thought piece on the nature and development of biological species. It’s the starting point for nearly all of modern biology.

But, as we’ve noted before, Darwin was not a man of pure IQ. He was not Issac Newton, or Richard Feynman, or Albert Einstein — breezing through complex mathematical physics at a young age.

Charlie Munger thinks Darwin would have placed somewhere in the middle of a good private high school class. He was also in notoriously bad health for most of his adult life and, by his son’s estimation, a terrible sleeper. He really only worked a few hours a day in the many years leading up to the Origin of Species.

Yet his “thinking work” outclassed almost everyone. An incredible story.

In his autobiography, Darwin reflected on this peculiar state of affairs. What was he good at that led to the result? What was he so weak at? Why did he achieve better thinking outcomes? As he put it, his goal was to:

“Try to analyse the mental qualities and the conditions on which my success has depended; though I am aware that no man can do this correctly.”

In studying Darwin ourselves, we hope to better appreciate our own strengths and weaknesses and, not to mention understand the working methods of a “mental overachiever.

Let’s explore what Darwin saw in himself.

***

1. He did not have a quick intellect or an ability to follow long, complex, or mathematical reasoning. He may have been a bit hard on himself, but Darwin realized that he wasn’t a “5 second insight” type of guy (and let’s face it, most of us aren’t). His life also proves how little that trait matters if you’re aware of it and counter-weight it with other methods.

I have no great quickness of apprehension or wit which is so remarkable in some clever men, for instance, Huxley. I am therefore a poor critic: a paper or book, when first read, generally excites my admiration, and it is only after considerable reflection that I perceive the weak points. My power to follow a long and purely abstract train of thought is very limited; and therefore I could never have succeeded with metaphysics or mathematics. My memory is extensive, yet hazy: it suffices to make me cautious by vaguely telling me that I have observed or read something opposed to the conclusion which I am drawing, or on the other hand in favour of it; and after a time I can generally recollect where to search for my authority. So poor in one sense is my memory, that I have never been able to remember for more than a few days a single date or a line of poetry.

2. He did not feel easily able to write clearly and concisely. He compensated by getting things down quickly and then coming back to them later, thinking them through again and again. Slow, methodical….and ridiculously effective: For those who haven’t read it, the Origin of Species is extremely readable and clear, even now, 150 years later.

I have as much difficulty as ever in expressing myself clearly and concisely; and this difficulty has caused me a very great loss of time; but it has had the compensating advantage of forcing me to think long and intently about every sentence, and thus I have been led to see errors in reasoning and in my own observations or those of others.

There seems to be a sort of fatality in my mind leading me to put at first my statement or proposition in a wrong or awkward form. Formerly I used to think about my sentences before writing them down; but for several years I have found that it saves time to scribble in a vile hand whole pages as quickly as I possibly can, contracting half the words; and then correct deliberately. Sentences thus scribbled down are often better ones than I could have written deliberately.

3. He forced himself to be an incredibly effective and organized collector of information. Darwin’s system of reading and indexing facts in large portfolios is worth emulating, as is the habit of taking down conflicting ideas immediately.

As in several of my books facts observed by others have been very extensively used, and as I have always had several quite distinct subjects in hand at the same time, I may mention that I keep from thirty to forty large portfolios, in cabinets with labelled shelves, into which I can at once put a detached reference or memorandum. I have bought many books, and at their ends I make an index of all the facts that concern my work; or, if the book is not my own, write out a separate abstract, and of such abstracts I have a large drawer full. Before beginning on any subject I look to all the short indexes and make a general and classified index, and by taking the one or more proper portfolios I have all the information collected during my life ready for use.

4. He had possibly the most valuable trait in any sort of thinker: A passionate interest in understanding reality and putting it in useful order in his headThis “Reality Orientation” is hard to measure and certainly does not show up on IQ tests, but probably determines, to some extent, success in life.

On the favourable side of the balance, I think that I am superior to the common run of men in noticing things which easily escape attention, and in observing them carefully. My industry has been nearly as great as it could have been in the observation and collection of facts. What is far more important, my love of natural science has been steady and ardent.

This pure love has, however, been much aided by the ambition to be esteemed by my fellow naturalists. From my early youth I have had the strongest desire to understand or explain whatever I observed,–that is, to group all facts under some general laws. These causes combined have given me the patience to reflect or ponder for any number of years over any unexplained problem. As far as I can judge, I am not apt to follow blindly the lead of other men. I have steadily endeavoured to keep my mind free so as to give up any hypothesis, however much beloved (and I cannot resist forming one on every subject), as soon as facts are shown to be opposed to it.

Indeed, I have had no choice but to act in this manner, for with the exception of the Coral Reefs, I cannot remember a single first-formed hypothesis which had not after a time to be given up or greatly modified. This has naturally led me to distrust greatly deductive reasoning in the mixed sciences. On the other hand, I am not very sceptical—a frame of mind which I believe to be injurious to the progress of science. A good deal of scepticism in a scientific man is advisable to avoid much loss of time, but I have met with not a few men, who, I feel sure, have often thus been deterred from experiment or observations, which would have proved directly or indirectly serviceable.

[…]

Therefore my success as a man of science, whatever this may have amounted to, has been determined, as far as I can judge, by complex and diversified mental qualities and conditions. Of these, the most important have been—the love of science—unbounded patience in long reflecting over any subject—industry in observing and collecting facts—and a fair share of invention as well as of common sense.

5. Most inspirational to us of average intellect, he outperformed his own mental aptitude with these good habits, surprising even himself with the results.

With such moderate abilities as I possess, it is truly surprising that I should have influenced to a considerable extent the belief of scientific men on some important points.

***

Still Interested? Read his autobiography, his The Origin of Species, or check out David Quammen’s wonderful short biography of the most important period of Darwin’s life. Also, if you missed it, check out our prior post on Darwin’s Golden Rule.

The Need for Biological Thinking to Solve Complex Problems

“Biological thinking and physics thinking are distinct, and often complementary, approaches to the world, and ones that are appropriate for different kinds of systems.”

***

How should we think about complexity? Should we use a biological or physics system? The answer, of course, is that it depends. It’s important to have both tools available at your disposal.

These are the questions that Samuel Arbesman explores in his fascinating book Overcomplicated: Technology at the Limits of Comprehension.

[B]iological systems are generally more complicated than those in physics. In physics, the components are often identical—think of a system of nothing but gas particles, for example, or a single monolithic material, like a diamond. Beyond that, the types of interactions can often be uniform throughout an entire system, such as satellites orbiting a planet.

Biology is different, and there is something meaningful to be learned from a biological approach to thinking.

In biology, there are a huge number of types of components, such as the diversity of proteins in a cell or the distinct types of tissues within a single creature; when studying, say, the mating behavior of blue whales, marine biologists may have to consider everything from their DNA to the temperature of the oceans. Not only is each component in a biological system distinctive, but it is also a lot harder to disentangle from the whole. For example, you can look at the nucleus of an amoeba and try to understand it on its own, but you generally need the rest of the organism to have a sense of how the nucleus fits into the operation of the amoeba, how it provides the core genetic information involved in the many functions of the entire cell.

Arbesman makes an interesting point here when it comes to how we should look at technology. As the interconnections and complexity of technology increases, it increasingly resembles a biological system rather than a physics one. There is another difference.

[B]iological systems are distinct from many physical systems in that they have a history. Living things evolve over time. While the objects of physics clearly do not emerge from thin air—astrophysicists even talk about the evolution of stars—biological systems are especially subject to evolutionary pressures; in fact, that is one of their defining features. The complicated structures of biology have the forms they do because of these complex historical paths, ones that have been affected by numerous factors over huge amounts of time. And often, because of the complex forms of living things, where any small change can create unexpected effects, the changes that have happened over time have been through tinkering: modifying a system in small ways to adapt to a new environment.

Biological systems are generally hacks that evolved to be good enough for a certain environment. They are far from pretty top-down designed systems. And to accommodate an ever-changing environment, they are rarely the most optimal system on a mico-level, preferring to optimize for survival over any one particular attribute. And it’s not the survival of the individual that’s optimized, it’s the survival of the species.

Technologies can appear robust until they are confronted with some minor disturbance, causing a catastrophe. The same thing can happen to living things. For example, humans can adapt incredibly well to a large array of environments, but a tiny change in a person’s genome can cause dwarfism, and two copies of that mutation invariably cause death. We are of a different scale and material from a particle accelerator or a computer network, and yet these systems have profound similarities in their complexity and fragility.

Biological thinking, with a focus on details and diversity, is a necessary tool to deal with complexity.

The way biologists, particularly field biologists, study the massively complex diversity of organisms, taking into account their evolutionary trajectories, is therefore particularly appropriate for understanding our technologies. Field biologists often act as naturalists— collecting, recording, and cataloging what they find around them—but even more than that, when confronted with an enormously complex ecosystem, they don’t immediately try to understand it all in its totality. Instead, they recognize that they can study only a tiny part of such a system at a time, even if imperfectly. They’ll look at the interactions of a handful of species, for example, rather than examine the complete web of species within a single region. Field biologists are supremely aware of the assumptions they are making, and know they are looking at only a sliver of the complexity around them at any one moment.

[…]

When we’re dealing with different interacting levels of a system, seemingly minor details can rise to the top and become important to the system as a whole. We need “Field biologists” to catalog and study details and portions of our complex systems, including their failures and bugs. This kind of biological thinking not only leads to new insights, but might also be the primary way forward in a world of increasingly interconnected and incomprehensible technologies.

Waiting and observing isn’t enough.

Biologists will often be proactive, and inject the unexpected into a system to see how it reacts. For example, when biologists are trying to grow a specific type of bacteria, such as a variant that might produce a particular chemical, they will resort to a process known as mutagenesis. Mutagenesis is what it sounds like: actively trying to generate mutations, for example by irradiating the organisms or exposing them to toxic chemicals.

When systems are too complex for human understanding, often we need to insert randomness to discover the tolerances and limits of the system. One plus one doesn’t always equal two when you’re dealing with non-linear systems. For biologists, tinkering is the way to go.

As Stewart Brand noted about legacy systems, “Teasing a new function out of a legacy system is not done by command but by conducting a series of cautious experiments that with luck might converge toward the desired outcome.”

When Physics and Biology Meet

This doesn’t mean we should abandon the physics approach, searching for underlying regularities in complexity. The two systems complement one another rather than compete.

Arbesman recommends asking the following questions:

When attempting to understand a complex system, we must determine the proper resolution, or level of detail, at which to look at it. How fine-grained a level of detail are we focusing on? Do we focus on the individual enzyme molecules in a cell of a large organism, or do we focus on the organs and blood vessels? Do we focus on the binary signals winding their way through circuitry, or do we examine the overall shape and function of a computer program? At a larger scale, do we look at the general properties of a computer network, and ignore the individual machines and decisions that make up this structure?

When we need to abstract away a lot of the details and lean on physics thinking more.

Think about it from an organizational perspective. The new employee at the lowest level is focused on the specific details of their job, whereas the executive is focused on systems, strategy, culture, and flow — how things interact and reinforce one another. The details of the new employee’s job are lost on them.

We can’t use one system, whether biological or physics, exclusively. That’s fragile thinking. Instead, we need to combine both systems.

In Cryptonomicon, a novel by Neal Stephenson, he makes exactly this point talking about the structure of the pantheon of Greek gods:

And yet there is something about the motley asymmetry of this pantheon that makes it more credible. Like the Periodic Table of the Elements or the family tree of the elementary particles, or just about any anatomical structure that you might pull up out of a cadaver, it has enough of a pattern to give our minds something to work on and yet an irregularity that indicates some kind of organic provenance—you have a sun god and a moon goddess, for example, which is all clean and symmetrical, and yet over here is Hera, who has no role whatsoever except to be a literal bitch goddess, and then there is Dionysus who isn’t even fully a god—he’s half human—but gets to be in the Pantheon anyway and sit on Olympus with the Gods, as if you went to the Supreme Court and found Bozo the Clown planted among the justices.

There is a balance, and we need to find it.

Lewis Thomas on our Social Nature and “Getting the Air Right”


“What it needs is for the air to be made right. If you want a bee to make honey, you do not issue protocols on solar navigation or carbohydrate chemistry, you put him together with other bees (and you’d better do this quickly, for solitary bees do not stay alive) and you do what you can to arrange the general environment around the hive. If the air is right, the science will come in its own season, like pure honey.”
— Lewis Thomas

***

In his wonderful collection of essays, The Lives of a Cell, the biologist Lewis Thomas displays a fairly pronounced tendency to compare humans to the “social insects” — primarily bees and ants. It’s not unfair to wonder: Looked at from a properly high perch, are humans simply doing the equivalent of hive-building and colony-building?

In a manner Yuval Harari would later echo in his book Sapiens, Thomas concludes that, while we’re similar, there are some pretty essential differences. He wonders aloud in the essay titled Social Talk:

Nobody wants to think that the rapidly expanding mass of mankind, spreading out over the surface of the earth, blackening the ground, bears any meaningful resemblance to the life of an anthill or a hive. Who would consider for a moment that the more than 3 billion of us are a sort of stupendous animal when we become linked together? We are not mindless, nor is our day-to-day behavior coded out to the last detail by our genomes, nor do we seem to be engaged together, compulsively, in any single, universal, stereotyped task analogous to the construction of a nest. If we were ever to put all our brains together in fact, to make a common mind the way the ants do, it would be an unthinkable thought, way over our heads.

Social animals tend to keep at a particular thing, generally something huge for their size; they work at it ceaselessly under genetic instructions and genetic compulsion, using it to house the species and protect it, assuring permanence.

There are, to be sure, superficial resemblance’s in some of the things we do together, like building glass and plastic cities on all the land and farming under the sea, or assembling in armies, or landing samples of ourselves on the moon, or sending memoranda into the next galaxy. We do these together without being quite sure why, but we can stop doing one thing and move to another whenever we like. We are not committed or bound by our genes to stick to one activity forever, like the wasps.

Today’s behavior is no more fixed than when we tumbled out over Europe to build cathedrals in the twelfth century. At that time we were convinced that it would go on forever, that this was the way to live, but it was not; indeed, most of us have already forgotten what it was all about. Anything we do in this transient, secondary social way, compulsively and with all our energies but only for a brief period of our history, cannot be counted as social behavior in the biological sense. If we can turn it on and off, on whims, it isn’t likely that our genes are providing the detailed instructions. Constructing Charters was good for our minds, but we found that our lives went on, and it is no more likely that we will find survival in Rome plows or laser bombs, or rapid mass transport or a Mars lander, or solar power, or even synthetic protein. We do tend to improvise things like this as we go along, but it is clear that we can pick and choose.

With our basic nature as a backdrop, human beings “pick and choose,” in Thomas’s words, among the possible activities we might engage in. These can range from pyramid building to art and music, from group campfire songs to extreme and brutal war. The wide range, the ability to decide to be a warring society sometimes and a peaceful society sometimes, might be seen as evidence that there are major qualitative differences between what humans do as a group and what the social insects are up to. Maybe we’re not just hive-builders after all.

What causes the difference then? Thomas thought it might well be our innate capacity for language, and the information it allows us to share:

It begins to look, more and more disturbingly, as if the gift of language is the single human trait that marks us all genetically, setting us apart from all the rest of life. Language is, like nest building or hive making, the universal and biologically specific activity of human beings. We engage in it communally, compulsively, and automatically. We cannot be human without it; if we were to be separated from it our minds would die, as surely as bees lost from the hive.

We are born knowing how to use language. The capacity to recognize syntax, to organize and deploy words into intelligible sentences, is innate in the human mind. We are programmed to identify patterns and generate grammar. There are invariant and variable structures in speech that are common to all of us. As chicks are endowed with an innate capacity to read information in the shapes of overhanging shadows, telling hawk from other birds, we can identify the meaning of grammar in a string of words, and we are born this way. According to Chomsky, who has examined it as a biologist looks at live tissue, language “must simply be a biological property of the human mind.” The universal attributes of language are genetically set; we do not learn them, or make them up as we go along.

We work at this all our lives, and collectively we give it life, but we do not exert the least control over language, not as individuals or committees or academies or governments. Language, once it comes alive, behaves like an active, motile organism. Parts of it are always being changed, by a ceaseless activity to which all of us are committed; new words are invented and inserted, old ones have their meaning altered or abandoned. New ways of stringing words and sentences together come into fashion and vanish again, but the underlying structure simply grows, enriches itself, and expands. Individual languages age away and seem to die, but they leave progeny all over the place. Separate languages can exist side by side for centuries without touching each other, maintaining their integrity with the vigor of incompatible tissues. At other times, two languages may come together, fuse, replicate, and give rise to nests of new tongues.

The thing about the development of language is its unplannedness. There’s no language committee directing the whole enterprise. Not only is language innate, as Noam Chomsky and his student Steven Pinker have so well proven, but it’s extremely flexible based on the needs of its users. All the strange things about our language that seem so poorly drawn up were never drawn up at all. (What kind of masochist would put an “s” in the word lisp?)

***

One commonality to the social insects that Thomas does see is something he calls Getting the Air Right – his description of a really productive human group as a direct reflection of a really productive bee colony. In this case, he’s talking about getting great science done, but the application to other human endeavors seems clear.

The following piece, pulled from his essay titled Natural Science, is worth reading and re-reading closely when you’re tempted to “command and control” others around you.

I don’t know of any other human occupation, even including what I have seen of art, in which the people engaged in it are so caught up, so totally preoccupied, so driven beyond their strength and resources.

Scientists at work have the look of creatures following genetic instructions; they seem to be under the influence of a deeply placed human instinct. They are, despite their efforts at dignity, rather like young animals engaged in savage play. When they are near to an answer their hair stands on end, they sweat, they are awash in their own adrenalin. To grab the answer, and grab it first, is for them a more powerful drive than feeding or breeding or protecting themselves against the elements.

It sometimes looks like a lonely activity, but it is as much the opposite of lonely as human behavior can be. There is nothing so social, so communal, and so interdependent. An active field of science is like an immense intellectual anthill; the individual almost vanishes into the mass of minds tumbling over each other, carrying information from place to place, passing it around at the speed of light.

There are special kinds of information that seem to be chemotactic. As soon as a trace is released, receptors at the back of the neck are caused to tremble, there is a massive convergence of motile minds flying upwind on a gradient of surprise, crowding around the source. It is an infiltration of intellects, an inflammation.

There is nothing to touch the spectacle. In the midst of what seems a collective derangement of minds in total disorder, with bits of information being scattered about, torn to shreds, disintegrated, deconstituted, engulfed, in a kind of activity that seems as random and agitated as that of bees in a disturbed part of the hive, there suddenly emerges, with the purity of a slow phrase of music, a single new piece of truth about nature.

In short, it works. It is the most powerful and productive of the things human beings have learned to do together in many centuries, more effective than farming, or hunting and fishing, or building cathedrals, or making money. It is instinctive behavior, in my view, and I do not understand how it works.

It cannot be prearranged in any precise way; the minds cannot be lined up in tidy rows and given directions from printed sheets. You cannot get it done by instructing each mind to make this or that piece, for central committees to fit with the pieces made by the other instructed minds. It does not work this way.

What it needs is for the air to be made right. If you want a bee to make honey, you do not issue protocols on solar navigation or carbohydrate chemistry, you put him together with other bees (and you’d better do this quickly, for solitary bees do not stay alive) and you do what you can to arrange the general environment around the hive. If the air is right, the science will come in its own season, like pure honey.

Still Interested? Check out another great biologist, E.O. Wilson, writing about his experiences in science, or yet another, Richard Dawkins, writing about why chain letters work as a method for understanding natural selection.