Category: People

Isaac Watts and the Improvement of the Mind

What did an 18th-century hymn writer have to contribute to the modern understanding of the world? As it turns out, a lot. Sometimes we forget how useful the old wisdom can be.

***

One of the most popular and prolific Christian hymn writers of all time — including Joy to the World — was a man named Isaac Watts, who lived in England in the late 17th and early 18th century. Watts was a well educated Nonconformist (in the religious sense, not the modern one) who, along with his hymn writing, published a number of books on logic, science, and the learning process, at a time when these concepts were only just starting to grab hold as a dominant ideology, replacing the central role of religious teaching.

Watts’s book The Improvement of the Mind was an important contribution to the growing body of work emphasizing the importance of critical thinking and rational, balanced inquiry, rather than adhering to centuries of dogma. If, as Alfred North Whitehead once pronounced, modernity’s progress was due to the “invention of the method of invention,” Watts and his books (which became textbooks in English schools, including Oxford) can easily be credited with helping push the world along.

One non-conformist who would later come to be deeply influenced by Watts was the great scientist Michael Faraday. Faraday grew up in a poor area of 18th-century England and received a fairly crude education, and yet would go on to become the Father of Electromagnetism. How?

In part, Faraday credits his own “inventing the method of invention” to reading Watts’s books, particularly The Improvement of the Mind — a self improvement guide a few centuries before the internet. Watts recommended keeping a commonplace book to record facts, and Faraday did. Watts recommended he be guided by observed facts, and Faraday was. Watts recommended finding a great teacher, and Faraday starting attending lectures.

In Watts’s book, Faraday had found a guiding ethos for how to sort out truth and fiction, what we now call the scientific method. And, given his tremendous achievements from a limited starting point, it’s worth asking…what did Faraday find?

***

We needn’t search far to figure it out. Smack dab in Chapter One of the book, Watts lays out his General Rules for the Improvement of Knowledge.

Watts first lays out the goal of the whole enterprise. The idea is a pretty awesome one, the same ethos we promote constantly here: We all need to make decisions constantly, so why not figure out how to make better ones? You don’t have to be an intellectual to pursue this goal. Everybody has a mind worth cultivating in order to improve the practical outcome of their lives:

No man is obliged to learn and know every thing; this can neither be sought nor required, for it is utterly impossible : yet all persons are under some obligation to improve their own understanding; otherwise it will be a barren desert, or a forest overgrown with weeds and brambles. Universal ignorance or infinite errors will overspread the mind, which is utterly neglected, and lies without any cultivation.

Skill in the sciences is indeed the business and profession but of a small part of mankind; but there are many others placed in such an exalted rank in the world, as allows them much leisure and large opportunities to cultivate their reason, and to beautify and enrich their minds with various knowledge. Even the lower orders of men have particular railings in life, wherein they ought to acquire a just degree of skill; and this is not to be done well, without thinking and reasoning about them.

The common duties and benefits of society, which belong to every man living, as we are social creatures, and even our native and necessary relations to a family, a neighbourhood, or government, oblige all persons whatsoever to use their reasoning powers upon a thousand occasions; every hour of life calls for some regular exercise of our judgment, as to time and things, persons and actions; without a prudent and discreet determination in matters before as, we, shall be plunged into perpetual errors in our conduct. Now that which should always be practised, must at some time be learnt.

We then get into the Rules themselves, an 18th-century guide to becoming smarter, better, and more useful which is just as useful three hundred years later. In the Rules, Watts promotes the idea of becoming wiser, more humble, more hungry, and more broad-thinking. These are as good a guide to improving your mind as you’ll find.

Below as an abridged version of the Rules. Check them all out here or get it in book form here. Watts had a bit of a bent towards solemnity and godliness that need not be emulated (unless you’d like to, of course), but most of the Rules are as useful today as the day they were written.

***

Rule I. DEEPLY possess your mind with the vast importance of a good judgment, and the rich and inestimable advantage of right reasoning.

Review the instances of your own misconduct in life ; think seriously with yourselves how many follies and sorrows you had escaped, and how much guilt and misery you had prevented, if from your early years you had but taken due paius to judge aright concerning persons, times, and things. This will awaken you with lively vigour to address yourselves to the work of improving your reasoning powers, and seizing every opportunity and advantage for that end.

Rule II. Consider the weaknesses, frailties, and mistakes of human nature in general, which arise from the very constitution of a soul united to an animal body, and subjected to many inconveniences thereby.

Consider the many additional weaknesses, mistakes, and frailties, which are derived from our original apostasy and fall from a state of innocence; how much our powers of understanding are yet more darkened, enfeebled, and imposed upon by our senses, our fancies, and our unruly passions, &c.

Consider the depth and difficulty of many truths, and the flattering appearances of falsehood, whence arises an infinite variety of dangers to which we are exposed in our judgment of things.

Read with greediness those authors that treat of the doctrine of prejudices, prepossessions, and springs of error, on purpose to make your soul watchful on all sides, that it suffer itself, as far as possible, to be imposed upon by none of them.

Rule III. A slight view of things so momentous is not sufficient.

You should therefore contrive and practise some proper methods to acquaint yourself with your own ignorance, and to impress your mind with a deep and painful sense of the low and imperfect degrees of your present knowledge, that you may be incited with labour and activity to pursue after greater measures. Among others, you may find some such methods as these successful.

1. Take a wide survey now and then of the vast and unlimited regions of learning. […] The worlds of science are immense and endless.

2. Think what a numberless variety of questions and difficulties there are belonging even to that particular science in which you have made the greatest progress, and how few of them there are in which you have arrived at a final and undoubted certainty; excepting only those questions in the pure and simple mathematics, whose theorems are demonstrable, and leave scarce any doubt; and yet, even in the pursuit of some few of these, mankind have been strangely bewildered.

3. Spend a few thoughts sometimes on the puzzling enquiries concerning vacuums and atoms, the doctrine of infinites, indivisibles, and incommensurables in geometry, wherein there appear some insolvable difficulties: do this on purpose to give you a more sensible impression of the poverty of your understanding, and the imperfection of your knowledge. This will teach you what a vain thing it is to fancy that you know all things, and will instruct you to think modestly of your present attainments […]

4. Read the accounts of those vast treasures of knowledge which some of the dead have possessed, and some of the living do possess. Read and be astonished at the almost incredible advances which have been made in science. Acquaint yourself with some persons of great learning, that by converse among them, and comparing yourself with them, you may acquire a mean opinion of your own attainments, and may thereby be animated with new zeal, to equal them as far as possible, or to exceed: thus let your diligence be quickened by a generous and laudable emulation.

Rule IV. Presume not too much upon a bright genius, a ready wit, and good parts; for this, without labour and study, will never make a man of knowledge and wisdom.

This has been an unhappy temptation to persons of a vigorous and gay fancy, to despise learning and study. They have been acknowledged to shine in an assembly, and sparkle in a discourse on common topics, and thence they took it into their heads to abandon reading and labour, and grow old in ignorance; but when they had lost their vivacity of animal nature and youth, they became stupid and sottish even to contempt aud ridicule. Lucidas and Scintillo are young men of this stamp; they shine in conversation; they spread their native riches before the ignorant; they pride themselves in their own lively images of fancy, and imagine themselves wise and learned; but they had best avoid the presence of the skilful, and the test of reasoning; and I would advise them once a day to think forward a little, what a contemptible figure they will make in age.

The witty men sometimes have sense enough to know their own foible; and therefore they craftily shun the attacks of argument, or boldly pretend to despise and renounce them, because they are conscious of their own ignorance, aud inwardly confess their want of acquaintance with the skill of reasoning.

Rule V. As you are not to fancy yourself a learned man because you are blessed with a ready wit; so neither must you imagine that large and laborious reading, and a strong memory, can denominate you truly wise.

What that excellent critic has determined when he decided the question, whether wit or study makes the best poet, may well be applied to every sort of learning:

“Concerning poets there has been contest,
Whether they’re made by art, or nature best;
But if I may presume in this affair,
Among the rest my judgment to declare,
No art without a genius will avail,
And parts without the help of art will fail:
But both ingredients jointly must unite,
Or verse will never shine with a transcendent light.”
– Oldham.

It is meditation and studious thought, it is the exercise of your own reason and judgment upon all you read, that gives good sense even to the best genius, and affords your understanding the truest improvement. A boy of a strong memory may repeat a whole book of Euclid, yet be no geometrician; for he may not be able perhaps to demonstrate one single theorem. Memorino has learnt half the Bible by heart, and is become a living concordance, and a speaking index to theological folios, and yet he understands little of divinity. […]

Rule VII. Let the hope of new discoveries, as well as the satisfaction and pleasure of known trains, animate your daily industry.

Do not think learning in general is arrived at its perfection, or that the knowledge of any particular subject in any science cannot be improved, merely because it has lain five hundred or a thousand years without improvement. The present age, by the blessing of God on the ingenuity and diligence of men, has brought to light such truths in natural philosophy, and such discoveries in the heavens and the earth, as seemed to be beyond the reach of man. But may there not be Sir Isaac Newtons in every science? You should never despair therefore of finding out that which has never yet been found, unless you see something in the nature of it which renders it unsearchable, and above the reach of our faculties. […]

Rule VIII. Do not hover always on the surface of things, nor take up suddenly with mere appearances; but penetrate into the depth of matters, as far as your time and circumstances allow, especially in those things which relate to your own profession.

Do not indulge yourselves to judge of things by the first glimpse, or a short and superficial view of them; for this will fill the mind with errors and prejudices, and give it a wrong turn and ill habit of thinking, and make much work for retractation. Subito is carried away with title pages, so that he ventures to pronounce upon a large octavo at once, and to recommend it wonderfully when he had read half the preface. Another volume of controversies, of equal size, was discarded by him at once, because it pretended to treat of the Trinity, and yet he could neither find the word essence nor subsistences in the twelve first pages; but Subito changes his opinions of men and books and things so often, that nobody regards him.

As for those sciences, or those parts of knowledge, which either your profession, your leisure, your inclination, or your incapacity, forbid you to pursue with much application, or to search far into them, you must be contented with an historical and superficial knowledge of them, and not pretend to form any judgments of your own on those subjects which you understand very imperfectly.

Rule IX. Once a day, especially in the early years of life and study, call yourselves to an account what new ideas, what new proposition or truth you have gained, what further confirmation of known truths, and what advances you have made in any part of knowledge;

And let no day, if possible, pass away without some intellectual gain: such a course, well pursued, must certainly advance us in useful knowledge. It is a wise proverb among the learned, borrowed from the lips and practice of a celebrated painter,

“Let no day pass without one line at least.”

…and it was a sacred rule among the Pythagoreans, That they should every evening thrice run over the actions and affairs of the day, and examine what their conduct had been, what they had done, or what they had neglected: and they assured their pupils, that by this method they would make a noble progress on the path of virtue.

Rule X. Maintain a constant watch at all times against a dogmatical spirit;

Fix not your assent to any proposition in a firm and unalterable manner, till you have some firm and unalterable ground for it, and till you have arrived at some clear and sure evidence; till you have turned the proposition on all sides, and searched the matter through and through, so that you cannot be mistaken.

And even where you may think you have full grounds of assurance, be not too early, nor too frequent, in expressing this assurance in too peremptory and positive a manner, remembering that human nature is always liable to mistake in this corrupt and feeble state. A dogmatical spirit has man; inconveniences attending it: as

1. It stops the ear against all further reasoning upon that subject, and shuts up the mind from all farther improvements of knowledge. If you have resolutely fixed your opinion, though it be upon too slight and insufficient grounds, yet you will stand determined to renounce the strongest reason brought for the contrary opinion, and grow obstinate against the force of the clearest argument. Positive is a man of this character; and has often pronounced his assurance of the Cartesian vortexes: last year some further light broke in upon his understanding, with uncontrollable force, by reading something of mathematical philosophy; yet having asserted his former opinions in a most confident manner, be is tempted now to wink a little against the truth, or to prevaricate in his discourse upon that subject, lest by admitting conviction, he should expose himself to the necessity of confessing his former folly and mistake: and he has not humility enough for that.

2. A dogmatical spirit naturally leads us to arrogance of mind, and gives a man some airs in conversation which are too haughty and assuming. Audens is a man of learning, and very good company ; but his infallible assurance renders his carriage sometimes insupportable.

[…]

Rule XI. Though caution and slow assent will guard you against frequent mistakes and retractions; yet you should get humility and courage enough to retract any mistake, and confess an error.

Frequent changes are tokens of levity in our first determinations; yet you should never be too proud to change your opinion, nor frighted at the name of a changeling. Learn to scorn those vulgar bugbears, which confirm foolish man in his old mistakes, for fear of being charged with inconstancy. I confess it is better not to judge, than judge falsely; it is wiser to withhold our assent till we see complete evidence; but if we have too suddenly given up our assent, as the wisest man does sometimes, if we have professed what we find afterwards to be false, we should never be ashamed nor afraid to renounce a mistake. That is a noble essay which is found among the occasional papers ‘ to encourage the world to repractise retractations;’ and I would recommend it to the perusal of every scholar and every Christian.

Rule XV. Watch against the pride of your own reason, and a vain conceit of your own intellectual powers, with the neglect of divine aid and blessing.

Presume not upon great attainments in knowledge by your own self-sufficiency: those who trust to their own understandings entirely, are pronounced fools in the word of God; and it is the wisest of men gives them this character,

‘ He that trusteth in his own heart is a fool/ Prov. xxviii. 26. And the same divine writer advises us to ‘ trust in the Lord with all our heart, and not to lean to our understandings, nor to be wise in our own eyes,’ chap. iii. 5, 7*

Those who, with a neglect of religion and dependence on God, apply themselves to search out every article in the things of God by the mere dint of their own reason, have been suffered to run into wild excesses of foolery, and strange extravagance of opinions. Every one who pursues this vain course, and will not ask for the conduct of God in the study of religion, has just reason to fear he shall be left of God, and given up a prey to a thousand prejudices ; that he shall be consigned over to the follies of his own heart, and pursue his own temporal and eternal ruin. And even in common studies we should, by humility and dependence, engage the God of truth on our side. (Transcribers Note: This talk of God, pure nonsense that it is, does not diminish the value of his other rules.)

 

 

Andy Grove and the Value of Facing Reality

“People who have no emotional stake in a decision
can see what needs to be done sooner.”
— Andy Grove

***

What do you do when you wake up one day and realize that reality has changed, and you will either change with it or perish? Here’s one story of someone who did it successfully: Andy Grove, the former CEO of Intel Corp.

Here’s the long and short: As late as 1981, Intel Corp had massive dominance of the worldwide semiconductor business. They made memory chips (RAM), owning about 60% of the global trade in a business that was growing in leaps and bounds. The personal computer revolution was taking off and the world was going digital slowly, year by year. It was the right business to be in, and Intel owned it. They got designed into the IBM PC, one of the first popular personal computers, in 1981. Life was good.

The problem was that everyone else wanted into the same business. New companies were popping up every day in the United States, and in the late ’70s and throughout the ’80s, Japanese semiconductor manufacturers started nipping at Intel’s heels. They were competing on price and fast availability. Slowly, Intel realized its products were becoming commodities. By 1988, Japanese manufacturers had over 50% of the global market.

What did Intel do in response?

At first, as most all of us do, they tried to cope with the old reality. They tried running faster on a treadmill to nowhere. This is the first true difficulty of facing a new reality: Seeing the world as it truly is. The temptation is always to stick to the old paradigm.

What Intel really wanted was to be able to stay in the old business and make money at it. Andy Grove describes some of the tactics they tried to this end in his great book Only the Paranoid Survive, written in 1996:

We tried a lot of things. We tried to focus on a niche of the memory market segment, we tried to invent special-purpose memories called valued-added designs, we introduced more advanced technologies and built memories with them. What we were desperately trying to do was earn a premium for our product in the marketplace as we couldn’t match the Japanese downward pricing spiral. There was a saying at Intel at that time: “If we do well we get ‘2x’ [twice] the price of Japanese memories, but what good does it do if ‘X’ gets smaller and smaller?

[…]

We had meetings and more meetings, bickering and arguments, resulting in nothing but conflicting proposals. There were those who proposed what they called a “go for it” strategy: “Let’s build a gigantic factory dedicated to producing memories and nothing but memories, and let’s take on the Japanese.” Others proposed that we should get really clever and use an avant-garde technology, “go for it” but in a technological rather than a manufacturing sense and build something the Japanese producers couldn’t build. Others were still clinging to the idea that we could come up with special-purpose memories, an increasingly unlikely possibility as memories became a uniform worldwide commodity. Meanwhile, as the debates raged, we just went on losing more and more money.

As Grove started waking up to the reality that the old way of doing business wasn’t going to work anymore, he allowed himself the thought that Intel would leave the business that had buttered its bread for so long.

And with this came the second difficulty of facing a new reality: Being the first to see it means you’ll face tremendous resistance from those who are not there yet. 

Of course, Grove faced this in spades at Intel. Notice how he describes the ties to the old reality: Religious conviction.

The company had a couple of beliefs that were as strong as religious dogmas. Both of them had to do with the importance of memories as the backbone of our manufacturing and sales activities. One was that memories were our “technology drivers.” What this phrase meant was that we always developed and refined our technologies on our memory products first because they were easier to test. Once the technology had been debugged on memories, we would apply it to microprocessors and other products. The other belief was the “full product-line” dogma. According to this, our salesmen needed a full product line to do a good job in front of our customers; if they didn’t have a full product line, the customer would prefer to do business with our competitors who did.

Given the strength of these beliefs, an open-minded, rational discussion about getting out of memories was practically impossible. What were we going to use for technology drivers? How were our salespeople going to do their jobs when they had an incomplete product family?

Eventually, after taking half-measures and facing all kinds of resistance from the homeostatic system that is a large organization, Grove was able to convince the executive team it was time to move on from the memory business and go whole-hog into microprocessors, a business where Intel could truly differentiate themselves and build a formidable competitive position.

It’s here that Grove hits on a very humbling point about facing reality: We’re often the last ones to see things the way they truly are! We’re sitting on a train oblivious to the fact that it’s moving at 80 miles per hour, but anyone sitting outside the train watches it whiz right by! This is the value of learning to see the world through the eyes of others.

After all manner of gnashing of teeth, we told our sales force to notify our memory customers. This was one of the great bugaboos: How would our customers react? Would they stop doing business with us altogether now that we were letting them down? In fact, the reaction was, for all practical purposes, a big yawn. Our customers knew that we were not a very large factor in the market and they had half figured that we would get out; most of them had already made arrangements with other suppliers.

In fact, when we informed them of the decision, some of them reacted with the comment, “It sure took you a long time.” People who have no emotional stake in a decision can see what needs to be done sooner. 

This is where the rubber hits on the road. As Grove mentions regarding Intel, you must train yourself to see your situation from the perspective of an outsider.

This is why companies often bring outside management or consulting organizations in to help them — they feel only someone sitting outside the train can see how fast it’s moving! But what if you could have for yourself that kind of objectivity? It takes a passionate interest in reality and a commitment to being open to change. In business especially, the Red Queen effect means that change is a constant, not a variable.

And the story of Andy Grove shows that it can be done. Despite the myriad of problems discussed above, not only did Grove realize how fast the train was moving, but he got all of his people off, and onto a new and better train! By the late ’80s Intel pushed into microprocessing and out of memories, and became one of the great growth companies of the 1990s in a brand new business. (And he did it without bringing in outside help.)

What it took was the courage to face facts and act on them: As hard as it must have been, the alternative was death.

Here’s what Grove took from the experience:

Change is pain

I learned how small and helpless you feel when facing a force that’s “10X” larger than what you are accustomed to. I experienced the confusion that engulfs you when something fundamental changes in the business, and I felt the frustration that comes when the things that worked for you in the past no longer do any good. I learned how desperately you want to run from dealing with even describing a new reality to close associates. And I experienced the exhilaration that comes from a set-jawed commitment to a new direction, unsure as that may be.

A new reality doesn’t happen overnight

In this case, the Japanese started beating us in the memory business in the early eighties. Intel’s performance started to slump when the entire industry weakened in mid-1984. The conversation with Gordon Moore that I described occurred in mid-1985. It took until mid-1986 to implement our exit from memories. Then it took another year before we returned to profitability. Going through the whole strategic inflection point took us a total of three years.

The new reality may be preferable to the old one

I also learned that strategic inflection points, painful as they are for all participants, provide an opportunity to break out of a plateau and catapult to a higher level of achievement. Had we not changed our business strategy, we would have been relegated to an immensely tough economic existence and, for sure, a relatively insignificant role in our industry. By making a forceful move, things turned out far better for us.

So here is your opportunity: When a new reality awaits, don’t go at it timidly. Take it head on and make it not only as good, but better than the old reality. Don’t be the boy in the well, looking up and seeing only the sides of the well. Take the time to see the world around you as it truly is.

***

Still Interested? Check out Grove’s classic book on strategic inflection points, Only the Paranoid Survive. For another interesting business case study, read the interesting story of how IBM first built its monster 20th century competitive advantage.

Henry Ford and the Actual Value of Education

“The object of education is not to fill a man’s mind with facts;
it is to teach him how to use his mind in thinking.”
— Henry Ford

***

In his memoir My Life and Work, written in 1934, the brilliant (but flawed) Henry Ford (1863-1947) offers perhaps the best definition you’ll find of the value of an education, and a useful warning against the mere accumulation of information for the sake of its accumulation. A  devotee of lifelong learning need not be a Jeopardy contestant, accumulating trivia to spit back as needed. In the Age of Google, that sort of knowledge is increasingly irrelevant.

A real life-long learner seeks to learn and apply the world’s best knowledge to create a more constructive and more useful life for themselves and those around them. And to do that, you have to learn how to think on your feet. The world does not offer up no-brainers every day; more frequently, we’re presented with a lot of grey options. Unless your studies are improving your ability to handle reality as it is and get a fair result, you’re probably wasting your time.

From Ford’s memoir:

An educated man is not one whose memory is trained to carry a few dates in history—he is one who can accomplish things. A man who cannot think is not an educated man however many college degrees he may have acquired. Thinking is the hardest work anyone can do—which is probably the reason why we have so few thinkers. There are two extremes to be avoided: one is the attitude of contempt toward education, the other is the tragic snobbery of assuming that marching through an educational system is a sure cure for ignorance and mediocrity. You cannot learn in any school what the world is going to do next year, but you can learn some of the things which the world has tried to do in former years, and where it failed and where it succeeded. If education consisted in warning the young student away from some of the false theories on which men have tried to build, so that he may be saved the loss of the time in finding out by bitter experience, its good would be unquestioned.

An education which consists of signposts indicating the failure and the fallacies of the past doubtless would be very useful. It is not education just to possess the theories of a lot of professors. Speculation is very interesting, and sometimes profitable, but it is not education. To be learned in science today is merely to be aware of a hundred theories that have not been proved. And not to know what those theories are is to be “uneducated,” “ignorant,” and so forth. If knowledge of guesses is learning, then one may become learned by the simple expedient of making his own guesses. And by the same token he can dub the rest of the world “ignorant” because it does not know what his guesses are.

But the best that education can do for a man is to put him in possession of his powers, give him control of the tools with which destiny has endowed him, and teach him how to think. The college renders its best service as an intellectual gymnasium, in which mental muscle is developed and the student strengthened to do what he can. To say, however, that mental gymnastics can be had only in college is not true, as every educator knows. A man’s real education begins after he has left school. True education is gained through the discipline of life.

[…]

Men satisfy their minds more by finding out things for themselves than by heaping together the things which somebody else has found out. You can go out and gather knowledge all your life, and with all your gathering you will not catch up even with your own times. You may fill your head with all the “facts” of all the ages, and your head may be just an overloaded fact−box when you get through. The point is this: Great piles of knowledge in the head are not the same as mental activity. A man may be very learned and very useless. And then again, a man may be unlearned and very useful.

The object of education is not to fill a man’s mind with facts; it is to teach him how to use his mind in thinking. And it often happens that a man can think better if he is not hampered by the knowledge of the past.

Ford is probably wrong in his very last statement, study of the past is crucial to understand the human condition, but the sentiment offered in the rest of the piece should be read and re-read frequently.

This brings to mind a debate you’ll hear that almost all debaters get wrong: What’s more valuable, to be educated in the school of life, or in the school of books? Which is it?

It’s both!

This is what we call a false dichotomy. There is absolutely no reason to choose between the two. We’re all familiar with the algebra. If A and B have positive value, then A+B must be greater than A or B alone! You must learn from your life as it goes along, but since we have the option to augment that by studying the lives of others, why would we not take advantage? All it takes is the will and the attitude to study the successes and failures of history, add them to your own experience, and get an algebra-style A+B result.

So, resolve to use your studies to learn to think, to learn to handle the world better, to be more useful to those around you. Don’t worry about the facts and figures for their own sake. We don’t need another human encyclopedia.

***

Still Interested? Check out all of Ford’s interesting memoir, or try reading up on what a broad education should contain. 

Hares, Tortoises, and the Trouble with Genius

“Geniuses are dangerous.”
— James March

How many organizations would deny that they want more creativity, more genius, and more divergent thinking among their constituents? The great genius leaders of the world are fawned over breathlessly and a great amount of lip service is given to innovation; given the choice between “mediocrity” and “innovation,” we all choose innovation hands-down.

So why do we act the opposite way?

Stanford’s James March might have some insight. His book On Leadership (see our earlier notes here) is a collection of insights derived mostly from the study of great literature, from Don Quixote to Saint Joan to War & Peace. In March’s estimation, we can learn more about human nature (of which leadership is merely a subset) from studying literature than we can from studying leadership literature.

March discusses the nature of divergent thinking and “genius” in a way that seems to reflect true reality. We don’t seek to cultivate genius, especially in a mature organization, because we’re more afraid of the risks than appreciative of the benefits. A classic case of loss aversion. Tolerating genius means tolerating a certain amount of disruption; the upside of genius sounds pretty good until we start understanding its dark side:

Most original ideas are bad ones. Those that are good, moreover, are only seen as such after a long learning period; they rarely are impressive when first tried out. As a result, an organization is likely to discourage both experimentation with deviant ideas and the people who come up with them, thereby depriving itself, in the name of efficient operation, of its main source of innovation.

[…]

Geniuses are dangerous. Othello’s instinctive action makes him commit an appalling crime, the fine sentiments of Pierre Bezukhov bring little comfort to the Russian peasants, and Don Quixote treats innocent people badly over and over again. A genius combines the characteristics that produce resounding failures (stubbornness, lack of discipline, ignorance), a few ingredients of success (elements of intelligence, a capacity to put mistakes behind him or her, unquenchable motivation), and exceptional good luck. Genius therefore only appears as a sub-product of a great tolerance for heresy and apparent craziness, which is often the result of particular circumstances (over-abundant resources, managerial ideology, promotional systems) rather than deliberate intention. “Intelligent” organizations will therefore try to create an environment that allows genius to flourish by accepting the risks of inefficiency or crushing failures…within the limits of the risks that they can afford to take.

We’ve bolded an important component: Exceptional good luck. The kind of genius that rarely surfaces but we desperately pursue needs great luck to make an impact. Truthfully, genius is always recognized in hindsight, with the benefit of positive results in mind. We “cherrypick” the good results of divergent thinkers, but forget that we use the results to decide who’s a genius and who isn’t. Thus, tolerating divergent, genius-level thinking requires an ability to tolerate failure, loss, and change if it’s to be applied prospectively.

Sounds easy enough, in theory. But as Daniel Kahneman and Charlie Munger have so brilliantly pointed out, we become very risk averse when we possess anything, including success; we feel loss more acutely than gain, and we seek to keep the status quo intact. (And it’s probably good that we do, on average.)

Compounding the problem, when we do recognize and promote genius, some of our exalting is likely to be based on false confidence, almost by definition:

Individuals who are frequently promoted because they have been successful will have confidence in their own abilities to beat the odds. Since in a selective, and therefore increasingly homogenous, management group the differences in performance that are observed are likely to be more often due to chance events than to any particular individual capacity, the confidence is likely to be misplaced. Thus, the process of selecting on performance results in exaggerated self-confidence and exaggerated risk-taking.

Let’s use a current example: Elon Musk. Elon is (justifiably) recognized as a modern genius, leaping tall buildings in a single bound. Yet as Ashlee Vance makes clear in his biography, Musk teetered on the brink several times. It’s a near miracle that his businesses have survived (and thrived) to where they’re at today. The press would read much differently if SpaceX or Tesla had gone under — he might be considered a brilliant but fatally flawed eccentric rather than a genius. Luck played a fair part in that outcome (which is not to take away from Musk’s incredible work).

***

Getting back to organizations, the failure to appropriately tolerate genius is also a problem of homeostasis: The tendency of systems to “stay in place” and avoid disruption of strongly formed past habits. Would an Elon Musk be able to rise in a homeostatic organization? It generally does not happen.

James March has a solution, though, and it’s one we’ve heard echoed by other thinkers like Nassim Taleb and seems to be used fairly well in some modern technology organizations. As with most organizational solutions, it requires realigning incentives, which is the job of a strong and selfless leader.

An analogy of the hare and the tortoise illustrates the solution:

Although one particular hare (who runs fast but sleeps too long) has every chance or being beaten by one particular tortoise, an army of hares in competition with an army of tortoises will almost certainly result in one of the hares crossing the finish line first. The choices of an organization therefore depend on the respective importance that it attaches to its mean performance (in which case it should recruit tortoises) and the achievement of a few dazzling successes (an army of hares, which is inefficient as a whole, but contains some outstanding individuals.)

[…]

In a simple model, a tortoise advances with a constant speed of 1 mile/hour while a hare runs at 5 miles/hour, but in each given 5-minute period a hare has a 90 percent chance of sleeping rather than running. A tortoise will cover the mile of the test in one hour exactly and a hare will have only about an 11 percent chance of arriving faster (the probability that he will be awake for at least three of the 5-minute periods.) If there is a race between the tortoise and one hare, the probability that the hare will win is only 0.11. However, if there are 100 tortoises and 100 hares in the race, the probability that at least one hare will arrive before any tortoise (and thus the race will be won by a hare) is 1– ((0.89)^100), or greater than 0.9999.

The analogy holds up well in the business world. Any one young, aggressive “hare” is unlikely to beat the lumbering “tortoise” that reigns king, but put 100 hares out against 100 tortoises and the result is much different.

This means that any organization must conduct itself in such a way that hares have a chance to succeed internally. It means becoming open to divergence and allowing erratic genius to rise, while keeping the costs of failure manageable. It means having the courage to create an “army of hares” inside of your own organization rather than letting tortoises have their way, as they will if given the opportunity.

For a small young organization, the cost of failure isn’t all that high, comparatively speaking — you can’t fall far off a pancake. So hares tend to get a lot more leash. But for a large organization, the cost of failure tends to increase to such a pain point that it stops becoming tolerated! At this point, real innovation ceases.

But, if we have the will and ability to create small teams and projects with “hare-like” qualities, in ways that allow the “talent + luck” equation to surface truly better and different work, necessarily tolerating (and encouraging) failure and disruption, then we might have a shot at overcoming homeostasis in the same way that a specific combination of engineering and fuel allow rockets to overcome the equally strong force of gravity.

***

Still Interested? Check out our notes on James March’s books On Leadership and The Ambiguities of Experience, and an interview March did on the topic of leadership.

Our Genes and Our Behavior

“But now we are starting to show genetic influence on individual differences using DNA. DNA is a game changer; it’s a lot harder to argue with DNA than it is with a twin study or an adoption study.”
— Robert Plomin

***

It’s not controversial to say that our genetics help explain our physical traits. Tall parents will, on average, have tall children. Overweight parents will, on average, have overweight children. Irish parents have Irish looking kids. This is true to the point of banality and only a committed ignorant would dispute it.

It’s slightly more controversial to talk about genes influencing behavior. For a long time, it was denied entirely. For most of the 20th century, the “experts” in human behavior had decided that “nurture” beat “nature” with a score of 100-0. Particularly influential was the child’s early life — the way their parents treated them in the womb and throughout early childhood. (Thanks Freud!)

So, where are we at now?

Genes and Behavior

Developmental scientists and behavioral scientists eventually got to work with twin studies and adoption studies, which tended to show that certain traits were almost certainly heritable and not reliant on environment, thanks to the natural controlled experiments of twins separated at birth. (This eventually provided fodder for Judith Rich Harris’s wonderful work on development and personality.)

All throughout, the geneticists, starting with Gregor Mendel and his peas, kept on working. As behavioral geneticist Robert Plomin explains, the genetic camp split early on. Some people wanted to understand the gene itself in detail, using very simple traits to figure it out (eye color, long or short wings, etc.) and others wanted to study the effect of genes on complex behavior, generally:

People realized these two views of genetics could come together. Nonetheless, the two worlds split apart because Mendelians became geneticists who were interested in understanding genes. They would take a convenient phenotype, a dependent measure, like eye color in flies, just something that was easy to measure. They weren’t interested in the measure, they were interested in how genes work. They wanted a simple way of seeing how genes work.

By contrast, the geneticists studying complex traits—the Galtonians—became quantitative geneticists. They were interested in agricultural traits or human traits, like cardiovascular disease or reading ability, and would use genetics only insofar as it helped them understand that trait. They were behavior centered, while the molecular geneticists were gene centered. The molecular geneticists wanted to know everything about how a gene worked. For almost a century these two worlds of genetics diverged.

Eventually, the two began to converge. One camp (the gene people) figured out that once we could sequence the genome, they might be able to understand more complicated behavior by looking directly at genes in specific people with unique DNA, and contrasting them against one another.

The reason why this whole gene-behavior game is hard is because, as Plomin makes clear, complex traits like intelligence are not like eye color. There’s no “smart gene” — it comes from the interaction of thousands of different genes and can occur in a variety of combinations. Basic Mendel-style counting (the sort of dominant/recessive eye color gene thing you learned in high school biology) doesn’t work in analyzing the influence of genes on complex traits:

The word gene wasn’t invented until 1903. Mendel did his work in the mid-19th century. In the early 1900s, when Mendel was rediscovered, people finally realized the impact of what he did, which was to show the laws of inheritance of a single gene. At that time, these Mendelians went around looking for Mendelian 3:1 segregation ratios, which was the essence of what Mendel showed, that inheritance was discreet. Most of the socially, behaviorally, or agriculturally important traits aren’t either/or traits, like a single-gene disorder. Huntington’s disease, for example, is a single-gene dominant disorder, which means that if you have that mutant form of the Huntington’s gene, you will have Huntington’s disease. It’s necessary and sufficient. But that’s not the way complex traits work.

The importance of genetics is hard to understate, but until the right technology came along, we could only observe it indirectly. A study might have shown that 50% of the variance in cognitive ability was due to genetics, but we had no idea which specific genes, in which combinations, actually produced smarter people.

But the Moore’s law style improvement in genetic testing means that we can cheaply and effectively map out entire genomes for a very low cost. And with that, the geneticists have a lot of data to work with, a lot of correlations to begin sussing out. The good thing about finding strong correlations between genes and human traits is that we know which one is causative: The gene! Obviously, your reading ability doesn’t cause you to have certain DNA; it must be the other way around. So “Big Data” style screening is extremely useful, once we get a little better at it.

***

The problem is that, so far, the successes have been a bit minimal. There are millions of “ATCG” base pairs to check on.  As Plomin points out, we can only pinpoint about 20% of the specific genetic influence for something simple like height, which we know is about 90% heritable. Complex traits like schizophrenia are going to take a lot of work:

We’ve got to be able to figure out where the so-called missing heritability is, that is, the gap between the DNA variants that we are able to identify and the estimates we have from twin and adoption studies. For example, height is about 90 percent heritable, meaning, of the differences between people in height, about 90 percent of those differences can be explained by genetic differences. With genome-wide association studies, we can account for 20 percent of the variance of height, or a quarter of the heritability of height. That’s still a lot of missing heritability, but 20 percent of the variance is impressive.

With schizophrenia, for example, people say they can explain 15 percent of the genetic liability. The jury is still out on how that translates into the real world. What you want to be able to do is get this polygenic score for schizophrenia that would allow you to look at the entire population and predict who’s going to become schizophrenic. That’s tricky because the studies are case-control studies based on extreme, well-diagnosed schizophrenics, versus clean controls who have no known psychopathology. We’ll know soon how this polygenic score translates to predicting who will become schizophrenic or not.

It brings up an interesting question that gets us back to the beginning of the piece: If we know that genetics have an influence on some complex behavioral traits (and we do), and we can with the continuing progress of science and technology, sequence a baby’s genome and predict to a certain extent their reading level, facility with math, facility with social interaction, etc., do we do it?

Well, we can’t until we get a general recognition that genes do indeed influence behavior and do have predictive power as far as how children perform. So far, the track record on getting educators to see that it’s all quite real is pretty bad. Like the Freudians before, there’s a resistance to the “nature” aspect of the debate, probably influenced by some strong ideologies:

If you look at the books and the training that teachers get, genetics doesn’t get a look-in. Yet if you ask teachers, as I’ve done, about why they think children are so different in their ability to learn to read, and they know that genetics is important. When it comes to governments and educational policymakers, the knee-jerk reaction is that if kids aren’t doing well, you blame the teachers and the schools; if that doesn’t work, you blame the parents; if that doesn’t work, you blame the kids because they’re just not trying hard enough. An important message for genetics is that you’ve got to recognize that children are different in their ability to learn. We need to respect those differences because they’re genetic. Not that we can’t do anything about it.

It’s like obesity. The NHS is thinking about charging people to be fat because, like smoking, they say it’s your fault. Weight is not as heritable as height, but it’s highly heritable. Maybe 60 percent of the differences in weight are heritable. That doesn’t mean you can’t do anything about it. If you stop eating, you won’t gain weight, but given the normal life in a fast-food culture, with our Stone Age brains that want to eat fat and sugar, it’s much harder for some people.

We need to respect the fact that genetic differences are important, not just for body mass index and weight, but also for things like reading disability. I know personally how difficult it is for some children to learn to read. Genetics suggests that we need to have more recognition that children differ genetically, and to respect those differences. My grandson, for example, had a great deal of difficulty learning to read. His parents put a lot of energy into helping him learn to read. We also have a granddaughter who taught herself to read. Both of them now are not just learning to read but reading to learn.

Genetic influence is just influence; it’s not deterministic like a single gene. At government levels—I’ve consulted with the Department for Education—I don’t think they’re as hostile to genetics as I had feared, they’re just ignorant of it. Education just doesn’t consider genetics, whereas teachers on the ground can’t ignore it. I never get static from them because they know that these children are different when they start. Some just go off on very steep trajectories, while others struggle all the way along the line. When the government sees that, they tend to blame the teachers, the schools, or the parents, or the kids. The teachers know. They’re not ignoring this one child. If anything, they’re putting more energy into that child.

It’s frustrating for Plomin because he knows that eventually DNA mapping will get good enough that real, and helpful, predictions will be possible. We’ll be able to target kids early enough to make real differences — earlier than problems actually manifest — and hopefully change the course of their lives for the better. But so far, no dice.

Education is the last backwater of anti-genetic thinking. It’s not even anti-genetic. It’s as if genetics doesn’t even exist. I want to get people in education talking about genetics because the evidence for genetic influence is overwhelming. The things that interest them—learning abilities, cognitive abilities, behavior problems in childhood—are the most heritable things in the behavioral domain. Yet it’s like Alice in Wonderland. You go to educational conferences and it’s as if genetics does not exist.

I’m wondering about where the DNA revolution will take us. If we are explaining 10 percent of the variance of GCSE scores with a DNA chip, it becomes real. People will begin to use it. It’s important that we begin to have this conversation. I’m frustrated at having so little success in convincing people in education of the possibility of genetic influence. It is ignorance as much as it is antagonism.

Here’s one call for more reality recognition.

***

Still Interested? Check out a book by John Brookman of Edge.org with a curated collection of articles published on genetics.

J.K. Rowling On People’s Intolerance of Alternative Viewpoints

At the PEN America Literary Gala & Free Expression Awards, J.K. Rowling, of Harry Potter fame, received the 2016 PEN/Allen Foundation Literary Service Award. Embedded in her acceptance speech is some timeless wisdom on tolerance and acceptance:

Intolerance of alternative viewpoints is spreading to places that make me, a moderate and a liberal, most uncomfortable. Only last year, we saw an online petition to ban Donald Trump from entry to the U.K. It garnered half a million signatures.

Just a moment.

I find almost everything that Mr. Trump says objectionable. I consider him offensive and bigoted. But he has my full support to come to my country and be offensive and bigoted there. His freedom to speak protects my freedom to call him a bigot. His freedom guarantees mine. Unless we take that absolute position without caveats or apologies, we have set foot upon a road with only one destination. If my offended feelings can justify a travel ban on Donald Trump, I have no moral ground on which to argue that those offended by feminism or the fight for transgender rights or universal suffrage should not oppress campaigners for those causes. If you seek the removal of freedoms from an opponent simply on the grounds that they have offended you, you have crossed the line to stand alongside tyrants who imprison, torture and kill on exactly the same justification.

Too often we look at the world through our own eyes and fail to acknowledge the eyes of others. In so doing we often lose touch with reality.

The quick reaction our brains have to people who disagree with us is often that they are idiots. They shouldn’t be allowed to talk or have a platform. They should lose.

This reminds me of Kathryn Schulz’s insightful view on what we do when someone disagrees with us.

As a result we dismiss the views of others, failing to even consider that our view of the world might be wrong.

It’s easy to be dismissive and intolerant of others. It’s easy to say they’re idiots and wish they didn’t have the same rights you have. It’s harder to map that to the very freedoms we enjoy and relate it to the world we want to live in.