Category: People

12 Things Lee Kuan Yew Taught Me About the World

“It’s no accident that Singapore has a much better record, given where it started, than the United States. There, power was concentrated in one enormously talented person, Lee Kuan Yew, who was the Warren Buffett of Singapore.”
— Charlie Munger

***

Singapore seemed destined for failure or subservience to a more powerful neighbor. The country is by far the smallest in Southeast Asia and was not gifted with many natural resources. Lee Kuan Yew thought otherwise. “His vision,” wrote Henry Kissinger, “was of a state that would not simply survive, but prevail by excelling. Superior intelligence, discipline, and ingenuity would substitute for resources.”

To give you an idea of the magnitude of success that Lee Kuan Yew achieved, when he took over, per capita income was about $400 and now, in only about two generations, it exceeds $50,000.

Here are 12 things I learned from Lee Kuan Yew about the world and the source of many of our present ills reading  Lee Kuan Yew: The Grand Master’s Insights on China, the United States, and the World

  1. You need a free exchange of ideas. “China will inevitably catch up to the U.S. in absolute GDP. But its creativity may never match America’s, because its culture does not permit a free exchange and contest of ideas.”
  2. Technology will change how governance operates. “Technology is going to make (China’s) system of governance obsolete. By 2030, 70% or maybe 75% of their people will be in cities, small towns, big towns, mega big towns. They are going to have cell phones, Internet, satellite TV. They are going to be well-informed; they can organize themselves. You cannot govern them the way you are governing them now, where you just placate and monitor a few people, because the numbers will be so large.”
  3. Don’t try to install a democracy in a country that has never had one. “I do not believe you can impose on other countries standards which are alien and totally disconnected with their past. So to ask China to become a democracy, when in its 5,000 years of recorded history it never counted heads; all rulers ruled by right of being the emperor, and if you disagree, you chop off heads, not count heads.”
  4. Welcome the best the world has to offer. “Throughout history, all empires that succeeded have embraced and included in their midst people of other races, languages, religions, and cultures.”
  5. It’s about results, not promises. “When you have a popular democracy, to win voices you have to give more and more. And to beat your opponent in the next election, you have to promise to give more away. So it is a never-ending process of auctions—and the cost, the debt being paid for by the next generation. Presidents do not get reelected if they give a hard dose of medicine to their people. So, there is a tendency to procrastinate, to postpone unpopular policies in order to win elections. So problems such as budget deficits, debt, and high unmployment have been carried forward from one administration to the next.”
  6. Governments shouldn’t have an easy way out. “American and European governments believed that they could always afford to support the poor and the needy: widows, orphans, the old and homeless, disadvantaged minorities, unwed mothers. Their sociologists expounded the theory that hardship and failure were due not to the individual person’s character, but to flaws in the economic system. So charity became “entitlement,” and the stigma of living on charity disappeared. Unfortunately, welfare costs grew faster than the government’s ability to raise taxes to pay for it. The political cost of tax increases is high. Governments took the easy way out by borrowing to give higher benefits to the current generation of voters and passing the costs on to the future generations who were not yet voters. This resulted in persistent government budget deficits and high public debt.”
  7. What goes into a standard of living? “A people’s standard of living depends on a number of basic factors: first, the resources it has in relation to its population . . .; second, its level of technological competence and standards of industrial development; third, its educational and training standards; and fourth, the culture, the discipline and drive in the workforce.”
  8. The single most important factor to national competitiveness … “The quality of a nation’s manpower resources is the single most important factor determining national competitiveness. It is a people’s innovativeness, entrepreneurship, team work, and their work ethic that give them the sharp keen edge in competitiveness. Three attributes are vital in this competition—entrepreneurship to seek out new opportunities and to take calculated risks. Standing still is a sure way to extinction. . . . The second attribute, innovation, is what creates new products and processes that add value. . . . The third factor is good management. To grow, company managements have to open up new markets and create new distribution channels. The economy is driven by the new knowledge, new discoveries in science and technology, innovations that are taken to the market by entrepreneurs. So while the scholar is still the greatest factor in economic progress, he will be so only if he uses his brains—not in studying the great books, classical texts, and poetry, but in capturing and discovering new knowledge, applying himself in research and development, management and marketing, banking and finance, and the myriad of new subjects that need to be mastered.”
  9. Earning your place in history … “A nation is great not by its size alone. It is the will, the cohesion, the stamina, the discipline of its people, and the quality of their leaders which ensure it an honorable place in history.”
  10. Weak leaders rely on opinion polls. “I have never been overconcerned or obsessed with opinion polls or popularity polls. I think a leader who is, is a weak leader. If you are concerned with whether your rating will go up or down, then you are not a leader. You are just catching the wind … you will go where the wind is blowing. . . . Between being loved and feared, I have always believed Machiavelli was right. If nobody is afraid of me, I am meaningless. When I say something … I have to be taken very seriously.”
  11. We are fundamentally competitive. “Human beings are not born equal. They are highly competitive. Systems like Soviet and Chinese communism have failed, because they tried to equalize benefits. Then nobody works hard enough, but everyone wants to get as much as, if not more than, the other person.”
  12. The value of history: “If you do not know history, you think short term. If you know history, you think medium and long term.”

 

***

Lee Kuan Yew: The Grand Master’s Insights on China, the United States, and the World offers Yew’s timeless wisdom.

Towards a Greater Synthesis: Steven Pinker on How to Apply Science to the Humanities

The fundamental idea behind Farnam Street is to learn to think across disciplines and synthesize, using ideas in combination to solve problems in novel ways.

An easy example would be to take a fundamental idea of psychology like the concept of a near-miss (deprival super-reaction) and use it to help explain the success of a gambling enterprise. Or, similarly, using the idea of the endowment effect to help explain why lotteries are a lot more successful if you allow people to choose their own numbers. Sometimes we take ideas from hard science, like the idea of runaway feedback (think of a nuclear reaction gaining steam), to explain why small problems can become large problems or small advantages can become large ones.

This kind of reductionism and synthesis helps one understand the world at a fundamental level and solve new problems.

We’re sometimes asked about untapped ways that this thinking can be applied. In hearing this, it occasionally seems that people fall into the trap of believing all of the great cross-disciplinary thinking has been done. Or maybe even that all of the great thinking has been done, period.

Steven-Pinker-by-Rebecca-Goldstein

Harvard psychologist Steven Pinker is here to say we have a long way to go.

We’ve written before about Pinker’s ideas on a broad education and on writing, but he’s also got a great essay on Edge.org called Writing in the 21st Century wherein he addresses some of the central concepts of his book on writing — The Sense of Style. While the book’s ideas are wonderful, later in the article he moves to a more general point useful for our purposes: Systematic application of the “harder” sciences to the humanities is a huge untapped source of knowledge.

He provides some examples that are fascinating in their potential:

This combination of science and letters is emblematic of what I hope to be a larger trend we spoke of earlier, namely the application of science, particularly psychology and cognitive science, to the traditional domains of humanities. There’s no aspect of human communication and cultural creation that can’t benefit from a greater application of psychology and the other sciences of mind. We would have an exciting addition to literary studies, for example, if literary critics knew more about linguistics.Poetry analysts could apply phonology (the study of sound structure) and the cognitive psychology of metaphor. An analysis of plot in fiction could benefit from a greater understanding of the conflicts and confluences of ultimate interests in human social relationships. The genre of biography would be deepened by an understanding of the nature of human memory, particularly autobiographical memory. How much of the memory of our childhood is confabulated? Memory scientists have a lot to say about that. How much do we polish our image of ourselves in describing ourselves to others, and more importantly, recollecting our own histories? Do we edit our memories in an Orwellian manner to make ourselves more coherent in retrospect? Syntax and semantics are relevant as well. How does a writer use the tense system of English to convey a sense of immediacy or historical distance?

In music the sciences of auditory and speech perception have much to contribute to understanding how musicians accomplish their effects. The visual arts could revive an old method of analysis going back to Ernst Gombrich and Rudolf Arnheim in collaboration with the psychologist Richard Gregory. Indeed, even the art itself in the 1920s was influenced by psychology, thanks in part to Gertrude Stein, who as an undergraduate student of William James did a wonderful thesis on divided attention, and then went to Paris and brought the psychology of perception to the attention of artists like Picasso and Braque. Gestalt psychology may have influenced Paul Klee and the expressionists. Since then we have lost that wonderful synergy between the science of visual perception and the creation of visual art.

Going beyond the arts, the social sciences, such as political science could benefit from a greater understanding of human moral and social instincts, such as the psychology of dominance, the psychology of revenge and forgiveness, and the psychology of gratitude and social competition. All of them are relevant, for example, to international negotiations. We talk about one country being friendly to another or allying or competing, but countries themselves don’t have feelings. It’s the elites and leaders who do, and a lot of international politics is driven by the psychology of its leaders.

In this short section alone, Pinker offers realistically that we can apply:

  • Linguistics to literature
  • Phonology and psychology to poetry
  • The biology of groups to understand fiction
  • The biology of memory to understand biography
  • Semantics to understand historical writing
  • Psychology and biology to understand art and music
  • Psychology and biology to understand politics

Turns out, there’s a huge amount of thinking left to be done. Effectively, Pinker is asking us to imitate the scientist Linus Pauling, who sought to systematically understand chemistry by using the next most fundamental discipline, physics, an approach which led to great breakthroughs and a consilience of knowledge in the two fields which is taken for granted in modern science.

Towards a Greater Synthesis

Even if we’re not trying to make great scientific advances, think about how we could apply this idea to all of our lives. Fields like basic mathematics, statistics, biology, physics, and psychology provide deep insight into the “higher level” functions of humanity like law, medicine, politics, business, and social groups. Or, as Munger has put it, “When you get down to it, you’ll find worldly wisdom looks pretty darn academic.” And it isn’t as hard as it sounds: We don’t need to understand the deep math of relativity to grasp the idea that two observers can see the same event in a different way depending on perspective. The rest of the world’s models are similar, although having some mathematical fluency is necessary.

Pinker, like Munger, doesn’t stop there. He also believes in what Munger calls the ethos of hard science, which is a way of rigorously considering the problems of the practical world.

Even beyond applying the findings of psychology and cognitive science and social and affective neuroscience, it’s the mindset of science that ought to be exported to cultural and intellectual life as a whole. That consists in increased skepticism and scrutiny about factual conventional wisdom: How much of what you think is true really is true if you go to the numbers? For me this has been a salient issue in analyzing violence, because the conventional wisdom is that we’re living in extraordinarily violent times.

But if you take into account the psychology of risk perception, as pioneered by Daniel Kahneman, Amos Tversky, Paul Slovic, Gerd Gigerenzer, and others, you realize that the conventional wisdom is systematically distorted by the source of our information about the world, namely the news. News is about the stuff that happens; it’s not about the stuff that doesn’t happen. Human risk perception is affected by memorable examples, according to Tversky and Kahneman’s availability heuristic. No matter what the rate of violence is objectively, there are always enough examples to fill the news. And since our perception of risk is influenced by memorable examples, we’ll always think we’re living in violent times. It’s only when you apply the scientific mindset to world events, to political science and history, and try to count how many people are killed now as opposed to ten years ago, a hundred years ago, or a thousand years ago that you get an accurate picture about the state of the world and the direction that it’s going, which is largely downward. That conclusion only came from applying an empirical mindset to the traditional subject matter of history and political science.

Nassim Taleb has been on a similar hunt for a long time (although, amusingly, he doesn’t like Pinker’s book on violence at all). The question is relatively straightforward: How do we know what we know? Traditionally, what we know has simply been based on what we can see, something now called the availability bias. In other words, because we see our grandmother live to 95 years old while eating carrots every day, we think carrots prevent cancer. (A conflation of correlation and causation.)

But Pinker and Taleb call for a higher standard called empiricism, which requires pushing beyond anecdote into an accumulation of sound data to support a theory, with disconfirming examples weighted as heavily as confirming ones. This shift from anecdote to empiricism led humanity to make some of its greatest leaps of understanding, yet we’re still falling into the trap regularly, an outcome which itself can be explained by evolutionary biology and modern psychology. (Hint: It’s in the deep structure of our minds to extrapolate.)

Learning to Ask Why

Pinker continues with a claim that Munger would dearly appreciate: The search for explanations is how we push into new ideas. The deeper we push, the better we understand.

The other aspect of the scientific mindset that ought to be exported to the rest of intellectual life is the search for explanations. That is, not to just say that history is one damn thing after another, that stuff happens, and there’s nothing we can do to explain why, but to relate phenomena to more basic or general phenomena … and to try to explain those phenomena with still more basic phenomena. We’ve repeatedly seen that happen in the sciences, where, for example, biological phenomena were explained in part at the level of molecules, which were explained by chemistry, which was explained by physics.

There’s no reason that that this process of explanation can’t continue. Biology gives us a grasp of the brain, and human nature is a product of the organization of the brain, and societies unfold as they do because they consist of brains interacting with other brains and negotiating arrangements to coordinate their behavior, and so on.

This idea certainly takes heat. The biologist E.O. Wilson calls it Consilience, and has gone as far as saying that all human knowledge can eventually be reduced to extreme fundamentals like mathematics and particle physics. (Leading to something like The Atomic Explanation of the Civil War.)

Whether or not you take it to such an extreme depends on your boldness and your confidence in the mental acuity of human beings. But even if you think Wilson is crazy, you can still learn deeply from the more fundamental knowledge in the world. This push to reduce things to their simplest explanations (but not simpler) is how we array all new knowledge and experience on a latticework of mental models.

For example, instead of taking Warren Buffett’s dictum that markets are irrational on its face, try to understand why. What about human nature and the dynamics of human groups leads to that outcome? What about biology itself leads to human nature? And so on. You’ll eventually hit a wall, that’s a certainty, but the further you push, the more fundamentally you understand the world. Elon Musk calls this first principles thinking and credits it with helping him do things in engineering and business that almost everyone considered impossible.

***

From there, Pinker concludes with a thought that hits near and dear to our hearts:

There is no “conflict between the sciences and humanities,” or at least there shouldn’t be. There should be no turf battle as to who gets to speak about what matters. What matters are ideas. We should seek the ideas that give us the deepest, richest, best-informed understanding of the human condition, regardless of which people or what discipline originates them. That has to include the sciences, but it can’t come only from the sciences. The focus should be on ideas, not on people, disciplines, or academic traditions.


Still Interested?
Start building your mental models and read some more Pinker for more goodness.

Elon Musk and the Question of Overconfidence

Ashlee Vance’s book on Elon Musk is well read for a good reason: It’s a fascinating look at a fascinating person.

You can interpret the book however you like. It’s a tale of genius. It’s a tale of someone driven beyond all reason to succeed. It’s a tale of a brilliant, talented engineer/entrepreneur. It’s the tale of someone trying to overcome a difficult childhood by setting audacious goals for himself and accomplishing them. It’s the tale of someone creating the future instead of waiting for it. It’s the tale of a deluded, arrogant, thrice-divorced jerk. It’s the tale of a guy with an IQ of 190 who thinks it’s 250. (Munger made that claim a few years ago — we were there.)

Frankly, it doesn’t matter how you see Musk — he is who he is (which brings to mind Eminem’s lyrics: I am, whatever you say I am. If I wasn’t, why would you say I am). But however you choose to read the book, and read it you should, there’s one part of the tale that it would be hard to disagree with: The guy is chasing larger goals than essentially anyone else, he’s made a surprising amount of progress towards achieving them in such a very short time.

His mind is different than yours or mine.

Which begs a good question: Why would Charlie Munger, an admitted science/engineering nut, the guy with a fanatical devotion to the Chinese firm BYD for its engineering culture and its aggressively entrepreneurial CEO, accuse Musk of attempting too much? (“Personally, I’m scared of the guy,” Munger added.)

***

As Vance describes in the book, in 2001 Musk came to aerospace engineer Jim Cantrell with his most audacious question to date: How do we become a multi-planetary species? 

Musk wanted to know how we could create a sustainable colony on Mars, a sort of “backup plan” for humanity.

That launched (puns!) Musk’s now 14-year old venture SpaceX, a private business dedicated to putting a sustainable human colony on Mars. (Yes, really.) That would require first figuring out a low-cost method of launching rockets into space, to get us and our supplies to the colony.

Prior to SpaceX, Musk was best known for co-founding Paypal — he was considered a bright software engineer and an up-and-coming entrepreneur, but Rocket Man? Not so much. To that end, someone recently asked a good question on QuoraHow did Elon Musk learn enough about rockets to run SpaceX? 

Luckily, Musk’s friend and SpaceX co-founder Jim Cantrell took note and left an interesting response, interesting because Cantrell doesn’t really answer the stated question so much as (what seems to us) a better one:

Elon Musk and the Question of Overconfidence

What allows Musk to attempt and complete projects everyone else considers impossible? 

Those projects would include designing rockets from scratch, creating a successful private company to put them into orbit (it hadn’t been done), starting a car company from scratch (it hadn’t been done in the U.S. since 1925), designing a fully electric vehicle that was also considered cool and desirable, and selling cars directly to consumers, among other projects.

We reprint Cantrell’s Quora answer, and recommend you take a minute to consider the merits and demerits of his approach to life.

What I found from working with Elon is that he starts by defining a goal and he puts a lot of effort into understanding what that goal is and why it is a good and valid goal.  His goal, as I see it, has not changed from the day he first called me in August of 2001.  I still hear it in his speeches.  His goal was to make mankind a multi planetary species and to do that he had to first solve the transportation problem.

Once he has a goal, his next step is to learn as much about the topic at hand as possible from as many sources as possible.  He is by far the single smartest person that I have ever worked with …  period.  I can’t estimate his IQ but he is very very intelligent.  And not the typical egg head kind of smart.  He has a real applied mind.  He literally sucks the knowledge and experience out of people that he is around.  He borrowed all of my college texts on rocket propulsion when we first started working together in 2001.  We also hired as many of my colleagues in the rocket and spacecraft business that were willing to consult with him.  It was like a gigantic spaceapalooza.  At that point we were not  talking about building a rocket ourselves, only launching a privately  funded mission to Mars.  I found out later that he was talking to a  bunch of other people about rocket designs and collaborating on some spreadsheet level systems designs for launchers.  Once our dealings with the Russians fell apart, he decided to build his own rocket and this was the genesis of SpaceX.

So I am going to suggest that he is successful not because his visions are  grand, not because he is extraordinarily smart and not because he works incredibly hard.  All of those things are true.  The one major important distinction that sets him apart is his inability to consider failure.  It simply is not even in his thought process.  He cannot conceive of  failure and that is truly remarkable.  It doesn’t matter if its going up against the banking system (Paypal), going up against the entire  aerospace industry (SpaceX) or going up against the US auto industry (Tesla). He can’t imagine NOT succeeding and that is a very critical  trait that leads him ultimately to success. He and I had very similar upbringings, very similar interests and very similar early histories.  He was a bit of a loner and so was I.  He decided to start a software company at age 13.  I decided to design and build my own stereo amplifier system at age 13.  Both of us succeeded at it.  We both had engineers for fathers and were extremely driven kids.  What separated us, I believe, was his lack of even being able to conceive failure.  I know this because this is where we parted ways at SpaceX.  We got to a point where I could not see it succeeding and walked away.  He didn’t and succeeded.  I have 25 years experience building space hardware and he had none at the time.  So much for experience.

I recently wrote an op-ed piece for Space News where I also suggest that his ruthlessly efficient way to deploy capital is another great reason for his success.  He can almost smell the right way through a problem and he drives his staff and his organization hard to achieve it.  The results speak for themselves.  The article is here End of WWII Model Shakes Up Aerospace Industry.

In the end I think that we are seeing a very fundamental shift in the way our world takes on the big challenges facing humanity and Elon’s Way as I call it will be considered the tip of the spear.  My hat’s off to the man.

Our hats off to him too. For sure. But this first-hand account does solve the Munger puzzle to an extent.

Here’s Munger in 1998, in a speech to a group of foundation CIOs, including representatives from the Hilton Foundation and the Getty Trust.

Similarly, the hedge fund known as ‘Long-Term Capital Management’ recently collapsed, through overconfidence in its highly leveraged methods, despite I.Q.’s of its principals that must have averaged 160. Smart, hard-working people aren’t exempted from professional disasters from overconfidence. Often, they just go aground in the more difficult voyages they choose, relying on their self-appraisals that they have superior talents and methods.

We’ll leave it up to you to judge whether Musk may go aground in his various ventures — SpaceX, Tesla, and SolarCity among them, but it’s hard not to be impressed at the work completed so far. The world needs more people like him, not fewer.
***

Still Interested? 
Check out Musk’s thoughts on regulators, the 12 books he recommended in 2014, or his system of first-principles thinking.

What Can We Learn From the Prolific Mr. Asimov?

To learn is to broaden, to experience more, to snatch new aspects of life for yourself. To refuse to learn or to be relieved at not having to learn is to commit a form of suicide; in the long run, a more meaningful type of suicide than the mere ending of physical life. 

Knowledge is not only power; it is happiness, and being taught is the intellectual analog of being loved.

— Isaac Asimov, Yours, Isaac Asimov: A Life in Letters

 

isaac-asimov
Fans estimate that the erudite polymath Isaac Asimov authored nearly 500 full-length books during his life. Even if some that “don’t count” are removed from the list — anthologies he edited, short science books he wrote for young people and so on — Asimov’s output still reaches into the many hundreds of titles.  Starting with a spate of science-fiction novels in the 1950s, including the now-classic Foundation series, Asimov’s writing eventually ranged into non-fiction with works of popular science, Big History, and even annotated guides to classic novels like Paradise Lost and Gulliver’s Travels.

Among his works were a 1,200 page Guide to the Bible; he also wrote books on Greece, Rome, Egypt, and the Middle East; he wrote a wonderful Guide to Shakespeare and a comprehensive Chronology of the World; he wrote books on Carbon, Nitrogen, Photosynthesis, The Moon, The Sun, and the Human Body, along with many more scientific topics. He coined the term “robotics” and his stories led to modern movies like I, Robot and Bicentennial Man. He wrote one of the most popular stories of all time: The Last Question. He even wrote a few joke books and a book of limericks.

His Intelligent Man’s Guide to Science, a 500,000 word epic written in a mad dash of eight months, was nominated for a National Book Award in 1961, losing only to William Shirer’s bestselling history of Nazi Germany, The Rise and Fall of the Third Reich.

His science-fiction books continue to sell to this day and are considered foundational works of the genre. He won more than a dozen book awards. His science and history books were considered some of the best published for lay audiences — the only real complaint we can make is that a few of them are outdated now. (We’ll give Asimov a pass for not updating them, since he’s been dead for almost 25 years.)

In his free time, he was reputed to have written over 90,000 letters while keeping a monthly column in the Magazine of Fantasy and Science Fiction for 33 years between 1958 and 1991. Between the Magazine and numerous other outlets, Asimov compiled somewhere near 1,600 essays throughout his life.

In other words, the man was a writer through and through, leading to a question that begs to be asked:

What can we mortals learn from the Prolific Mr. Asimov?

Make the Time — No Excuses

Many people complain that they don’t have time for their passions because of the unavoidable duties which suck up every free moment: Well, Asimov had duties too, but he got his writing career started anyway. From 1939 until 1958, Asimov doubled as a professor of biochemistry at Boston University, during which he completed 28 novels and a list of short stories long enough to fill most writers’ entire career. He simply made the time to write.

In a posthumously published memoir, Asimov reflects on the “candy store” schedule implanted on him by his father, who’d worked long hours running a convenience store in New York after emigrating from Russia. As Asimov became a professional writer, he kept the heroic schedule for himself:

I wake at five in the morning. I get to work as early as I can. I work as long as I can. I do this every day of the week, including holidays. I don’t take vacations voluntarily and I try to do my work even when I’m on vacation. (And even when I’m in the hospital.)

In other words, I am still and forever in the candy store. Of course, I’m not waiting on customers; I’m not taking money and making change; I’m not forced to be polite to everyone who comes in (in actual fact, I was never good at that). I am, instead, doing things I very much want to do — but the schedule is there; the schedule that was ground into me; the schedule you would think I would have rebelled against once I had the chance.

Know your Spots, and Stick to those Spots 

“I’m no genius, but I’m smart in spots, and I stay around those spots.”
—Thomas Watson, Sr., Founder of IBM

Even though he’d been writing in his spare time as a professor, Asimov was not doing any academic research, which did not go unnoticed by his superiors at Boston University. Asimov’s success as an author combined with his dedication to his craft had forced him into a decision: Be an academic or be a popular writer. The decision needed no fretting — he was making so much money and such a large impact as a writer, he knew he’d be a fool to give it up. His rationalization to the school was wise and instructive:

I finally felt angry enough to say, “…as a science writer, I am extraordinary. I plan to be the best science writer in the world and I will shed luster on the medical school [at BU]. As a researcher, I am simply mediocre and…if there’s one thing this school does not need, it is one more merely mediocre researcher.”

[One faculty member complimented him on his bravery in fighting for academic freedom.] I shrugged, “There’s no bravery about it. I have academic freedom and I can give it to you in two words:

“What’s that?” He said.

Outside income,” I said.

In other words, Asimov knew his circle of competence and knew himself. He made that again clear in a 1988 interview, when he was asked about a number of other projects and interests outside of writing. He demurred on all of them:

SW: Do you have any time left for other things besides writing?

IA: All I do is write. I do practically nothing else, except eat, sleep and talk to my wife.

[…]

SW: Have you ever written any screenplays for SF movies?

IA: No, I’m no talent for that and I don’t want to get mixed up with Hollywood. If they are going to do something of mine, they will have to find someone else to write the screenplays.

[…]

SW: Do you like the covers of your books? Do you have any input in their design?

IA: No, I don’t have any input into that. Publishers take care of that entirely. They never ask any questions and I never offer any advice, because my artistic talent is zero.

[…]

SW: Do you have a favorite SF painter?

IA: Well, there is a number of painters that I like very much. To name just a few: Michael Whelan and Boris Vallejo are between my favorites. I’m impressed by them, but that doesn’t necessarily mean anything – I don’t know that I have any taste in art.

[…]

SW: Have you ever tried to paint something yourself?

IA: No, I can’t even draw a straight line with a ruler.

[…]

SW: Do you have any favorite SF writers?

IA: My favorite is Arthur Clark. I also like people like Fred Pohl or Larry Niven and others who know their science. I like Harlan Ellison, too, although his stories are terribly emotional. But I don’t consider myself a judge of good science-fiction – not even my own.

Asimov knew and recognized his own constitution at a fairly early age, smartly seizing opportunities to build his life around that self-awareness in the way Hunter S. Thompson would advise young people to do years later.

In a separate posthumously published autobiography, Asimov reflected on his highly independent nature:

I never found true peace till I turned my whole working life into self-employment. I was not made to be an employee.

For that matter, I strongly suspect I was not made to be an employer either. At least I have never had the urge to have a secretary or helper of any kind. My instinct tells me that there would surely be interactions that would slow me down. Better to be a one-man operation, which I eventually became and remained.

Find What you Love, and Work Like Hell

To be prolific, he warns, one must be a
“single-minded, 
driven, non-stop person.”
— Interview with Isaac Asimov, 1979

Although Asimov was working the “candy store” hours and producing more output than nearly anyone of his generation, it was clear that he did it out of love.  The only reason he was able to write so much, he said, was “pure hedonism.”  He simply couldn’t not write. That would have been unfathomable.

One admission from his autobiography tells the tale best:

One of the few depressing lunches I have had with Austin Olney [Houghton Mifflen editor] came on July 7, 1959. I incautiously told him of the various books I had in progress, and he advised me strongly not to write so busily. He said my books would compete with each other, interfere with each other’s sales, and do less well per book if there were many.

The one thing I had learned in my ill-fated class in economics in high school was “the law of diminishing returns,” whereby working ten times as hard or investing ten times as much or producing ten times the quantity does not yield ten times the return.

I was rather glum after that meal and gave the matter much thought afterward.

What I decided was that I wasn’t writing ten times as many books in order to get ten times the monetary returns, but in order to have ten times the pleasure

One of Asimov’s best methods to keep the work flowing was to have more than one project going at a time. If he got writers’ block or got bored with one project, he simply switched to another project, a tactic which kept him from stopping work to agonize and procrastinate. By the time he came back to the first project, he found the writing flowed easily once again.

This sort of “switching” is a hugely useful method to improve your overall level of productivity and avoid major hair-pulling roadblocks. You can also use this tactic with books to improve your overall reading yield, switching between them as your mood and energy dictates.

Never Stop Learning

If anything besides sheer productivity defined Asimov, it was a thirst for knowledge. He simply never stopped learning, and with that attitude, he grew into a mental giant who was more than once accused of “knowing everything”:

Nothing goes to waste, if you’re determined to learn. I had already learned, for instance, that although I was one of the most overeducated people I knew, I couldn’t possibly write the variety of books I manage to do out of the knowledge I had gained in school alone. I had to keep a program of self-education in process. 

[…]

And, as I went on to discover, each time I wrote a book on some subject outside my immediate field it gave me courage and incentive to do another one that was perhaps even farther outside the narrow range of my training…I advanced from chemical writer to science writer, and, eventually, I took all of my learning for my subject (or at least all that I could cram into my head — which, alas, had a sharply limited capacity despite all I could do).

As I did so, of course, I found that I had to educate myself. I had to read books on physics to reverse my unhappy experiences in school on the subject and to learn at home what I had failed to learn in the classroom — at least up to the point where my limited knowledge of mathematics prevented me from going farther.

When the time came, I read biology, medicine, and geology. I collected commentaries on the Bible and on Shakespeare. I read history books. Everything led to something else. I became a generalist by encouraging myself to be generally interested in all matters.

[…]

As I look back on it, it seems quite possible that none of this would have happened if I had stayed at school and had continued to think of myself as, primarily, a biochemist…[so] I was forced along the path I ought to have taken of my own accord if I had had the necessary insight into my own character and abilities.

(Source: It’s Been a Good Life)

Still interested? Check out Asimov’s memoir I, Asimov, his collection of stories I, Robot, or his collection of letters, Yours, Isaac Asimov: A Life in Letters.

When Breath Becomes Air: What Makes Life Worth Living in the Face of Death?

“When you come to one of the many moments in life where you must give an account of yourself, provide a ledger of what you have been, and done, and meant to the world, do not, I pray, discount that you filled a dying man’s days with a sated joy, a joy unknown to me in all my prior years, a joy that does not hunger for more and more but rests, satisfied. In this time, right now, that is an enormous thing.”

***

Dr. Paul Kalanithi was 36 years old and in his final year as a neurosurgical resident when he was diagnosed with terminal cancer. His beautifully written memoir, When Breath Becomes Air, published posthumously, chronicles his lifelong quest to learn what gives life meaning.

Kalanithi’s wife Lucy, also a doctor, explains in the epilogue why he chose to write about his experience.

Paul confronted death – examined it, wrestled with it, accepted it – as a physician and a patient. He wanted to help people understand death and face their mortality. Dying in one’s fourth decade is unusual now, but dying is not.

In a letter to a friend, he writes, “That’s what I’m aiming for, I think. Not the sensationalism of dying, and not exhortations to gather rosebuds, but: Here’s what lies up ahead on the road.”

In When Breath Becomes Air, Kalanithi shares his journey along that road as he transitions from doctor to patient and comes face-to-face with his own mortality.

As a student

Before studying medicine at Yale, Kalanithi had earned a BA and an MA in English literature, a BA in biology and an MPhil in the history and philosophy of science and medicine. He was interested in discovering where “biology, morality, literature and philosophy intersect”.

I was driven less by achievement than by trying to understand, in earnest: What makes human life meaningful? I still felt literature provided the best account of the life of the mind, while neuroscience laid down the most elegant rules of the brain.

Throughout college, my monastic, scholarly study of human meaning would conflict with my urge to forge and strengthen the human relationships that formed that meaning. If the unexamined life was not worth living, was the unlived life worth examining?

After years of theoretical discussions about mortality and the meaning of life, he came to the conclusion that “direct experience of life-and-death questions was essential to generating substantial moral opinions about them”. And so, he chose to study medicine.

As a physician

In Being Mortal: Medicine and What Matters in the End, Dr. Atul Gawande, calls for change in the way medical professionals deal with illness. While medical science has given us the ability to extend life, it does not ask – or answer – the question of when life still has meaning.

The problem with medicine and the institutions it has spawned for the care of the sick and the old is not that they have had an incorrect view of what makes life significant. The problem is that they have had almost no view at all. Medicine’s focus is narrow. Medical professionals concentrate on repair of health, not sustenance of the soul. Yet – and this is the painful paradox – we have decided that they should be the ones who largely define how we live in our waning days.

As a neurosurgical resident, Kalanithi was well aware of this paradox and the interplay between our medical choices and the things that give our lives meaning.

While all doctors treat diseases, neurosurgeons work in the crucible of identity: every operation on the brain is, by necessity, a manipulation of the substance of our selves, and every conversation with a patient undergoing brain surgery cannot help but confront this fact…At those critical junctures, the question is not simply whether to live or die but what kind of life is worth living. Would you trade your ability – or your mother’s – to talk for a few extra months of mute life? The expansion of your visual blind spot in exchange for eliminating the small possibility of a fatal brain hemorrhage? Your right hand’s function to stop seizures? How much neurologic suffering would you let your child endure before saying that death is preferable? Because the brain mediates our experience of the world, any neurosurgical problem forces a patient and family, ideally with a doctor as a guide, to answer this question: What makes life meaningful enough to go on living?

Both Gawande and Kalanithi help us recognize that knowing what we – and our loved ones – value in life will inform the choices we make about death when that time comes.

As a patient

What happens to your identity and sense of purpose when your plan for the next 40 years is suddenly wiped off the table?

My brother Jeevan had arrived at my bedside. “You’ve accomplished so much,” he said. “You know that, don’t you?”

I sighed. He meant well, but the words rang hollow. My life had been building potential, potential that would now go unrealized. I had planned to do so much, and I had come so close. I was physically debilitated, my imagined future and my personal identity collapsed, and I faced the same existential quandaries my patients faced. The lung cancer was confirmed. My carefully planned and hard-won future no longer existed.

After the diagnosis, Kalanithi was forced to re-evaluate what was most valuable to him.

While being trained as a physician and scientist had helped me process the data and accept the limits of what that data could reveal about my prognosis, it didn’t help me as a patient. It didn’t tell Lucy and me whether we should go ahead and have a child, or what it meant to nurture a new life while mine faded. Nor did it tell me whether to fight for my career, to reclaim the ambitions I had single-mindedly pursued for so long, but without the surety of the time to complete them.

Like my own patients, I had to face my mortality and try to understand what made my life worth living…

The old adage to ‘live each day as if it were your last’ loses strength under scrutiny. What gives our lives meaning on any given day depends to some extent on how imminent we believe death is.

Grand illnesses are supposed to be life-clarifying. Instead, I knew I was going to die – but I’d known that before. My state of knowledge was the same, but my ability to make lunch plans had been shot to hell. The way forward would seem obvious, if only I knew how many months or years I had left. Tell me three months, I’d spend time with family. Tell me one year, I’d write a book. Give me ten years, I’d go back to treating diseases. The truth that you live one day at a time didn’t help: What was I supposed to do with that day?

In searching for solace, Kalanithi returned to his love of literature.

And so it was literature that brought me back to life during this time. The monolithic uncertainty of my future was deadening; everywhere I turned, the shadow of death obscured the meaning of any action. I remember the moment my overwhelming unease yielded, when that seemingly impassable sea of uncertainty parted. I woke up in pain, facing another day – no project beyond breakfast seemed tenable. I can’t go on, I thought, and immediately, its antiphon responded, completing Samuel Beckett’s seven words, words I had learned long ago as an undergraduate: I’ll go on. I got out of bed and took a step forward, repeating the phrase over and over “I can’t go on. I’ll go on.”

That morning, I made a decision: I would push myself to return to the OR. Why? Because I could. Because that’s who I was. Because I would have to learn to live in a different way, seeing death as an imposing itinerant visitor but knowing that even if I’m dying, until I actually die, I am still living.

In one of the most profound passages of the book, Lucy and Paul discuss whether to have a child, “Don’t you think saying goodbye to your child will make your death more painful?” she asks, and he responds, simply, “Wouldn’t it be great if it did?”

Kalanithi comes to believe that life is about striving, not about avoiding suffering.

Years ago, it had occurred to me that Darwin and Nietzsche agreed on one thing: the defining characteristic of the organism is striving. Describing life otherwise was like painting a tiger without stripes. After so many years of living with death, I’d come to understand that the easiest death wasn’t necessarily the best. We talked it over. Our families gave their blessing. We decided to have a child. We would carry on living instead of dying.

He leaves behind this impassioned message for his daughter, Cady, eight months old at the time of his death.

When you come to one of the many moments in life where you must give an account of yourself, provide a ledger of what you have been, and done, and meant to the world, do not, I pray, discount that you filled a dying man’s days with a sated joy, a joy unknown to me in all my prior years, a joy that does not hunger for more and more but rests, satisfied. In this time, right now, that is an enormous thing.

When Breath Becomes Air, paired with Being Mortal, will get you thinking about what matters in your life and about ‘what lies up ahead on the road’.

***

Two related Farnam Street Posts:

Tiny Beautiful Things. A famous advice columnist operates under a pen name allowing her to be intimate and frank — dispensing advice built on a foundation of deep personal experience.

Richard Feynman’s Love Letter to His Wife Sixteen Months After Her Death. The famous physicist understood more about living a meaningful life than physics.

***

Want More of This?

Join the free newsletter that helps you gain an edge in life.

100% privacy. No games, no B.S., no spam.



James Cash Penney and the Golden Rule


It is then we must remember that all good days in human life come from
the mastery of the days of 
trouble that are forever recurrent.
-J.C. Penney

Many are unaware that the department store J.C. Penney was originally the work of a man named, appropriately, James Cash Penney. Penney was raised in Missouri by a father who doubled as a preacher and a farmer. After a career full of turbulence, James became manager of a Golden Rule store in Evanston, Wyoming. The stores traded in dry goods — grain, flour, beans; anything that wouldn’t spoil. After a few years of success, Penney was offered his own store in Kemmerer, Wyoming, and took his shot.

James_Cash_Penney_(ca._1902)
Penney turned the venture into a great success, and by 1913 had ownership of 34 Golden Rule stores which were renamed J.C. Penney. He’d go on to expand the chain into a dominant national department store chain that continues on today, albeit in a less prosperous form.

In 1949, Penney published a very slim book called My Experience with the Golden Rule — it didn’t describe in detail his retail experience, but instead his thoughts on the Rule itself. The little volume has some beautiful passages worth sharing. He speaks with a bit of a Southern Baptist tone, but whether you are a spiritual person or not, the lessons hold.

On the Teachings of Life

He starts with a refrain which echoes our favorite from Joseph Tussman:

As I look back over the entire range of my life’s journey I would say that discipline has never let up. I seem to have moved from one contest to another—from one hard situation to another. As I try to read it all now it seems to me as if life has been trying to make me understand that a man has only to work with the universal law and purpose and they, in turn, will work for him. But if he decides to work, trusting wholly to his own judgment, ignoring all wiser leadership, he will get hurt.

On The Challenge of Ethical Principles

I want to show that we build lasting values for later life precisely as we are motivated in our youth. For what we do in the beginning of our careers capitalizes us all the way along. If it happens that one is challenged in youth by ethical principles and if one is led, or even compelled, to adopt them, he will begin to have high altitude experiences. In other words, it is well for every one of us to be forced by whatever circumstances to work righteously. This is the supreme benefit even if, in the doing it, it seems to “go against the grain.” We learn, slowly, that all accomplishment and advancement are gained only by a contest—a fight with the circumstances and conditions through which we pass.

And we gradually begin to see that great principles have it in them to make the going rough, hard and foot-wearying. To seek to do best for ourselves by doing right for all concerned is by no means an easy proposition. It is, in fact, infinitely hard. But at the same time it guarantees us safety and security as a dividend on the investment of the effort we make. It does not keep us from attaining material success, hard as the going may be. But it makes the way safe and the method effective.

[…]

There were a lot of cowboys in Kemmerer in my time. One of them put the fact I have in mind in better words than I can command. Speaking of a rodeo he said: “There is one thing about a bronco that is always true. With never an exception. He is full of unexpectedness.”

The point is this, any ethical principle that one may adopt is just as full of unexpectedness. I mean, in the challenge it puts up to one. The problem with the bronco is to get on and stay on.

On the Value of Having One’s Principles Tested

Penney proceeds to tell the story of his father forcing him to start earning his own money if he wants new clothes. Upon earning a small amount, he buys a pig and uses the earnings from the first pig to bankroll the purchase of many more, until he’s got about a dozen. It’s then that his father tells him he must get rid of the pigs due to complaints, even though James would have to take a loss on the sale.

It was a long time after the event that I was able to look back upon the experience as one might look back upon a mountain range where he has been tramping and see its skyline. Let it suffice for me to say that what I leaned out of that business experience and the three parties concerned, namely, myself, my father, and the neighbors, proved to be a treasure worthy to possess.

My father knew that if I was compelled to clothe myself it would make me think and search and find ways of earning the money to do it. And furthermore he knew that I would learn this important fact:

We do not meet the demands of life with money. But with the imagination, forethought, plans and energy that earn the money.

Through life we learn many principles of business operation. But this one is of high rating among them all. Then, one thing further, my father took pains to make me understand:

–That I would not exercise my ingenuity to get money if by so doing I caused distress to other people.

–That any effort is worth only what can be gotten out of it by the action of a fair deal.

[…]

My father said to me one day: “We would resent it if a neighbor distressed and discomforted us in any way. Therefore, you see that a neighbor will resent distress and discomfort if we cause it. This means,” he said, “we must do to everyone as we wish to be done by.”

And so there emerged into my youthful experience the Golden Rule.

On Holding Fast to Ethics When You’re Riding High

Penney spends some time discussing his relationship with employees and his hope to develop them to their highest purpose. His basic idea is that “…to hire a man and literally leave him as is, is the beginning of a degree of human dissatisfaction that can go to any length.”

And then he returns to the Golden Rule:

So I come back again to the condition that the Golden Rule, if one adopts it, is a difficult master to serve. The ship’s captain will not throw the compass overboard because the wind blows fair and the day is funny. For he knows, from the experiences of the ocean’s instability, that the danger days of storm are always “just ahead.” So the compass must always be handy and obedience to it must always be loyal. And so with the Golden Rulle—the compass must be ever at hand through life’s journey. It will see us through trying times. And perhaps the most trying of all times comes when success is riding high and we may be tempted to “throw the compass overboard.” It is then we must remember that all good days in human life come from the mastery of the days of trouble that are forever recurrent.

On Advice to Young Men in Finding Their Purpose and Career

Penney closes with an admonition that the purpose of life is to find the calling to which you can devote your time and energy and feel fulfilled. As he makes clear, devoting your energy to something which you care about and feel fulfilled by is the highest purpose you can achieve:

To young men my advice is as simple and distinct as my own experience. It runs like this:

Take time to discover what you would prefer above all else to make your life work. You may have to do a lot of temporary jobs before you reach the one your ambition places above all others. But if your idea is clear and your determination firm, you will surely reach it.

Remember that it is often necessary in life to learn to work hard at many things before you arrive at the very great privilege of working hard at the one thing you prize the most.

Think persistently into great principles. That many have persisted for thousands of years simply because their truth is unassailable, applies to all of us in all situations and problems. Hence the great Proverbs, the Golden Rule, the Decalogue, the Sermon on the Mount, and, along with these, the testimony of men who have sought their way to the rare privilege of doing what they most wanted to do.

Remember that all this effort to reach your preference in life work and your further effort to perfect it within the scope of your ability is for just one purpose. And the purpose is to give service to the utmost of your ability.

Still Interested? You can find the book here, but it’s out of print and a bit pricey for such a short volume. Another one to learn more about J.C. Penney is a book about his life and career, Main Street Merchant. If you want more on ethics and wisdom, try some of our posts on Seneca.