Category: People

Aim For What’s Reasonable: Leadership Lessons From Director Jean Renoir

Directing a film involves getting an enormous group of people to work together on turning the image inside your head into a reality. In this 1970 interview, director Jean Renoir dispenses time-tested wisdom for leaders everywhere on humility, accountability, goal-setting, and more.

***

Many of us end up in leadership roles at some point in our career. Most of us, however, never get any training or instruction on how to actually be a good leader. But whether we end up offering formal or informal leadership, at some point we need to inspire or motivate people towards accomplishing a shared vision.

Directors are the leaders of movie productions. They assemble their team, they communicate their vision, and they manage the ups and downs of the filming process. Thus the experience of a successful director offers great insight into the qualities of a good leader. In 1970, film director Jean Renoir gave an interview with George Stevens Jr. of the American Film Institute where he discussed the leadership aspects of directing. His insights illustrate some important lessons. Renoir started out making silent films, and he continued filmmaking through to the 1960s. His two greatest cinematic achievements were the films The Grand Illusion (1937) and The Rules of the Game (1939). He received a Lifetime Achievement Academy Award in 1975 for his contribution to the motion picture industry.

In the interview, Renoir speaks to humility in leadership when he says, “I’m a director who has spent his life suggesting stories that nobody wanted. It’s still going on. But I’m used to it and I’m not complaining, because the ideas which were forced on me were often better than my own ideas.”

Leadership is not necessarily coming up with all the answers; it’s also important to put aside your own ego to cultivate and support the contributions from your team. Sometimes leaders have the best ideas. But often people on their team have excellent ones as well.

Renoir suggests that the role of a director is to have a clear enough vision that you can work through the imperfections involved in executing it. “A picture, often when it is good, is the result of some inner belief which is so strong that you have to show what you want, in spite of a stupid story or difficulties about the commercial side of the film.”

Good leaders don’t require perfection to achieve results. They work with what they have, often using creativity and ingenuity to fill in when reality doesn’t conform to the ideal image in their head. Having a vision is not about achieving exactly that vision. It’s about doing the best you can once you come into contact with reality.

When Renoir says, “We directors are simply midwives,” he implies that effective leadership is about giving shape to the talents and capabilities that already exist. Excellent leaders find a way to challenge and develop those on their team. In explaining how he works with actors, he says, “You must not ask an actor to do what he cannot do.” Rather, you need to work with what you have, using clear feedback and communication to find a way to bring out the best in people. Sometimes getting out of people’s way and letting their natural abilities come out is the most important thing to do.

Although Renoir says, “When I can, I shoot my scenes only once. I like to be committed, to be a slave to my decision,” he further explains, “I don’t like to make the important decisions alone.” Good leaders know when to consult others. They know to take in information from those who know more than they do and to respect different forms of expertise. But they still take accountability for their decisions because they made the final choice.

Good leaders are also mindful of the world outside the group or organization they are leading. They don’t lead in a vacuum but are sensitive to all those involved in achieving the results they are trying to deliver. For a director, it makes no sense to conceive of a film without considering the audience. Renoir explains, “I believe that the work of art where the spectator does not collaborate is not a work of art.” Similarly, we all have groups that we interact with outside of our organization, like clients or customers. We too need to run our teams with an understanding of that outside world.

No one can be good at everything, and thus effective leadership involves knowing when to ask for help. Renoir admits, “That’s where I like to have my friends help me, because I am very bad at casting.” Knowing your weaknesses is vital, because then you can find people who have strengths in those areas to assist you.

Additionally, most organizations are too complex for any one person to be an expert at all of the roles. Leaders show hubris when they assume they can do the jobs of everyone else well. Renoir explains this notion of knowing your role as a leader: “Too many directors work like this. They tell the actor, ‘Sit down, my dear friends, and look at me. I am going to act a scene, and you are going to repeat what I just did.’ He acts a scene and he acts it badly, because if he is a director instead of an actor, it’s probably because he’s a bad actor.”

***

Although leadership can be all encompassing, we shouldn’t be intimidated by the ideal list of qualities and behaviors a good leader displays. Focus on how you can improve. Set goals. Reflect on your failures, and recognize your success.

“You know, there is an old slogan, very popular in our occidental civilization: you must look to an end higher than normal, and that way you will achieve something. Your aim must be very, very high. Myself, I am absolutely convinced that it is mere stupidity. The aim must be easy to reach, and by reaching it, you achieve more.”

Job Interviews Don’t Work

Better hiring leads to better work environments, less turnover, and more innovation and productivity. When you understand the limitations and pitfalls of the job interview, you improve your chances of hiring the best possible person for your needs.

***

The job interview is a ritual just about every adult goes through at least once. They seem to be a ubiquitous part of most hiring processes. The funny thing about them, however, is that they take up time and resources without actually helping to select the best people to hire. Instead, they promote a homogenous workforce where everyone thinks the same.

If you have any doubt about how much you can get from an interview, think of what’s involved for the person being interviewed. We’ve all been there. The night before, you dig out your smartest outfit, iron it, and hope your hair lies flat for once. You frantically research the company, reading every last news article based on a formulaic press release, every blog post by the CEO, and every review by a disgruntled former employee.

After a sleepless night, you trek to their office, make awkward small talk, then answer a set of predictable questions. What’s your biggest weakness? Where do you see yourself in five years? Why do you want this job? Why are you leaving your current job? You reel off the answers you prepared the night before, highlighting the best of the best. All the while, you’re reminding yourself to sit up straight, don’t bite your nails, and keep smiling.

It’s not much better on the employer’s side of the table. When you have a role to fill, you select a list of promising candidates and invite them for an interview. Then you pull together a set of standard questions to riff off, doing a little improvising as you hear their responses. At the end of it all, you make some kind of gut judgment about the person who felt right—likely the one you connected with the most in the short time you were together.

Is it any surprise that job interviews don’t work when the whole process is based on subjective feelings? They are in no way the most effective means of deciding who to hire because they maximize the role of bias and minimize the role of evaluating competency.

What is a job interview?

“In most cases, the best strategy for a job interview is to be fairly honest, because the worst thing that can happen is that you won’t get the job and will spend the rest of your life foraging for food in the wilderness and seeking shelter underneath a tree or the awning of a bowling alley that has gone out of business.”

— Lemony Snicket, Horseradish

When we say “job interviews” throughout this post, we’re talking about the type of interview that has become standard in many industries and even in universities: free-form interviews in which candidates sit in a room with one or more people from a prospective employer (often people they might end up working with) and answer unstructured questions. Such interviews tend to focus on how a candidate behaves generally, emphasizing factors like whether they arrive on time or if they researched the company in advance. While questions may ostensibly be about predicting job performance, they tend to better select for traits like charisma rather than actual competence.

Unstructured interviews can make sense for certain roles. The ability to give a good first impression and be charming matters for a salesperson. But not all roles need charm, and just because you don’t want to hang out with someone after an interview doesn’t mean they won’t be an amazing software engineer. In a small startup with a handful of employees, someone being “one of the gang” might matter because close-knit friendships are a strong motivator when work is hard and pay is bad. But that group mentality may be less important in a larger company in need of diversity.

Considering the importance of hiring and how much harm getting it wrong can cause, it makes sense for companies to study and understand the most effective interview methods. Let’s take a look at why job interviews don’t work and what we can do instead.

Why job interviews are ineffective

Discrimination and bias

Information like someone’s age, gender, race, appearance, or social class shouldn’t dictate if they get a job or not—their competence should. But that’s unfortunately not always the case. Interviewers can end up picking the people they like the most, which often means those who are most similar to them. This ultimately means a narrower range of competencies is available to the organization.

Psychologist Ron Friedman explains in The Best Place to Work: The Art and Science of Creating an Extraordinary Workplace some of the unconscious biases that can impact hiring. We tend to rate attractive people as more competent, intelligent, and qualified. We consider tall people to be better leaders, particularly when evaluating men. We view people with deep voices as more trustworthy than those with higher voices.

Implicit bias is pernicious because it’s challenging to spot the ways it influences interviews. Once an interviewer judges someone, they may ask questions that nudge the interviewee towards fitting that perception. For instance, if they perceive someone to be less intelligent, they may ask basic questions that don’t allow the candidate to display their expertise. Having confirmed their bias, the interviewer has no reason to question it or even notice it in the future.

Hiring often comes down to how much an interviewer likes a candidate as a person. This means that we can be manipulated by manufactured charm. If someone’s charisma is faked for an interview, an organization can be left dealing with the fallout for ages.

The map is not the territory

The representation of something is not the thing itself. A job interview is meant to be a quick snapshot to tell a company how a candidate would be at a job. However, it’s not a representative situation in terms of replicating how the person will perform in the actual work environment.

For instance, people can lie during job interviews. Indeed, the situation practically encourages it. While most people feel uncomfortable telling outright lies (and know they would face serious consequences later on for a serious fabrication), bending the truth is common. Ron Friedman writes, “Research suggests that outright lying generates too much psychological discomfort for people to do it very often. More common during interviews are more nuanced forms of deception which include embellishment (in which we take credit for things we haven’t done), tailoring (in which we adapt our answers to fit the job requirements), and constructing (in which we piece together elements from different experiences to provide better answers.)” An interviewer can’t know if someone is deceiving them in any of these ways. So they can’t know if they’re hearing the truth.

One reason why we think job interviews are representative is the fundamental attribution error. This is a logical fallacy that leads us to believe that the way people behave in one area carries over to how they will behave in other situations. We view people’s behaviors as the visible outcome of innate characteristics, and we undervalue the impact of circumstances.

Some employers report using one single detail they consider representative to make hiring decisions, such as whether a candidate sends a thank-you note after the interview or if their LinkedIn picture is a selfie. Sending a thank-you note shows manners and conscientiousness. Having a selfie on LinkedIn shows unprofessionalism. But is that really true? Can one thing carry across to every area of job performance? It’s worth debating.

Gut feelings aren’t accurate

We all like to think we can trust our intuition. The problem is that intuitive judgments tend to only work in areas where feedback is fast and cause and effect clear. Job interviews don’t fall into that category. Feedback is slow. The link between a hiring decision and a company’s success is unclear.

Overwhelmed by candidates and the pressure of choosing, interviewers may resort to making snap judgments based on limited information. And interviews introduce a lot of noise, which can dilute relevant information while leading to overconfidence. In a study entitled Belief in the Unstructured Interview: The Persistence of an Illusion, participants predicted the future GPA of a set of students. They either received biographical information about the students or both biographical information and an interview. In some of the cases, the interview responses were entirely random, meaning they shouldn’t have conveyed any genuine useful information.

Before the participants made their predictions, the researchers informed them that the strongest predictor of a student’s future GPA is their past GPA. Seeing as all participants had access to past GPA information, they should have factored it heavily into their predictions.

In the end, participants who were able to interview the students made worse predictions than those who only had access to biographical information. Why? Because the interviews introduced too much noise. They distracted participants with irrelevant information, making them forget the most significant predictive factor: past GPA. Of course, we do not have clear metrics like GPA for jobs. But this study indicates that interviews do not automatically lead to better judgments about a person.

We tend to think human gut judgments are superior, even when evidence doesn’t support this. We are quick to discard information that should shape our judgments in favor of less robust intuitions that we latch onto because they feel good. The less challenging information is to process, the better it feels. And we tend to associate good feelings with ‘rightness’.

Experience ≠ expertise in interviewing

In 1979, the University of Texas Medical School at Houston suddenly had to increase its incoming class size by 50 students due to a legal change requiring larger classes. Without time to interview again, they selected from the pool of candidates the school chose to interview, then rejected as unsuitable for admission. Seeing as they got through to the interview stage, they had to be among the best candidates. They just weren’t previously considered good enough to admit.

When researchers later studied the result of this unusual situation, they found that the students whom the school first rejected performed no better or worse academically than the ones they first accepted. In short, interviewing students did nothing to help select for the highest performers.

Studying the efficacy of interviews is complicated and hard to manage from an ethical standpoint. We can’t exactly give different people the same real-world job in the same conditions. We can take clues from fortuitous occurrences, like the University of Texas Medical School change in class size and the subsequent lessons learned. Without the legal change, the interviewers would never have known that the students they rejected were of equal competence to the ones they accepted. This is why building up experience in this arena is difficult. Even if someone has a lot of experience conducting interviews, it’s not straightforward to translate that into expertise. Expertise is about have a predictive model of something, not just knowing a lot about it.

Furthermore, the feedback from hiring decisions tends to be slow. An interviewer cannot know what would happen if they hired an alternate candidate. If a new hire doesn’t work out, that tends to fall on them, not the person who chose them. There are so many factors involved that it’s not terribly conducive to learning from experience.

Making interviews more effective

It’s easy to see why job interviews are so common. People want to work with people they like, so interviews allow them to scope out possible future coworkers. Candidates expect interviews, as well—wouldn’t you feel a bit peeved if a company offered you a job without the requisite “casual chat” beforehand? Going through a grueling interview can make candidates more invested in the position and likely to accept an offer. And it can be hard to imagine viable alternatives to interviews.

But it is possible to make job interviews more effective or make them the final step in the hiring process after using other techniques to gauge a potential hire’s abilities. Doing what works should take priority over what looks right or what has always been done.

Structured interviews

While unstructured interviews don’t work, structured ones can be excellent. In Thinking, Fast and Slow, Daniel Kahneman describes how he redefined the Israel Defense Force’s interviewing process as a young psychology graduate. At the time, recruiting a new soldier involved a series of psychometric tests followed by an interview to assess their personality. Interviewers then based their decision on their intuitive sense of a candidate’s fitness for a particular role. It was very similar to the method of hiring most companies use today—and it proved to be useless.

Kahneman introduced a new interviewing style in which candidates answered a predefined series of questions that were intended to measure relevant personality traits for the role (for example, responsibility and sociability). He then asked interviewers to give candidates a score for how well they seemed to exhibit each trait based on their responses. Kahneman explained that “by focusing on standardized, factual questions I hoped to combat the halo effect, where favorable first impressions influence later judgments.” He tasked interviewers only with providing these numbers, not with making a final decision.

Although interviewers at first disliked Kahneman’s system, structured interviews proved far more effective and soon became the standard for the IDF. In general, they are often the most useful way to hire. The key is to decide in advance on a list of questions, specifically designed to test job-specific skills, then ask them to all the candidates. In a structured interview, everyone gets the same questions with the same wording, and the interviewer doesn’t improvise.

Tomas Chamorro-Premuzic writes in The Talent Delusion:

There are at least 15 different meta-analytic syntheses on the validity of job interviews published in academic research journals. These studies show that structured interviews are very useful to predict future job performance. . . . In comparison, unstructured interviews, which do not have a set of predefined rules for scoring or classifying answers and observations in a reliable and standardized manner, are considerably less accurate.

Why does it help if everyone hears the same questions? Because, as we learned previously, interviewers can make unconscious judgments about candidates, then ask questions intended to confirm their assumptions. Structured interviews help measure competency, not irrelevant factors. Ron Friedman explains this further:

It’s also worth having interviewers develop questions ahead of time so that: 1) each candidate receives the same questions, and 2) they are worded the same way. The more you do to standardize your interviews, providing the same experience to every candidate, the less influence you wield on their performance.

What, then, is an employer to do with the answers? Friedman says you must then create clear criteria for evaluating them.

Another step to help minimize your interviewing blind spots: include multiple interviewers and give them each specific criteria upon which to evaluate the candidate. Without a predefined framework for evaluating applicants—which may include relevant experience, communication skills, attention to detail—it’s hard for interviewers to know where to focus. And when this happens, fuzzy interpersonal factors hold greater weight, biasing assessments. Far better to channel interviewers’ attention in specific ways, so that the feedback they provide is precise.

Blind auditions

One way to make job interviews more effective is to find ways to “blind” the process—to disguise key information that may lead to biased judgments. Blinded interviews focus on skills alone, not who a candidate is as a person. Orchestras offer a remarkable case study in the benefits of blinding.

In the 1970s, orchestras had a gender bias problem. A mere 5% of their members were women, on average. Orchestras knew they were missing out on potential talent, but they found the audition process seemed to favor men over women. Those who were carrying out auditions couldn’t sidestep their unconscious tendency to favor men.

Instead of throwing up their hands in despair and letting this inequality stand, orchestras began carrying out blind auditions. During these, candidates would play their instruments behind a screen while a panel listened and assessed their performance. They received no identifiable information about candidates. The idea was that orchestras would be able to hire without room for bias. It took a bit of tweaking to make it work – at first, the interviewers were able to discern gender based on the sound of a candidate’s shoes. After that, they requested that people interview without their shoes.

The results? By 1997, up to 25% of orchestra members were women. Today, the figure is closer to 30%.

Although this is sometimes difficult to replicate for other types of work, blind auditions can provide an inspiration to other industries that could benefit from finding ways to make interviews more about a person’s abilities than their identity.

Competency-related evaluations

What’s the best way to test if someone can do a particular job well? Get them to carry out tasks that are part of the job. See if they can do what they say they can do. It’s much harder for someone to lie and mislead an interviewer during actual work than during an interview. Using competency tests for a blinded interview process is also possible—interviewers could look at depersonalized test results to make unbiased judgments.

Tomas Chamorro-Premuzic writes in The Talent Delusion: Why Data, Not Intuition, Is the Key to Unlocking Human Potential, “The science of personnel selection is over a hundred years old yet decision-makers still tend to play it by ear or believe in tools that have little academic rigor. . . . An important reason why talent isn’t measured more scientifically is the belief that rigorous tests are difficult and time-consuming to administer, and that subjective evaluations seem to do the job ‘just fine.’”

Competency tests are already quite common in many fields. But interviewers tend not to accord them sufficient importance. They come after an interview, or they’re considered secondary to it. A bad interview can override a good competency test. At best, interviewers accord them equal importance to interviews. Yet they should consider them far more important.

Ron Friedman writes, “Extraneous data such as a candidate’s appearance or charisma lose their influence when you can see the way an applicant actually performs. It’s also a better predictor of their future contributions because unlike traditional in-person interviews, it evaluates job-relevant criteria. Including an assignment can help you better identify the true winners in your applicant pool while simultaneously making them more invested in the position.”

Conclusion

If a company relies on traditional job interviews as its sole or main means of choosing employees, it simply won’t get the best people. And getting hiring right is paramount to the success of any organization. A driven team of people passionate about what they do can trump one with better funding and resources. The key to finding those people is using hiring techniques that truly work.

Standing on the Shoulders of Giants

Innovation doesn’t occur in a vacuum. Doers and thinkers from Shakespeare to Jobs, liberally “stole” inspiration from the doers and thinkers who came before. Here’s how to do it right.

***

“If I have seen further,” Isaac Newton wrote in a 1675 letter to fellow scientist Robert Hooke, “it is by standing on the shoulders of giants.”

It can be easy to look at great geniuses like Newton and imagine that their ideas and work came solely out of their minds, that they spun it from their own thoughts—that they were true originals. But that is rarely the case.

Innovative ideas have to come from somewhere. No matter how unique or unprecedented a work seems, dig a little deeper and you will always find that the creator stood on someone else’s shoulders. They mastered the best of what other people had already figured out, then made that expertise their own. With each iteration, they could see a little further, and they were content in the knowledge that future generations would, in turn, stand on their shoulders.

Standing on the shoulders of giants is a necessary part of creativity, innovation, and development. It doesn’t make what you do less valuable. Embrace it.

Everyone gets a lift up

Ironically, Newton’s turn of phrase wasn’t even entirely his own. The phrase can be traced back to the twelfth century, when the author John of Salisbury wrote that philosopher Bernard of Chartres compared people to dwarves perched on the shoulders of giants and said that “we see more and farther than our predecessors, not because we have keener vision or greater height, but because we are lifted up and borne aloft on their gigantic stature.”

Mary Shelley put it this way in the nineteenth century, in a preface for Frankenstein: “Invention, it must be humbly admitted, does not consist in creating out of void but out of chaos.”

There are giants in every field. Don’t be intimidated by them. They offer an exciting perspective. As the film director Jim Jarmusch advised, “Nothing is original. Steal from anywhere that resonates with inspiration or fuels your imagination. Devour old films, new films, music, books, paintings, photographs, poems, dreams, random conversations, architecture, bridges, street signs, trees, clouds, bodies of water, light, and shadows. Select only things to steal from that speak directly to your soul. If you do this, your work (and theft) will be authentic. Authenticity is invaluable; originality is non-existent. And don’t bother concealing your thievery—celebrate it if you feel like it. In any case, always remember what Jean-Luc Godard said: ‘It’s not where you take things from—it’s where you take them to.’”

That might sound demoralizing. Some might think, “My song, my book, my blog post, my startup, my app, my creation—surely they are original? Surely no one has done this before!” But that’s likely not the case. It’s also not a bad thing. Filmmaker Kirby Ferguson states in his TED Talk: “Admitting this to ourselves is not an embrace of mediocrity and derivativeness—it’s a liberation from our misconceptions, and it’s an incentive to not expect so much from ourselves and to simply begin.”

There lies the important fact. Standing on the shoulders of giants enables us to see further, not merely as far as before. When we build upon prior work, we often improve upon it and take humanity in new directions. However original your work seems to be, the influences are there—they might just be uncredited or not obvious. As we know from social proof, copying is a natural human tendency. It’s how we learn and figure out how to behave.

In Antifragile: Things That Gain from Disorder, Nassim Taleb describes the type of antifragile inventions and ideas that have lasted throughout history. He describes himself heading to a restaurant (the likes of which have been around for at least 2,500 years), in shoes similar to those worn at least 5,300 years ago, to use silverware designed by the Mesopotamians. During the evening, he drinks wine based on a 6,000-year-old recipe, from glasses invented 2,900 years ago, followed by cheese unchanged through the centuries. The dinner is prepared with one of our oldest tools, fire, and using utensils much like those the Romans developed.

Much about our societies and cultures has undeniably changed and continues to change at an ever-faster rate. But we continue to stand on the shoulders of those who came before in our everyday life, using their inventions and ideas, and sometimes building upon them.

Not invented here syndrome

When we discredit what came before or try to reinvent the wheel or refuse to learn from history, we hold ourselves back. After all, many of the best ideas are the oldest. “Not Invented Here Syndrome” is a term for situations when we avoid using ideas, products, or data created by someone else, preferring instead to develop our own (even if it is more expensive, time-consuming, and of lower quality.)

The syndrome can also manifest as reluctance to outsource or delegate work. People might think their output is intrinsically better if they do it themselves, becoming overconfident in their own abilities. After all, who likes getting told what to do, even by someone who knows better? Who wouldn’t want to be known as the genius who (re)invented the wheel?

Developing a new solution for a problem is more exciting than using someone else’s ideas. But new solutions, in turn, create new problems. Some people joke that, for example, the largest Silicon Valley companies are in fact just impromptu incubators for people who will eventually set up their own business, firm in the belief that what they create themselves will be better.

The syndrome is also a case of the sunk cost fallacy. If a company has spent a lot of time and money getting a square wheel to work, they may be resistant to buying the round ones that someone else comes out with. The opportunity costs can be tremendous. Not Invented Here Syndrome detracts from an organization or individual’s core competency, and results in wasting time and talent on what are ultimately distractions. Better to use someone else’s idea and be a giant for someone else.

Why Steve Jobs stole his ideas

“Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it. They just saw something. It seemed obvious to them after a while; that’s because they were able to connect experiences they’ve had and synthesize new things.” 

— Steve Jobs

In The Runaway Species: How Human Creativity Remakes the World, Anthony Brandt and David Eagleman trace the path that led to the creation of the iPhone and track down the giants upon whose shoulders Steve Jobs perched. We often hail Jobs as a revolutionary figure who changed how we use technology. Few who were around in 2007 could have failed to notice the buzz created by the release of the iPhone. It seemed so new, a total departure from anything that had come before. The truth is a little messier.

The first touchscreen came about almost half a century before the iPhone, developed by E.A. Johnson for air traffic control. Other engineers built upon his work and developed usable models, filing a patent in 1975. Around the same time, the University of Illinois was developing touchscreen terminals for students. Prior to touchscreens, light pens used similar technology. The first commercial touchscreen computer came out in 1983, soon followed by graphics boards, tablets, watches, and video game consoles. Casio released a touchscreen pocket computer in 1987 (remember, this is still a full twenty years before the iPhone.)

However, early touchscreen devices were frustrating to use, with very limited functionality, often short battery lives, and minimal use cases for the average person. As touchscreen devices developed in complexity and usability, they laid down the groundwork for the iPhone.

Likewise, the iPod built upon the work of Kane Kramer, who took inspiration from the Sony Walkman. Kramer designed a small portable music player in the 1970s. The IXI, as he called it, looked similar to the iPod but arrived too early for a market to exist, and Kramer lacked the marketing skills to create one. When pitching to investors, Kramer described the potential for immediate delivery, digital inventory, taped live performances, back catalog availability, and the promotion of new artists and microtransactions. Sound familiar?

Steve Jobs stood on the shoulders of the many unseen engineers, students, and scientists who worked for decades to build the technology he drew upon. Although Apple has a long history of merciless lawsuits against those they consider to have stolen their ideas, many were not truly their own in the first place. Brandt and Eagleman conclude that “human creativity does not emerge from a vacuum. We draw on our experience and the raw materials around us to refashion the world. Knowing where we’ve been, and where we are, points the way to the next big industries.”

How Shakespeare got his ideas

Nothing will come of nothing.”  

— William Shakespeare, King Lear

Most, if not all, of Shakespeare’s plays draw heavily upon prior works—so much so that some question whether he would have survived today’s copyright laws.

Hamlet took inspiration from Gesta Danorum, a twelfth-century work on Danish history by Saxo Grammaticus, consisting of sixteen Latin books. Although it is doubtful whether Shakespeare had access to the original text, scholars find the parallels undeniable and believe he may have read another play based on it, from which he drew inspiration. In particular, the accounts of the plight of Prince Amleth (which has the same letters as Hamlet) involves similar events.

Holinshed’s Chronicles, a co-authored account of British history from the late sixteenth century, tells stories that mimic the plot of Macbeth, including the three witches. Holinshed’s Chronicles itself was a mélange of earlier texts, which transferred their biases and fabrications to Shakespeare. It also likely inspired King Lear.

Parts of Antony and Cleopatra are copied verbatim from Plutarch’s Life of Mark Anthony. Arthur Brooke’s 1562 poem The Tragicall Historye of Romeus and Juliet was an undisguised template for Romeo and Juliet. Once again, there are more giants behind the scenes—Brooke copied a 1559 poem by Pierre Boaistuau, who in turn drew from a 1554 story by Matteo Bandello, who in turn drew inspiration from a 1530 work by Luigi da Porto. The list continues, with Plutarch, Chaucer, and the Bible acting as inspirations for many major literary, theatrical, and cultural works.

Yet what Shakespeare did with the works he sometimes copied, sometimes learned from, is remarkable. Take a look at any of the original texts and, despite the mimicry, you will find that they cannot compare to his plays. Many of the originals were dry, unengaging, and lacking any sort of poetic language. J.J. Munro wrote in 1908 that The Tragicall Historye of Romeus and Juliet “meanders on like a listless stream in a strange and impossible land; Shakespeare’s sweeps on like a broad and rushing river, singing and foaming, flashing in sunlight and darkening in cloud, carrying all things irresistibly to where it plunges over the precipice into a waste of waters below.”

Despite bordering on plagiarism at times, he overhauled the stories with an exceptional use of the English language, bringing drama and emotion to dreary chronicles or poems. He had a keen sense for the changes required to restructure plots, creating suspense and intensity in their stories. Shakespeare saw far further than those who wrote before him, and with their help, he ushered in a new era of the English language.

Of course, it’s not just Newton, Jobs, and Shakespeare who found a (sometimes willing, sometimes not) shoulder to stand upon. Facebook is presumed to have built upon Friendster. Cormac McCarthy’s books often replicate older history texts, with one character coming straight from Samuel Chamberlain’s My Confessions. John Lennon borrowed from diverse musicians, once writing in a letter to the New York Times that though the Beatles copied black musicians, “it wasn’t a rip off. It was a love in.”

In The Ecstasy of Influence, Jonathan Lethem points to many other instances of influences in classic works. In 1916, journalist Heinz von Lichberg published a story of a man who falls in love with his landlady’s daughter and begins a love affair, culminating in her death and his lasting loneliness. The title? Lolita. It’s hard to question that Nabokov must have read it, but aside from the plot and name, the style of language in his version is absent from the original.

The list continues. The point is not to be flippant about plagiarism but to cultivate sensitivity to the elements of value in a previous work, as well as the ability to build upon those elements. If we restrict the flow of ideas, everyone loses out.

The adjacent possible

What’s this about? Why can’t people come up with their own ideas? Why do so many people come up with a brilliant idea but never profit from it? The answer lies in what scientist Stuart Kauffman calls “the adjacent possible.” Quite simply, each new innovation or idea opens up the possibility of additional innovations and ideas. At any time, there are limits to what is possible, yet those limits are constantly expanding.

In Where Good Ideas Come From: The Natural History of Innovation, Steven Johnson compares this process to being in a house where opening a door creates new rooms. Each time we open the door to a new room, new doors appear and the house grows. Johnson compares it to the formation of life, beginning with basic fatty acids. The first fatty acids to form were not capable of turning into living creatures. When they self-organized into spheres, the groundwork formed for cell membranes, and a new door opened to genetic codes, chloroplasts, and mitochondria. When dinosaurs evolved a new bone that meant they had more manual dexterity, they opened a new door to flight. When our distant ancestors evolved opposable thumbs, dozens of new doors opened to the use of tools, writing, and warfare. According to Johnson, the history of innovation has been about exploring new wings of the adjacent possible and expanding what we are capable of.

A new idea—like those of Newton, Jobs, and Shakespeare—is only possible because a previous giant opened a new door and made their work possible. They in turn opened new doors and expanded the realm of possibility. Technology, art, and other advances are only possible if someone else has laid the groundwork; nothing comes from nothing. Shakespeare could write his plays because other people had developed the structures and language that formed his tools. Newton could advance science because of the preliminary discoveries that others had made. Jobs built Apple out of the debris of many prior devices and technological advances.

The questions we all have to ask ourselves are these: What new doors can I open, based on the work of the giants that came before me? What opportunities can I spot that they couldn’t? Where can I take the adjacent possible? If you think all the good ideas have already been found, you are very wrong. Other people’s good ideas open new possibilities, rather than restricting them.

As time passes, the giants just keep getting taller and more willing to let us hop onto their shoulders. Their expertise is out there in books and blog posts, open-source software and TED talks, podcast interviews, and academic papers. Whatever we are trying to do, we have the option to find a suitable giant and see what can be learned from them. In the process, knowledge compounds, and everyone gets to see further as we open new doors to the adjacent possible.

What You Truly Value

Our devotion to our values gets tested in the face of a true crisis. But it’s also an opportunity to reconnect, recommit, and sometimes, bake some bread.

***

The recent outbreak of the coronavirus is impacting people all over the world — not just in terms of physical health, but financially, emotionally, and even socially. As we struggle to adapt to our new circumstances, it can be tempting to bury our head and wait for it all to blow over so we can just get back to normal. Or we can see this as an incredible opportunity to figure out who we are.

What many of us are discovering right now is that the things we valued a few months ago don’t actually matter: our cars, the titles on our business cards, our privileged neighborhoods. Rather, what is coming to the forefront is a shift to figuring out what we find intrinsically rewarding

When everything is easy, it can seem like you have life figured out. When things change and you’re called to put it into practice, it’s a different level. It’s one thing to say you are stoic when your coffee spills and another entirely when you’re watching your community collapse. When life changes and gets hard, you realize you’ve never had to put into practice what you thought you knew about coping with disaster.

But when a crisis hits, everything is put to the real test.

The challenge then becomes wrapping our struggles into our values, because what we value only has meaning if it’s important when life is hard. To know if they have worth, your values need to help you move forward when you can barely crawl and the obstacles in your way seem insurmountable.

In the face of a crisis, what is important to us becomes evident when we give ourselves the space to reflect on what is going to get us through the hard times. And so we find renewed commitment to get back to core priorities. What seemed important before falls apart to reveal what really matters: family, love, community, health.

“I was 32 when I started cooking; up until then, I just ate.” 

— Julia Child

One unexpected activity that many people are turning to now that they have time and are more introspective is baking. In fact, this week Google searches for bread recipes hit a noticeable high.


Baking is a very physical experience: kneading dough, tasting batter, smelling the results of the ingredients coming together. It’s an activity that requires patience. Bread has to rise. Pies have to cook. Cakes have to cool before they can be covered with icing. And, as prescriptive as baking seems on its surface, it’s something that facilitates creativity as we improvise our ingredients based on what we have in the cupboard. We discover new flavors, and we comfort ourselves and others with the results. Baked goods are often something we share, and in doing so we are providing for those we care about.

Why might baking be useful in times of stress? In Overcoming Anxiety, Dennis Tirch explains “research has demonstrated that when people engage more fully in behaviors that give them a sense of pleasure and mastery, they can begin to overcome negative emotions.”

At home with their loved ones people can reconsider what they value one muffin at a time. Creating with the people we love instead of consuming on our own allows us to focus on what we value as the world changes around us. With more time, slow, seemingly unproductive pursuits have new appeal because they help us reorient to the qualities in life that matter most.

Giving yourself the space to tune in to your values doesn’t have to come through baking. What’s important is that you find an activity that lets you move past fear and panic, to reconnect with what gives your life meaning. When you engage with an activity that gives you pleasure and releases negative emotions, it allows you to rediscover what is important to you.

Change is stressful. But neither stress nor change have to be scary. If you think about it, you undergo moments of change every day because nothing in life is ever static. Our lives are a constant adaptation to a world that is always in motion.

All change brings opportunity. Some change gives us the opportunity to pause and ask what we can do better. How can we better connect to what has proven to be important? Connection is not an abstract intellectual exercise, but an experience that orients us to the values that provide us direction. If you look for opportunities in line with your values, you will be able to see a path through the fear and uncertainty guided by the light that is hope.

Seduced by Logic: Émilie du Châtelet and the Struggles to create the Newtonian Revolution

Against great odds, Émilie du Châtelet (1706–1749) taught herself mathematics and became a world authority on Newtonian mathematical physics.

I say against great odds because being a woman at the time meant she was ineligible for the same formal and informal opportunities available to others. Seduced by Logic, by Robyn Arianrhod tells her story with captivating color.

Émilie and her lover and collaborator Voltaire realized that Newton’s Principia not only changed our view of the world but also the way we do science.

“Newton,” writes Arianrhod, “had created a method for constructing and then testing theories, so the Principia provided the first truly modern blueprint for theoretical science as both a predictive, quantitative discipline—Newton eschewed qualitative, unproven, metaphysical speculations—and a secular discipline, separate from religion, although by no means inherently opposed to it.”

This, of course, has impacted the way we live and see ourselves. While Newton is relatively well known today, his theories were not easily accepted at the time. Émilie was one of the first to realize his impact and promote his thinking. In the late 1740s, she created what is, still to this day, the authoritative French translation, which includes detailed commentary, on Newton’s masterpiece. Voltaire considered du Châtelet “a genius worthy of Horace and Newton.”

Émilie du Châtelet didn’t limit herself to only commenting on Newton. The reason the book still stands today is that she added a lot of original thought.

***

How did Émilie du Châtelet come to learn so much in a world that overtly limited her opportunities? This is where her character shines.

While her brothers were sent to the most prestigious Jesuit secondary schools; Émilie was left to fend for herself and acquired much of her knowledge through reading. While her brothers could attend university, “such a thing was unthinkable for a girl.”

Luckily her family environment was conducive to self-education. Émilie’s parents “were rather unorthodox in the intellectual freedom they allowed in their children: both parents allowed Émilie to argue with them and express opinions, and from the time they were about ten years old, the children had permission to browse freely through the library.”

 

***

Émilie would grow and enter an arranged marriage at eighteen with thirty-year-old Florent-Claude, marquis du Chatelet and count of Lomont. Less than a year later she gave birth to their first child, Gabrielle-Pauline, which was followed seventeen months later by their son, Floren-Louis. Another child, a boy, would come six years later only to pass within two years. His death caused her to remark on her grief that the ‘sentiments of nature must exist in us without us suspecting.’

“Sometime around 1732, she experienced a true intellectual epiphany,” Arianrhod writes. As a result, Émilie would come to see herself as a ‘thinking creature.’

“At first, she only caught a glimpse of this new possibility, and she continued to allow her time to be wasted by superficial society life and its dissipation, ‘which was all I had felt myself born for.’ Fortunately, her ongoing friendship with these ‘people who think’—including another mathematically inclined woman, Marie de Thil, who would remain her lifelong friend—led Émilie to the liberating realisation that it was not too late to begin cultivating her mind seriously.”

It would be a difficult journey. “I feel,” Émilie wrote, “all the weight of the prejudice that universally excludes [women] from the sciences. It is one of the contradictions of this world that has always astonished me, that there are great countries whose destiny the law permits us to rule, and yet there is no place where we are taught to think.”

To become a person who thinks she became a person who reads.

“Presumably,” Arianrhod writes, “she studied Descartes, Newton, and the great English philosopher of liberty, John Locke, because when she met Voltaire a year after her epiphany, he was immediately captivated by her mind as well as her other charms.”

In an early love letter, Voltaire would write to her “Ah! What happiness to see you, to hear you … and what pleasures I taste in your arms! I am so fortunate to love the one I admire … you are the idol of my heart, you make all my happiness.”

“When Émilie and Voltaire because their courtship in 1733,” Arianhod writes, “she was twenty-six, and he was thirty-eight (the same as-as her husband, with whom Voltaire would eventually become good friends, thanks to Émilie’s encouragement and her efforts as a diplomatic go-between.)”

***

Arianrhod writes of Émilie’s struggles to learn:

Émilie’s plan to become a mathematician would require all her courage and determination. Firstly, envious acquaintances like Madame du Deffand would try to cast her as a dry and ugly ‘learned woman’ or femme savante, despite the fact that she had such appeal and charisma that the handsome duc de Richelieu, one of the most sought-after men in Paris, was rumoured to have once been her lover, while the celebrated Voltaire adored her. Of course, some of her female contemporaries admired her scholarship: Madame de Graffigny would later say, ‘Our sex ought to erect altars to her!’ But many were irritated by, or envious of, her liberated commitment to an intellectual life, because Émilie was very different from the glamorous women who ran many of Paris’s legendary literary salons. It was acceptable, even admirable, for such women to know enough of languages and philosophy to be good conversationalists with the learned men who dominated salon gatherings, but it was expected that women be modest about their knowledge. By contrast, Émilie would become famous as a scholar in her own right, thus angering the likes of Madame du Deffand, a powerful salonnière who claimed Émilie’s interest in science was all for show.

There were few truly learned women of the time, the belief being they were “either pretentious or ugly,” something that lingered “for the next three centuries.”

If you’re going to blaze the trail, you really have to blaze it.

At thirty-five, (Pierre-Louis Moreau de Maupertuis) Maupertuis was both ambitious and charming. When he agreed to tutor Émilie, he probably expected her to be a dilettante like his other female students: he had quite a following among society ladies. But her first known letter to him, written in January 1734, is both deferential and eager: ‘I spent all yesterday evening working on your lessons. I would like to make myself worthy of them. I fear, I confess to you, losing the good opinion you have of me.’ Perhaps he still doubted her commitment, because a week or two later she wrote, ‘I spent the evening with binomials and trinomials, [but] I am no longer able to study if you do not give me a task, and I have an extreme desire for one.’ Over the next few months, she sent him a stream of notes, trying to arrange lessons, asking him to come to her house for a couple of hours, or offering to meet him outside the Academy of Sciences – women were allowed inside only for the twice-yearly public lectures – or outside Gradot’s, one of the favourite cafés of the intellectual set.

[…]

It was this kind of intensity – as expressed in this multitude of requests for rendezvous – that fuelled gossip among her peers, and jealousy from Voltaire. Until the late twentieth century, most historians, too, seemed unable to imagine a woman like Émilie could be seduced only by mathematics – after all, until then, few women had actually become mathematicians. But it is true that many of Émilie’s letters to Maupertuis have a very flirtatious style – it was, after all, an era that revelled in the game of seduction. There is no evidence to prove whether or not they ever became lovers in those early months, before she and Voltaire had fully committed themselves to each other, but her letters certainly prove that all her life she would continue to hold a deep affection and respect for Maupertuis. In late April 1734, Émilie wrote to Maupertuis: ‘I hope I will render myself less unworthy of your lessons by telling you that it is not for myself that I want to become a mathematician, but because I am ashamed of making such mediocre progress under such a master as you.’ It was, indeed, an era of flattery! (Voltaire was quite adept at it – as a mere bourgeois, he often needed to flatter important people to help advance his literary career.) Although this letter suggests Émilie was simply using flattery to extract more lessons from her mathematical ‘master’, she always did have genuine doubts about her ability, which is not surprising given her lack of formal education and the assumed intellectual inferiority of her gender. She would later write, ‘If I were king … I would reform an abuse which cuts back, as it were, half of humanity. I would have women participate in all human rights, and above all those of the mind.’

***

In the translator’s preface to her late 1730s edition of Selected Philosophical and Scientific Writings, Du Châtelet highlights a few of the traits that helped her overcome so much.

You must know what you want:

(Knowledge) can never be acquired unless one has chosen a goal for one’s studies. One must conduct oneself as in everyday life; one must know what one wants to be. In the latter endeavors irresolution produces false steps, and in the life of the mind confused ideas.

She considered herself a member of the ordinary class, and she wrote about how regular people can come to acquire talent.

It sometimes happens that work and study force genius to declare itself, like the fruits that art produces in a soil where nature did not intend it, but these efforts of art are nearly as rare as natural genius itself. The vast majority of thinking men — the others, the geniuses, are in a class of their own — need to search within themselves for their talent. They know the difficulties of each art, and the mistakes of those who engage in each one, but they lack the courage that is not disheartened by such reflections, and the superiority that would enable them to overcome such difficulties. Mediocrity is, even among the elect, the lot of the greatest number.

Seduced by Logic is worth reading in its entirety. Du Châtelet’s story is as fascinating as informative.

Using Multidisciplinary Thinking to Approach Problems in a Complex World

Complex outcomes in human systems are a tough nut to crack when it comes to deciding what’s really true. Any phenomena we might try to explain will have a host of competing theories, many of them seemingly plausible.

So how do we know what to go with?

One idea is to take a nod from the best. One of the most successful “explainers” of human behavior has been the cognitive psychologist Steven Pinker. His books have been massively influential, in part because they combine scientific rigor, explanatory power, and plainly excellent writing.

What’s unique about Pinker is the range of sources he draws on. His book The Better Angels of Our Nature, a cogitation on the decline in relative violence in recent human history, draws on ideas from evolutionary psychology, forensic anthropology, statistics, social history, criminology, and a host of other fields. Pinker, like Vaclav Smil and Jared Diamond, is the opposite of the man with a hammer, ranging over much material to come to his conclusions.

In fact, when asked about the progress of social science as an explanatory arena over time, Pinker credited this cross-disciplinary focus:

Because of the unification with the sciences, there are more genuinely explanatory theories, and there’s a sense of progress, with more non-obvious things being discovered that have profound implications.

But, even better, Pinker gives out an outline for how a multidisciplinary thinker should approach problems in a complex world.

***

Here’s the issue at stake: When we’re viewing a complex phenomena—say, the decline in certain forms of violence in human history—it can be hard to come with up a rigorous explanation. We can’t just set up repeated lab experiments and vary the conditions of human history to see what pops out, as with physics or chemistry.

So out of necessity, we must approach the problem in a different way.

In the above referenced interview, Pinker gives a wonderful example how to do it: Note how he carefully “cross-checks” from a variety of sources of data, developing a 3D view of the landscape he’s trying to assess:

Pinker: Absolutely, I think most philosophers of science would say that all scientific generalizations are probabilistic rather than logically certain, more so for the social sciences because the systems you are studying are more complex than, say, molecules, and because there are fewer opportunities to intervene experimentally and to control every variable. But the exis­tence of the social sciences, including psychology, to the extent that they have discovered anything, shows that, despite the uncontrollability of human behavior, you can make some progress: you can do your best to control the nuisance variables that are not literally in your control; you can have analogues in a laboratory that simulate what you’re interested in and impose an experimental manipulation.

You can be clever about squeezing the last drop of causal information out of a correlational data set, and you can use converging evi­dence, the qualitative narratives of traditional history in combination with quantitative data sets and regression analyses that try to find patterns in them. But I also go to traditional historical narratives, partly as a sanity check. If you’re just manipulating numbers, you never know whether you’ve wan­dered into some preposterous conclusion by taking numbers too seriously that couldn’t possibly reflect reality. Also, it’s the narrative history that provides hypotheses that can then be tested. Very often a historian comes up with some plausible causal story, and that gives the social scientists something to do in squeezing a story out of the numbers.

Warburton: I wonder if you’ve got an example of just that, where you’ve combined the history and the social science?

Pinker: One example is the hypothesis that the Humanitarian Revolution during the Enlightenment, that is, the abolition of slavery, torture, cruel punishments, religious persecution, and so on, was a product of an expansion of empathy, which in turn was fueled by literacy and the consumption of novels and journalis­tic accounts. People read what life was like in other times and places, and then applied their sense of empathy more broadly, which gave them second thoughts about whether it’s a good idea to disembowel someone as a form of criminal punish­ment. So that’s a historical hypothesis. Lynn Hunt, a historian at the University of California–Berkeley, proposed it, and there are some psychological studies that show that, indeed, if people read a first-person account by someone unlike them, they will become more sympathetic to that individual, and also to the category of people that that individual represents.

So now we have a bit of experimental psychology supporting the historical qualita­tive narrative. And, in addition, one can go to economic histo­rians and see that, indeed, there was first a massive increase in the economic efficiency of manufacturing a book, then there was a massive increase in the number of books pub­lished, and finally there was a massive increase in the rate of literacy. So you’ve got a story that has at least three vertices: the historian’s hypothesis; the economic historians identifying exogenous variables that changed prior to the phenomenon we’re trying to explain, so the putative cause occurs before the putative effect; and then you have the experimental manipulation in a laboratory, showing that the intervening link is indeed plausible.

Pinker is saying, Look we can’t just rely on “plausible narratives” generated by folks like the historians. There are too many possibilities that could be correct.

Nor can we rely purely on correlations (i.e., the rise in literacy statistically tracking the decline in violence) — they don’t necessarily offer us a causative explanation. (Does the rise in literacy cause less violence, or is it vice versa? Or, does a third factor cause both?)

However, if we layer in some other known facts from areas we can experiment on — say, psychology or cognitive neuroscience — we can sometimes establish the causal link we need or, at worst, a better hypothesis of reality.

In this case, it would be the finding from psychology that certain forms of literacy do indeed increase empathy (for logical reasons).

Does this method give us absolute proof? No. However, it does allow us to propose and then test, re-test, alter, and strengthen or ultimately reject a hypothesis. (In other words, rigorous thinking.)

We can’t stop here though. We have to take time to examine competing hypotheses — there may be a better fit. The interviewer continues on asking Pinker about this methodology:

Warburton: And so you conclude that the de-centering that occurs through novel-reading and first-person accounts probably did have a causal impact on the willingness of people to be violent to their peers?

Pinker: That’s right. And, of course, one has to rule out alternative hypotheses. One of them could be the growth of affluence: perhaps it’s simply a question of how pleasant your life is. If you live a longer and healthier and more enjoyable life, maybe you place a higher value on life in general, and, by extension, the lives of others. That would be an alternative hypothesis to the idea that there was an expansion of empathy fueled by greater literacy. But that can be ruled out by data from eco­nomic historians that show there was little increase in afflu­ence during the time of the Humanitarian Revolution. The increase in affluence really came later, in the 19th century, with the advent of the Industrial Revolution.

***

Let’s review the process that Pinker has laid out, one that we might think about emulating as we examine the causes of complex phenomena in human systems:

  1. We observe an interesting phenomenon in need of explanation, one we feel capable of exploring.
  2. We propose and examine competing hypotheses that would explain the phenomena (set up in a falsifiable way, in harmony with the divide between science and pseudoscience laid out for us by the great Karl Popper).
  3. We examine a cross-section of: Empirical data relating to the phenomena; sensible qualitative inference (from multiple fields/disciplines, the more fundamental the better), and finally;  “Demonstrable” aspects of nature we are nearly certain about, arising from controlled experiment or other rigorous sources of knowledge ranging from engineering to biology to cognitive neuroscience.

What we end up with is not necessarily a bulletproof explanation, but probably the best we can do if we think carefully. A good cross-disciplinary examination with quantitative and qualitative sources coming into equal play, and a good dose of judgment, can be far more rigorous than the gut instinct or plausible nonsense type stories that many of us lazily spout.

A Word of Caution

Although Pinker’s “multiple vertices” approach to problem solving in complex domains can be powerful, we always have to be on guard for phenomena that we simply cannot explain at our current level of competence: We must have a “too hard” pile when competing explanations come out “too close to call” or we otherwise feel we’re outside of our circle of competence. Always tread carefully and be sure to follow Darwin’s Golden Rule: Contrary facts are more important than confirming ones. Be ready to change your mind, like Darwin, when the facts don’t go your way.

***

Still Interested? For some more Pinker goodness check out our prior posts on his work, or check out a few of his books like How the Mind Works or The Blank Slate: The Modern Denial of Human Nature.