Tag: Creativity

The Art of Being Alone

Loneliness has more to do with our perceptions than how much company we have. It’s just as possible to be painfully lonely surrounded by people as it is to be content with little social contact. Some people need extended periods of time alone to recharge, others would rather give themselves electric shocks than spend a few minutes with their thoughts. Here’s how we can change our perceptions by making and experiencing art.

***

At a moment in time when many people are facing unprecedented amounts of time alone, it’s a good idea for us to pause and consider what it takes to turn difficult loneliness into enriching solitude. We are social creatures, and a sustained lack of satisfying relationships carries heavy costs for our mental and physical health. But when we are forced to spend more time alone than we might wish, there are ways we can compensate and find a fruitful sense of connection and fulfillment. One way to achieve this is by using our loneliness as a springboard for creativity.

“Loneliness, longing, does not mean one has failed but simply that one is alive.”

— Olivia Laing

Loneliness as connection

One way people have always coped with loneliness is through creativity. By transmuting their experience into something beautiful, isolated individuals throughout history have managed to substitute the sense of community they might have otherwise found in relationships with their creative outputs.

In The Lonely City: Adventures in the Art of Being Alone, Olivia Laing tells the stories of a number of artists who led isolated lives and found meaning in their work even if their relationships couldn’t fulfill them. While she focuses specifically on visual artists in New York over the last seventy years, their methods of using their loneliness and transmitting it into their art carry wide resonance. These particular artists tapped into sentiments many of us will experience at least once in our lives. They found beauty in loneliness and showed it to be something worth considering, not just something to run from.

The artist Edward Hopper (1882–1967) is known for his paintings of American cityscapes inhabited by closed-off figures who seem to embody a vision of modern loneliness. Laing found herself drawn to his signature images of uneasy individuals in sparse surroundings, often separated from the viewer by a window or some other barrier.

Why, then, do we persist in ascribing loneliness to his work? The obvious answer is that his paintings tend to be populated by people alone, or in uneasy, uncommunicative groupings of twos and threes, fastened into poses that seem indicative of distress. But there’s something else too; something about the way he contrives his city streets . . . This viewpoint is often described as voyeuristic, but what Hopper’s urban scenes also replicate is one of the central experiences of being lonely: the way a feeling of separation, of being walled off or penned in, combines with a sense of near unbearable exposure.

While Hopper intermittently denied that his paintings were about loneliness, he certainly experienced the sense of being walled off in a city. In 1910 he moved to Manhattan, after a few years spent mostly in Europe, and found himself struggling to get by. Not only were his paintings not selling, he also felt alienated by the city. Hopper worked on commissions and had few close relationships. Only in his forties did he marry, well past the window of acceptability for the time. Laing writes of his early time in New York:

This sense of separation, of being alone in a big city, soon began to surface in his art . . . He was determined to articulate the day-to-day experience of inhabiting the modern, electric city of New York. Working first with etchings and then in paint, Hopper began to produce a distinctive body of images that captured the cramped, sometimes alluring experience of urban living.

Hopper roamed the city at night, sketching scenes that caught his eye. This perspective meant that the viewer of his paintings finds themselves most often in the position of an observer detached from the scene in front of them. If loneliness can feel like being separated from the world, the windows Hopper painted are perhaps a physical manifestation of this.

By Laing’s description, Hopper transformed the isolation he may have experienced by depicting the experience of loneliness as a place in itself, inhabited by the many people sharing it despite their differences. She elaborates and states, “They aren’t sentimental, his pictures, but there is an extraordinary attentiveness to them. As if what he saw was as interesting as he kept insisting he needed it to be: worth the labor, the miserable effort of setting it down. As if loneliness was something worth looking at. More than that, as if looking itself was an antidote, a way to defeat loneliness’ strange, estranging spell.”

Hopper’s work shows us that one way to make friends with loneliness is to create work that explores and examines it. This not only offers a way to connect with those enduring the same experience but also turns isolation into creative material and robs it of some of its sting.

Loneliness as inspiration

A second figure Laing considers is Andy Warhol (1928–1987). Born Andrew Warhola, the artist has become an icon, his work widely known, someone whose fame renders him hard to relate to. When she began exploring his body of work, Laing found that “one of the interesting things about his work, once you stop to look, is the way the real, vulnerable human self remains stubbornly visible, exerting its own submerged pressure, its own mute appeal to the viewer.”

In particular, much of Warhol’s work pertains to the loneliness he felt throughout his life, no matter how surrounded he was by glittering friends and admirers.

Throughout Warhol’s oeuvre, we see his efforts to turn his own sense of being on the outside into art. A persistent theme in his work was speech. He made thousands of tapes of conversations, often using them as the basis for other works of art. For instance, Warhol’s book, a, A Novel, consists of transcribed tapes from between 1965 and 1967. The tape recorder was such an important part of his life, both a way of connecting with people and keeping them at a distance, that he referred to it as his wife. By listening to others and documenting the oddities of their speech, Warhol coped with feeling he couldn’t be heard. Laing writes, “he retained a typically perverse fondness for language errors. He was fascinated by empty or deformed language, by chatter and trash, by glitches and botches in conversation.” In his work, all speech mattered regardless of its content.

Warhol himself often struggled with speech, mumbling in interviews and being embarrassed by his heavy Pittsburgh accent, which rendered him easily misunderstood in school. Speech was just one factor that left him isolated at times. At age seven, Warhol was confined to his bed by illness for several months. He withdrew from his peers, focusing on making art with his mother, and never quite integrated into school again. After graduating from Carnegie Mellon University in 1949, Warhol moved to New York and sought his footing in the art world. Despite his rapid rise to success and fame, he remained held back by an unshakeable belief in his own inferiority and exclusion from existing social circles.

Becoming a machine also meant having relationships with machines, using physical devices as a way of filling the uncomfortable, sometimes unbearable space between self and world. Warhol could not have achieved his blankness, his enviable detachment, without the use of these charismatic substitutes for intimacy and love.

Later in the book, Laing visits the Warhol museum to see his Time Capsules, 610 cardboard boxes filled with objects collected over the course of thirteen years: “postcards, letters, newspapers, magazines, photographs, invoices, slices of pizza, a piece of chocolate cake, even a mummified human foot.” He added objects until each box was full, then transferred them to a storage unit. Some objects have obvious value, while others seem like trash. There is no particular discernable order to the collection, yet Laing saw in the Time Capsules much the same impulse reflected in Warhol’s tape recordings:

What were the Capsules, really? Trash cans, coffins, vitrines, safes; ways of keeping the loved together, ways of never having to admit to loss or feel the pain of loneliness . . . What is left after the essence has departed? Rind and skin, things you want to throw away but can’t.

The loneliness Warhol felt when he created works like the Time Capsules was more a psychological one than a practical one. He was no longer alone, but his early experiences of feeling like an outsider, and the things he felt set him apart from others, like his speech, marred his ability to connect. Loneliness, for Warhol, was perhaps more a part of his personality than something he could overcome through relationships. Even so, he was able to turn it into fodder for the groundbreaking art we remember him for. Warhol’s art communicated what he struggled to say outright. It was also a way of him listening to and seeing other people—by photographing friends, taping them sleeping, or recording their conversations—when he perhaps felt he couldn’t be heard or seen.

Where creativity takes us

Towards the end of the book, Laing writes:

There are so many things that art can’t do. It can’t bring the dead back to life, it can’t mend arguments between friends, or cure AIDS, or halt the pace of climate change. All the same, it does have some extraordinary functions, some odd negotiating ability between people, including people who have never met and yet who infiltrate and enrich each other’s lives. It does have a capacity to create intimacy; it does have a way of healing wounds, and better yet of making it apparent that not all wounds need healing and not all scars are ugly.

When we face loneliness in our lives, it is not always possible or even appropriate to deal with it by rushing to fill our lives with people. Sometimes we do not have that option; sometimes we’re not in the right space to connect deeply; sometimes we first just need to work through that feeling. One way we can embrace our loneliness is by turning to the art of others who have inhabited that same lonely city, drawing solace and inspiration from their creations. We can use that as inspiration in our own creative pursuits which can help us work through difficult, and lonely, times.

Standing on the Shoulders of Giants

Innovation doesn’t occur in a vacuum. Doers and thinkers from Shakespeare to Jobs, liberally “stole” inspiration from the doers and thinkers who came before. Here’s how to do it right.

***

“If I have seen further,” Isaac Newton wrote in a 1675 letter to fellow scientist Robert Hooke, “it is by standing on the shoulders of giants.”

It can be easy to look at great geniuses like Newton and imagine that their ideas and work came solely out of their minds, that they spun it from their own thoughts—that they were true originals. But that is rarely the case.

Innovative ideas have to come from somewhere. No matter how unique or unprecedented a work seems, dig a little deeper and you will always find that the creator stood on someone else’s shoulders. They mastered the best of what other people had already figured out, then made that expertise their own. With each iteration, they could see a little further, and they were content in the knowledge that future generations would, in turn, stand on their shoulders.

Standing on the shoulders of giants is a necessary part of creativity, innovation, and development. It doesn’t make what you do less valuable. Embrace it.

Everyone gets a lift up

Ironically, Newton’s turn of phrase wasn’t even entirely his own. The phrase can be traced back to the twelfth century, when the author John of Salisbury wrote that philosopher Bernard of Chartres compared people to dwarves perched on the shoulders of giants and said that “we see more and farther than our predecessors, not because we have keener vision or greater height, but because we are lifted up and borne aloft on their gigantic stature.”

Mary Shelley put it this way in the nineteenth century, in a preface for Frankenstein: “Invention, it must be humbly admitted, does not consist in creating out of void but out of chaos.”

There are giants in every field. Don’t be intimidated by them. They offer an exciting perspective. As the film director Jim Jarmusch advised, “Nothing is original. Steal from anywhere that resonates with inspiration or fuels your imagination. Devour old films, new films, music, books, paintings, photographs, poems, dreams, random conversations, architecture, bridges, street signs, trees, clouds, bodies of water, light, and shadows. Select only things to steal from that speak directly to your soul. If you do this, your work (and theft) will be authentic. Authenticity is invaluable; originality is non-existent. And don’t bother concealing your thievery—celebrate it if you feel like it. In any case, always remember what Jean-Luc Godard said: ‘It’s not where you take things from—it’s where you take them to.’”

That might sound demoralizing. Some might think, “My song, my book, my blog post, my startup, my app, my creation—surely they are original? Surely no one has done this before!” But that’s likely not the case. It’s also not a bad thing. Filmmaker Kirby Ferguson states in his TED Talk: “Admitting this to ourselves is not an embrace of mediocrity and derivativeness—it’s a liberation from our misconceptions, and it’s an incentive to not expect so much from ourselves and to simply begin.”

There lies the important fact. Standing on the shoulders of giants enables us to see further, not merely as far as before. When we build upon prior work, we often improve upon it and take humanity in new directions. However original your work seems to be, the influences are there—they might just be uncredited or not obvious. As we know from social proof, copying is a natural human tendency. It’s how we learn and figure out how to behave.

In Antifragile: Things That Gain from Disorder, Nassim Taleb describes the type of antifragile inventions and ideas that have lasted throughout history. He describes himself heading to a restaurant (the likes of which have been around for at least 2,500 years), in shoes similar to those worn at least 5,300 years ago, to use silverware designed by the Mesopotamians. During the evening, he drinks wine based on a 6,000-year-old recipe, from glasses invented 2,900 years ago, followed by cheese unchanged through the centuries. The dinner is prepared with one of our oldest tools, fire, and using utensils much like those the Romans developed.

Much about our societies and cultures has undeniably changed and continues to change at an ever-faster rate. But we continue to stand on the shoulders of those who came before in our everyday life, using their inventions and ideas, and sometimes building upon them.

Not invented here syndrome

When we discredit what came before or try to reinvent the wheel or refuse to learn from history, we hold ourselves back. After all, many of the best ideas are the oldest. “Not Invented Here Syndrome” is a term for situations when we avoid using ideas, products, or data created by someone else, preferring instead to develop our own (even if it is more expensive, time-consuming, and of lower quality.)

The syndrome can also manifest as reluctance to outsource or delegate work. People might think their output is intrinsically better if they do it themselves, becoming overconfident in their own abilities. After all, who likes getting told what to do, even by someone who knows better? Who wouldn’t want to be known as the genius who (re)invented the wheel?

Developing a new solution for a problem is more exciting than using someone else’s ideas. But new solutions, in turn, create new problems. Some people joke that, for example, the largest Silicon Valley companies are in fact just impromptu incubators for people who will eventually set up their own business, firm in the belief that what they create themselves will be better.

The syndrome is also a case of the sunk cost fallacy. If a company has spent a lot of time and money getting a square wheel to work, they may be resistant to buying the round ones that someone else comes out with. The opportunity costs can be tremendous. Not Invented Here Syndrome detracts from an organization or individual’s core competency, and results in wasting time and talent on what are ultimately distractions. Better to use someone else’s idea and be a giant for someone else.

Why Steve Jobs stole his ideas

“Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it. They just saw something. It seemed obvious to them after a while; that’s because they were able to connect experiences they’ve had and synthesize new things.” 

— Steve Jobs

In The Runaway Species: How Human Creativity Remakes the World, Anthony Brandt and David Eagleman trace the path that led to the creation of the iPhone and track down the giants upon whose shoulders Steve Jobs perched. We often hail Jobs as a revolutionary figure who changed how we use technology. Few who were around in 2007 could have failed to notice the buzz created by the release of the iPhone. It seemed so new, a total departure from anything that had come before. The truth is a little messier.

The first touchscreen came about almost half a century before the iPhone, developed by E.A. Johnson for air traffic control. Other engineers built upon his work and developed usable models, filing a patent in 1975. Around the same time, the University of Illinois was developing touchscreen terminals for students. Prior to touchscreens, light pens used similar technology. The first commercial touchscreen computer came out in 1983, soon followed by graphics boards, tablets, watches, and video game consoles. Casio released a touchscreen pocket computer in 1987 (remember, this is still a full twenty years before the iPhone.)

However, early touchscreen devices were frustrating to use, with very limited functionality, often short battery lives, and minimal use cases for the average person. As touchscreen devices developed in complexity and usability, they laid down the groundwork for the iPhone.

Likewise, the iPod built upon the work of Kane Kramer, who took inspiration from the Sony Walkman. Kramer designed a small portable music player in the 1970s. The IXI, as he called it, looked similar to the iPod but arrived too early for a market to exist, and Kramer lacked the marketing skills to create one. When pitching to investors, Kramer described the potential for immediate delivery, digital inventory, taped live performances, back catalog availability, and the promotion of new artists and microtransactions. Sound familiar?

Steve Jobs stood on the shoulders of the many unseen engineers, students, and scientists who worked for decades to build the technology he drew upon. Although Apple has a long history of merciless lawsuits against those they consider to have stolen their ideas, many were not truly their own in the first place. Brandt and Eagleman conclude that “human creativity does not emerge from a vacuum. We draw on our experience and the raw materials around us to refashion the world. Knowing where we’ve been, and where we are, points the way to the next big industries.”

How Shakespeare got his ideas

Nothing will come of nothing.”  

— William Shakespeare, King Lear

Most, if not all, of Shakespeare’s plays draw heavily upon prior works—so much so that some question whether he would have survived today’s copyright laws.

Hamlet took inspiration from Gesta Danorum, a twelfth-century work on Danish history by Saxo Grammaticus, consisting of sixteen Latin books. Although it is doubtful whether Shakespeare had access to the original text, scholars find the parallels undeniable and believe he may have read another play based on it, from which he drew inspiration. In particular, the accounts of the plight of Prince Amleth (which has the same letters as Hamlet) involves similar events.

Holinshed’s Chronicles, a co-authored account of British history from the late sixteenth century, tells stories that mimic the plot of Macbeth, including the three witches. Holinshed’s Chronicles itself was a mélange of earlier texts, which transferred their biases and fabrications to Shakespeare. It also likely inspired King Lear.

Parts of Antony and Cleopatra are copied verbatim from Plutarch’s Life of Mark Anthony. Arthur Brooke’s 1562 poem The Tragicall Historye of Romeus and Juliet was an undisguised template for Romeo and Juliet. Once again, there are more giants behind the scenes—Brooke copied a 1559 poem by Pierre Boaistuau, who in turn drew from a 1554 story by Matteo Bandello, who in turn drew inspiration from a 1530 work by Luigi da Porto. The list continues, with Plutarch, Chaucer, and the Bible acting as inspirations for many major literary, theatrical, and cultural works.

Yet what Shakespeare did with the works he sometimes copied, sometimes learned from, is remarkable. Take a look at any of the original texts and, despite the mimicry, you will find that they cannot compare to his plays. Many of the originals were dry, unengaging, and lacking any sort of poetic language. J.J. Munro wrote in 1908 that The Tragicall Historye of Romeus and Juliet “meanders on like a listless stream in a strange and impossible land; Shakespeare’s sweeps on like a broad and rushing river, singing and foaming, flashing in sunlight and darkening in cloud, carrying all things irresistibly to where it plunges over the precipice into a waste of waters below.”

Despite bordering on plagiarism at times, he overhauled the stories with an exceptional use of the English language, bringing drama and emotion to dreary chronicles or poems. He had a keen sense for the changes required to restructure plots, creating suspense and intensity in their stories. Shakespeare saw far further than those who wrote before him, and with their help, he ushered in a new era of the English language.

Of course, it’s not just Newton, Jobs, and Shakespeare who found a (sometimes willing, sometimes not) shoulder to stand upon. Facebook is presumed to have built upon Friendster. Cormac McCarthy’s books often replicate older history texts, with one character coming straight from Samuel Chamberlain’s My Confessions. John Lennon borrowed from diverse musicians, once writing in a letter to the New York Times that though the Beatles copied black musicians, “it wasn’t a rip off. It was a love in.”

In The Ecstasy of Influence, Jonathan Lethem points to many other instances of influences in classic works. In 1916, journalist Heinz von Lichberg published a story of a man who falls in love with his landlady’s daughter and begins a love affair, culminating in her death and his lasting loneliness. The title? Lolita. It’s hard to question that Nabokov must have read it, but aside from the plot and name, the style of language in his version is absent from the original.

The list continues. The point is not to be flippant about plagiarism but to cultivate sensitivity to the elements of value in a previous work, as well as the ability to build upon those elements. If we restrict the flow of ideas, everyone loses out.

The adjacent possible

What’s this about? Why can’t people come up with their own ideas? Why do so many people come up with a brilliant idea but never profit from it? The answer lies in what scientist Stuart Kauffman calls “the adjacent possible.” Quite simply, each new innovation or idea opens up the possibility of additional innovations and ideas. At any time, there are limits to what is possible, yet those limits are constantly expanding.

In Where Good Ideas Come From: The Natural History of Innovation, Steven Johnson compares this process to being in a house where opening a door creates new rooms. Each time we open the door to a new room, new doors appear and the house grows. Johnson compares it to the formation of life, beginning with basic fatty acids. The first fatty acids to form were not capable of turning into living creatures. When they self-organized into spheres, the groundwork formed for cell membranes, and a new door opened to genetic codes, chloroplasts, and mitochondria. When dinosaurs evolved a new bone that meant they had more manual dexterity, they opened a new door to flight. When our distant ancestors evolved opposable thumbs, dozens of new doors opened to the use of tools, writing, and warfare. According to Johnson, the history of innovation has been about exploring new wings of the adjacent possible and expanding what we are capable of.

A new idea—like those of Newton, Jobs, and Shakespeare—is only possible because a previous giant opened a new door and made their work possible. They in turn opened new doors and expanded the realm of possibility. Technology, art, and other advances are only possible if someone else has laid the groundwork; nothing comes from nothing. Shakespeare could write his plays because other people had developed the structures and language that formed his tools. Newton could advance science because of the preliminary discoveries that others had made. Jobs built Apple out of the debris of many prior devices and technological advances.

The questions we all have to ask ourselves are these: What new doors can I open, based on the work of the giants that came before me? What opportunities can I spot that they couldn’t? Where can I take the adjacent possible? If you think all the good ideas have already been found, you are very wrong. Other people’s good ideas open new possibilities, rather than restricting them.

As time passes, the giants just keep getting taller and more willing to let us hop onto their shoulders. Their expertise is out there in books and blog posts, open-source software and TED talks, podcast interviews, and academic papers. Whatever we are trying to do, we have the option to find a suitable giant and see what can be learned from them. In the process, knowledge compounds, and everyone gets to see further as we open new doors to the adjacent possible.

Embrace the Mess: The Upside of Disorder

“We often succumb to the temptation of a tidy-minded approach
when we would be better served by embracing a degree of mess.”
— Tim Harford

***

The breadth and depth of products and services that promise to help us stay organized is almost overwhelming. Indeed, it would seem that to be messy is almost universally shunned, considered a sign of not being “put together,” while being tidy and neat is venerated to the nth degree.

Tim Harford has a different take. In his book Messy: The Power of Disorder to Transform Our Lives, he flips this notion around, showing us that there are situations in which disorder is beneficial, or at the very least that order has been oversold. (Tim previously introduced us to another counterintuitive thought with Adapt.)

***

One of the reasons why we put so much time and effort into being organized and tidy is because we make assumptions about what this will do for our productivity. If all our papers are neatly filed and email is neatly sorted, it will be easier to retrieve anything that’s important, right? Maybe not.

Harford cites a paper by Steve Whittaker and researchers at IBM called “Am I Wasting My Time Organizing Email?” to illustrate the fallacy.

Whittaker and his colleagues got permission to install logging software on the computers of several hundred office workers, and tracked around 85,000 attempts to find e-mail by clicking through folders, or by using ad hoc methods—scrolling through the inbox, clicking on a header to sort by (for example) the sender, or using the search function. Whittaker found that clicking through a folder tree took almost a minute, while simply searching took just 17 seconds. People who relied on folders took longer to find what they were looking for, but their hunts for the right e-mail were no more or less successful. In other words, if you just dump all your e-mail into a folder called “archive,” you will find your e-mails more quickly than if you hide them in a tidy structure of folders.

Okay, so taking the time to organize your email may not be as useful as we thought. Computers, after all, are designed as tools to help us work better and faster, so it makes sense that the simple search function would outperform us. But physical filing and keeping our work space neat makes us more productive right?

Once again, maybe not.

Quite a bit of research has been done on people’s working environments and it would seem that those with big piles of paper and/or clutter on their desks may be just as effective (and sometimes more so) than those pedantic ‘fillers.’

This is not to argue that a big pile of paper is the best possible filing system. But despite appearances, it’s very far from being a random assortment. A messy desk isn’t nearly as chaotic as it at first seems. There’s a natural tendency toward a very pragmatic system of organization based simply on the fact that the useful stuff keeps on getting picked up and left on the top of the pile.

David Kirsh, a cognitive scientist at the University of California, San Diego studies the differences between the working habits of the tidy types (he calls them ‘neats’) and the messy types (he calls them ‘scruffies’). Let’s look at what he found.

…how do people orient themselves after arriving at the office or finishing a phone call? Kirsh finds that “neats” orient themselves with to-do lists and calendars, while “scruffies” orient themselves using physical cues—the report that they were working on is lying on the desk, as is a letter that needs a reply, and receipts that must be submitted for expenses. A messy desk is full of such cues. A tidy desk conveys no information at all, and it must be bolstered with the prompt of a to-do list. Both systems can work, so we should hesitate before judging other people based on their messy desks.

So if both systems work, are there times when it’s actually more advantageous to embrace messiness?

Here Harford hits upon an interesting hypothesis: Messiness may enhance certain types of creativity. In fact, creativity itself may systematically benefit from a certain amount of disorder.

When things are too neat and tidy, it’s easy for boredom to set in and creativity to suffer. We feel stifled.

A messy environment offers disruptions that seem to act as a catalyst for new ideas and creations. If you think about it, we try to avoid these same disruptions when we focus on being more “organized.” But, if you sometimes embrace a little mess, you may be opening yourself up to more creative serendipity:

Messy disruptions will be most powerful when combined with creative skill. The disruption puts an artist, scientist, or engineer in unpromising territory—a deep valley rather than a familiar hilltop. But then expertise kicks in and finds ways to move upward again: the climb finishes at a new peak, perhaps lower than the old one, but perhaps unexpectedly higher.

Think about an “inefficiently” designed office plan that looks wasteful on the surface: What’s lost in efficiency (say, putting two departments that need to talk to each other in separated areas) can be more than made up for in serendipitous encounters.

Brian Eno, considered one of the most influential and innovative figures in music over the last five decades describes it like this:

The enemy of creative work is boredom, actually,” he says. “And the friend is alertness. Now I think what makes you alert is to be faced with a situation that is beyond your control so you have to be watching it very carefully to see how it unfolds, to be able to stay on top of it. That kind of alertness is exciting.”

Eno created an amazing system for pushing people into ‘alertness.’ He came up with something he called “Oblique Strategies” cards. He would show up at the recording studio with a handful of cards and bring them out whenever it seemed that the group needed a nudge.

Each had a different instruction, often a gnomic one. Whenever the studio sessions were running aground, Eno would draw a card at random and relay its strange orders.

Be the first not to do what has never not been done before
Emphasize the flaws
Only a part, not the whole
Twist the spine
Look at the order in which you do things
Change instrument roles

Can you imagine asking the guitarist of a group to sit behind the drums on a track? These were the type of suggestions that Eno is famous for and it seems to be serving him well; at age sixty-eight he has a new album coming out in January of 2017 and some variation of his cards have been available for purchase since first appearing for public consumption in 1975.

We all won’t be able to embrace a card from Eno’s deck. Some people do well in tidy environments/situations and some do well in messy ones — it’s probably contingent on what you’re trying to achieve. (We wouldn’t go so far as recommending a CEO be disorganized.)

Reading through the book it would seem that the key is, like most things, to give it a try. A little “intentional messiness” could go a long way towards helping you climb out of a rut. And, if you are the tidy type through and through, it’s important not to try and force that on others — you just might be taking away a good thing.

If you like the ideas in Messy, check out Harford’s other book Adapt: Why Success Always Starts With Failure, or check out another important book on things that gain from disorder, Antifragile.

Why Great Writers Write

“Books are never finished.They are merely abandoned.”
— Oscar Wilde

***

Why do great writers write?

The question will never be answered only explored — in the context of culture, time, and ourselves. This is exactly what Meredith Maran probes in Why We Write: 20 Acclaimed Authors on How and Why They Do What They Do. The powerful range of answers reveal quite a bit about the nature of drive, passion, and the creative process.

Maran starts by introducing us to what authors have historically said on the subject. One of the most well-known is from George Orwell, who eloquently answered the question in 1946 in an essay called Why I Write:

From a very early age, perhaps the age of five or six, I knew that when I grew up I should be a writer. Between the ages of about seventeen and twenty-four I tried to abandon this idea, but I did so with the consciousness that i was outraging my true nature and that sooner or later I should have to settle down and write books.

Throughout the book, we encounter this idea that writing is so intrinsically a part of most writers that they couldn’t imagine being anything else. David Baldacci, the writer behind Absolute Power, claims, “If writing were illegal, I’d be in prison. I can’t not write. It’s a compulsion.

Can you claim that about your own work?

Baldacci notes that even before he was a full time author, he was a storyteller in law, his other profession.

Some of the best fiction I ever came up with was as a lawyer.

You know who wins in court? The client whose lawyer tells better stories than the other lawyer does. When you’re making a legal case, you can’t change the facts. You can only rearrange the to make a story that better enhances your client’s position, emphasizing certain things, deemphasizing others. You make sure the facts that you want people to believe are the most compelling ones. The facts that hurt your case are the ones you either explain away or hide away. That’s telling a story.

Baldacci describes the intrinsic fears he feels as an author, where every new project starts with a blank slate and infinite possibilities. (Something Steven Pressfield calls the War of Art.)

Every time I start a project, I sit down scared to death that I won’t be able to bring the magic again.

You’d never want to be on the operating table with a right-handed surgeon who says, ‘Today I’m going to try operating with my left hand.’ But writing is like that. The way you get better is by pushing yourself to do things differently each time. As a writer you’re not constrained by mechanical devices or technology or anything else. You get to play. Which is terrifying.

mary-karr

It’s interesting to note that most of the authors in Maran’s book, despite being successful, had a huge fear of not being able to produce again. We assume that success brings a certain amount of confidence but also brings expectations. Nothing is more scary to a musician than a sophomore album or more scary to an author then starting the next book.

One way to solve the problem is to attempt to be ridiculously prolific, like Issac Asimov. But even prolific authors seem to suffer from trepidation. Sue Grafton, the author of A is for Alibi, has written a mystery novel for every letter of the alphabet up to X, and this is what she had to say:

Most days when I sit down at my computer, I’m scared half out of my mind. I’m always convinced that my last book was my last book, that my career is at an end, that I’ll never be able to pull off another, that my success was a fleeting illusion, and my hopes for the future are already dead. Dang! All this drama and it’s not even nine a.m.

For many this fear seems ubiquitous, a shadow that follows them through the process. This is evidenced in Isabel Allende’s description of her writing methodology. (Allende wrote The House of Spirits.)

I start all my books on January eighth. Can you imagine January seventh? It’s hell.

Every year on January seventh, I prepare my physical space. I clean up everything from my other books. I just leave my dictionaries, and my first editions, and the research materials for the new one. And then on January eight I walk seventeen steps from the kitchen to the little pool house that is my office. It’s like a journey to another world. It’s winter, it’s raining usually. I go with my umbrella and the dog following me. From those seventeen steps on, I am in another world and i am another person.

I go there scared. And excited. And disappointed – because I have a sort of idea that isn’t really an idea. The first two, three, four weeks are wasted. I just show up in front of the computer. Show up, show up, show up, and after a while the muse shows up, too. If she doesn’t show up invited, eventually she just shows up.

Experience has shown these seasoned authors that this fear is to be embraced: If the muse isn’t here right now, she will come eventually, because she always does. That time between then and now is an experience that can be used in their writing.

So why write? Why punish yourself by putting yourself through such a difficult process? Mary Karr (author of The Liar’s Club) had an almost poetic answer.

I write to dream; to connect with other human beings; to record; to clarify; to visit the dead. I have a kind of primitive need to leave a mark on the world. Also, I have a need for money.

I’m almost always anxious when I’m writing. There are those great moments when you forget where you are, when you get your hands on the keys, and you don’t feel anything because you’re somewhere else. But that very rarely happens. Mostly I’m pounding my hands on the corpse’s chest.

With all that said, Mr. Orwell might have summed it up best. In his essay, he listed what he believed to be the four great motives for writing:

Sheer egoism. To be talked about, to be remembered after death, to get your own back on grown-ups in childhood, etc.

Aesthetic enthusiasm. To take pleasure in the impact of one sound on another, in the firmness of good prose or the rhythm of a good story.

Historical impulse. The desire to see things as they are, to find out true facts and store them up for the use of posterity.

Political purposes. The opinion that art should have nothing to do with politics is itself a political attitude.

While every author seemed to have a slightly different motive for writing, they all appear compelled to tell us stories, a burning desire to get something out and share it with the world. Joan Didion once said, “I write entirely to find out what I’m thinking…”

Sometimes we can’t learn without writing. Sometimes we can’t make sense of our feelings unless we talk about them, and for writers that conversation happens in their books.

Whether you are an avid reader or a writer Why We Write is an insightful work which allows you the chance to visit the minds of some of the most successful authors of our time. Complement this with Maran’s other book Why We Write About Ourselves: Twenty Memoirists On Why They Expose Themselves (and others) in the Name of Literature.

Also check out our post on the extremely prolific writer Isaac Asimov and maybe improve your professional writing by taking the advice of Steven Pinker.

Hares, Tortoises, and the Trouble with Genius

“Geniuses are dangerous.”
— James March

How many organizations would deny that they want more creativity, more genius, and more divergent thinking among their constituents? The great genius leaders of the world are fawned over breathlessly and a great amount of lip service is given to innovation; given the choice between “mediocrity” and “innovation,” we all choose innovation hands-down.

So why do we act the opposite way?

Stanford’s James March might have some insight. His book On Leadership (see our earlier notes here) is a collection of insights derived mostly from the study of great literature, from Don Quixote to Saint Joan to War & Peace. In March’s estimation, we can learn more about human nature (of which leadership is merely a subset) from studying literature than we can from studying leadership literature.

March discusses the nature of divergent thinking and “genius” in a way that seems to reflect true reality. We don’t seek to cultivate genius, especially in a mature organization, because we’re more afraid of the risks than appreciative of the benefits. A classic case of loss aversion. Tolerating genius means tolerating a certain amount of disruption; the upside of genius sounds pretty good until we start understanding its dark side:

Most original ideas are bad ones. Those that are good, moreover, are only seen as such after a long learning period; they rarely are impressive when first tried out. As a result, an organization is likely to discourage both experimentation with deviant ideas and the people who come up with them, thereby depriving itself, in the name of efficient operation, of its main source of innovation.

[…]

Geniuses are dangerous. Othello’s instinctive action makes him commit an appalling crime, the fine sentiments of Pierre Bezukhov bring little comfort to the Russian peasants, and Don Quixote treats innocent people badly over and over again. A genius combines the characteristics that produce resounding failures (stubbornness, lack of discipline, ignorance), a few ingredients of success (elements of intelligence, a capacity to put mistakes behind him or her, unquenchable motivation), and exceptional good luck. Genius therefore only appears as a sub-product of a great tolerance for heresy and apparent craziness, which is often the result of particular circumstances (over-abundant resources, managerial ideology, promotional systems) rather than deliberate intention. “Intelligent” organizations will therefore try to create an environment that allows genius to flourish by accepting the risks of inefficiency or crushing failures…within the limits of the risks that they can afford to take.

We’ve bolded an important component: Exceptional good luck. The kind of genius that rarely surfaces but we desperately pursue needs great luck to make an impact. Truthfully, genius is always recognized in hindsight, with the benefit of positive results in mind. We “cherrypick” the good results of divergent thinkers, but forget that we use the results to decide who’s a genius and who isn’t. Thus, tolerating divergent, genius-level thinking requires an ability to tolerate failure, loss, and change if it’s to be applied prospectively.

Sounds easy enough, in theory. But as Daniel Kahneman and Charlie Munger have so brilliantly pointed out, we become very risk averse when we possess anything, including success; we feel loss more acutely than gain, and we seek to keep the status quo intact. (And it’s probably good that we do, on average.)

Compounding the problem, when we do recognize and promote genius, some of our exalting is likely to be based on false confidence, almost by definition:

Individuals who are frequently promoted because they have been successful will have confidence in their own abilities to beat the odds. Since in a selective, and therefore increasingly homogenous, management group the differences in performance that are observed are likely to be more often due to chance events than to any particular individual capacity, the confidence is likely to be misplaced. Thus, the process of selecting on performance results in exaggerated self-confidence and exaggerated risk-taking.

Let’s use a current example: Elon Musk. Elon is (justifiably) recognized as a modern genius, leaping tall buildings in a single bound. Yet as Ashlee Vance makes clear in his biography, Musk teetered on the brink several times. It’s a near miracle that his businesses have survived (and thrived) to where they’re at today. The press would read much differently if SpaceX or Tesla had gone under — he might be considered a brilliant but fatally flawed eccentric rather than a genius. Luck played a fair part in that outcome (which is not to take away from Musk’s incredible work).

***

Getting back to organizations, the failure to appropriately tolerate genius is also a problem of homeostasis: The tendency of systems to “stay in place” and avoid disruption of strongly formed past habits. Would an Elon Musk be able to rise in a homeostatic organization? It generally does not happen.

James March has a solution, though, and it’s one we’ve heard echoed by other thinkers like Nassim Taleb and seems to be used fairly well in some modern technology organizations. As with most organizational solutions, it requires realigning incentives, which is the job of a strong and selfless leader.

An analogy of the hare and the tortoise illustrates the solution:

Although one particular hare (who runs fast but sleeps too long) has every chance or being beaten by one particular tortoise, an army of hares in competition with an army of tortoises will almost certainly result in one of the hares crossing the finish line first. The choices of an organization therefore depend on the respective importance that it attaches to its mean performance (in which case it should recruit tortoises) and the achievement of a few dazzling successes (an army of hares, which is inefficient as a whole, but contains some outstanding individuals.)

[…]

In a simple model, a tortoise advances with a constant speed of 1 mile/hour while a hare runs at 5 miles/hour, but in each given 5-minute period a hare has a 90 percent chance of sleeping rather than running. A tortoise will cover the mile of the test in one hour exactly and a hare will have only about an 11 percent chance of arriving faster (the probability that he will be awake for at least three of the 5-minute periods.) If there is a race between the tortoise and one hare, the probability that the hare will win is only 0.11. However, if there are 100 tortoises and 100 hares in the race, the probability that at least one hare will arrive before any tortoise (and thus the race will be won by a hare) is 1– ((0.89)^100), or greater than 0.9999.

The analogy holds up well in the business world. Any one young, aggressive “hare” is unlikely to beat the lumbering “tortoise” that reigns king, but put 100 hares out against 100 tortoises and the result is much different.

This means that any organization must conduct itself in such a way that hares have a chance to succeed internally. It means becoming open to divergence and allowing erratic genius to rise, while keeping the costs of failure manageable. It means having the courage to create an “army of hares” inside of your own organization rather than letting tortoises have their way, as they will if given the opportunity.

For a small young organization, the cost of failure isn’t all that high, comparatively speaking — you can’t fall far off a pancake. So hares tend to get a lot more leash. But for a large organization, the cost of failure tends to increase to such a pain point that it stops becoming tolerated! At this point, real innovation ceases.

But, if we have the will and ability to create small teams and projects with “hare-like” qualities, in ways that allow the “talent + luck” equation to surface truly better and different work, necessarily tolerating (and encouraging) failure and disruption, then we might have a shot at overcoming homeostasis in the same way that a specific combination of engineering and fuel allow rockets to overcome the equally strong force of gravity.

***

Still Interested? Check out our notes on James March’s books On Leadership and The Ambiguities of Experience, and an interview March did on the topic of leadership.

Warren Berger’s Three-Part Method for More Creativity

“A problem well stated is a problem half-solved.”
— Charles “Boss” Kettering

***

The whole scientific method is built on a very simple structure: If I do this, then what will happen? That’s the basic question on which more complicated, intricate, and targeted lines of inquiry are built, across a wide variety of subjects. This simple form helps us push deeper and deeper into knowledge of the world. (On a sidenote, science has become such a loaded, political word that this basic truth of how it works frequently seems to be lost!)

Individuals learn this way too. From the time you were a child, you were asking why (maybe even too much), trying to figure out all the right questions to ask to get better information about how the world works and what to do about it.

Because question-asking is such an integral part of how we know things about the world, both institutionally and individually, it seems worthy to understand how creative inquiry works, no? If we want to do things that haven’t been done or learn things that have never been learned — in short, be more creative — we must learn to ask the right questions, ones so good that they’re half-answered in the asking. And to do that, it might help to understand the process, no?

Warren Berger proposes a simple method in his book A More Beautiful Questionan interesting three-part system to help (partially) solve the problem of inquiry. He calls it The Why, What If, and How of Innovative Questioning, and reminds us why it’s worth learning about.

Each stage of the problem solving process has distinct challenges and issues–requiring a different mind-set, along with different types of questions. Expertise is helpful at certain points, not so helpful at others; wide-open, unfettered divergent thinking is critical at one stage, discipline and focus is called for at another. By thinking of questioning and problem solving in a more structured way, we can remind ourselves to shift approaches, change tools, and adjust our questions according to which stage we’re entering.

Three-Part Method for More Creativity

Why?

It starts with the Why?

A good Why? seeks true understanding. Why are things the way they are currently? Why do we do it that way? Why do we believe what we believe?

This start is essential because it gives us permission to continue down a line of inquiry fully equipped. Although we may think we have a brilliant idea in our heads for a new product, or a new answer to an old question, or a new way of doing an old thing, unless we understand why things are the way they are, we’re not yet on solid ground. We never want to operate from a position of ignorance, wasting our time on an idea that hasn’t been pushed and fleshed out. Before we say “I already know” the answer, maybe we need to step back and look for the truth.

At the same time, starting with a strong Why also opens up the idea that the current way (whether it’s our way or someone else’s) might be wrong, or at least inefficient. Let’s say a friend proposes you go to the same restaurant you’ve been to a thousand times. It might be a little agitating, but a simple “Why do we always go there?” allows two things to happen:

A. Your friend can explain why, and this gives him/her a legitimate chance at persuasion. (If you’re open minded.)

B. The two of you may agree you only go there out of habit, and might like to go somewhere else.

This whole Why? business is the realm of contrarian thinking, which not everyone enjoys doing. But Berger cites the case of George Lois:

George Lois, the renowned designer of iconic magazine covers and celebrated advertising campaigns, was also known for being a disruptive force in business meetings. It wasn’t just that he was passionate in arguing for his ideas; the real issue, Lois recalls, was that often he was the only person in the meeting willing to ask why. The gathered business executives would be anxious to proceed on a course of action assumed to be sensible. While everyone else nodded in agreement, “I would be the only guy raising his hand to say, ‘Wait a minute, this thing you want to do doesn’t make any sense. Why the hell are you doing it this way?”

Others in the room saw Lois to be slowing the meeting and stopping the group from moving forward. But Lois understood that the group was apt to be operating on habit–trotting out an idea or approach similar to what had been done in similar situations before, without questioning whether it was the best idea or the right approach in this instance. The group needed to be challenged to “step back” by someone like Lois–who had a healthy enough ego to withstand being the lone questioner in the room.

The truth is that a really good Why? type question tends to be threatening. That’s also what makes it useful. It challenges us to step back and stop thinking on autopilot. It also requires what Berger calls a step back from knowing — that recognizable feeling of knowing something but not knowing how you know it. This forced perspective is, of course, as valuable a thing as you can do.

Berger describes a valuable exercise that’s sometimes used to force perspective on people who think they already have a complete answer. After showing a drawing of a large square (seemingly) divided into 16 smaller squares, the questioner asks the audience “How many squares do you see?”

The easy answer is sixteen. But the more observant people in the group are apt to notice–especially after Srinivas allows them to have a second, longer, look–that you can find additional squares by configuring them differently. In addition to the sixteen single squares, there are nine two-by-two squares, four three-by-three squares, and one large four-by-four square, which brings the total to thirty squares.

“The squares were always there, but you didn’t find them until you looked for them.”

Point being, until you step back, re-examine, and look a little harder, you might not have seen all the damn squares yet!

What If?

The second part is where a good questioner, after using Why? to understand as deeply as possible and open a new line of inquiry, proposes a new type of solution, usually an audacious one — all great ideas tend to be, almost by definition — by asking What If…?

Berger illustrates this one well with the story of Pandora Music. The founder Tim Westergren wanted to know why good music wasn’t making it out to the masses. His search didn’t lead to a satisfactory answer, so he eventually asked himself, What if we could map the DNA of music? The result has been pretty darn good, with something close to 80 million listeners at present:

The Pandora story, like many stories of inquiry-driven startups, started with someone’s wondering about an unmet need. It concluded with the questioner, Westergren, figuring out how to bring a fully realized version of the answer into the world.

But what happened in between? That’s when the lightning struck. In Westergren’s case, ideas and influences began to come together; he combined what he knew about music with what he was learning about technology. Inspiration was drawn from a magazine article, and from a seemingly unrelated world (biology). A vision of the new possibility began to form in the mind. It all resulted in an audacious hypothetical question that might or might not have been feasible–but was exciting enough to rally people to the challenge of trying to make it work.

The What If stage is the blue-sky moment of questioning, when anything is possible. Those possibilities may not survive the more practical How stage; but it’s critical to innovation that there be time for wild, improbable ideas to surface and to inspire.

If the word Why has penetrative power, enabling the questioner to get past assumptions and dig deep into problems, the words What if have a more expansive effect–allowing us to think without limits or constraints, firing the imagination.

Clearly, Westergren had engaged in serious combinatorial creativity pulling from multiple disciplines, which led him to ask the right kind of questions. This seems to be a pretty common feature at this stage of the game, and an extremely common feature of all new ideas:

Smart recombinations are all around us. Pandora, for example, is a combination of a radio station and search engine; it also takes the biological method of genetic coding and transfers it to the domain of music […] In today’s tech world, many of the most successful products–Apple’s iPhone being just one notable example–are hybrids, melding functions and features in new ways.

Companies, too, can be smart recombinations. Netflix was started as a video-rental business that operated like a monthly membership health club (and how it has added “TV production studio” to the mix). Airbnb is a combination of an online travel agency, a social media platform, and a good old-fashioned bed-and-breakfast (the B&B itself is a smart combination from way back.)

It may be that the Why? –> What if? line of inquiry is common to all types of innovative thinking because it engages the part of our brain that starts turning over old ideas in new ways by combining them with other unrelated ideas, much of them previously sitting idle in our subconscious. That churning is where new ideas really arise.

The idea then has to be “reality-tested”, and that’s where the last major question comes in.

How?

Once we think we’ve hit on a brilliant new idea, it’s time to see if the thing actually works. Usually and most frequently, the answer is no. But enough times to make it worth our while, we discover that the new idea has legs.

The most common problem here is that we try to perfect a new idea all at once, leading to stagnation and paralysis. That’s usually the wrong approach.

Another, often better, way is to try the idea quickly and start getting feedback. As much as possible. In the book, Berger describes a fun little experiment that drives home the point, and serves as a fairly useful business metaphor besides:

A software designer shared a story about an interesting experiment in which the organizers brought together a group of kindergarten children who were divided into small teams and given a challenge: Using uncooked spaghetti sticks, string, tape, and a marshmallow, they had to assemble the tallest structure they could, within a time limit (the marshmallow was supposed to be placed on top of the completed structure.)

Then, in a second phase of the experiment, the organizers added a new wrinkle. They brought in teams of Harvard MBA grad students to compete in the challenge against the kindergartners. The grad students, I’m told, took it seriously. They brought a highly analytical approach to the challenge, debating among themselves about how best to combine the sticks, the string, and the tape to achieve maximum altitude.

Perhaps you’ll have guessed this already, but the MBA students were no match for the kindergartners. For all their planning and discussion, the structures they carefully conceived invariably fell apart–and then they were out of time before they could get in more attempts.

The kids used their time much more efficiently by constructing right away. They tried one way of building, and if it didn’t work, they quickly tried another. They got in a lot more tries. They learned from their mistakes as they went along, instead of attempting to figure out everything in advance.

This little experiment gets run in the real world all the time by startups looking to outcompete ponderous old bureaucracies. They simply substitute velocity for scale and see what happens — it often works well.

The point is to move along the axis of Why?–>What If–>How? without too much self-censoring in the last phase. Being afraid to fail can often mean a great What If? proposition gets stuck there forever. Analysis paralysis, as it’s sometimes called. But if you can instead enter the testing of the How? stage quickly, even by showing that an idea won’t work, then you can start the loop over again, either asking a new Why? or proposing a new What If? to an existing Why?

Thus moving your creative engine forward.

***

Berger’s point is that there is an intense practical end to understanding productive inquiry. Just like “If I do this, then what will happen?” is a basic structure on which all manner of complex scientific questioning and testing is built, so can a simple Why, What If, and How structure catalyze a litany of new ideas.

Still Interested? Check out the book, or check out some related posts: Steve Jobs on CreativitySeneca on Gathering Ideas And Combinatorial Creativity, or for some fun with question-asking, What If? Serious Scientific Answers to Absurd Hypothetical Questions.