Category: Creativity

The Improbable Story of the Online Encyclopedia

More important than determining who deserved credit is ap­preciating the dynamics that occur when people share ideas.

 

Walter Isaacson is the rare sort of writer that, if you’re like me, you just pre-order everything he writes. The first thing I read that he wrote was the Einstein Biography, then the Steve Jobs Biography, then I went back and ordered everything else. He’s out with a new book, The Innovators, which recounts the story of the people who created the Internet. From Ada Lovelace, Lord Byron’s daughter, who pioneered computer programming in the 1840s, long before anyone else, through to Steve Jobs, Tim Berners-Lee, and Larry Page, Isaacson shows not only the people but how their minds worked.

Below is an excerpt from The Innovators, recounting the improbable story of Wikipedia.

When he launched the Web in 1991, Tim Berners-Lee intended it to be used as a collaboration tool, which is why he was dismayed that the Mosaic browser did not give users the ability to edit the Web pages they were viewing. It turned Web surfers into passive consumers of published content. That lapse was partly mitigated by the rise of blog­ging, which encouraged user-generated content. In 1995 another me­dium was invented that went further toward facilitating collaboration on the Web. It was called a wiki, and it worked by allowing users to modify Web pages—not by having an editing tool in their browser but by clicking and typing directly onto Web pages that ran wiki software.

The application was developed by Ward Cunningham, another of those congenial Midwest natives (Indiana, in his case) who grew up making ham radios and getting turned on by the global communities they fostered. After graduating from Purdue, he got a job at an elec­tronic equipment company, Tektronix, where he was assigned to keep track of projects, a task similar to what Berners-Lee faced when he went to CERN.

To do this he modified a superb software product developed by one of Apple’s most enchanting innovators, Bill Atkinson. It was called HyperCard, and it allowed users to make their own hyper-linked cards and documents on their computers. Apple had little idea what to do with the software, so at Atkinson’s insistence Apple gave it away free with its computers. It was easy to use, and even kids—especially kids—found ways to make HyperCard stacks of linked pictures and games.

Cunningham was blown away by HyperCard when he first saw it, but he found it cumbersome. So he created a super simple way of creating new cards and links: a blank box on each card in which you could type a title or word or phrase. If you wanted to make a link to Jane Doe or Harry’s Video Project or anything else, you simply typed those words in the box. “It was fun to do,” he said.

Then he created an Internet version of his HyperText program, writing it in just a few hundred lines of Perl code. The result was a new content management application that allowed users to edit and contribute to a Web page. Cunningham used the application to build a service, called the Portland Pattern Repository, that allowed soft­ware developers to exchange programming ideas and improve on the patterns that others had posted. “The plan is to have interested parties write web pages about the People, Projects and Patterns that have changed the way they program,” he wrote in an announcement posted in May 1995. “The writing style is casual, like email . . . Think of it as a moderated list where anyone can be moderator and everything is archived. It’s not quite a chat, still, conversation is possible.”

Now he needed a name. What he had created was a quick Web tool, but QuickWeb sounded lame, as if conjured up by a com­mittee at Microsoft. Fortunately, there was another word for quick that popped from the recesses of his memory. When he was on his honeymoon in Hawaii thirteen years earlier, he remembered, “the airport counter agent directed me to take the wiki wiki bus between terminals.” When he asked what it meant, he was told that wiki was the Hawaiian word for quick, and wiki wiki meant superquick. So he named his Web pages and the software that ran them WikiWikiWeb, wiki for short.

In his original version, the syntax Cunningham used for creating links in a text was to smash words together so that there would be two or more capital letters—as in Capital Letters—in a term. It be­came known as CamelCase, and its resonance would later be seen in scores of Internet brands such as AltaVista, MySpace, and YouTube.

WardsWiki (as it became known) allowed anyone to edit and contribute, without even needing a password. Previous versions of each page would be stored, in case someone botched one up, and there would be a “Recent Changes” page so that Cunningham and others could keep track of the edits. But there would be no supervisor or gatekeeper preapproving the changes. It would work, he said with cheery midwestern optimism, because “people are generally good.” It was just what Berners-Lee had envisioned, a Web that was read-write rather than read-only. “Wikis were one of the things that allowed col­laboration,” Berners-Lee said. “Blogs were another.”

Like Berners-Lee, Cunningham made his basic software available for anyone to modify and use. Consequently, there were soon scores of wiki sites as well as open-source improvements to his software. But the wiki concept was not widely known beyond software engineers until January 2001, when it was adopted by a struggling Internet entrepreneur who was trying, without much success, to build a free, online encyclopedia.

***

Jimmy Wales was born in 1966 in Huntsville, Alabama, a town of rednecks and rocket scientists. Six years earlier, in the wake of Sput­nik, President Eisenhower had personally gone there to open the Marshall Space Flight Center. “Growing up in Huntsville during the height of the space program kind of gave you an optimistic view of the future,” Wales observed. “An early memory was of the windows in our house rattling when they were testing the rockets. The space program was basically our hometown sports team, so it was exciting and you felt it was a town of technology and science.”

Wales, whose father was a grocery store manager, went to a one-room private school that was started by his mother and grandmother, who taught music. When he was three, his mother bought a World Book Encyclopedia from a door-to-door salesman; as he learned to read, it became an object of veneration. It put at his fingertips a cor­nucopia of knowledge along with maps and illustrations and even a few cellophane layers of transparencies you could lift to explore such things as the muscles, arteries, and digestive system of a dissected frog. But Wales soon discovered that the World Book had shortcom­ings: no matter how much was in it, there were many more things that weren’t. And this became more so with time. After a few years, there were all sorts of topics—moon landings and rock festivals and protest marches, Kennedys and kings—that were not included. World Book sent out stickers for owners to paste on the pages in order to update the encyclopedia, and Wales was fastidious about doing so. “I joke that I started as a kid revising the encyclopedia by stickering the one my mother bought.”

After graduating from Auburn and a halfhearted stab at graduate school, Wales took a job as a research director for a Chicago financial trading firm. But it did not fully engage him. His scholarly attitude was combined with a love for the Internet that had been honed by playing Multi-User Dungeons fantasies, which were essentially crowdsourced games. He founded and moderated an Internet mailing list discussion on Ayn Rand, the Russian-born American writer who espoused an objectivist and libertarian philosophy. He was very open about who could join the discussion forum, frowned on rants and the personal attack known as flaming, and managed comportment with a gentle hand. “I have chosen a ‘middle-ground’ method of moderation, a sort of behind-the-scenes prodding,” he wrote in a posting.

Before the rise of search engines, among the hottest Internet ser­vices were Web directories, which featured human-assembled lists and categories of cool sites, and Web rings, which created through a common navigation bar a circle of related sites that were linked to one another. Jumping on these bandwagons, Wales and two friends in 1996 started a venture that they dubbed BOMIS, for Bitter Old Men in Suits, and began casting around for ideas. They launched a panoply of startups that were typical of the dotcom boom of the late ’90s: a used-car ring and directory with pictures, a food-ordering service, a business directory for Chicago, and a sports ring. After Wales relo­cated to San Diego, he launched a directory and ring that served as “kind of a guy-oriented search engine,” featuring pictures of scantily clad women.

The rings showed Wales the value of having users help generate the content, a concept that was reinforced as he watched how the crowds of sports bettors on his site provided a more accurate morning line than any single expert could. He also was impressed by Eric Ray­mond’s The Cathedral and the Bazaar, which explained why an open and crowd-generated bazaar was a better model for a website than the carefully controlled top-down construction of a cathedral.

Wales next tried an idea that reflected his childhood love of the World Book: an online encyclopedia. He dubbed it Nupedia, and it had two attributes: it would be written by volunteers, and it would be free. It was an idea that had been proposed in 1999 by Richard Stallman, the pioneering advocate of free software. Wales hoped eventually to make money by selling ads. To help develop it, he hired a doctoral student in philosophy, Larry Sanger, whom he first met in online discussion groups. “He was specifically interested in finding a philoso­pher to lead the project,” Sanger recalled.

Sanger and Wales developed a rigorous, seven-step process for creating and approving articles, which included assigning topics to proven experts, whose credentials had been vetted, and then putting the drafts through outside expert reviews, public reviews, professional copy editing, and public copy editing. “We wish editors to be true experts in their fields and (with few exceptions) possess Ph.Ds.,” the Nupedia policy guidelines stipulated. “Larry’s view was that if we didn’t make it more academic than a traditional encyclopedia, people wouldn’t believe in it and respect it,” Wales explained. “He was wrong, but his view made sense given what we knew at the time.” The first article, published in March 2000, was on atonality by a scholar at the Johannes Gutenberg University in Mainz, Germany.

***

It was a painfully slow process and, worse yet, not a lot of fun. The whole point of writing for free online, as Justin Hall had shown, was that it produced a jolt of joy. After a year, Nupedia had only about a dozen articles published, making it useless as an encyclopedia, and 150 that were still in draft stage, which indicated how unpleasant the process had become. It had been rigorously engineered not to scale.

This hit home to Wales when he decided that he would personally write an article on Robert Merton, an economist who had won the Nobel Prize for creating a mathematical model for markets contain­ing derivatives. Wales had published a paper on option pricing theory, so he was very familiar with Merton’s work. “I started to try to write the article and it was very intimidating, because I knew they were going to send my draft out to the most prestigious finance professors they could find,” Wales said. “Suddenly I felt like I was back in grad school, and it was very stressful. I realized that the way we had set things up was not going to work.”

That was when Wales and Sanger discovered Ward Cunningham’s wiki software. Like many digital-age innovations, the application of wiki software to Nupedia in order to create Wikipedia—combining two ideas to create an innovation—was a collaborative process in­volving thoughts that were already in the air. But in this case a very non-wiki-like dispute erupted over who deserved the most credit.

The way Sanger remembered the story, he was having lunch in early January 2001 at a roadside taco stand near San Diego with a friend named Ben Kovitz, a computer engineer. Kovitz had been using Cunningham’s wiki and described it at length. It then dawned on Sanger, he claimed, that a wiki could be used to help solve the problems he was having with Nupedia. “Instantly I was considering whether wiki would work as a more open and simple editorial system for a free, collaborative encyclopedia,” Sanger later recounted. “The more I thought about it, without even having seen a wiki, the more it seemed obviously right.” In his version of the story, he then convinced Wales to try the wiki approach.

Kovitz, for his part, contended that he was the one who came up with the idea of using wiki software for a crowdsourced encyclopedia and that he had trouble convincing Sanger. “I suggested that instead of just using the wiki with Nupedia’s approved staff, he open it up to the general public and let each edit appear on the site immediately, with no review process,” Kovitz recounted. “My exact words were to allow ‘any fool in the world with Internet access’ to freely modify any page on the site.” Sanger raised some objections: “Couldn’t total idiots put up blatantly false or biased descriptions of things?” Kovitz replied, “Yes, and other idiots could delete those changes or edit them into something better.”

As for Wales’s version of the story, he later claimed that he had heard about wikis a month before Sanger’s lunch with Kovitz. Wikis had, after all, been around for more than four years and were a topic of discussion among programmers, including one who worked at BOMIS, Jeremy Rosenfeld, a big kid with a bigger grin. “Jeremy showed me Ward’s wiki in December 2000 and said it might solve our problem,” Wales recalled, adding that when Sanger showed him the same thing, he responded, “Oh, yes, wiki, Jeremy showed me this last month.” Sanger challenged that recollection, and a nasty cross­fire ensued on Wikipedia’s discussion boards. Wales finally tried to de-escalate the sniping with a post telling Sanger, “Gee, settle down,” but Sanger continued his battle against Wales in a variety of forums.

The dispute presented a classic case of a historian’s challenge when writing about collaborative creativity: each player has a different rec­ollection of who made which contribution, with a natural tendency to inflate his own. We’ve all seen this propensity many times in our friends, and perhaps even once or twice in ourselves. But it is ironic that such a dispute attended the birth of one of history’s most collab­orative creations, a site that was founded on the faith that people are willing to contribute without requiring credit. (Tellingly, and laudably, Wikipedia’s entries on its own history and the roles of Wales and Sanger have turned out, after much fighting on the discussion boards, to be bal­anced and objective.)

More important than determining who deserved credit is ap­preciating the dynamics that occur when people share ideas. Ben Kovitz, for one, understood this. He was the player who had the most insightful view—call it the “bumblebee at the right time” theory—on the collaborative way that Wikipedia was created. “Some folks, aim­ing to criticize or belittle Jimmy Wales, have taken to calling me one of the founders of Wikipedia, or even ‘the true founder,’” he said. “I suggested the idea, but I was not one of the founders. I was only the bumblebee. I had buzzed around the wiki flower for a while, and then pollinated the free-encyclopedia flower. I have talked with many oth­ers who had the same idea, just not in times or places where it could take root.”

That is the way that good ideas often blossom: a bumblebee brings half an idea from one realm, and pollinates another fertile realm filled with half-formed innovations. This is why Web tools are valuable, as are lunches at taco stands.

***

Cunningham was supportive, indeed delighted when Wales called him up in January 2001 to say he planned to use the wiki software to juice up his encyclopedia project. Cunningham had not sought to patent or copyright either the software or the wiki name, and he was one of those innovators who was happy to see his products become tools that anyone could use or adapt.

At first Wales and Sanger conceived of Wikipedia merely as an adjunct to Nupedia, sort of like a feeder product or farm team. The wiki articles, Sanger assured Nupedia’s expert editors, would be rel­egated to a separate section of the website and not be listed with the regular Nupedia pages. “If a wiki article got to a high level it could be put into the regular Nupedia editorial process,” he wrote in a post. Nevertheless, the Nupedia purists pushed back, insisting that Wiki­pedia be kept completely segregated, so as not to contaminate the wisdom of the experts. The Nupedia Advisory Board tersely declared on its website, “Please note: the editorial processes and policies of Wikipedia and Nupedia are totally separate; Nupedia editors and peer reviewers do not necessarily endorse the Wikipedia project, and Wikipedia contributors do not necessarily endorse the Nupedia project.” Though they didn’t know it, the pedants of the Nupedia priesthood were doing Wikipedia a huge favor by cutting the cord.

Unfettered, Wikipedia took off. It became to Web content what GNU/Linux was to software: a peer-to-peer commons collabora­tively created and maintained by volunteers who worked for the civic satisfactions they found. It was a delightful, counterintuitive concept, perfectly suited to the philosophy, attitude, and technology of the Internet. Anyone could edit a page, and the results would show up instantly. You didn’t have to be an expert. You didn’t have to fax in a copy of your diploma. You didn’t have to be authorized by the Powers That Be. You didn’t even have to be registered or use your real name. Sure, that meant vandals could mess up pages. So could idiots or ideologues. But the software kept track of every version. If a bad edit appeared, the community could simply get rid of it by clicking on a “revert” link. “Imagine a wall where it was easier to remove graffiti than add it” is the way the media scholar Clay Shirky explained the process. “The amount of graffiti on such a wall would depend on the commitment of its defenders.” In the case of Wikipedia, its de­fenders were fiercely committed. Wars have been fought with less intensity than the reversion battles on Wikipedia. And somewhat amazingly, the forces of reason regularly triumphed.

One month after Wikipedia’s launch, it had a thousand articles, approximately seventy times the number that Nupedia had after a full year. By September 2001, after eight months in existence, it had ten thousand articles. That month, when the September 11 attacks occurred, Wikipedia showed its nimbleness and usefulness; contribu­tors scrambled to create new pieces on such topics as the World Trade Center and its architect. A year after that, the article total reached forty thousand, more than were in the World Book that Wales’s mother had bought. By March 2003 the number of articles in the English-language edition had reached 100,000, with close to five hundred ac­tive editors working almost every day. At that point, Wales decided to shut Nupedia down.

By then Sanger had been gone for a year. Wales had let him go. They had increasingly clashed on fundamental issues, such as Sanger’s desire to give more deference to experts and scholars. In Wales’s view, “people who expect deference because they have a Ph.D. and don’t want to deal with ordinary people tend to be annoying.” Sanger felt, to the contrary, that it was the nonacademic masses who tended to be annoying. “As a community, Wikipedia lacks the habit or tra­dition of respect for expertise,” he wrote in a New Year’s Eve 2004 manifesto that was one of many attacks he leveled after he left. “A policy that I attempted to institute in Wikipedia’s first year, but for which I did not muster adequate support, was the policy of respect­ing and deferring politely to experts.” Sanger’s elitism was rejected not only by Wales but by the Wikipedia community. “Consequently, nearly everyone with much expertise but little patience will avoid ed­iting Wikipedia,” Sanger lamented.

Sanger turned out to be wrong. The uncredentialed crowd did not run off the experts. Instead the crowd itself became the expert, and the experts became part of the crowd. Early in Wikipedia’s devel­opment, I was researching a book about Albert Einstein and I noticed that the Wikipedia entry on him claimed that he had traveled to Al­bania in 1935 so that King Zog could help him escape the Nazis by getting him a visa to the United States. This was completely untrue, even though the passage included citations to obscure Albanian websites where this was proudly proclaimed, usually based on some third-hand series of recollections about what someone’s uncle once said a friend had told him. Using both my real name and a Wikipedia han­dle, I deleted the assertion from the article, only to watch it reappear. On the discussion page, I provided sources for where Einstein actu­ally was during the time in question (Princeton) and what passport he was using (Swiss). But tenacious Albanian partisans kept reinserting the claim. The Einstein-in-Albania tug-of-war lasted weeks. I became worried that the obstinacy of a few passionate advocates could under­mine Wikipedia’s reliance on the wisdom of crowds. But after a while, the edit wars ended, and the article no longer had Einstein going to Albania. At first I didn’t credit that success to the wisdom of crowds, since the push for a fix had come from me and not from the crowd. Then I realized that I, like thousands of others, was in fact a part of the crowd, occasionally adding a tiny bit to its wisdom.

A key principle of Wikipedia was that articles should have a neutral point of view. This succeeded in producing articles that were generally straightforward, even on controversial topics such as global warming and abortion. It also made it easier for people of different viewpoints to collaborate. “Because of the neutrality policy, we have partisans working together on the same articles,” Sanger explained. “It’s quite remarkable.” The community was usually able to use the lodestar of the neutral point of view to create a consensus article offering competing views in a neutral way. It became a model, rarely emulated, of how digital tools can be used to find common ground in a contentious society.

Not only were Wikipedia’s articles created collaboratively by the community; so were its operating practices. Wales fostered a loose system of collective management, in which he played guide and gentle prodder but not boss. There were wiki pages where users could jointly formulate and debate the rules. Through this mechanism, guidelines were evolved to deal with such matters as reversion practices, media­tion of disputes, the blocking of individual users, and the elevation of a select few to administrator status. All of these rules grew organically from the community rather than being dictated downward by a cen­tral authority. Like the Internet itself, power was distributed. “I can’t imagine who could have written such detailed guidelines other than a bunch of people working together,” Wales reflected. “It’s common in Wikipedia that we’ll come to a solution that’s really well thought out because so many minds have had a crack at improving it.”

As it grew organically, with both its content and its governance sprouting from its grassroots, Wikipedia was able to spread like kudzu. At the beginning of 2014, there were editions in 287 lan­guages, ranging from Afrikaans to Žemaitška. The total number of articles was 30 million, with 4.4 million in the English-language edi­tion. In contrast, the Encyclopedia Britannica, which quit publishing a print edition in 2010, had eighty thousand articles in its electronic edition, less than 2 percent of the number in Wikipedia. “The cumu­lative effort of Wikipedia’s millions of contributors means you are a click away from figuring out what a myocardial infarction is, or the cause of the Agacher Strip War, or who Spangles Muldoon was,” Clay Shirky has written. “This is an unplanned miracle, like ‘the market’ deciding how much bread goes in the store. Wikipedia, though, is even odder than the market: not only is all that material contributed for free, it is available to you free.” The result has been the greatest collaborative knowledge project in history.

***

The Innovators

So why do people contribute? Harvard Professor Yochai Benkler dubbed Wikipedia, along with open-source software and other free collaborative projects, examples of “commons-based peer produc­tion.” He explained, “Its central characteristic is that groups of in­dividuals successfully collaborate on large-scale projects following a diverse cluster of motivational drives and social signals, rather than either market prices or managerial commands.” These motivations include the psychological reward of interacting with others and the personal gratification of doing a useful task. We all have our little joys, such as collecting stamps or being a stickler for good grammar, knowing Jeff Torborg’s college batting average or the order of battle at Trafalgar. These all find a home on Wikipedia.

There is something fundamental, almost primordial at work. Some Wikipedians refer to it as “wiki-crack.” It’s the rush of dopamine that seems to hit the brain’s pleasure center when you make a smart edit and it appears instantly in a Wikipedia article. Until recently, being published was a pleasure afforded only to a select few. Most of us in that category can remember the thrill of seeing our words appear in public for the first time. Wikipedia, like blogs, made that treat avail­able to anyone. You didn’t have to be credentialed or anointed by the media elite.

For example, many of Wikipedia’s articles on the British aristoc­racy were largely written by a user known as Lord Emsworth. They were so insightful about the intricacies of the peerage system that some were featured as the “Article of the Day,” and Lord Emsworth rose to become a Wikipedia administrator. It turned out that Lord Emsworth, a name taken from P. G. Wodehouse’s novels, was actu­ally a 16-year-old schoolboy in South Brunswick, New Jersey. On Wikipedia, nobody knows you’re a commoner.

Connected to that is the even deeper satisfaction that comes from helping to create the information that we use rather than just pas­sively receiving it. “Involvement of people in the information they read,” wrote the Harvard professor Jonathan Zittrain, “is an important end itself.” A Wikipedia that we create in common is more mean­ingful than would be the same Wikipedia handed to us on a platter. Peer production allows people to be engaged.

Jimmy Wales often repeated a simple, inspiring mission for Wiki­pedia: “Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge. That’s what we’re doing.” It was a huge, audacious, and worthy goal. But it badly understated what Wikipedia did. It was about more than people being “given” free access to knowledge; it was also about empowering them, in a way not seen before in history, to be part of the process of creating and distributing knowledge. Wales came to realize that. “Wikipedia allows people not merely to access other people’s knowl­edge but to share their own,” he said. “When you help build some­thing, you own it, you’re vested in it. That’s far more rewarding than having it handed down to you.”

Wikipedia took the world another step closer to the vision pro­pounded by Vannevar Bush in his 1945 essay, “As We May Think,” which predicted, “Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.” It also harkened back to Ada Lovelace, who asserted that machines would be able to do almost anything, except think on their own. Wikipedia was not about building a machine that could think on its own. It was instead a dazzling example of human-machine symbiosis, the wisdom of humans and the processing power of computers being woven to­gether like a tapestry. When Wales and his new wife had a daughter in 2011, they named her Ada, after Lady Lovelace.

The Innovators is a must read for anyone looking to better understand the creative mind.

​​(h/t The Daily Beast)

Google and Combinatorial Innovation

Innovaiton
In his new book, How Google Works, Eric Schmidt argues that “we are entering … a new period of combinatorial innovation.” This happens, he says, when “there is a great availability of different component parts that can be combined or recombined to create new inventions.”

For example, in the 1800s, the standardization of design of mechanical devices such as gears, pulleys, chains, and cams led to a manufacturing boom. In the 1900s, the gasoline engine led to innovations in automobiles, motorcycles, and airplanes. By the 1950s, it was the integrated circuit proliferating in numerous applications. In each of these cases, the development of complementary components led to a wave of inventions.

Today’s components are often about information, technology, and computing.

Would-be inventors have all the world’s information, global reach, and practically infinite computing power. They have open-source software and abundant APIs that allow them to build easily on each other’s work. They can use standard protocols and languages. They can access information platforms with data about things ranging from traffic to weather to economic transactions to human genetics to who is socially connected with whom, either on an aggregate or (with permission) individual basis. So one way of developing technical insights is to use some of these accessible technologies and data and apply them in an industry to solve an existing problem in a new way.

Regardless of your business there is a core of knowledge and conventional wisdom that your industry is based upon. Maybe it’s logistics, maybe it’s biology, chemistry or storytelling. Whatever that core is, “that’s your technology. Find the geeks, find the stuff, and that’s where you’ll find the technical insights you need to drive success.”

That’s also the area to look for — where conventional wisdom might be wrong. What was once common sense becomes common practice. When everyone agrees on some fundamental assumption about how the industry works, the opposite point of view can lead toward disruption.

Another possible source of innovation is to start with a solution to one problem and then look at ways to use the same solution on other problems.

New technologies tend to come into the world in a very primitive condition, often designed for very specific problems. The steam engine was used as a nifty way to pump water out of mines long before it found its calling powering locomotives. Marconi sold radio as a means of ship-to-shore communications, not as a place to hear phrases like “Baba Booey!” and “all the children are above average.” Bell labs was so underwhelmed by the commercial potential of the laser when it was invented in the ‘60s that it initially put off patenting it. Even the Internet was initially conceived as a way for scientists and academics to share research. As smart as its creators were, they could never have imagined its future functionality as a place to share pictures and videos, stay in touch with friends, learn anything about anything, or do the other amazing things we use it for today.

Schmidt gives his favorite example of building upon a solution developed for a narrow problem.

When Google search started to ramp up, some of our most popular queries were related to adult-oriented topics. Porn filters at the time were notoriously ineffective, so we put a small team of engineers on the problem of algorithmically capturing Supreme Court Justice Potter Stewart’s definition of porn, “I know it when I see it.” They were successful by combining a couple of technical insights: They got very good at understanding the content of an image (aka skin), and could judge its context by seeing how users interacted with it. (When someone searches for a pornography-related term and the image is from a medical textbook, they are unlikely to click on it, and if they do they won’t stay on the site for long.) Soon we had a filter called SafeSearch that was far more effective in blocking inappropriate images than anything else on the web—a solution (SafeSearch) to a narrow problem (filtering adult content).

But why stop there? Over the next couple of years we took the technology that had been developed to address the porn problem and used it to serve broader purposes. We improved our ability to rate the relevance of images (any images, not just porn) to search queries by using the millions of content-based models (the models of how users react to different images) that we had developed for SafeSearch. Then we added features that let users search for images similar to the ones they find in their search results (“I like that shot of Yosemite-go find more that look just like that”). Finally, we developed the ability to start a search not with a written query (“half dome, yosemite”), but a photograph (that snapshot you took of Half Dome when you visited Yosemite). All of these features evolved from technology that had initially developed for the SafeSearch porn filter. So when you are looking at screen upon screen of Yosemite photos that are nearly identical to the ones you took, you can thank the adult entertainment industry for helping launch the technology that is bringing them to you.

How Google Works is full of interesting insights into the inner workings of a company we’re all fascinated with.

Eight Things I Learned from Peter Thiel’s Zero To One

Peter Thiel is an entrepreneur and investor. He co-founded PayPal and Palantir. He also made the first outside investment in Facebook and was an early investor in companies like SpaceX and LinkedIn. And now he’s written a book, Zero to One: Notes on Startups, or How to Build the Future, with the goal of helping us “see beyond the tracks laid down” to the “broader future that there is to create.”

Zero To One is an exercise in thinking — about questioning and rethinking received wisdom in order to create the future. And thinking about thinking is what we’re all about.

Here are eight lessons I took away from the book.

1. Each Moment Happens Once

Like Heraclitus, who said that you can only step into the same river once, Thiel believes that each moment in business happens only once.

The next Bill Gates will not build an operating system. The next Larry Page or Sergey Brin won’t make a search engine. And the next Mark Zuckerberg won’t create a social network. If you are copying these guys, you aren’t learning from them.

Of course, it’s easier to copy a model than to make something new. Doing what we already know how to do takes the world from 1 to n, adding more of something familiar. But every time we create something new, we go from 0 to 1. The act of creation is singular, as is the moment of creation, and the result is something fresh and strange.

2. There is no Formula

The paradox of teaching entrepreneurship is that such a formula (for innovation) cannot exist; because every innovation is new and unique, no authority can prescribe in concrete terms how to be more innovative. Indeed, the single most powerful pattern I have noticed is that successful people find value in unexpected places, and they do this by thinking about business from first principles instead of formulas.

3. The Best Interview Question

Whenever I interview someone for a job, I like to ask this question: “What important truth do very few people agree with you on?”

This is a question that sounds easy because it’s straightforward. Actually, it’s very hard to answer. It’s intellectually difficult because the knowledge that everyone is taught in school is by definition agreed upon. And it’s psychologically difficult because anyone trying to answer must say something she knows to be unpopular. Brilliant thinking is rare, but courage is in even shorter supply than genius.

Most commonly, I hear answers like the following:

“Our educational system is broken and urgently needs to be fixed.”

“America is exceptional.”

“There is no God.”

These are bad answers. The first and the second statements might be true, but many people already agree with them. The third statement simply takes one side in a familiar debate. A good answer takes the following form: “Most people believe in x, but the truth is the opposite of x.”

What does this have to do with the future?

In the most minimal sense, the future is simply the set of all moments yet to come. But what makes the future distinctive and important isn’t that it hasn’t happened yet, but rather that it will be a time when the world looks different from today. … Most answers to the contrarian questions are different ways of seeing the present; good answers are as close as we can come to looking into the future.

4. A Company’s Most Important Strength

Properly defined, a startup is the largest group of people you can convince of a plan to build a different future. A new company’s most important strength is new thinking: even more important than nimbleness, small size affords space to think.

“Madness is rare in individuals—but in groups, parties, nations and ages it is the rule.”

— Nietzche

5. The Contrarian Question

The question “What important truth do very few people agree with you on?” is hard to answer at first. It’s better to start with “what does everybody agree on?”

If you can identify a delusional popular belief, you can find what lies hidden behind it: the contrarian truth.

[…]

Conventional beliefs only ever come to appear arbitrary and wrong in retrospect; whenever one collapses we call the old belief a bubble, but the distortions caused by bubbles don’t disappear when they pop. The internet bubble of the ‘90s was the biggest of the last two decades, and the lessons learned afterward define and distort almost all thinking about technology today. The first step to thinking clearly is to question what we think we know about the past.

Here is an example Thiel gives to help illuminate this idea.

The entrepreneurs who stuck with Silicon Valley learned four big lessons from the dot-com crash that still guide business thinking today:

1. Make incremental advances — “Grand visions inflated the bubble, so they should not be indulged. Anyone who claims to be able to do something great is suspect, and anyone who wants to change the world should be more humble. Small, incremental steps are the only safe path forward.”

2. Stay lean and flexible — “All companies must be lean, which is code for unplanned. You should not know what your business will do; planning is arrogant and inflexible. Instead you should try things out, iterate, and treat entrepreneurship as agnostic experimentation.”

3. Improve on the competition — “Don’t try to create a new market prematurely. The only way to know that you have a real business is to start with an already existing customer, so you should build your company by improving on recognizable products already offered by successful competitors.”

4. Focus on product, not sales — “If your product requires advertising or salespeople to sell it, it’s not good enough: technology is primarily about product development, not distribution. Bubble-era advertising was obviously wasteful, so the only sustainable growth is viral growth.”

These lessons have become dogma in the startup world; those who would ignore them are presumed to invite the justified doom visited upon technology in the great crash of 2000. And yet the opposite principles are probably more correct.

1. It is better to risk boldness than triviality.
2. A bad plan is better than no plan.
3. Competitive markets destroy profits.
4. Sales matters just as much as product.”

To build the future we need to challenge the dogmas that shape our view of the past. That doesn’t mean the opposite of what is believed is necessarily true, it means that you need to rethink what is and is not true and determine how that shapes how we see the world today. As Thiel says, “The most contrarian thing of all is not to oppose the crowd but to think for yourself.

6. Progress Comes From Monopoly, not Competition

The problem with a competitive business goes beyond lack of profits. Imagine you’re running one of those restaurants in Mountain View. You’re not that different from dozens of your competitors, so you’ve got to fight hard to survive. If you offer affordable food with low margins, you can probably pay employees only minimum wage. And you’ll need to squeeze out every efficiency: That is why small restaurants put Grandma to work at the register and make the kids wash dishes in the back.

A monopoly like Google is different. Since it doesn’t have to worry about competing with anyone, it has wider latitude to care about its workers, its products and its impact on the wider world. Google’s motto—”Don’t be evil”—is in part a branding ploy, but it is also characteristic of a kind of business that is successful enough to take ethics seriously without jeopardizing its own existence. In business, money is either an important thing or it is everything. Monopolists can afford to think about things other than making money; non-monopolists can’t. In perfect competition, a business is so focused on today’s margins that it can’t possibly plan for a long-term future. Only one thing can allow a business to transcend the daily brute struggle for survival: monopoly profits.

So a monopoly is good for everyone on the inside, but what about everyone on the outside? Do outsize profits come at the expense of the rest of society? Actually, yes: Profits come out of customers’ wallets, and monopolies deserve their bad reputation—but only in a world where nothing changes.

In a static world, a monopolist is just a rent collector. If you corner the market for something, you can jack up the price; others will have no choice but to buy from you. Think of the famous board game: Deeds are shuffled around from player to player, but the board never changes. There is no way to win by inventing a better kind of real-estate development. The relative values of the properties are fixed for all time, so all you can do is try to buy them up.

But the world we live in is dynamic: We can invent new and better things. Creative monopolists give customers more choices by adding entirely new categories of abundance to the world. Creative monopolies aren’t just good for the rest of society; they’re powerful engines for making it better.

7. Rivalry Causes us to Copy the Past

Marx and Shakespeare provide two models that we can use to understand almost every kind of conflict.

According to Marx, people fight because they are different. The proletariat fights the bourgeoisie because they have completely different ideas and goals (generated, for Marx, by their very different material circumstances). The greater the difference, the greater the conflict.

To Shakespeare, by contrast, all combatants look more or less alike. It’s not at all clear why they should be fighting since they have nothing to fight about. Consider the opening to Romeo and Juliet: “Two households, both alike in dignity.” The two houses are alike, yet they hate each other. They grow even more similar as the feud escalates. Eventually, they lose sight of why they started fighting in the first place.”

In the world of business, at least, Shakespeare proves the superior guide. Inside a firm, people become obsessed with their competitors for career advancement. Then the firms themselves become obsessed with their competitors in the marketplace. Amid all the human drama, people lose sight of what matters and focus on their rivals instead.

[…]

Rivalry causes us to overemphasize old opportunities and slavishly copy what has worked in the past.

8. Last can be First

You’ve probably heard about “first mover advantage”: if you’re the first entrant into a market, you can capture significant market share while competitors scramble to get started. That can work, but moving first is a tactic, not a goal. What really matters is generating cash flows in the future, so being the first mover doesn’t do you any good if someone else comes along and unseats you. It’s much better to be the last mover – that is, to make the last great development in a specific market and enjoy years or even decades of monopoly profits.

Chess Grand-master José Raúl Capablanca put it well: to succeed, “you must study the endgame before everything else.”

Zero to One is full of counter-intuitive insights that will help your thinking and ignite possibility.

Steve Jobs on Creativity

“Originality depends on new and striking combinations of ideas.”
— Rosamund Harding


In a beautiful article for The Atlantic, Nancy Andreasen, a neuroscientist who has spent decades studying creativity, writes:

[C]reative people are better at recognizing relationships, making associations and connections, and seeing things in an original way—seeing things that others cannot see. … Having too many ideas can be dangerous. Part of what comes with seeing connections no one else sees is that not all of these connections actually exist.

The same point of view is offered by James Webb Young, who many years earlier, wrote:

An idea is nothing more nor less than a new combination of old elements [and] the capacity to bring old elements into new combinations depends largely on the ability to see relationships.

A lot of creative luminaries think about creativity in the same way. Steve Jobs had a lot to say about creativity.

In I, Steve: Steve Jobs in His Own Words, editor George Beahm draws on more than 30 years of media coverage of Steve Jobs in order to find Jobs’ most thought-provoking insights on many aspects of life and creativity.

In one particularly notable excerpt Jobs says:

Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it, they just saw something. It seemed obvious to them after a while. That’s because they were able to connect experiences they’ve had and synthesize new things. And the reason they were able to do that was that they’ve had more experiences or they have thought more about their experiences than other people. Unfortunately, that’s too rare a commodity. A lot of people in our industry haven’t had very diverse experiences. So they don’t have enough dots to connect, and they end up with very linear solutions without a broad perspective on the problem. The broader one’s understanding of the human experience, the better design we will have.

The more you learn about, the more you can connect things. This becomes an argument for a broad-based education. In his 2005 commencement address to the class of Stanford, Jobs makes the case for learning things that, at the time, may not offer the most practical benefit. Over time, however, these things add up to give you a broader base of knowledge from which to connect ideas:

Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn’t have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can’t capture, and I found it fascinating.

None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me.

While education is important for building up a repository for which you can connect things, it’s not enough. You need broad life experiences as well.

I, Steve: Steve Jobs in His Own Words is full of things that will make you think.

Paula Scher on Process versus Outcome

Paula Scher
Iconic typography designer Paula Scher discusses her creative process, including the famous Citi logo. Interestingly, the idea came to her in seconds and that presented a problem for the client. They wanted to buy a process not an outcome. Scher’s process is very much one of combinatory creativity, whereby she combines existing things in new ways.

A lot of clients like to buy process. It’s like they think they are not getting their money’s worth because I solved it too fast.

[…]

How can it be that you talk to someone and it’s done in a second? But it IS done in a second — it’s done in a second and 34 years. It’s done in a second and every experience, and every movie, and everything in my life that’s in my head.

This reminds me of an old story with many variations. Here is one version.

A giant ship engine failed. The ship’s owners tried one expert after another, but none of them could figure but how to fix the engine.

Then they brought in an old man who had been fixing ships since he was a young [boy]. He carried a large bag of tools with him, and when he arrived, he immediately went to work. He inspected the engine very carefully, top to bottom.

Two of the ship’s owners were there, watching this man, hoping he would know what to do. After looking things over, the old man reached into his bag and pulled out a small hammer. He gently tapped something. Instantly, the engine lurched into life. He carefully put his hammer away. The engine was fixed!

A week later, the owners received a bill from the old man for ten thousand dollars.

“What?!” the owners exclaimed. “He hardly did anything!” So they wrote the old man a note saying, “Please send us an itemised bill.”

The man sent a bill that read:
Tapping with a hammer………………….. $ 2.00
Knowing where to tap…………………….. $ 9,998.00

*Effort is important, but knowing where to make an effort makes all the difference!*

This mini interview prompted me to order a copy of Scher’s Make It Bigger, which looks at graphic design from the vantage point of business.

(image source)