A great short video on the science of procrastination and the role of hyperbolic discounting.
Basically, when we procrastinate, we often choose things like video games, facebook, twitter, and even email. These options are very attractive because they provide small quick dopamine rewards, unlike what we’re avoiding, which is likely a one time future reward.
Human motivation is highly influenced by how imminent the reward is perceived to be — meaning, the further away the reward is, the more you discount its value. This is often referred to as Present bias, or Hyperbolic discounting.
What makes a genius is a story that never gets told, argues Adam Westbrook, the creative mind behind The Man Who Turned Paper Into Pixels. There is a single thread that connects history’s greatest achievers. “Well,” Westbrook argues, “actually it’s pretty simple but it’s the complete opposite to how we think today.”
It’s about the difficult years. In his book Mastery, which studies the patterns of history’s greatest achievers, Robert Greene describes this period as:
A largely self-directed apprenticeship that lasts some five to ten years [and] receive little attention because it does not contain stories of great achievement or discovery.
Michael Faraday, who was brought to our attention in Faraday, Maxwell, and the Electromagnetic Field, a recent book recommendation of Charlie Munger, worked as a lab assistant for 7 years before he was even allowed to do his own experiments. “Stephen King wrote every day for nine years before he even sold his first novel. And John Coltrane practiced the saxophone every day for 17 years before he got his first big hit.”
This sounds an awful lot like hard work. Today people think that genius and success are instantaneous and easy, yet this is the case only when extreme luck is involved. In the vast majority of cases it involves a struggle that we never see — the hours of deliberate practice, the failed businesses, the long nights writing, the paintings that failed to please clients. We never read about the struggle. And while there are no assurances that with struggle comes reward, without it the odds are lower.
Iconic typography designer Paula Scher discusses her creative process, including the famous Citi logo. Interestingly, the idea came to her in seconds and that presented a problem for the client. They wanted to buy a process not an outcome. Scher’s process is very much one of combinatory creativity, whereby she combines existing things in new ways.
A lot of clients like to buy process. It’s like they think they are not getting their money’s worth because I solved it too fast.
How can it be that you talk to someone and it’s done in a second? But it IS done in a second — it’s done in a second and 34 years. It’s done in a second and every experience, and every movie, and everything in my life that’s in my head.
This reminds me of an old story with many variations. Here is one version.
A giant ship engine failed. The ship’s owners tried one expert after another, but none of them could figure but how to fix the engine.
Then they brought in an old man who had been fixing ships since he was a young [boy]. He carried a large bag of tools with him, and when he arrived, he immediately went to work. He inspected the engine very carefully, top to bottom.
Two of the ship’s owners were there, watching this man, hoping he would know what to do. After looking things over, the old man reached into his bag and pulled out a small hammer. He gently tapped something. Instantly, the engine lurched into life. He carefully put his hammer away. The engine was fixed!
A week later, the owners received a bill from the old man for ten thousand dollars.
“What?!” the owners exclaimed. “He hardly did anything!” So they wrote the old man a note saying, “Please send us an itemised bill.”
The man sent a bill that read:
Tapping with a hammer………………….. $ 2.00
Knowing where to tap…………………….. $ 9,998.00
*Effort is important, but knowing where to make an effort makes all the difference!*
This mini interview prompted me to order a copy of Scher’s Make It Bigger, which looks at graphic design from the vantage point of business.
Claude Shannon is the most important man you’ve probably never heard of. If Alan Turing is to be considered the father of modern computing, then the American mathematician Claude Shannon is the architect of the Information Age.
The video, created by the British filmmaker Adam Westbrook, echoes the thoughts of Nassim Taleb that boosting the signal does not mean you remove the noise, in fact, just the opposite: you amplify it.
Any time you try to send a message from one place to another something always gets in the way. The original signal is always distorted. Where ever there is signal there is also noise.
So what do you do? Well, the best anyone could do back then was to boost the signal. But then all you do is boost the noise.
Thing is we were thinking about information all wrong. We were obsessed with what a message meant.
A Renoir and a receipt? They’re different, right? Was there a way to think of them in the same way? Like so many breakthroughs the answer came from an unexpected place. A brilliant mathematician with a flair for blackjack.
The transistor was invented in 1948, at Bell Telephone Laboratories. This remarkable achievement, however, “was only the second most significant development of that year,” writes James Gleick in his fascinating book: The Information: A History, a Theory, a Flood. The most important development of 1948 and what still underscores modern technology is the bit.
An invention even more profound and more fundamental came in a monograph spread across seventy-nine pages of The Bell System Technical Journal in July and October. No one bothered with a press release. It carried a title both simple and grand “A Mathematical Theory of Communication” and the message was hard to summarize. But it was a fulcrum around which the world began to turn. Like the transistor, this development also involved a neologism: the word bit, chosen in this case not by committee but by the lone author, a thirty-two-year -old named Claude Shannon. The bit now joined the inch, the pound, the quart, and the minute as a determinate quantity— a fundamental unit of measure.
But measuring what? “A unit for measuring information,” Shannon wrote, as though there were such a thing, measurable and quantifiable, as information.
Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos. It led to compact discs and fax machines, computers and cyberspace, Moore’s law and all the world’s Silicon Alleys. Information processing was born, along with information storage and information retrieval. People began to name a successor to the Iron Age and the Steam Age.
Gleick also recounts the relationship between Turing and Shannon:
In 1943 the English mathematician and code breaker Alan Turing visited Bell Labs on a cryptographic mission and met Shannon sometimes over lunch, where they traded speculation on the future of artificial thinking machines. (“ Shannon wants to feed not just data to a Brain, but cultural things!” Turing exclaimed. “He wants to play music to it!”)
Commenting on vitality of information, Gleick writes:
(Information) pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. … Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level— an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions.… If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment.
The bit is the very core of the information age.
The bit is a fundamental particle of a different sort: not just tiny but abstract— a binary digit, a flip-flop, a yes-or-no. It is insubstantial, yet as scientists finally come to understand information, they wonder whether it may be primary: more fundamental than matter itself. They suggest that the bit is the irreducible kernel and that information forms the very core of existence.
In the words of John Archibald Wheeler, the last surviving collaborator of both Einstein and Bohr, information gives rise to “every it— every particle, every field of force, even the spacetime continuum itself.”
This is another way of fathoming the paradox of the observer: that the outcome of an experiment is affected, or even determined, when it is observed. Not only is the observer observing, she is asking questions and making statements that must ultimately be expressed in discrete bits. “What we call reality,” Wheeler wrote coyly, “arises in the last analysis from the posing of yes-no questions.” He added: “All things physical are information-theoretic in origin, and this is a participatory universe.” The whole universe is thus seen as a computer —a cosmic information-processing machine.
The greatest gift of Prometheus to humanity was not fire after all: “Numbers, too, chiefest of sciences, I invented for them, and the combining of letters, creative mother of the Muses’ arts, with which to hold all things in memory .”
Information technologies are both relative in the time they were created and absolute in terms of the significance. Gleick writes:
The alphabet was a founding technology of information. The telephone, the fax machine, the calculator, and, ultimately, the computer are only the latest innovations devised for saving, manipulating, and communicating knowledge. Our culture has absorbed a working vocabulary for these useful inventions. We speak of compressing data, aware that this is quite different from compressing a gas. We know about streaming information, parsing it, sorting it, matching it, and filtering it. Our furniture includes iPods and plasma displays, our skills include texting and Googling, we are endowed, we are expert, so we see information in the foreground. But it has always been there. It pervaded our ancestors’ world, too, taking forms from solid to ethereal, granite gravestones and the whispers of courtiers. The punched card, the cash register, the nineteenth-century Difference Engine, the wires of telegraphy all played their parts in weaving the spiderweb of information to which we cling. Each new information technology, in its own time, set off blooms in storage and transmission. From the printing press came new species of information organizers: dictionaries, cyclopaedias, almanacs— compendiums of words, classifiers of facts, trees of knowledge. Hardly any information technology goes obsolete. Each new one throws its predecessors into relief. Thus Thomas Hobbes, in the seventeenth century, resisted his era’s new-media hype: “The invention of printing, though ingenious, compared with the invention of letters is no great matter.” Up to a point, he was right. Every new medium transforms the nature of human thought. In the long run, history is the story of information becoming aware of itself.
Here he talks to Bill Moyers about our mysterious universe and whether faith and science can be reconciled. This is part two of a three part series. The first part is on the new cosmos and the second on science literacy.
In an interesting moment he touches on why, despite the invention of Google street view and online tours, there is no substitute for the real thing.
If you tour the air and space museum in Washington, which has the history of flight, including space flight … (The) museum people could have made an exact replica of the Apollo 11 command module that went to the moon. And then we’d say, here is an exact replica.. so that’s ok. But if I now say this actual thing went to the moon, intellectually that means something different to you. Your eyes see exactly the same thing. You can make a replica … with all the blemishes and the heat shield damage but if you know it is the real thing the meaning is magnified.