Cal Newport's Blog, page 17
July 29, 2020
The Bit Player Who Changed the World
In 1937, at the precocious age of 21, an MIT graduate student named Claude Shannon had one of the most important scientific epiphanies of the century. To explain it requires some brief background.
Before coming to MIT, Shannon earned two bachelors degrees at the University of Michigan: one in mathematics and one in electrical engineering. The former degree exposed him to Boolean Algebra, a somewhat obscure branch of philosophy, developed in the mid-nineteenth century by a self-taught English mathematician named George Boole. This new algebra took propositional logic, a fuzzy-edged field of rhetorical inquiry that dated back to the Stoic logicians of the 3rd century BC, and cast it into clean equations that could be mechanically-optimized using the tools of modern mathematics.
Shannon’s degree in electrical engineering, by contrast, exposed him to the design of electrical circuits — an endeavor that in the 1930s still required a healthy dollop of intuition and art. Given a specification for a circuit, the engineer would tinker until he got something that worked. (Thomas Edison, for example, was particularly gifted at this type of intuitive electrical construction.)
In 1937, in the brain of this 21-year-old, these two ideas came together.
Boolean logic, Shannon realized, could be used to transform the art of designing electrical circuits into something more formal. Instead of starting from a qualitative description of a what a circuit needed to accomplish, and then tinkering until you came up with a workable solution, you could instead capture the goal as a logic equation, and then apply algebraic rules to improve it, before finally translating your abstract symbols back into concrete wires and resistors.
This insight was more than just a parlor trick. As Jimmy Soni and Rob Goodman note in their fantastic 2017 biography, A Mind at Play: How Claude Shannon Invented the Information Age, “circuit design was, for the first time, a science.” As Soni and Goodman elaborate, Shannon had done more than just simplify the job of wire-soldering engineers. He had also introduced a breakthrough idea: that metal and electron circuits could implement arbitrary logic. As Walter Isaacson summarized in The Innovators, “[this became] the basic concept underlying all digital computers.”
Shannon published these leaps in his master’s thesis, which he gave an unassuming title, “A Symbolic Analysis of Relay and Switching Circuits.” Nearly seventy years later, as I was writing my own masters thesis at MIT, Shannon’s shadow still loomed large.
I’m recounting this story for two reasons. First, I’m a fan of Shannon, and think he doesn’t get enough credit. His contributions arguably dwarf those of his contemporary, Alan Turing, who Shannon later briefly met when their wartime cryptanalysis efforts overlapped.
Second, and more specifically, I bring him up because a brand new documentary about Shannon, called The Bit Player, was just released on Amazon Prime. It’s directed by Mark Levinson, whose work I admire, and is based in part on the book by Soni and Goodman that I also admire. Needless to say, I’m excited to watch it, and thought many of you might be as well.


July 23, 2020
On Confronting the Productivity Dragon (take 2)
On a recent episode of my podcast, Deep Questions, a listener asked me what to do when one feels overwhelmed with incoming tasks, requests, and ambiguous obligations — a problem that has become unfortunately common in our current period of largely remote and persistently frenzied work.
The temptation in such moments is to curl up as the onslaught engulfs you; perhaps answering the most recent emails to arrive, or tackling a sampling of tasks that seem particularly urgent, but otherwise just hoping the rest will dissipate.
In the mythology of your professional life, in other words, you decline to confront the dragon, and instead put up a half-hearted warning sign, or rage to anyone in earshot about the unfairness of the dragon’s existence in the first place.
My advice was to resist this temptation.
I told the listener to instead confront the dragon. Jot down every loop that opens; whether it comes via email, or a phone call, or a Zoom meeting, or Slack. Because these loops might emerge rapidly, use a minimalist tool with incredibly low friction. I recommended a simple plain text file on your computer in which you can record incoming obligations at the speed of typing (a strategy I elaborate in this vintage post).
Then, at the beginning of each day, before the next onslaught begins, process these tasks into your permanent system. In doing so, as David Allen recommends, clarify them: what exactly is the “next action” this task requires? Stare at this collection before getting started with your work.
It’s quite possible that the list will be terrifying — way more assignments and activities than you can ever hope to accomplish in time. But you should still confront it. Quantify the impossibility of your load. Visualize its contours. Walk into the cave, shield raised, prepared to face what lurks.
I can offer three justifications for this recommendation:
As David Allen argues, obligations that are kept only in your head cause stress and drain mental resources. An overwhelming number of tasks captured in a system that you regularly review will generate a fraction of the angst spawned by trying to instead pretend that those same tasks don’t exist.
Quantifying the impossibility of your assignments makes it much easier to argue for change. When you instead just battle your inbox all day, switching haphazardly between the easy and unavoidably urgent, you can convince yourself that you’re simply busy and need to hustle harder. Enumerating the absurd quantity of these demands will sharpen your conviction that something has to give.
You can optimize. If you have 400 tasks on your list, there’s no way you can accomplish them all in a single day. But if you can see all 400 obligations in one place, then you can choose the five or six that will have the biggest impact. This is almost certainly better than just jumping on whatever caught your attention most recently.
In summary, I told this podcast listener not to confuse the systems with which he organizes his work for the actual quantity of work with which he has been burdened. Abandoning the former won’t reduce the latter, it will only make its metaphorical fiery breath burn all the hotter.
#####
Speaking of productivity, one of my favorite productivity writers, Laura Vanderkam, just published a new ebook original titled, The New Corner Office: How the Most Successful People Work from Home. I couldn’t think of a more relevant book for the frustrated many, like my overwhelmed podcast listener, currently struggling with the “new normal” of the remote workplace.


July 15, 2020
On Deep Work Tents and the Struggle for Focus in an Age of Social Distance
Jessica Murnane is a wellness advocate, writer, and podcaster who interviewed me on her show not long ago. Earlier this year she signed a deal with Penguin Random House to write a new book. This was great news, except for one wrinkle: the coronavirus.
“Writing a book during a pandemic was one of the most challenging things I have ever done,” she told me. Like many working parents during the past few months, she was trying to balance homeschool with the need to accomplish serious, mind-stretching deep work; all without any easy means of finding some peace and quiet.
So Jessica went to an extreme: she setup a beach tent in her backyard, so she could work outside without the sun glaring on her laptop screen (see above). She’s not alone in this innovation: I can think of at least two other people I personally know well who deployed similar tent setups in their yards for similar purposes.
I mention this story to emphasize a point that sometimes gets lost in our technical discussions, both here and on my podcast, about neuro-productivity, workflows, and the deep life: it’s been really, really hard to get things done recently. To the point where we’re setting up tents in our yards.
I strongly believe, however, that it’s still important to push back on these challenges and do our best to structure our obligations, and build our time blocks, and prioritize our deep work to the extent possible. From both a mental health and professional longevity perspective, straining to optimize a hard situation is still better than just giving in to chaos. This commitment might also set you up well to sprint ahead when things eventually, inevitably return to normal, as has happened after every pandemic in the history of humankind.
But we should also all give ourselves a break. This sucks. It will get better. We should keep striving to do our best until then.
In the meantime, if you need me, I’m trying to find an extension cord long enough to plug in the margarita maker I just hauled out to the tent I setup behind my backyard azaleas.
July 7, 2020
Has the Shift Toward Neuro-Productivity Already Begun?
A reader recently pointed me toward an interesting new feature Microsoft added to its widely-used Outlook email and calendar software: support for deep work.
Outlook users can now create a personal “focus plan” that measures how many hours they’re spending dedicated to undistracted work, and can automatically schedule these blocks. Though the tool uses the term “focus time” to label these efforts on your calendar, it also directly uses the term “deep work” in its interface when describing what it’s helping you accomplish (see above).
This is an important shift.
In the first decades of digital knowledge work, human productivity was often viewed through a computer processor metaphor. People were understood as unbounded processors and the goal was to leverage technology to get them as much useful information as possible, with the least amount of friction. In this metaphor, getting more done meant getting more information through the pipeline.
As I’ve been arguing since at least 2016, this is not a useful frame. Humans do not operate on unbounded computer processors, but instead quirky bundles of neurons that have operational properties much different than silicon. If you really want to get more done, you have to understand how human brains actually function, and then arrange for work processes that complement these realities.
Once you start such examinations, one of the most obvious findings is that human brains are not great at context switching. If you want to perform cognitively demanding work, you need to arrange a setting in which your brain can spend time focused on that one task without needing to consider emails, or slack messages, or online news, or Zoom calls, at the same time. An office that segregates deep from shallow work, therefore, should produce more high value output in the same number of total hours.
Given the huge potential productivity gains inherent in a shift toward this neuro-productivity approach, I’ve been convinced that it’s only a matter of time before we began to see large knowledge work players begin to formalize these ideas. Microsoft’s addition of the focus plan feature to Outlook implies that this shift may very well have already begun.
July 2, 2020
A Deliberate Tribute
I was saddened to learn earlier today that Anders Ericsson, creator of deliberate practice theory, recently passed away. Longtime readers of mine know that his work greatly influenced me. I never met Anders in person, but we shared a sporadic correspondence that I cherished. I thought it appropriate to offer a brief personal tribute to his powerful ideas.
Anders tackled the fundamental question of how experts get really good at what they do. The framework he proposed, which clarified a lot of confusion in the field at the time, introduced these two big ideas (among others):
When trying to get better at a skill, an effort called “deliberate practice” is most effective. Deliberate practice, which aims to isolate areas that need improvement and then stretch you past your comfort zone to induce growth, is the critical activity that helps individuals move past amateur status in many endeavors, both physical and cognitive.
To reach an expert level often requires a lot of deliberate practice. In some of Anders’s more engaging studies, he would sift through accounts of so-called “prodigies”, and identify, time and again, prodigious quantities of deliberate practice surreptitiously squeezed into their early childhood years. As his New York Times obituary recalls, Anders once summarized this finding as follows in an interview: “This idea that somebody more or less discovers, suddenly, that they’re extremely good at something, I’ve yet to find even a single example of that type of phenomenon.”
I first came across Anders’s work in Geoff Colvin’s 2008 book, Talent is Overrated, which blew my mind, and led to a deep dive into deliberate practice theory. It provided an antidote to an increasingly frenetic, digital-mediated world, where everyone was trying to find their passion or seek to somehow transmute social media busyness into accomplishment. It explained a lot about what seemed to resonate for me when I reflected on my own life, or surveyed those I admired around me at MIT or in the biographies of big thinkers I was devouring at the time.
The theory laid the foundations in my own writing for the idea that the type of work you’re doing matters (elaborated in Deep Work), and that meaningful accomplishment often requires the diligent application of such efforts over a long period of time (elaborated in So Good They Can’t Ignore You.)
As with many big theories, the implications of Anders’s ideas were sometimes pushed to unsustainable extremes. In Outliers, for example, Malcolm Gladwell deployed these concepts to argue for an over-simplified egalitarian utopia in which all significant achievements are due to incidental environmental factors that enable rapid deliberate practice acquisition. (Anders may have egged Gladwell on, as Anders was known to enjoy advancing extreme versions of his theories; though given the good-natured manner with which he approached subsequent debate, I always suspected that this tendency was in part driven by a Socratic impulse to generate progress through the dialectical clash of opposing conceptions.)
In recent years, deliberate practice theory has continued to evolve. Most contemporary thinking on expert performance includes factors beyond just practice accumulation to understand high achievement, such as trainability, innate physiological advantage, and the complex and murky psychological cocktail we often summarize as “drive.”
There’s also an increased consideration of the type of skill being mastered. If there are clear cut rules and feedback, like when learning chess or golf, the application and advantages of deliberate practice are clear. In other pursuits, however, such as the ambiguous, semi-creative, semi-administrative efforts that define knowledge work, designing appropriate practice can be maddeningly difficult and its rewards less immediately obvious. (Though, as I argue in So Good, this is a challenge worth undertaking, as its difficulty scares most people off, leaving a huge competitive advantage for the few willing to apply a deliberate approach to their office-bound skills.)
In the end, however, Anders’s key ideas — that the type and quantity of practice matters a lot — remain widely accepted. He transformed our understanding of the world from a frustratingly unobtainable vision in which people stumbled into their prodigious talent and lived happily ever after, into a more democratized and tractable reality; one in which your abilities are mutable, and disciplined diligence — though perhaps unable to transform you into the next Tiger Woods — will almost always push your skills to a place where they can do you some real good.
Anders will be missed. His ideas will not be forgotten.
June 26, 2020
On the Exceptionalism of Books in an Age of Tweets
Early in his 1994 essay collection, The Gutenberg Elegies, literary critic Sven Birkerts tells a story about his experience teaching an undergraduate course on short stories. He started his students easy, with some Washington Irving, then moved on to Hawthorne and Poe before arriving at Henry James.
It was here that his class “derailed.”
He tried to solicit opinions on the story he’d selected, but came up short. “My students could barely muster the energy for a thumbs-up or -down,” he writes. “It was as though some pneumatic pump had sucked out the last dregs of their spirits.”
As he probed, Birkerts realized the issue wasn’t localized; it wasn’t just the vocabulary, or the diction, or the specific references. The root drove deeper:
“They were not, with few exceptions, readers — never had been; that they had always occupied themselves with music, TV, and videos; they had difficulty slowing down enough to concentrate on prose of any density.”
As he reflected on this reality he came to realize that the implications were “staggering.” This was not just a “generational disability,” but instead a “permanent turn” in the human endeavor.
It’s easy to dismiss such sentiments as nostalgia: everything changes; it’s reactionary to become too enthralled with any particular aging technology. But Birkerts convincingly argues for a literary exceptionalism of sorts:
“For in fact, our entire collective subjective history — the soul of our societal body, is encoded in print. Is encoded, and has for countless generations been passed along by way of the word, mainly through books.”
As I elaborated in last week’s episode of my podcast, Neil Postman argues that it was the introduction of mass-produced longform writing that really unleashed human potential — ushering in the modes of critical, analytical understanding that birthed both the enlightenment and the scientific revolution, the foundations of modernity. It allowed us to efficiently capture complex thought in all its nuance, then build on it, layer after layer, nudging forward human intellectual endeavor.
Writing was not just another technology, in other words, but the cognitive lodestone that attracted all advances that followed.
Which is why Birkerts was troubled in the early 1990s to see an emergent electronic culture destabilize this medium.
It’s also why in 1985, Neil Postman described a similar ominous premonition as he surveyed the impact of television.
And it’s why today, as I see more of our political and philosophical discourse mediated through Tweets, I despair, but as I also see the emergence of longform audio and the resurgence of audio books, I feel hope.
As I elaborated in my podcast, the medium through which you mediate the world matters. An app on your phone can offer you diversion or fleeting catharsis. On the other hand, something more lexicographically substantial — though perhaps, as Birkert’s students discovered, more difficult to consume — can often offer true progress.
June 19, 2020
On Social Media and Character
Madison Fischer, a professional sport climber, recently pointed me toward an insightful essay she published on her blog about her battle with social media.
Early in her climbing career, Madison was exposed to Instagram. At first she posted pictures of her cat; then pictures of competitions; then her training; then she had a professional account where she could carefully track the demographics of her viewers, optimizing when she posted, and synchronizing her online behavior with a carefully-calibrated content calendar.
This sudden influencer status was impossibly appealing:
“I wanted the congratulations. I wanted admiration. I wanted my follower count to grow. I wanted everyone to envy my life and achievements. I wanted, no, needed people to tell me I was going places…But you can’t blame me. It’s so easy, so stimulating. It’s not even a statement that you have Instagram, it’s assumed. Everyone’s doing it.”
But something didn’t feel quite right about the increasingly artificial life she was constructing online. Beyond the “obvious egotism” issues, she began to lose touch with her true self: “I started believing this narrative of a girl…living the dream,” she writes, “traveling around the world to compete while finding the time for school, work, and a relationship.”
This became a problem:
“This story blinded me to the many mistakes I had along the way. I couldn’t step out of the reputation… Pride in my accomplishments made me content, and contentedness is poison to a young athlete who has to stay hungry if she wants to stay competitive.”
Madison eventually made a bold decision: she would quit Instagram. As she elaborates, it actually took her months of false starts and failed attempts to get to this place. At first, she tried partial solutions. She would delete the app, but it was still too easy to just Google “Instagram” and log in using her phone’s browser. She unfollowed everyone to empty her feed, but she still felt compelled to compulsively document her life.
So she finally had to get rid of her account altogether. “My exit from social media was a quiet one,” she writes. No big post announcing her decision. No warnings. Just silence. She was free.
It was then that Madison’s athletic career moved to the next level. “There’s nobody I’m here to perform for,” she writes. “I just train and silently work on achieving my own definition of success.”
Without the need to document and promote her daily activities, Madison regained a sense of self-motivation. She was honing her craft for her own reasons.
Three months after going off the grid, she traveled to the biggest event in her competition calendar, the Canadian Open Boulder Nationals. She wasn’t looking at posts showing her competitors preparing, and she wasn’t thinking about how her performance would play online. As a result, she “felt overwhelming ease” and was “able to perform at my capacity.”
She won second place.
What struck me about Madison’s story was not the impact her decision had on her training, or focus, or performance, but instead the way it transformed her character.
Here’s how she describes her mindset heading into Nationals:
“I wanted to see what I could do. Nothing to do with you, or your friends, or the neighbors, or the members at my gym, or my competitors, or family. It was all within, as it should be, and as it has to be.”
This issue is often overlooked when people consider the role of tools like social media in their lives. They consider factors like audience-building, entertainment, discovery and connection, and weigh them against obvious costs like distraction and privacy. But a deeper question lurks beneath this debate: are these services making you a better or worse version of yourself?
######
I want to thank everyone who took on my challenge from earlier this week to donate to a pair of excellent organizations working on police reform. Over 70 of you stepped up and made donations that added up to roughly $7,000. As promised, I matched every dollar.
June 16, 2020
Facebook’s Fatal Flaw?
In Episode 4 of my Deep Questions podcast (posted Monday), a reader named Jessica asked my opinion about the future of social media. I have a lot of thoughts on this issue, but in my response I focused on one point in particular that I’ve been toying with recently: Facebook may have accidentally developed a fatal flaw.
To understand this claim, we have to rewind to the early days of this social platform. The original pitch for Facebook was that it made it easier to connect online with people you knew. The content model was simple: you setup a profile, people you knew setup profiles, and everyone could then check each others’ vacation pictures and relationship statuses.
For this model to be valuable, the people you knew had to also use the service. This is why Mark Zuckerberg focused at first on college campuses. These were closed communities in which it was easy to build up enough critical user mass to make Facebook fun.
Once Facebook moved into the range of hundreds of millions of users, competition became difficult. The value of a network with a hundred million users was exponentially larger than one with a million, as the former was much more likely to connect you with the people you cared about. It was on the strength of this model that Facebook emerged as a powerful social internet monopoly.
The problem, however, was that they weren’t making enough money.
As their IPO loomed, Facebook executives feared that the appeal of checking the profiles of friends and family wasn’t strong enough to get people to use the service all day long. It was an activity you would occasionally do when bored; they needed to find a way to make their platform stickier.
So Facebook did something radical: it blew up its original content model and replaced it with something novel: the bottomless scrolling newsfeed. Instead of checking the profiles of friends and family, you now encounter a stream of articles sourced from all over the network, handpicked by optimized statistical algorithms to push your buttons and stoke the fires of the elusive quality known as engagement.
Facebook shifted from connection to distraction; an entertainment giant built on content its users produced for free.
This shift was massively profitable because it significantly increased the time Facebook’s gigantic user base spent on the platform each day. Tapping that blue and gray icon on your phone now promised instant satisfaction, and our days are filled with endless moments were such appeasement is welcome.
The thought that keeps capturing my attention, however, is that perhaps in making this short term move toward increased profit, Facebook set itself up for long term trouble.
When this platform shifted from connection to distraction it abdicated its greatest advantage: network effects. If Facebook’s main pitch is that it’s entertaining, it must then compete with everything else that’s entertaining. This includes podcasts, and YouTube, and streaming video services, not to mention niche long tail social media platforms that can’t offer you access to your old roommate, but can connect you with a small number of people who are interested in the same things as you. Meanwhile the social interactions that used to occur on these platforms have moved to more flexible and simpler mediums, like group text messages. Facebook used to be the place where grandparents sought new baby pictures. Today, these images are just as likely to be spread in a nondescript iMessage thread, with no creepy data mining or malicious attention engineering required.
I’m not so sure that a newsfeed made up of posts and links generated by random social media users can compete with this increasingly optimized world of targeted entertainment and streamlined digital socialization. Facebook found a way to grow to a market capitalization of $600 billion, but may have accidentally crippled itself in the process.
Or not. But one thing I know for sure is that it would be myopic to believe that the future of social media is going to look just like it does today.
June 15, 2020
Small Steps
Last week, I asked for your help in identifying organizations that have had some success working on issues surrounding police violence.
My instinct when facing an overwhelming problem is to find at least one place where some improvement is possible, find people who are having success with these improvements, then give them support to help them keep going. Such steps are small in the short term, but they have a way of breaking the complacency of standing still, which in the long term can end up making the difference between transformation or frustration.
Over 60 of you sent me notes, pointed me toward organizations, and provided reading lists. This was massively helpful. I have sorted through this information to identify two organizations in particular (among many) that seem to be having success in this policy area, and that apply the type of data-driven approach I thought might appeal to my audience here:
Campaign Zero
Center for Policing Equity
I encourage you to donate if you can. If you do, forward me the receipt. To the extent I’m able, I’ll match these contributions dollar for dollar.
June 12, 2020
Ancient Complications to Modern Career Advice
In 2012, I published a book titled So Good They Can’t Ignore You. It argued that “follow your passion” was bad career advice. I didn’t claim that passion was a problem, but instead argued that it was too simplistic to assume that the key to career satisfaction was as easy as matching your job to a pre-existing inclination. For many people, this slogan might actually impede their progress down the more complicated path that leads to true satisfaction.
One of the interesting things I uncovered in my research was that the term “follow your passion” didn’t really emerge in the context of career advice until the 1980s. Where did it come from? I argued that two critical trends converged during this period.
First, the unionized industrial work that characterized mid-century American economic growth gave way to a less rooted knowledge sector. Workers who might have previously taken a job at whatever factory happened to be located in their hometown might now be forced to travel cross-country in search of a suitable office position.
For the first time, the question of what you wanted to do for a living became pervasive — a shift captured well by the emergence in the early 1970s of Richard Bolles’ seminal career guide, What Color is Your Parachute: one of the original books to help readers identify which professions suit their personality and interests. It’s important to remember that this was a radical notion. “[At the time,] the idea of doing a lot of pen-and-paper exercises in order to take control of your career was regarded as a dilettante’s exercise,” Bolles later explained.
The second force at play was Joseph Campbell, the polymath literature professor who was heavily influenced by Carl Jung, and popularized the hero’s journey as a foundational mythology that emerges in many cultures. In 1988, PBS aired a multi-part interview with Campbell hosted by Bill Moyers. This wildly popular series introduced the concept of following your bliss, which Campbell, who read Sanskrit, had adapted from the ancient Hindu notion of ananda, or rapture.
Combine these two forces — a sudden need to figure out what you wanted to do for a living with Campbell’s mantra — and a strange, secularized, bastardized hybrid emerges: the key to career happiness, we decided as the 80s gave way to the 90s, was to follow your passion.
I was reminded of this history as I recently began reading Stephen Cope’s engaging treatise, The Great Work of Your Life (hat tip to Brett McKay). In this book, Cope dives deep into the classic Hindu scripture, the Bhagavad Gita, which tackles the centrality of ananda, and was almost certainly an influence on Campbell’s blissful bromide.
Cope notes that this text indeed argues for the importance of discovering one’s dharma (calling), and giving it full commitment. What caught my attention, however, was the complexity Cope ascribes to this notion. In the ancient context in which the Bhagavad Gita was first composed, dharma was not something you identified through soul searching about what really lights your fire. As Cope writes:
“In the caste system of ancient India, dharmas were prescribed at birth. Arjuna [one of the two main characters of the Gita] was born into the warrior class. So, he was destined to be a warrior. It was his sacred duty to fight a just war. He never had any choice in the matter, nor was his dharma based on any particular personal qualities.”
As Cope then elaborates, in the traditional culture where this story was told, the very concept of an autonomous “personal self” didn’t exist.
This idea of dharma — or its equivalent — manifesting as a burden or responsibility that one takes on, not an energizing inclination, is common in the many cultural interpretations of the hero’s journey monomyth. Ancient wisdom, in other words, doesn’t so much prescribe that we follow our passion, as it does that we approach with passion the trials and responsibilities placed before us.
Modern career advice may be based on an incomplete translation of the underlying philosophies that sparked its emergence four decades earlier. For many, recognizing this reality is empowering. The belief that the world owes you the perfect role for your special unique personality is myopically self-focused and ill-suited to hard times. The alternative notion that the world needs you to offer all that you can is comparably liberating.
Cal Newport's Blog
- Cal Newport's profile
- 9849 followers
