Cal Newport's Blog, page 33
September 23, 2017
Spend More Time Alone
A Lonely Binge
I recently read three books on the topic of solitude. Two were actually titled Solitude, while the third, and most recently published, was titled Lead Yourself First — which is pitched as a leadership guide, but is actually a meditation on the value of being alone with your thoughts.
This last book resonated with me in part because it was co-authored by a former Army officer and a well-respected federal appellate judge, meaning it’s written with the type of exacting logic and ontological clarity that warms my overly-technical nerd heart.
Style aside, Lead Yourself First makes many interesting points, but there were two lessons in particular that struck me as relevant to the types of things we talk about here. So I thought I would share them:
Lesson #1: The right way to define “solitude” is as a subjective state in which you’re isolated from input from other minds.
When we think of solitude, we typically imagine physical isolation (a remote cabin or mountain top), making it a concept that we can easily push aside as romantic and impractical. But as this book makes clear, the real key to solitude is to step away from reacting to the output of other minds: be it listening to a podcast, scanning social media, reading a book, watching TV or holding an actual conversation. It’s time for your mind to be alone with your mind — regardless of what’s going on around you.
Lesson #2: Regular doses of solitude are crucial for the effective and resilient functioning of your brain.
Spending time isolated from other minds is what allows you to process and regulate complex emotions. It’s the only time you can refine the principles on which you can build a life of character. It’s what allows you to crack hard problems, and is often necessary for creative insight. If you avoid time alone with your brain your mental life will be much more fragile and much less productive.
Among other impacts, these ideas provide an interesting new perspective on one of my favorite topics: deep work. Not all types of deep work satisfy this definition of solitude, as it’s possible to deeply react to inputs from other minds, such as when you’re trying to make sense of a tough piece of writing or lock into a complicated lecture.
But in general, deep thinking is time spent alone with your mind, and as such it’s just one of many different flavors of solitude — all of which aid human flourishing.
I ended my last book by claiming: “a deep life is a good life.” The authors of Lead Yourself First would rework that claim to read something like: “a life rich in solitude (both at work and at home) is a good life.” In an age where persistent reactivity is possible from the moment you wake up to the moment you fall sleep, this latter formulation is probably one worth spreading.


September 17, 2017
Approach Technology Like the Amish
Kevin Kelly and the Amish
Eight years after dropping out of college to wander Asia, Kevin Kelly returned home to America, bought an inexpensive bike, and made a meandering 5,000 mile journey across the country. As he recalls in his original and insightful 2010 book, What Technology Wants, the “highlight” of the bike tour was “gliding through the tidy farmland of the Amish in eastern Pennsylvania.”
Kelly ended up returning to the Amish on multiple occasions during the years that followed his first encounter, allowing him to develop a nuanced understanding of how these communities approach technology. As he reveals in Chapter 11 of his book, the common idea that the Amish reject all modern technology is a myth. The reality is not only more interesting, but it also has important implications for our current culture.
As Kelly puts it: “In any discussion about the merits of avoiding the addictive grasp of technology, the Amish stand out as offering an honorable alternative.”
Given such a strong endorsement, it seems worthwhile to briefly summarize what Kelly uncovered during these visits to rural Pennsylvania…
The Amish and Technology
“Amish lives are anything but anti-technological,” Kelly writes. “I have found them to be ingenious hackers and tinkers, the ultimate makers and do-it-yourselvers. They are often, surprisingly, pro-technology.”
He explains that the simple notion of the Amish as Luddites vanishes as soon as you approach a standard Amish farm. “Cruising down the road you may see an Amish kid in a straw hat and suspenders zipping by on Rollerblades.”
Some Amish communities use tractors, but only with metal wheels so they cannot drive on roads like cars. Some allow a gas-powered wheat thresher but require horses to pull the “smoking contraption.” Personal phones (cellular or household) are almost always prohibited, but many communities maintain a community phone booth.
Almost no Amish communities allow automobile ownership, but it’s common for Amish to travel in cars driven by others.
Kelly reports that both solar panels and diesel electric generators are common, but it’s usually forbidden to connect to the larger municipal power grid.
Disposable diapers are popular as are chemical fertilizers.
In one memorable passage, Kelly talks about visiting a family that uses a $400,000 computer-controlled precision milling machine to produce pneumatic parts needed by the community. The machine is run by the family’s bonnet-wearing, 10-year old daughter. It’s housed behind their horse stable.
These observations dismiss the common belief that the Amish reject any technology invented after the 19th century. So what’s really going on here?
The Amish, it turns out, do something that’s both shockingly radical and simple in our age of impulsive and complicated consumerism: they start with the things they value most, then work backwards to ask whether a given technology performs more harm than good with respect to these values.
As Kelly explains, when a new technology rolls around, there’s typically an “alpha geek” in any given Amish community that will ask the parish bishops permission to try it out. Usually the bishops will agree. The whole community will then observe this first adopter “intently,” trying to discern the ultimate impact of the technology on the things the community values most.
If this impact is deemed more negative than helpful the technology is prohibited. Otherwise it’s allowed, but usually with caveats on its use that optimize its positives and minimize its negatives.
The reason most Amish are prohibited to own cars, for example, has to do with their impact on the social fabric of the community. As Kelly explains:
“When cars first appeared at the turn of the last century, the Amish noticed that drivers would leave the community to go picnicking or sightseeing in other towns, instead of visiting family or the sick on Sundays, or patronizing local shops on Saturday. Therefore the ban on unbridled mobility was intended to make it hard to travel far and to keep energy focused in the local community. Some parishes did this with more strictness than others.”
This also explains why an Amish farmer can own a solar panel but not connect to the power grid. The problem is not electricity, it’s the fact that the grid connects them too strongly to the world outside of their local community, violating the Amish commandment to “be of the world, but not in it.”
The Original Digital Minimalists
I titled this post: Approach Technology Like the Amish. To be clear, I don’t mean that you should adopt the specific values of Amish life, as these are based primarily on their often illiberal and admittedly esoteric religious beliefs.
What I do mean, however, is that you should consider adopting their same thoughtfulness in approaching technology. The Amish are clear about what they value, and new technologies are evaluated by their impact on these values. The key is building a good life — not fretting about missing out on some minor short term pleasure or interesting diversion.
(If we held ourselves to this same standard, I suspect, many fewer people would own Apple watches.)
Later in this chapter, Kelly asks the key question: “This method works for the Amish, but can it work for the rest of us?”
He then answers: “I don’t know.”
I’m more confident than Kelly. I think something like this method can work for the rest of us, especially once you replace Amish values with your personal values, and the decree of your parish bishops with your own honest self-assessment.
In fact, I even have a name for such a philosophy: digital minimalism.
(Photo by frankieleon)


September 10, 2017
Franklin Foer on Technology’s Surprising Threat to Humanity
Contemplating the Importance of Contemplation
Franklin Foer has a new book coming out this week. It’s titled, World Without Mind: The Existential Threat of Big Tech.
I haven’t read it yet, but this morning, on returning from a family camping trip, I read Foer’s essay in today’s Washington Post and a recent interview with The Verge (as, of course, there’s no better time to contemplate the existential threat of technology than right after a weekend in the woods).
According to the interview in The Verge, Foer writes in the book: “the tech companies are destroying the possibility of contemplation.”
This premise is one I obviously support, having written an entire book on why we should fight to retain our diminishing ability for sustained attention.
But whereas my main issue with digital distraction was limited to issues of personal satisfaction and productivity, Foer, in elaborating his contemplation quote, goes much broader in his concern:
“We’re being dinged, notified, and clickbaited, which interrupts any sort of possibility for contemplation. To me, the destruction of contemplation is the existential threat to our humanity.” [emphasis mine]
In using this strong language, Foer is hitting on an increasingly urgent point that I’ve also seen fruitfully explored in Matt Crawford and Jaron Lanier’s humanist critiques of the attention economy.
Whereas I’m often focused on the immediate practical concerns of new technologies, an increasing number of thinkers like Foer, Crawford and Lanier are exploring a bigger point: when we allow ourselves to be washed away by the latest gadget or app designed to extract some more dollars from our attention, we’re not just losing some time, we’re actually losing something more fundamental about what it means to be an autonomous human.
When you hear an argument enough times, it probably makes sense to start taking it seriously.
(Photo by Alyson Hurt; this is where I was camping this weekend.)
August 31, 2017
Apple’s New Open Office Sparks Revolt
Not Open to Openness
Apple’s new Cupertino headquarters cost $5 billion (see above). One of its prominent features is a massive open office space in which many Apple engineers sit on benches at long shared work tables.
As Apple aficionado John Gruber revealed in a recent episode of his podcast, not everyone is happy with this decision.
“I heard that when floor plans were announced, that there was some meeting with Johny Srouji’s team,” said Gruber, before explaining that Srouji is an important senior vice president in charge of Apple’s custom silicon chips.
Srouji, to put it politely, was not pleased with the idea of moving his team to a cacophonous, distracting, cavernous open office.
As Gruber tells it:
“When he [Srouji] was shown the floor plans, he was more or less just ‘Fuck that, fuck you, fuck this, this is bullshit.’ And they built his team their own building, off to the side on the campus … My understanding is that that building was built because Srouji was like, ‘Fuck this, my team isn’t working like this.’”
To be clear, this story is just a rumor. But it smells right.
Designing silicon is a complicated, painstaking process that requires copious amounts of deep work. Nothing about it is helped by surrounding yourself with unrelenting disruption.
True or not, I like the broader point underscored by this rumor. In knowledge work, your primary capital investment is in human brains. If you’re not careful about the environment you setup for these brains to function in, you cannot expect a good return on investment.
To date, Silicon Valley has tried hard to ignore this reality to instead chase vague trends and embrace tired signifiers of innovation. But if more and more senior people like Srouji react by saying “Fuck this,” things will change.
#####
(Hat tip to Mike B. who pointed me toward this interview via this Silicon Valley Business Journal article.)
August 26, 2017
Toward a Deeper Vocabulary
When Writing is More than Writing
As a professor who also happens to opine publicly about productivity, I’m often invited to stop by dissertation bootcamps — a semi-annual ritual at many universities where doctoral students gather to hear advice and work long hours on their theses in an atmosphere of communal diligence.
Something that strikes me about these events is the extensive use of the term “writing” to capture the variety of different mental efforts that go into producing a doctoral dissertation; e.g., “make sure you write every day” or “don’t get too distracted from your writing by other obligations.”
The actual act of writing words on paper, of course, is necessary to finish a thesis, but it’s far from the only part of this process. The term “writing,” in this context, is being used as a stand in for the many different cognitive efforts required to create something worthy of inclusion in the intellectual firmament of your discipline.
In my own academic work, for example, these efforts include the general synthesis of trends in search of new openings, the struggle to read and understand existing papers, probing for a fresh attack on a problem, trying to work through the technical details needed to pull an argument together, and, of course, the careful grind required to write up the results clearly — each of which presents a unique mental experience and its own set of challenges.
The tendency for bootcamp attendees to sweep such varied activities together into a generic term like “writing” is a minor linguistic quirk, but I’m beginning to believe that it points to a potentially broader problem: our culture lacks a sufficiently nuanced vocabulary for discussing rigorous cognitive efforts.
I guess what I’m trying to say is, if, as Boas famously claimed, the Eskimos have dozens of words for “snow,” then in an emerging knowledge work society, we should have more than a handful of words to describe the mental efforts on which, more and more, our livelihoods depend.
August 19, 2017
Are We Going to Allow Smartphones to Destroy a Generation?
The iGen Problem
Many people recently sent me the same article from the current issue of The Atlantic. It’s titled, “Have Smartphones Destroyed a Generation?”, and it’s written by Jean Twenge, a psychology professor at San Diego State University.
The article describes Twenge’s research on iGen, her name for kids born between 1995 and 2012 — the first generation to grow up with smartphones. Here’s a short summary of her alarming conclusions:
“It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades. Much of this deterioration can be traced to their phones.”
I won’t bother describing all of Twenge’s findings here. If you’re interested, read the original article, or her new book on the topic, which comes out this week.
The point I want to make instead is that in my position as someone who researches and writes on related topics, I’ve started to hear this same note of serious alarm from multiple different reputable sources — including the head of a well-known university’s mental health program, and a reporter currently bird-dogging the topic for a major national publication.
In other words, I don’t think this growing concern about the mental health impact of smartphones on young people is simply nostalgia-tinged, inter-generational ribbing.
Something really scary is probably going on.
My prediction is that we’re going to see a change in the next 2 – 5 years surrounding how parents think about the role of smartphones in their kids’ lives. There will be a shift from shrugging our shoulders and saying “what can we do?”, to squaring our shoulders and asking with more authority, “what are we going to do?”
(Photo by Pabak Sarkar)


August 13, 2017
How I Read When Researching a Book
The Reading Writer
As a writer I’m required to read lots of books, especially when ramping up a new project, as I am now. The picture above, for example, shows the books I’ve purchased only in the past two days.
I’ve already finished one of them.
My approach to the books I process in my professional life is quite different than my approach to the books I savor in my personal life. The former requires the ruthlessly efficient extraction of key ideas and citations, while the latter unfolds as a slower, more romantic endeavor.
I thought it might be interesting to briefly reveal the method I’ve honed over the years for my professional reading. It’s simple, and the basics should sound familiar to any serious nonfiction reader, but it has served me well.
Here’s the strategy:
I read with pencil in hand. Recently I’ve been using Ticonderoga #2 soft lead pencils as their footprint on the page is pleasingly gentle. But the writing implement doesn’t really matter, and I’ll fall back on a brutish ballpoint Bic if that’s all that happens to be available.
When I find a passage I want to remember, or an allusion or citation I might need, or a stylistic approach that catches my attention: I mark it in the margin.
Sometimes I scribble a few notations if I want to ossify a non-obvious observation or insight about what I’m marking.
Then — and this is the key optimization — I cross the corner of the page with a clear line.
Here, for example, is a recently read page from Michael Harris’s book Solitude (which, interestingly enough, is different than Anthony Storr’s 1988 classic of the same title which I just acquired today). Notice the mark in the upper right corner that indicates a few interesting citations below have been identified for potential further review:
Here’s another example from Laurence Scott’s The Four-Dimensional Human. Once again, a mark in the corner identifies the page as relevant, and a scratched line below highlights a passage that captures an idea useful to my purposes:
The key to my system is the pencil mark in the page corner. This allows me later to quickly leaf through a book and immediately identify the small but crucial subset of pages that contain passages that relate to whatever project I happen to be working on.
My copy of Scott’s book, for example, has around 30 pages marked (I just counted). It will take me less than 10 minutes to review in totality the elements of this treatise that are potentially relevant.
(To emphasize the obvious, this doesn’t mean that Scott’s book only contained 30 pages that interested me: it’s a complicated and interesting work of literary techno-social criticism — which came close to winning the Samuel Johnson Prize two years ago — that I found thought provoking throughout. These are just the number of pages that happened to be relevant to what I’m working on at the moment.)
When it comes to my research process, this is about as complicated as it gets — at least with repect to how I process relevant books. As I work on a new writing project, a growing number of volumes fill the shelf next to my desk, each marred with intentional pencil scratches. When I think I need a particular source, I pull it from the shelf and after a brief review of my ad hoc annotations find myself fully engaged with what it has to offer.
Simple. But effective.


August 8, 2017
Aziz Ansari Quit the Internet
The Disconnected Life
Aziz Ansari recently deleted the web browser from his phone and laptop. He also stopped using email, Twitter and Instagram.
As he explained in an interview with GQ, when he gets into a cab, he now leaves his phone in his pocket and simply sits there and thinks; when he gets home, instead of “looking at websites for an hour and half, checking to see if there’s a new thing,” he reads a book.
Here’s how he explains his motivation:
“Whenever you check for a new post on Instagram or whenever you go on The New York Times to see if there’s a new thing, it’s not even about the content. It’s just about seeing a new thing. You get addicted to that feeling. You’re not going to be able to control yourself. So the only way to fight that is to take yourself out of the equation and remove all these things.”
He was worried when he first deleted his web browsers that he would suffer from not being able to look things up. He soon stopped caring.
“Most of the shit you look up, it’s not stuff you need to know,” he explains.
The journalist interviewing Ansari for GQ reacts to this answer with incredulity. “What about important news and politics?”, he asks.
“Guess what?”, Ansari replies. “Everything is fine! I’m not out of the loop on anything. Like, if something real is going down, I’ll find out about it.”
Later in the interview, however, after covering a variety of topics, the interviewer makes a harsh observation:
“I have to be honest, my man. I’m surprised at how sad you sound…You don’t seem like someone who has the world by the balls, you know?”
Ansari provides a surprisingly honest (if perhaps excessively testicular) reply:
“I got the world by the balls professionally. Personally, I’m alone right now…So right now, I have it by the balls, but I’m feeling it slowly going away and I’m worried about finding new balls.”
Highlighted in this conversation is a fundamental complexity of our current moment.
Escaping the fizzy chatter of the online world can support deep insight and creative achievement (c.f., the reviews for the second season of Master of None).
But life without persistent digital distraction can also be lonely, and stark, and, frankly, require a lot more work to satisfy the human need for novelty and connection.
Ansari, in other words, perhaps encapsulates both the highs and lows of a committing to a deep life in a distracted world.
(Photo by Vincent Anderlucci)
#####
On a related note, I just finished reading Michael Harris’s new book, Solitude: In Pursuit of a Singular Life in a Crowded World. It provides a thoughtful investigation of several similarl issues (I was particularly taken by Harris’s final chapter, which was beautifully written).


July 24, 2017
Top Performer is Now Open
Top Performer is an eight-week online career mastery course that I developed with my friend and longtime collaborator Scott Young. It helps you develop a deep understanding of how your career works, and then apply the principles of deliberate practice to efficiently master the skills you identify as mattering most. Over the past four years we’ve had over two thousand professionals go though this course, representing a wide variety of different fields, backgrounds, and career stages.
We open the course infrequently for new registrations (usually twice a year). It’s that time again: the course is open for registration this week (the registration closes Friday at midnight Pacific time).
If you’d like to learn more about the course, how it works or whether it’s right for you,* see the registration page here.
If you have any questions about the course, Scott’s team will be happy to answer them here: support@scotthyoung.zendesk.com
#####
* To emphasize the obvious: the course is definitely not for everyone. It’s expensive and targeting those at a stage in their career where they’re able and willing to invest more seriously in advancement. I might send one or two additional notes about the course this week, but will then return to my regularly scheduled programming.
July 20, 2017
On Claude Shannon’s Deliberate Depth
An Insightful Life
Claude Shannon is one of my intellectual heroes.
His MIT master’s thesis, submitted in 1936, laid the foundation for digital circuit design. (My MIT master’s thesis, submitted 70 years later, has so far proven somewhat less influential.)
His insight was simple. The wires, relays and switches that made up the types of complex circuits he encountered at AT&T could be understand as the terms and operators of logic statements expressed in the boolean algebra he encountered as a math major at the University of Michigan.
Though simple, this insight had huge impact. It meant that circuits could be designed and optimized in the abstract and precise language of mathematics, and then transformed back to soldered wires and finicky magnetic coils only at the last step — enabling staggering leaps in circuit complexity.
But he wasn’t done. A decade later, inspired in part by his wartime research efforts, Shannon developed information theory: a mathematical framework that formalizes both techniques and fundamental limits for reliably transmitting information over noisy channels.
(For a popular treatment of this theory, see this or this; for a technical introduction, I recommend this guide).
Put another way, Shannon’s master’s thesis laid the foundation for digital computers, while his information theory paper laid the foundation for digital communication.
Not a bad legacy.
Decoding Shannon’s Work Habits
This is all to say that I was, quite naturally, excited to learn that my friend Jimmy Soni was co-authoring a big new biography of Shannon.
The resulting book came out earlier this week (I read a review copy — it’s great). As part of the publicity surrounding the release, Soni wrote an epic article on the twelve lessons he learned from the years he spent researching Shannon. The title of the first lesson caught my attention: “cull your inputs.”
To quote Soni:
“[D]istractions are a permanent feature of life, in any era, and Shannon shows us that shutting them out isn’t just a matter of achieving random bursts of focus. It’s about consciously designing one’s life and work habits to minimize them.”
Shannon, we learn, often worked with his door shut at Bell Labs to ward off distraction.
“None of Shannon’s colleagues, to our knowledge, remembered him as rude or unfriendly,” Soni writes, “but they do remember him as someone who valued his privacy and quiet time for thinking.”
It’s not that Shannon avoided collaboration. If anything, he was known for his ability to maintain stimulating conversation for hours when the topic was right. But he was wary of less fruitful digressions.
Shannon also discarded much of his voluminous incoming correspondence and invitations into a box labeled: “Letters I’ve Procrastinated On For Too Long.” When Soni and his co-author studied Shannon’s correspondence at the Library of Congress, they found “far more incoming letters than outgoing ones.”
To summarize these observations somewhat flippantly, while it’s absolutely true that Shannon’s breakthroughs ultimately enabled Facebook (which, of course, depends on computers and networks), if he was alive today, he’d almost certainly not use it.
Cal Newport's Blog
- Cal Newport's profile
- 9854 followers
