More on this book
Community
Kindle Notes & Highlights
Read between
November 20 - December 17, 2022
we work more effectively, scientists have found, when we continually alter our study routines and abandon any “dedicated space” in favor of varied locations.
The left hemisphere was the intellectual, the wordsmith, and it could be severed from the right without any significant loss of IQ. The right side was the artist, the visual-spatial expert. The two worked together, like copilots.
The left hemisphere takes whatever information it gets and tells a tale to conscious awareness. It does this continually in daily life, and we’ve all caught it in the act—overhearing our name being whispered, for example, and filling in the blanks with assumptions about what people are gossiping about. The brain’s cacophony of voices feels coherent because some module or network is providing a running narration.
The brain does not store facts, ideas, and experiences like a computer does, as a file that is clicked open, always displaying the identical image. It embeds them in networks of perceptions, facts, and thoughts, slightly different combinations of which bubble up each time. And that just retrieved memory does not overwrite the previous one but intertwines and overlaps with it. Nothing is completely lost, but the memory trace is altered and for good. As scientists put it, using our memories changes our memories.
If learning is building up skills and knowledge, then forgetting is losing some of what was gained. It seems like the enemy of learning. It’s not. The truth is nearly the opposite.
the New Theory of Disuse, to distinguish it from an older, outdated principle stating, simply, that memories evaporate entirely from the brain over time if they’re not used. The new theory is far more than an updating, though. It’s an overhaul, recasting forgetting as the best friend of learning, rather than its rival. A better name for it, then, might be the Forget to Learn theory.
Ebbinghaus’s Forgetting Curve captured the minds of many theorists and would not let go. In 1914, the influential American education researcher Edward Thorndike turned Ebbinghaus’s curve into a “law” of learning. He called it the Law of Disuse, which asserted that learned information, without continued use, decays from memory entirely—i.e., use it or lose it.
Ballard doubted what he was seeing and ran hundreds of additional tests, with more than ten thousand subjects, over the next several years. The results were the same: Memory improved in the first few days without any further study, and only began to taper off after day four or so, on average. Ballard reported his findings in 1913, in a paper that seems to have caused mostly confusion. Few scientists appreciated what he’d done, and even today he is little more than a footnote in psychology, a far more obscure figure than Ebbinghaus. Still, Ballard knew what he had. “We not only tend to forget
...more
The American psychologist B. F. Skinner showed how rewards and punishments could alter behavior, and accelerate learning in many circumstances. Skinner tested various reward schedules against one another and got striking results: An automatic reward for a correct answer leads to little learning; occasional, periodic rewards are much more effective. Skinner’s work, which was enormously influential among educators, focused on improving teaching, rather than on the peculiarities of memory.
There’s no spontaneous improvement on test scores after a day or two. By contrast, reminiscence is strong for imagery, for photographs, drawings, paintings—and poetry, with its word-pictures. And it takes time to happen.
The scientists who have shepherded the theory along and characterized it most clearly are Robert Bjork of UCLA and his wife, Elizabeth Ligon Bjork, also at UCLA. The new theory of disuse (“Forget to Learn,” as we’re calling it) is largely their baby. The first principle theory is this: Any memory has two strengths, a storage strength and a retrieval strength.
Storage strength is just that, a measure of how well learned something is. It builds up steadily with studying, and more sharply with use. The multiplication table is a good example.
That is, no memory is ever “lost” in the sense that it’s faded away, that it’s gone. Rather, it is not currently accessible. Its retrieval strength is low, or near zero.
Retrieval strength, on the other hand, is a measure of how easily a nugget of information comes to mind. It, too, increases with studying, and with use. Without reinforcement, however, retrieval strength drops off quickly, and its capacity is relatively small (compared to storage). At any given time, we can pull up only a limited number of items in connection with any given cue or reminder.
The harder we have to work to retrieve a memory, the greater the subsequent spike in retrieval and storage strength (learning). The Bjorks call this principle desirable difficulty,
Using memory changes memory—and for the better. Forgetting enables and deepens learning, by filtering out distracting information and by allowing some breakdown that, after reuse, drives retrieval and storage strength higher than they were originally. Those are the basic principles that emerge from brain biology and cognitive science,
generally speaking, we perform better on tests when in the same state of mind as when we studied—and, yes, that includes mild states of intoxication from alcohol or pot, as well as arousal from stimulants. Moods, preoccupations, and perceptions matter, too: how we feel while studying, where we are, what we see and hear.
this research establishes a couple of points that are valuable in developing a study strategy. The first is that our assumptions about learning are suspect, if not wrong. Having something going on in the study environment, like music, is better than nothing (so much for sanctity of the quiet study room). The second point is that the experience of studying has more dimensions than we notice, some of which can have an impact on retention. The contextual cues scientists describe—music, light, background colors—are annoyingly ephemeral, it’s true. They’re subconscious, usually untraceable.
...more
the larger message of context research is that, in the end, it doesn’t much matter which aspects of the environment you vary, so long as you vary what you can.
Since we cannot predict the context in which we’ll have to perform, we’re better off varying the circumstances in which we prepare. We need to handle life’s pop quizzes, its spontaneous pickup games and jam sessions, and the traditional advice to establish a strict practice routine is no way to do so. On the contrary: Try another room altogether. Another time of day. Take the guitar outside, into the park, into the woods. Change cafés. Switch practice courts. Put on blues instead of classical. Each alteration of the routine further enriches the skills being rehearsed, making them sharper and
...more
The technique is called distributed learning or, more commonly, the spacing effect. People learn at least as much, and retain it much longer, when they distribute—or “space”—their study time than when they concentrate it. Mom’s right, it is better to do a little today and a little tomorrow rather than everything at once. Not just better, a lot better. Distributed learning, in certain situations, can double the amount we remember later on.
In effect, Wozniak had reinvented Ebbinghaus for the digital age. His algorithm answered a crucial question about the timing of intervals. To build and retain foreign vocabulary, scientific definitions, or other factual information, it’s best to review the material one or two days after initial study; then a week later; then about a month later. After that, the intervals are longer.
By the 1990s, after its long incubation period in the lab, the spacing effect had grown legs and filled out—and in the process showed that it had real muscle. Results from classroom studies continued to roll in: Spaced review improves test scores for multiplication tables, for scientific definitions, for vocabulary. The truth is, nothing in learning science comes close in terms of immediate, significant, and reliable improvements to learning.
“To put it simply, if you want to know the optimal distribution of your study time, you need to decide how long you wish to remember something,” Wiseheart and Pashler’s group wrote. The optimal interval ranges can be read off a simple chart:
If the test is in a week, and you want to split your study time in two, then do a session today and tomorrow, or today and the day after tomorrow. If you want to add a third, study the day before the test (just under a week later). If the test is a month away, then the best option is today, a week from today (for two sessions); for a third, wait three more weeks or so, until a day before the test. The further away the exam—that is, the more the time you have to prepare—the larger the optimal interval between sessions one and two. That optimal first interval declines as a proportion of the
...more
I was duped by what psychologists call fluency, the belief that because facts or formulas or arguments are easy to remember right now, they’ll remain that way tomorrow or the next day. The fluency illusion is so strong that, once we feel we’ve nailed some topic or assignment, we assume that further study won’t help. We forget that we forget. Any number of study “aids” can create fluency illusions, including (yes) highlighting, making a study guide, and even chapter outlines provided by a teacher or a textbook. Fluency misperceptions are automatic. They form subconsciously and make us poor
...more
Let’s recall the Bjorks’ “desirable difficulty” principle: The harder your brain has to work to dig out a memory, the greater the increase in learning (retrieval and storage strength). Fluency, then, is the flipside of that equation. The easier it is to call a fact to mind, the smaller the increase in learning. Repeating facts right after you’ve studied them gives you nothing, no added memory benefit.
And in the end, Gates had his ratio. “In general,” he concluded, “the best results are obtained by introducing recitation after devoting about 40 percent of the time to reading. Introducing recitation too early or too late leads to poorer results,” Gates wrote. In the older grades, the percentage was even smaller, closer to a third. “The superiority of optimal reading and retention over reading alone is about 30 percent.” The quickest way to download that St. Crispin’s Day speech, in other words, is to spend the first third of your time memorizing it, and the remaining two thirds reciting from
...more
Spitzer showed not only that testing is a powerful study technique, he showed it’s one that should be deployed sooner rather than later. “Immediate recall in the form of a test is an effective method of aiding the retention of learning and should, therefore, be employed more frequently,” he concluded. “Achievement tests or examinations are learning devices and should not be considered only as tools for measuring achievement of pupils.”
If self-examination is more effective than straight studying (once we’re familiar with the material), there must be reasons for it. One follows directly from the Bjorks’ desirable difficulty principle. When the brain is retrieving studied text, names, formulas, skills, or anything else, it’s doing something different, and harder, than when it sees the information again, or restudies. That extra effort deepens the resulting storage and retrieval strength. We know the facts or skills better because we retrieved them ourselves, we didn’t merely review them.
In the jargon of the field, your “unsuccessful retrieval attempts potentiated learning, increasing successful retrieval attempts on subsequent tests.” In plain English: The act of guessing engaged your mind in a different and more demanding way than straight memorization did, deepening the imprint of the correct answers. In even plainer English, the pretest drove home the information in a way that studying-as-usual did not.
Testing—recitation, self-examination, pretesting, call it what you like—is an enormously powerful technique capable of much more than simply measuring knowledge. It vanquishes the fluency trap that causes so many of us to think that we’re poor test takers. It amplifies the value of our study time. And it gives us—in the case of pretesting—a detailed, specific preview of how we should begin to think about approaching a topic.
Jorge Luis Borges once said about his craft: “Writing long books is a laborious and impoverishing act of foolishness: expanding in five hundred pages an idea that could be perfectly explained in a few minutes. A better procedure is to pretend that those books already exist and to offer a summary, a commentary.” Pretend that the book already exists. Pretend you already know. Pretend you already can play something by Sabicas, that you already inhaled the St. Crispin’s Day speech, that you have philosophy logic nailed to the door. Pretend you already are an expert and give a summary, a
...more
An insight problem, by definition, is one that requires a person to shift his or her perspective and view the problem in a novel way.
To Plato, thinking was a dynamic interaction between observation and argument, which produced “forms,” or ideas, that are closer to reality than the ever-changing things we see, hear, and perceive. To this, Aristotle added the language of logic, a system for moving from one proposition to another—the jay is a bird, and birds have feathers; thus, the jay must have feathers—to discover the essential definitions of things and how they relate. He supplied the vocabulary for what we now call deduction (top-down reasoning, from first principles) and induction (bottom-up, making generalizations based
...more
Graham Wallas was known primarily for his theories about social advancement, and for cofounding the London School of Economics. In 1926, at the end of his career, he published The Art of Thought, a rambling meditation on learning and education that’s part memoir, part manifesto.
Wallas also quotes the German physicist Hermann von Helmholtz, who described how new ideas would bubble up after he’d worked hard on a problem and hit a wall: “Happy ideas come unexpectedly, without effort, like an inspiration,” he wrote. “So far as I am concerned, they have never come to me when my mind was fatigued, or when I was at my working table … they came particularly readily during the slow ascent of wooded hills on a sunny day.”
Wallas saw, however, that the descriptions had an underlying structure. The thinkers had stalled on a particular problem and walked away. They could not see an opening. They had run out of ideas. The crucial insights came after the person had abandoned the work and was deliberately not thinking about it.
While Maier was conducting his experiments, Duncker was studying in Berlin under Max Wertheimer, one of the founders of the Gestalt school of psychology. Gestalt—“shape,” or “form” in German—theory held that people perceive objects, ideas, and patterns whole, before summing their component parts.
Creative leaps often come during downtime that follows a period of immersion in a story or topic, and they often come piecemeal, not in any particular order, and in varying size and importance. The creative leap can be a large, organizing idea, or a small, incremental step, like finding a verse, recasting a line, perhaps changing a single word. This is true not just for writers but for designers, architects, composers, mechanics—anyone trying to find a workaround, or to turn a flaw into a flourish.
Zeigarnik’s studies on interruption revealed a couple of the mind’s intrinsic biases, or built-in instincts, when it comes to goals. The first is that the act of starting work on an assignment often gives that job the psychological weight of a goal, even if it’s meaningless.
interrupting yourself when absorbed in an assignment extends its life in memory and—according to her experiments—pushes it to the top of your mental to-do list.
The first element of percolation, then, is that supposed enemy of learning—interruption.
Having a goal foremost in mind (in this case, a drink), tunes our perceptions to fulfilling it. And that tuning determines, to some extent, where we look and what we notice.
As the French microbiologist Louis Pasteur famously put it, “Chance favors the prepared mind.” Seeing that quote always made me think, Okay, but how does one prepare for chance? I have a better idea now, thanks to social psychology. I’d put it differently than Pasteur, if less poetically: Chance feeds the tuned mind.
Those are the first two elements of percolation: interruption, and the tuned, scavenging mind that follows. The journal entries provided the third element, conscious reflection.
Percolation is a matter of vigilance, of finding ways to tune the mind so that it collects a mix of external perceptions and internal thoughts that are relevant to the project at hand.
Transfer is what learning is all about, really. It’s the ability to extract the essence of a skill or a formula or word problem and apply it in another context, to another problem that may not look the same, at least superficially. If you’ve truly mastered a skill, you “carry it with you,” so to speak.
Whenever researchers scrambled practice sessions, in one form or another, people improved more over time than if their practice was focused and uninterrupted.
It’s not that repetitive practice is bad. We all need a certain amount of it to become familiar with any new skill or material. But repetition creates a powerful illusion. Skills improve quickly and then plateau. By contrast, varied practice produces a slower apparent rate of improvement in each single practice session but a greater accumulation of skill and learning over time. In the long term, repeated practice on one skill slows us down.