More on this book
Community
Kindle Notes & Highlights
Read between
August 28 - October 14, 2018
I never let go of my studies—just allowed them to become part of my life, rather than its central purpose.
And the deeper I looked, the more odd results I found. Distractions can aid learning. Napping does, too. Quitting before a project is done: not all bad, as an almost done project lingers in memory far longer than one that is completed. Taking a test on a subject before you know anything about it improves subsequent learning.
Yet we work more effectively, scientists have found, when we continually alter our study routines and abandon any “dedicated space” in favor of varied locations. Sticking to one learning ritual, in other words, slows us down.
integrating learning into the more random demands of life can improve recall in many circumstances—and that what looks like rank procrastination or distraction often is nothing of the kind.
Yet there are large upsides to forgetting, too. One is that it is nature’s most sophisticated spam filter. It’s what allows the brain to focus, enabling sought-after facts to pop to mind.
If recollecting is just that—a re-collection of perceptions, facts, and ideas scattered in intertwining neural networks in the dark storm of the brain—then forgetting acts to block the background noise, the static, so that the right signals stand out. The sharpness of the one depends on the strength of the other.
Hermann Ebbinghaus,
He tested himself at various intervals: Twenty minutes after studying. An hour. A day later, then a week. He varied the duration of his practice sessions, too, and found (surprise) that more practice sessions generally resulted in higher test scores and a slower rate of forgetting.
Ebbinghaus’s Forgetting Curve
Law of Disuse, which asserted that learned information, without continued use, decays from memory entirely—i.e., use it or lose it.
Ballard knew what he had. “We not only tend to forget what we have once remembered,” he wrote, “but we also tend to remember what we have once forgotten.”
nonsense syllables
For nonsense syllables, and for most lists of vocabulary words or random sentences, it’s zero: There’s no spontaneous improvement on test scores after a day or two. By contrast, reminiscence is strong for imagery, for photographs, drawings, paintings—and poetry, with its word-pictures. And it takes time to happen. Ballard had identified the “bubbling up” of new verse in the first few days after study, when it’s strongest. Other researchers had looked for it too early, minutes afterward, or too late, after a week or more.
The first principle theory is this: Any memory has two strengths, a storage strength and a retrieval strength. Storage strength is just that, a measure of how well learned something is. It builds up steadily with studying, and more sharply with use. The multiplication table is a good example.
More than 99 percent of experience is fleeting, here and gone. The brain holds on to only what’s relevant, useful, or interesting—or may be so in the future.
Retrieval strength, on the other hand, is a measure of how easily a nugget of information comes to mind. It, too, increases with studying, and with use. Without reinforcement, however, retrieval strength drops off quickly, and its capacity is relatively small (compared to storage). At any given time, we can pull up only a limited number of items in connection with any given cue or reminder.
Compared to storage, retrieval strength is fickle. It can build quickly but also weaken quickly.
This is due to the passive side of forgetting, the fading of retrieval strength over time. The theory says that that drop facilitates deeper learning once the fact or memory is found again. Again, think of this aspect of the Forget to Learn theory in terms of building muscle. Doing pull-ups induces tissue breakdown in muscles that, after a day’s rest, leads to more strength the next time you do the exercise.
The harder we have to work to retrieve a memory, the greater the subsequent spike in retrieval and storage strength (learning).
Thus, forgetting is critical to the learning of new skills and to the preservation and reacquisition of old ones.
Yes, the Hesperus will eventually sink if the brain stops thinking about it, and its retrieval strength will inch toward zero. But a third test, and a fourth, would anchor the poem in memory more richly still, as the brain—now being called on to use the poem regularly—would continue its search for patterns within the poem, perhaps pulling up another half line or two with each exam. Will it all come back, with enough testing, even if only half was remembered the first time? Not likely. You get something back, not everything.
Forgetting enables and deepens learning, by filtering out distracting information and by allowing some breakdown that, after reuse, drives retrieval and storage strength higher than they were originally. Those are the basic principles that emerge from brain biology and cognitive science, and they underlie—and will help us understand—the various learning techniques yet to come.
Of those who studied and tested in the same condition, the silence-silence group did the worst.
The background music weaves itself subconsciously into the fabric of stored memory.
Having something going on in the study environment, like music, is better than nothing (so much for sanctity of the quiet study room).
the experience of studying has more dimensions than we notice, some of which can have an impact on retention.
“associations or episodic events … can be regenerated more completely in a similar mood state than they can in a different mood state.”
barbiturates and alcohol, could produce so-called state-dependent learning in modest amounts
The participants’ memories functioned best when their brain was in the same state during study as during testing, high or not high.
That “retrieval key” comes back most clearly when the brain is in the same state, stoned or sober.
Internal and external cues can be good reminders, but they pale next to strong hints.
The strong hint provided by a partial drawing trumps the weaker ones provided by reinstating my learning environment.
We can easily multiply the number of perceptions connected to a given memory—most simply, by varying where we study.
A simple change in venue improved retrieval strength (memory) by 40 percent.
The definition of “context,” for that matter, is a moving target. If it includes moods, movement, and background music, it could by extension mean any change in the way we engage our vocabulary lists, history chapters, or Spanish homework. Think about it. Writing notes by hand is one kind of activity; typing them using a keyboard is another. The same goes for studying while standing up versus sitting down, versus running on a treadmill.
it doesn’t much matter which aspects of the environment you vary, so long as you vary what you can.
Since we cannot predict the context in which we’ll have to perform, we’re better off varying the circumstances in which we prepare.
Each alteration of the routine further enriches the skills being rehearsed, making them sharper and more accessible for a longer period of time. This kind of experimenting itself reinforces learning, and makes what you know increasingly independent of your surroundings.
distributed learning or, more commonly, the spacing effect.
Distributed learning, in certain situations, can double the amount we remember later on.
Studying a new concept right after you learn it doesn’t deepen the memory much, if at all; studying it an hour later, or a day later, does.
SuperMemo.
One group had shown that teaching third graders addition once a day for ten days was far more effective than twice a day for five days.
And ever-expanding intervals—as per SuperMemo—indeed appeared to be the most effective way to build a knowledge base, making the spacing effect “one of the most remarkable phenomenon to emerge from laboratory research on learning,” one reviewer, psychologist Frank N. Dempster, of the University of Nevada, Las Vegas, wrote.
By the end, the two-month interval improved performance by 50 percent.
The optimal interval ranges can be read off a simple chart:
The fluency illusion is so strong that, once we feel we’ve nailed some topic or assignment, we assume that further study won’t help.
Let’s recall the Bjorks’ “desirable difficulty” principle: The harder your brain has to work to dig out a memory, the greater the increase in learning (retrieval and storage strength). Fluency, then, is the flipside of that equation. The easier it is to call a fact to mind, the smaller the increase in learning. Repeating facts right after you’ve studied them gives you nothing, no added memory benefit.
Here’s the philosopher Francis Bacon, spelling it out in 1620: “If you read a piece of text through twenty times, you will not learn it by heart so easily as if you read it ten times while attempting to recite it from time to time and consulting the text when your memory fails.”
“A curious peculiarity of our memory is that things are impressed better by active than by passive repetition. I mean that in learning—by heart, for example—when we almost know the piece, it pays better to wait and recollect by an effort from within, than to look at the book again. If we recover the words in the former way, we shall probably know them the next time; if in the latter way, we shall very likely need the book once more.”