How We Learn: The Surprising Truth About When, Where, and Why It Happens
Rate it:
Open Preview
2%
Flag icon
I never let go of my studies—just allowed them to become part of my life, rather than its central purpose.
2%
Flag icon
And the deeper I looked, the more odd results I found. Distractions can aid learning. Napping does, too. Quitting before a project is done: not all bad, as an almost done project lingers in memory far longer than one that is completed. Taking a test on a subject before you know anything about it improves subsequent learning.
3%
Flag icon
Yet we work more effectively, scientists have found, when we continually alter our study routines and abandon any “dedicated space” in favor of varied locations. Sticking to one learning ritual, in other words, slows us down.
3%
Flag icon
integrating learning into the more random demands of life can improve recall in many circumstances—and that what looks like rank procrastination or distraction often is nothing of the kind.
12%
Flag icon
Yet there are large upsides to forgetting, too. One is that it is nature’s most sophisticated spam filter. It’s what allows the brain to focus, enabling sought-after facts to pop to mind.
13%
Flag icon
If recollecting is just that—a re-collection of perceptions, facts, and ideas scattered in intertwining neural networks in the dark storm of the brain—then forgetting acts to block the background noise, the static, so that the right signals stand out. The sharpness of the one depends on the strength of the other.
14%
Flag icon
Hermann Ebbinghaus,
15%
Flag icon
He tested himself at various intervals: Twenty minutes after studying. An hour. A day later, then a week. He varied the duration of his practice sessions, too, and found (surprise) that more practice sessions generally resulted in higher test scores and a slower rate of forgetting.
15%
Flag icon
Ebbinghaus’s Forgetting Curve
15%
Flag icon
Law of Disuse, which asserted that learned information, without continued use, decays from memory entirely—i.e., use it or lose it.
16%
Flag icon
Ballard knew what he had. “We not only tend to forget what we have once remembered,” he wrote, “but we also tend to remember what we have once forgotten.”
16%
Flag icon
nonsense syllables
17%
Flag icon
For nonsense syllables, and for most lists of vocabulary words or random sentences, it’s zero: There’s no spontaneous improvement on test scores after a day or two. By contrast, reminiscence is strong for imagery, for photographs, drawings, paintings—and poetry, with its word-pictures. And it takes time to happen. Ballard had identified the “bubbling up” of new verse in the first few days after study, when it’s strongest. Other researchers had looked for it too early, minutes afterward, or too late, after a week or more.
18%
Flag icon
The first principle theory is this: Any memory has two strengths, a storage strength and a retrieval strength. Storage strength is just that, a measure of how well learned something is. It builds up steadily with studying, and more sharply with use. The multiplication table is a good example.
18%
Flag icon
More than 99 percent of experience is fleeting, here and gone. The brain holds on to only what’s relevant, useful, or interesting—or may be so in the future.
18%
Flag icon
Retrieval strength, on the other hand, is a measure of how easily a nugget of information comes to mind. It, too, increases with studying, and with use. Without reinforcement, however, retrieval strength drops off quickly, and its capacity is relatively small (compared to storage). At any given time, we can pull up only a limited number of items in connection with any given cue or reminder.
18%
Flag icon
Compared to storage, retrieval strength is fickle. It can build quickly but also weaken quickly.
19%
Flag icon
This is due to the passive side of forgetting, the fading of retrieval strength over time. The theory says that that drop facilitates deeper learning once the fact or memory is found again. Again, think of this aspect of the Forget to Learn theory in terms of building muscle. Doing pull-ups induces tissue breakdown in muscles that, after a day’s rest, leads to more strength the next time you do the exercise.
19%
Flag icon
The harder we have to work to retrieve a memory, the greater the subsequent spike in retrieval and storage strength (learning).
19%
Flag icon
Thus, forgetting is critical to the learning of new skills and to the preservation and reacquisition of old ones.
20%
Flag icon
Yes, the Hesperus will eventually sink if the brain stops thinking about it, and its retrieval strength will inch toward zero. But a third test, and a fourth, would anchor the poem in memory more richly still, as the brain—now being called on to use the poem regularly—would continue its search for patterns within the poem, perhaps pulling up another half line or two with each exam. Will it all come back, with enough testing, even if only half was remembered the first time? Not likely. You get something back, not everything.
20%
Flag icon
Forgetting enables and deepens learning, by filtering out distracting information and by allowing some breakdown that, after reuse, drives retrieval and storage strength higher than they were originally. Those are the basic principles that emerge from brain biology and cognitive science, and they underlie—and will help us understand—the various learning techniques yet to come.
22%
Flag icon
Of those who studied and tested in the same condition, the silence-silence group did the worst.
23%
Flag icon
The background music weaves itself subconsciously into the fabric of stored memory.
23%
Flag icon
Having something going on in the study environment, like music, is better than nothing (so much for sanctity of the quiet study room).
23%
Flag icon
the experience of studying has more dimensions than we notice, some of which can have an impact on retention.
24%
Flag icon
“associations or episodic events … can be regenerated more completely in a similar mood state than they can in a different mood state.”
24%
Flag icon
barbiturates and alcohol, could produce so-called state-dependent learning in modest amounts
25%
Flag icon
The participants’ memories functioned best when their brain was in the same state during study as during testing, high or not high.
25%
Flag icon
That “retrieval key” comes back most clearly when the brain is in the same state, stoned or sober.
25%
Flag icon
Internal and external cues can be good reminders, but they pale next to strong hints.
26%
Flag icon
The strong hint provided by a partial drawing trumps the weaker ones provided by reinstating my learning environment.
27%
Flag icon
We can easily multiply the number of perceptions connected to a given memory—most simply, by varying where we study.
27%
Flag icon
A simple change in venue improved retrieval strength (memory) by 40 percent.
28%
Flag icon
The definition of “context,” for that matter, is a moving target. If it includes moods, movement, and background music, it could by extension mean any change in the way we engage our vocabulary lists, history chapters, or Spanish homework. Think about it. Writing notes by hand is one kind of activity; typing them using a keyboard is another. The same goes for studying while standing up versus sitting down, versus running on a treadmill.
28%
Flag icon
it doesn’t much matter which aspects of the environment you vary, so long as you vary what you can.
28%
Flag icon
Since we cannot predict the context in which we’ll have to perform, we’re better off varying the circumstances in which we prepare.
28%
Flag icon
Each alteration of the routine further enriches the skills being rehearsed, making them sharper and more accessible for a longer period of time. This kind of experimenting itself reinforces learning, and makes what you know increasingly independent of your surroundings.
29%
Flag icon
distributed learning or, more commonly, the spacing effect.
29%
Flag icon
Distributed learning, in certain situations, can double the amount we remember later on.
30%
Flag icon
Studying a new concept right after you learn it doesn’t deepen the memory much, if at all; studying it an hour later, or a day later, does.
32%
Flag icon
SuperMemo.
32%
Flag icon
One group had shown that teaching third graders addition once a day for ten days was far more effective than twice a day for five days.
32%
Flag icon
And ever-expanding intervals—as per SuperMemo—indeed appeared to be the most effective way to build a knowledge base, making the spacing effect “one of the most remarkable phenomenon to emerge from laboratory research on learning,” one reviewer, psychologist Frank N. Dempster, of the University of Nevada, Las Vegas, wrote.
32%
Flag icon
By the end, the two-month interval improved performance by 50 percent.
34%
Flag icon
The optimal interval ranges can be read off a simple chart:
36%
Flag icon
The fluency illusion is so strong that, once we feel we’ve nailed some topic or assignment, we assume that further study won’t help.
36%
Flag icon
Let’s recall the Bjorks’ “desirable difficulty” principle: The harder your brain has to work to dig out a memory, the greater the increase in learning (retrieval and storage strength). Fluency, then, is the flipside of that equation. The easier it is to call a fact to mind, the smaller the increase in learning. Repeating facts right after you’ve studied them gives you nothing, no added memory benefit.
36%
Flag icon
Here’s the philosopher Francis Bacon, spelling it out in 1620: “If you read a piece of text through twenty times, you will not learn it by heart so easily as if you read it ten times while attempting to recite it from time to time and consulting the text when your memory fails.”
36%
Flag icon
“A curious peculiarity of our memory is that things are impressed better by active than by passive repetition. I mean that in learning—by heart, for example—when we almost know the piece, it pays better to wait and recollect by an effort from within, than to look at the book again. If we recover the words in the former way, we shall probably know them the next time; if in the latter way, we shall very likely need the book once more.”
« Prev 1