“It’s true, you know. In space, no one can hear you scream like a little girl.”
I have a bit of a lover's quarrel with this one. The plot (lone crew me“It’s true, you know. In space, no one can hear you scream like a little girl.”
I have a bit of a lover's quarrel with this one. The plot (lone crew member gets stranded on Mars), setting (Mars) and scientific integrity (there's a lot of good science here) would seem to be the perfect blend for a target audience that is me. But then there's Mark Watney, or as I like to call him, "the teenager in an EVA suit." Crafted of equal parts cocky and corny, The Martian's main character makes the male cohort on The Big Bang Theory seem downright intoxicating. Each time you're about to settle into the sci-fi goodness unfolding on the blood-red planet, Watney's juvenility and hackeneyed attempts at humor rear up to depressurize the drama and poison the narrative atmosphere. I did not connect with this character, at all.
Here's an exchange between Watney and NASA Mission Control in which Watney can't help but lay on the prepubescent charm:
[11:49] JPL: What we can see of your planned cut looks good. We’re assuming the other side is identical. You’re cleared to start drilling.
[12:07] Watney: That’s what she said.
[12:04] JPL: We’ll get botanists in to ask detailed questions and double-check your work. Your life is at stake, so we want to be sure. Also, please watch your language. Everything you type is being broadcast live all over the world.
[12:15] WATNEY: Look! A pair of boobs! -> (.Y.)
What am I reading...? Is this sci-fi or middle school? I'm all for bucking stereotypes—like the urbane, straight-laced NASA astronaut Weir apparently had in mind—but Watney is a stride too far in the opposite direction. On rare occasions the dullish teen-speak gives way to genuine wit, but these instances are too few and far between that the bad taste in my mouth never left. That said, I do expect reader mileage to vary on this score.
I could probably look the other way if the supporting cast were infused with greater dimensionality, but it's hardly the case. The crew deliver dialogue every bit as stilted and cliched, their interactions adding nothing of substance to the narrative. Here's one crew member chatting with a loved one back home:
Martinez: "So, you're pissed."
Marissa: "I have to wait another 533 days to get laid!"
Martinez: "So do I," he said defensively.
A World Away
But not even Watney's itchy tongue and forgettable dialogue are enough to dash an epic quest on a foreign world. This is Mars after all, our second closest neighbor and perennial sci-fi favorite. In this outing a crew of six travel to Mars for NASA's third manned mission, known as Ares 3. While out on expedition, a nasty storm sweeps up and amid the chaos one crew member is struck by a wayward antenna carried by the high-powered surface winds. With his comms no longer transmitting, the crew is unable to locate the downed engineer. Fearing the destruction of their return vehicle, the crew abandon the search and conclude that Mars has claimed its first human casualty.
Except Ares 3 leaves behind more than an unforgivable environment. They leave one of their own, bruised and battered, but not exactly dead. It's now Watney vs. the Red Planet, a match less lopsided than one might think. Mars' razor thin atmosphere, brutal cold, active weather and craggy terrain all serve as redoubtable antagonists Watney must overcome to secure a return trip home. Imagine being all alone on a planet climatically hostile to your kind of life with dwindling resources, no return vessel and no contact with the only people who can bring you one. Even the best odds of survival would be Planck length-low.
Fortunately, our deserted soul is no slouch. What Watney lacks in charisma he more than makes up for in sheer intelligence and technical brilliance. Mars' first "colonizer" wears the hats of botanist and mechanical engineer, and is a person for whom "asleep at the wheel" would be a most inapt descriptor. If MacGyver, Rube Goldberg and Robinson Crusoe were to have some kind of hybrid child, Watney would be it. The man's a dynamo, as pragmatically minded, resourceful and resilient as they come. It's probably why he was chosen for a NASA mission.
He's also utterly determined to make it back to Earth. As Watney awakes groggy-eyed and the true extent of his plight comes into focus, his indomitable survivalism takes over and doesn't let up. He quickly realizes it will require every ounce of his scientific acumen to hold out until the next scheduled NASA mission, at which time an aghast Ares 4 crew would set eyes on one weary astronaut. His botany training is put to immediate use by creating a renewable source of food from little more than potato seeds and "homegrown" fertilizer. He employs some fancy chemistry in order to maintain a breathable atmosphere and reliable (though radiatively unstable) heat source. And every whit of Watney's engineering know-how is spent on preparing the rover for a transplanetary jaunt over Mars' surly, rough-and-tumble terrain.
Watney's time on Mars is relayed through daily first-person logs that record his progress in addition to a few clunky transitions to third-person omniscient. Provided you don't mind being submerged in technical detail, these logs may just win you over as they did me. This is science at its most raw and ad hoc. The meticulous cataloging succeeds in connecting you to the action as Watney slaps together one near-suicidal scheme after another. Just as we might expect of someone marooned 140 million miles (annual average) from all of civilization, our hero is never allowed too much comfort. Part of the allure is seeing what hellish scenario presents itself next and how Watney's ingenuity and moxie will combine to solve it away. Better yet, all of the science here is kosher, otherwise known as "hard" sci-fi. Watney won't run into any boogeymen or Martian monsters in this one, but the trials he does chance upon are every bit as deadly. With each setback and triumph, no specifics are spared the reader, as complex concepts are unspooled with ease and clarity.
Andy Weir, something of a prodigy himself, started out as a computer programmer at age 15. For him, science is both a hobby and a narrative device. But Weir's goal was not just to use science to drive the story forward, but to make Watney's exploits as scientifically plausible as possible. He released some early chapters online as a free serial novel, which quickly garnered interest from fans and scientists alike. Weir incorporated their technical feedback for the final print edition, making The Martian a kind of collective effort by science enthusiasts.
What results is a unique blend of survivalist sci-fi and problem-solving escapades told through excruciatingly detailed science. Could one human really survive on Mars with standard NASA equipage? The answer is surely yes, if Watney has anything to say about it. All of his interdisciplinary expertise is on display for the reader to either absorb, deconstruct and debunk, or skim over until the next existential disaster strikes. Technical readers will fall head over heels working through the minutia, while the less initiated may find their eyes glazing over, but both audiences will come away having learned something new. The thoroughness of it all is really what pulled me in and lent the story its strong scent of credibility. There's no deus ex machina here. If Watney didn't die in the previous chapter, it's because he used science to decatastrophize the latest curveball Mars threw his way. It's satisfying in a way that "softer" sci-fi plot tropes aren't.
Don't Leave Home Without Them
Before wrapping up the review, I thought I'd briefly walk through a few pieces of equipment that recur throughout the story. These are absolutely vital to Watney's survival, and given how often they're mentioned it might be helpful to have a quick reference here for those looking to embark on Weir's planetary safari. The "Big Three" are:
Oxygenator. A machine that strips apart the carbon atoms from the CO2 that Watney exhales and retains the oxygen atoms. Relies on the atmospheric regulator for the CO2; worthless without it.
Atmospheric regulator. A machine that monitors the molecular gas concentrations in the air, removing and resupplying CO2 and O2 as necessary. Too much oxygen (oxygen toxicity) is just as dangerous as too much carbon dioxide (hypercapnia).
Water reclaimer. A machine that salvages and purifies water from virtually anything that gives off moisture, including humidity from the air when Watney exhales or sweats in the pressurized environments, waste waters from the Hab's fuel cells, and even Watney's urine. If this sounds disgusting, it's worth noting that the reclaimers NASA employs on their manned missions use three-step purification.
In The Martian, science is front and center. It assumes the roles of protagonist and antagonist and is the driving mechanism that allows forward progress for the hero. If chemistry, biology and physics aren't your speed, you won't last long on this cerebral joyride. Much of the book hovers just on the edge of possible, and Weir's technical accuracy and attention to detail were more than enough to keep me glued, even if Watney's unsavory personality and the stilted character interactions left me out in the cold. Were the grade-school script and throwaway dialogue intentional juxtapositions to compensate for the technical nature of much of the rest of the book—a lighthearted, expletive-suffused respite to allow your brain a cooldown period from the stress and heavy lifting? Perhaps, but my gripe is that they could have been handled much better, as I found the contrast jarring, often piercing the tension at several inopportune moments. I also simply found his attempts at humor nonfunctional and rarely clever, though I acknowledge the subjectivity on this account. Quibbles aside, The Martian is well researched space fiction that manages to capture mankind's relentless will to survive and demonstrates that with limited resources and unlimited creativity, we can even face down Mars.
Note: This review is republished from my official website. Click through for additional footnotes and imagery....more
I know what you're thinking. Which book isn't like the others? And so it would seem a preface is very much in order.
It all started with a lightheartedI know what you're thinking. Which book isn't like the others? And so it would seem a preface is very much in order.
It all started with a lighthearted conversation with a family friend that morphed spontaneously into a much deeper and more philosophically tinted exchange, touching upon everything from karma to palmistry (palm-reading) and eventually to something called 'the secret'. Now, I believe it my rational duty to adopt a posture of skepticism when anyone claims to have the secret to success and the workings of the universe, especially when said person is not a scientist and has no formal scientific training. I had heard only vague murmurings about Rhonda Byrne's 2006 book (and film of the same name) prior to this conversation and knew only of its fashionable reputation in the self-help genre. Previous associations notwithstanding, when her "Law of Attraction" was verbalized by this family friend, all of my antennae were telling me to doubt, doubt, doubt. But in a gesture of acte gratuit I accepted the invitation to read the book that had just been placed in my hands.
When I brought it home, I quite honestly did not intend to read it. There are just too many other books peering out from my shelf and vying for attention. But curiosity won out and I found myself leafing through it a little every day. Oy vey. For someone allegiant to the scientific method, slogging through The Secret is what I imagine Ayn Rand must have felt like reading the New Testament gospels. I now understand why the self-help market is such a target of high dudgeon. With insufferable twaddle like this clogging the shelves, is it worth our time sifting through the dunghills to see if any gems might be buried there? Not sure I wish to find out.
Page after page I encountered something guaranteed to set off alarm tones in the skeptic's acropolis. I jotted down the most troubling passages specifically for this review. These are the ones that most raised my hackles:
“You will attract everything that you require. If it’s money you need you will attract it. If it’s people you need you’ll attract it. You've got to pay attention to what you’re attracted to, because as you hold images of what you want, you’re going to be attracted to things and they’re going to be attracted to you. But it literally moves into physical reality with and through you. And it does that by law."
“Thoughts are magnetic, and thoughts have a frequency. As you think thoughts, they are sent out into the Universe, and they magnetically attract all like things that are on the same frequency. Everything sent out returns to the source - you.”
“Don’t become mesmerized by the pictures that have appeared if they are not what you want. Take responsibility for them, make light of them if you can, and let them go. Then think new thoughts of what you want, feel them, and be grateful that it is done.”
“Thought = creation. If these thoughts are attached to powerful emotions (good or bad) that speeds the creation."
“The law of attraction is a law of nature. It is as impartial and impersonal as the law of gravity is. It is precise, and it is exact.”
“Remember that your thoughts are the primary cause of everything.”
"Nothing can come into your experience unless you summon it through persistent thoughts."
“You cannot “catch” anything unless you think you can, and thinking you can is inviting it to you with your thought. You are also inviting illness if you are listening to people talking about their illness. As you listen, you are giving all your thought and focus to illness, and when you give all of your thought to something, you are asking for it.”
“Laughter attracts joy, releases negativity, and leads to miraculous cures.”
“Ask once, believe you have received, and all you have to do to receive is feel good.”
“Many people who order their lives rightly in all other ways are kept in poverty by their lack of gratitude.”
“The only reason any person does not have enough money is because they are blocking money from coming to them with their thoughts.”
“Food cannot cause you to put on weight, unless you think it can.”
Bonus: In this interview, Byrne told ABC that she wouldn't even get a flu shot because, "If you're feeling good, how can you attract any illness to you?"
If you're still with me, that means you haven't croaked from exasperation. Were I able to scribble in the margins I'd have copy-pasted the below meme ad infinitum.
See I have a fundamental problem with every single one of the messages above: they are a castle of lies based on no evidence at all. Thoughts are not "magnetic", they don't "move into physical reality", the "law of attraction" is not a "law of nature", our thoughts do not "invite illness", laughter does not "lead to miraculous cures", people are not kept in poverty because of "their lack of gratitude", and thoughts by themselves cannot "cause you to put on weight" because our thoughts are not "the primary cause of everything". (Tell that last one to the Jews. Better yet, don't.)
A reasonable question at this juncture might be to inquire of Byrne's background. I am actually unable to find any record of her educational history or whether she even attended college. It is as though the internet has been suspiciously wiped of anything prior to her jaunts with Australian television. Regardless, the preceding excerpts practically drip of science illiteracy. I knew I was in for some unsolicited arrhythmia when I read the following just minutes after picking it up: "Quantum physicists tell us that the entire Universe emerged from thought" (p. 15). No, they don't. Whether she has a formal education or not, it is clearly not in STEM (as if titles like The Magic and The Power didn't tip you off already).
Take a look once more at her first quote above, if you dare. To speak about phenomena that "literally moves into physical reality" is to speak about scientific phenomena. Byrne's "Law of Attraction", in brief, is the idea that thoughts bend nature to their will—that renovating one's inner state of mind feeds into tangible effects outside of that mind. In case it needed spelling out, there is no known physical mechanism by which this can occur. This is rubbish mixed with high-octane nonsense. Are we to suppose Byrne has stumbled onto something hitherto uncharted by the communities of academics who have made careers out of studying nature? Her infirm speculations do not belong in an analogy with the law of gravity any more than screen doors belong in submarines.
And so here I am, forced to parade a criticism I've leveled at no other book to date. As emotionally appealing though they may be, there is not a shred of support for any of the grandiose claims in The Secret. As I wrote in my review of Carl Sagan's penultimate tome, science relies on the principles of skepticism and trained observation fueled by an overarching preference for the truth, however inconvenient, over the psychologically comfortable. These may indeed be comforting, compelling messages Byrne is putting out there. But the more we want something to be true the greater skepticism we should wield in its vicinity.
Mentality and Meditation
So what can be said about the ground Byrne covers in The Secret? As psychologist Christopher Chabris notes in his dual review in the NY Times, the "Law of Attraction" isn't novel. Some pre-Socratics meddled in it, it's what some medieval natural philosophers thought, and indeed, Byrne cites several of these figures in between her scattered logical dead ends. But that was then and this is now. We've learned much since Plato. We wouldn't, by analogy, adopt a particular view of cosmology because elements of it are found in Giordano Bruno's musings on nature. In science, the litmus test is evidence, not ancient writings or naked intuition. Chabris also has some juicy bits about the web of ways our mind can deceive us into ascertaining nonexistent patterns, e.g. "illusory correlation" and the related phenomenon of "playing up the hits and downplaying the misses", all of which trespass directly on Byrne's magical mind slaughter.
Byrne cites early on the inspiration for her movement: a 100 year-old book called The Science of Getting Rich. "Wealth attraction", as the author called it, was bunk then and it's bunk now. Byrne has simply repackaged this and other superstitious fluff for the modern uncritical listener. But why entertain the fact-free notion of metaphysical causal arrows between happy thoughts and the universe when the empirical literature is brimming with data of great import for the self-help audience? Cultivating a positive inner state can bring about very real psychological benefits, from helping you cope with difficult situations to helping you interact effectively in social environments. Byrne is also in favor of meditation. Here, too, the benefits are studied and can be palpable. Introspection in its various forms has proved effective at achieving a desired conscious state. These experiences are well-established and we've been doing it for thousands of years since the earliest Eastern contemplatives. No additional "laws of nature" required. No voodoo woo woo.
The Dangers of Self-Help "Wisdom"
It is hardly a "secret" that positive thinking can reap appreciable benefit to the psychological well-being of an individual. But extrapolating from here by telling people that those thoughts animate and somehow reach out into the world like magic tendrils and cause physical transformation is entirely another matter. I'm all for positive thinking as a means for improving one's outlook. But I'm also for communicating accurate science rather than distorting it to prey on the needs of the under-equipped.
And this is why I think books like The Secret signal a broader cause for concern. Tied up in these messages is essentially an "easy" button to life, the idea that if we only adopt a new mindset the ship will right itself. After all, Byrne's 'law' is cast as a law of nature, and laws are nothing if not invariant. I have no doubt that these messages do prevent actual, necessary, useful action from being taken. When you are resigned to the idea that your thoughts enact change all by themselves, where is the incentive to take real action, to expend the effort necessary to achieve the things you need or desire?
Victims of depression and the suicidal and clinically mentally unstable need professional attention, not self-help half-truths coated in meaningless, contextless jargon. No licensed clinician would advocate what Byrne advocates in this book. Someone overweight or obese may require a nutritional change to their diet or may benefit from medical prescription. What a doctor would absolutely not do is tell the patient they are obese because their thoughts are not in the right place. Likewise, having a less poverty-stricken vision of reality will not return prosperity, happiness and riches to you as if the universe were some conscious gumball machine and obliging the downtrodden its occupation. So unless we're willing to say that rich white people have been in on the "secret" to the exclusion of the rest of the planet, the universe doesn't give you what you want. No amount of positive thinking can pick up the slack between coincidence and hard work.
The oft-quoted maxim by Carl Sagan continues to hold sway: "Extraordinary claims require extraordinary evidence." These are certainly some extraordinary claims being sent out by Ms. Byrne, but the only effects they are having in the real world are on her bank account. Other than false hope and dreamed-up physics, there are no secrets to be found in Byrne's self-help manifesto. This is pure, distilled pseudoscience concentrate at its most disingenuous, a manufactured word tangle that is the intellectual equivalent of canned Spam and a Jason Statham flick. She works in equivocal quotes from past intellectuals, many of whom would no doubt call high treason on their conscription here, and dresses up her conjectural detritus in sciencey-sounding language ripped straight from a Deepak Chopra meditation tape. As it turns out, weight management, food security, and global conflict and disease do not all and sundry yield before Byrne's transfinite law. Millionaires are not in tune with the universe. Negative thoughts don't land you in the unemployment line. And honesty has always been better medicine than bullshit.
Note: This review is republished from my official website. Click through for additional footnotes and imagery....more
Probably the best-written book I have on my shelf to date. That there's an absorbing and urgent narrative here as well is icing on the Orwellian cake.Probably the best-written book I have on my shelf to date. That there's an absorbing and urgent narrative here as well is icing on the Orwellian cake. The lessons of Oceania transcend era and have lasting value for those working to organize and uplift society. It's as if Orwell has reached into the future and touched the mood of the present. One can connect with ease Orwell's cautionary utopia to the depredations of American privacy that have surfaced in just the last couple of years. Among the rarefied collective of the greatest writers in the English language, George Orwell has painstakingly crafted a penetrating, prophetic tale, a world within a world, a spiraling ideascape whose tendrils integrate seamlessly with the realities of modern life.
Here's a small sampling of the winged brilliance flanking the reader at every turn:
“He was a fattish but active man of paralyzing stupidity, a mass of imbecile enthusiasms – one of those completely unquestioning, devoted drudges on whom, more even than on the Thought Police, the stability of the Party depended.” (p. 22)
"She had a bold, aquiline face, a face that one might have called noble until one discovered that there was as nearly as possible nothing behind it." (p. 66)
"Until they become conscious they will never rebel, and until after they have rebelled they cannot become conscious." (p. 70)
"But there was still that memory moving round the edges of his consciousness, something strongly felt but not reducible to definite shape, like an object seen out of the corner of one's eye." (p. 122)
"To hang on from day to day and from week to week, spinning out a present that had no future, seemed an unconquerable instinct, just as one's lungs will always draw the next breath so long as there is air available." (p. 152)
"What opinions the masses hold, or do not hold, is looked on as a matter of indifference. They can be granted intellectual liberty because they have no intellect." (p. 210)
Recommended companion reading:
- Brave New World by Aldous Huxley (1932) - Amusing Ourselves to Death: Public Discourse in the Age of Show Business by Neil Postman (1985) ...more
"Studying mathematics in order to understand the laws of physics is not unlike learning enough of a foreign language to capture some of the special fl"Studying mathematics in order to understand the laws of physics is not unlike learning enough of a foreign language to capture some of the special flavor and beauty of prose or poetry written in that language. In the process, one may well become fascinated by the language itself." (pp. 169-170)
Mathematics has always occupied the mythical verge between reality and abstraction, between beauty and physics. For those acquainted with its rhythms, descending into the mathematical realm is like peeking behind the cosmic curtain and seeing how nature is choreographed. In this brief volume Robert Osserman opens up the aesthetic space as he volleys philosophy in between sets of mathematical exposition. Do numbers, geometric patterns, proofs and equations inhabit a reality independent of the mind? Were they sealed off in their own ontological antechamber just waiting to be discovered? Or is mathematics an uncannily useful quirk of cognition gradually refined by human ingenuity? Such questions may leap to the fore as you make your way through Poetry of the Universe: A Mathematical Exploration of the Cosmos (1995).
The idea for this book, Osserman tells us, began as a course at Stanford. A colleague of his once posed the question, 'How is it that mathematics is such a beautiful subject, yet students can go through four years of college and never find out?' This fed into a focused course created to give aesthetic attention to the symbiotic nature of math and science. A number of subtopics–from geometry to topology to cartography to cosmology–are emotively presented throughout the book.
There is certainly something amazing about the ability of mathematics to describe this vast, wild universe. Of course, its secrets were not passed stamped and sealed through the veil of heaven to enlighten us mortals on what we could never achieve by ourselves. To the contrary, mathematics became a collective enterprise, with each successive generation adding a bit more to the knowledge of the previous. Modern mathematics owes a great deal to ancient Greece. In particular, the legendary triumvirate of Pythagoras, Euclid and Archimedes were among the first to traffic in theorems and proofs, rendering viable such feats as determining the shape and circumference of the Earth. (Osserman explodes the myth about Medieval sophisticates thinking the Earth was flat; they didn’t.) Extraordinarily, much of their work has stood the test of time and continues to form the foundation of several fields of study today.
When the curtain fell on ancient Greece, the adventure was just beginning. Mathematicians in the Middle Ages would later recover the Greek classics and inaugurate a whole new era of esoterica. Indeed, the story of mathematics is littered with more abstract theory than anything else. Though we use it to model and describe our universe to an “unreasonably effective” degree (per Eugene Wigner), much of it has no connection whatever to anything we find in nature and operates quite independently of the physical sciences.
Osserman revisits some of the mighty moments, linking the efforts of Euler, Gauss, Lobachevsky, Bolyai, Fermat, Riemann, Minkowski and Einstein, whose intrepid excursions into the arcane would occasionally reap massive payouts on the practical side of things. Bernhard Riemann's contributions to differential geometry laid the groundwork for Einstein's general relativity. Standing on the shoulders of their predecessors, Maxwell and his equations ushered in radar, radio and television, while Newton’s and Einstein’s completely revolutionized our picture of physics and cosmology. Our modern understanding of the observable universe is indebted to mathematics in the same way Homo sapiens and its common ancestors are beholden to the Chicxulub impact.
Mathematics as a Lingua Franca
It is not easy for some people to think mathematically; it has its own structure, its own grammar and its own jargon. Much of the book touches on conceptually very difficult areas, such as curvature and geodesy, and it takes a skillful communicator to convey them to the nonspecialist without devolving into indecipherable froth. Unfortunately, Osserman is uneven in this regard. He rushes through too many topics, which is doubtless a symptom of the extreme brevity of the book but isn’t alleviated by his roughshod presentation.
While the many illustrations are handy, they won’t do much without a solid background in abstract, non-applied mathematics, in particular geometry and topology which absorb roughly three quarters of the book. This is ultimately a flaw fatal to the book’s theme, as a true appreciation of mathematical elegance requires at minimum an understanding of the underlying ideas. For those lacking firm footing in these areas, expect to do a lot of companion reading to resolve your inevitably many clarifying questions.
Expansion and Contraction
His tie-ins with cosmology in the latter sections of the book fare better. The excitement level trebles as he undresses the Big Bang and the interplay of intelligence that led to its formulation. The canonical astronomers of the early 20th century, Edwin Hubble and Georges Lemaître, paired their pioneering spirit with Einstein’s relativity equations to derive the redshift-distance relation and hence the basis of the Big Bang model of the universe. This was a lively time for astronomy and for anyone interested in deep time, and the book would have benefited from giving more space to this era and its many achievements.
Given the publication date of 1995, there is also a good amount here that is outdated. The book was released three years before the Nobel Prize-winning discovery of the accelerating universe and eighteen years before the landmark release of ESA’s Planck data, which revised the Hubble constant as well as the age and overall composition of the universe. And there is curiously no mention of cosmic inflation, a key model within Big Bang cosmology that recently received powerful confirmation from a discovery at the South Pole. (NB: Corroboration still pending) No mention of Alan Guth and eternal inflation; no mention of Andrei Linde and chaotic inflation, which is made all the more curious considering the book lingers so long in geometry territory and inflationary expansion certainly has some rather remarkable geometrical ramifications.
It’s also interesting to hear his skepticism on how far back to the beginning of time we will be able to reach, in which he notes that our curtain suddenly drops in the vicinity of the Big Bang. If the 2014 announcement of gravitational waves detected in the cosmic microwave background holds up, portions of this book would benefit from an update.
Osserman’s Poetry of the Universe is the story of man’s obsessive affair with the mathematical and the riches of possibility. Explorations in the mathematical space subsidize our inquest of the cosmos, sparking new opportunities in the physical as well as mental space. The book shines when figuring in the key players along the road to us and contextualizing their breakthroughs. Where it falls short is in its explanation and presentation, which is too technical, too terse and too scattershot for introductory readers to piece together. Even with the supplemental notes provided in the back, this is a challenging read not recommended for the mathematical neophyte. For better and more up-to-date treatments on the intersection of science, math and beauty, see Brian Greene’s The Fabric of the Cosmos and Hawking’s A Brief History of Time.
Note: This review is republished from my official website. Click through for additional footnotes and imagery. ...more
"Humans have brains that are built to work on anecdote rather than real data." [Jeffrey P. Utz, MD]
Man's relationship with drugs has been one long and"Humans have brains that are built to work on anecdote rather than real data." [Jeffrey P. Utz, MD]
Man's relationship with drugs has been one long and turbulent ride. Since first contact they've been consumed, smoked, inhaled and injected by humans for everything from their mood-altering effects to their analgesic properties. Over the years and across many cultures, the legal status of nicotine, ethanol (the ingredient that supplies alcohol its intoxicating highs), cannabis and a congeries of other psychoactive substances has oscillated from complete prohibition to medicinal use only to nonmedicinal recreational, with some blurry shades in between. In the United States, the posture on narcotics is one of quick suspicion and reproach, tilting lopsidedly toward criminalization. Columnist Jacob Sullum explores this general attitude in Saying Yes: In Defense of Drug Use with a data-centric focus, asking whether existing drug policies and popular perception are sensible or whether the conversation has been hijacked by false prophets and disinformation.
After canvassing the world religions for their perspectives on various euphoriants, Sullum transitions to the pith of his ensuing narrative: the statistics on drug usage. Any attempt to dissect the war on drugs must at minimum involve a familiarity with the hard data on psychotropic substances both licit and illicit. Sullum provides this in spades, chronicling usage statistics and dozens of studies on marijuana, methamphetamine, LSD, MDMA (ecstasy), PCP, cocaine, crack cocaine and heroin.
Anti-drug activists rely primarily on a two-pronged prosecution: 1) an inordinate amount of crime is caused by drug use and 2) moderation is impossible or something like impossible. But are these persevering assumptions tenable? What does the research show?
The literature simply does not support these claims, no matter how loudly the media and abstinence hounds continue to repeat them. Above all, Saying Yes takes aim at the myth of "voodoo pharmacology", which Sullum defines as the notion "that drugs control people rather than the other way around" (p. 268). The idea that drugs in some tangible or abstract sense recircuit the brain and neutralize free will is certainly a valuable defense for a user who stands convicted of a crime, but it reflects a misimpression of the particulars of substance dependence (the medically preferred term to 'addiction') as well as a failure to zoom out and examine precursor effects.
"...the conventional understanding of addiction, which portrays it as a kind of chemical slavery in which the user's values and wishes do not matter, is fundamentally misleading." (p. 27)
What the data indicate is that actions and behaviors while under the influence of narcotics, and the level of substance dependence achieved, have more to do with the user's preexisting psyche and his or her sociocultural circumstances than anything inherent to the drug itself. Someone down on their luck not only may be more likely to use in the first place, but their personality, ongoing situation and expectations of taking the drug tend to steer the attendant psychoactive effects in consistent, if not altogether predictable directions.
"...it's clear that happy, well-adjusted people are less likely to get into trouble with drugs." (p. 280)
When we take a look at crimes alleged to be the result of drug use, we consistently find that the perpetrator had a history of crime, violence and/or antisocial tendencies. Rarely do we find a pattern of nonviolence abruptly interrupted by a drug-fueled spree of immoderate behavior and crime. If any one drug had a direct tendency to warp human behavior in ways that enact unprecedented transformations on the part of the user, then we should expect to find a distribution of users with dissimilar backgrounds exhibiting similar behaviors and committing similar acts while under the influence of said drug. We do not find this: the crimes and abuses of drugs tend to have more in common with preexisting conditions and proximate circumstances. Thus contrary to the spurious connections found in mainstream rhetoric, recreational drugs do not regularly turn docile, reticent individuals into unfeeling, reckless, sex-crazed brutes, or vice versa.
The Golden Mean
On the second point almost universal to drug prohibitionist rhetoric, the copious data indicate that most narcotics users (the statistics are more or less consistent for all drugs surveyed) do so in moderation and lead functional, productive lives. Questions over how to define 'moderation' can be disambiguated by looking to alcohol as a proxy. All of us either know people who consume alcohol more than occasionally or do so ourselves without manifesting life-destructive patterns. Just as most are not burdened by their forays with alcohol, the data-driven profile of the typical narcotics user is someone who indulges their desire for certain substances while being fully moored as a functioning unit of society.
"The portrait of predictable escalation from experimentation to an unbreakable habit was wrong when it was applied to drinking, and it is no less mistaken as a description of illegal drug use-even such reputedly powerful substances as heroin, crack, and methamphetamine-do not typically become addicts." (p. 27)
To understand why moderation is the rule and not the rarefied exception requires nothing like the bad-faith moral authority deployed by anti-drug lobbyists. It merely requires we approach the question in terms of benefits and risks. Often enough, one's desire to return to the drug is overriden by one's commitments to family, friends and other social and occupational obligations. Conversely, those with low economic security and no social safety net are more likely to develop drug dependency issues. Much like alcoholism, without a resume of responsibility to attend to and other societal pressures in place to curb excess, it becomes decidedly easier to fall prey to overuse. Yet for the vast majority of more than one-time users, their drug of choice yields temporary highs, in between which they are free to pursue the activities that fill up the remainder of their day without negative interference. While the war on drugs tends to dichotomize usage in terms of extremes, there is clearly ample space between abstention and enslavement.
Deconstructing the Dysmorphia
Given the limpidity of the data on narcotics usage, what can explain the abiding cultural stigma? A lot of it, Sullum recommends, is grounded in misjudgments by people who have never used the drug in question (or recreational drugs in general), who jump to hasty conclusions based on superficial connections. 'Since Tom is a junkie and has done nothing with his life, drugs must be to blame', or 'I see what drugs have done to this person and I don't want it to happen to me.'
But it is the media that have reinforced these latent preconceptions by playing up anecdotes and downplaying the (admittedly less captivating) empirical side of the equation. Sullum lays special emphasis on alcohol throughout, one of humanity's eldest drugs, refusing to keep it far from overheard while assessing the risk of currently targeted illicit substances. After all, alcohol went through its own prohibition period and over the decades precisely the same arguments leveled at alcohol to keep it off the shelves have been redirected at the menace of the week. Indeed, the statistics on crime and violence can be easily torqued to incriminate alcohol, too. Just as the media played central roles both in the declamation and legalization of alcohol, so it plays equivalently critical roles in the domain of nonalcoholic substances. As more of the silent majority opens up about their drug habits, the quicker entrenched stereotypes will dissolve.
Another key to changing public perception is attention from the medical establishment. There are a number of accepted therapeutic treatments that overlap significantly with narcotics in terms of the ailments they target as well as pharmacological effects. One prodigious class of drugs, for example, is prescribed exclusively to treat mental disorders. On a conceptual level, the disorders many of these drugs purport to treat are only hazily distinguished from reasons users turn to illicit substances with similar pharmacological profiles. Ecstasy, for example, is often taken to improve one's experience in social settings, and there is no evidence that MDMA involves higher substance dependence issues than commonly prescribed remedies.
"Taking MDMA to overcome shyness is drug abuse, but prescribing Paxil to treat "social anxiety disorder" is good medicine. Legally, the distinction between medical and nonmedical is clear. Conceptually, it has never been blurrier." (pp. 252-253)
Marijuana, once characterized as a gateway drug on the road to ruin is of course well on its way to mass legalization. Its medicinal value as an antiemetic and antispasmodic and its subtle analgesic effects have led to relaxed legal controls in the U.S. As of this writing, marijuana possession and sale is legal in twenty states and the District of Columbia.
The Exacerbations of Embargo
Toward the close of the book, Sullum spends some time analyzing how usage patterns might shift following a removal of the drug ban. For one, any time a product is sequestered to the black market, the opportunities for violence multiply. A move toward decriminalization would reduce crime associated with sales, acquisition and distribution. Second, unregulated markets translate to heavily inflated prices. Heroin, for example, can cost forty to fifty times more than a legal dose, making illicitly obtained substances more likely to drive users into economic instability and social and physical degeneration. Third, absent regulation a whole miscellany of hazards comes into play including dummy doses, poisoned doses and overdoses. Fourth, due to the high costs, heroin and other opiate habitués are more likely to inject the drug—a more guaranteed and efficient method of uptake—raising the potential for transmission of diseases through shared injection materials.
In a post-criminalized world, the aforementioned societal influences would remain intact, likely ensuring the current patterns of use continued. And on a more philosophical level, the repeal of our regulatory apparatus would force us to come to grips with the dark side of human nature that feeds into the pernicious behaviors for which drugs have often been substituted as a scapegoat.
Saying Yes isn't the lengthiest book on the topic, clocking in at just 284 pages, but Sullum manages to cram a Titanic's worth of useful information into its slender form. It is a balanced, data-sensitive look at an uneasy and highly controversial topic. In navigating the divide between use and abuse, Sullum forthrightly avoids the monochrome portrayals of drug users prevalent in media caricatures in favor of thorough, wide-angle analysis. Many of my own preconceptions were challenged and tossed aside. Read the hard statistics on "hard" drugs and decide for yourself whether the criminalization paradigm is worthwhile, whether it creates more harm than good, and whether psychoactive effects reflect more the properties of the drug or the individual.
As I progressed and read about how (and how quickly!) public perception has changed with respect to certain drugs, how stereotypes and stigmas have rotated among different drugs at different times, in some cases even different classes of drugs, and how media and social cues completely parameterized the debate and national discourse, the more I recalled the quote preceding this review. We are wooed by anecdotes and tranquilized by data. Efforts like Saying Yes have the potential to reverse this asymmetry.
Perhaps the only knock against the book I could muster is Sullum's early fixation on religion's history with forbidden substances, which read like an extended preamble and seemed far from the regime of relevance to the war on drugs. Quibbles aside, Saying Yes should be requisite reading for an informed discussion on the longstanding war on drugs, for relating to friends, family and relatives who may be recreationally friendly with narcotics, and for anyone considering incorporating them into their own regimen.
Note: This review is republished from my official website. Click through for additional footnotes and imagery....more
"An excellent quest it mostly turns out to be. It’s no spoiler to report that the author doesn’t return, like Ernest Hemingway with a marlin, with a unified theory of everything. “Why Does the World Exist?” is more about the nuances of the intellectual and moral hunt."...more
Ken Follett spares not an ounce of genius in bringing his characters to life and weaving them together in electrifying narrative. His artistry is oneKen Follett spares not an ounce of genius in bringing his characters to life and weaving them together in electrifying narrative. His artistry is one that burdens the reader with sorting the protagonists from the antagonists, enriching each character's complexion and back story with such brio that you may just end up pulling for the whole lot to triumph as the novel winds to a close.
The Man From St. Petersburg is of course no exception, with Follett's tried and true, World War era-themed cat and mouse thriller once again taking center stage. Ever the epicure of historical fiction, Follett treats his settings with care, honoring the historical minutia and injecting them with multiples shots of hair-raising drama. The global tensions and dis-ease surrounding the two largest global engagements to date provide blueprints aplenty for building an engrossing alternate timeline. Situated in the lead-up to the First World War in 1914 London, we find Britain pushing to secure an alliance with the Russian Empire. War seems all but inevitable, and intel indicates a low chance of Allied success unless the whole of the Triple Entente is prepared to throw their martial weight against Germany.
The man chosen to represent Russia in the negotiations is the courtly admiral Prince Orlov, nephew to a British aristocrat named Lord Walden. Importuned by no less a figure than Winston Churchill, Walden is tasked with brokering the secret bond and saving his nation from impending defeat. A delicate assignment, no doubt, but one made all the more perilous by a shrewdly intelligent and combat-adept anarchist, whose life is interwoven with the Walden household's in variously surprising ways.
Enter Feliks Kschessinsky, who might just be the most unforgettable covert agent this side of Jason Bourne. The Russian idealist is fed up with his mother country's penchant for embroiling its citizens in wars in which they have no choice in participating and vows to sever the alliance talks with Great Britain by assassinating the admiral. Fearless yet stringently cautious, unflinchingly determined, almost too capable of evading his pesky pursuers, and ornamented with the occasional flash of charisma and sensuality, Feliks is the cloak-and-dagger character you just can't help but cheer for. (If you're a pacifist at heart, you may have all the more reason to get behind him.) His frequent bouts with Walden and the full armada of the British police force ratchet up the intensity as the walls close in around the Muscovite assassin. But Feliks finds help in the most unsuspecting of places...
Beyond the instant allure of Feliks and his skirmishes with Walden and company, Follett has also arranged equally enticing female leads who are not subordinately tossed in but who command central roles in the narrative. Walden's wife, Lydia, whose Russian past is dredged up in plot-twisting fashion, and their daughter, Charlotte, with her closeted upbringing and later affinity with the suffragette movement underway in Britain at the time, round out the exquisite cast. There isn't too much that can be shared about these two characters without giving major plot shifts away, but their presence is integral to the whole and compete with Feliks on every page for rights to the most memorable character.
Follett's 1982 thriller has a lot to offer, from the international intrigue of anarchist subversionism hurled against the British secret police, an endearing and dynamic cast, Ludlum-esque chase scenes, sensual but not at all gratuitous sex, to the masterful pacing and pitch-perfect dialogue, all encased in a historical backdrop that will lend the reader an osmotic familiarity with prewar London. Sure, a few of the plot turns are a bit too sharp and escape sequences faintly implausible, but the gripping prose and fluorescent cast are more than adequate to keep you anchored firmly to your seat. It may not be as polished around the edges as Eye of the Needle, or as seductive as his massively medieval opus, Pillars of the Earth, but Follett's The Man From St. Petersburg is surely just as absorbing, insisting you delay that next meal just a little while longer so you can see how the current scene plays out. This is smooth escapism, enclothed in classic Follett garb.
The only question that remains: which character will you root for?
“We’ve arranged a global civilization in which most crucial elements profoundly depend on science and technology. We have also arranged things so tha“We’ve arranged a global civilization in which most crucial elements profoundly depend on science and technology. We have also arranged things so that almost no one understands science and technology. This is a prescription for disaster. We might get away with it for a while, but sooner or later this combustible mixture of ignorance and power is going to blow up in our faces.” (p. 26)
The omen above was put to print in 1995 and echoed throughout Carl Sagan’s prolific career as both practitioner and communicator of science. Swathed in a world so joined at the hip to science and technology, Sagan saw denial and ignorance of science as the greatest risks to human well-being and continuity. Is the past here to stay?
In the United States at least, conditions are none too sunny. Nearly 7 in 10 believe that angels and demons are active in the world. 61% and 48% believe in ghosts and UFOs of extraterrestrial origin, respectively. More than half (56%) deny the scientific consensus on climate change. One-third of the public still waffles on the science of evolution. And over half believe that God influences the outcome of sporting events. Dr. Sagan passed away the year after releasing The Demon-Haunted World: Science as a Candle in the Dark, and in the decades that have come and gone since his oracular swan song, the American electorate seems as awash as ever in pseudoscience and superstition. As momentous, relevant and urgent as Sagan’s message was, its infiltration remains woefully incomplete.
The venerated astronomer, astrophysicist and cosmologist regularly popularized his lifelong passion for replacing delusion with fact-sensitive grandeur. His 1980 docuseries Cosmos: A Personal Voyage was such a groundbreaking moment in broadcasting because it showcased the degree to which science, if presented properly, could warm hearts and inspire minds. In The Demon-Haunted World, Sagan continued this saga, with his inimitable style intact, but with a dire focus on communicating how science undergirds the modern world, its co-dependency with democracy, and, amid the tenured struggle for progress and survival, is so often overshadowed by uncritical thinking and politicized agenda.
The uninitiated often maintain a warped view of science, that of an arcane discipline requiring superheroic intellect out to devour devout beliefs. But as Sagan spent a lifetime making clear, science isn’t just for scientists. Every one of us can revel in its fruits, be won over by its infectious appetite for discovery. Most important, we can all benefit from applying the philosophical principles on which it rests to our everyday life. What are those principles? The twinship of skepticism and trained observation fueled by an overarching preference for the truth, however inconvenient, over the psychologically comfortable. Science is far more than cold collection of data and interpretation; it is a way of thinking, an approach to the world that values the questions as much as the answers and has built-in tools for prioritizing both.
“Some may consider this an overbroad characterization, but to me every time we exercise self-criticism, every time we test our ideas against the outside world, we are doing science. When we are self-indulgent and uncritical, when we confuse hopes and facts, we slide into pseudoscience and superstition.” (p. 27)
Donning this intellectual apparatus full-tilt may occasion us to revisit ideas we once accepted without any skeptical filter or require we discard some beliefs long held dear. And in taking the plunge, Sagan encourages, we often find that nature is far more clever, subtle and adept at inspiring wonder than our fallible pattern-seeking devices can imagine. “Better the hard truth than the comforting fantasy. And in the final tolling it often turns out that the facts are more comforting than the fantasy…There are wonders enough out there without our inventing any." (pp. 59, 204) While this may be new cognitive territory for some of us, the benefits are too vast to pass up. A sharp mind keeps the charlatans at bay.
Key to how science delivers the goods has been is its unmasking of natural processes to arrive at natural explanation. We may recall how our ancestors ascribed various features and bugs of our existence to supernatural causality: witches inflicted sickness with their spells; rain was a divine reward, drought a divine punishment; earthquakes were just the local god(s) stomping around in fits of rage; the “rising” and “setting” of the sun was controlled by the whims of the neighborhood deity; short-period comets presaged the fall of state empires.
The advent of science severed the agency-focused paradigm. We learned that the ebbs and flows of celestial bodies mind predictable, calculable patterns. We learned that weather events are beholden to entirely terrestrial phenomena. We learned that transmissible disease is carried by microbes and other agents in our environment. We learned that the right medicine can cure an illness. The implications were radical, because if an illness was caused by the spell of a witch there is no reason to think we should find a natural cause for it, nor is there any reason to think we should find a natural cure. But in fact, it turned out that the right remedy could always overcome the power of “magic spells”. Per a unidirectional phase shift, super- and non-natural explanations were rendered obsolete, buoyed by an acute awareness of our propensity to overinterpret reality.
“For much of our history, we were so fearful of the outside world, with its unpredictable dangers, that we gladly embraced anything that promised to soften or explain away the terror. Science is an attempt largely successful, to understand the world, to get a grip on things, to get hold of ourselves, to steer a safe course. Microbiology and meteorology now explain what only a few centuries ago was considered sufficient cause to burn women to death." (p. 26)
Respect for this approach has not been universal, as a handful of minutes with mainstream media will avouch. In a world overflowing with pseudoscientific madness, Sagan divides his time between conveying the method and blitzing specific manifestations of the irrational. He casts his gaze on a whole armamentarium of woo, including creationism, crop circles, faith healing, astrology, psychics, UFOs and alien encounters. Is there anything at all behind these claims that can connect them to reality? Not if skeptical inquiry has anything to say; such notions find a vacuum of support inside, as Sagan wittily remarks, “any universe burdened by rules of evidence." (p. 58)
We learn of how two enterprising hoaxsters from Southampton fooled millions of credulants into believing that patterns in cornfields were cryptic messages from off-world. We listen in on the exploits of James Randi, who once outfoxed Australian media with video documentation of a “channeler”. Our talent for deceiving ourselves is on full display as Sagan recounts the initial frisson of seeing “faces” on Mars and assesses the merits of UFO claims from perhaps every conceivable angle. (As a pioneer of exobiological research, it’s no surprise Sagan devoted such sizable chunks to debunking UFO conspiracy tales, but he could have toggled it down a notch or two.) In turn, astrology and biblical creationism sport the same empirical garb as alchemy and witchcraft. (Quickly! Someone get Answers in Genesis on the phone.) From séance mediumship to 'spirit photography', the counterfeit carousel requires similar ingredients to survive: “what they need is darkness and gullibility." (p. 241)
Democracy and the Future
Why haven't the contrails of science seeped into the inner recesses of society and taken hold of our discourse and policy, Sagan asks? A look to the past tells us that commitment to these ideals has waxed and waned over time, surfacing first and most clearly in ancient Greece in the form of natural philosophy. Greek antiquity’s mental preoccupation with nature was distinguished by an express concern with natural cause and effect explanation, checked against their homegrown rules of logic and deduction. This marriage of reasoning and observation nourished some extraordinarily precocious activities. Sagan charts the achievements of early polymaths like Eratosthenes, who measured the circumference of the earth, its axial tilt, as well as its distance from both the sun and the moon all with peculiar accuracy in the 2nd century BCE, Aristarchus, who presented the first known model of a sun-centered cosmos, and Democritus, who was the first to offer an atomic theory of the universe and often considered the “father of modern science."
Later societies yielded intermittent deviation from the systematic acme of Athens as triumphs gave way to enshrined overindulgence of superstition and as nationalistic fervor billowed to abnormally toxic levels. Beyond our undersized prefrontal cortex and the diversiform predispositions underwritten by our evolutionary heritage, at the heart of these setbacks lay the institution and its doctrinaire approach to knowledge. Both religious and secular governance can boast of choking free inquiry, stamping out critical investigation of the cosmos, and cultivating an infrastructural incapacity for nurturing the open exchange of ideas. Whenever and wherever this happens, humanity falters, the mind capsized under the crushing weight of tyranny. And like a derailed traincar, we inevitably throw ourselves headlong into state-sanctioned superstition and unreason.
Science cannot prosper under these conditions. It stultifies and stagnates. Democracy ensures the efficacy of science insofar as it ensures all voices are heard. Science and democracy reinforce one another in this way; science depends on democratic values to function, while democracy depends sensitively on science to maintain its selected way of life, in everything from informing policy to keeping infrastructure in motion.
After spending ample time surveying the overwhelming science illiteracy and innumeracy in the States, again and again Sagan returns to the point that democracy is unworkable in this environment. Uninformed citizens cannot cast informed votes. The shrieks of the ignorant become the shrieks of the next generation, who often adhere to the ideological persuasions impressed by their sheltered upbringing. So before we rebuff the allegation that beliefs in pseudoscience are harmless, we must be open to recognizing how they are emblematic of a larger infirmity. We need open-minded, critically thinking, intellectually equipped individuals exercising their constitutional duty and voting on the policies that will give shape to the parameters under which future generations may thrive or fall.
“A proclivity for science is embedded deeply within us, in all times, places and cultures. It has been the means for our survival. It is our birthright. When, through indifference, inattention, incompetence, or fear of skepticism, we discourage children from science, we are disenfranchising them, taking from them the tools needed to manage their future.” (p. 317)
Sagan’s penultimate work is packed with diverse subject matter. Much more than an impassioned defense of science, The Demon-Haunted World meanders through philosophy, history, politics, religion and grin-inducing exposés on claims to reality that just aren’t so. Sagan acknowledges the imperfections of science along with every other human endeavor while urging that it is the best we have, that when it comes to understanding how the world works, science seizes the epistemological crown. It is also a siren call to the coming generations: that we disregard science at our own peril. With mounting concerns over a warming planet, alternative energy sources, overpopulation, and the most forward-focused way to preserve our pale blue dot, we cannot afford to treat with insouciance its revelations. Every human should read this book.
Sagan also holds a special place in my own intellectual journey, reviving a pulse which continues to reverberate in my life, something subliminal yet concrete. His books unshackled my imagination. His words spoke for me. He gave me a voice. A man of great passion and fierce intellect, he had the uncanny ability to ambush the heart with an equal measure of poetry and humble curiosity. His words can be understood by anyone who takes the time to read them. Carl synthesized my deepest thoughts and pointed me toward new horizons. He opened my eyes to a post-religious ethos and, more than anyone else, inspired me to abandon the intellectual celibacy of my youth and secure a personal relationship with reality and the cosmos. If Sagan communicated anything, it’s that science is a unification measure, something in which all of us can partake. Together with reason it is among the greatest tools in our survival kit. Let’s keep them burning brightly.
“I worry that, especially as the Millennium edges nearer, pseudoscience and superstition will seem year by year more tempting, the siren song of unreason more sonorous and attractive. Where have we heard it before? Whenever our ethnic or national prejudices are aroused, in times of scarcity, during challenges to national self-esteem or nerve, when we agonize about our diminished cosmic place and purpose, or when fanaticism is bubbling up around us – then, habits of thought familiar from ages past reach for the controls.
The candle flame gutters. Its little pool of light trembles. Darkness gathers. The demons begin to stir.”
Note: This review is republished from my official website. Click through for additional footnotes and imagery....more
“Giants are not what we think they are. The same qualities that appear to give them strength are often the sources of great weakness.”
Big insights are“Giants are not what we think they are. The same qualities that appear to give them strength are often the sources of great weakness.”
Big insights are rare commodities. That is, unless you happen to be cycling through Gladwell territory, where tucked away inside every myth, anecdote or counterintuitive result is a profound lesson about the human condition. This is harmless enough when confined to the fiction aisles of your local library, but Gladwell presents his ideas as scientifically respectable, even moving (well) beyond the academic literature and into high-concept generalization. And this is what gets him in the hot seat.
By now we're all familiar with Gladwell's tried and true formula: packaging a mixed bag of vignettes that loosely revolve around a common theme. His latest book, eponymous with the biblical tale implanted in every child's vocabulary, serves as the hub with which his assorted case studies can network. The humble stone-slinger slays the mighty warrior–the archetypal underdog story. Except Gladwell upends this classic tale and contends that any ostensible disadvantages on the part of David were actually aptitudes, and vice versa for Goliath. After some offbeat meta-analysis of the Hebrew account, he intones the book's central theme:
“There is a set of advantages that have to do with material resources, and there is a set that have to do with the absence of material resources – and the reason underdogs win as often as they do is that the latter is sometimes every bit the equal of the former.” (pp. 24-25)
Now there's a heartwarming idea giftwrapped in persuasive prose. One of Time's most influential people of 2005, Gladwell's ability to craft absorbing narrative merits the lavish acclaim from the popular press, but the accolades stop there. His cultivated habit of extrapolating grand truths from flimsy research has proved time and again his Achilles' heel. The anecdotes he recounts stand by themselves and gain nothing by being tangled in with pop-psychology, especially when there is actual peer-reviewed research into many of the areas Gladwell touches.
Gladwell leans all of his weight on a small-sampled (read: underpowered) study which found that less legible fonts activated higher performance in students, presumably because the students worked through the problems more slowly in order to decode the typeface. Yet, as psychology professor Christopher F. Chabris points out, Gladwell conveniently omits any reference to the replication studies (of superior statistical power) that failed to reproduce this result. That the study Gladwell makes such heavy weather of is plagued with sampling, selection effect and other confounds does not bode well for his thesis.
Pesky details are of trifling importance to Gladwell, who steers the data in directions both cliched and maudlin. Were we to find life on the moons of Saturn, he would undoubtedly apply his guiding principles there as well; so sure is Gladwell that this obscure study holds the key to overcoming life’s obstacles that he applies it to dyslexia, claiming that inbuilt disadvantages generate asymmetrical success stories. Replication results notwithstanding, are we to really believe that the authors of the typeface study suppose their findings reveal some deep lesson about turning adversity into triumph? Gladwell may be a lot of things but mantic isn’t one of them. You could fill a supertanker with all of the sociological and psychological layers pushed to the periphery in favor of his handpicked fiction.
The balance of his thesis rests on self-reported anecdote by real-life Davids who faced down their own Goliaths. There is nothing so prescient as hindsight, especially when other factors stab at the provocative narrative hanging in the balance. Complications be damned: it can only be adversity that produces successful outcomes which, curiously, seems to knock against his thesis in Outliers (2008), which replaces adversity with caprice and right-place right-timeness. How many rabbits will he pull out of the same hat before even his unsuspecting readership cry foul?
And it gets worse for Gladwell. One need not inventory his entire catalog, as his scuffles with internal contradiction can be found in adjoining chapters of David and Goliath. After tagging dyslexia as a "desirable difficulty" that breeds success, he juxtaposes a story about a would-be scientist who dropped out of school because she encountered undesirable difficulties at a renowned university. By relying on Gladwell's vacuum-packed logic of the preceding chapters, one is left scratching his head as to why the "small-fish-big-pond" schematic wasn't parlayed into the successes afforded the world's dyslexics. For Gladwell, it seems, difficulty is a good thing until it isn't, which smells as fallacy-laden as the aphorism 'If you're not early you're late.'
Or as Christopher Chabris puts it in his blue-chip review in the Wall Street Journal:
"The idea that difficulty is good when it helps you and bad when it doesn't is no great insight."
Even his deconstruction of the David and Goliath bout diverges from the core theme he attempts to build throughout the book. Gladwell tells us plainly that David actually had the sizable advantage since lithe marksmen were always favored over cumbrous infantrymen clad in heavy armor. Within the conventional confines of ancient warfare, it was Goliath who was the decided underdog. Why then did this strategic imbalance, along with his hypothesized acromegaly condition, not spur Goliath to victory as it did the dyslexic CEOs or Vivek Ranadivé’s middle school basketball team, or steel his resolve as with the near-miss survivors during the Battle of Britain?
For a more hyperbolic but just as on-target commentary, see Daniel Engber's review in Slate:
"The notion that a rule holds true except for when it doesn’t runs through David and Goliath, and insulates its arguments from deep interrogation. Is it really advantageous to have severe dyslexia? Yes, and certainly not. Are children better off without their parents? Don’t be silly, but it could be so. These non-answers rub the dazzle from Gladwell’s clever thesis statements, until they all begin to look like dullish intuition. We don’t need another book to tell us that adversity can lead to greatness (see: memoirs by CEOs, episodes of The Moth, every college essay ever written), just as we don’t need another book to say that adversity really, really sucks (see: the world outside your window). But couched in the golden armor of anecdote, Gladwell’s overgrown ideas seem powerful and new."
What's perhaps most unforgivable about Gladwell's tactics is that he, whether wittingly or no, ignores the relevant research into the very topics he covers. For example, why not mention the ongoing research on the comparative value of attending elite vs. non-elite schools? Why not discuss the actual research on age-specific impairments of dyslexia? Or why not, even if only in one of his many footnotes, source the relevant literature for the science behind "desirable difficulty"?   Probably because this complicates the narrative. But drawing on unrelated research that supports your thesis at a minimally conceptual level while ignoring the more relevant research extant is what we call cherry-picking. Subsequently presenting this research as though it models macroscale principles about the nature of reality is shifty behavior, to put it mildly.
Even with the outpouring of negative reaction from various quarters of the blogosphere, Gladwell's literary talents remain as agile as ever. His ability to locate and draw out the most moving aspects of the human stories he collects is on full display here. But this savoriness is perpetually undermined by his messy forays into social science, in which he scavenges the academic literature for nuggets that can be spun into a thematic web. In David and Goliath, this web comes unraveled into muddled contradiction. The quickening stories he inflects through fluid prose work well on their own, even in their romanticized state, but Gladwell seems unable to fight the compulsion to reach for obscure, underpowered, unreplicated studies to which he can fasten his larger than life lessons. Despite this awkward matrimony often borne out in the pages of his books, I suspect most readers are none too interested in Gladwell's fidelity to scientific wisdom but rather are lulled by the melody around which all good storytelling harmonizes.
Note: This review is republished from my official website. Click through for additional footnotes and imagery....more
"Between language and the theory of nature there exists therefore a relation that is of a critical type; to know nature is, in fact, to build upon the"Between language and the theory of nature there exists therefore a relation that is of a critical type; to know nature is, in fact, to build upon the basis of language a true language, one that will reveal the conditions in which all language is possible and the limits within which it can have a domain of validity." (p. 161)
There's no need to beat around the bush: The Order of Things is, bar none, the densest read on my shelf to date. Philosophy tyros steer clear; an entry-level text this is not. To say that this was as difficult to read as it was to understand would be a heavy understatement. Snippets patterned after the one above would frequently invite two- and three-peat readings to absorb before moving on to the next, equally demanding line of Foucaultian esoterica. Michel Foucault, writing in the French philosophy tradition, is touted as a librarian of ideas, and his works demonstrate such canonical breadth that they are surely not intended to be consumed in isolation. Indeed, you had better have a working understanding of the systems of knowledge throughout Western history if you stand any chance of deconstructing this significant opus.
Foucault's acumen and seemingly bottomless knack for depth are on full display in this, his most ambitious and the one that propelled him to stardom, work. However, even with a solid grasp of philosophy and the pivotal shifts in Western thought, you must then also place these insights within the tramlines of the baroque prose Foucault has prepared. Similitudes, resemblances, representation, significations, character, the analytic of finitude, empirico-transcendental: familiarity with this repetition of terminology will be critical if one is to grok the landscape Foucault has delicately painted.
The Order of Things: An Archaeology of the Human Sciences (1966) is nothing less than a genealogy of ideas, an intellectual ancestry of the Western mind. Along the way, Foucault somehow manages to retrace the entire development of science, restricting his analysis to a specific slice of spacetime: European culture since the 16th century. It is a work so daunting in scope, and so winged in its execution, that it seems to relish in keeping the mind in a perpetual state of entanglement, sputtering, caroming as you eagerly await for a resting point to collect your wits and proceed further into the well. He blinds you with brilliance, and insists that you see. Foucault ricochets between the intellectual giants of the Western world in rapid-fire fashion, traipsing from Spinoza to Descartes to Kant to Marx, Freud and Adam Smith, to Nietzsche, seemingly all while assuming on the part of the reader a dissertation-level of intimacy with each. Come prepared.
As I understand it-and I am most emphatically not claiming that I do-Foucault is demonstrating that there do exist traceable patterns in the great developments of Western thought in terms of limits, possibilities and approaches to new and old knowledge, but also discontinuities and breaks from old ways of thinking. How "clean" these breaks were is of course a matter of debate. He focuses in on three domains-linguistics and philology (language), biology (life) and economics (labor)-emphasizing how the intellectual boundaries present in each historical era shaped how man thought about these venues and how they approached and reflected on new developments and discoveries that pervaded our consciousness. Whether we were categorizing or taxonomizing, articulating or deconstructing, we operated in the epistemes confined to our period of history, but also turned toward new modes of discourse as ideas emerged out of the Western world's interminable, civilizational march.
There is also the niggling question of "man" and how and where s/he figures into the whole grandiose state of affairs. Foucault seems to be arguing that man, like everything else, is a historical construct, and its relation to the order of nature pivots according to developments in each area of inquiry, including but not limited to, the human sciences. That is, man's interpretation of man is a product of the historical development of the spaces that have most dominated the human intellect, viz the human sciences of (proto-)biology, anthropology and psychoanalysis, the social sciences of economics and labor, and, most intricately, the all-enveloping force of language, which is coextensive with every sphere with which we make contact. Certainly, man's shifting coordinates within the grid of knowledge and human inquiry is of special emphasis here in Foucault's sweeping manifesto.
In the closing sections, Foucault hints toward a new episteme, something that is ill-defined, turbid, hazy but which carries all the signs of a break from what came before. He doesn't specify with any precision what this branching episteme consists of, or which domain(s) has largely catalyzed its brachiation, but he seems to think it is imminent as a reflection of the mid-20th century region Foucault occupied.
A work like this is one which eludes classification, much like how the centerpiece of the book itself-man-resists arrangement within its relation to human knowledge. The Order of Things is simply, and not so simply, sui generis, transcending the common boundaries of empirical disciplines and even philosophy. Foucault's writing is ornate, painstakingly precise in places yet frustratingly ambiguous in others (so much so that, like me, you might desire the opportunity to stop every now and again and ask questions). I wish I could say that I grasped the book in its overarching messages as well as its more subtle analyses, but this will require subsequent readings, likely several more. If you've previously been introduced to Foucault or his French antecedents, you may be in a better position to follow along. But if you're like me, this will be a humbling read, an intellectual tour de force that incessantly reminds you how much more there is yet to learn.
For a more informed and capable post-book analysis, I recommend this page for a good starting point.
"History shows that everything that has been thought will be thought again by a thought that does not yet exist." (p. 372) ...more
"According to Luther, reason had given Christian teaching "the French pox" (i.e., syphilis) and Aristotle was the pimp who had arranged the tryst." (p"According to Luther, reason had given Christian teaching "the French pox" (i.e., syphilis) and Aristotle was the pimp who had arranged the tryst." (p. 197)
A capable and impressively comprehensive treatment of the Protestant Reformation, including the historical movements perched above and below its moment in Western development. James Payton, Jr. is a professor of history who specializes in Reformation era scholarship. In this handy distillation Payton addresses some of the common misreadings of the Reformation–its ideas, its context, its trajectories–that so abundantly reside along the shoals of modern Protestantism. In doing so, he strays far from overlaying any private credo onto its retelling, choosing to maintain a stringently historical focus throughout. That is to say, even though his tome is published under the evangelical churning mill, InterVarsity Press, Payton pulls no punches and does not confine his inquiry to what today's Protestants might wish to hear.
This is the Reformation laid bare, warts and all, its triumphs as well as its tragedies. Indeed, a deeper excavation of this historical era may generate friction for those operating under the illusion that the Reformation was a monolithic, amicable waltz of Christian revelation and renewal. Getting the Reformation Wrong knocks down this wobbly edifice and erects in its place the multivalent presentation uncovered by historical scholarship in recent decades. It isn't until the final chapter that Payton swaps his historian hat for his self-confessed Protestant chapeau and offers some speculative wisdom on how Protestant Christendom may move forward in the face of some 30,000+ denominations and splinter sects extant.
The first misconception to which he lays waste is the notion that Martin Luther's ideas fell from heaven, as it were, sans historical context and ideological precedence. In fact, the clarion call for reformation within Western Christendom had been sounded for decades leading up to Luther's decisive act at Wittenburg. Longstanding corruption and profligacy within the Roman papacy and enfolding dioceses during the Late Middle Ages had provoked deep resentment from the outlying communities. The rancor reached a fever pitch during the late 14th through the early 15th centuries in what became known as the Western Schism: Church governance devolved to an embarrassing state of disarray as quarrels over who had rights to the Roman throne only brought ignominy to the institution. Moreover, the Medieval Period's Black Death, which rent the peoples of Europe and shaved its populaton by some 60-75 percent between the years 1347 and 1450, combined with religion's complete impotence to abate this paroxysm, further depleted the waning confidence in the Church. The siren of reformatio in capite et membris - "reform in head and members" - blared loudly, with anticlericalism a prevalent current at the front of European consciousness.
Mind you this was not an ideological opposition to religion writ large but rather a vehement disapproval of the rank venality in which the Roman communion was now awash. Christians of all stripes, from the Late Middle Ages to the Renaissance into which it bled, yearned for an overhaul of the Christian faith–a return to form. The groundwork for the Reformation had been laid atop a societal substrate eager to receive new ideas.
The Renaissance Revisited
A second misconstruction which Payton targets is the idea that the Renaissance and the Protestant Reformation were antithetical responses to the shared grievances toward the Church. It is often supposed in Protestant chatter that the Renaissance was fundamentally irreligious and, ergo, anticipated the Enlightenment that was to come, while the Reformation was by contrast a recrudescence of spiritual wisdom. Payton traces these misconceptions largely to the 19th century Swiss historian Jacob Burckhardt, who published a number of works characterizing the Renaissance as a humanistic, or human-centered, movement as opposed to a religious one. This view carried weight for several centuries and was eventually revealed by later scholarship as a false dichotomy.
Scholarship has now shown that the term "humanist" as used in Renaissance literature carried no philosophical implications whatsoever but simply meant someone who taught the liberal arts or "humanities". It was in fact anachronistic of Burckhardt and his confreres to retroject their contemporary understanding of humanism onto a cultural epoch with a different idiom. The prevailing view today is one that sees the Renaissance as foregoing scholasticism and dialectic theological training for more individual concerns, such as poetry, grammar, history and art. While anti-papal sentiments were indeed widespread in Renaissance rhetoric, it was not a development centered on skepticism of religion as later interpreters have often presumed. Renaissance awareness aspired to a different balance between religion and culture, but embodied no hard-lined philosophical valence at its root. In this way, the Renaissance had far greater familial connection to the coming Reformation than to the Enlightenment of the 17th and 18th centuries.
The Question of Unity
Perhaps the gravest of misconstruals about the Reformation is that Luther's thinking followed a fixed, undeviating path from the moment he proclaimed his new way forward to the moment he died, or likewise, that the Reformers were all united in their ideas for reform. Like everyone else, Martin Luther was human and, as such, saw a shift in his thinking over time. Furthermore, not all of the Reformers shared similar backgrounds; some of the major players were versed in Northern Christian humanism, while others were steeped in scholastic pedagogy, and this led to different emphases, foci and clarities. Trained in the scholastic mode of theology, Luther initially hit upon his revelatory approach to the Christian faith after some emotionally trying times during his monastic life. Even so, he did not anticipate or envisage a worldwide reformation to spring out of his ideas. They did, of course, and once his ideas were released into the wild, others latched onto them, sometimes taking them in directions he vehemently opposed.
The ferocity and zeal with which Luther and his Reformer compeers and detractors quarreled over various tenets and doctrines must have been something to behold. It was a common tactic of Luther to declaim his opponents as minions of Satan who wished to distort the truth in service to the wicked one. Here is one instance of the depths of Luther's discourtesy:
"You run against God with the horns of your pride up in the air and thus plunge into the abyss of hell. Woe unto you, Antichrist!" [From Defense and Explanation of All the Articles, pg. 87 of Luther's Works, Vol. 32]
Or the inimitably endearing:
"You are dumber than Seriphian frogs and fishes." [From The Bondage of the Will, pg. 77 of Luther's Works, Vol. 33]
One of the moving targets of these colorful debates was Luther's core idea: justification sola fide. Through his many devoted years in service to the monastic order without any emotional relief to show for it, Luther had touched upon the doctrine that salvation is grasped by faith alone. This reduction of the Christian scheme of salvation was deemed heretical by the Roman Church, who contested that such a radical view would lead to indifference or apathy toward good behavior. In defense, the Reformers were more or less united in their ecclesiastical response. They countered that faith is never solitary and that good works and virtuous deportment are a natural outworking of the belief in God's unconditional mercy and plan for redemption. Faith is never alone but a package deal, they declared.
The second delineating doctrine propounded by Luther and his contemporaries vectored around sola scriptura, which is often just as amenable to misinterpretation as that above. There is still to be found echoes of the "Scripture is good, tradition is bad" dichotomy in modern Protestant thinking. A glance at the Reformers' own words reveals this to be an unacceptably narrow contrast. What Luther and company advocated was ad fontes theology, a return "to the sources", viz apostolic teaching and the writings of the Church fathers, something that had been muddled and all but smothered in the Christian life and teaching of the day. Luther believed that much of the Christian doctrine of his time was unfaithful to Scripture as well as to the Church fathers' interpretation of Scripture. He sought to revive these ancient sources and privilege their position in the hierarchy of religious authority over against Catholicism's primary emphasis on tradition. While Luther and many of his loyalists argued that the Bible was the most important touchstone of Christian understanding, they did not insist it was the only hammer in the believers' toolkit.
I was overall very pleased with this book. Payton delivers on the task he set out to accomplish. He provides a historically focused synopsis of the Reformation, allowing the major voices to speak for themselves, and corrects along the way the sundry misconceptions operative among mainline Protestants today. The book is also fastidiously referenced, allowing readers to drill further into the source literature for clarification. I recommend this book to anyone interested in Church history and the roots of Protestantism.
Note: This review is republished from my official website. Click through for additional footnotes and imagery. ...more
A decent reference if you're looking for an alphabetized listing of common theological parlance. Not so good if you're lGatekeeping in Dictionary Form
A decent reference if you're looking for an alphabetized listing of common theological parlance. Not so good if you're looking for one not colored by denominational agenda. IVP’s Pocket Dictionary of Theological Terms is exactly as the title suggests, but unfortunately its use as an educational tool is compromised by a pervasive gatekeeping mentality that is so prevalent in evangelical circles.
And then I came across this nacre of doctrinaire clumsiness:
atheism. A system of belief that categorically asserts that there is no God. Atheism usually affirms as well that the only form of existence is the material universe and that the universe is merely the product of chance or fate.
If this is the kind of willful distortion coming down from the top in evangelical academia, it's no wonder why interfaith discourse is so heavily deformed in this country. The trinity of authors here have of course misdefined atheism.
Very few atheists say, “God definitely does not exist.” The vast majority say, “It’s unlikely that gods exist, and I see no good reason to believe that they do.” In keeping with the philosophical distinction between knowledge and belief, the former is what we might call a gnostic atheist (or, alternatively, a positive or “strong” atheist), the latter an agnostic atheist (alternatively, a negative or “weak” atheist). Just as most Christians don’t sashay around claiming Amun-Ra, Hermes, Zeus, Quetzalcoatl or unicorns don’t exist, neither are most atheists in the business of making the positive claim that no gods exist. It’s just not something they concern themselves with, just as most people don’t concern themselves with belief in unicorns or other cryptids.
As I am wont to emphasize, positive disaffirmation is a spectrum's length away from nonbelief. Most inclusively, then, atheism is simply a linguistic placeholder we use to denote the nonbelief in personal deities. Often enough, it is a conscious conclusion based on the available evidence.
Likewise, atheists do not have an ideological bias toward materialism. It's just that a material universe is all that can be supported by the evidence. To persuade a materialist to accept some form of dualism, supernaturalism or paraphysical causality, the advocates of those views would need to produce probative evidence (or at least a soundly reasoned case) in their favor. The burden of proof lies with those positing alternate dimensions of reality. At any rate, atheists are usually not in the habit of making universal or absolutist claims, but of simply voicing skepticism in the face of unchecked fanaticism.
Another area in which the authors' doctrinal commitments seep through is in the various definitions connected to Christology (the nature of the Jesus of Scripture). One example is adoptionism:
adoptionism. The theory that asserts that God adopted Jesus as his Son...This theory fails to reflect scriptural texts that point to Jesus' eternal relationship with the Father (e.g., Jn 17:5).
If only it were so simple. Of course, in order to defend your favorite theology as “biblical” or “scriptural”, you have to advertise a univocal, monolithic view running throughout the Christian New Testament, a view which fails to hold up under any modicum of scrutiny or grasp of Christian history.
Examination of early Christian documents reveals that as stories about the historical Jesus developed, a diverse spectrum of thought began to take shape. The surviving exchanges and the manuscript tradition of the canonical gospels and other New Testament texts provide a window into these 1st-4th century conversations. The gospel narratives, for example, originated in different communities from different authors speaking to different issues to address different needs. These men had their own perspectives, their own beliefs, their own needs, their own concerns, their own desires, their own theologies. And this kaleidoscope of inspirations is what we see preserved in the Christian New Testament.
It should also be emphasized here that none of the Greek writers thought they were writing (what was later to become known as) ‘Scripture’ or imagined that their writings would one day be canonized and subsequently compared, contrasted and hyper-scrutinized alongside other period texts. How could they? Such foresight was alien to them. As we might expect, once these disparate texts were smashed together and consolidated many centuries later, the multivocality came along for the ride. Given this scenario, it should not be surprising in the least that the gospel writers, in several respects, did not agree with each other; they expressed different views about Jesus, God, and the linkages therein.
As a result, adoptionist Christologies, widespread in early Christian thought, along with docetic and separationist Christologies and others, all made it into the eventual Bible. Moreover, when we compare later manuscripts with earlier manuscripts, we find dozens of examples of where those holding anti-adoptionist, anti-docetic, anti-separationist perspectives, and everything in between, altered the words in an effort to bludgeon the texts into an artificial conformity. (Ostensibly, antiquity’s concern for internal harmony was anticipatory of modern day evangelicals.) If there were not this diversity of voices, there would have been no motivation to amend the texts.
To recap, where did this mishmash of views come from? They originated with the texts (and any associated oral tradition from which they derived), ideologically dissimilar as they were. Because the New Testament documents, taken together, are inconsistent, conflicting and contradictory on several matters of theological importance, of course there are passages in one book which suggest against adoptionism, just as there are passages in others which gesture toward adoptionism. This is what happens when you consolidate texts from different authors. Ultimately, doctrine is best organized by text, not by denomination.
This is also in general why "proof-texting"–mining for verses in an effort to extrapolate a biblical-wide perspective–is irretrievably flawed in approach. Pointing to passages like 2 Timothy 3:16 or 2 Peter 1:20-21 as denoting biblical 'infallibility' or 'divine inspiration' is a naive way of using the Bible to inform theological beliefs. How could the authors of these texts claim a property or attribute for texts that had not yet been written and for texts they had no clue would one day accompany their own? Nowhere in the Bible does it mention which books should be included in the Bible as its authors had no conception of 'canon'. "Proof-texting" fails as a hermeneutical device, not least because you are using the words of one author to interpret the words of another, while papering over the local context within the text itself (i.e., the specific needs, concerns, and issues the author is addressing), all while ignoring the complex, arduous history of the formation of the biblical canon, itself the product of a long line of human decisions.
There can be no substitute for, and no escape from, working out meaning for oneself.
Instead of suppressing these facts or deeming them a problem, those in thrall to evangelical tradition might try accepting the Bible for what it is instead of forcing it to be something it isn't. The Bible isn't a book; it's a library (the very word Bible means "library"). And hence contrary to the reflexively tendentious language plastered up and down this handy dictionary, the Bible is not an ideological monolith.
To push against this fact is like a library patron complaining that something she read in a book from one part of the library contradicts something she read in a book from another part of the library. We would probably question this person's mental maturity. Just as we expect different perspectives from different books in a library, so we should not be surprised or otherwise disturbed by the presence of alternative or divergent views in the biblical texts. Like so much of evangelical scholarship, this resource is contaminated with theological insularity. Gatekeeping in dictionary form.
Note: This review is republished from my official website. Click through for additional footnotes and imagery....more
This is the book that cemented my move to atheism (along with other writing by Enns).
As a placeholder for a more fully fleshed review, here is one conThis is the book that cemented my move to atheism (along with other writing by Enns).
As a placeholder for a more fully fleshed review, here is one conclusion I drew from Enns' great book:
Proving the historical accuracy of Genesis (or other fable in the biblical texts) is analogous to proving the historical accuracy of "The Boy Who Cried Wolf". What really matters is how we apply its life lessons and that we do so in an appropriate context....more
From science writer and TED-Ed speaker Garth Sundem comes a Pandora’s box of brain taffy to stuff somewhere in your hippocampus, the bulk of which youFrom science writer and TED-Ed speaker Garth Sundem comes a Pandora’s box of brain taffy to stuff somewhere in your hippocampus, the bulk of which you’ll probably lose all recollection of shortly after consumption. In a quixotic attempt to ward off this neurological misdemeanor, I’ll take some moments to queue up any residual anamnesis I still have of Garth Sundem’s vat of brain confectionery.
In some ways, I enjoyed Brain Candy, but not in the ways I normally enjoy a book. Also 'book' is not perchance the proper term (something that would not have come as a surprise had I leafed through it pre-purchase rather than spasmodically tossing it in my Amazon shopping cart). Brain Candy is a mixed bag of trivia, brain trials and interactive questionnaires, spiked with adages about the brain and the latest research, all divvied up into small chunks in more or less random sequence. As such, it's more of an entertaining coffee table accoutrement, or something you'd place within short reach of the lavatory, its viscid morsels laced in such a way that you don't linger too long.
As someone who has scoffed down every crumb tucked between the bindings, I can say it’s meant to be approached in episodes, committed to near-term memory and put aside until a later instance warrants its re-attention. The pages lurch from one topic to the next in such rapid-fire fashion that you’re forced to switch up your train of thought just as it is building speed. No need to overload your cerebrum; I’d hate to see you hopped up on all those sweets anyway. But you want more.
Surface treatment is the enemy of all enemies. Indeed, many topics simply do not sit well with this format. Dissecting the extent to which human and non-human behavior is genetic or environmentally conditioned requires more than a family of paragraphs (p. 90). Probing how human empathy is continuous with other primates and prosocial animals is not an area you can properly canvass in a sub-two-page serving (p. 149). Deconstructing addictive behavior and its underlying neural substrates does not go hand in hand with brevity (p. 125). The differential brain states of those with and without religious convictions is simply not something that can be paraphrased in a sectioned-off word salad (p. 142). Neuroethics? Free will? Don’t even bother.
Alas, this is not recommended for obsessive-compulsives or for those magnetized to detail. Yes, people like yours truly. The narrow snippets prefacing each study are embarrassingly, tantalizingly terse and frequently left me with more questions than answers. Are some people better at thought suppression than others? When interrogating a potential suspect, how does one determine what is and is not an extraneous detail? Is Mr. Sundem available for a Skype Q&A?
A soothing reprieve to the madcap formula served up here is that full citations are provided for many of the excerpts in the back of the book. If one of the topics in particular seizes your interest, you can always follow up with the original research paper. Brain Candy hence registers as a depthless collation priming you for the main entree, if you’re so inclined. Of course, there’s no reason the ADHD types won’t gobble this up. If hastily digestible chestnuts are more your speed, Brain Candy‘s a fine recipe.
There are other treats Sundem crams onto the dinner table. The book is chock full of brain teasers, psychometric and other pop-personality tests, and further topped with visual illusions and “eye hacks” that will infuriate your occipital lobe. Some of the brain teasers are worthwhile, while others barely rise to the level of time-wasters. And missing entirely is the crowning glory of all brain twisters: four men in hats. No excuse for its absence.
The optical legerdemain, moreover, consist of illusions you’ve probably seen before if you’ve had any exposure to the internet. Along the way you’ll also be forced to slurp down a glossary of phobias that I have my doubts are even real. (Automatonophobia – who knew we had created a word for people frightened by ventriloquist dummies?) Some might call much of this filler. Those people would be right. Any more, and I would have unhesitantly slapped on two stars.
Lastly, I cannot close out this review without mentioning the most savory bite in the book. Sundem enumerates a few hapless men who found themselves with nails (yes, nails) and other pointy objects lodged in their brain (p. 60). One man was found to have a two-inch nail embedded in his skull for twelve years before symptoms prompted him to consult a doctor. Another’s suicidal run-in with a nail gun backfired after having twelve nails plunged into his skull. What did he do except stroll into his local hospital complaining of a "mild headache?" Yet another nail gun incident involved a construction worker unbeknowingly firing a four-inch nail through the roof of his mouth, which penetrated clean through to the brain, stopping directly behind his right eye. The man gathered he had a toothache, which he sat on for six days before seeking help from one profoundly bemused dentist. And many of these men recovered with no neurological deficits whatsoever. Extraordinary.
With neuroscience some twenty years or more behind genetics and other interoperable disciplines, it takes some effort to enlist the reader on an empirically sound, up-to-date voyage of the field. Brain Candy may well have been formed with other goals or target audiences in mind, and that’s fine, too. As a mere intellectual stimulant, it gets the job done. I’m hesitant to label this ‘pop’-neuroscience, given the aforementioned citations and scholarly references abutting much of the synopses. But it sure reads like it. Sundem’s an entertaining writer, to be sure, but he leaves too much to the imagination, culminating in a book that’s long on fun and short on substance, like a hyper-condensed Radiolab podcast. There’s also a lot of dull filler that could have, to great benefit, been replaced by meatier exposés on the less permeable topics. I’m just one taste-tester, however. If a heavily staccatoed collection of brain facts and toothsome studies, logic puzzles and neurovisual tricks appeals to you, Brain Candy might be the perfect complement to your reading or living room décor.
Note: This review is republished from my official website. Click through for additional footnotes and imagery....more
“Everybody in the world got her cells, only thing we got of our mother is just them medical records and her Bible.”
In 1951 doctors at Johns Hopkins i“Everybody in the world got her cells, only thing we got of our mother is just them medical records and her Bible.”
In 1951 doctors at Johns Hopkins in Baltimore diagnosed an unusually extreme case of cervical cancer. The breakneck growth rate and resistance to common remedies were unlike anything previously seen. Without informed consent, the doctor scraped a clump of cells from the cancerous tumor during one of the 30 year-old woman’s visits and deposited them with a nearby cell culture lab operated by George Gey. There, in the air-controlled space of Gey’s culture facility, something happened that researchers had long suspected out of reach. Where countless other cells before them had died, hers survived, and multiplied indefinitely. The immortality of this one cell line would prove instrumental in treating a number of the world’s most heavy-hitting diseases and set the stage for an all new era in cell research, medical diagnosis, drug development and bioethics. But this is a story not just about bundles of cells, but about the humanity behind them, and in particular the family with which they would eventually intersect.
Henrietta Lacks passed away just nine months after diagnosis, but it would be more than two decades before her family received word of the namesake cell line which survived her—HeLa. Its value as a scientific and commercial commodity was paramount, getting bogged down in the thicket of rights and privacy less so. Once researchers discovered that Henrietta’s cells did not die but instead bred a new generation every twenty-four hours, HeLa quickly became the tireless workhorse of the culture community. Vials were trucked to research facilities around the world, where they were regularly exposed to a pavilion of infectious diseases, merged with non-human DNA, and even shot into space in an effort to study the influence of zero gravity.
Over time, details about the parent of the eponymous cells funneled out of view. Henrietta’s own kin were oblivious to the fact that her cells lived, were being studied and tested in dozens of countries, and were being exchanged at great profit. Once the unwitting donor’s official name was released, journalists descended in full fervor, leading to a lifetime of turmoil and unrest for the Lacks household. Set against the rich legacy of the family’s cells, their life of poverty and unpaid medical bills seemed untoward by comparison, and this sense of injustice only mounted as more details filtered in from the press.
Rebecca Skloot’s segue into this lush narrative was by and large a product of serendipitous circumstance, having first heard of HeLa in an intro to biology course at her local community college. After learning about the remarkable advances the cell line has gifted to science, Skloot shot up her hand and innocently asked, “Who was she?” The professor’s reply proved unsatisfying: “An African American woman.” The seed of curiosity had been planted, and by the time Skloot graduated she’d decided to write a book to recover the disremembered woman whose cells had lived outside of her body for 60 years.
Who Was Henrietta Lacks?
For more than a decade, Rebecca Skloot layered herself into the Lacks’ story, befriending and forming deep bonds with Henrietta’s children and cousinry. She forges an especially strong connection with Deborah Lacks, Henrietta’s youngest daughter, who was unfortunately too young to remember much about her mother. With a mix of patience and persistence the two work together to dig up Henrietta’s heritage and bring to light her hitherto uncelebrated legacy. It is Skloot’s at times unstable relationship and camaraderie with Deborah that comprise the bulk of the book’s runtime, as they probe ever deeper into the mystery surrounding her mother.
Tracing the family’s roots places Henrietta’s childhood at the tail end of the Jim Crow era, a time when segregation meant a great deal more than which bathroom and water fountain one was required to use. Rampant legal disadvantages in the South translated to decades of institutionalized maladjustment for blacks, and this included limited access to medical care. Having spent her youth on a slave plantation in Clover, Virginia, Henrietta joined in the Great Migration at the ripened age of twenty-one, exchanging her familiarity with tobacco fields for a new life in Baltimore. Nine years later, she would be diagnosed with the malignancy that led to her iconic cell line. And it would be another fifty years before Deborah, with the help of Skloot, laid eyes on her mother’s original medical records.
Johns Hopkins Hospital was founded for the express purpose of treating Baltimore's poor and, unlike today, informed consent upon providing blood or tissue samples was neither required by law nor common practice (the term did not even appear in a court document until 1957). Like everyone at the time, Henrietta hadn't a clue about what would become of her excised genetic material and the boons it would lavish upon the scientific community. This became a flashpoint issue in the years that followed as the occasional patient attempted to turn a gray area into a lucrative venture. While we learned more about DNA and transmissible disease, Henrietta's resilient cells inaugurated an international conversation on the commercialization of biological materials and where exactly the donors fit within that ecosystem. (Skloot includes a superb summary in the afterword which draws together the various threads of this nuanced debate.) In many ways, the conversation is ongoing.
The HeLa Legacy
To properly gauge the vastitude of the legacy tied up in Henrietta’s cells, it’s important to understand 20th century cell culture. Until Henrietta came along, cell immortality was a pipe dream. While standard, non-cancerous cells had been grown in vitro since 1907, none of them survived long enough to endure important testing, petering out after 50 divisions, on average. Even cancer cells, like those common to cervical cancers, died shortly after relocation to the culture environment. This meant not only was there a brief window for experimentation but that a continuous supply of fresh cells was needed to sustain research. Many in the field yearned for a better way.
HeLa keynoted a paradigm shift. They were robust enough to survive in the bricolage of materials used in Dr. Gey’s culture medium, and they were susceptible to the same range of infectious diseases as were normal cells. This allowed researchers to inject the lively DNA with diseases as well as cures. Not merely a boon to science, this revolutionized the field, so much so that one of the researchers Skloot interviews in the book, when asked what would happen if HeLa was pulled from research use, replied, “Restricting HeLa cell use would be disastrous. The impact that would have on science is inconceivable.” (p. 328)
Thanks to Dr. Gey’s partnering efforts with labs around the globe, the benefits of HeLa manifested seemingly overnight. In just one year after HeLa’s genesis, Jonas Salk and his team were able to derive a vaccine for polio by infecting HeLa cells, followed shortly by drug treatments for HPV, herpes, leukemia, influenza, hemophilia and Parkinson’s disease. In 1953 HeLa became the first cells to be cloned successfully. In total, some 60,000 papers have been published on HeLa, and cell culturists around the world still use her cancer cells as a model for human biology.
What was it exactly that gave HeLa cells their added dose of moxie? Even today, we can only speculate. We now know that Henrietta was infected with both HPV and syphilis, so one theory is that this virulent combination may have helped suppress PCD. A rival theory suggests she had highly atypical genes to begin with, perhaps a rare mutation, that enabled the cancer to spiral in unspecified directions. We still are uncertain. HeLa has also been the source of much frustration over the years as it has even infected other cell lines, periodically halting research until the contamination is sorted.
Rebecca Skloot’s ten-year effort is a literary and cultural marvel balanced between unsheathing the posthumous success and humanity behind one of the richest narratives in science and laying bare the bioethical implications to which it is attached. It succeeds on both fronts. Skloot is at her most animating when channeling the lens of Deborah Lacks as they band together to reach some much-needed closure to the saga thrust upon the Lacks family. With her first book, Skloot has demonstrated in equal measure her facility for conveying scientific nuance while weaving meaning and poetic force into a cohesive whole. Yes, this is first and foremost a nonfiction book, but its pages are emblazoned with enough touches of novel-like charisma to beckon both crowds. The Immortal Life of Henrietta Lacks is a timeless story delivered by a gifted writer and is the definitive presentation of the legacy Henrietta never lived to see. Highly recommended.
Skloot also founded The Henrietta Lacks Foundation - click here for more - to provide support for the Lacks as well as assistance to African Americans pursuing education in science and medicine. A portion of her book’s proceeds are donated to the Foundation.
Note: This review is republished from my official website. Click through for additional footnotes and imagery....more