Hmmm, at first the New York Times review, Marshall McLuhan: Media Savant, made me want to read this, but the last few paragraphs kinda dampened my eHmmm, at first the New York Times review, Marshall McLuhan: Media Savant, made me want to read this, but the last few paragraphs kinda dampened my enthusiasm. Still, a two-hundred page quick read can’t hurt too much....more
The New York Times review (see The Final Conflict, by Orville Schell) of this epic work includes this paragraph in describing the book’s conclusion:The New York Times review (see The Final Conflict, by Orville Schell) of this epic work includes this paragraph in describing the book’s conclusion:
The competition that East and West have been pursuing for so long, Morris warns, is about to be disrupted by some powerful forces. Nuclear proliferation, population growth, global epidemics and climate change are in the process of radically altering old historical patterns. “We are approaching the greatest discontinuity in history,” he says.
After hundreds of pages of description, the author apparently surprises the reader by ending with heartfelt prescription: our global civilization is caught between the dream of transformation away from all of the problems that have historically beset mankind, and the nightmare of collapse. The “Singularity” is apparently emblematic of the former, and “Nightfall” represents the later. Unfortunately,
For the Singularity to win out, “everything has to go right,” Morris says. “For Nightfall to win only one thing needs to go wrong. The odds look bad.”
But I started out skeptical. I’m fairly optimistic that in the long term humans are pretty good at ratcheting up to a better future, but my gut reaction to the wide array of problems facing today’s civilization is that the cumulative effect might trigger a global “reset button” handing us a new Dark Age, relatively speaking, within a few generations.
I haven’t seen any writer examine the totality of these problems and address how difficult things might get if we’re hit by them all at once, more or less. The review in the Economist made me think Ridley is fairly dismissive of some problems, so my expectation that he’s got something fundamentally new to say is pretty low. Still, I wanted to give him a chance.
But the book — even though I couldn’t be bothered to finish it — was worse.
I had several big complaints.
Style and Attitude
The first thing that began to exasperate me within a few pages was his attitude. Frankly, he’s arrogant, and in my book arrogance is only barely tolerated when you’ve satisfied one huge condition: you are absolutely correct, no ifs-ands-or-buts. Even then, it’s nice if you show a little humility. And Ridley fails on that score in many ways.
Ridley hasn’t made anywhere near a strong enough argument to dismiss contrary opinions so casually. As an example which specifically grated, in the chapter on climate change he mentions that a few decades ago it was “fashionable” to talk of global cooling, and now it is “fashionable” to speak of global warming. He contrasts two paragraphs from then-contemporary news articles to show how similar such prognostications can sound.
Now, to anyone with a scientific approach, being accused of pursuing your research because it is “fashionable” is, frankly, a vicious insult. Fashion is about what people will find appealing, and is typically ephemeral and often superficial. In contrast, scientists are doing their best to seek “eternal truths.” Or, as Jared Diamond puts it, they’re engaged in a methodological search for reliable information. Ridley’s choice of words was an insult, and this is not only a lack of civility, it is also very poor reasoning. The latin for what Ridley is doing here is the old ad hominem attack: Ridley doesn’t waste one word examining why scientists were worried about climatic cooling in the ‘80s, and instead he trivializes that investigation as “fashionable.” To a logician, he has attacked the arguer, not the argument. Then, by drawing such a strong parallel between the earlier fruitless investigation and the current one, he is also dismissive of the latter. An attack by association: by linking the two arguments on the basis of cursory similarity, he is again ignoring the argument whilst attacking irrelevancies. So even before he goes into any details of evidence, he has primed his readers’ expectations in a logically illegitimate way — and yet his book is supposed to be about rationality?
And while he does go into a bit more discussion of climatic warming, he remains dismissive of opposing arguments. For example, he rails against one IPCC scenario because the “world population reaches fifteen billion by 2100, nearly double what demographers expect.” Well, duh — they used multiple scenarios. If you check his endnotes and look at what he cites, the first sentence of the first paragraph of the portion of the report he’s citing states: “Three different population trajectories were chosen for SRES scenarios to reflect future demographic uncertainties based on published population projections.” The high one at 15 billion, the mid-range (the UN’s most recent estimate) of 10 billion, and a low estimate of 7 billion. Note how he cherry-picks data that strengthens his argument. This is a very bad sign, if you are hoping for balanced reasoning.
Once I grew suspicious of this tendency, I saw further hints all over the place. I’m not surprised — if the blurb on the back is any indication, Ridley takes pride in being “provocative,” which I’m pretty sure doesn’t play well with “balanced.” For example, on the next page (p. 333) he starts off saying he’ll look into the IPCC’s more likely case of a 3°C rise by 2100. He then goes on to toss in an unsubstantiated complaint in parenthesis, noting that this scenario still requires a rate of temperature increase to double that of the 1980s and 1990s, whereas “the rate has been decelerating, not accelerating.” Well, yes, climate change is expected to accelerate for a wide variety of positive feedbacks, such as: the earth’s decreased albedo as white ice melts and is replaced with dark sea water; or permafrost melts and decays, releasing methane and carbon; and droughts cause deforestation reducing carbon sequestration; or even the melting of methane clathrates as the oceans warm. And that “recent” trend? I’m pretty sure Mr. Ridley is latching onto noise due to the ENSO and other chaotic factors. But since he selectively neglects to footnote his cause for complaint, we have no way of checking his assertion regarding “decelerating.”
Blindness to Societal Collapse
Ridley starts off by looking at the big picture of human evolution and our ever-increasing trend towards prosperity. But he really doesn’t like annoying details, such as the many civilizations that have collapsed during that stretch of time. Having recently finished Jared Diamond’s Collapse: How Societies Choose to Fail or Succeed, I was watching for Ridley’s argument about why our current global civilization couldn’t collapse, paving the way in a few centuries for another to continue his teleology. But he conflates any mention of “catastrophe” with the extinction of the human race. As far as I know, neither the Romans nor the Maya nor the Khmer Empire were struck down by a meteor. None of those societies are mentioned in the index — even though this book was published five years after Diamond’s Collapse, Ridley apparently didn’t think that topic was worth his research. Diamond is only mentioned in the book so Ridley can dismiss him as “the otherwise excellent scientist and writer,” some of whom’s 1995 predictions are at least a little off, according to Ridley’s assurances (Diamond even gets laughably smeared with Ridley’s favorite dismissal as being “fashionable”).
If someone is going to write a book about how everything is going to be wonderful forever and ever, I would think he’d at least explore some of the more notorious alternatives, not just those that he can easily make fun of.
But I had anticipated this oversight before even opening this book. What still surprised me was that he never even spotted the best reason for “rational” pessimism.
Ridley points out that humans have evolved into incredibly efficient organisms at solving the problems our paleolithic ancestors faced. Most humans alive today have access to food, health and a length of life that would astonish even our great-grandparents.
And given how important those things are in our life, I’m also optimistic that we’re going to keep getting better at them. Given the staggering amount of research that’s going on, it would be very surprising if the coming decades don’t provide continuing delights at keeping people healthier and living longer.
But here’s the problem: when I look around me, most of the people I see are already pretty satisfied on those counts. Sure, it’ll be really sweet when we finally cure cancer, and when we can reliably prevent Alzheimer’s, etc., etc. But the existential threats that drove paleolithic existence aren’t reflected on most folks’ day-to-day anxiety list, are they?
The upshot of this is a little tricky: if the existential threats present during evolutionary time aren’t what drives us today… what does? Something I think is important to realize is that no matter what the answer is to that question, it isn’t embedded in our nature, at least certainly not in the same way as the old threats. Which means it is a very flexible thing, informed by culture, preference and contingency. And that means individual and societal choices will vary widely, and might often contradict each other. I can easily imagine some of those drives being cause for pessimism — whether they be growth-for-growth’s sake of the capitalist, or the holy wars of various religious extremists. Those mimetic constructs could, in turn, put a damper on the pollyannaish future presented here.
Since Ridley merely examines how good we are at meeting the materialistic goals of cavemen, he really never gets it. The pessimism of the post-modern isn’t about Malthusian crises, but about the lack of focused direction for our post-materialist civilization to take.
Ridley doesn’t see that problem, and this book is fundamentally flawed. ...more
Some books get all the luck. When a reader is first exposed to a perspective never before seen, or an effort of creation never imagined, that book thaSome books get all the luck. When a reader is first exposed to a perspective never before seen, or an effort of creation never imagined, that book that triggered this will loom larger, regardless of its merits.
Wood's book is the first litcrit book I've ever read; or at least that I can recall (there are plenty of books I read twenty or thirty years ago that would surprise me now).
I got lucky, since this is a engagingly written and passionate work of a bibliophile, but what earned it that extra star was that I hadn't studied the craft of writing before, so it hadn't occurred to me that it would refine my craft of reading as well. As others here have complained, this makes pedestrian prose a bit harder absorb, but Wood also remind us that there is probably still plenty of excellent fiction that can be turned to instead.
The overwhelming majority of books I read come from the public library -- San Francisco's main is only a ten minute walk. This will be one of the very rare books that makes it to my 'buy' shelf. I think it will also be that even rarer book, one that I'll hope to re-read often -- although my infatuation may lessen if and when I find other (perhaps better?) litcrit books.
I just took a look at that shelf, and it reminded me that Wood's frequent references to books I haven't yet read, or to books I read as a less enlightened reader brought back to mind Helene Hanff's 84, Charing Cross Road. I don't recall anything about Hanff's skill as a writer, but she must be one of the most delightful readers of the past century. If you haven't read her short, epistolatory memoir, then you are missing out on a classic. (The movie is a conceptual sacrilege: a story about readers should be read, not watched!)
P.S.: Take a look at the moderately glowing review from The Economist, and an article-on/interview-with the author from the Harvard Crimson (2003).
(Mentioned in V.S. Ramachandran'sPhantoms in the Brain; apparently it includes an explanation from Richard Feynman regarding why we perceive mirrors(Mentioned in V.S. Ramachandran'sPhantoms in the Brain; apparently it includes an explanation from Richard Feynman regarding why we perceive mirrors "flipping" images from left to right, but not top to bottom. "How does the mirror know?" asks the child to the discomfited parent. "And while we're at it, why is the sky blue?".)...more
This is, loosely, a follow-on from A Big Boy Did It and Ran Away. Same locale, and the focus here in on the wonderfully named Angelique de Xavia, whoThis is, loosely, a follow-on from A Big Boy Did It and Ran Away. Same locale, and the focus here in on the wonderfully named Angelique de Xavia, who played a large part in the previous book.
Do you need to have read Big Boy to get everything here? No, you'd be able to piece together the necessary inferences about what happened, but it would somewhat spoil things should you then want to go back and read the prequel.
Anyway, what I wrote in my review of that book still stands.
Quite the high adventure. Plenty of profanity, satire and characters spouting misanthropic vitriol. Took quite a bit of effort to parse the Scots lingo at times, and there was some I still couldn't get. The sports references sailed right by — I don't even follow American stuff.
The cultural mismatch slowed down my reading so the hook didn't really sink in until one third of the book was gone, but eventually the shenanigans should grab anyone. Can't say that I think Brookmyre is the next greatest thing, but still a fun read.
Yup, pretty much more of the screwball same. This sequel did have quite a bit more salaciousness, and while it has been a while since I read the first, I believe he did a better job of developing his characters. Oddly, the only scene and relationship that might merit tittilating content got a soft-focus soft-core treatment. The naughty bits were tangential to the story, and intended more to cause the reader to gawp and snicker.
Certainly good enough that I'm buying the next (and last) to feature this cast of characters. ...more
Oh, dear — I wished I could say I liked this. But this is a slow moving, quietly speaking novel whose main attraction is the allegory behind the wordsOh, dear — I wished I could say I liked this. But this is a slow moving, quietly speaking novel whose main attraction is the allegory behind the words. That story in front is a somnolent narrative about nothing much happening for twenty years. I have enough trouble with allegory when it's hidden behind an interesting story, but couldn't handle this combination. Ah, well......more
I hadn't realized this was a collection of Gladwell's essays, many (most? all?) have seen publication in the New Yorker. I found this out while readinI hadn't realized this was a collection of Gladwell's essays, many (most? all?) have seen publication in the New Yorker. I found this out while reading the New York Times essay on the book, Malcolm Gladwell, Eclectic Detective, by none other than Steven Pinker. His evaluation of What the Dog Saw is mostly laudatorypretty hostile*, althoughand he takes the opportunity to get a dig in at Outliers:
The reasoning in “Outliers,” which consists of cherry-picked anecdotes, post-hoc sophistry and false dichotomies, had me gnawing on my Kindle. Fortunately for “What the Dog Saw,” the essay format is a better showcase for Gladwell’s talents, because the constraints of length and editors yield a higher ratio of fact to fancy.
And just for the sake of completeness, the New York Times published a "profile" of Gladwell, back in 2006, entitled The Gladwell Effect, by Rachel Donadio, which includes two pictures showing Gladwell's exuberant hair at two degrees of shornness. Bonus! (Although not quite as luxurious as Pinker's tresses...)
* Review edited for accuracy after I was prompted to read it more carefully! ...more