The start of this book feels like being punched in the face with adjectives and being expected to weave together a narrative from the bruises. And theThe start of this book feels like being punched in the face with adjectives and being expected to weave together a narrative from the bruises. And then, slowly, the narrative begins to weave itself for you, and you begin to cherish the aches and pains it accords you in its wake. Being embarrassingly late to the Faulkner party-- and doubly ashamed for nevertheless being such an avid McCarthy fan-- I hardly need to expound upon how wonderfully complex a writer Faulkner is. This is Southern gothic at its absolute best: an ephemerally beautiful darkness that unravels itself in a slow, dense, meandering poetry that evokes the heaviness of the Mississippi heat and the captivating allure of hidden drama and private tragedies. I was transfixed by every page; I sense a Faulkner binge coming on....more
Astonishingly good art carries the weight of this comic, which suffers from woefully dull writing and adherence so firm to genre stereotypes that theAstonishingly good art carries the weight of this comic, which suffers from woefully dull writing and adherence so firm to genre stereotypes that the plot becomes a predictable backdrop to the unpredictable beauty of the illustrations. And frankly, the art is enough to make it worth the quick read. Gorgeous, gorgeous, gorgeous....more
A short and accessible historical gem that explores, well before its time, points of convergence and divergence between natural and artificial intelliA short and accessible historical gem that explores, well before its time, points of convergence and divergence between natural and artificial intelligence systems. It's particularly interesting as a meter stick of progress between the 1950's and the current day: whereas the computer side of the question is in some ways quaintly outdated, von Neumann's outline of the human nervous system and the functioning of individual neurons remains near the limit of what we understand even today. Not quite exactly at the limit, but certainly not as dreadfully short of it as 1950's computer science is to our current level of technological understanding. Psychologists often point to the youth of our field as if it were an apology for the slow progress-- still waiting for our Newton or our Einstein--, but the contrasts laid out in this book, as a modern reader, question the validity of the excuse. Computer science is younger still than psychology, yet the speed of knowledge acquisition dramatically outpaces that of psychology.
Nevertheless, as von Neumann outlines presciently, the points of divergence may overwhelm the points of convergence, rendering the comparison moot. The essay stands as a reminder of the limits of artificial intelligence when compared to biological intelligence: we may never be able to build real brains from artificial components running current computer architectures. I was especially piqued by his closing remarks that the mathematics of the nervous system may reflect different mathematics than the ones we currently understand. He compares our known mathematics to assembly language and the hypothesized neural mathematics to higher-level programming languages, suggesting that we may simply not have found the right method for interpreting the latter.
I really enjoyed the quick read and recommend it to anyone who works with computers, brains, or-- most especially-- both, as an important historical document....more
A wonderful philosophical exploration of how embodied cognition must naturally extend to the question of consciousness. It veered a bit too polemic atA wonderful philosophical exploration of how embodied cognition must naturally extend to the question of consciousness. It veered a bit too polemic at times, as philosophers are wont to do I guess. For example, whereas I'm fully on board with the notion that cognitive science should reevaluate its stance on computational cognition, given various mismatches between its underlying assumptions and neuroscience, I don't think computational modelling should be thrown out entirely. The perspective pushed by Noe is in some ways an idealistic one; as important as it is to get the science to that bright shining point, we won't get there without trudging through the messy, non-ideal approaches as well. As it stands, current technologies and scientific understanding simply aren't far enough along to fully prove Noe's point, even if it ultimately may be correct. Philosophical over-idealizing notwithstanding, Noe does provide some valuable food for thought and some pause to reconsider the broader extensions of one's scientific viewpoints....more
I was recommended this book by a person who I consider to be extremely intelligent. Having read it, I'm not sure why.
It might have hit a stronger chorI was recommended this book by a person who I consider to be extremely intelligent. Having read it, I'm not sure why.
It might have hit a stronger chord in the 60's, before the idea of creating a new religion to shine light on the perversities of organized religion and human culture more broadly became relatively commonplace. I expect it *was* a shiny new notion back then, and I bet this book *did* ripple in the imaginations of the disenfranchised youth of the time. But to me, now, it read like the effortfully nonsensical ramblings of stoned college students, disillusioned by the wreck that is the world and wanting to escape it-- like a group of guys got together and riffed on one single interesting, though not particularly challenging, idea while high, expanding upon it later to exaggerate the wackiness and insightfulness of their trip.
Chaos is valuable and under-appreciated. Got it. Good. Next idea? Oh you don't have any? Well, why don't you just expound on that one in as silly a way as you possibly can and call it philosophical. Done....more
I have to start by pointing out a serious rift between why I was recommended this book and why I kept reading it: a philosophy major drunkenly tryingI have to start by pointing out a serious rift between why I was recommended this book and why I kept reading it: a philosophy major drunkenly trying to defend free will told me I might be impressed by the book's arguments, and I took it as a challenge. I'd love to see an intelligent response to determinism, a sharp knife to poke at the thick skin of my convictions.
To be frank, I did not find that in this book.
Koestler's arguments against determinism are, to my perspective, peripheral to the bulk of his thesis, which is that evolution has landed man in a blind alley by giving him an exponential growth of cortex that he hasn't yet figured out how to use in concert with the older and more slowly developed subcortical structures. Yet the determinism arguments are treated as foundational in light of the strongly behaviourist tendencies of the scientific era against which he was battling. The result is that he ends up debating a macro-level stimulus-response caricature of determinism that few determinists today hold as sufficient-- a straw man by current standards but, sure, admirable and fairly unique for psychological science circa 1960. It could even be that such relentless (and at the least witty and delightful to read) attacks on behaviourism helped to open bright minds to the cognitive revolution, so we won't hold that against him. But we will hold it against that philosophy major for bringing a knife to a gunfight, and encourage him to read some newer books....
Much of the remainder was a fascinating read, however. Koestler takes us through evolutionary theory and hierarchical systems theory, tying the two together to look at the larger whole of humanity in terms of some evolutionary phenomena.
One of his focuses is on the "blind alley" in evolution, in which a species becomes overspecialized for its environment and then can't adapt properly to new changes in that environment. In the history of evolution, this has happened a number of times, but nature has a way of getting out of it: paedomorphism, basically extending the growth phase of development and evolving off younger forms of the species. This is a process that Koestler calls reculer pour mieux sauter (roughly: taking a step back to jump better), and it's one of the more interesting of the ideas he applies to other levels of the human hierarchical system. For example, he cites this process as the means to creativity in both art and science. Breakthroughs are obtained by breaking down the existing state of thought and working from older ideas in combination with new knowledge (and thus is hit upon the very reason I read old and outdated books like this one).
In a later thread of his theoretical mixology, Koestler takes a closer look at the structure of hierarchical systems, in which each element is both a whole on its level and a part of the level above it-- a duality which he calls a holon and which term is probably this book's most enduring contribution. A holon in a hierarchical system embodies a fundamental tension between integrative (i.e., parts of a whole) and self-assertive (i.e., each part is a whole in itself) tendencies. He not only explains humour, art, and science as lying along a continuous spectrum on this dimension, but also proposes that most of human woe can be explained by the dichotomy. Specifically, that whereas overexpression of the self-assertive tendency can lead to small scale violence, overexpression of the integrative tendency moves the behaviour one step up in the hierarchy and leads to large-scale violence. So nationalism, religion, cults, etc., are a submission of the "wholeness" of the part to the benefit of the larger whole, and lead to destruction on a broader scale. I'm not generally one for such broad theorizing, but I love this idea.
After hitting that broad and impressive peak, he reels the magnifying glass back down to the level of the individual human and argues that we've evolved into a sick blind alley that makes us prone to the delusions inherent in closed systems. A closed system doesn't behave hierarchically, but locks its parts into the part role and leaves the larger closed system the only whole, rejects opinions from outside the system, and so on. Whether these delusions are expressed in terms of mental illness or social illnesses like nationalism/religion, the result is the same and it is not good. Also generally on board with this idea, and at this point I began to develop expectations about where he was going with it...
And then suddenly the message begins a glorious spiral of WTF so far off the mark of the natural extension I'd extrapolated from the book's brilliant middle portion that it took me and my incredulity an entire two weeks to read the last few chapters. Basically, Koestler sketches the need for a drug to "unlock the potentials of the underused cortex" by somehow allowing more distributed communication with subcortical structures, and thereby evading the closed system that amplifies the part/whole tension and leads to our madness as a species-- a madness defined by what he calls the absolute certainty of self-destruction by nuclear war. If the leaps in this synopsis are hard to follow, I'm sorry, but his elaboration doesn't do much better. The 1960's come through loud and clear in these pages, and it's such a pity that he ends on this note, to the tune of my repeating the word NOPE. Not to mention that in the same breath as he expounds on the virtues of such a drug, he deeply misunderstands Huxley's proposal for the virtues of hallucinogens. If there's an afterlife, I hope Mr. Huxley and Mr. Koestler have by now discussed, over magical heaven tea, that they completely agree about what drugs can and cannot do with the contents of a human brain, because wow what a misreading of Huxley. BUT I DIGRESS.
Take the last bit with a grain of salt and forgive Koestler the hubris of assigning a much-deplored relic of psychology's past as his arch-nemesis, and this is a profoundly interesting extrapolation of known scientific processes to new milieus, with at worst thought-provoking and at best insightful results. With the reality of those elements however, it's hard not to take the rest with at least a half-grain of the same salt. If one's foundational assumptions are outdated, it's difficult not to question the validity of anything built on them. Nevertheless, it was delicious and surprisingly far-ranging food for thought and I'm glad I got into a drunken debate with a philosophy major. Even though he's wrong....more
Chilling. This was the first Poe I have ever read, and I find myself immensely impressed by his skill with words. At a time before thesauruses, he kneChilling. This was the first Poe I have ever read, and I find myself immensely impressed by his skill with words. At a time before thesauruses, he knew how to turn words to his needs, reshaping each one to pull a horrified fascinating out of what is ultimately a simple, predictable, and oft-told narrative....more
Despite its hyperbolizing (and a few factual inaccuracies that I recognized only because I've studied neurophysiology), this was some fun food for thoDespite its hyperbolizing (and a few factual inaccuracies that I recognized only because I've studied neurophysiology), this was some fun food for thought. A quick read offering up plenty of laughs, much in the style of Cracked.com, for which the author is a contributor and editor. It was moreover a wonderful and accessible way to learn about a whole host of recent scientific advances, both terrifying and incredible....more
Dave McKean, you need to do more solo work, because this was spectacular. As an artist, McKean has a keen (no pun intended) eye for composition anWow.
Dave McKean, you need to do more solo work, because this was spectacular. As an artist, McKean has a keen (no pun intended) eye for composition and a skilled hand for form; every page is beautifully crafted and stylistically evocative, seamlessly blending a number of techniques, materials, and moods to tell an intricate visual story. His linework strikes a difficult and haunting balance between fine weightlessness and heavy schizophrenic uncertainty, while his broader brush strokes serve alternately to lighten, to darken, to evoke a sense of blank canvas, to suffocate, to give motion to the lines, and so on. I really could go on forever talking about the art in this book, and forever and a day talking about how perfectly the art amplifies the story and each individual character within. But my words would fall short of how stunning it is to feast your brain on, so you should probably just feed it to your own brain instead of listening to me.
As a writer, he is perhaps a touch over-explicit. Whereas the first 90% of the book is a beautiful enigma, he spells out a bit to plainly what his objective was by the book's conclusion. Nevertheless, his ideas are demonstrably ambitious and his execution remarkably thoughtful; he explores every angle of the book's themes, of which I can distinctly count about five and none of which were skimped on in the slightest. The story-within-a-story motif is pushed to its limits, creating an onion of narrative that echoes and adds to itself with every layer, often switching between layers like orchestral threads.... the musical analogy maintained throughout the story is, I think, particularly apt for describing the overall shape of it, and you could easily argue that metaanalysis itself is a sixth theme if you wanted to. And I think I do want to argue that, now that I think about it.
The book is ostensibly about God and creation, but I think reading too literally into that idea would be a misguided attribution, and scientists and atheists and other godless types should not shy away as a result. What McKean has created is, I think, a universally appreciable hymn to the intricate mess that is humanity and the impossibly recursive ways that our serendipitously meta-capable minds are equipped for thinking about it....more
This was an extremely fun and informative read. Ronson's self-deprecating style incorporates honesty with entertaining gonzo journalism and a willingnThis was an extremely fun and informative read. Ronson's self-deprecating style incorporates honesty with entertaining gonzo journalism and a willingness to do almost anything and trust almost anyone in the name of a story. If he ever fears for himself in his increasingly unusual adventures, he writes about it, and if he sees that he is becoming as paranoid as his conspiracy theorist subjects, he admits it freely.
The subject matter was completely unknown to me, but Ronson delivered it helpfully, slowly introducing new characters and scenarios and occasionally reinforcing the reader's understanding, without coming across as pandering. That's a hard line to walk in investigative journalism, especially when the subject metter touches equally on niche and (what should be) general knowledge-- but I think he walks it wonderfully throughout. I learned a lot about conspiracies, world leaders, extremist figures, and atrocities about which I knew nothing, like the battle for Ruby Ridge. I have no idea how I could have been so completely unaware of such a monumentally disturbing event in modern history, but I'm grateful to have learned about it from so honest a writer. I think it takes great skill and character to be able to look at a massacre, relay its horrors openly and honestly, and find some modicum of true humor within it-- finding the seeds of irony that make the situation possible to digest.
Above all, I think this book made me aware, for the first time, of the strange subculture of conspiracy theorists and counter-conspiracy agents, extremists and coalitions against extremism, racists and anti-racists, and how the two sides of the coin are almost indistinguishable, as if they've built up to an arms race of absurdity for media attention. Fascinating stuff, and a thankfully fast read for the amount of information contained within. I will definitely be checking out more of Ronson's work, namely The Psychopath Test: A Journey Through the Madness Industry. Because psychology....more
Despite its age (which becomes apparent only in a select few chapters that focus on the Internet and neuroscience), and despite that I disagree with aDespite its age (which becomes apparent only in a select few chapters that focus on the Internet and neuroscience), and despite that I disagree with a number of the author's contentions, I really enjoyed this book. Blackmore presents a comprehensive understanding of memes, those cultural self-replicators that drive much of our behaviour in our modern social world. She makes use of a host well-articulated descriptions, examples, and scientific narratives, offering fairly weighted arguments for most of the (many) topics she touches on, and explores the full extent of her theory as any rigorous scientist should. I was even appreciative to see a number of "I have no idea"s and "I'm not sure"s throughout; the honesty of a researcher who recognizes the need for further exploration shines infinities brighter, to me, than does the stark certainty of a popular science writer..... (that old adage that the more intelligent you are, the more you realize how little you know, applies)
My primary complaint about the book was that it retraced its steps a lot, reiterating basic points and redrawing key analogies a number of times. I imagine this was to really drive the point home for those less familiar with biological evolution or memetic theory, which is entirely fair given the book's publication date. Still, I found myself skimming lightly over some of the middle sections, having deja vue moments and wondering when new ideas would be presented.
A secondary complaint is at once more fundamental to the content of the book less critical to my enjoyment of it. And it's simply this: I'm not entirely convinced by some of the author's proposed extensions of memetic theory, which seem in spots to move the explanation back one rung without really hitting at the core "why" or "how" of the question. Meanwhile, she argues that this is the problem with using consciousness as an explanation for anything (and I agree whole-heartedly): it is an epiphenomenon that itself explains nothing, but instead demands further explanation. It's unfortunate that she didn't acknowledge the similar limitations of memetic theory in this respect, especially as it pertains to the chapter on "self" and the "selfplex." As a cognitive scientist deeply interested in the nature of memory, I don't think Blackmore steps back far enough to reach the core of the "self" issue.
Regardless, the notion of memetic selection in this area has added an additional layer to my own theories of self (which rest, naturally, on a foundation of memory). I'm grateful to Blackmore for outlining these levels of analysis for me in such a thorough manner. It's certainly the task of any scientist to run with their pet theories as far as they can, and wait for evidence to contradict them; the contradictions serve as motivation to refine and move one step closer to "truth," with the understanding that capital T truth is an illusion in science and that your ego is thereby best left at the door.
And on that note, I want to underline how much I enjoyed the final chapter, which flirts much more closely with philosophy than the rest of the book. I love this way of thinking, without a self, and I want to practice it and see where it takes my thoughts; I feel it could have tremendous benefits for my work (or my memes, if you will... it's come full circle!) and my ability to think clearly about theory and computational modelling. As a prototypical grad student sufferer of impostor syndrome, it can only help my science to let go of such concerns and let the ideas drive themselves....more
This was a great review of research into all forms of memory, impressive more for its breadth and depth than its current relevance, as it's about tenThis was a great review of research into all forms of memory, impressive more for its breadth and depth than its current relevance, as it's about ten years outdated. Neath and Surprenant are at the forefront of memory research, and this book really highlights why: they consider all angles to a problem, are impressive scientific thinkers, capable computational modellers, and highly accessible writers-- and, importantly, they don't hold to the status quo when new insight suggests the truth may be otherwise. Often, they are the ones leading the charge with new insight. Indeed, this book almost serves as a historical primer to the ideas they develop in more detail in their "Principles of Memory."
The chapters on modal models, sensory memory, and working memory served as excellent reminders of everything I'd forgotten since second year of undergrad on account of my work not touching those areas at all. The chapters on processing, implicit memory and forgetting were particularly memorable (pun not intended) and offered some new angles I'd never considered to the problems via classic research I'd never gotten around to reading. And their optional chapter on computational modelling presents the most easily comprehensible explanations I've seen of a number of tricky models: a great benefit for the mathematically uninclined but hopeful modellers out there.
I'm really glad I read this book, and I hope to explore some of the research ideas their writing inspired (assuming nobody has beat me to these questions in the last ten years....). I recommend this to any cognitive scientist or memory researcher; it's an invaluable historical resource that summarizes the development of most areas of the field succinctly and thoroughly. ...more
A whirlwind tour of cognition as it applies to decision-making, delivered by one of the founding minds of the related discipline of behavioural economA whirlwind tour of cognition as it applies to decision-making, delivered by one of the founding minds of the related discipline of behavioural economics. Having been a student of cognitive science for more than six years, I can't pretend that I learned much from the book; most of it is a re-telling of the empirical stories that made Kahneman and Tversky household (laboratory-hold?) names and a re-framing of age old dual-systems conceptions of mind in the most literal terms possible (System 1, System 2). The renaming adds little to the theoretical perspective, but the book as a whole is a helpful refresher on a wide sweep of research from numerous fields of psychology. At times repetitive and oversimplified, and at other times flippantly overoptimistic about a general audience's mathematical capacities, Kahneman nevertheless succeeds at penning an accessible and engaging introduction to decision-making theory and cognitive biases. That he ties the empirical work tightly to considerations of personal betterment, happiness, and policy is moreover laudable. The man has had a truly impressive career, and it's inspiring to see that excelling so broadly in empirical explorations has not diminished his view of the bigger picture.
That said, I just don't think I was the intended audience for this book. To me, having already been familiar with the majority of the experiments and findings discussed (sometimes to a painful degree), the reading was a bit like exercise: rewarding but tedious and over-familiar, like reading an introductory textbook start to finish years after finishing a course. I was not motivated or compelled to open the book often, or to keep it open very long when I did, and it took me far longer to read than it should have as a result.
I nevertheless strongly recommend the book to a lay audience that is wholly unfamiliar with the field of decision-making or with the notion of cognitive biases. I remember learning about these phenomena for the first time in my second year of university and being blown away by how truly un-objective my perception of reality was. I remember subsequently finding myself more cognizant of my flaws and careful in my analysis of a given situation. That many of the book's demonstrations of cognitive bias did not "work" on me is proof that one can be taught by mere exposure to question such automatic responses, and speaks clearly to the importance of this book getting into the hands of the general public. We could all benefit from a world full of more careful and attentive decision makers....more