R. Scott Bakker's Blog, page 32

March 9, 2012

Caution Flag

Aphorism of the Day: The first thing to go when you turn your back on philosophy is your Ancient Greek. The next is your formal logic. Then you lose your ability to masturbate in good conscience, which tends to dwindle in direct proportion to your ability to read German.


So I'm still reading Kahneman's Thinking Fast and Slow here and there between several other works. One of the things I'm enjoying about the book is the significance he attributes to what he calls (rather cumbersomely) WYSIATI – or 'What You See Is All There Is.'


For years I've referred to it as the Invisibility of Ignorance, or the 'unknown unknown' (of Donald Rumsfield fame), but lately I've started to call it 'sufficiency.' I'm also beginning to think it's the most profound and pervasive cognitive illusion of them all.


Consider the following one sentence story about Johnny:


S1: Johnny went to the corner store, grabbed some milk, then came home to watch Bill Maher.


This is innocuous enough in isolation, until you begin packing in some modifiers:


S2: Johnny went to the corner store, stepped over the blood running into the aisle, grabbed some milk, then came home to smoke a joint and watch that idiot Bill Maher.


Pack in some more:


S3: Rather than take his medication, Johnny went to the corner store, shot the guy at the till in the face, stepped over the blood running into the aisle, grabbed some milk, then came home to smoke a joint and watch that liberal scumbag idiot Bill Maher with his neighbour's corpse.


Oof. That Johnny's fucking crazy, man.


The point here has nothing to do with 'what really happened' with Johnny. The extra modifiers are additions, not revelations. The lesson lies in the way each formulation strikes us as complete - or in other words, sufficient. This is one of those hard won nuggets of wisdom that most writers, I think, learn without realizing: reading fiction is pretty much a long-drawn out exercise in sufficiency (WYSIATI). What they don't know about your story and your world literally does not exist for them, not even as an absence. Going back to Johnny, you can see this quite clearly: It wasn't as if anything in the meaning of the prior sentences required anything whatsoever from the subsequent ones…


Well, not quite. S2, you might have noticed, contained an incongruous detail, 'the blood running into the aisle,' that pointed to the existence of something more, something crucial that had been left unsaid. Let's call this a flag.


A flag is simply information that cuts against the invisibility of ignorance, a detail that explicitly begs other details. You might say that the key to effective writing lies in balancing sufficiency against 'flag play.' One of my biggest weaknesses as a writer before becoming a professional was way I tried to turn everything into a flag. I made the mistake of thinking the relationship between the reader's intrigue was directly related to the quantity of flags in my prose, not realizing that the fine line between narrative confusion and mystery was a much more complicated matter of balancing sufficiency against the strategic deployment of flags. Roger's piece, I think, can be used as a case study in just how well it can be done.


Flags also help us understand the first problem I mentioned, the way novice writers often have difficulty trusting the sufficiency of their prose, and so think they need to exhaust scenes with detail that readers already assume, such as the fact that rooms have walls, homes have windows, and so on. The fact is, the apparent sufficiency of anything can always be flagged. All you have to do is ask the right questions, and what seems sufficient will suddenly sport gaping holes. This why learning to write requires learning to anticipate the kinds of questions the bulk of your readers will be prone to ask, the kinds of things they may gloss while reading, but flag when reflecting on the story in retrospect.


This, by the way, explains why stories that strike some as pitch perfect will strike others as ridiculously flawed: different expectations means different flags means different estimations of sufficiency.


This also explains why criticism is such a delicate art, and why writers have to be exceedingly critical of the critiques they receive: since anything can be flagged, so much depends on the mindset of the reader. So many critiques you encounter as a writer turn on individual readers asking atypical questions. 'Finding' problems in a text is literally indistinguishable from 'making' problems for a text, so when you read looking for problems, you will invariably find them. Anything can be flagged. All you have to do is find the right question.


This also explains the 'poisoning the well' effect, the way simply broadcasting certain questions can have the effect ruining the illusion of sufficiency (for as should be apparent, sufficiency is always an illusion) for other readers. You could say that fiction is like religion this way: it requires that some questions go unasked to maintain its sufficiency. In other words, ignorance underwrites narrative bliss as much as spiritual.


And this explains how it is different books sort readers in different ways, and why so many people are inclined to judge the intelligence and character of other people on the basis of what they read: pretentious, stupid, what have you. You can tell as much about a person by the things they're prone to find sufficient as you can by the things they're prone to flag.


Moreover, since we seem to be hardwired to flag the other guy, we generally (mistakenly) assume that our judgments are sufficient. One of the things that makes Johnny crazy, you might assume, is the fact that he thinks S1 is an honest characterization of S3. We literally have systems in our brain dedicated to editing S3, the ugly truth of our character as others see it, into the streamlined and blameless S1, which then becomes the very gospel of sufficiency. Our memories are edited and rewritten. Our attention is prone to overlook potential flags, and cherry-pick anything that coheres with whatever 'sufficiency hypothesis' we happen to need.


There's a reason you bristle every time your spouse flags something you do.


And things go deeper still. Wank deep.


You could say, for instance, that sufficiency lies at the heart of what Martin Heidegger and Jacques Derrida call the 'Metaphysics of Presence,' and that deconstruction, for example, is simply a regimented way to flag apparently sufficient texts.


You could also say the same about 'essentialism,' the philosophical whipping boy of pragmatism and contextualism more generally. Or Adorno's 'Identity Thinking.'


In fact, so much of contemporary philosophy and cultural critique can be seen as attempts to read S3 into S1, crazy into innocuous – raising all the same flag finding/making problems pertaining to reader critiques I mentioned above. What does wholesale cultural critique mean when it's so bloody easy? All you have to do is begin inserting the right modifiers or asking the right questions.


And deeper still, you have science, whose claims we take as sufficient, often despite the best efforts of its flag-waving critics, primarily because nuclear explosions, cell phones, and octogenarian life-expectancies are so damn impressive.


Science, as it turns out, is the greatest flag machine in human history. Only those claims that survive its interpolative and interrogative digestive tract are taken as sufficient. And now, after centuries of development and refinement, it finally possesses the tools and techniques required to read the brain into S1, to show that innocuous Johnny, when viewed through the same lens that make nuclear explosions, cell phones, and octogenarian life-expectancies possible, is in fact a crazy ass biomechanism. Just a more complicated version of his neighbour's corpse.


A bundle of flags, pretending to be sufficient.



 •  0 comments  •  flag
Share on Twitter
Published on March 09, 2012 12:07

March 6, 2012

In Praise and Dread of Crazy

Aphorism of the Day: When the Real goes mad, sanity can only cling to delusion.


 


What if sane is stupid and smart is crazy?


I've been thinking about Neuropath a lot lately. I'm doing an interview on the book with Peter Wolfendale (whom I've invited to rebut Roger's excellent post on ancient skepticism), as well as discussing it with Frank Cameron, a friend of mine who has it assigned for one of his philosophy classes. At the same time I happened to bump into a paper by Eric Schwitzgebel, entitled "The Crazyist Metaphysics of Mind."


The question, 'What if sane is stupid and smart is crazy?' is pretty much Neuropath in nutshell. It depicts a world where the cracks between human intuition–sanity–and scientific knowledge–craziness–have yawned into a chasm, an anomie that either drives us further and further into fantasy or swallows us whole.


Of course not all 'sanes' and 'crazies' are equal. At the very least we need to distinguish stupid sane, from smart sane, and likewise, smart crazy from stupid crazy. So how do they rank? Like 'chaotic evil' in the moral metaphysics of the old Advanced Dungeons and Dragons, you might think stupid crazy would be the worst kind of crazy. I mean, what could be worse than a crazy idiot?


Well, it depends on your time-frame. The thing about stupid crazy is that it tends to be as self-defeating–not unlike chaotic evil. Think of the difference between Hannibal Lector and Buffalo Bill: Buffalo Bill did a lot of damage assembling his wardrobe, sure, but there was no question of him getting the best of the delicious Agent Starling. No. That entree (and the sequels) were reserved for Hannibal the Cannibal.


It's the smart crazy that we need to worry about, the lawful evil. Like the line from the old Tragically Hip tune says, "The smarter it is, the further it's gonna go…" Consider the Standard Model of Particle Physics, which is not only the gold standard of scientific theory, but the very definition of smart crazy. Think chalk, blackboard, Niels Bohr, Hiroshima…


"We are all agreed that your theory is crazy," Bohr once famously said to Wolfgang Pauli. "The question which divides us is whether it is crazy enough to have a chance of being correct."


And now, as Schwitzgebel argues, we have cognitive neuroscience nipping at our mental health.


The problem, it seems, is that human evolution only really cared about good enough, that the cognitive tools required to reproduce human children who reproduce only needed to land a glancing blow on the way things really are. And as a result, we find ourselves stranded with a near miss as the very yardstick (criterion) for what we call 'sane' – simply because it's the only yardstick we got. The problem, in other words, is that the world really is crazy.


A psychopath, as Neuropath would have it.


So I thought I would pitch this as my counter-argument to throw into Roger's wheelhouse (one that sidesteps all the tiresome charges of self-refutation philosophers typically used to stuff wax into their dogmatic ears): "Yes-yes. Your logic is as impeccable as always, Sextus, my dear man. That has to be the smartest defence of stupid sanity I have ever encountered."


So the very game of giving and asking for theoretical reasons loses the game of giving and asking for theoretical reasons–fair enough. The powder is wet. The very inference structure of philosophy is self-defeating…


But who said all defeats are equal?



 •  0 comments  •  flag
Share on Twitter
Published on March 06, 2012 10:38

March 2, 2012

To Know Our Unknowing

Aphorism of the Day 1:


"Nothing becomes a man, even the most zealous, more perfectly in learning than to be found very learned in ignorance itself, which is his characteristic.  The more he knows that he is unknowing, the more learned he will be."


– Nicholas of Cusa, On Learned Ignorance


Aphorism of the Day 2:


"There are some things we now know too well, we knowing ones: oh, how we nowadays learn as artists to forget well, to be good at not knowing!"


– Nietzsche, preface to The Gay Science


———————————————–


Welcome to the first post by a guest-blogger here at the TPB!  My name's Roger Eichorn.  I'm a friend of Scott's, an aspiring fantasy novelist, and a Ph.D. student in philosophy at the University of Chicago.  My primary area of specialization is ancient skepticism, particularly the Pyrrhonism of Sextus Empiricus.


In this post, I'd like to discuss one of Scott's favorite themes—human stupidity—in relation to Pyrrhonism.


Scott focuses, and for good reason, on the growing scientific (that is, empirical) evidence to the effect that humans are stupid, stupid creatures.  Much of this work is cutting-edge stuff, largely because of recent technological advances that have (as Scott likes to say) broken open the 'black box' of the human brain.  Even so, there's a sense in which the findings Scott brings to our attention are merely the latest chapter in a long story, a story that goes all the way back to the ancients.


Sextus Empicirus himself based many of his arguments on empirical evidence.  Though, of course, his 'evidence' was not the sort of thing that would pass muster in a modern scientific context, I believe there's every reason to think that, were he alive today, Sextus would be at least as fascinated by the growing body of evidence concerning human cognitive shortcomings as Scott is—and moreover, there's every reason to think that he would have made potent use of this evidence in his skeptical dialectic.


However, Sextus did not think that we require empirical evidence in order to arrive at the conclusion that we're all idiots.  That conclusion, he thought, can be arrived at purely a priori, that is, while lounging in our armchairs and merely thinking through our knowing.  Let's see how this works.


The question is this: What, if anything, do we know?  Knowledge is generally taken to be justified true belief.*  (This is a twentieth-century formulation, but the thought goes back at least to Plato.)  On the one hand, there are beliefs—all sorts of beliefs, many of them batshit crazy.  On the other hand, there is the way things actually are (truth).  How do we assure ourselves that a belief reflects how things actually are?  We do so, the thought goes, by justifying that belief.


So far, so good.  But any step we take from here is going to lead us into trouble, for the question immediately arises: What does and does not count as a genuine justification?  Right away, we find ourselves in the grip of what's called the problem of the criterion, which can be summed up this way: without an already-established criterion of truth/justification, we have no way to establish the truth/justification of a putative criterion of truth/justification.  Immediately, in other words, we've fallen into the difficulty of needing to justify that which makes justification possible.  It is no easy task—putting it mildly—to see our way around this epistemic impasse.


But even if we bracket out the problem of the criterion, our difficulties are hardly over.  For the sake of argument, let's all agree to construe justification in purely rationalistic terms.  Let us, in other words, agree to seek justification solely on the basis of the autonomous exercise of our capacity to reason.  (Let us, that is, become philosophers.)  Straight off, then, we can dismiss any putative justification that relies on appeals to authority (appeals that cannot be independently underwritten by reason alone, that is).  Appeals to authority (such as God, sacred texts, or your friendly neighborhood guru) can play a role in justification, but they cannot be its ground.  We can also dismiss things like divine revelation.  (Again, divine revelation can play a role in justification, but only if the truth of the revelation has been independently justified.)


In short, let's all agree to be 'rational.'  Now, there must exist constraints on what counts as rational; otherwise, the concept would be empty, indistinguishable from irrationality.  Ancient skeptics suggested the following as non-tendentious rational constraints:


(1)  If a person claims to know something, then that person opens herself up to the standing possibility of being asked how she knows, i.e., to being asked for the justification of her belief.


(2)  Successful justifications cannot involve:



Brute assumption


Infinite regress


Vicious circularity

(3)  If a claim to knowledge cannot be justified, then the claimant is rationally constrained to withdraw it (at least qua knowledge-claim).


The constraints on justification outlined in (2)–called the Agrippan Trilemma–come down simply to this: merely assuming that something is true is not a rational reason to maintain that it's true; therefore, any putative justifier must itself be justified, from which it follows that an infinite regress of justifications (where x is justified by y, which is justified by z, on and on forever) fails, as do circular justifications (where x is justified by y, which is itself justified by x).


There's a sense in which the Agrippan Trilemma sums up the problematic of the entire history of epistemology.  Foundationalist theories attempt to end the regress by appealing to some privileged class of self-justifying justifiers.  Coherence theories, on the other hand, attempt to make a virtue of circularity by claiming, roughly, that we are justified in holding a set of beliefs if those beliefs evince the requisite degree of internal coherence.


Despite centuries–millenia!–of ingenious epistemological tinkering by generations of staggeringly intelligent people, it is hard to see, on the face of it, how any theory can escape the Agrippan Trilemma without giving up on rational justification altogether.  The very idea of a self-justifying justifier is, if not incoherent, at least deeply suspicious.  Such 'foundations' to our knowledge are often said to be 'self-evident.'  But as the Devil's Dictionary points out, 'self-evident' seems to mean that which is evident to oneself–and no one else.  (Making the same point with far more plausibility, and much less humor: 'self-evident' seems to mean nothing more than what a particular cultural tradition has taught its members to accept without reasons.)


As for coherence theories, it may be the case that the greater the coherence of a set of beliefs, the more reason we have, ceteris paribus, to think those beliefs true.  But the game of truth is not horseshoes or hand-grenades.  Given that knowledge means justified true belief, then by claiming knowledge of x, we're claiming that x is true, not that x is more or less likely to be true by virtue of belonging to a more or less coherent set of beliefs.  There might be all sorts of interesting uses for coherence theories, but they are not theories of truth.


Finally, some epistemologists endorse 'externalism,' according to which (roughly) knowledge does not require that the knowing subject know that she knows.  Here's one way of putting it: as long as a belief was acquired by means of a reliable mechanism (a mechanism that is known to 'track the truth'), then the belief is justified regardless of the 'internal' state of the subject.  Externalists will want to argue that I (and other pesky skeptics) are demanding too much, namely, not just that we know x, but that we know that we know x.


Think about it for a minute, though.  What does 'externalism' come down to?  Just this: "It might very well be the case that many of our beliefs are justified even if we have no way of knowing that they are."  For consider: unless the externalist, or someone, is able to adopt the third-person perspective—the perspective from which it is possible to determine that Beatrice has arrived at belief x by means of a reliable, truth-tracking mechanism, and thus that she knows x (even though she does not know that she knows x)—then externalism amounts to saying, "It might be the case that we know all sorts of stuff."  Fine.  I accept that, Sextus accepts that—all ancient skeptics do (at least in the externalist's sense of having a true belief that is in fact justified in some way that escapes us).  But without specifying what we know and how we know it (what justifies it), then externalism simply does not answer the question.


On the other hand, if externalists think that they (or someone) can adopt the justification-identifying third-person perspective, can identify (e.g.) reliable truth-tracking mechanisms, then their account of justification would have to be an account of the justification of those mechanisms—that is, an account of how it's known that those mechanisms are truth-tracking.  Externalism, then, if it is to contribute anything to the conversation, must collapse into internalism.


It is not enough to 'know' something in the externalist's sense.  Unless we're in possession of a justification for a belief we hold, then we do not know that we know it, in which case we have no warrant for crowning it Knowledge.


Where does this leave us?  It seems to leave us with the conclusion that, as far as we know, we know nothing.


But that can't be right, for if we know that we do not know whether we know anything, then we know something.


We've run aground on peritropē: self-refutation.  I'll continue the story in a later post…


What I've tried to show here is just that, even sitting in our armchairs, reflecting on our epistemic predicament, it's possible to illuminate for ourselves the cognitive knots in which our thinking entangles itself—to know our unknowing.


We're all idiots.  The more we accept this—the more we become good at not knowing—the more learned we will be.


———————————————–


* = Those with a philosophical background might at this point protest, "But what of Gettier cases?"  I'm going to ignore Gettier here, partly to keep things simple, but also because I think Gettier's problematization of the standard conception of knowledge fails, that its failure has been demonstrated numerous times, and that epistemologists should just move on already.



 •  0 comments  •  flag
Share on Twitter
Published on March 02, 2012 14:25

February 28, 2012

Freebasing Thaumazein

Aphorism of the Day I: When arguing, I always try to meet people in the middle, knowing that there, at least, I will be alone at last.


Aphorism of the Day II: Chase wonder through the mill of reason, and you find philosophy. Chase wonder through the mill of desire, and you find fantasy. Since desire always has its reasons, and since reason is never free of desire, there's no having the one without somehow committing the other.


It was summer. One of those days when a nimbus of white frames all the windows, and the breeze hisses through the screens. We lived in this little frame farmhouse not so far from the shore of Lake Erie, and far enough from any town or village that you could pass a day without hearing a car. I was sitting in the dining room – on a bean bag chair, I think. I was reading The Fellowship of the Ring. I was ten years old.


So that would make it about 1977.


I look up from the page.


The house is empty. The compressor on the fridge hums. Outside, the wind brushes the hair of the world–the maples out front and the giant willows out back and cornfields that square all creation.


I see the battered old couch in the livingroom. Branches waving no-no in the top corner of the adjacent window. Light smeared like wax across the paneling. Crumbs on the carpet.


And I feel something between static and vertigo climb into the interval that separates what I see from the fact of my seeing.


There's no words for it really, except, maybe…


What is this?


Seeing? Breathing? Being?


How could this be?


Then I hear my mother calling out to my brother… I can't remember what, only that it was out back, beneath the willows. And the horror threatening Middle-earth pulls my eyes back down to the narcotic lines on the page–drags me back in.


But I never forgot the moment: How could I when it would be the first of so very many? Even still, as a middle-aged man, though it often seems the glass has been scuffed to a fog. Sometimes daily.


What… the-fuck… is-going-on?


Plato called it thaumazein, wonder, and for him it was the origin of all philosophy. But sometimes I can't help but think that philosophy begins precisely when we forget to wonder, or even worse, confuse it with the will to answer. Sometimes, when I consider all the things I think that I know, that old feeling climbs into the interval yet again, and it seems so clear that I know nothing at all. And the strange and the weird and the mad all become possible… Beautiful.


And the urge to write fantasy rules me once again. It's pursuing this urge that I feel I write my best stuff, when the story seems to drip from the tips of my fingers. And it's the urge that makes me smile and nod when I recognize it in writing that belongs to others.



1 like ·   •  0 comments  •  flag
Share on Twitter
Published on February 28, 2012 11:19

February 24, 2012

The Mother of All Non-sequiturs

Aphorism of the Day: A question is friendly or insulting in direct proportion to the ignorance it reveals… Or is it?


Definition of the Day - Attitude: 1) the only thing cheaper than belief; 2) a popular brand of fact repellant; 3) something you need to lose a lot of to fly under other's radar, but to keep a little bit of to avoid crashing.


I want to apologize to anyone I might have failed to reply to over the past couple weeks. I would also like to apologize if I came across as curt or too cute in the replies I did make for much the same reason. I'm not sure I've ever typed so many words in my entire life, rushing to give every devil his or her due – or just to cover my ass!  I did become frustrated on occasion, especially if I felt I was simply answering the same questions. The trolls who came by are the exception of course. In their case, I would like to apologize to my argument for chasing them away so quickly. 


Just a few cool things to update everyone on. First, I just signed contracts for audio book versions of both trilogies, something many people have asked me about over the years. I have no idea on what the timetable is, or any other details for that matter, but I will be sure to provide updates as that information becomes available.


Second, I posted two pieces in the Speculative Fragments section, one old, the other new. The old one is simply the Bestiary of Future Literatures piece I posted a while back. The new one is a Bestiary of Consciousnesses. Both are pretty wank and over-the-top technical, and will likely interest only those keen on the philosophy of mind stuff.


The third is something of a biggie. I've decided to invite a couple of guest-bloggers to TPB. The idea is to continue discussing things pulp and philosophical, only bringing in voices possessing less bluff and bluster and far, far more expertise. Your opinions, as always, are most welcome. (I swear to Gad I won't try to pinch you in the nuts!)


It is presently 2:40PM Double Standard time. I expect to be selling coffee and bagels through the site before the year is through.



 •  0 comments  •  flag
Share on Twitter
Published on February 24, 2012 11:42

February 20, 2012

A Eulogy for the Unconscious (1895 – 2012)

Aphorism of the Day I: It's not that sexists are more stupid, only that they aim their stupidity in a less intelligent direction.


Aphorism of the Day II: Feminists are primates too.


 


So I've been reading Guy Claxton's The Wayward Mind, an interesting (but curiously out-of-date–but then everything strikes me that way since reading Boyer and Attran) historical account of the Unconscious. At the same time I've been thinking about the debate we've been having the past few weeks, and all the times the unconscious/subconscious has been referenced as an accusation or argumentative tool. And I realized that I had, quite 'unconsciously,' stopped believing in the Unconscious.


I no longer think there's any such thing.


Just as a reminder, lest people read too much into my claims, for me, 'no longer thinking there's any such thing,' simply means, 'I think I've found a better cartoon.'


Claxton is a fantastic writer, and for this reason I heartily recommend the opening chapters of The Wayward Mind to anyone interested in secondary world-building: he does a great job evoking the ancient mindset, the way our ancestors, lacking our sophisticated nomenclature for interiority, had no choice but to turn to the external world. Post Boyer's Religion Explained, his accounts seem inadequate and even romantic (he actually relies quite heavily on Julian Jaynes), but the vividness of his writing makes these worlds come alive. And his master narrative could very well be true: that the Unconscious finds its historical origins beyond the horizon of the outer, objective world, then gradually migrates to its present locus beyond the horizon of our inner, subjective world.


The Unconscious, in other words, is of a piece with gods and underworlds, a way of comprehending What We Are Not in terms of What We Are. It's literally what happens when we rebuild Mount Olympus into our skull. This explains why it's such a curious double gesture, why, in the course of disempowering us, it allows us to own our abjection. My skull, after all, remains my skull, and if What We Are Not resides inside my skull, then 'I own it.' We bitch about our Unconscious to be sure, but we cluck and joke about it as well, the same way we do when our children are obstinate or wilful. 'A Freudian Slip' is almost always an occasion for smiles, if not laughter.


And now I want to argue there probably isn't any such thing. Why?


Well, as critics in the past have noted (most famously, Descartes), it seems incoherent to talk about 'having' experiences, memories, beliefs, desires, and so on that you don't have. The rejoinder, of course, is that we simply have to have these things if we're to make any sense of the fact that we can make implicit things explicit. Humans act out all the time: short of the Unconscious, how are we going to make sense of that?


One way to redescribe this dilemma is to say that we have this powerful intuition of sufficiency, that consciousness is something whole. But we find ourselves continually confronted with indirect evidence of insufficiency, ways that compel us to conclude that consciousness is incomplete.


The 'Unconscious,' I now think, is simply another way for us to have our cake and eat it to, to acknowledge insufficiency while endorsing a kind of orthogonal, crypto-sufficiency.


One of the things I find the most embarrassing about my old post-structuralism turns on precisely this point: I literally cannot count the number of times I've referred to the 'post-modern subject,' decentred, fragmented, and so on and so forth. I now see that this was little more than dogmatic window-dressing: surrendering the Cartesian subject is pretty damn cheap. You acknowledge that the sufficiency of the Self is illusory, and yet you blithely assume that all its constituents are quite sufficient, or at least sufficient enough to keep all the traditional discourses afloat–which is what you have to do to rationalize the institutions that make them possible (so much of the discourse you find in the humanities, if you think about it, is given over to justifying the institutional importance of the humanities).


It strikes me as laughable that I literally thought I was radical, that I had defected from the traditional game of giving and asking for reasons in any meaningful way. It seems little more than fashion, now, a product of an old ingroup self-identification. There is certainly nothing 'radical' about it, and even less that is courageous. If you buy into the 'decentred post-modern subject' you're cringing in the trenches with everyone else, bragging because at least you fired your rifle into the air. But you're as much an intellectual coward as those you critique–or at least far from the hero you think you are (but then, we're pretty much all cowards here in the post-industrial West, or any place where sales and the consensus of ingroup peers worry you more than the Mob or the Censors.)


Why? Because the Question of Sufficiency pertains to everything. Why should we suppose, for instance, that norms are sufficient? Or purpose or even intentionality more generally? What does it mean to yield the house when you leave the walls, floor, and roof intact?–except that you think you're cooler because your interior designer decorates in French.


The Unconscious is yet another concession to sufficiency. The prospect of Radical Insufficiency, the possibility that we're wrong not only about the subject, but everything subjective as well, suggests that very little might separate the projection of psychologies (gods, demons, spirits) beyond the rim of the world and the projection of intentionalities (beliefs, desires, memories) beyond the rim of consciousness. In other words, it suggests there's just no such thing as the Unconscious.


So what is there? I mean, there has to be something that explains all our neuroses…


And there is: the mad, biomechanical complexities that comprise the brain.


Note how dramatically this transforms the old landscape. Gone is the bipartite geography of consciousness and Unconscious, the strangely reassuring sense of some Cold War stand-off between antithetical rivals. If we see 'the mad, biomechanical complexities that comprise the brain' as a substitute for the Unconscious, then in a sense you would have to say that everything is 'unconscious,' insofar as those mad, biomechanical complexities exhaust the brain.


But if everything is 'unconscious,' what does it mean to be conscious?


This absurdity suggests that what we're actually talking about are different levels of description, one psychological, the other neural. The fact is, once we concede the possibility that the projection of the traditional/intuitive categories of consciousness to explain the insufficiencies of consciousness (the ways our actions exceed our awareness) is not quite coherent insofar as we're assuming the sufficiency of intentionality to explain its insufficiency, then the whole game changes. We can take what Dennett calls the 'intentional stance'–a psychological perspective–to get a grip on causal complexities that would boggle us otherwise, certainly, but as with taking the 'design stance' with reference to evolution, we always have to be ready to retreat, to acknowledge the gross, cartoonish nature of this heuristic way of speaking, and be ready to concede the biomechanical where necessary. In a sense, we would be talking about an 'As-if-unconscious,' one that, paradoxically, is completely coextensive with consciousness–leaving us with the suggestion that consciousness is itself, somehow, unconscious.


And this just goes to show that consciousness itself is every bit as much up for grabs as the Unconscious here–that perhaps we need to reserve a family plot.


But that's another fucking twisted story.


 



 •  0 comments  •  flag
Share on Twitter
Published on February 20, 2012 05:55

February 18, 2012

Um, Does Anybody Got a Mint?

I'm e-stinky.


I know it. I've been told it too many times for it not to be true.


Isn't it funny the way no one will tell you if your real stinky, but if you're e-stinky they just can't stop going on and on? It must be liberating, telling people how e-stinky they are, saying: "I hate to break it to you, bud, but your 'E' – peeeyew, does it stink. You smell like fish giblets wrapped in a shit taco, like a corpse's asshole–Do you know that? Man o' man does your E stink! Jesus is going to have to come up with a whole new brand salvation to get your stinky E ass into heaven!"


That's the great thing about the internet, the way it liberates us from all those things, like civility, compassion, reasoned debate. Adulthood.


Because it's kinda like a schoolyard, isn't it? Name-calling. Mocking and jeering and sneering. Fook-you! No fook-you, motherfooker!


And if you happen to be e-stinky – lookout!


As far as I can tell I've been e-stinky my whole life. I'm a veteran of a thousand flame wars. I've slain trolls (who are attracted to e-stink the way a rutting bull moose is attracted to female urine), scaled bastions of unbridled, wanton stubborn stupidity. I have seen grown men go Full Retard over the definition of 'illness.'


And I've been beaten down, hurt, mystified. I've turned to friends and asked, "Why does my E stink?"


"You come off as a pompous ass."


"What if I made my ass more humble… Would it stink less then?"


"Maybe."


"Maybe?"


"I don't know what to say. It's that whole philosophy thing, maybe. All that shit about facts and reasons and fallacies and cognitive shortcomings…"


"That's what makes me stinky?"


"Well… There's your personality."


"Fook-you!"


"No fook-you, motherfooker!"


So that was that. I was e-stinky. I doomed to smell the smell that dare not waft its name. It was part of who I am. Part of my essence


What was I to do? Withdraw from the e-world entirely, live in self-imposed e-exile, e-embittered?


Or should I e-embrace my e-brand. Stand e-tall, e-proud, and e-declare, "I e-stink, Goddammit… How do ya like me so far!"


"Fook-you!"


"No, fook-you, motherfooker!"


Sometimes I would become self-conscious, I admit. It's a hard thing, forcing yourself to be proud. I would do things like shove breath-mints and air-fresheners up my e-ass, just to see if that would work. I once even shaved my armpits. But to no avail. Take my advice: Never, ever, try soaking your balls in Old Spice. Those guys who write those commercials are a bunch of lying motherfookers!


Other times I exulted in my e-stink. I would breath deep, and cry out with savage abandon. The heavens would resound, and the beasts would slink, eyes watering, into their lairs. Olfactory bulbs shrivelled for the mere rumour of my approach. My smell was my roar!


But it got to be too much, the schoolyard, the name-calling, the mocking, the jeering and the sneering. Even my heart seemed to e-stink after awhile.


So I retired from the e-world, travelled the lonely ways of the e-stinky man, pensive, reflective, searching for someone wise who could tell me what to do. And so I found myself in Tibet, searching for a famed Tibetan Llama renowned for his wisdom regarding all things stinky and E.


I climbed mountain scarps. The wind whipped tails of white from the surrounding peaks. When I arrived at the monastery, I knocked on the great, bronze-gilded, door. It opened of it's own volition. I would have been awed, had I not been convinced it simply retreated from my odour.


I stepped into the gloom following a little dust-devil of snow. I blinked, my eyes adjusting. The heavens howled beyond the timbered dark. Even the Gods, it seemed, were, like, totally grossed out.


And there she sat, regaled in woolen vestments chased in gold.


I said, "You're a chick!"


She shrugged. "Yeah, well…"


I swallowed. "Do you know why I've come?"


"Oh, my dear child," she said. "You are such a darling foo–oof!" Her face crumpled into waxen origami. "Fuck me! Is that you?"


"Please! Guru-chick! You must help me!"


She held her breath as if bottling a toke. "You smell like shit, but, hey…"


"Thank you!" I cried, falling to my knees.


She regarded me, her look stern, as ruthless as a Republican Primary. "I like the lay of your sausage, kid, so I'm afraid I'm going to have to be cruel."


"Yes! Yes! Please be cruuuel!"


"Well," she said, "there's two things. First, you gotta remember it's just the fucking internet. Christ! I mean, puh-leeease! People say all kinds of crap they wouldn't dare say in real life. They, like, even got names for it and shit. Syndromes and what-not."


I knew this already. I caught my breath in sudden worry. "And? And?"


She paused and crouched forward, made like a gagging cat. She leaned back gasping, caught a thread of spittle on her thumb. I heard her mutter something wise and arcane, but all I could make out was "demon dick" and "like anal sex with decomposing chimpanzees…"


Words that would haunt the subconscious galleries of my soul for, like, forever-ever-ever-ver


"So you reek. It's all relative, kid. Stick to arguing with those even e-stinkier than you."


With regal grace, I slowly lowered my forehead to the floor, thinking, What kind of loser would just leave a tack laying on the floor like that? Fucking owww


"Thank you, oh most wise Guru-chick–thank you!"


She nodded. "Oh, yeah… Have you ever tried washing your nads in Old Spice? Cause that's what you smell like–" She paused to simultaneously burp and cough.


"Balls?"


"Horseshit, kid. Horse. Shit."



 •  0 comments  •  flag
Share on Twitter
Published on February 18, 2012 05:07

February 16, 2012

That Empty Place

So it was a suite party at a con and this guy sits opposite to me, tells me we've met before. I apologize, tell him about my crazy inability to recognize faces that I don't encounter on a regular basis, to the point of not being able to remember relatives, former students, and old friends.


He nods, not quite believing. "I just wanted to tell you how much I loved your books."


"Cool. Cool. Have you had a chance to read Neuropath yet?"


He makes a face, pokes his glasses against the bridge of his nose. "Yeah…"


"And?"


"I'm afraid I didn't like it."


"You're not the first to say that! What put you off?"


He scans the crowded room. "Well… Horror is a tricky thing… There's a fine line between being frightening and being… sordid, I think…"


This took me by surprise. "Sordid?"


It's funny, encountering criticisms that stick at these things. I never get offended, just… intent.


"Yeah… Hey, do I mind if I ask you if you have any kids?"


"I think you just did! But, ah, no."


A nod screwed too tight to really signal agreement. "I knew it."


"How?" I asked, already knowing the answer.


"I just didn't think anyone who had kids could have written that book."


Now I'm intent times two.


"Am I allowed to take that as a compliment?"


"Now, maybe. But when you have a kid, not so much."


This is a reconstruction: all I remember clearly are wire-rim glasses, the gist, and the way crowds of people can fence an illuminated conversation in dark. But it stuck, and I've thought about it more than once since I've become a father…


Like him.


Because the fact is, I totally agree. There's no way in bloody hell I could've written Neuropath now.


Why? Because some experiences are arguments. The difficulty lies in discovering just what that argument is…


I still can't get over the amount of traffic TPB has received over the past week. I'm still processing everything. It'll probably take a week or two before I feel confident enough to draw any solid conclusions. As it stands, I'm probably most troubled by the role life experience and reasoning plays in all this, and how the two evolved into antagonists as the debate progressed.


This world is a hard one – certainly harder than any one life. And in this sense we're all victims. The pivotal question, given all the sound and fury that we've witnessed, is one of what this legitimizes. I was a victim, growing up, the victim of–you guessed it–another victim. We grow into our power, and time is prone to rob us of our retribution, but we hold onto that empty place, don't we? the place where our actual victimizer, the one raw with whiskey and an unshaven jaw, once stood. So we fill it with others, with groups or children or anything safe, so that the circuit might be complete. We laugh. We sneer. We shame and we strike. We pass piety around like a joint. We become as ugly as we are. This is the cycle, isn't it? great and miserable and so very sordid, playing across the generations, along all our degrees of separation.


And here we are playing it out. Again.



 •  0 comments  •  flag
Share on Twitter
Published on February 16, 2012 06:57

February 10, 2012

The Halftime Show

Aphorism of the Day: Moral certainty is simply greed dressed as poverty.


I've been sitting on this for several days.


An example of how it should be done? Or yet more perfidious evidence of my insatiable need to appropriate?


Either way, it's both wonderful and wise.



 •  0 comments  •  flag
Share on Twitter
Published on February 10, 2012 14:47

February 8, 2012

Gonads versus Nomads

Aphorism of the Day - Otto's Law:  Thou shalt not cite Internet Laws, for they have as much logical force as laughing at an immigrant's clothing. 


I lost this fight before I even started it – I assumed as much going in. I spent most of yesterday following responses across the web, and I've been quite entertained by the Bakker-slamming going on. I'm pretentious. I'm this, that, and the other thing, my actual argument nowhere to be seen (well, that's not quite true: someone raised the Criteria Question to make fun of it for being incomprehensible). The issue is who I am, and more importantly, what kind of group I belong to: white, male, university wanker, thin-skinned author, etc. For most of those condemning me agreement simply is intelligence, and disagreement simply is idiocy. Elodie's troll posts, for instance, struck a handful as a decisive blow, something that somehow proves how much an idiot or fool I am. (As much as the old practical reasoning instructor in me wants to scream, the kid who grew up arguing against the extreme right wing views of his tanked father around the dinner table understands full well). The way their judgments make my point for me  is almost absolutely invisible to them. In fact, at one point, less than 1% of the people checking "Requires Only Haidt" even bothered clicking on the link to Haidt's interview. Something Haidt would likely have predicted.


But the fact is, there's another side to this problem, the side represented by all those who think I'm wasting my time. I live my life on the border of two very different worlds, one where I'm 'questionable' because of my gutter humour, another where I'm either 'out there' or 'pretentious' because of my vocabulary. What Haidt calls 'lifestyle enclaves' is very relevant to what seems to be happening to contemporary NA society- to what Murray's data tracks in Coming Apart. I could spend all my time writing and talking for people who already agree that, yes, Sexism is a devilishly difficult and complex thing, and so on. I could exchange all the slaps on the back of the head for pats on the back… Anyone can. All you have to do is say the right things to the right people. Be groupish.


The problem is, I really do buy my own bullshit. And now that confirming data is being published, from both the left and the right, I'm downright, well, elated and excited. As strange as it might seem, I literally feel renewed by all this, knowing that the guesses upon which I raised my career are probably true. Giving into groupishness is inevitable. We're hardwired to gather about certain attitudinal fires, warm our hands over the recitation of certain words expressed in certain ways. Intellectuals make fun of evangelicals. Evangelicals make fun of intellectuals. Conservatives make fun of liberals. Liberals make fun of conservatives. Everybody is right, completely convinced their group has won the Magical Confirmation and Affirmation Lottery. But a few of us genuinely strive to be nomads, not in the boutique sense of philosophers like Deleuze, but in the sense of not really belonging to any institutionalized group, because they strive to belong to humanity at large, a humanity trapped in a game theory nightmare.


I'm romanticizing, I know. But that's because I'm actively recruiting. I'm trying to convince as many damn people as possible to be mindful of the ways their own psychology fucks them up. This isn't some foofy New Age claim: we are not what we intuitively think we are as a matter of scientific fact. And given that 'Believe!' is far and away the most pervasive slogan in our culture (after 'Buy!'), it follows that our culture is delusional – and that you, dear reader, live in a dreamworld the degree to which you buy into it. You can start here if you don't 'believe' me. But the data is becoming mountainous, and it keeps piling up. Politicians and corporations are making use of it because it works. You should too.


(And just to be clear, this applies just as much, if not moreso, to the liberal intellectual types reading this. In some ways, you're the worst of the bunch, simply because you think you've already found your way past all the delusions, when in fact, you've simply found a way to fortify them. In a very real sense, all your 'critical training' is the product of the Middle Ages. Until you know what your brain is doing, you do not know what you are doing.)


I lost this battle before it began, if you tally up the brains, pro and con. Sure. And it definitely would have been better if I had picked someone who hadn't 'reviewed' me, the way I did with Vox (who only reviewed me afterward), simply because it would have closed down one obvious motivational liability. But the Dude was just such a golden example, and I'm as vengeful as the next guy.


But that's what it takes, isn't it? Losing battle after battle, changing a few minds every time. 1% here, 1% there. The research isn't going away. Neither are the institutions eager to manage our perceptions. The only question is whether we'll come to collective grips with it in time.


I spent about a half an hour last night, laying in bed and pondering sexism and what I was attempting in my books, worrying all the different angles. What makes a book sexist? The perception of a certain percentage of a certain victim group? The intent or attitude of the author? The actual social consequences of the book? It can't be the former, because it means that works like the Bible, for instance, only became sexist once they were perceived to be so. Calling a thing something does not make it so. It can't be the second because the author's intent, unfortunately, does not abide like magic pixie dust in text. No. It has to be the third, the actual social consequences of the book. For me, that remains an open question, and a worrying one.


Then I fell asleep disgusted because the Leafs had lost to the Jets. What a pisser that was.


In other words, pile-on people. Condemning, lampooning, labelling: these things come so easy because they're so natural. Indulge away, if it makes you feel superior and connected. Just don't fool yourself into thinking you're any less of an idiot than those you target.


We're all idiots around here.



 •  0 comments  •  flag
Share on Twitter
Published on February 08, 2012 09:02

R. Scott Bakker's Blog

R. Scott Bakker
R. Scott Bakker isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow R. Scott Bakker's blog with rss.