David Teachout's Blog, page 18
March 17, 2015
To Live Passionately Is to Be Open to Hurt
Passion burns. Love hurts. And in the immortal words of the J Geils Band, “love stinks.” Limiting the topic of passion to love would portray an unfortunate falseness, though certainly the two terms are often interchangeable. We speak of having a passion for something almost in the same breath as declaring we love it. That so many of us do not have the gift of a poet limits our word choices, but it does not limit the underlying feeling. Our passions, in as much as our frailties, perhaps more so, define the scope of our lives.
Situations that turn otherwise sane people into fever-eyed imps, entice us in our fantasies and are often emulated in our media. The embrace of passionate intention is done despite of, perhaps even because of, the potential for and poetically narrated descent into, pain and heartache. I’ve sat on those shores, watching the tide go out, wanting to stretch my arms and hold in the ocean, knowing even as I attempted it the hopes and aspirations would seep from between my fingers and flow past the curve of my embrace. I’m not here even talking of love’s loss, though that has happened, but the similar feeling of futility having followed a dream of employment and the proverbial rug pulled out from under. I’m hardly alone in this experience, either of them. Within five years, half of all new businesses will have failed. Despite the falsity of the claim that half of all marriages end in divorce, the stat is so ubiquitous it has reached cult status for truth. For clarification, it should be noted that as of 2002, the rate of divorce is around 33% but only after having been married for ten years.
These stats indicate a shelf-life of sorts for passion. It sustains us only for so long, before circumstances often out of our control hurtle us in another direction. This is, clearly, a simplification of social experiences, but the information still paints a picture of eventual undeniable suffering. Despite this, we push forward valiantly anyway. Why?
Are we all like the berserker of legend, leaping madly into a field of our passions, only to succumb to the pressures of the battlefield that is life? Are we all like whirling dervishes, arms outstretched to extend the power of our desire into the chaos of existence, only to eventually fall down from an emotionally riotous cessation of bodily function? From the foolishness of youth’s embrace of first love, to the entrepreneur who ends his life once having had his dreams crumble around him or in some circumstances felt the brush of success, to the dizzying display of vengeful madness in divorce, all these experiences and more point to the answer of those questions as being a resounding “YES!”
Life cries out to be embraced by our lives. We can be a dispassionate lump, but to maintain such a disposition often requires almost as much fervor as the person leaping about with seeming abandon. The vast numbers of our religious creations speak not to an underlying truth, but to the very human desire to form a coherent narrative of nature’s whim. The varied and constantly evolving forms of romantic attachment speak not to a diminished quality of love, but the deeply passionate belief that love chaffs at artificial boundaries. The endeavors of entrepreneurs and scientists speak not to the frailty of our existential understanding, but the soulful drive to shape out of disparate parts something that will last beyond our individual lives.
The constantly shifting variables in our interconnected universe, combined with the fleeting nature of our emotional states, make passion an affair of that which is profoundly human. From John Green, in The Fault of Our Stars:
“I believe the universe wants to be noticed. I think the universe is inprobably biased toward the consciousness, that it rewards intelligence in part because the universe enjoys its elegance being observed. And who am I, living in the middle of history, to tell the universe that it-or my observation of it-is temporary?”
Hurt and heartache may go with passion and love like interlocked genetic matrices of lived experience, and the acknowledgment of such should never stop us from continuing the barrel-run of our lives. As we are the universe thinking of itself, so it is too our elegance being observed in every pursuit of that which will live on beyond us.
© David Teachout

March 13, 2015
The False Help of Equality: Being Nice Can Limit Truth
There is a seemingly benevolent desire to give people the benefit of the doubt. This desire increases in direct proportion to how uncertain one’s own position is felt to be. We seem to take the scientific empirical principle that all knowledge is tentative and use that to support the false conclusion that therefore all opinions are equally tentative. The equality is enticing, we’re a magnanimous species when it serves the greater value of getting along.
So then, on one hand we like equality and want to be nice, publicly supporting the notion that all opinions are equal in potential validity. However, on the other hand we continue to operate under the premise that we ourselves individually are usually correct. The situation allows for a double-barreled blast to progress in understanding, a self-reinforcing system of diminished judgment. The former public support allows people to dismiss the advice of those who may be better informed or offer better arguments in order to buttress the latter ego-centric notion that we’re likely correct anyway.
These findings were recently reported in the Washington Post by Chris Mooney, looking at a recent study published in Proceedings of the National Academy of Sciences (PNAS). The experimenters noted certain criticisms of cognitive bias research, that they were largely based on western cultures, and set out to cross-check by using populations in Denmark, China and Iran. Groups were set up in two-person dyads and basic observation tasks were given, with each person being observed as to their accuracy, asked about the confidence they have for their decisions. Later, after many trials, when faced with a new task example, each was asked as to whether they would continue with their own judgment or go with that of their partner.
Mooney notes from the study that:
…people have an “equality bias” when it comes to competence or expertise, such that even when it’s very clear that one person in a group is more skilled, expert, or competent (and the other less), they are nonetheless inclined to seek out a middle ground in determining how correct different viewpoints are.
This “equality bias” was consistent regardless of whether or not a dyad was set up with a running tally that each could view noting how much better one person was at the task. When it came time to answer whether or not they’d continue with their own answer or that of their partner, those who had been less correct under-estimated the other person and those who had been more correct over-estimated the other person. Even when offered a monetary reward based on giving the correct answer for the next trial, this did not change the behavior of the groups.
The authors of the study stated it this way:
Remarkably, dyad members exhibited this “equality bias”— behaving as if they were as good as or as bad as their partner—even when they (i) received a running score of their own and their
partner’s performance, (ii) differed dramatically in terms of their individual performance, and (iii) had a monetary incentive to assign the appropriate weight to their partner’s opinion.
Further, and even more troubling, was that those who were consistently less sensitive to the variations in the trials reported higher levels of confidence in their answers, regardless of their historical lack of accuracy. In other words, those who are less accurate tend to have a greater confidence regardless of whether they’ve been informed of that fact. This greater confidence, despite no actual reality to base such a belief on, contributes to decision-making later by the group, as people tend to follow those who exhibit greater levels of confidence. The “fake it till you make it” advice is one example. From years of experience in security work, I can personally attest to situations where a projection of confidence, despite objectively being overwhelmed by superior numbers, resulted in reduced situations of conflict.
Let’s clarify to clear up potential confusion. 1) At the group level we support the notion of equality in the validity of opinion, whereas 2) at the individual level we tend to support the ego-centric notion that we’re more correct than others, resulting in 3) a group dynamic of when faced with displays of confidence, regardless of any objective facts noting someone as being more or less accurate in their judgment, we tend to go along with the middle ground, leaning towards the greater display of confidence.
The authors conclude:
Social exclusion invokes strong aversive emotions that—some have even argued—may resemble
actual physical pain. By confirming themselves more often than they should have, the inferior member of each dyad may have tried to stay relevant and socially included. Conversely, the better performing member may have been trying to avoid ignoring their partner.
Essentially we don’t like participating in actions that contribute to anyone’s suffering, particularly our own. This may not make much sense in a world of internet trolls and multimedia bullying, but notice that those actions are not about group decision-making. Instead, they deal with an “Us vs Them” mentality, where concern is for the establishment or realignment of a power dynamic.
Mooney, again taking from the study, notes:
…it also shows how our evolution in social groups binds us powerfully together and enforces collective norms, but can go haywire when it comes to recognizing and accepting inconvenient truths.
Inconvenient truths are any such that make us uncomfortable, that go against the assumed, conscious or otherwise, foundations of a world-view. This is why how one seeks to interact with the world is so profoundly important. If knowledge is not based on special delivery from an authority, if all people are potentially capable of, through generative dialogue and skeptical inquiry, increase their understanding of any aspect of experience, then there is no reason to hold unnecessarily to our opinions. Unfortunately we often don’t see the world that way, instead seeking through authority, real or imaginative, to support the ego-centric mischaracterizations of our competence.
Regardless of which world-view we follow, the tendency to project a false degree of equality in group decision-making is unhelpful at the minimum, and potentially dangerous at the extreme. By allowing opinions that are demonstrably unequal to hold equal weight in decisions, as a society we stumble even faster down the path of negative returns for our actions. This doesn’t mean being deliberately mean, but it most certainly means recognizing that truth and reality are not concerned with a desire to be nice.
© David Teachout
________
Research article in PNAS (purchase required to be viewed):
Equality Bias impairs collective decision-making across cultures

March 10, 2015
Faith: It Doesn’t Mean What You Think It Means
As a human being I’m interested in broadening the understanding of my experiences and increasing my knowledge by identifying what I’m ignorant of and then looking to fill in the gaps. My humanity also determines the limits to fulfilling those desires. I have particular interests by virtue of being me, not every subject draws me the same way. I have time limitations so I have to choose on a daily basis what to read, what to study and plan accordingly for the future. I have career limits, in that my professional obligations concerning psychology direct me to continued education along paths associated with it and not, say, that of electrical engineering. I also, though this is controversial and not without a great number of caveats, have limitations on my intelligence; there are items I study which I struggle to understand while other people have already passed me by. All of these limits are part of being human, but none of them determine prior to the inquiry itself whether I could understand by virtue of that very humanity, they are only particular limits of my own.
As an atheist I am confronted often by the simple declaration from religious adherents of “you have faith too” or in its more arrogantly adolescent form: “it takes more faith to be an atheist.” The confusing nature of this argument becomes immediately obvious when I inquire as to just what is meant, resulting in some example of the form: “you have faith that x will happen” where “x” is filled in by the sun rising tomorrow, the continued love of friends and family, or other such. From the days of my own belief, I can recall the apologetic of referencing wind or air when attempting to describe how the Holy Spirit works. Then, as now, the response to such attempts is to point out that the examples being referenced are not at all comparable.
Faith, as used colloquially, is an indication of ignorance that currently has not been resolved. The determination of proving a claim through this version is to gather more information and clarify one’s thoughts. At all times it is accepted that the potential knowledge exists, though in the end any claim will likely be tentative. I can claim to have faith (if I want to use the term) the sun will rise tomorrow due to history of experience, the demonstrated belief of the uniformity of nature and if further clarity is needed, endeavor to study astrophysics.
To believe by faith, in the supernatural religious sense of the term, means to accept that knowledge of who we are, what we can know and how to act, are forever beyond the reach of our understanding as human beings. This is why there is the felt need for special revelation from a Deity. Knowledge and the ethics that come from it are no longer derived from human rational inquiry, but are at the mercy of deistic whim. In other words, knowledge claims and moral imperatives are true not because they are demonstrably true via social consequence or ethical philosophy, but because Deity says so. If at any time it could be stated that Deity declares something true or moral because it is true or moral in itself, not only is there then no need for a Deity to clarify, but there exists knowledge and morality above or beyond the Deity. Such a situation simply cannot be accepted by adherents to religious dogmatism.
Caught in the bind of not being able to rationally justify their beliefs, believers in the supernatural like to use faith in two different ways and then get confused when called on it.
Here’s the clarification:
1) Faith: I don’t know x, but I could know x with sufficient ability and study because the material/natural world is inherently understandable or amenable to rational inquiry. If faced with a situation where x is required, then I have faith that someone else understands x for the situation to have occurred.
Example: I do not understand structural integrity and engineering, but I have faith that driving over a bridge it will not suddenly collapse or turn into a dragon and fly away with me. Given ability and time I could learn engineering, and for the bridge to exist, someone else must know enough about it to have contributed to building it.
2) Faith: I can’t know x, nor could I ever know x despite dedicating time and energy to the study of it because x is inherently outside of the natural universe. I (pretend to) know x is true because I have faith that x exists precisely because I can’t nor ever could understand it.
Example: All claims of the supernatural by various religious ideologies and many metaphysical claims made by mystical traditions that are said to “simply be known” or “known by intuition or resonance with the universe”.
There you have it. The first is otherwise known as standard human ignorance, the latter is what is being used by claimants of the supernatural, of which not all religions claim. This is why I separate out religions that have them from those who don’t.
Let’s be even clearer. One cannot have the faith of the supernatural religionist without first positing the existence of the supernatural. However, the only justification for such a notion is the very concept of faith in the first place. Faith, for such a person, is not the trust of the scientist in the indomitable power of human inquiry, but a tremulous hope, a wish-filled thought, whispered into the dark expanse of ignorance, that the dark doesn’t really exist, instead it has already been filled or in fact has always been filled. Faith, in this sense, is a completely made-up idea, it has no purpose and no meaning beyond propping up statements of belief in things that otherwise are not open to skeptical inquiry. Conveniently, the Grand-Filler, bastardizing Aristotle’s first cause, is always the very deity the believer wants others to believe in originally.
So yes, as an atheist I have faith, but it is not a faith that sets me apart from the rest of humanity. Nothing that I know now or could know in the future is due to some special revelatory experience, there are no ideas I have now or in the future that are not inherently open to skeptical inquiry, criticism and hold the potential for being removed. Faith from within humanity exists as an identifier of ignorance and a push for continued questioning, with a wariness for declarations from an authority. Such is not the case for the religious dogmatist.
© David Teachout

March 6, 2015
My Journey from Faith to Freedom
Looking back at growing up there was never a question of going to church on Sundays, attending Sunday School as a child and eventually graduating to the “adult” experience of sitting in pews, singing old songs and listening to a sermon. Beyond Sundays there was mid-week youth group of some kind, attempting often in vain to bond with other believers of my age group.
The experiences varied through the years with new churches and leaders. Pews gave way to chairs, old hymns and organ gave way to contemporary music and a band, sermons delivered by reading a print-out gave way to emotional declarations and fiery speech. What didn’t change was the often stated and personally lived idea that Christianity and love of God was not only a belief but a life choice. When I was around 11, the exact age escapes me, I wrote a small letter to my mother telling her I was ready to “invite Jesus into my heart.” The last I knew, decades later, she still has it.
While I was “saved” at that tender age, the push to do so was one of fear and concern. I remember asking my pastor at the time about whether I’d still go to Heaven if I was struck by a car before I was baptized. I was assured that my soul was secure and God would understand. I don’t remember the degree of relief I felt upon being brought down into that large bin of water and back out, but I’m certain it was quite great. Water dripping down over my eyes, I can still vividly recall the seemingly endless rows of standing believers who I was told had become my extended spiritual family. Nobody was happier or more teary-eyed than my parents.
As a teen, a few years later, I “rededicated myself to Jesus,” a fact of life for many evangelicals who grow up in the church. Finding myself living a life having “backslidden,” the fervency of “getting right with God” cannot be overstated. I went after this feeling with a dedication reserved for the self-righteousness of the teenage mind or reformed sinner. Being a Christian was more than a statement, it was a living ideal.
In high school, even among church groups I was referred to as “church boy.” That I would move on to pursue ministry as a pastor was more than a felt call from God, it was a foregone conclusion. The world needed to know the Truth. Unfortunately for those around me, this idealistic mania led to a degree of disregard for the feelings of others that I still flinch at when remembering. From high school to bible college I held close to the idea that all truth was God’s truth and therefore to study was to unlock more and more of God’s revealed nature. I was completely blind to the idea that such a mindset would lead to the end of my faith. Had I been told this was the case, I’d have undoubtedly dismissed it. That only happened, so I told myself, to lesser people of faith.
College brought a host of new experiences I had been dreading and wishing for. The dread was the continued perception that even at this level most believers were not interested in truly exploring the extent of their faith. I had thought that certainly at a bible college I would find a similar degree of fiery faith in others that I felt had so far belonged only to myself. In a campus of a couple hundred, I found only a few. Those few, however, became a supportive core.
I’d studied philosophy before, C.S. Lewis and Francis Schaeffer, but never to the extent I was now. Similarly with theology, I was introduced to the history of Christianity, studies in biblical literature and the changing history of theological ideas. I ate it up even as more and more the questions I was coming up with were increasingly not being answered. The boy who had written in his senior yearbook picture caption that “to live in Christ is gain” was now beginning to grasp at straws.
9/11 hit with the emotional weight of a freight-train. The towers fell, people died and suddenly I was confronted with this whole other religious faith called Islam. More importantly, I could not shake the similarity their fiery fervor had to mine. Faced with actions I found horrendous, but impressed nonetheless with the depth of their dedication, I had to figure out why they were wrong and I was right.
All of the defenses for Christianity I found could be applied to Islam. Their usage of the term “faith” shook me. I couldn’t determine why their usage was wrong and mine was right, not with any degree of self-honesty. With Christian writing defending the faith having long since been seen as poorly portraying other ideas, even lying about them, seeing the way faith was used to justify any act left me with little choice. Picking up George H. Smith’s “Atheism: The Case Against God,” a book I’d been warned against a year earlier, I became an atheist by page 50.
If that was the end, this story would never need writing. What began in November of 2001 with a declaration to friends changed my life in more ways than I can count. The sense of loss was overwhelming at times. I’d lie awake at night shaking with the deep fear of damnation and the perceived loss of love from a God I no longer believed in. All of my notions concerning politics, sexuality, friendship, morality, career, were destroyed. Feeling as if I’d been lied to by my spiritual family and lied to myself for so many years, there was a lot of anger to work through.
The depth of that hurt and seeing the damage done to self and others by religion, has been instrumental in guiding me to where I am today. The healing took years, but through love of friends and the support of others, I no longer lie awake at night. Rediscovering what it is to truly love life through an appreciation rather than condemnation of my humanity has been a journey of growth and happiness that will never end. I do not want it to. Where Christianity left me with nowhere to go, a dead-end for progress, atheism and the resulting love for life’s infinite pleasures leaves me full and un-broken. Where God could take anything away, his absence means there’s always more to explore.
© David Teachout

March 4, 2015
Removing the Disease from Mental Health
When hitting, catching or throwing a ball in a sport there is an easy link made between action and effect. We can declare rather simply “I did that” and depending on the skill level, feel a greater or lesser degree of personal pride. This mentality bleeds over into much of what we do as human beings, our ego as the “I” of our personal narrative or story, constructing each and every situation from that mind’s eye view. Events happen “to” us, we “do” things, the happenstance of positive or negative confluences are always concerned with helping or hindering us. While this self-centered focus is most pronounced in the positive thinking movement, such did not simply arise out of nowhere. The social power of positive thinking is due to its inherent resonance with how we usually go about our lives.
This billiard-ball mental construct of causation becomes linked with mental health through the common disease model of psychiatry. Illnesses happen “to” us, as if they are invading organisms, and while there is certainly truth to this when it comes to germs and viruses, the distinction loses its validity when it comes to mental health. Just where does the mind start and end? The common belief is there is an “I” and then an “other” that encompasses everyone and everything else that makes up experience. Attempt for a moment to think only of an “I” in complete isolation from anything and anyone else and the futility of doing so should, if being honest, lead to the stark conclusion that such easy boundaries are anything but.
The difficulty with viewing mental health through a disease or illness model is compounded and promoted by the common understanding of drugs. As Peter Kinderman says:
…while it is clear that medication (like many other substances, including drugs and alcohol) has an effect on our neurotransmitters, and therefore on our emotions and behavior, this is a long way from supporting the idea that distressing experiences are caused by imbalances in those neurotransmitters.
Were such a direct relationship to exist between drugs and behavioral change, there’d be no variation. We’d never have to worry about negative side-effects because we’d know exactly what y-behavior will result from x-drug. Thankfully most psychiatrists, when more openly discussing this, are well aware that such a simple explanation is patently false, but the lack of focus for dispelling this myth only contributes to the already prevalent American mentality concerning “better living through pharmacology.”
The interconnectivity of lived reality positions us, our individual “I’s” as one variable among many in the building of experience. It is only hubris and the continued disconnection from our social relationships that pushes us to believe we can pick and choose what and how events and people effect us.
We learn as a result of the events that happen to us, and there is increasing evidence that even severe mental health problems are not merely the result simply of faulty genes or brain chemicals. They are also a result of experience — a natural and normal response to the terrible things that can happen to us and that shape our view of the world.
I remember in Boy Scouts, in the Pine-Wood Derby, where using a kit we’d create with the assumed minimal help of our parent(s) a small car that would then be placed on a ramp and timed in races. Placed ever so carefully at the top of the ramp, I had a healthy understanding that the result was not entirely up to me, but the sense of failure upon not winning remained all mine. The same can be said for mental health. We can place our lives through our perspective at the top of the ramp of life, but this by no means is the only determinant of whether we will succeed in our projected goals. The structure of the car (our genes and familial history), the friction of the ramp itself (our continued relational dynamics) and the proximity of other cars (natural and social effects outside our immediate control) all play major roles in the race. To look at mental health through the lens of a disease model is to assume the only thing that exists is our perspective and whether something “out there” forces itself upon us.
We need to place people and human psychology central in our thinking. Psychological science offers robust scientific models of mental health and well-being, which integrate biological findings with the substantial evidence of the social determinants of health and well-being, mediated by psychological processes.
Much is being said recently about whether there can be a scientific means of determining well-being or human flourishing. Michael Shermer and Sam Harris, among others, have sought to explain how morality and therefore positive human community can be developed through the skeptical and steady inquiry of scientific study. Whatever the particulars of such endeavors may create, the central view cannot be ignored: we cannot come to a legitimate, helpful understanding of mental health and well-being without beginning with the study and appreciation of the human organism. Doing so means seeing humanity as an integrated organism within the holism of reality, not as something set apart.
We must offer services that help people to help themselves and each other rather than disempowering them — services that facilitate “personal agency” in psychological jargon. That means involving a wide range of community workers and psychologists in multidisciplinary teams, and promoting psychosocial rather than medical solutions.
A medical understanding of humanity is only one perspective, but it fails if it seeks to remove all others. This does not mean any and all perspectives are equal in validity, only that a full and proper articulation of human flourishing will be found when looked through a holistic lens. This means an integral model, multi-disciplinary studies and an appreciation that the psychology cannot be removed from the social anymore than either can be removed from the material biology of our being.
© David Teachout

March 2, 2015
The Passing of the Most Human: Tribute to Leonard Nimoy
“A life is like a garden. Perfect moments can be had, but not preserved, except in memory. LLAP”
The death of Leonard Nimoy did not go unnoticed, as any even passing perusal of social media would have noted Friday and through the weekend. The above quote, his last tweet, could not be surpassed in encapsulating his life, were thousands more words added on. Like any memory, what Nimoy achieved says as much about those he touched as the man who lived.
Star Trek, both in television and in film, has in its many forms, sparked the imagination and wonder of countless people. That flame lit so many fires of the human spirit with the pursuit of an unabashed narrative of scientific discovery and the hopeful future of a humanity dedicated to peaceful exploration. Any violence, certainly at times heavy-handed, seemed always to remind us that the search for truth and the awe of discovery is always tempered by the acknowledged destruction of preconceived notions, not least of which concern ourselves as individuals and a species.
Nimoy, playing the beloved character of ‘Spock,’ half-human and half-vulcan, portrayed the humanity we all struggle with. The difficulty of his vulcan, logical side, attempting to get past his human, emotional side. was increased by the sheer magnitude of the difference. The purity of his logical training made the emotional like unto a willful child screaming in the midst of a presentation on quantum mechanics. Ironically, nowhere was this struggle more evident than in those who called him friend and shipmate. The perpetual and at times silly desire of Spock’s human compatriots to get him to express his feelings juxtaposed brilliantly with their ever-present need to seek him for analysis and information to make better decisions and understand perplexing situations.
In this, the quality of Nimoy as a person blazed beautifully. His aloofness in analysis displayed the brilliance we can all pursue, even as the ferocity of his emotional states reminded us of the passions that are always within. Nimoy, through Spock, and in his own life, attempted always to display the best of our shared humanity.
Growing up in the Hebrew faith, Nimoy has told the
The Kohanim symbol is similar to the Hebrew letter ‘shin,’ which is the first letter in the name Shaddai, the name given for the Almighty or God. This is why the congregation was to look away, for the name of God is sacred and not to be despoiled by man. By making such a symbol a public greeting, I cannot help but ponder the significance. Here is a sacred, to be kept hidden, symbol being used as an acknowledgment of the connection shared in public communion. When returned, the two split-fingered sign creates the whole of the original symbol, in essence each greeting being a public awareness of the shared god between them.
That unity, our shared humanity and struggle with our nature, finds solace and purpose in the community of interconnected beings greeting one another, regardless of the length of time such a connection may exist. Whatever the term “god” may mean for some, the communal reality of a shared existence, pregnant with struggle and meaning, gives birth to the only form of transcendence we may ever know.
Leonard Nimoy has passed, but his memory lives on. The garden of his life continues to grow with every struggle of logic and emotion leading to a greater embrace of our humanity and in every greeting where we acknowledge the shared divinity of our lives. To live long and prosper need not end at death, it certainly hasn’t for Spock.
© David Teachout

February 28, 2015
When Religion Kills: The Cowardice of the Dogmatic
As reported in the NY Times, Avijit Roy, a Bangladeshi-American blogger critical of religion, was murdered yesterday in Dhaka, Bangladesh, hacked to death by machete-wielding religious adherents. His wife was attacked as well and is currently in critical condition. If we are to follow in the footsteps of the current Pope, that bastion of progressive values championed by liberals ignorant of Catholic dogma, Roy got what was coming to him. Comparing criticism of religion with the cursing of one’s mother, an equivalency with playground childishness that is as ridiculous as it is inaccurate, he declared such usage of free speech as wrong and the person doing so should expect to be punched. That the Pope disavowed murder as an appropriate response is completely undone by this rationalized approval for violence.
In recent polling done by Pew Research (May-June of 2014), when asked to describe, by reference to temperature, how positive or negative a particular religious ideology is viewed, Americans scored atheism at 41 degrees, only one degree warmer than Muslims. Considering all the press concerning the possible rise of hate-crimes against Muslims, the lack of coverage concerning antipathy towards atheists seems to tacitly endorse the fact that such people deserve to be hated. This wanton disregard by leaders and social institutions shows the lie of their supposed dedication to making the world a better, more informed, place.
I write about religion. I have been and will continue to be critical of its social utility, the paucity of its ideas and the absurdity of self-righteous claims that its adherents are inherently more capable of living moral lives. I have not received death threats. That experience is not something I aspire to have. I like to believe that were such an inevitable result of criticizing religion to happen, I would continue on, albeit with a greater concern for international travel. Thankfully I live in a nation where violence as an outgrowth of religious belief is considered hypocritical. Unfortunately, that social assumption blinds people to the reality of those who live day-to-day under the constant fear that daring to criticize will result in their death.
A moment before continuing, to clarify about religion. Not all religious ideologies are the same. Some, like certain interpretations of Buddhism, allow for and praise intellectual rigor and skeptical inquiry. While even there, a wall exists beyond which inquiry is discouraged, this is a far cry from dogmatic religions where subservience to authoritarian dictate is of the highest value. Christianity, Islam and orthodox Judaism are the largest examples of the latter. What sets them apart is the slavery of will and mind to Authority, where morality is not considered such by virtue of its relationship to relational reality, but by the decree of said Authority. In such a system, murder is not murder if done at the behest of Authority.
This abdication of skeptical inquiry and moral responsibility is why discussions of who are “true believers” is absurd. The murder of Avijit Roy, those killed at Charlie Hebdo, still others blown up by Christian terrorists and the countless others killed day after day in religious warfare throughout the world, are all done by those screaming allegiance to the fundamental dogmas of their stated religions. If such abject dedication and avowed belief have nothing to do with their actions, then the same must be said of a believers’ positive actions.
The fact is, religion of the dogmatic type has no Authority because there is no Deity, which means the “Authority” in question always comes down to the individual and group espousing their allegiance. This is why faith, as a means of justifying belief, ends up in practice being nothing more than a statement of personal desire. The adherence to that fictional authority is why it is so easy to utilize religious ideology for the purpose of murder and mayhem. That gap, that absence, must be filled by something else and self-righteous anger is rather impressively capable of overwhelming empathy and reason.
Liberals and conservatives alike will decry the actions of those responsible, trotting out the judgment of “cowardice” to describe the murderers. Such a judgment is one step too far. The cowardice is not in their behavior, it is in their previous abdication of moral and intellectual responsibility by adherence to religious dogmatism.
Sartre declared that “Man is nothing else but what he makes of himself.” This is why some religious believers murder and others support equal rights. The capacity for murder, just as the capacity to do good, is not inherent within religion, it is indelible to human character. What dogmatic religion does is remove the possibility of reaching the good through the only principle possible: that of acknowledging the sanctity of life as itself. The road to destruction is not paved with good intentions, it is put down stone by stone through the refusal to humbly contemplate the uncertainty of our own existence.
Allegiance to a dogmatic authority, removing the need for such humble contemplation, is the original cowardice that breeds violence. It is the initial step towards separation of self from others, of making a world of ‘Us vs Them,’ where any questioning of that authority is to be inevitably met by violence in temperament and action. Our first move away from such a world…
…is to make every man aware of what he is and to make the full responsibility of his existence rest on him. And when we say that a man is responsible for himself, we do not only mean that he is responsible for his own individuality, but that he is responsible for all men.
Writers such as Avijit Roy, like the rising tide of “nones” in religious polling, want this kind of world. Questioning and the criticism that results, whatever the focus of that inquiry is, should never be looked at as justifying the murder or harm of another.
© David Teachout

February 25, 2015
Finding Our Own Character In Stories
Influence. Down through history stories have been used to convey moral imperatives, contribute to social cohesion and provide entertainment. From sitting around a fire in a group to the traveling minstrel to the literary author, the evolution of our species’ story-telling has increased in scope even as it has changed in its form. To reach thousands if not millions now only takes the steady click-clack of a keyboard followed by the eventual release of a digital file onto the internet. Many lament how language seems to be languishing in the land of tweets and internet slang, while others with notable examples point to the bullying and trolling, but the usage of story-telling has not diminished in its potential to expand our understanding of self and others.
Stories are not simply what is found between pages or expressed in lengthy verbal expositions. They are the projections of our imagination, relationally connected with the world we perceive and the world we wish existed. Classic literature is one form, but so are novels, so-called non-fiction and even the comments we leave online in conversational threads. Each example is a behavioral extension of an identity brought up to deal with a particular social situation. An identity is a container for a particular set of mental constructs and learned behavior. We have many of them, like hats or outfits we put on for different occasions. What happens when one encounters something or someone new is where the ebb and flow of personal change can and does occur. That encounter is a story or narrative.
A team led by Judith Lysaker of Purdue University conducted an experimental intervention with 22 second- and third-grade students who were exhibiting difficulties with both reading comprehension and social relationships. The children participated in a reading group that focused not only on understanding the text but also on exploring the thoughts, intentions, and emotions of the characters in the books. For example, the students were asked to write a letter from a particular character’s perspective.
This experiment, remarked upon in Observer Vol.27, No.7 September, 2014, “Literary Character,” led to the increase in reading comprehension and the ability to imagine the emotions of others. Active learning through the study of a story opened up the imaginative potential such that empathy increased. This was done through reading and good teaching. How much more could be done if narratives were seen in everything people do?
Working with adults, focusing in particular on theory of mind (ToM: understanding other people’s mental states), further research noted:
The study suggests that not just any fiction helps foster ToM. Unlike popular fiction, literary fiction requires intellectual engagement and creative thought from its readers, Kidd and Castano assert.
“Features of the modern literary novel set it apart from most bestselling thrillers or romances,” they wrote. “Through the use of … stylistic devices, literary fiction defamiliarizes its readers. Just as in real life, the worlds of literary fiction are replete with complicated individuals whose inner lives are rarely easily discerned but warrant exploration.”
The key points here are concerned with “intellectual engagement” and “creative thought.” These intentional behaviors broaden a person’s ideas about others through empathic union, utilizing the imagination to see connections otherwise unnoticed. Those “complicated individuals” “warrant exploration” because of the desire of the reader. The existence of that desire, not simply the form of the literature is paramount, as in the same article it is noted:
even books populated by wizards, dragons, vampires, and aliens can strive to depict important aspects of the human experience.
How often in school or when meeting other people, has there been a feeling of disconnection and then suddenly finding oneself drawn to a subject or a person? That draw is the feeling of intentional connection. Such is not limited to academic pursuits, but pervades our entire lives, pushing us to pursue or not pursue one or another personal relationship or experience.
The dual-meaning of character as both a person or attribute is important to keep in mind. We express our self-character through the myriad of identities used to interact within experiences. The depth of our attribute-character is determined by what we focus on when doing so. What do we see when someone tells a story, regardless of its length or depth? The friend or family member or lover is regarded positively, but even they are limited to the extent that the shared stories work at doing so. Far more then is the perceived enemy, often broken down to a mere caricature. Even the casual comment carries with it an identity and narrative that connects back and outward into a history we are not aware of.
The extent of our ability to respond to changing experiences is determined largely by our perceptual capacity. This is not the simplicity of the positive thinking movement, this is the experiencing of life as a tapestry of narrative threads, each with degrees of good and bad, positive and negative, often both at the same time. Books and the stories they contain are gateways into other worlds, so much more then are people and the narratives they’re spinning. The desire to pursue a broader understanding is to engage creatively, illuminating one’s own experience through the lens of another and in so doing, find out all that’s been missing.
© David Teachout

February 23, 2015
Is There A Limit To Skepticism?
Skepticism is in no small part related to judgment, or if that term is too emotionally charged, discernment. Both require an active conscious engagement with lived experience and a humility that rests on the acknowledgment that all knowledge is tentative and open to revision. This is not to say that conclusions cannot be reached, hence the problem with declaring judgment to be in some way inherently wrong. We reach conclusions or judgments almost every waking moment of every day, from the mundane of walking out of our homes having judged that we would not immediately perish, to the complex issues of work ethics and the treatment of family and friends. Saying no to judgment is to say no to the relational reality of existence. What skepticism supplies is a general paradigm of shaping those judgments.
Skepticism, or rational inquiry, should be differentiated from denial. The qualifier of rational inquiry is the difference. For instance, to be a “climate change skeptic” is a false characterization, when in fact “denialist” is more proper. The difference is one of rational inquiry, but more doing so honestly. The mere asking of questions is not sufficient to come under skepticism, for such is determined at least equally by the purpose in doing so. The skeptic has investigated the material, understood the consensus and then offered further inquiry to search out possible variations in the understanding of that data. The denialist is not interested in understanding what has come before and very rarely offers possible variations, they simply declare other conclusions wrong by personal fiat. Doing so does not support skepticism, it utterly denies the validity of it.
With skepticism and its supporting structure of rational/scientific inquiry, there often is an inevitable question that comes up concerning where a door of inquiry should be opened. In the movie “Jurassic Park,” the chaos theorist when confronted with the genetic manipulation that had been going on, declares: “They always asked whether they could, not whether they should.” With gene research, neurological studies into the extent or even existence of free-will, particle accelerators, and A.I. this question has increased in frequency and for many, in legitimacy.
As all group dynamics entail individual manifestations, whether skeptical inquiry has a limit depends on how you want to live your life, whether it be as a consciously active participant or as a passive observer. If the latter, then certainly skepticism is a truly awful value to hold and practice. Passivity can be found in non-judgment, but also in the realm of dogmatism. Ideological fundamentalism abhors skeptical inquiry, for any and all truth or knowledge is pre-determined in structure if not often also in specific forms. Such is instantiated in the parental frustration of facing incessant inquiry with the response “because I say so” and relatedly in the religious sphere of declaring “God” to fill in any gaps where inquiry is not allowed. In both cases and any other where an authority stops inquiry, the principle of wonder-filled skepticism is to be removed, resulting in the destructive notion that you have as much purpose in life as a rock. The difficulty of doing this is rather impressively huge, since even daily behavior requires some slight skeptical inquiry, even if almost wholly unconscious. To remove all inquiry is to cease living.
The issue of a limit to skeptical inquiry is not that black and white however. As may come as a shock given what has been written so far, I find the limits of inquiry to be a rather personal question, at least in its practice. The value of skeptical inquiry is unassailable if one wishes to live life fully, but the practice of it is not the same for everyone. For some people, asking questions can lead to a paralysis and indecisive depression such that life becomes unbearable. This is not a situation I’d wish upon anyone.
I offer two criteria for engaging in skepticism:
1. Am I questioning because of a felt need to understand more fully or to prove I am right? If the latter, cease and desist, you’re not inquiring, you’re simply rationalizing or walking down the path of denialism.
2. Does the questioning pertain to how I live my life such that may help in my own and therefore that of others’ well-being and flourishing? Merely questioning for the sake of doing so often leads to emotional futility. Inquiry assumes a relationship between self and lived reality, where consequences will not simply belong to the individual but have ramifications for any who go down a similar journey. To avoid cynicism, a consequence of separating ego from relational reality, one must have a goal in mind. I can think of none better than the betterment of our living, or eudamonia (Greek term for “human flourishing”).
Skeptical inquiry serves life because it is the active process of seeking out the perceptual limits of our current mental paradigms. Call it the evolution of consciousness where adaptation is the means of growth and progress. Whether through dogmatism or lack of inquiry, the limits of present knowledge will lead to an environment inimical to life. Only through a deliberate, rational skeptical inquiry can our lives as individuals and as a species adapt to a dynamic existence.
© David Teachout

February 19, 2015
Non-Judgment Is Moral Abdication
Backing away from judgment and letting people walk their own path is the post-modern moral paradigm, a way of holding back from full engagement with ideas. This may be due to a deeply held though personally hidden concern that one’s own ideas haven’t actually been thought through all that well. However, being uncertain about one’s own ideas is not a valid justification for withholding judgment.
Notice that the fundamentalists of political and religious ideologies have no concern for passing judgment, there’s quite often a fanatical gleam in their eye as they engage in a favorite pastime. Baseball, apple pie and fire and brimstone, it’s the new recipe for nationalist pride that often walks hand-to-heart with religious zealotry. True, often these crusaders have thought through their ideas about as much as a child caught with hand in cookie jar has considered possible excuses, but the felt belief is decidedly the opposite, hence the internal justification for passing judgment on anything that isn’t them.
Here lies the difference between the skeptic and the fundamentalist: the ability to recognize, accept and dwell in a world that constantly shows ideas to be forever tentative and constantly in need of revision. There’s little wonder that the fundamentalist of any kind is far more prone to violence than the skeptic or liberal. When faced with a world that doesn’t match their vision, the skeptic asks more question whereas the fundamentalist begins flailing about in externalized frustration.
Remaining silent, as the fundamentalist is often shrilly noting, is tantamount to acceptance. Despite the opacity of many of their ideas, this point really should be considered. First, let’s be honest, it’s not as if the judgment isn’t happening, it’s just not being expressed for others to hear. Second, given the inevitably tentative nature of all knowledge claims, refusing to voice truly is a form of tacit approval. In a world of a billion voices, the person who screams loudest isn’t always the winner, rather it’s the person who keeps talking well past when others have stayed silent. Being uncertain is less about not being able to judge and more about accepting that such judgments could be, with more inquiry and future expanded understanding, deemed wrong, in part or in entirety.
At the heart of declaring one’s belief in not declaring judgment is the inability to see that such is itself a judgment, not necessarily about the belief in question, but about the ability to interact with reality. Beliefs are the means of interaction with reality at a personal level, the lenses through which our behavior manifests within and as experiences. If all beliefs are even remotely possible to have equal footing in relating accurately with reality, then non-judgment is a legitimate enterprise. Fortunately, this is not the case, as even a couple of examples can attest. Take the notion that a hot burner will cause damage upon touching it. An accurate belief certainly, but saying to someone who believes otherwise that their belief is perfectly alright since it’s their journey to go down seems at best irresponsible and at worst sadistic. Another example: the belief by someone that walking down an alley is perfectly safe, but you know information about a murderer who is lying in wait. Again, if all beliefs are inherently equal then of course you’d not say anything, but frankly any decent moral inquiry will determine that by not telling the person you have condemned them to pain and suffering otherwise avoided.
The difficulty with not wanting to judge or “honoring all paths” is one of false equivalence and conflation. The phrase has two distinct potential definitions: 1) an acceptance that due to knowledge being tentative, we should refrain from absolute judgments as to the entirety of belief structures and 2) all beliefs are inherently equal such that even contradictory statements from two or more people about reality cannot be determined as having a greater or lesser degree of accuracy.The first accepts that there is a singular reality and multiple perspectives that open up various aspects of it, rather like viewing through a window that is only partially clean in spots. The second says that there is no shared reality and any attempt at communication is based on this false assumption. Clearly the second cannot actually be lived as we go about our lives with the embodied acceptance that some views of reality are more or less false than others. If you doubt this, remember the burner.
Which leads back to the moral issue. Declaring all paths are equal in kind, in that they are all beliefs is one thing, but declaring all paths equal in dealing with shared reality is quite something else altogether. Saying this is, as noted above, tantamount to abdicating all morality. A moral system is of little use if actions, of which ideas are the progenitors, have no consequence. A shared reality means social interconnectedness, it means that what one does is indeed a variable in determining the actions of others. Frankly, the notion of non-judgment is often found in people who have not faced the horrors of bad beliefs or think that they shouldn’t have to. Female genital mutilation, throwing acid on the faces of raped women to signify their un-cleanness, flying planes into buildings, blowing up abortion clinics, beheading apostates, denying services to the poor and downtrodden, etc. These are all examples of beliefs that have effects beyond the personal, there is simply no way of referring to oneself as a moral person if non-judgment is the course of action when faced with such atrocities.
To say “don’t judge” may in fact be worse than the beliefs themselves, at least those shouting such horrors are willing to stand in the full passionate fervor of their self-righteousness. Non-judgment utterly lacks any conviction for caring for others. Instead, the refusal of judgment is promoted as being in some fashion better than attempting to diminish the very real pain and suffering of those succumbing to the bad beliefs of others.
Judgment, assessment, coming to conclusions, however one might refer to it as, the lack of it is not morally superior. Judgment does not necessitate going down the path of zealotry. Truly it is the zealot who has given up judgment, as it requires constant engagement with reality. What the zealot has substituted is ideological dictatorship, a despotism of the mind where reality has no sway. We move forward in the world, as individuals, as social groups, as a species, not by ignoring reality, whether through zealotry or abdicating judgment. Rather, progress of any kind requires the active engagement with the world and each other, which at times may mean, even as we keep ourselves open to further questions, noting how some ideas are simply not helpful and even wrong.
© David Teachout
